explabox
The Explabox aims to support data scientists and machine learning (ML) engineers in explaining, testing and documenting AI/ML models, developed in-house or acquired externally. The explabox turns your ingestibles (AI/ML model and/or dataset) into digestibles (statistics, explanations or sensitivity insights)!
- To install run:
$ pip3 install explabox
Currently, the main interface for working with the Explabox is Jupyter Notebook. For more help, read the documentation at https://explabox.rtfd.io.
Explabox is developed by the Dutch National Police Lab AI (NPAI), and released under the GNU Lesser General Public License v3.0 (GNU LGPLv3).
- class explabox.Explabox(ingestibles=None, locale='en', **kwargs)
Bases:
Readable
,IngestiblesMixin
Use the Explabox to .explore, .examine, .expose and .explain your AI model.
Example
>>> from explabox import Explabox >>> box = Explabox(data=data, model=model)
- Parameters:
ingestibles (Optional[Ingestible], optional) – Ingestibles (data and model). Defaults to None.
locale (str, optional) – Language of dataset. Defaults to ‘en’.
**kwargs – Arguments used to construct an Ingestible (if the ingestibles argument is None).
Subpackages:
- explabox.digestibles
- explabox.examine
- explabox.explain
- explabox.explore
- explabox.expose
- explabox.ingestibles
- explabox.ui
- explabox.utils
Submodules:
explabox.config module
Configuration for default paths and variables.
explabox.mixins module
Extensions to classes.
- class explabox.mixins.IngestiblesMixin
Bases:
object
- check_requirements(elements=['data', 'model'])
Check if the required elements are in the ingestibles.
- Parameters:
elements (List[str], optional) – Elements to check. Defaults to [‘data’, ‘model’].
- Raises:
ValueError – The required element is not in the ingestibles.
- Returns:
True if all requirements are included.
- Return type:
bool
- property data
All data.
- property labels
Labelprovider.
- property labelset
Names of labels.
- property model
Predictive model.
- property splits
Named splits.