modelstore 0.0.74 (April 2022)
🆕 New functionality
get_domain() returns key meta data about a domain (#141)
delete_model() delete models from modelstore. If the user attempts to query for a model after it has been deleted, modelstore will raise a ModelDeletedException (#137)
list_model_states() lists all of the existing model states (#131)
You can optionally set a model_id value when uploading a model (#147, #165), thanks @cdknorow.
🆕 Storage improvements
The file system storage can now be configured to create its root directory if it doesn't already exist (#143, thanks @cdknorow)
Public, read-only Google Cloud Storage containers can now be read from using modelstore (#142, thanks @ionicsolutions)
Previously, any extra files you wanted to upload were uploaded separately to the model archive. Now, they are added into the archive in a subdirectory called "extras" so that you can easily download them back (#139). I've also added an example of uploading a model with some additional files (#138).
🐛 Bug fixes & general updates
Fixed a regression: keras models saved with an older version of modelstore couldn't be loaded (#145).
Updated the names of the environment variables that are checked for setting the modelstore storage root (prefixes). Previously, this was using the same variable name and this would cause issues if you were creating more than one type of modelstore.
ℹ️ General updates
The list_versions() function is deprecated and has been replaced with list_models() (#132)
Python 3.6 has passed its end-of-life, so this library is now tested with Python 3.7 and above.
modelstore 0.0.73 (February 2022)
🆕 New functionality
You can upload multiple models to the same archive, if they don't share any keywords. For example modelstore.upload(domain, model=sklearn_model, explainer=shap_explainer) can be used to upload and download models and explainers together.
You can now set the root prefix of your model registry storage (thank you, @cdknorow!).
Added to the CLI functionality! You can now python -m modelstore upload <domain> <model-file> to upload a model. This requires you to set environment variables.
Added support for uploading skorch models.
🐛 Bug fixes
Merged the model managers for keras and tensorflow into one.
modelstore 0.0.72 (November 2021)
🆕 New functionality
Added support for uploading unsetting model states - hat tip to @erosenthal-square who opened an issue about this.
Added support for uploading shap explainers.
Added CLI functionality! You can now python -m modelstore download <domain> <model-id> <directory> to download a model. This requires you to set environment variables.
Added Prophet support.
Need to upload additional files alongside your model? You can now use the extras= kwarg in modelstore.upload() to point modelstore to a file (or list of files) to upload as well.
🐛 Bug fixes
Saving complex sklearn pipelines was raising a TypeError. This is because the get_params() function, which modelstore uses to save meta data about the model, returns a lot of things that are not JSON serializable. For now, I've patched this by not returning metadata for sklearn.pipeline.Pipeline models.
Colab is currently running fastai==1.0.61, while modelstore was designed for fastai>2, so things would break in Colab Notebooks due to the different import paths in the two versions of fastai: the import paths are now version-dependent..
Updated the library so that PyTorch models can be uploaded without an optimizer. This is useful for uploading pretrained embedding models!
Fixed a logging bug when trying to download the latest model in a domain - hat tip to @erosenthal-square who found the issue.
Fixed an ImportError bug when trying to use modelstore on an instance that does not have git installed.
modelstore 0.0.71 (September 2021)
🆕 New functionality
Load models straight into memory! Model Store previously had modelstore.download() to download an artifact archive to a local path, it now also has modelstore.load() to load a model straight into memory.
Upload models from frameworks that are not (yet) supported by modelstore! The modelstore.upload() function now works if you give it a model= kwarg that is a path to a file.
Read a specific model's metadata with modelstore.get_model_info()
Added Annoy, ONNX, and MXNet (hybrid models) support.
🆕 New functionality
Added model states, and updated listing models to listing by state.
Created a unified upload function. You can now use modelstore.upload() for all ML frameworks.
Added Gensim support.
Added Azure blob storage support.
🐛 Bug fixes
Minor fixes to how modelstore uses env variables for the hosted storage, bug fixes for local file system storage.
Downgraded requests due to a version conflict with the version in Google Colab.
modelstore 0.0.6 (March 2021)
🆕 New functionality
Added FastAI support.
Add support for scikit-learn pipelines.
modelstore 0.0.52 (February 2021)
🆕 New functionality
Added PyTorch Lightning and LightGBM support.
Added a new type of storage: ModelStore.from_api_key(). If you're reading this and do not want to manage your own storage, get in touch with me for an API key.
Added skeleton functions for summary stats about training data; implemented feature importances for sklearn models.
🐛 Bug fixes
Fixed bugs related to listing domains and the models inside of a domain.
modelstore 0.0.4 (December 2020)
🆕 New functionality
Clean up how meta-data is generated
Add interactive authentication when using Google Colab
Added auto-extraction of model params and model info into the meta-data
🐛 Bug fixes
Upgraded dependencies to deal with an issue using modelstore in Colab
modelstore 0.0.3 (November 2020)
🆕 New functionality
Simplied the API to just requiring upload() (no more create_archive()).
modelstore 0.0.2 (September 2020)
🆕 New functionality
Added models: transformers, tensorflow
Storage: downloading models via download()
Extended support to Python 3.6, 3.7, 3.8
Repo: added Github actions
🆕 First release!
Supports (and tested on) Python 3.7 only. ☢️
Storage: GCP buckets, AWS S3 buckets, file systems. Upload only!
Initial models: catboost, keras, torch, sklearn, xgboost
Meta-data: Python runtime, user, dependency versions, git hash