mlrun.frameworks.auto_mlrun#
- class mlrun.frameworks.auto_mlrun.auto_mlrun.AutoMLRun[source]#
Bases:
object
A library of automatic functions for managing models using MLRun's frameworks package.
- static apply_mlrun(model: ModelType | None = None, model_name: str | None = None, tag: str = '', model_path: str | None = None, modules_map: dict[str, Union[NoneType, str, list[str]]] | str | None = None, custom_objects_map: dict[str, Union[str, list[str]]] | str | None = None, custom_objects_directory: str | None = None, context: MLClientCtx | None = None, framework: str | None = None, auto_log: bool = True, **kwargs) ModelHandler [source]#
Use MLRun's 'apply_mlrun' of the detected given model's framework to wrap the framework relevant methods and gain the framework's features in MLRun. A ModelHandler initialized with the model will be returned.
- Parameters:
model -- The model to wrap. Can be loaded from the model path given as well.
model_name -- The model name to use for storing the model artifact. If not given will have a default name according to the framework.
tag -- The model's tag to log with.
model_path -- The model's store object path. Mandatory for evaluation (to know which model to update). If model is not provided, it will be loaded from this path.
modules_map --
A dictionary of all the modules required for loading the model. Each key is a path to a module and its value is the object name to import from it. All the modules will be imported globally. If multiple objects needed to be imported from the same module a list can be given. The map can be passed as a path to a json file as well. For example:
{ "module1": None, # import module1 "module2": ["func1", "func2"], # from module2 import func1, func2 "module3.sub_module": "func3", # from module3.sub_module import func3 }
If the model path given is of a store object, the modules map will be read from the logged modules map artifact of the model.
custom_objects_map --
A dictionary of all the custom objects required for loading the model. Each key is a path to a python file and its value is the custom object name to import from it. If multiple objects needed to be imported from the same py file a list can be given. The map can be passed as a path to a json file as well. For example:
{ "/.../custom_model.py": "MyModel", "/.../custom_objects.py": ["object1", "object2"], }
All the paths will be accessed from the given 'custom_objects_directory', meaning each py file will be read from 'custom_objects_directory/<MAP VALUE>'. If the model path given is of a store object, the custom objects map will be read from the logged custom object map artifact of the model. Notice: The custom objects will be imported in the order they came in this dictionary (or json). If a custom object is depended on another, make sure to put it below the one it relies on.
custom_objects_directory -- Path to the directory with all the python files required for the custom objects. Can be passed as a zip file as well (will be extracted during the run before loading the model). If the model path given is of a store object, the custom objects files will be read from the logged custom object artifact of the model.
context -- A MLRun context.
auto_log -- Whether to enable auto-logging capabilities of MLRun or not. Auto logging will add default artifacts and metrics besides the one you can pass here.
framework -- The model's framework. If None, AutoMLRun will try to figure out the framework. From the provided model or model path. Default: None.
kwargs -- Additional parameters for the specific framework's 'apply_mlrun' function like metrics, callbacks and more (read the docs of the required framework to know more).
- Returns:
The framework's model handler initialized with the given model.
- static load_model(model_path: str, model_name: str | None = None, context: MLClientCtx | None = None, modules_map: dict[str, Union[NoneType, str, list[str]]] | str | None = None, custom_objects_map: dict[str, Union[str, list[str]]] | str | None = None, custom_objects_directory: str | None = None, framework: str | None = None, **kwargs) ModelHandler [source]#
Load a model using MLRun's ModelHandler. The loaded model can be accessed from the model handler returned via model_handler.model. If the model is a store object uri (it is logged in MLRun) then the framework will be read automatically, otherwise (for local path and urls) it must be given. The other parameters will be automatically read in case its a logged model in MLRun.
- Parameters:
model_path -- A store object path of a logged model object in MLRun.
model_name -- The model name to use for storing the model artifact. If not given will have a default name according to the framework.
modules_map --
A dictionary of all the modules required for loading the model. Each key is a path to a module and its value is the object name to import from it. All the modules will be imported globally. If multiple objects needed to be imported from the same module a list can be given. The map can be passed as a path to a json file as well. For example:
{ "module1": None, # import module1 "module2": ["func1", "func2"], # from module2 import func1, func2 "module3.sub_module": "func3", # from module3.sub_module import func3 }
If the model path given is of a store object, the modules map will be read from the logged modules map artifact of the model.
custom_objects_map --
A dictionary of all the custom objects required for loading the model. Each key is a path to a python file and its value is the custom object name to import from it. If multiple objects needed to be imported from the same py file a list can be given. The map can be passed as a path to a json file as well. For example:
{ "/.../custom_model.py": "MyModel", "/.../custom_objects.py": ["object1", "object2"], }
All the paths will be accessed from the given 'custom_objects_directory', meaning each py file will be read from 'custom_objects_directory/<MAP VALUE>'. If the model path given is of a store object, the custom objects map will be read from the logged custom object map artifact of the model. Notice: The custom objects will be imported in the order they came in this dictionary (or json). If a custom object is depended on another, make sure to put it below the one it relies on.
custom_objects_directory -- Path to the directory with all the python files required for the custom objects. Can be passed as a zip file as well (will be extracted during the run before loading the model). If the model path given is of a store object, the custom objects files will be read from the logged custom object artifact of the model.
context -- A MLRun context.
framework -- The model's framework. It must be provided for local paths or urls. If None, AutoMLRun will assume the model path is of a store uri model artifact and try to get the framework from it. Default: None.
kwargs -- Additional parameters for the specific framework's ModelHandler class.
- Returns:
The model inside a MLRun model handler.
- Raises:
MLRunInvalidArgumentError -- In case the framework is incorrect or missing.
- mlrun.frameworks.auto_mlrun.auto_mlrun.framework_to_apply_mlrun(framework: str) Callable[[...], ModelHandler] [source]#
Get the 'apply_mlrun' shortcut function of the given framework's name.
- Parameters:
framework -- The framework's name.
- Returns:
The framework's 'apply_mlrun' shortcut function.
- Raises:
MLRunInvalidArgumentError -- If the given framework is not supported by AutoMLRun or if it does not have an 'apply_mlrun' yet.
- mlrun.frameworks.auto_mlrun.auto_mlrun.framework_to_model_handler(framework: str) type[mlrun.frameworks._common.model_handler.ModelHandler] [source]#
Get the ModelHandler class of the given framework's name.
- Parameters:
framework -- The framework's name.
- Returns:
The framework's ModelHandler class.
- Raises:
MLRunInvalidArgumentError -- If the given framework is not supported by AutoMLRun.
- mlrun.frameworks.auto_mlrun.auto_mlrun.get_framework_by_class_name(model: ModelType) str [source]#
Get the framework name of the given model by its class name.
- Parameters:
model -- The model to get its framework.
- Returns:
The model's framework.
- Raises:
MLRunInvalidArgumentError -- If the given model's class name is not supported by AutoMLRun or not recognized.
- mlrun.frameworks.auto_mlrun.auto_mlrun.get_framework_by_instance(model: ModelType) str [source]#
Get the framework name of the given model by its instance.
- Parameters:
model -- The model to get his framework.
- Returns:
The model's framework.
- Raises:
MLRunInvalidArgumentError -- If the given model type is not supported by AutoMLRun or not recognized.