Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError when logging a model with MLFlow #476

Closed
ABerry057 opened this issue Feb 7, 2025 · 2 comments
Closed

AttributeError when logging a model with MLFlow #476

ABerry057 opened this issue Feb 7, 2025 · 2 comments
Labels

Comments

@ABerry057
Copy link

What happened + What you expected to happen

Description
When attempting to log a trained MLForecast model using MLFlow.flavor module, I encounter an AttributeError. It appears the MLForecast object somehow loses the models_ attribute in the process of performing cross validation; if I omit the cross validation step, there is no bug.

Expected behavior
Calling mlforecast.flavor.log_model() should successfully save the MLForecast model artifact after performing cross validation.

Logs

Traceback (most recent call last):
  File "/Users/ab/Desktop/mlflow-demo/src/train.py", line 63, in <module>
    mlforecast.flavor.log_model(model=fcst_model, artifact_path="model")
  File "/Users/ab/Desktop/mlflow-demo/.venv/lib/python3.11/site-packages/mlforecast/flavor.py", line 263, in log_model
    return Model.log(
           ^^^^^^^^^^
  File "/Users/ab/Desktop/mlflow-demo/.venv/lib/python3.11/site-packages/mlflow/models/model.py", line 796, in log
    flavor.save_model(path=local_path, mlflow_model=mlflow_model, **kwargs)
  File "/Users/ab/Desktop/mlflow-demo/.venv/lib/python3.11/site-packages/mlforecast/flavor.py", line 142, in save_model
    model.save(model_data_path)
  File "/Users/ab/Desktop/mlflow-demo/.venv/lib/python3.11/site-packages/mlforecast/forecast.py", line 1009, in save
    cloudpickle.dump(self.models_, f)
                     ^^^^^^^^^^^^
AttributeError: 'MLForecast' object has no attribute 'models_'. Did you mean: 'models'?

Versions / Dependencies

  • Python 3.11.4
  • mlforecast==1.0.1
  • mlflow==2.20.1
  • macOS 15.3

Reproduction script

import pandas as pd
import mlflow
from sklearn.linear_model import LinearRegression
from utilsforecast.losses import rmse, smape
from utilsforecast.evaluation import evaluate

import mlforecast.flavor
from mlforecast import MLForecast


data_url = "https://vincentarelbundock.github.io/Rdatasets/csv/AER/FrozenJuice.csv"
freq = 1
h = 2
data_series = (
    pd.read_csv(data_url, usecols=["rownames", "price"])
    .rename({"rownames": "ds", "price": "y"}, axis=1)
    .assign(unique_id="test")
)

# start MLFlow tracking
with mlflow.start_run():
    fcst_object = MLForecast(
        models=[LinearRegression()],
        freq=freq,
        lags=list(range(1, 5))
    )
    fcst_object.fit(data_series)
    if hasattr(fcst_object, "models_"):
        print("MLForecast object has models_ attribute")
    cv_metrics_df = fcst_object.cross_validation(
        df=data_series, h=h,
        n_windows=10
    )
    eval_result_dict = evaluate(
        cv_metrics_df.drop("cutoff", axis=1),
        metrics=[rmse, smape],
        agg_fn="mean"
    ).set_index("metric").squeeze().to_dict()
    mlflow.log_metrics(eval_result_dict)
    # the following print statement does not get executed
    if hasattr(fcst_object, "models_"):
        print("MLForecast object has models_ attribute")
    mlforecast.flavor.log_model(model=fcst_object, artifact_path="model")

Issue Severity

Medium: It is a significant difficulty but I can work around it.

@ABerry057 ABerry057 added the bug label Feb 7, 2025
@jmoralez
Copy link
Member

jmoralez commented Feb 7, 2025

You should perform cross validation first and then fit.

@ABerry057
Copy link
Author

@jmoralez That worked, thank you for the quick suggestion!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants