Categories:

Model monitor functions

MODEL_MONITOR_PERFORMANCE_METRIC

Gets performance metrics from a model monitor. Each model monitor monitors one machine learning model.

See also:

Querying monitoring results for more information.

Syntax

MODEL_MONITOR_PERFORMANCE_METRIC(<model_monitor_name>, <performance_metric_name>,
    [, <granularity> [, <start_time>  [, <end_time> ] ] ] )
Copy

Arguments

Required:

MODEL_MONITOR_NAME

Name of the model monitor used to compute the metric.

Valid values:

A string that’s the name of the model monitor. It can be a simple or fully qualified name.

METRIC_NAME

Name of the performance metric.

Valid values if the model monitor is attached to a regression model:

  • 'RMSE'

  • 'MAE'

  • 'MAPE'

  • 'MSE'

Valid values if the model monitor is attached to a binary classification model:

  • 'ROC_AUC'

  • 'CLASSIFICATION_ACCURACY'

  • 'PRECISION'

  • 'RECALL'

  • 'F1_SCORE'

Optional:

GRANULARITY

Granularity of the time range being queried. The default value is 1 DAY.

Valid values:

  • '<num> DAY'

  • '<num> WEEK'

  • '<num> MONTH'

  • '<num> QUARTER'

  • '<num> YEAR'

  • 'ALL'

  • NULL

START_TIME

Start of the time range used to compute the metric. The default value is 60 days before the current time, and is calculated each time you call the function.

Valid values:

A timestamp expression or NULL.

END_TIME

End of the time range used to compute the metric. The default value is the current time, and is calculated each time you call the function.

Valid values:

A timestamp expression or NULL.

Returns

Column

Description

Example values

EVENT_TIMESTAMP

Timestamp at the start of the time range.

2024-01-01 00:00:00.000

METRIC_VALUE

Value of the metric within the specified time range.

0.5

COUNT_USED

Number of records used to compute the metric.

100

COUNT_UNUSED

Number of records excluded from the metric computation.

10

METRIC_NAME

Name of the metric that has been computed.

ROC_AUC

Usage Notes

Requirements

  • The model monitor must be associated with a model that supports the requested metric type.

  • The model monitor must contain the necessary data for each metric type.

  • Ensure the model monitor meets the metric requirements.

    • Regression

      • RMSE: Requires prediction_score and actual_score columns

      • MAE: Requires prediction_score and actual_score columns

      • MAPE: Requires prediction_score and actual_score columns

    • Binary Classification

      • ROC_AUC: Requires prediction_score and actual_class columns

      • CLASSIFICATION_ACCURACY: Requires prediction_class and actual_class columns

      • PRECISION: Requires prediction_class and actual_class columns

      • RECALL: Requires prediction_class and actual_class columns

      • F1_SCORE: Requires prediction_class and actual_class columns

Error cases

You might run into errors if you do the following:

  • Request an accuracy metric without setting the corresponding prediction or actual column.

  • There is no data for the actual_score or actual_class columns.

Examples

The following example gets the Root Mean Square Error (RMSE) over a one-day period from the model monitor.

SELECT * FROM TABLE(MODEL_MONITOR_PERFORMANCE_METRIC(
'MY_MONITOR', 'RMSE', '1 DAY', TO_TIMESTAMP_TZ(‘2024-01-01’)
, TO_TIMESTAMP_TZ(‘2024-01-02’))
)
Copy

The following example gets the Root Mean Square Error (RMSE) over the last 30 days from the model monitor:

SELECT * FROM TABLE(MODEL_MONITOR_PERFORMANCE_METRIC(
'MY_MONITOR', 'RMSE', '1 DAY', DATEADD('DAY', -30, CURRENT_DATE()), CURRENT_DATE())
)
Copy
Language: English