Skip to main content

AI Builder – New prediction model performance page

Headshot of article author Norah Liu

We’ve updated the prediction model performance page in AI Builder. Now, there’s a performance grade to help give you a better idea of how the model is performing. We’ve also added a new performance tab on the training details pane to provide more detailed information to help you evaluate performance.  These improvements help you decide when a model is ready to publish. In this post, we will discuss how to use the new measurements.

Why this change

We’ve received customer feedback that the accuracy score alone wasn’t sufficient to help make publishing decisions.  Some people were confused by the performance score.

Let’s look at an example model that predicts business loan approvals.  Here are 2 contrasting scenarios that have the same accuracy score (71%), but vastly different practical performance:

Business loan approval model Historical approval rate Performance score Accuracy grade
Scenario 1 50% 71% B
Scenario 2 70% 71% C

The model has a 71% accuracy rate, but scenario 1 represents a significant improvement, while scenario 2 is only marginally better than a random guess. Therefore, the accuracy score alone isn’t helpful, but the accuracy grade (A, B, C, D) based on  improvement relative to a random guess can help you decide if your model is ready to use.

Existing models

For any existing trained or model, you won’t see the new performance information until you retrain it.

Performance grade

After you train your model, the performance grade appears on the model details page. Accuracy may vary based on the data you use, but the grade is more consistent for a given model.

 

Let’s say your model predicts whether a business loan is approved or denied. If your model grade is B, it means the model performance is generally good.  You have to decide whether it is good enough based on your own unique circumstances. Does it have room to improve? Yes – probably. You can follow instructions in the AI Builder documentation to keep improving your model if you want to achieve better accuracy.

More information about what each grade means and how they’re calculated: Prediction model performance.

Training details performance tab

In addition to training summary, we’ve introduced a new Performance tab in the training details pane that appears when you click see details below the performance grade. The following performance metrics are shown:

  • Accuracy grade
  • Accuracy score

More information about these metrics: Prediction model performance.