Hydrosphere.io
GithubPython SDKContact UsSlack Community
2.4.1 Release
2.4.1 Release
  • Hydrosphere
  • 🌊About Hydrosphere
    • Overview
    • Concepts
    • Platform Architecture
      • Serving
      • Monitoring
      • Interpretability
    • Key Features
      • Model Registry
      • Inference Pipelines
      • A/B Model Deployments
      • Traffic Shadowing
      • Language-Agnostic
      • Automatic Outlier Detection
      • Data Drift Report
      • Monitoring Dashboard
      • Alerts
      • Prediction Explanation
      • Data Projection
      • Kubeflow Components
      • AWS Sagemaker
  • 🏄Quickstart
    • Installation
      • CLI
      • Python SDK
    • Getting Started
    • Tutorials
      • A/B Analysis for a Recommendation Model
      • Using Deployment Configurations
      • Train & Deploy Census Income Classification Model
      • Using Automatic Outlier Detection to find anomalies
      • Monitoring Anomalies with a Custom Metric
      • Monitoring External Models
    • How-To
      • Invoke applications
      • Write definitions
      • Develop runtimes
      • Use private pip repositories
  • 💧Resources
    • Troubleshooting
    • Reference
      • Runtimes
    • Contribution
      • Contributing Pull Requests
Powered by GitBook
On this page

Was this helpful?

Export as PDF
  1. About Hydrosphere
  2. Key Features

Prediction Explanation

PreviousAlertsNextData Projection

Last updated 4 years ago

Was this helpful?

Prediction Explanation service is designed to help Hydrosphere users understand the underlying causes of changes in predictions coming from their models.

Prediction Explanation generates explanations of predictions produced by your models and tells you why a model made a particular prediction. Depending on the type of data your model uses, Prediction Explanation provides an explanation as either a set of logical predicates (if your data is in a tabular format) or a saliency map (if your data is in the image format). A saliency map is a heat map that highlights parts of a picture that a prediction was based on.

Hydrosphere uses methods for explaining your model predictions. Such methods can be used on any machine learning model after they've been uploaded to the platform.

As of now, Hydrosphere supports explaining tabular and image data with and tools correspondingly.

🌊
model-agnostic
Anchor
RISE
Tabular Explanation for class 0
Saliency map calculated by RISE.