MLOps with Red Hat OpenShift : a cloud-native approach to machine learning operations.

Build and manage MLOps pipelines with this practical guide to using Red Hat OpenShift Data Science, unleashing the power of machine learning workflows Key Features Grasp MLOps and machine learning project lifecycle through concept introductions Get hands on with provisioning and configuring Red Hat...

Full description

Saved in:
Bibliographic Details
Main Authors: Brigoli, Ross (Author), Masood, Faisal (Author)
Format: Electronic eBook
Language:English
Published: Birmingham, UK : Packt Publishing, Limited, 2024.
Subjects:
Online Access:CONNECT
Table of Contents:
  • Cover
  • Title Page
  • Copyright and Credits
  • Contributors
  • Table of Contents
  • Preface
  • Part 1: Introduction
  • Chapter 1: Introduction to MLOps and OpenShift
  • What is MLOps?
  • Introduction to OpenShift
  • OpenShift features
  • Understanding operators
  • Understanding how OpenShift supports MLOps
  • Red Hat OpenShift Data Science (RHODS)
  • The advantages of the cloud
  • ROSA
  • Summary
  • References
  • Part 2: Provisioning and Configuration
  • Chapter 2: Provisioning an MLOps Platform in the Cloud
  • Technical requirements
  • Installing OpenShift on AWS
  • Preparing AWS accounts and service quotas
  • Preparing AWS for ROSA provisioning
  • Installing ROSA
  • Adding a new machine pool to the cluster
  • Installing Red Hat ODS
  • Installing partner software on RedHat ODS
  • Installing Pachyderm
  • Summary
  • Chapter 3: Building Machine Learning Models with OpenShift
  • Technical requirements
  • Using Jupyter Notebooks in OpenShift
  • Provisioning an S3 store
  • Using ML frameworks in OpenShift
  • Using GPU acceleration for model training
  • Enabling GPU support
  • Building custom notebooks
  • Creating a custom notebook image
  • Importing notebook images
  • Summary
  • Part 3: Operating ML Workloads
  • Chapter 4: Managing a Model Training Workflow
  • Technical requirements
  • Configuring Pachyderm
  • Versioning your data with Pachyderm
  • Training a model using Red Hat ODS
  • Building a model training pipeline
  • Installing Red Hat OpenShift Pipelines
  • Attaching a pipeline server to your project
  • Building a basic data science pipeline
  • Summary
  • Chapter 5: Deploying ML Models as a Service
  • Packaging and deploying models as a service
  • Saving and uploading models to S3
  • Updating the pipeline via model upload to S3
  • Creating a model server for Seldon
  • Deploying and accessing your model
  • Autoscaling the deployed models
  • Releasing new versions of the model
  • Automating the model deployment process
  • Rolling back model deployments
  • Canary model deployment
  • Securing model endpoints
  • Summary
  • Chapter 6: Operating ML Workloads
  • Monitoring ML models
  • Installing and configuring Prometheus and Grafana
  • Logging inference calls
  • Optimizing cost
  • Summary
  • References
  • Chapter 7: Building a Face Detector Using the Red Hat ML Platform
  • Architecting a human face detector system
  • Training a model for face detection
  • Deploying the model
  • Validating the deployed model
  • Installing Redis on Red Hat OpenShift
  • Building and deploying the inferencing application
  • Bringing it all together
  • Optimizing cost for your ML platform
  • Machine management in OpenShift
  • Spot Instances
  • Summary
  • Index
  • Other Books You May Enjoy