Tech Topic
Red Hat Accelerates AI/ML Workflows and Delivery of AI-Powered
Intelligent Applications with Red Hat OpenShift
Artificial intelligence (AI)
AI is the capability of machines to imitate intelligent human behavior, and perform tasks that normally require humans.
Machine learning (ML)
ML is a subset of AI that gives computers the ability to learn without being explicitly programmed. Computers use algorithms and statistical models to perform specific tasks, relying on patterns and inferences.1
Deep learning (DL)
Deep learning is a subset of ML that uses layers to progressively extract higher level features from the raw input. DL architectures have been applied to computer vision, natural language processing, image analysis, and more.2
Machine learning lifecycle is a multi phase process to obtain the power of large volumes and variety of data, abundant compute, and open source machine learning tools to build intelligent applications.
At a high level, there are four steps in the lifecycle:
Data scientists are primarily responsible for ML modeling to ensure the selected ML model continues to provide the highest prediction accuracy.
The key challenges data scientists face are:
Containers and Kubernetes are key to accelerating the ML lifecycle as these technologies provide data scientists the much needed agility, flexibility, portability, and scalability to train, test, and deploy ML models.
Red Hat OpenShift is the industry's leading containers and Kubernetes hybrid cloud platform. It provides all these benefits, and through the integrated DevOps capabilities and integration with hardware accelerators, OpenShift enables better collaboration between data scientists and software developers, and accelerates the roll out of intelligent applications across hybrid cloud (data center, edge, and public clouds).
Integrations with popular hardware accelerators e.g. NVIDIA GPUs and NGC containers ensures that OpenShift can seamlessly meet the high compute resource requirements to help select the best ML model providing the highest prediction accuracy, and ML inferencing jobs as the model experiences new data in production.
Extending OpenShift DevOps automation capabilities to the ML lifecycle enables collaboration between data scientists, software developers, and IT operations so that ML models can be quickly integrated into the development of intelligent applications. This helps boost productivity, and simplify lifecycle management for ML powered intelligent applications.
OpenShift is helping organizations across various industries to accelerate business and mission critical initiatives by developing intelligent applications in the hybrid cloud. Some example use cases include fraud detection, data driven diagnostics and cure, connected cars, autonomous driving, oil and gas exploration, automated insurance quotes, claims processing, etc.
Red Hat Decision Manager is a cloud-native business rules and decisioning platform that allows ML models to be integrated with decision models. These models can then be served and made available for inference as microservices on OpenShift. Integration with monitoring tools like Prometheus and Grafana enables monitoring and management of the (business) performance of ML models in production.
For additional information on Red Hat Decision Manager, please consult the Red Hat Decision Manager product page, or visit the Red Hat Developer site.
Red Hat Ceph Storage was built to address petabyte-scale storage requirements in the ML lifecycle, from data ingestion and preparation, ML modeling, to the inferencing phase. It is an open source software defined storage system which provides comprehensive support for S3 object, block, and file storage, and delivers massive scalability on industry standard commodity hardware.
For example, you can present scalable Ceph storage to containerized Jupyter notebooks on OpenShift via S3 or persistent volumes.
Open Data Hub Project is a functional architecture based on OpenShift, Red Hat Ceph Storage, Red Hat AMQ Streams, and several upstream open source projects to help build an open ML platform with the necessary ML tooling.
For additional information on the Open Data Hub project, read the blogs, and get started here.
Customer success stories
Boston Children’s Hospital speeds up medical image processing with Red Hat
MOD Israel operationalizes ML on OpenShift with NVIDIA DGX
Volkswagen Tests Autonomous Cars with GPUs and OpenShift
Exxon Mobil democratizing Data Science with Kubernetes and Containers
OpenShift for containerizing Spark at Royal Bank of Canada, OpenShift Commons Gathering 2019
Delivering On-demand analytics environment for Data scientists at Discover Financial Services
e-Book
Top considerations for building a production-ready AI/ML environment
Machine learning ecosystem solution examples
NVIDIA and Red Hat team up to Accelerate Enterprise AI
H2O.ai on OpenShift Delivers Data Science Ease and Flexibility at Scale
PerceptiLabs simplifies building complex ML models, Red Hat 2019 keynote
Video recordings from AI/ML focused OpenShift Commons Gathering, Oct 28th, 2019
Videos on AI/ML on OpenShift and Open Data Hub Community Project
Blogs
AI at the Edge in Industrial Manufacturing
INTERACTIVE LEARNING PORTAL
AI and Machine Learning on OpenShift
REFERENCE ARCHITECTURES
Accelerated AI on Red Hat OpenShift, NVIDIA, and HPE
Red Hat and Supermicro Benchmark
AI/ML on Red Hat OpenShift, NVIDIA, and Cisco UCS
GPU-Accelerated Machine Learning with OpenShift, NVIDIA, and DellEMC
Contact
For press inquiries, email us directly