Red Hat Delivers Next Wave of Gen AI Innovation with New Red Hat Enterprise Linux AI Capabilities

RHEL AI 1.3 adds the Granite 3.0 8b model, helps simplify the preparation of AI training data and expands support for the latest accelerated compute hardware

Red Hat, Inc., the world's leading provider of open source solutions, today announced the latest release of Red Hat Enterprise Linux AI (RHEL AI), Red Hat’s foundation model platform for more seamlessly developing, testing and running generative artificial intelligence (gen AI) models for enterprise applications. RHEL AI 1.3 brings support for the latest advancements in the Granite large language model (LLM) family and incorporates open source advancements for data preparation while still maintaining expanded choice for hybrid cloud deployments, including the underlying accelerated compute architecture.

According to IDC’s “Market Analysis Perspective: Open GenAI, LLMs, and the Evolving Open Source Ecosystem,” 61% of respondents plan to use open source foundation models for gen AI use cases, while more than 56% of deployed foundation models are already open source1. Red Hat sees this trend validating the company’s vision for enterprise gen AI, which calls for:

  • Smaller, open source-licensed models that can run anywhere and everywhere needed across the hybrid cloud.
  • Fine-tuning capabilities that enable organizations to more easily customize LLMs to private data and specific use cases.
  • Optimized and more efficient AI models driven by inference performance engineering expertise.
  • The backing of a strong partner and open source ecosystem for broader customer choice.

RHEL AI forms a key pillar for Red Hat’s AI vision, bringing together the open source-licensed Granite model family and InstructLab model alignment tools, based on the Large-scale Alignment for chatBots (LAB) methodology. These components are then packaged as an optimized, bootable Red Hat Enterprise Linux image for individual server deployments anywhere across the hybrid cloud.

Support for Granite 3.0 LLMs

RHEL AI 1.3 extends Red Hat’s commitment to Granite LLMs with support for Granite 3.0 8b English language use cases. Granite 3.0 8b is a converged model, supporting not only English but a dozen other natural languages, code generation and function calling. Non-English language use cases, as well as code and functions, are available as a developer preview within RHEL AI 1.3, with the expectation that these capabilities will be supported in future RHEL AI releases.

Simplifying data preparation with Docling

Recently open sourced by IBM Research, Docling is an upstream community project that helps parse common document formats and convert them into formats like Markdown and JSON, preparing this content for gen AI applications and training. RHEL AI 1.3 now incorporates this innovation as a supported feature, enabling users to convert PDFs into Markdown for simplified data ingestion for model tuning with InstructLab.

Through Docling, RHEL AI 1.3 now also includes context-aware chunking, which takes into account the structure and semantic elements of the documents used for gen AI training. This helps resulting gen AI applications maintain better levels of coherency and contextually-appropriate responses to questions and tasks, which otherwise would require further tuning and alignment.

Future RHEL AI releases will continue to support and refine Docling components, including additional document formats as well as integration for retrieval-augmented generation (RAG) pipelines in addition to InstructLab knowledge tuning.

Broadening the gen AI ecosystem

Choice is a fundamental component of the hybrid cloud and with gen AI serving as a signature workload for hybrid environments, this optionality needs to start with the underlying chip architectures. RHEL AI already supports leading accelerators from NVIDIA and AMD, and the 1.3 release now includes Intel Gaudi 3 as a technology preview.

Beyond chip architecture, RHEL AI is supported across major cloud providers, including AWS, Google Cloud and Microsoft Azure consoles as a “bring your own subscription” (BYOS) offering. The platform is also available soon as an optimized and validated solution option on Azure Marketplace and AWS Marketplace.

RHEL AI is available as a preferred foundation model platform on accelerated hardware offerings from Red Hat partners, including Dell PowerEdge R760xa servers and Lenovo ThinkSystem SR675 V3 servers.

Model serving improvements with Red Hat OpenShift AI

As users look to scale out the serving of LLMs, Red Hat OpenShift AI now supports parallelized serving across multiple nodes with vLLM runtimes, providing the ability to handle multiple requests in real-time. Red Hat OpenShift AI also allows users to dynamically alter an LLM’s parameters when being served, such as sharding the model across multiple GPUs or quantizing the model to a smaller footprint. These improvements are aimed at speeding up response time for users, increasing customer satisfaction and lowering churn.

Supporting Red Hat AI

RHEL AI, along with Red Hat OpenShift AI, underpins Red Hat AI, Red Hat’s portfolio of solutions that accelerate time to market and reduce the operational cost of delivering AI solutions across the hybrid cloud. RHEL AI supports individual Linux server environments, while Red Hat OpenShift AI powers distributed Kubernetes platform environments and provides integrated machine-learning operations (MLOps) capabilities. Both solutions are compatible with each other, with Red Hat OpenShift AI will incorporate all of RHEL AI’s capabilities to be delivered at scale.

Availability

RHEL AI 1.3 is now generally available. More information on additional features, improvements, bug fixes and how to upgrade to the latest version can be found here.

Supporting Quotes

Joe Fernandes, vice president and general manager, Artificial Intelligence Business Unit, Red Hat

“To harness the transformative power of gen AI, we believe that smaller, optimized models are a necessity, and that these models need to be deployed anywhere and everywhere across the hybrid cloud. Our enhancements to RHEL AI build on this belief, making it easier to prepare organizational data for private model training with Docling and incorporating the latest advancements in the Granite family of open source-licensed LLMs. ”

Additional Resources

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Forward-Looking Statements

Except for the historical information and discussions contained herein, statements contained in this press release may constitute forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are based on the company’s current assumptions regarding future business and financial performance. These statements involve a number of risks, uncertainties and other factors that could cause actual results to differ materially. Any forward-looking statement in this press release speaks only as of the date on which it is made. Except as required by law, the company assumes no obligation to update or revise any forward-looking statements.

Red Hat, Red Hat Enterprise Linux, the Red Hat logo and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux® is the registered trademark of Linus Torvalds in the U.S. and other countries.

1 IDC, Market Analysis Perspective: Open GenAI, LLMs, and the Evolving Open Source Ecosystem, 2024, doc #US51864824, September 2024

 

Contacts

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.