Ghostboard pixel

Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

The Linux Foundation AI & Data announced the Open Platform for Enterprise AI (OPEA)

The Linux Foundation AI & Data recently announced the Open Platform for Enterprise AI (OPEA). The OPEA will address the fragmentation at the heart of generative AI technologies with standardized components, including frameworks, blueprints, and reference solutions.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
The Linux Foundation AI & Data announced the Open Platform for Enterprise AI (OPEA)
Credit: The Linux Foundation AI & Data

The Open Platform for Enterprise AI (OPEA) is the Linux Foundation AI & Data's latest Sandbox Project partly developed by Intel and industry partners, including Anyscale, Cloudera, Hugging Face, Qdrant, VMWare, and more. The OPEA has emerged as a response to the fragmentation of resources like tools, techniques, and solutions in generative AI technologies, mainly derived from the fast development and increasing competitiveness the industry has seen recently. OPEA's approach to standardized, modular, and heterogeneous pipelines will be based on collaboration with industry partners to "standardize components, including frameworks, architecture blueprints, and reference solutions that showcase performance, interoperability, trustworthiness, and enterprise-grade readiness."

As one of the main contributors to OPEA, Intel has stated it plans to publish a technical conceptual framework, release reference implementations for generative AI pipelines based on Intel Xeon processors and Gaudi 2 accelerators, and expand infrastructure capacity in the Intel Tiber Development Cloud. In parallel, Intel has also revealed that it has already taken its first steps to fulfill these goals as it has published a set of reference implementations in the OPEA GitHub repository which includes frameworks for a chatbot based on Intel Xeon 6 and Gaudi 2, and for document summarization, visual question answering and a code generation copilot in Gaudi 2. Additionally, Intel released an assessment framework providing a standardized grading and evaluation system based on performance, trustworthiness, scalability, and resilience. Looking forward, the OPEA will provide self-evaluation tests based on this framework and perform external grading and evaluation when requested.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara

Data Phoenix Digest

Subscribe to the weekly digest with a summary of the top research papers, articles, news, and our community events, to keep track of trends and grow in the Data & AI world!

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More