The Linux Foundation Projects
Skip to main content

Cloudera

“Cloudera is thrilled to join industry thought leaders like Intel in the Open Platform for Enterprise AI Alliance, embracing openness and collaboration to drive innovation and empower the future of generative AI.” – Andy Moller, SVP of Global Alliances & Ecosystem, Cloudera

Domino Data Lab

“Enterprises integrating their choice of cutting-edge tools into their AI platforms aren’t just ahead of the AI curve—they’re defining it. OPEA’s vision aligns with our commitment to open, flexible, governed AI innovation, and we’re proud to support it alongside Intel.”  – Thomas Robinson, COO, Domino Data Lab

dstack

“At dstack, we’re building a new approach to AI infrastructure management aimed at leveraging open source and ensuring its portability across multiple infrastructure and model vendors. We believe the mission of the OPEA initiative is crucial for the safety and democratization of enterprise AI. We’re excited to be a part of it.” – Andrey Cheptsov, CEO & Founder, dstack

Hugging Face

“Hugging Face’s mission is to democratize good machine learning and maximize its positive impact across industries and society. By joining OPEA’s open-source consortium to accelerate Generative AI value to enterprise, we will be able to continue advancing open models and simplify GenAI adoption.” – Julien Simon, Chief Evangelist, Hugging Face

KX

“Collaborative, open-source projects like OPEA fuel our excitement for the future of gen AI because of the ability it has to drive acceleration of both innovation and adoption within enterprise organizations. The power of RAG is undeniable, and its integration into gen AI creates a ballast of truth that enables businesses to confidently tap into their data and use it to grow their business.” – Michael Gilfix, Chief Product and Engineering Officer, KX

MariaDB Foundation

“As GenAI matures, integration into existing IT is a natural and necessary step. The world needs GenAI and vectors as part of a general purpose RDBMS, and we have already demonstrated our ability to deliver this through MariaDB Server. We see huge opportunities for core MariaDB users – and users of the related MySQL Server – to build RAG solutions. It’s logical to keep the source data, the AI vector data, and the output data in one and the same RDBMS. The OPEA community, as part of LF AI & Data, is an obvious entity to simplify Enterprise GenAI adoption.” – Kaj Arnö, CEO, MariaDB Foundation

Minio

“The OPEA initiative is crucial for the future of AI development. Advocating for a foundation of open source and standards – from datasets to formats to APIs and models, enables organizations and enterprises to build transparently. The AI data infrastructure must also be built on these open principles. Only by having open source and open standard solutions, from models to infrastructure and down to the data are we able to create trust, ensure transparency and promote accountability.” – AB Periasamy, CEO and co-Founder, MinIO

Qdrant

“As the leading open-source vector database technology provider, Qdrant is excited to support the launch of the OPEA, underscoring the importance of open standards in AI for innovation and data sovereignty. Our commitment to these principles is rooted in our core, and we look forward to contributing to an ecosystem where AI thrives with a deep respect for data ownership.” – Andre Zayarni, CEO & co-Founder, Qdrant

Red Hat

“As gen AI continues to advance, open source is playing a critical role in the standardization and democratization of models, frameworks, platforms and the tools needed to help enterprises realize value from AI. Red Hat is excited about the potential for AI innovation for our customers through the Open Platform for Enterprise AI.” – Steven Huels, vice president and general manager, AI Business Unit, Red Hat

SAS

“With the potential that generative AI has to shape our future and the ways we do business, it’s imperative that we tap into the power of collaboration for innovation and accuracy in enterprise AI. We’re excited to be a part of the newest AI & Data Sandbox project and to work with other industry leaders on OPEA. – Shadi Shahin VP, Product Strategy, SAS

VMware (acquired by Broadcom)

“We are seeing tremendous enthusiasm among our customer base for RAG, with organizations deploying RAG applications on-premises to empower employees and customers to find the information they need faster, creating greater efficiencies in customer service and document search.  The constructs behind RAG can be universally applied to a variety of use cases, making a community-driven approach that drives consistency and interoperability for RAG applications an important step forward in helping all organizations to safely embrace the many benefits that AI has to offer.” – Chris Wolf, Global Head of AI and Advanced Services, Broadcom

Yellowbrick Data

“We are pleased to collaborate with the Open Platform for Enterprise AI (OPEA), which offers essential guidance in a dense and complex market. Within OPEA, Yellowbrick serves as a data provider—recognizing data as the crucial fuel for AI. Our data warehouse incorporates advanced vector capabilities, enabling seamless integration of AI with current systems and workflows. This ensures that AI augments rather than interrupts business processes, simplifying AI adoption.” – Mark, Cusack, CTO, Yellowbrick Data

Zilliz

“We firmly believe that vector databases are integral to the future of open generative AI, which is why we donated the Milvus vector database to the Linux Foundation back in 2020. Our support of OPEA is an extension of that commitment to creating a framework alongside Intel that fosters extensible, accessible, and scalable AI platforms for enterprise developers.” – Charles Xie, CEO & Founder of Zilliz