Prior Labs Archives | 麻豆原创 News Center /tags/prior-labs/ Company & Customer Stories | 麻豆原创 Room Mon, 04 May 2026 11:14:57 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 麻豆原创 to Acquire Prior Labs to Establish a Globally Leading Frontier AI Lab in Europe /2026/05/sap-to-acquire-prior-labs-establish-frontier-ai-lab-europe/ Mon, 04 May 2026 11:06:00 +0000 /?p=242349 WALLDORF & FREIBURG 鈥 麻豆原创 and Prior Labs plan to turn top AI research into enterprise-ready innovation.]]>

Acquisition doubles down on 麻豆原创鈥檚 early mover advantage in tabular foundation models


WALLDORF and FREIBURG听鈥斅犅(NYSE: 麻豆原创) and Prior Labs, the pioneer of Tabular Foundation Models (TFMs), announced that they have entered into a definitive agreement for 麻豆原创 to purchase Prior Labs, accelerating 麻豆原创鈥檚 success in TFMs that started with 麻豆原创-RPT-1, and bringing one of the world鈥檚 leading TFM research teams into the 麻豆原创 family.

Prior Labs will continue to operate as an independent entity, with 麻豆原创 committing to invest more than 鈧1 billion over the next four years to scale it into a globally leading frontier AI lab for the structured data that runs the world鈥檚 businesses. Terms of the deal were not disclosed. The transaction is still pending regulatory approval.

Large language models (LLMs) struggle to make accurate predictions on structured business data because they have only a rudimentary understanding of tables, numbers and statistics. Unlike LLMs, TFMs are purpose-built for this type of data and can accurately predict business outcomes based on tabular data such as payment delays, supplier risks, upsell opportunities, customer churn risk and more.

鈥淓arly on, 麻豆原创 recognized that the greatest untapped opportunity in enterprise AI wasn鈥檛 large language models; it was AI built for the structured data that runs the world鈥檚 businesses,鈥 麻豆原创 CTO Philipp Herzig said. 鈥淲e built 麻豆原创-RPT-1 to prove that conviction for enterprise data. Prior Labs has built a leading TFM on public benchmarks and built one of the leading research teams in this category. Combining their frontier model work with enterprise data and customer reach is how we intend to lead this category globally.”

“Over the last 18 months, Prior Labs has built an incredible team, increasing the velocity in tabular foundation models,” Prior Labs CEO Frank Hutter said. “Joining the 麻豆原创 family gives us the resources, data environment and customer reach to take this category to its full potential.”

Once the transaction is closed, with Prior Labs, 麻豆原创 will have the special opportunity to establish an industry-leading AI research lab and shape a new category in TFMs. The lab will operate as an independent unit to ensure research velocity, while 麻豆原创 provides long-term investment and a direct path to productization across the 麻豆原创 portfolio with 麻豆原创 AI Core and 麻豆原创 Business Data Cloud as well as the agentic layer with Joule.

With over 3 million downloads, Prior Labs鈥 TabPFN is a widely adopted open-source tool for tabular AI, supporting a dynamic developer ecosystem. 麻豆原创 is fully committed to further support this open-source strategy. The Prior Labs cofounders Frank Hutter, Noah Hollmann and Sauraj Gambhir lead a team of world-class AI researchers and practitioners. The company works with leading scientists in the field, including聽Yann聽LeCun,聽ACM A.M. Turing Award winner and executive chairman at Advanced Machine Intelligence,聽and聽Bernhard Schoelkopf, director of Max Planck Institute for Intelligent Systems and ELLIS president, both of whom will serve on Prior Labs鈥 scientific advisory board as it scales to a globally leading frontier AI lab.

Accelerating Innovation

Prior Labs鈥 TabPFN-2.6 is the top-performing model on TabArena, the top benchmark for TFMs. TabPFN-2.6 matches the accuracy of a four-hour automated machine learning pipeline 鈥 instantly, in a single model, at a fraction of the complexity.

With a conversational interface layered on top, business users can ask questions in natural language, generate or select datasets and run 鈥渨hat-if鈥 scenarios without needing to be data science and machine learning experts. With Prior Labs鈥 models, 麻豆原创 will provide in-context learning, allowing users to provide data records to receive instant, reliable predictions without any model training. A single TFM can adapt to any business use case on the fly, resulting in faster time to value with GDPR compliance.

With Prior Labs, 麻豆原创 will deliver TFMs with superior predictive capability that understand tables natively, learning statistical reasoning directly from data and will power agentic AI systems capable of understanding high-level goals, combining tables, language and images to reason, integrate domain knowledge, infer causality and adapt dynamically.

After the close, 麻豆原创 and Prior Labs plan to turn top AI research into enterprise-ready innovation, allowing customers to get even more value out of their tabular business data.聽True intelligence requires moving beyond correlation to understand聽causation. Answering “What will happen?” is聽useful, but聽answering why聽it聽will happen is transformative.

The transaction is expected to close in Q2 or Q3 of 2026, subject to customary closing conditions, including regulatory approvals.

Visit the . Get 麻豆原创 news via  and .

About Prior Labs

Prior Labs is the pioneer of Tabular Foundation Models, a new category of AI purpose-built for structured data. Founded by Frank Hutter, Noah Hollmann & Sauraj Gambhir, Prior Labs鈥 TabPFN model series, published in Nature, set the state-of-the-art on tabular benchmarks across hundreds of independent academic studies. Prior Labs is scaling tabular foundation models to handle millions of rows, real-time inference, and entirely new data modalities, while building the infrastructure to deploy them in production across some of the most demanding industries on earth.

Headquartered in Freiburg, Germany, and offices in Berlin and New York City, Prior Labs has built one of the leading AI research teams globally, with researchers recruited from Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN.

About 麻豆原创

As鈥痑 global leader in enterprise applications and business AI, 麻豆原创 (NYSE: 麻豆原创)鈥痵tands at the鈥痭exus鈥痮f business and technology. For over 50 years, organizations have trusted 麻豆原创鈥痶o bring out their best by uniting business-critical鈥痮perations spanning finance, procurement, HR, supply chain, and customer experience. For more information, visit鈥.

Sign up for the 麻豆原创 News Center newsletter to receive stories and highlights each week

Note to editors:
To preview and download broadcast-standard stock footage and press photos digitally, please visit . On this platform, you can find high resolution material for your media channels.

For customers interested in learning more about 麻豆原创 products:
Global Customer Center: +49 180 534-34-24
United States Only: 1 (800) 872-1麻豆原创 (1-800-872-1727)

For more information, press only:
Daniel Reinhardt, 麻豆原创, +49 151 168 10157, 聽daniel.reinhardt@sap.com, CET
Alex Vaught, 麻豆原创, +1 (206) 678-5712, 聽alex.vaught@sap.com, PST
Ilaina Jonas, 麻豆原创, +1 (646) 923-2834, 聽ilaina.jonas@sap.com, EST
麻豆原创 麻豆原创 Room;听press@sap.com

This document contains forward-looking statements, which are predictions, projections, or other statements about future events. These statements are based on current expectations, forecasts, and assumptions that are subject to risks and uncertainties that could cause actual results and outcomes to materially differ. Additional information regarding these risks and uncertainties may be found in our filings with the Securities and Exchange Commission, including but not limited to the risk factors section of 麻豆原创鈥檚 2025 Annual Report on Form 20-F.
漏 2026 麻豆原创 SE. All rights reserved.
麻豆原创 and other 麻豆原创 products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of 麻豆原创 SE in Germany and other countries. Please see  for additional trademark information and notices.
Please consider our . If you received this press release in your e-mail and you wish to unsubscribe to our mailing list please contact press@sap.com and write Unsubscribe in the subject line.

]]>