As Databricks stacks more capital, a competitive AI market heats up

CEO Ali Ghodsi chimes in on the investment, growth and more

This morning, data and AI giant Databricks said it had raised a new $500 million funding round from venture capitalists, crossover capital funds, and strategic investors. The new cash values Databricks at $43 billion, a material step up from its last private valuation set in 2021, when the company was worth $38 billion.


The Exchange explores startups, markets and money.

Read it every morning on TechCrunch+ or get The Exchange newsletter every Saturday.


It’s not a complete surprise to see Databricks raise more capital — the company was reported to be in the market for nine figures recently. The new funds are also not a surprise given the excitement for all things AI-related, a part of the technology market that Databricks has embraced, particularly this year.

The company may be best known for its data and analytics work, but it is developing more AI tooling and recently purchased MosaicML to further build its artificial intelligence muscles.

The funding round was more than a mere cash infusion, though. It included several strategic investors, including Nvidia, which has seen demand for AI-related computing power greatly bolster its own growth and profitability in recent quarters.

To better understand the company’s own perspective, TechCrunch+ interviewed Databricks CEO Ali Ghodsi about the investment, its plans for AI, growth, the current market and more.

AI stands for All In

What exactly is Databricks doing to warrant this kind of investment at this value in this market? It’s a combination of a few things, really. For starters, data is the fuel for AI, and Databricks, at its core, stores data in its data lakehouse — think a data lake and a data warehouse combined, giving the best of both worlds.

As AI took off in a big way this year, driven in large part by the release of OpenAI’s ChatGPT last November, Databricks saw a way to put that data to work and drive the message. But Ghodsi said in a March interview with TechCrunch+ that even though LLMs entered the mainstream at the end of last year, they’ve been around for some time and close to 1,000 Databricks customers were already using data stored on its platform to train LLMs.

Prior to the interactivity of generative AI, he said that it was for things like fast language translation, deriving risk data from insurance claims and understanding the contents of medical records, among other things. But as generative AI began to take off, Databricks wanted to be central to that process and not just remain as the data store fueling the models.

“I think something shifted in the market this year, where our customers and the world have woken up and they have this awareness of AI,” Ghodsi said, adding that movement spurred Databricks to shift its strategy. That’s when the company began developing Dolly to have an LLM of its own, and later acquired Mosaic, all part of a push to be a player in generative AI, rather than a conduit for it.

The end result is that Databricks is now part of the conversation. When Salesforce announced Einstein Copilot Studio this week with the ability to bring your own model, one of the supported models was that of Databricks.

In it to win it

With a valuation like $43 billion, it’s somewhat easy to forget just how much competition Databricks has. But Ghodsi is well aware that he is competing fiercely with a variety of vendors, especially Snowflake, which went public in 2020. But he believes his company’s lakehouse approach separates it from the pack.

“People realize that the lakehouse, which is a lot about what you can do with AI, is actually quite different from, say, things like data warehousing and other technologies that people have been obsessed with in prior years,” he said. In other words, he thinks his architecture is better geared toward current use cases.

He explained that the difference is that data warehouse technology lets you answer questions about the past, while the lakehouse lets you answer questions about the future. “Previously, it was about taking lots of data and analyzing the past, like what happened with my revenue last week or last month. The lakehouse brings in the lake portion, which is really unstructured data that is used for AI and data science, and that lets you ask questions about the future,” he said.

Nvidia is kicking in money as part of today’s investment, which can’t be a coincidence. It’s the leader in GPUs, which is a key part of the underlying infrastructure running large language models. It’s worth noting, however, that Nvidia also has a strategic partnership with Databricks’ primary rival, Snowflake, which was announced in June with an eye toward — wait for it — making it easier to build generative AI applications.

While Ghodsi didn’t provide details about how the partnership would play out, he did say it is about working together over many years and covering many different parts of the business. It’s fair to say that Nvidia is covering its bases with this investment, making sure that it can do business with all the key players in the market.

On the subject of capital

Moving from the bigger picture to the more nitty-gritty work of accounting, let’s turn our attention to the market that Databricks sees today.

First, according to Ghodsi, his company wasn’t hurting for duckets. The CEO told TechCrunch+ that his company did not “really need the capital” it raised, and instead emphasized its new partnerships and how they may help it “really double down on generative AI and build custom language models.”

It is, of course, difficult to vet a cash balance comment from the CEO of a private company, but we’ll be able to confirm Ghodsi’s notes when we finally get an S-1 from his camp in due time. Perhaps late next year, if we’re feeling optimistic.

Still, the idea of partnerships to bolster AI work makes good sense; every enterprise software company that deals with lots of customer data is trying to spin those bits into AI gold. Databricks, which rose to prominence thanks to its work to bring data warehouse–like analytics tools to lakes of unstructured data, has a big, big data story. And so it likely has a material AI play ahead of it.

Who wouldn’t want Nvidia on their side for that push?

For other investors, the new round is likely a wager on Databricks continuing to grow quickly and build a business that they hope will be worth closer to $100 billion than $50 billion when it does go public. That’s only an estimate, but we doubt that investors want to tie up large blocks of liquid capital for minor returns in today’s higher interest rate environment.

Still, there’s reason to believe Databricks can keep up the growth. Its disclosed growth rate — more than 50% year-over-year in its most recent fiscal quarter — was accompanied by a note that it added more revenue in that period than in any other quarter in its history. This is Databricks arguing implicitly that sure, it’s growth is likely to slow in percentage terms as its revenue base expands, but that doesn’t mean it is not grabbing increasingly large handfuls of market spend at the same time.

That growth is coming from all around the world. Ghodsi told TechCrunch+ that his company is seeing “super fast growth” in the Europe, Middle East, and African region, as well as in the Asia-Pacific area. He name-checked Latin America, later calling out Brazil, India, Japan, Australia and New Zealand, and added that “Europe is on fire” for his company’s products and services.

Demand for data, analysis, and AI services is growing strongly everywhere, then. We can see this in Snowflake’s own impressive growth rates post-IPO, but Databricks’ notes make it clear that the getting is good for the entire market, and not just for one company or another.

Finally, Databricks disclosed that it has “achieved record non-GAAP subscription gross margins of 85%.” That means that if we focus only on Databricks’ software revenues and strip out certain costs, it is seeing top-tier revenue.

The greater the gross margin that a company can achieve, the stronger its revenue is, all other things being equal. Of course, if we did not strip out share-based compensation while observing the company’s subscription gross margins, that figure would be smaller, just as Databricks’ aggregate gross margins would be lower still.

Still, for the core of what it offers, Databricks is spending to build up a revenue base that should generate lots of cash for its operations.

With new capital, new friends, and AI momentum, Databricks appears to be doing well for itself this year. The success that it and some of its peers are seeing today will engender increased competition, though. That may explain why tech shops today are wasting no time in laying claim to the future AI market and mindshare, Databricks included.