As first reported by The Information, then seemingly verified by Salesforce CEO Marc Benioff on X (formerly known as Twitter), AI startup Hugging Face has raised $235 million in a Series D funding round.
The tranche, which had participation from Google, Amazon, Nvidia, Intel, AMD, Qualcomm, IBM, Salesforce and Sound Ventures, values Hugging Face at $4.5 billion. That’s double the startup’s valuation from May 2022 and reportedly more than 100 times Hugging Face’s annualized revenue, reflecting the enormous appetite for AI and platforms to support its development.
Hugging Face has raised a total of $395.2 million to date, placing it among the better-funded AI startups in the space. Those ahead of it are OpenAI ($11.3 billion), Anthropic ($1.6 billion), Inflection AI ($1.5 billion), Cohere ($435 million) and Adept ($415 million).
“AI is the new way of building all software. It’s the most important paradigm shift of the decade and, compared to the software shift, it’s going to be bigger because of new capabilities and faster because software paved the way,” co-founder and CEO Clément Delangue told TechCrunch via email. “Hugging Face intends to be the open platform that empowers this paradigm shift.”
Delangue, a French entrepreneur, launched Brooklyn-based Hugging Face in 2016 alongside Julien Chaumond and Thomas Wolf. The trio originally built a chatbot app targeted at teenagers. But after open sourcing the algorithm behind the app, Hugging Face pivoted to focus on creating a platform for creating, testing and deploying machine learning.
Today, Hugging Face offers a number of data science hosting and development tools, including a GitHub-like hub for AI code repositories, models and data sets as well as web apps to demo AI-powered applications. Hugging Face also provides libraries for tasks like data set processing and evaluating models in addition to an enterprise version of the hub that supports software-as-a-service and on-premises deployments.
Hugging Face’s paid functionality includes AutoTrain, which helps to automate the task of training AI models; Inference API, which allows developers to host models without managing the underlying infrastructure; and Infinity, which is designed to increase the speed with which an in-production model processes data.
Hugging Face has 10,000 customers today, it claims, and more than 50,000 organizations on the platform. And its model hub hosts over 1 million repositories.
Contributing to the growth is the strong, sustained interest in AI from the enterprise. According to a HubSpot poll, 43% of business leaders say that they plan to increase their investment in AI and automation tools over the course of 2023, while 31% say AI and automation tools are very important to their overall business strategy.
Much of what Hugging Face delivers falls into MLOps, a category of tools for streamlining the process of taking AI models to production and then maintaining and monitoring them. The MLOps market is substantial in its own right, with one report estimating that it’ll reach $16.61 billion by 2030.
But Hugging Face dabbles in other areas, too.
In 2021, Hugging Face launched BigScience, a volunteer-led project to produce an open source language model as powerful as OpenAI’s GPT-3, but free and open for anyone to use. It culminated in Bloom, a multilingual model that for more than a year has been available to tinker with on Hugging Face’s model hub.
Bloom is but one of several open source models to which Hugging Face has contributed development resources.
Hugging Face collaborated with ServiceNow, the enterprise software company, to release a free code-generating AI model called StarCoder (a follow-up model, SafeCoder, debuted this week). And the startup made available its own free version of ChatGPT, OpenAI’s viral AI-powered chatbot, in partnership with the German nonprofit LAION.
Hugging Face’s team-ups extend to major cloud providers, some of which are strategic investors.
Hugging Face recently worked with Nvidia to expand access to cloud compute via Nvidia’s DGX computing platform. It has a partnership with Amazon to extend its products to AWS customers and everage Amazon’s custom Trainium chips to train the next generation of Bloom. And Hugging Face collaborated with Microsoft on Hugging Face Endpoints on Azure, a way to turn Hugging Face-developed AI models into scalable production solutions hosted through Azure.
With this latest investment, Delangue says that Hugging Face plans to “double down” on its supportive efforts in many domains, including research, enterprise and startups. It has 170 employees, but plans on recruiting new talent over the coming months.
The tranche, which had participation from Google, Amazon, Nvidia, Intel, AMD, Qualcomm, IBM, Salesforce and Sound Ventures, values Hugging Face at $4.5 billion. That’s double the startup’s valuation from May 2022 and reportedly more than 100 times Hugging Face’s annualized revenue, reflecting the enormous appetite for AI and platforms to support its development.
Hugging Face has raised a total of $395.2 million to date, placing it among the better-funded AI startups in the space. Those ahead of it are OpenAI ($11.3 billion), Anthropic ($1.6 billion), Inflection AI ($1.5 billion), Cohere ($435 million) and Adept ($415 million).
“AI is the new way of building all software. It’s the most important paradigm shift of the decade and, compared to the software shift, it’s going to be bigger because of new capabilities and faster because software paved the way,” co-founder and CEO Clément Delangue told TechCrunch via email. “Hugging Face intends to be the open platform that empowers this paradigm shift.”
Delangue, a French entrepreneur, launched Brooklyn-based Hugging Face in 2016 alongside Julien Chaumond and Thomas Wolf. The trio originally built a chatbot app targeted at teenagers. But after open sourcing the algorithm behind the app, Hugging Face pivoted to focus on creating a platform for creating, testing and deploying machine learning.
Today, Hugging Face offers a number of data science hosting and development tools, including a GitHub-like hub for AI code repositories, models and data sets as well as web apps to demo AI-powered applications. Hugging Face also provides libraries for tasks like data set processing and evaluating models in addition to an enterprise version of the hub that supports software-as-a-service and on-premises deployments.
Hugging Face’s paid functionality includes AutoTrain, which helps to automate the task of training AI models; Inference API, which allows developers to host models without managing the underlying infrastructure; and Infinity, which is designed to increase the speed with which an in-production model processes data.
Hugging Face has 10,000 customers today, it claims, and more than 50,000 organizations on the platform. And its model hub hosts over 1 million repositories.
Contributing to the growth is the strong, sustained interest in AI from the enterprise. According to a HubSpot poll, 43% of business leaders say that they plan to increase their investment in AI and automation tools over the course of 2023, while 31% say AI and automation tools are very important to their overall business strategy.
Much of what Hugging Face delivers falls into MLOps, a category of tools for streamlining the process of taking AI models to production and then maintaining and monitoring them. The MLOps market is substantial in its own right, with one report estimating that it’ll reach $16.61 billion by 2030.
But Hugging Face dabbles in other areas, too.
In 2021, Hugging Face launched BigScience, a volunteer-led project to produce an open source language model as powerful as OpenAI’s GPT-3, but free and open for anyone to use. It culminated in Bloom, a multilingual model that for more than a year has been available to tinker with on Hugging Face’s model hub.
Bloom is but one of several open source models to which Hugging Face has contributed development resources.
Hugging Face collaborated with ServiceNow, the enterprise software company, to release a free code-generating AI model called StarCoder (a follow-up model, SafeCoder, debuted this week). And the startup made available its own free version of ChatGPT, OpenAI’s viral AI-powered chatbot, in partnership with the German nonprofit LAION.
Hugging Face’s team-ups extend to major cloud providers, some of which are strategic investors.
Hugging Face recently worked with Nvidia to expand access to cloud compute via Nvidia’s DGX computing platform. It has a partnership with Amazon to extend its products to AWS customers and everage Amazon’s custom Trainium chips to train the next generation of Bloom. And Hugging Face collaborated with Microsoft on Hugging Face Endpoints on Azure, a way to turn Hugging Face-developed AI models into scalable production solutions hosted through Azure.
With this latest investment, Delangue says that Hugging Face plans to “double down” on its supportive efforts in many domains, including research, enterprise and startups. It has 170 employees, but plans on recruiting new talent over the coming months.