H2O.ai Builds Smaller AI Models

Photo Credit: Benjamin Wiens from Pixabay

The growth in the AI industry has led to the increasing size of the AI models. OpenAI’s ChatGPT and Google’s Bard, for instance, are composed of more than 100 billion parameters. GPT-4 is estimated to be built out of over 1 trillion parameters. However, these large parameters require substantial computing power, have high operating costs, and can perpetuate harmful biases if not carefully monitored. The high resource requirement makes these large AI models inaccessible to smaller players. Mountain View-based H2O.ai is helping democratize AI adoption by working on smaller AI models, some which weigh as little as 1.8 billion parameters.

H2O.ai’s Offerings

Founded in 2012 by open-source-focused Cliff Click and Satish Ambati, H2O.ai began with the idea that there should be freedom around the creation and use of AI. H2O.ai’s open source framework known as H2O allows data scientists and developers access to a fast machine learning engine for their applications. It works both on top of existing big data infrastructure, on bare metal or on top of existing Hadoop, Spark, or Kubernetes clusters and can ingest data directly from HDFS, Spark, S3, Azure Data Lake, or any other data source into its in-memory distributed key-value store. Today, its customers use the H2O AI Hybrid Cloud platform to build, operate, and innovate AI solutions to solve complex business problems.

H2O.ai provides an end-to-end GenAI platform where customers own every part of the stack. The platform has been built for air-gapped, on-premises or cloud VPC deployments and allows organizations to own their data and their prompts. Its Enterprise h2oGPTe provides information retrieval on internal data, privately hosts LLMs, and secures data for its customers. H2O.ai is aware of the importance of size for organizations, and it offers some of the smallest data models that run on the GPUs that the customers already have. Its retrieval augmentation generation (RAG) technology can seamlessly integrate its models into the existing data store. It offers 13b, 34b, or 70b Llama2 models that are up to 100x times smaller and more affordable.

More recently, it released H2O-Danube2-1.8B that leverages a dataset of 1.8 billion parameters that can run on a single consumer GPU, fits within the RAM capacity common on consumer cards, and can operate on CPU-only machines for small tasks. By reducing the infrastructure requirement, H2O-Danube has become much more accessible to individual developers, researchers, and startups that have tighter resource constraints.

Recent research also suggests that smaller models are prone to lesser hallucinations. Compared to their bigger counterparts, the smaller models are trained on a narrower and more targeted dataset that is specific to their intended use case. The training makes the model acquire more relevant patterns, vocabulary and information, thus reducing the likelihood of generating irrelevant, unexpected, or inconsistent outputs.

H2O.ai’s Financials

H2O.ai remains privately held and does not disclose its financials. It has raised $251.1 million in 9 rounds of funding led by Veligera Capital, 2Shares, Goldman Sachs, Commonwealth Bank of Australia, Crane Venture Partners, Pivot Investment Partners, New York Life Insurance, NVIDIA, Celesta Capital, and Wells Fargo. It has not raised funds since November 2021, when it raised $100 million at a valuation of $1.6 billion.

H2O.ai is not the only open-source AI player in the industry. Companies like KNIME and XGBoost offer similar open-source and visual platform tools needed to build efficient machine-learning models. Plus, there is always competition from the likes of Amazon and Google that the company has to deal with.

More By This Author:

Is BigML Just Another Small ML Platform Player?
AI Unicorns: Clarifai Brings AI To Third Party Developers
AI Unicorns: DataRobot Delays Listing Plans

Disclosure: All investors should make their own assessments based on their own research, informed interpretations, and risk appetite. This article expresses my own opinions based on my own research ...

How did you like this article? Let us know so we can better customize your reading experience.


Leave a comment to automatically be entered into our contest to win a free Echo Show.