In an industrywide race to unveil the next viral AI tool, Google is pushing forward with plans to publicly release technology it has been developing.
Google unveiled its experimental conversational AI service Bard Monday, an announcement that follows its earnings call last week, which was sprinkled with AI-related tool teases.
Bard uses information from the internet to generate up-to-date responses and simplify complex topics, according to Alphabet. Google used the example of the AI tool synthesizing the responses to two questions: “is the piano or guitar easier to learn, and how much practice does each need?”
Experts watched OpenAI and Microsoft’s partnership grow alongside the famed launch of ChatGPT, and many looked to Google to see how the rival would respond. Microsoft’s OpenAI investments and reports of a Bing ChatGPT integration have put pressure on Google, most widely known for its AI-powered search engine.
“We have been preparing for this moment since early last year, and you’re going to see a lot from us in the coming few months across three big areas of opportunity,” Alphabet CEO Sundar Pichai said during the company’s Q4 2022 earnings call last week, according to a SeekingAlpha transcript.
Those areas include language models, new tools and APIs related to AI development and advancements in its cloud AI platform.
The company also plans to make more generative capabilities available.
Here is a breakdown of Google’s near-term AI plans:
Bard
Google made Bard, an experimental conversational AI service, available to “trusted testers” Monday, the company said. It will make it more widely available to the public in the coming weeks, according to the announcement Monday.
A lightweight model version of LaMDA will initially power Bard, which Google expects will require less computing power and enable the company to scale the model to more users.
Google will combine external feedback with internal testing to evaluate the accuracy and quality of generated responses.
LaMDA/PaLM
Google plans to make its language models, LaMDA and PaLM, publicly available, the company said last week in its earnings call. Interactions with LaMDA and PaLM will fuel Google’s efforts to test and improve the models, following a somewhat similar launch as ChatGPT.
“In the coming weeks and months, we’ll make these language models available, starting with LaMDA, so that people can engage directly with them,” Pichai said on the earnings call. “This will help us continue to get feedback, test and safely improve them.”
LaMDA, short for language model for dialogue applications, was built on Transformer, the company’s neural network architecture, and trained on dialogue, according to a Google blog post. Google first announced the technology in May 2021.
PaLM, the pathways language model, will follow LaMDA’s public release. PaLM is a decoder-only Transformer model trained with Google’s Pathways system. It can understand and generate language, reason and perform code-related tasks, according to a Google blog post. PaLM was first announced in April 2022.
Investing in AI entities
Alphabet is already signaling the importance of AI to its marquee brands. The company moved DeepMind from its “other bets” category in financial filings to listing it under its corporate costs, starting this quarter.
The move signals the expanding collaboration between the AI subsidiary across Google’s portfolio, according to Alphabet CFO Ruth Porat.
DeepMind is not the only entity helping Google reach its AI aspirations. Google confirmed it has invested in AI start-up Anthropic. While it did not say how much it invested, the Financial Times reported the amount to be around $300 million.
Anthropic will use Google Cloud’s infrastructure and Google will have access to Anthropic’s AI capabilities as part of the deal, similar to an arrangement struck between OpenAI and Microsoft.
“Google Cloud intends to build large-scale, next-generation TPU and GPU clusters that Anthropic plans to use to train and deploy its cutting-edge AI systems,” Google announced Friday.
BERT/MUM
As part of the AI push, Google Search is expected to see some improvement as well.
“Language models like BERT and MUM have improved searches for four years now, enabling significant ranking improvements and multimodal search like Google Nets,” Pichai said in the Q4 2022 earnings call. “Very soon, people will be able to interact directly with our newest, most powerful language models as a companion to search in experimental and innovative ways.”
BERT, a bidirectional encoder representations from transformers, is a method of pre-training language models.
Built by Google, BERT is trained on a large source of text and applied to natural language processing tasks, according to a Google blog post. BERT was first announced in November 2018.
MUM, short for multitask unified model, is 1,000 times more powerful than BERT, understands information from text and images and allows users to tackle fewer search queries than previous systems, according to a Google blog post. MUM was first announced in May 2021.