Amid a flurry of new hardware including the Pixel 7, the Pixel Buds Pro and a new Pixel Tablet, Google dropped one development at its I/O developer conference that went largely unnoticed: its AI can now understand jokes.
Jokes, sarcasm and humor require understanding the subtleties of language and human behavior. For a comedian to say something sarcastic or controversial, usually the audience can discern the tone and know it’s more of an exaggeration, something that’s learned from years of human interaction.
But PaLM, or Pathways Language Model, learned it without being explicitly trained on humor and the logic of jokes. After being fed two jokes, it was able to interpret them and spit out an explanation. In a blog post, Google shows how PaLM understands a novel joke not found on the internet.
Understanding dad jokes isn’t the end-goal for Alphabet, parent company to Google. The capability to parse the nuances of natural language and queries means that Google can get answers to complex questions faster and more accurately across more languages and peoples. This, in turn, can break down barriers and move humans away from communicating with machines through pre-determined means and instead more seamlessly interact. This can include answering questions in one language by finding information in another or writing code to a program as a person is speaking into the model with a specific task.
The power of this tech seemed lost on some in the Twitter space. Searching for #GoogleIO2022, the top results focus on Pixel hardware. Really, to get people to understand the power of its AI tech, Google plans to do so with bespoke chips housed in custom Pixel devices.
“How do you tell my mom, LaMDA or PaLM, let alone what natural language processing is,” said Tuong Nguyen, senior principal analyst at Gartner, a tech research and consulting firm. “The devices serve as, in part, a delivery mechanism for all the amazing things that they’re doing.”
Google can pair that with practical examples of how PaLM can make little annoyances go away. During his on-stage appearance at Google I/O on Wednesday, Alphabet CEO Sundar Pichai pointed to the frustration that people who speak lesser known languages face when trying to find answers to problems in their tongue. The answer is likely online but will probably be in English or Spanish.
Pichai showed off an example where PaLM was asked in Bengali what popular pizza toppings are in New York. The model can find the answer in English and translate it back to the user in Bengali.
“One day, we hope we can answer questions on more topics in any language you speak, making knowledge even more accessible in Search and across all of Google,” Pichai said.
Surpassing Star Trek’s Data?
PaLM is Google’s largest AI model to date and trained on 540 billion parameters. It can generate code from text, answer a math word problem and explain a joke. It does this through chain-of-thought prompting, which can describe multi-step problems as a series of intermediate steps.
On stage, Pichai described it as a teacher giving a step-by-step example to help a student understand how to solve a problem.
If what Pichai said on stage is accurate, Google has essentially leapfrogged over Star Trek and 400 years of fictional AI development, as evidenced by the character Data, who never truly understood the subtleties of humor. More so, it seems that Google has caught up with TARS from the movie Interstellar, which takes place in the year 2090, an AI that was so adept at humor that Matthew McConaughey’s character told it to tune it down.
PaLM’s ability to understand humor and make logical inferences is helping Google solve novel challenges that before would have taken someone with specific expertise.
In this sense, Google’s AI is a time machine. AI models can accomplish years of study and research in seconds. Time machine-like companies, ones that can connect people faster or instantaneously, are the ones that dominate global markets and change lives. The value these companies bring to human interoperability is tremendous.
Google, which has primarily been a search and ads sales company, can use its tech to ensure people continue to turn to Google to answer their questions. This, of course, will give Google more data points to pinpoint advertising to customers. Also, companies can turn to its servers and AI to solve complex problems. At I/O, Pichai announced a $9.5 billion publicly available data center and machine learning hub in Mayes County, Oklahoma. There, Google Cloud customers can use the facility’s nine exaflops of computing power to run complex models and help solve problems in medicine, logistics, sustainability and more.
On the consumer front, Google has invested in custom development with its Tensor line of chips for its Pixel phones.
“Google is fundamentally a data and AI and search company. And they monetize your attention and data,” said Avi Greengart, president and lead analyst at Techsponential. “The hardware is an extension of the platform and an ecosystem that drives all of that.”
For Google to fulfill its vision of “ambient computing,” or a future where people can use computers so intuitively they don’t realize they’re using one, it’s had to invest in its own hardware.
“Another company might not talk about something that’s coming in the fall, when we’re not even in the spring, because you’re afraid of burning yourself,” Greengart said. “Google isn’t afraid of that. They’d rather build towards this vision that they have an interoperable software that does cool stuff.”