Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Google is redesigning its search engine — and it’s AI all the way down 

From ‘AI Overviews’ to automatic categorization, Google is bringing AI to practically every part of the search process. 

A year ago, Google said that it believed AI was the future of search. That future is apparently here: Google is starting to roll out “AI Overviews,” previously known as the Search Generative Experience, or SGE, to users in the US and soon around the world. Pretty soon, billions of Google users will see an AI-generated summary at the top of many of their search results. And that’s only the beginning of how AI is changing search. 

“What we see with generative AI is that Google can do more of the searching for you,” says Liz Reid, Google’s newly installed head of Search, who has been working on all parts of AI search for the last few years. “It can take a bunch of the hard work out of searching, so you can focus on the parts you want to do to get things done, or on the parts of exploring that you find exciting.” 

Reid ticks off a list of features aimed at making that happen, all of which Google announced publicly on Tuesday at its I/O developer conference. There are the AI Overviews, of course, which are meant to give you a general sense of the answer to your query along with links to resources for more information. There’s also a new feature in Lens that lets you search by capturing a video. There’s a new planning tool designed to automatically generate a trip itinerary or a meal plan based on a single query. There’s a new AI-powered way to organize the results page itself so that when you want to see restaurants in a new city, it might offer you a bunch for date night and a bunch for a business meeting without you even having to ask.  

This is nothing short of a full-stack AI-ification of search. Google is using its Gemini AI to figure out what you’re asking about, whether you’re typing, speaking, taking a picture, or shooting a video. 

It’s using a new specialized Gemini model to summarize the web and show you an answer. It’s even using Gemini to design and populate the results page.  

Not every search needs this much AI, though, Reid says, and not every search will get it. “If you just want to navigate to a URL, you search for Walmart and you want to get to walmart.com. It’s not really beneficial to add AI.” Where she figures Gemini can be most helpful is in more complex situations, the sort of things you’d either need to do a bunch of searches for or never even go to Google for in the first place.  

One example Reid likes is local search. (You hear this one a lot in AI because it can be tricky to wade through tons of same-y listings and reviews to find something actually good.) With Gemini, she says, “we can do things like ‘Find the best yoga or pilates studio in Boston rated over four stars within a half-hour walk of Beacon Hill.’” Maybe, she continues, you also want details on which has the best offers for first-timers. “And so you can get information that’s combined, across the Knowledge Graph and across the web, and pull it together.” 

That combination of the Knowledge Graph and AI — Google’s old search tool and its new one — is key for Reid and her team. Some things in search are a solved problem, like sports scores: “If you just actually want the score, the product works pretty well,” Reid says. Gemini’s job, in that case, is to make sure you get the score no matter how strangely you ask for it. “You can think about expanding the types of questions that would successfully trigger the scores,” she says, “but you still want that canonical sports data.” 

Getting good data is the whole ball game for Google and any other search engine. Part of the impetus for creating the new search-specific Gemini model, Reid tells me, was to focus it on getting things right. “There’s a balance between creativity and factuality” with any language model, she says. “We’re really going to skew it toward the factuality side.” AI Overviews may not be fun or charming, but as a result, they might get things right more often. (Though no model is perfect, and Google is surely going to face plenty of problems from hallucinated and just straight-up false overviews.)  

As AI has come for search, products like Perplexity and Arc have come under scrutiny for combing and summarizing the web without directing users to the actual sources of information. Reid says it’s a tricky but important balance to strike and that one way Google is trying to do the right thing is by simply not triggering overviews on certain things. But she’s also convinced and says early data shows that this new way of searching will actually lead to more clicks to the open web. Sure, it may undercut low-value content, she says, but “if you think about [links] as digging deeper, websites that do a great job of providing perspective or color or experience or expertise — people still want that.” She notes that young users in particular are always looking for a human perspective on their query and says it’s still Google’s job to give that to them. 

Over most of the last decade, Google has been trying to change the way you search. It started as a box where you type keywords; now, it wants to be an all-knowing being that you can query any way you want and get answers back in whatever way is most helpful to you. “You increase the richness, and let people ask the question they naturally would,” Reid says. For Google, that’s the trick to getting even more people to ask even more questions, which makes Google even more money. For users, it could mean a completely new way to interact with the internet: less typing, fewer tabs, and a whole lot more chatting with a search engine.

 

Content Courtesy – The Verge