Quantcast
Channel: Fast Company
Viewing all articles
Browse latest Browse all 2739

Google’s new AI search will change the way you browse the internet

$
0
0

When Google launched its AI Overviews, a world of normies got access to the power of modern AI search. Instead of a page of links, its Gemini model synthesizes one answer—and occasionally adds some glue—in attempts to offer a single, end-all-be-all search result. 

But today, Google is sharing plans that reframe AI search as the exact opposite of that initial strategy. Rather than giving you one perfect answer, it’s designed to help you browse many. The features will come to recipe and restaurant results on mobile devices in the coming weeks, before making their way to travel and shopping.

“We know users love to browse and there’s a joy in discovery . . . it’s not always about that instant answer,” says Rhiannon Bell, VP of UX at Google. “We’re looking at organizing search from a content perspective in new ways that feel more natural to users.”

[Image: Google]

Visually speaking, that means many Google results will look less like a list of links, and more like multimedia clusters. Rather than big hero treatments of single recipes from Epicurious or Bon Appetit, the design prioritizes info density. Clusters will include thumbnails that lead to many different recipes. And then, different clusters can offer more nuanced tangents to explore. While one cluster might have results for pasta primavera, another might have a cluster focused on recipes for dietary restrictions or specific ingredients, while another cluster could be related to pasta cooking techniques. (And in theory, there are all sorts of possibilities for personalization.)

“AI has made this so much easier for us to do in terms of the diversity of the topics and subtopics that matter to your query,” says Bell. “Before it would have been like, directly matched to your keywords; now we can branch off a bit and offer more content around a subtopic that might interest you as well.”

[Image: Google]

Google has a few ways these clusters can appear—some requiring a photo, others operating on just a headline, others excerpting message boards in short answers—so the way these search results play out as you scroll down the page has visual variety that looks more like a publication than a search page. Bell calls this visual cadence, with plenty of white space, an important “rhythm” of browsing.

“As the web grows and evolves, there’s so many more formats out there. It’s not just the blue link; we have forums, human perspectives, image rich content, short and long form videos,” says Bell. “As that richness grows, we have to evolve our product strategy to meet people in search.”

A shift in engagement strategy

When it comes to the topic of information design, it’s hard to argue with Google’s strategy. Not all information should be presented the same way. Not every question can be resolved with one simple answer. And AI can unlock design possibilities we never imagined before, so we do need to think beyond the AI companion.

Notably, Google is adding other new capabilities to search through Lens that look beyond typing in the search bar. You can now film a video with Google Lens while asking a question with your voice, or circle and item from a photo to search to buy. These multimodal styles of search will no doubt play a bigger role in Google products down the line.

[Image: Google]

Google is clearly tweaking its approach to search design to expand and reposition its own search strategy. As the company pays approximately $15 billion to Apple annually to be the default search engine on iPhones, critics have warned that Google is losing its edge in search, be it from ChatGPT/Bing to Gen Z’s love for TikToking instead of Googling—though the company argues that AI search has actually proven to boost its search revenue so far. In fact, the company will be adding advertisements to its AI Overviews (along with welcome outbound links for sourcing).

As for Google’s new AI clustering strategy specifically—which takes you deep down into a more pleasantly designed page to explore links—that feels like a fresh grab for engagement. It’s a way that Google itself can continue to evolve from a quick stop before going somewhere else, to the place you hang out for a while to peruse the web. In fact, I’d argue the new Google is looking ever more like a publisher, using a mix of content types and presentations to keep stories fresh across the page. Case in point: Google’s own content-clustered redesign heavily mirrors the New York Times’ new app—which is increasingly organized to help you discover more stories beyond the headlines of the day, to poke around and stay a while.

I suspect much of the public will enjoy Google’s new approach to search, especially as it comes to other categories like travel and shopping in the future. The company does need to evolve—and what we see here is a mere taste of what Google will need to do with video and richer multimedia.

But it’s also difficult to look at a page full of recipes, neatly shuffled and sorted with AI, and not see this as yet another AI feature that’s competitive with publishers and standalone websites themselves. Because if you’re spending your time discovering on Google, you aren’t spending your time discovering on a publication’s (or store’s) site, app, or social feed. While they’re not showing ads in these new AI results for launch, Google confirmed that ad units—a major topic in Google’s recent antitrust trial—will work their way “across the experience of search” soon. Meaning Google will be using design to monetize the world’s information anew.

What is Google AI search in another two years? An omniscient assistant, like Microsoft Copilot? Or a smarter Flipboard? For now, Google is arguing it can be both. So long as we all promise to keep Googling.


Viewing all articles
Browse latest Browse all 2739

Trending Articles