Google Search On 2022 Announcements and New Features

Google Search On 22 lived up to the annual event’s exciting expectations, announcing changes and a future roadmap for Google’s search engine. This year, Google announced updates spanning new, intuitive ways to search, shop easier, make more sustainable choices, find food, use maps, and translate and discover the world.

Google’s approach to intuitive and natural search

Google showed off its advancements in artificial intelligence (AI) by introducing a new search experience that allows users to think and search for things beyond the search box.

These new search features build on the historical foundation of text-only searches by incorporating what you see with your camera and what you ask with your voice. These features enhance Lens (a visual search aid) and include multisearch (which uses images and text simultaneously), Lens translation (which translates text within images, and immersive view (hello, feeling like you’re in a place) in Google Maps.

Together, these features are meant to work like the human brain by gathering information related to what a user is thinking, then organizing information in a way that makes sense to each user. This will allow users to search for things using a combination of sounds, images, text, and speech and to find information with as few words as possible.

Multisearch and “multisearch near me”

Multisearch will be a brand-new search method to let users search using images and text simultaneously. Multisearch has been a beta in the U.S., but in the coming months, Google will expand it to more than 70 languages.

Google also announced “multisearch near me.” This feature will allow users to take a picture of an unfamiliar item, for example, a plant or a shirt, and then the search will give results on where to find those items nearby. Multisearch near me will roll out in English in the U.S. this fall.

How Multisearch can be used to find items based on images taken with Google Lens. Source: The Keyword, Google
How Multisearch can be used to find items based on images taken with Google Lens. Source: The Keyword, Google

Google Lens Translation

Thanks to new machine learning technology called Generative Adversarial Networks (GAN), Google can now take its image translation to another level.

Google Lens Translation can now blend translated text into the background of an image. When users point their camera over text on, say, a magazine, Google will translate the text and overlay it on the image underneath.

Google Lens Translation translating text with image overlay from a screenshot image.” Source: The Keyword, Google
Google Lens Translation translating text with image overlay from a screenshot image.”
Source: The Keyword, Google

Google’s approach to helping users find—and discover—information

Google will roll out new features in the coming months to help make using the search bar easier. These will help users find what they want and even discover information for which they might not have thought to search.

New Google App on iOS shortcuts

New shortcuts allow users to shop for products based on screenshot images in their photo library or to translate text with their camera. Over time, users will also see shortcuts for other Google-identified common user behaviors within the Google Search App. These could be for things such as solving homework using the camera, searching for a photo in the photo library, or even identifying a song that’s playing based on listening. Users can also opt in to get price drop updates on items. This feature is only available on iOS devices currently.

Google Search Shortcuts on the Google Search App for IOS. Source: The Keyword, Google
Google Search Shortcuts on the Google Search App for IOS. Source: The Keyword, Google

Discover

Discover (found in the Google App, on your Android, at google.com, or on some devices just a right swipe away from the home screen), will allow users to see style suggestions based on what they have been shopping for. To bring this home: you’ll be able to discover clothing items based on styles you’ve shown interest in before. Also, users can use Lens on the Google App to take a picture of a certain item and see nearby options for where to buy it.

Automatic Google suggestions are only getting better

Google’s current autofill feature provides related keyword question suggestions based on the topic a user is searching for. But Google plans to enhance this capability by significantly improving their automatic suggestions.

To accomplish this, Google created a feature that lets users explore new topics related to what they are searching for instead of just keywords. When users start typing on Google Search, they’ll be provided with specific keywords or topic options to help them specify their questions. This will help users find—and discover—more relevant results for each search.

Highlighted information

This feature, which will include content from web creators, will highlight the most important information about a search subject. For example, if a user searches about a city they’ve never visited, they’ll see highlighted videos and stories from other people who have visited that city.

Highlighted video story submitted by an online user showcasing an experience in Oaxaca. Source: The Keyword, Google
Highlighted video story submitted by an online user showcasing an experience in Oaxaca. Source: The Keyword, Google

Refining (and creating?) searches

Google deeply understands how users search, and they’re always learning. Over time, a user will be able to add or remove topic results that may come up based on what they’re searching for. As users continue to scroll through results, they will also begin to see more related topics to their search that they may not have thought of yet.

New Google Shopping features

Google announced nine new tools and features to make shopping more intuitive. Powered by Google’s enhanced AI model, Shopping Graph, which can detect more than 3.5 billion product listings, these new tools give shoppers—and sellers—a lot to look forward to. We detail a few top features below.

Add “Shop” to your Google Search

When a user searches the word “Shop” followed by what item they are looking for, they’ll see a visual feed of products, research tools, and nearby inventory related to that item. This shopping feature will expand across many different items, from clothing and beauty products to electronics.

Example of “Shop” results with a bomber jacket. Source: The Keyword, Google
Example of “Shop” results with a bomber jacket. Source: The Keyword, Google

“Shop the Look” now features item suggestions

When a user is searching for a specific piece of clothing like a jacket, Google will give them results for the jacket along with items that complement the jacket. Google will also tell users where they can find all the items nearby.

Shop the look feature with different clothing items. Source: The Keyword, Google
Shop the look feature with different clothing items. Source: The Keyword, Google

Shop trending products

Trending products will show popular products in a specific shopping category to help users discover the latest models, styles, and brands. Look for this feature to drop in the U.S. this fall.

Dynamic Filters will automatically apply shopping suggestions based on real-time search trends and will provide users with popular style results. Dynamic Filters are available in the U.S, Japan, and India, with rollouts to many other countries soon.

Shop eco-friendly & pre-owned items

Sustainability featured prominently at this year’s Search On event, based on Google’s recognition that people are increasingly searching for more ways to be sustainable in their daily lives.

Google’s car shopping searches will now create awareness and help people make informed choices about their daily carbon emissions. Consumers will increasingly see a car’s annual fuel costs and emission estimates. And, for electric cars, Google Search will show estimated costs, range, charging speeds, federal tax incentives, and nearby charging places.

Since fashion accounts for 10% of global carbon emissions, Google wants to make it easier for consumers to choose more sustainable (and often less expensive) pre-owned clothing in shopping.

Search adds more voices & more context

Google announced two new features that will bring more voices and viewpoints to Search to give users more context, insights, and choices when understanding search results.

Discussions and forums

Discussions and forums will add context to user searchers by presenting, along with traditional Search items, online forums and discussions with relevant advice and experiences from real people. This feature is now available on mobile in the U.S.

How Discussions and Forms look on Google Search. Source: The Keyword, Google
How Discussions and Forms look on Google Search. Source: The Keyword, Google

Translating international news sources

In 2023, Google Search will launch a new feature to translate news results from sources outside a user’s preferred language, starting with the ability to translate news results in French, German, and Spanish to English on both mobile and desktop.

Dinner made easier by Search

Now it will be easier for users to find specific foods and restaurants. Google will use better machine learning to preview and evaluate images and reviews for better restaurant ratings. Google will also combine menu information from merchants, people, and restaurant websites to provide updated digital menus with richer and more accurate information.

Find a place that sells your exact dish in Search

Google will make it possible for users to search for a meal they are craving and filter their results to locations serving that dish. For example, if a user is looking for pasta but wants it to be spicy, they can search “spicy pasta,” and Google will provide locations where they can get that spicy pasta dish they are craving. The Multisearch feature can apply a user’s screenshot or take a picture of food to help users discover what it is and where it can be found nearby.

Four Google Map updates

These new visual, intuitive map features will allow users to experience locations and landmarks in a more realistic view.

Neighborhood vibe

The new “Neighborhood vibe” feature will let users select a neighborhood and see its most popular places. Other Google Maps community users will post this information through helpful photos and information that will be viewable right on the map. Google will power Neighborhood vibe with AI to combine and select the 20 million reviews, photos, and videos that are posted by the Google Maps community every day. This feature will be available on Android and iOS soon.

Neighborhood vibe feature highlights popular locations in a neighborhood. Source : The Keyword, Google
Neighborhood vibe feature highlights popular locations in a neighborhood. Source : The Keyword, Google

Immersive view & Live View Search

Immerse View will use predictive modeling to give users a realistic view and experience of a location before they ever set foot there. Users will be able to visualize things such as the weather, how busy a place will be, and more. Google will also be launching over 250 different photorealistic aerial views of global landmarks from all over the world. Immersive View is coming to Maps first in five major cities, with more to follow.

A new update to Live View will allow users not just to overlay directions on their camera but to search with Live View to find locations more intuitively. Users can search for and ping a nearby location in real-time, then get arrows pointing the way to the searched-for locale.

Immerse View and Immerse View with weather feature. Source : The Keyword, Google
Immerse View and Immerse View with weather feature. Source : The Keyword, Google

Eco-friendly routing

Google launched eco-friendly routing to let users choose the most fuel-efficient route to their destination. Google will now offer eco-friendly routing options to third-party developers through Google Maps.

Stay up to date with Search Discovery

Search Discovery is a Google Premier Partner, and it’s our business to stay up to date on the latest Google announcements and features. Reach out to our digital marketing team to help your business grow as Google continues to evolve, and read more of our latest related posts: 

Get in touch using the form and sign up for our newsletter to stay in the know about the industry’s latest news.

Related Posts

Search Discovery
Education Community

Join Search Discovery’s new education community and keep up with the latest tools, technologies, and trends in analytics.

STAY IN TOUCH

FOLLOW US

SHARE THIS BLOG!

Scroll to Top

KEEP IN TOUCH

Catch the latest industry trends we’re watching and get new insights from our thought leaders delivered directly to your inbox each month.