Google now allows you to combine images and words to search

Google now allows you to combine images and words to search

(CNN Business) – Google wants to make it easier to find things that are hard to describe in a few words or a picture.

On Thursday, Google introduced a new search option that allows you to combine text and images in a single query. With this feature, you can search for a shirt similar to the one in the photo, but with “polka dots” you can write what you want or take a photo of your sofa and write “chair”.

The feature that the company calls “Multiple Search” and was shown in it Preview in September, Now available to US users in the Google Lens area of ​​Google Mobile Apps. Liz Reid, vice president of Google Search, told CNN Business that the feature would be considered a first-come, first-served basis. It was initially expected to be used for shopping-related searches, although it is not limited to such queries.

“It will be a start,” Reid said.

Multisearch refers to Google’s recent attempt to make search more flexible and limited to words on the screen. Google has been providing an image search engine for a long time. There is also a feature called Google Lens It was screened in 2017 Also it can recognize the objects in an image or instantly translate the text viewed through the lens of the phone camera. Another initiative in 2020 offered users The desire to hum a song to search for it.

How does the new multi-search method work?

To find multiple searches in Google Mobile Apps, just tap the camera icon on the right side of the search bar that opens the Google Lens. You can take or upload a photo and then tap the small bar with a plus sign and the phrase “Add to your search”. It allows you to write words that best describe what you want.

See also  Argentina vs. Guatemala Live: TyC Sports, DSports, Public TV, DIRECTV and Futbol Libre live free online TV broadcast of U-20 World Cup | Soccer-International

The multitasking mode works using artificial intelligence in a variety of ways. Computer vision minimizes what is in the picture, while normal language processing determines the meaning of the words you type. Those results are put together to train the overall system, Reid said.

When Google introduced this type of search In SeptemberThe company explained that it uses a powerful machine learning tool called MUM (referred to as the “integrated multitasking model”). He delivered last May. Reid said in an interview last week that this may not be the case at first, but that MUM could be used in the coming months, which he hopes will help improve the quality of the search.

Asked if Google would eventually make it possible to use search in a variety of ways, such as combining music and lyrics in a quest to discover new types of music, the company did not act specifically, but that is why it is interested in combining multiple contributions in the future.

Arzu Daniel

"Extreme pop culture lover. Twitter enthusiast. Music ninja. Booze. Communicator. Bacon nerd in general."

Leave a Reply

Your email address will not be published. Required fields are marked *