Google Lens can now search images and text at the same time
At I/O 2021, Google announced it was using a new technology called Multitask Unified Model to enhance the capabilities of its search engine. Then, in September of that same year, it showed off how MUM would make it possible to search images and text simultaneously within Google Lens. At the time, the company promised that multisearch would launch “in the coming months.” And while it's not using MUM to enhance searches just yet, it has started beta testing multisearch.
Provided you live in the US, you can try the feature out in the Google app on Android and iOS. To do so, tap the Lens icon and then swipe up after snapping a new photo or importing an existing image from your camera roll. Then tap the “Add to your search” icon.
You can use this field to either ask questions about the image before you or to refine your search. For instance, you could take a photo of your bike’s rear derailleur (the component that moves the chain from one gear to another) and then search for how to fix or adjust it on your own. By combining text and images, Google suggests it’s making it easier to complete searches where doing so with words alone might be tricky. After all, even most casual cyclists don’t know what a derailleur is or what it does.
As mentioned already, you can also use the feature to refine your searches. So say you see a shirt with a pattern you like but want to see if that same pattern is available on socks and other items of clothing. You could type “white floral Victorian socks” into Google, but again that would depend on you having the fashion vocabulary to know what you want from Google, and even if you accurately describe what you see, the search engine might not produce useful results. At the moment, Google suggests the feature works best when posed shopping-related searches such as the one pictured above.
“All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways,” Google said. “We’re also exploring ways in which this feature might be enhanced by MUM – our latest AI model in Search – to improve results for all the questions you could imagine asking.”
At I/O 2021, Google announced it was using a new technology called Multitask Unified Model to enhance the capabilities of its search engine. Then, in September of that same year, it showed off how MUM would make it possible to search images and text simultaneously within Google Lens. At the…
Recent Posts
- De’Longhi’s new bean-to-cup coffee machine could make you a milk-frothing maestro
- ICYMI: the 7 biggest tech stories of the week, from a next-gen Alexa to the new iPhone 16e
- The price of AMD’s most powerful processor ever has been slashed by almost half and I can’t understand why
- 10% Off Dell Coupon Codes in March 2025
- Dyson Promo Codes: 20% Off | March 2025
Archives
- March 2025
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010