ตัวอย่างเช่น คุณอาจกำลังค้นหา “ไอเดียการตกแต่งห้องนั่งเล่นสไตล์โมเดิร์น” และเห็นโต๊ะกาแฟที่ชอบ แต่อยากได้แบบที่เป็นทรงกลมแทนทรงสี่เหลี่ยมผืนผ้า คุณก็สามารถใช้ฟีเจอร์มัลติเสิร์ช แล้วเพิ่มคำว่า “สี่เหลี่ยมผืนผ้า” ลงไป เพื่อค้นหาโต๊ะที่คุณต้องการได้
เรากำลังสร้างประสบการณ์การค้นหาที่เป็นธรรมชาติและช่วยให้เห็นภาพมากขึ้น แต่นี่เป็นเพียงแค่จุดเริ่มต้นเท่านั้น และด้วยพลังของ AI ยังมีความเป็นไปได้อีกมากมายรอเราอยู่ในอนาคต
Elizabeth Reid
VP, Search
From images to videos, how AI is helping you search visually
Our products at Google have a singular goal: to be as helpful to you as possible, in moments big and small. And we’ve long believed that artificial intelligence can supercharge how we deliver on that goal.
Since the early days of Search, AI has helped us with language understanding, making results more helpful. Over the years, we've deepened our investment in AI and can now understand information in its many forms — from language understanding to image understanding, video understanding and even understanding the real world.
Today, we’re sharing a few new ways we’re applying our advancements in AI to make exploring information even more natural and intuitive.
If you can see it, you can search it
Cameras have become a powerful way to explore and understand the world around you. In fact, Lens is now used more than 10 billion times per month as people search what they see using their camera or images.
With Lens, we want to connect you to the world’s information, one visual at a time. You can already use Lens to search from your camera or photos, right from the Search bar. Now, we’re introducing a major update to help you search what’s on your mobile screen.
In the coming months, you’ll be able to use Lens to “search your screen” on Android globally. With this technology, you can search what you see in photos or videos across the websites and apps you know and love, like messaging and video apps — without having to leave the app or experience.
Say your friend sends you a message with a video of them exploring Paris. If you want to learn more about the landmark you spot in the background, you can simply long-press the power or home button on your Android phone (which invokes your Google Assistant) and then tap “search screen.” Lens identifies it as Palais du Luxembourg — and you can click to learn more.
Mix and match ways to search
With multisearch, you can search with a picture and text at the same time — opening up entirely new ways to express yourself. Today, multisearch is available globally on mobile, in all languages and countries where Lens is available.
We recently took multisearch even further by adding the ability to search locally. You can take a picture and add “near me” to find what you need, whether you’re looking to support neighborhood businesses or just need to find something in a hurry. This is currently available in English in the U.S., and in the coming months, we'll be expanding globally.
And sometimes, you might already be searching when you find something that catches your eye and inspires you. In the next few months, you’ll be able to use multisearch globally on any image you see on the search results page on mobile.
For example, you might be searching for “modern living room ideas'' and see a coffee table that you love, but you’d prefer it in another shape — say, a rectangle instead of a circle. You’ll be able to use multisearch to add the text “rectangle” to find the style you’re looking for.
We’re creating search experiences that are more natural and visual — but we’ve only scratched the surface. In the future, with the help of AI, the possibilities will be endless.