Google’s former AI chief John Giannandrea, was appointed the head of Google search in 2018. This appointment had a deeper meaning as keen observers saw this event marking the significance and greater role that AI would play in Google search in the years to come. Note that Google’s search has always been powered by algorithms which generated responses to all queries. The key difference being that earlier, the algorithms were working on a set of rules subject to changes at the behest of Google engineers. The incorporation of deep learning into Google’s search engine has been a complete game changer.
So how is AI helping make your Google search for meaningful?
Introduction of Hum-to-Search feature
Ever had a song stuck in your head and can’t seem to remember the lyrics? This simple human ‘irritant’ can now be overcome with the hum-to-search feature where you can hum or whistle to Google and you’d be surprised at the accurate results this feature generates. This is done with the help of a machine learning algorithm which enables identification of potential matches for the song you have been looking for. ML models pick up songs from humans singing, whistles, or hums, and even sound recordings. The audio tune is then converted into a number-based representing the melody of the song. The ML model is also trained to identify the instrument used in the melody. Try it today!
Introduction of BERT technique
The neural network-based technique called BERT or Bidirectional Encoder Representations from Transformers for natural language processing (NLP) pre-training was introduced by Google in 2018 for better language understanding. This leads to better and more relevant search results. This system is constantly enhanced for a better search experience as BERT is now used in every Google query in the English language for more precise results.
Information Access During The Current Pandemic
During the ongoing pandemic Google upped its search game by providing users with information which helped them go to places or get work done. This was enabled through introduction of features such as Live busyness updates on Google Map which can tell about how crowded or busy the place is in the current moment in view of the social distancing measured being followed. Live View feature too helps one in making informed decisions during the pandemic by letting one visualize their destination in real time. The feature has a more accurate pin so one doesn’t have to spend time navigating through the lanes or cities a lot.
Identification of video moments
Moreover, with the help of the latest AI tech, developers at Google can identify key moments of the video and even let users tag those moments within the video. These moments can also be browsed like one browses text.
In fact, the entire Google experience has AI incorporated to help you during different steps of your browsing as well as non-browsing experience. For example, the meeting schedules suggested by Google Drive Smart Scheduling are based on your current schedule and habits. Likewise, Google News makes use of AI to better understand the functioning of people on a daily basis – their places, things they do, news they search for or subscribe to, all are organized and put to use after determining how they connect with the other.
During the height of the pandemic, we at IA used AI for contract tracing and how it helped during Covid-19 crisis and put our experience forward in the post Can data science be used for Vaccine Distribution? Do subscribe to our newsletter to know more about our work and how we can help you with yours?