Please donate for our edition and Ukrainian people. How to Donate?

Google uses artificial intelligence to recognize requests from people in crisis

Google is launching a new MUM machine learning model to find “a wide range of personal search queries in crisis.”

This system first introduced at I / O 2021, and MUM will now be able to detect search queries related to complex personal situations.

“MUM can help us understand longer or more difficult questions, such as ‘why did he attack me when I said I didn’t love him?’ It may be obvious to people that this request is about domestic violence, but such long requests in natural language are difficult for our systems to understand without advanced artificial intelligence.” — Anne Merritt, Google Product Health and Information Quality Manager

Other examples of requests that the system can respond to, such as “the most common ways to commit suicide.” According to Merritt, previously, algorithms “could understand a query as a search for information.” Now, when Google makes such a request, it displays the Help Available information window, which is usually accompanied by a phone number or the website of a mental health charity.

However, AI experts believe that the widespread use of machine learning language models may lead to new problems, such as bias and misinformation in search results. Artificial intelligence systems are also opaque, giving engineers a limited idea of ​​how they come to certain conclusions.

The Verge asked Google how to check which search queries identified by MUM were related to personal crises. Representatives of the searcher either did not want to or could not answer. The company says it carefully tests changes in its search products with people with evaluators, but it’s not the same as knowing in advance how your AI system will respond to certain queries.

Please donate for our edition and Ukrainian people. How to Donate?

You might also like