Google's AI has been taken down for some medical searches, sparking concerns over the accuracy of its responses. In a recent investigation by The Guardian, it was found that Google's AI overviews were serving up misleading and false information in response to certain medical inquiries.
One example cited by experts was particularly alarming - Google advised people with pancreatic cancer to avoid high-fat foods, which is contrary to what medical professionals recommend. This could potentially increase the risk of patients dying from the disease, highlighting the serious implications of such errors.
In another case, the AI provided false information about crucial liver function tests, which could lead people with serious liver disease incorrectly believing they are healthy.
Following this report, Google has disabled its AI overviews for certain medical searches. When asked to comment on the specific removal, a spokesperson said that while the majority of their AI overviews provide accurate information, there were instances where additional context was needed, prompting improvements and taking action under their policies as necessary.
This is not the first controversy surrounding Google's AI feature, which has faced multiple lawsuits and criticism for suggesting people put glue on pizza or eat rocks. The removal of its AI overviews comes as part of a broader effort to ensure the accuracy and reliability of the information provided.
One example cited by experts was particularly alarming - Google advised people with pancreatic cancer to avoid high-fat foods, which is contrary to what medical professionals recommend. This could potentially increase the risk of patients dying from the disease, highlighting the serious implications of such errors.
In another case, the AI provided false information about crucial liver function tests, which could lead people with serious liver disease incorrectly believing they are healthy.
Following this report, Google has disabled its AI overviews for certain medical searches. When asked to comment on the specific removal, a spokesperson said that while the majority of their AI overviews provide accurate information, there were instances where additional context was needed, prompting improvements and taking action under their policies as necessary.
This is not the first controversy surrounding Google's AI feature, which has faced multiple lawsuits and criticism for suggesting people put glue on pizza or eat rocks. The removal of its AI overviews comes as part of a broader effort to ensure the accuracy and reliability of the information provided.