You should definitely think twice before using an AI chatbot to provide a quick summary of the news. A new report from theBBCshows that popular chatbots produce major flaws in the summary results.

The test covered ChatGPT, Google Gemini, Microsoft Copilot, and Perplexity AI. To begin, the BBC asked each chatbot 100 questions about the news, asking them to use BBC News sources where possible.

BBC Ai Study inaccuracies.

Experts from the BBC then assessed the quality of these summaries. Of the summaries, 51 percent had some type of error, whether it was a factual inaccuracy, misquotation, or outdated information.

Of those, 19 percent had some type of factual mistake like an incorrect date. And 13 percent of the quotes attributed to the BBC in the summaries were either altered from their original form or didn’t exist in the articles provided to the chatbots.

When broken down by chatbot, Google’s Gemini was the worst offender with more than 60 percent of the summaries containing problematic information. Microsoft Copilot was next at 50 percent while ChatGPT and Perplexity had around 40 percent of the responses with issues.

In the study’s conclusion, the BBC said that many of the issues were more than just wrong information:

This research also suggests the range of errors introduced by AI assistants is wider than just factual inaccuracies. The AI assistants we tested struggled to differentiate between opinion and fact, editorialized, and often failed to include essential context. Even when each statement in a response is accurate, these types of issues can result in responses which are misleading or biased.

I’ve never tried to use an AI chatbot to summarize news because of the simple fact that I didn’t trust the reliability of the technology. But the study results are still very surprising with the high number of results with issues. AI has a very long way to go to become a trusted way to find out more about the news.

AI Features Are Still a Work In Progress

AI technology, and especially chatbots, continue to improve rapidly. But as the BBC study shows, expecting correct information about news is a hugely problematic area.

The BBC has vocally complained about another AI-powered feature: Apple Intelligence’s notification summaries. In December 2024, a notification summary from the outlet incorrectly stated that Luigi Mangione had shot himself. He is the alleged shooter of healthcare CEO Brian Thompson.

In response to the BBC and other complaints,Apple temporarily turned off the summariesfor news and entertainment apps starting with iOS 18.3.

So when you’re looking to catch up on the news, keep it simple: skip an AI summary and do the reading yourself.