What about AI Hallucinations?

by | Jul 21, 2023

By Newsguru founder – Marcie Terman

I have been reading blogs and listening to podcasts recently where people take the view that generative AI systems like Bard and ChatGPT are black boxes where no one understands what is going on. One of the concepts these folks use to increase the mystique of AI is that of ‘hallucinations.’ These are the situations where, instead of creating fact-based content, the AI will appear to engage in flights of whimsey, making stuff up or ‘hallucinating.’

And the question frequently comes up – Does Newsguru hallucinate?

The answer to that question is no and relates directly to the reason generative AI can seem to make stuff up, and how Newsguru is fundamentally different.’ Here’s why.

‘Hallucinations’ are caused when people request the generative AI to create content based on an insufficient amount of input information. Systems like ChatGPT 3.5 have training that ends in September 2021 and do not have a direct connection to the internet, so its available information is limited.

Without denigrating the complexity or cleverness of generative AI, in the most general terms how they work is to attempt to predict from as much information as they have available, the likelihood of a correct response in a sequence of statements. If the AI has sufficient data, it will do a good job and create a statement based on facts resulting in good content. However, if the AI model has an insufficiency of information to complete a task, it will draw on its own internal weighting and attempt to surmise the most likely result. This can lead to inaccuracies which are classed by the non-sophisticated public as ‘hallucinations.’

Newsguru’s AI works the opposite way around. It has thousands of live news feeds and direct access to the internet to draw data from when it is composing articles. Our news sharding model uses this data to provide our generative AI with a large pool of knowledge it can draw on. It will pull from these resources, cross check the data it has available, then summarize it and sculpt the information into compelling content. This content is further filtered to be search engine optimized to give our customer the best result possible.

Since Newsguru starts the process of ‘thinking’ of ideas by pulling in information from the ‘real world’ it is not relying on internal weights for information and will not ‘hallucinate.’