Feed Eddy selected content in order to combat 'AI hallucinations'
Rux Talinga
As we are aware, AI generators, including Eddy, can sometimes produce inaccurate information when they do not know the answer. This issue, known as "AI hallucination," can be problematic.
To address this, we propose creating a private repository where we can store carefully selected and verified content. This repository will serve as a reference for Eddy, allowing it to access accurate answers to questions it previously answered incorrectly.
The private repository will be a secure, internal space managed by authorised personnel, ensuring that it contains only verified information. It will not be visible to customers or readers, but Eddy will have access to it to fetch correct answers. Regular updates to the repository will ensure that it covers new areas where Eddy might hallucinate, improving accuracy and reliability over time.
Log In
Gerard
i think this is related to https://feedback.document360.com/feature-request/p/ai-rules-and-guide-and-hallucination-mechanisms-reports
In my current case, the singular and plural forms for the word "call" with the exact sentence made the difference in Eddy providing the correct vs completely unrelated answer. It would be great if we had more tools to monitor and manage Eddy interactions so we can understand how to make it better.
Michael Wood
This would be a great improvement, especially for companies that don't want to have FAQ pages for every topic. Allowing these articles to be private, so they are not seen by our customers, but accessible to Eddy to help provide accurate answers when asked.