As we are aware, AI generators, including Eddy, can sometimes produce inaccurate information when they do not know the answer. This issue, known as "AI hallucination," can be problematic. To address this, we propose creating a private repository where we can store carefully selected and verified content. This repository will serve as a reference for Eddy, allowing it to access accurate answers to questions it previously answered incorrectly. The private repository will be a secure, internal space managed by authorised personnel, ensuring that it contains only verified information. It will not be visible to customers or readers, but Eddy will have access to it to fetch correct answers. Regular updates to the repository will ensure that it covers new areas where Eddy might hallucinate, improving accuracy and reliability over time.