Log in to your Document360 account to give feedback

Feature Request

Capturing citation details in Eddy AI analytics
While working with the Eddy AI Analytics page in the Knowledge base portal, I noticed that the citation details for responses are not currently visible. At present, Eddy AI displays references like [citation 1], [citation 2], but it doesn’t show which specific articles those citations are pulled from. Why does it matter? We regularly review the queries users submit and how Eddy AI responds to them. If the information provided is accurate, there’s no concern. However, when the response is incorrect or out of context, we need to identify the source of that information to make necessary corrections or improvements. Currently, to find out where Eddy AI fetched the content from, we often need to manually re-enter the same query in the Knowledge base site search. This process: Consumes Eddy AI query credits May return different results due to frequent content updates or backend changes in Eddy AI Sometimes fails to reproduce the original citation, making it harder to trace and verify the response Suggested improvement: It would be very helpful if the Analytics section could include the exact article titles or links from which the citations were generated. Providing this visibility within the portal itself will: Help us validate responses more efficiently Enable quicker identification and correction of outdated or misleading content Enhance trust in Eddy AI outputs for both the team and our end-users Thank you.
1
·
AI - Eddy
·
under review
Conserve AI search credits by require manual AI search
The Current Design User Action: User types a search query and hits "Enter." System Response: Doc360 automatically generates a comprehensive, AI-powered summary answer after hitting enter. This uses a credit. The Problem: Every time the user hits "Enter," even for a simple, basic search, a credit is consumed. The user is not given a choice. This feels like a "trick" because the user's natural, default action (hitting Enter) leads to a costly outcome. It's difficult to train our teams around not hitting enter to use the search button, and we don't have consistent coverage (sometimes gaps of full weeks) because of this design. Our Proposed Design User Action (Initial): User types a search query and hits " Ctrl-Enter or AI Search" System Response: The system presents the standard, non-AI search results first. This is the expected, fast, and free (credit-wise) action. Return the "Opt-in" Button: Prominently displayed near the standard results is a clear, secondary button, maybe labeled "Get an AI Summary" or "Summarize with AI." User also has to ctrl-enter or use a more complex search combo than "enter" to use the AI search. User Action (Optional): The user can now quickly scan the standard results. If they don't find what they need or if their query is complex, they can consciously decide to click the "Get an AI Summary" button. System Response (Optional): Only after the user's explicit action is a credit consumed and the AI summary is generated. This was pre-existing function before 2.0, so it shouldn't be too difficult to revert/fix. While training our support staff to not hit enter when searching is one workaround, this design forces a stumbling block into the middle of searching the knowledge base that wasn't there before.
2
·
AI - Eddy
·
backlog
Load More