-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AiChat for iTwinUI Docs #1414
AiChat for iTwinUI Docs #1414
Conversation
…but could not get it working
Spent some time playing around with this. Here are my initial thoughts. UX feedback
The feature itselfThe AI responses are very questionable. Often full of nonsensical code/text yet answered with such great confidence that an unsuspecting user would be easily fooled and get confused later. Here are my logs: query_history.json.
...which begs the question - what problem are we even trying to solve with this feature? Docs should be the place to find objectively correct information. When the "official" chatbot is so unhelpful, it reduces overall confidence in the documentation site. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with Mayanks comments about UX and feature itself. It needs some improvements before releasing it to the users. My query_history.json.txt
It's a great start and amazing tool to demo.
Couple additional comments:
I agree with Mayank's comments as well about this feature. It may become more useful as we add more complex components, but I don't think it's needed for the documentation site at the moment. It's still a really cool tool though! When the AI doesn't know the specific answer to my question, it will just rewrite whatever is written on the documentation site (explaining the component, its features, and its variants). At that point, it might as well just link me to the documentation pages so I can read the same text in a better format. Code examples are good to display, but I also think there needs to be an output displayed as well or at least a link to open the code in codesandbox. Here is my query history: query_history.txt |
Echoing what everyone else has already said. So I'm just going to add other things:
query: query_history.txt |
Since there has been no activity on this in a while, I'm closing this, but leaving the branch intact. PR can be reopened if/when we want to come back to it. |
This is the AI Chat feature that is powered by OpenAI's embedding and text completion APIs.
Video35.webm
Logic/workflow
features/search/scripts/create_chunks.js
reads all.mdx
files inapps/website/src/pages/docs
and all.tsx
files in/examples
and createsdocs_chunks.json
andexample_chunks.json
respectively.Code structure
apps/website/api/search.json.ts
: A backend endpoint in astro to do the vector comparison to get the top chunks and then send and receive a response from the OpenAI completions API. This was done as an alternative to a Python server.features/search/scripts/create_chunks.mjs
: This script is used to create the text chunks and also the code example chunks for the search index. I believe this needs to run in the CI step (or some place similar) to create the new chunks every time the docs site is updated.Features
apps/website/src/_data/query_history.json
Testing
Tested a few example prompts. I hope to have a more structured way of testing and evaluating soon.
Docs
On the docs website's
/docs/...
pages, I added an AI Chat button to the header.