Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Intelligence Service MVP Chat Interface #169

Open
wants to merge 48 commits into
base: feature/java-chat-mvp-intelligence-service
Choose a base branch
from

Conversation

milesha
Copy link
Contributor

@milesha milesha commented Nov 19, 2024

Motivation

To interact with the AI Mentor we need an interface, which provides users an ability to create reflective sessions, retrieve messages from existing sessions, send messages and see the AI Mentor response.

Description

NB: currently the bot does not persist any specific context and only reacts to the latest user message

  • Adds a button "AI Mentor" into the header to access the chat page
  • Adds a separate page to access the AI Mentor with following components:
    • chat: wrapper for the new page
    • messages: message view displaying the chat history
    • input: input field for the chat
    • sessions-card: card with a list of past sessions as well as a button to create a new one
    • first-session-card: view shown to the user, who does not have any started session yet

Testing Instructions

  1. Start the application-server, webapp and intelligence service
  2. Login with your Github account (make sure your user has a role "ai-maintainer" on Keycloak)
  3. Click on "AI Mentor" in the top nav-bar
  4. Click "New Session"
  5. Now you can chat with the bot through the input field on the bottom of the page

Screenshots (if applicable)

Screenshot 2024-11-25 at 15 48 31 Screenshot 2024-11-24 at 13 32 42

Checklist

General

  • PR title is clear and descriptive
  • PR description explains the purpose and changes
  • Code follows project coding standards
  • Self-review of the code has been done
  • Changes have been tested locally
  • Screenshots have been attached (if applicable)
  • Documentation has been updated (if applicable)

Client (if applicable)

  • UI changes look good on all screen sizes and browsers
  • No console errors or warnings
  • User experience and accessibility have been tested
  • Added Storybook stories for new components
  • Components follow design system guidelines (if applicable)

@milesha milesha self-assigned this Nov 19, 2024
@github-actions github-actions bot added client application-server feature size:L This PR changes 100-499 lines, ignoring generated files. labels Nov 19, 2024
@github-actions github-actions bot added size:XL This PR changes 500-999 lines, ignoring generated files. and removed size:L This PR changes 100-499 lines, ignoring generated files. labels Nov 25, 2024
Copy link
Contributor

@GODrums GODrums left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The UI looks very pleasant on the eye!
I got a few general points:

  • I know Angular is super confusing in the beginning, but try to get used to the new signals (especially input / output).
  • There are also quite a few places where you can simplify code (e.g. remove redundant types).
  • We decided on using Skeletons for loading-states. You can pass down the loading state into the invididual components and handle them at the bottom. Feel free to check out the User-Page for examples.

webapp/package.json Outdated Show resolved Hide resolved
webapp/angular.json Outdated Show resolved Hide resolved
webapp/src/app/chat/input/input.component.ts Outdated Show resolved Hide resolved
webapp/src/app/chat/input/input.component.ts Outdated Show resolved Hide resolved
webapp/src/app/chat/input/input.component.ts Show resolved Hide resolved
@@ -0,0 +1,28 @@
@if (isLoading()) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In other components we try to go for Skeletons instead of Spinners since they are much easier on the eye in terms of layout shifts.

Try to integrate Skeletons in here too. You can pass the "loading"-state down to the individual components, which can then handle it themselves.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@GODrums
Sure, was thinking of it too - the only problem in this case is that i am not sure what I need to create a Skeleton for in this case: we have one view which is shown if there are no sessions yet and a completely different one for the actual chat interface. What should I then make a Skeleton for when in is "undecided", which view to show (the number of user sessions is loading)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, I see your point here. I was a little confused about how the whole "fetching"-logic is fully contained in the ChatComponent.
I would have thought of it this way (inspired by the way https://chatgpt.com/ works):

  • until the sessions are loaded, a spinner should be fine
  • if existing sessions are found, they are displayed in the sidebar and the "chat-window" displays a message similar to "Please select a session or create a new one", basically a placeholder
  • the user is now able to click on a session, replacing the placeholder with Skeletons of sample messages until the real messages are loaded

Maybe I'm overengineering this a little though, I think your approach is fine too!

Copy link
Contributor

@GODrums GODrums left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just did some manual testing and wanted to add the following points:

  • The "no-sessions"-screen probably needs extra colors for the dark mode (text + button), it's pretty hard to read right now:
    Screenshot 2024-11-26 131941
  • I forgot to start the intelligence-service and the session creation failed. I did not get any feedback of that kind though and the endpoint did not return any errors. So either we should decouple the session creation from the Python-Server or the endpoint has to return an actual error, which can then be translated into an on-screen alert/toast. In production this is a realistic scenario as well if the Python server isn't working properly.

@milesha milesha added the needs work PRs that need more work label Nov 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
application-server client feature intelligence-service needs work PRs that need more work size:XL This PR changes 500-999 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement the AI mentor MVP
4 participants