Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Text exercises: Add preliminary AI feedback requests for students on text exercises using Athena #9241

Merged
merged 77 commits into from
Sep 11, 2024

Conversation

EneaGore
Copy link
Contributor

@EneaGore EneaGore commented Aug 22, 2024

ONLY DEPLOY TO TS1 or TS2 or ma-schwind servers

Checklist

General

Server

Client

  • Important: I implemented the changes with a very good performance, prevented too many (unnecessary) REST calls and made sure the UI is responsive, even with large data.
  • I strictly followed the client coding and design guidelines.
  • Following the theming guidelines, I specified colors only in the theming variable files and checked that the changes look consistent in both the light and the dark theme.
  • I added multiple integration tests (Jest) related to the features (with a high test coverage), while following the test guidelines.
    - [ ] I added authorities to all new routes and checked the course groups for displaying navigation elements (links, buttons).
  • I documented the TypeScript code using JSDoc style.
  • I added multiple screenshots/screencasts of my UI changes.
  • I translated all newly inserted strings into English and German.

Motivation and Context

Allow Preliminary Ai Generated Feedback for Students before the Exercise Due Date

Description

A new option in Text Exercises that if enabled will allow students to request AI Feedback (with a pre defined limit of requests per student possible)
The results are saved as automatic feedback and attached to a submission.
The students can request feedback for each submission once.
Unlike before, where text exercises had one pereptually updated submission, now if there exists an athena feedback on a submission a new one will be created in case the student submits again.
Tutor assessment had to be slightly adjusted to ignore athena results, and also to retrieve the latest submission (instead of the default [0] which was the case until now)

Steps for Testing

Prerequisites:

  • 1 Instructor
  • 1 Student
  • 1 Text Exercise
  1. Create New Exercise with the allow ai feedback feature enabled

  2. (Optional) Ideally add some grading instructions

  3. Participate as a student and request AI Feedback (limit must be 10 requests)

  4. Confirm that each ai feedback request returns a result and creates a new submission

  5. The student should be able to continue working on the submission

  6. The results should always be visible, even before due date and before assessment due date

  7. The results must be GRADED and tagged with PRELIMINARY on the detail view

  8. Login as instructor.

  9. Start an assessment.

  10. Confirm that the latest submission is retrieved (You can check this in the participations or scores tab).

  11. Confirm that assessment works as usually and that the student sees the final result.

  12. Confirm that complaints and Complaint assessments work as usual.

Testserver States

Note

These badges show the state of the test servers.
Green = Currently available, Red = Currently locked
Click on the badges to get to the test servers.







Review Progress

Performance Review

  • I (as a reviewer) confirm that the client changes (in particular related to REST calls and UI responsiveness) are implemented with a very good performance
  • I (as a reviewer) confirm that the server changes (in particular related to database calls) are implemented with a very good performance

Code Review

  • Code Review 1
  • Code Review 2

Manual Tests

  • Test 1
  • Test 2

Test Coverage

Screenshots

options
studentView
history
successAndOther
requireSubmitonResult

Screencast

video

Summary by CodeRabbit

Summary by CodeRabbit

  • New Features

    • Introduced automated feedback generation for text exercises via the new TextExerciseFeedbackService.
    • Enhanced filtering logic for displaying results based on assessment types, improving user experience.
    • Added new methods for retrieving student participation data with robust error handling.
    • Improved result display logic to indicate processing status for automatic feedback.
    • Refined logic for managing student participation results and feedback notifications in the Course Exercise Details component.
    • Updated submission handling to differentiate between new submissions and updates based on feedback.
    • Added localized messages for AI feedback status to enhance user communication.
  • Bug Fixes

    • Enhanced handling of feedback notifications to ensure timely user alerts.
  • Tests

    • Expanded test cases to validate the handling of results during automated assessments, ensuring accurate UI feedback.

@github-actions github-actions bot added the client Pull requests that update TypeScript code. (Added Automatically!) label Aug 22, 2024
@github-actions github-actions bot added the server Pull requests that update Java code. (Added Automatically!) label Aug 22, 2024
JohannesWt pushed a commit that referenced this pull request Sep 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
client Pull requests that update TypeScript code. (Added Automatically!) maintainer-approved The feature maintainer has approved the PR ready to merge server Pull requests that update Java code. (Added Automatically!) tests
Projects
Archived in project
Status: Done
Development

Successfully merging this pull request may close these issues.

8 participants