Skip to content

Fix deletion of backup schedules #11222

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jul 23, 2025

Conversation

bernardodemarco
Copy link
Member

Description

PR #9666 introduced changes to fix issues with the deletion of backup schedules. Prior to that PR, the deleteBackupSchedule API accepted only one parameter (virtualmachineid) and would delete a random backup schedule associated with the specified VM. The update modified the behavior so that, instead of randomly deleting one schedule, the API deletes all backup schedules for the given VM (when the virtualmachineid parameter is specified).

Additionally, the PR added support for deleting a specific schedule by introducing the id parameter to the deleteBackupSchedule API. However, the BackupScheduleVO class currently does not have a uuid attribute, and the BackupScheduleResponse does not have the id field. As a result, the API currently does not correctly recognize the id parameter in the API calls, preventing the deletion of a specific schedule.

This PR addresses that issue by enabling the deletion of individual backup schedules via their ID. It also updates the UI to send the id parameter when the user selects a specific schedule to delete, rather than using the virtualmachineid parameter.

Types of changes

  • Breaking change (fix or feature that would cause existing functionality to change)
  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Enhancement (improves an existing feature and functionality)
  • Cleanup (Code refactoring and cleanup, that may add test cases)
  • build/CI
  • test (unit or integration test code)

Feature/Enhancement Scale or Bug Severity

Feature/Enhancement Scale

  • Major
  • Minor

Screenshots (if appropriate):

How Has This Been Tested?

Before applying the PR changes, I verified that is was not possible to delete specific schedules:

Backup Schedules
select * from cloud.backup_schedule\G
*************************** 1. row ***************************
                 id: 11
              vm_id: 7
      schedule_type: 0
           schedule: 12
           timezone: AGT
scheduled_timestamp: 2025-07-16 14:12:00
       async_job_id: NULL
        max_backups: 2
*************************** 2. row ***************************
                 id: 12
              vm_id: 7
      schedule_type: 1
           schedule: 03:30
           timezone: GMT
scheduled_timestamp: 2025-07-17 06:03:00
       async_job_id: NULL
        max_backups: 1
2 rows in set (0.000 sec)
(segregated-lab) 🐱 > delete backupschedule id=12
🙈 Error: (HTTP 530, error code 9999) Either instance ID or ID of backup schedule needs to be specified
(segregated-lab) 🐱 > delete backupschedule id=11
🙈 Error: (HTTP 530, error code 9999) Either instance ID or ID of backup schedule needs to be specified
(segregated-lab) 🐱 > delete backupschedule id="11"
🙈 Error: (HTTP 530, error code 9999) Either instance ID or ID of backup schedule needs to be specified
(segregated-lab) 🐱 > delete backupschedule id="12"
🙈 Error: (HTTP 530, error code 9999) Either instance ID or ID of backup schedule needs to be specified

After applying the changes:

  • Verified that the ID is present in the listBackupSchedule API return
  • Verified that it is possible to delete backup schedules of a specific ID
  • Verified that it is possible to delete specific schedules through the UI
  • Verified that is is possible to delete all backup schedules of a VM
  • Verified access validation

Copy link

codecov bot commented Jul 16, 2025

Codecov Report

Attention: Patch coverage is 51.85185% with 13 lines in your changes missing coverage. Please review.

Project coverage is 16.58%. Comparing base (65d3592) to head (2dd5fc4).
Report is 13 commits behind head on main.

Files with missing lines Patch % Lines
...loudstack/api/response/BackupScheduleResponse.java 0.00% 5 Missing ⚠️
...org/apache/cloudstack/backup/BackupScheduleVO.java 0.00% 5 Missing ⚠️
...e/cloudstack/backup/dao/BackupScheduleDaoImpl.java 0.00% 2 Missing ⚠️
...i/command/user/backup/DeleteBackupScheduleCmd.java 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff            @@
##               main   #11222   +/-   ##
=========================================
  Coverage     16.57%   16.58%           
- Complexity    14054    14067   +13     
=========================================
  Files          5772     5772           
  Lines        512926   512951   +25     
  Branches      62306    62304    -2     
=========================================
+ Hits          85014    85051   +37     
+ Misses       418426   418418    -8     
+ Partials       9486     9482    -4     
Flag Coverage Δ
uitests 3.89% <ø> (ø)
unittests 17.47% <51.85%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@bernardodemarco
Copy link
Member Author

@blueorangutan package

@blueorangutan
Copy link

@bernardodemarco a [SL] Jenkins job has been kicked to build packages. It will be bundled with KVM, XenServer and VMware SystemVM templates. I'll keep you posted as I make progress.

@blueorangutan
Copy link

Packaging result [SF]: ✔️ el8 ✔️ el9 ✔️ debian ✔️ suse15. SL-JID 14216

Copy link
Collaborator

@abh1sar abh1sar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very useful change. thanks @bernardodemarco.

Just one comment. cmk is not listing backup schedules automatically. I believe it is because listBackupSchedule doesn't have an id parameter.

(localhost) 🐱 > list backupschedule virtualmachineid=c5dd93a7-2060-48db-983e-d6d318e6bafd 
{
  "backupschedule": [
    {
      "id": "65790631-2654-4743-bc16-2652420f73d1",
      "intervaltype": "HOURLY",
      "maxbackups": 5,
      "schedule": "3",
      "timezone": "ART",
      "virtualmachineid": "c5dd93a7-2060-48db-983e-d6d318e6bafd",
      "virtualmachinename": "VM-c5dd93a7-2060-48db-983e-d6d318e6bafd"
    },
    {
      "id": "5f5d1155-a935-4db3-b28d-7b637634e244",
      "intervaltype": "WEEKLY",
      "maxbackups": 4,
      "schedule": "00:12:1",
      "timezone": "ACT",
      "virtualmachineid": "c5dd93a7-2060-48db-983e-d6d318e6bafd",
      "virtualmachinename": "VM-c5dd93a7-2060-48db-983e-d6d318e6bafd"
    }
  ],
  "count": 2
}
(localhost) 🐱 > delete backupschedule virtualmachineid=c5dd93a7-2060-48db-983e-d6d318e6bafd id=

Copy link
Member

@winterhazel winterhazel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, tested by:

  • Deleting specific backup schedules of a VM by passing the id parameter via both the UI and CMK
  • Deleting all backup schedules of a VM by passing the virtualmachineid parameter via CMK
  • Checking that users can only delete backup schedules associated to VMs that they have access to

Some remarks:

  • I noticed that when a user deletes the last backup schedule of a VM via the UI, the schedule listing does not update because listBackupSchedule returns a 530 instead of an empty response
  • @abh1sar's comment would also be a nice addition

Both remarks can be addressed in a separate PR, as they are related to listBackupSchedule, and not deleteBackupSchedule.

@bernardodemarco
Copy link
Member Author

@abh1sar thanks for your review and @winterhazel thanks for testing!

Some remarks:

  • I noticed that when a user deletes the last backup schedule of a VM via the UI, the schedule listing does not update because listBackupSchedule returns a 530 instead of an empty response
  • @abh1sar's comment would also be a nice addition

Yes, I also noticed those behaviors while working on this PR, definitely something important for us to address.

I'm on vacation for the next 10 days and will be taking some time off during this period. So, I suggest we handle these two minor issues in a separate PR.

@winterhazel
Copy link
Member

@sureshanaparti can we run the CI here?

@sureshanaparti
Copy link
Contributor

@blueorangutan package

@blueorangutan
Copy link

@sureshanaparti a [SL] Jenkins job has been kicked to build packages. It will be bundled with KVM, XenServer and VMware SystemVM templates. I'll keep you posted as I make progress.

@blueorangutan
Copy link

Packaging result [SF]: ✔️ el8 ✔️ el9 ✔️ debian ✔️ suse15. SL-JID 14287

@sureshanaparti
Copy link
Contributor

@blueorangutan test

@blueorangutan
Copy link

@sureshanaparti a [SL] Trillian-Jenkins test job (ol8 mgmt + kvm-ol8) has been kicked to run smoke tests

Copy link
Contributor

@JoaoJandre JoaoJandre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CLGTM

Copy link
Contributor

@Pearl1594 Pearl1594 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

code lgtm

@blueorangutan
Copy link

[SF] Trillian test result (tid-13836)
Environment: kvm-ol8 (x2), Advanced Networking with Mgmt server ol8
Total time taken: 56165 seconds
Marvin logs: https://github.com/blueorangutan/acs-prs/releases/download/trillian/pr11222-t13836-kvm-ol8.zip
Smoke tests completed. 142 look OK, 0 have errors, 0 did not run
Only failed and skipped tests results shown below:

Test Result Time (s) Test File

@DaanHoogland DaanHoogland merged commit 9d04970 into apache:main Jul 23, 2025
26 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

9 participants