We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
start replication job 2023-12-26 17:19:04 192-0: guest => VM 192, running => 5688 2023-12-26 17:19:04 192-0: volumes => local-zfs:base-9002-disk-0/vm-192-disk-1,local-zfs:base-9002-disk-1/vm-192-disk-0,local-zfs:vm-192-disk-2,local-zfs:vm-192-state-S2023_12_18_10_54,local-zfs:vm-192-state-S_2023_12_23_11_58 2023-12-26 17:19:06 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-0/vm-192-disk-1 2023-12-26 17:19:06 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-1/vm-192-disk-0 2023-12-26 17:19:07 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-disk-2 2023-12-26 17:19:07 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S2023_12_18_10_54 2023-12-26 17:19:07 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S_2023_12_23_11_58 2023-12-26 17:19:07 192-0: using secure transmission, rate limit: none 2023-12-26 17:19:07 192-0: full sync 'local-zfs:base-9002-disk-0/vm-192-disk-1' (replicate_192-0_1703571544) 2023-12-26 17:19:08 192-0: full send of rpool/data/vm-192-disk-1@S2023_12_18_10_54 estimated size is 7.65G 2023-12-26 17:19:08 192-0: send from @S2023_12_18_10_54 to rpool/data/vm-192-disk-1@S_2023_12_23_11_58 estimated size is 894M 2023-12-26 17:19:08 192-0: send from @S_2023_12_23_11_58 to rpool/data/vm-192-disk-1@replicate_192-0_1703571544 estimated size is 483M 2023-12-26 17:19:08 192-0: total estimated size is 9.00G 2023-12-26 17:19:08 192-0: TIME SENT SNAPSHOT rpool/data/vm-192-disk-1@S2023_12_18_10_54 2023-12-26 17:19:09 192-0: cannot receive: local origin for clone rpool/data/vm-192-disk-1@S2023_12_18_10_54 does not exist 2023-12-26 17:19:09 192-0: cannot open 'rpool/data/vm-192-disk-1': dataset does not exist 2023-12-26 17:19:09 192-0: command 'zfs recv -F -- rpool/data/vm-192-disk-1' failed: exit code 1 2023-12-26 17:19:09 192-0: warning: cannot send 'rpool/data/vm-192-disk-1@S2023_12_18_10_54': signal received 2023-12-26 17:19:09 192-0: TIME SENT SNAPSHOT rpool/data/vm-192-disk-1@S_2023_12_23_11_58 2023-12-26 17:19:09 192-0: warning: cannot send 'rpool/data/vm-192-disk-1@S_2023_12_23_11_58': Broken pipe 2023-12-26 17:19:09 192-0: TIME SENT SNAPSHOT rpool/data/vm-192-disk-1@replicate_192-0_1703571544 2023-12-26 17:19:09 192-0: warning: cannot send 'rpool/data/vm-192-disk-1@replicate_192-0_1703571544': Broken pipe 2023-12-26 17:19:09 192-0: cannot send 'rpool/data/vm-192-disk-1': I/O error 2023-12-26 17:19:09 192-0: command 'zfs send -Rpv -- rpool/data/vm-192-disk-1@replicate_192-0_1703571544' failed: exit code 1 2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-0/vm-192-disk-1 2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-1/vm-192-disk-0 2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-disk-2 2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S2023_12_18_10_54 2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S_2023_12_23_11_58 2023-12-26 17:19:09 192-0: end replication job with error: command 'set -o pipefail && pvesm export local-zfs:base-9002-disk-0/vm-192-disk-1 zfs - -with-snapshots 1 -snapshot replicate_192-0_1703571544 | /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=dsco-vm-01' [email protected] -- pvesm import local-zfs:base-9002-disk-0/vm-192-disk-1 zfs - -with-snapshots 1 -snapshot replicate_192-0_1703571544 -allow-rename 0' failed: exit code 1
This is the log i am receving while trying to replicate a vm. What can be the solution?
The text was updated successfully, but these errors were encountered:
This seems to be an issue with Proxmox not made to run on debian-based operating systems.
Sorry, something went wrong.
No branches or pull requests
start replication job
2023-12-26 17:19:04 192-0: guest => VM 192, running => 5688
2023-12-26 17:19:04 192-0: volumes => local-zfs:base-9002-disk-0/vm-192-disk-1,local-zfs:base-9002-disk-1/vm-192-disk-0,local-zfs:vm-192-disk-2,local-zfs:vm-192-state-S2023_12_18_10_54,local-zfs:vm-192-state-S_2023_12_23_11_58
2023-12-26 17:19:06 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-0/vm-192-disk-1
2023-12-26 17:19:06 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-1/vm-192-disk-0
2023-12-26 17:19:07 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-disk-2
2023-12-26 17:19:07 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S2023_12_18_10_54
2023-12-26 17:19:07 192-0: create snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S_2023_12_23_11_58
2023-12-26 17:19:07 192-0: using secure transmission, rate limit: none
2023-12-26 17:19:07 192-0: full sync 'local-zfs:base-9002-disk-0/vm-192-disk-1' (replicate_192-0_1703571544)
2023-12-26 17:19:08 192-0: full send of rpool/data/vm-192-disk-1@S2023_12_18_10_54 estimated size is 7.65G
2023-12-26 17:19:08 192-0: send from @S2023_12_18_10_54 to rpool/data/vm-192-disk-1@S_2023_12_23_11_58 estimated size is 894M
2023-12-26 17:19:08 192-0: send from @S_2023_12_23_11_58 to rpool/data/vm-192-disk-1@replicate_192-0_1703571544 estimated size is 483M
2023-12-26 17:19:08 192-0: total estimated size is 9.00G
2023-12-26 17:19:08 192-0: TIME SENT SNAPSHOT rpool/data/vm-192-disk-1@S2023_12_18_10_54
2023-12-26 17:19:09 192-0: cannot receive: local origin for clone rpool/data/vm-192-disk-1@S2023_12_18_10_54 does not exist
2023-12-26 17:19:09 192-0: cannot open 'rpool/data/vm-192-disk-1': dataset does not exist
2023-12-26 17:19:09 192-0: command 'zfs recv -F -- rpool/data/vm-192-disk-1' failed: exit code 1
2023-12-26 17:19:09 192-0: warning: cannot send 'rpool/data/vm-192-disk-1@S2023_12_18_10_54': signal received
2023-12-26 17:19:09 192-0: TIME SENT SNAPSHOT rpool/data/vm-192-disk-1@S_2023_12_23_11_58
2023-12-26 17:19:09 192-0: warning: cannot send 'rpool/data/vm-192-disk-1@S_2023_12_23_11_58': Broken pipe
2023-12-26 17:19:09 192-0: TIME SENT SNAPSHOT rpool/data/vm-192-disk-1@replicate_192-0_1703571544
2023-12-26 17:19:09 192-0: warning: cannot send 'rpool/data/vm-192-disk-1@replicate_192-0_1703571544': Broken pipe
2023-12-26 17:19:09 192-0: cannot send 'rpool/data/vm-192-disk-1': I/O error
2023-12-26 17:19:09 192-0: command 'zfs send -Rpv -- rpool/data/vm-192-disk-1@replicate_192-0_1703571544' failed: exit code 1
2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-0/vm-192-disk-1
2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:base-9002-disk-1/vm-192-disk-0
2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-disk-2
2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S2023_12_18_10_54
2023-12-26 17:19:09 192-0: delete previous replication snapshot 'replicate_192-0_1703571544' on local-zfs:vm-192-state-S_2023_12_23_11_58
2023-12-26 17:19:09 192-0: end replication job with error: command 'set -o pipefail && pvesm export local-zfs:base-9002-disk-0/vm-192-disk-1 zfs - -with-snapshots 1 -snapshot replicate_192-0_1703571544 | /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=dsco-vm-01' [email protected] -- pvesm import local-zfs:base-9002-disk-0/vm-192-disk-1 zfs - -with-snapshots 1 -snapshot replicate_192-0_1703571544 -allow-rename 0' failed: exit code 1
This is the log i am receving while trying to replicate a vm. What can be the solution?
The text was updated successfully, but these errors were encountered: