-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
it is possible to use beta-crown directly without the need for alpha-crown? #79
Comments
That's probably not supported for now, but you may try setting the learning rate for alpha to 0 to effectively disable alpha optimization. |
Thanks!, May I ask why the certified accuracy under the setting: attack: is 64%, while for general: is 69%. It seem the front setting is beta-crown and is a complete verifier , while the latter seeting is alpha-beta-crown and is partially incomplete. However, the results for incomplete verifier is more precise than that for complete verifier. The results seem theoretically impossible. Did I make a mistake in the implementation? |
The later one is not an incomplete verifier as branch-and-bound is enabled by default. The former one is worse probably because some configs are changed (such as the fsb branching heuristic with 1 candidate, which is worse than the default kfsb) |
Hi @shizhouxing |
@ytsao It can be turned off by setting |
Thanks! May I ask why the certified accuracy under the following setting is still worse than the default one? enable_incomplete_verification: False data: attack: pgd_restarts: 50 bab: |
@xiaoyuanpigo Could you explain what changes are you adding and why the changes are introduced? |
I aim to use a verifier that primarily relies on BaB (Branch and Bound) for ReLU, instead of a simpler approach that bounds ReLU with a single pair of upper and lower linear constraints. To achieve this, I’ve set the timeout to 3600 seconds, the iteration count for beta-CROWN to 100, and the number of FSB candidates to 10 to encourage the verifier to apply BaB on more ReLUs. However, the results don’t seem to match my expectations. Could there be an error in my implementation? Further, The better certified results in under the following settings: general: data: attack: |
This number is not for the number of ReLUs to apply BaB on. BaB will be applied to all the unstable ReLUs until timeout. FSB candidates is the number of candidate neurons where some actual bounds will be computed when trying to pick the best neuron to branch. Increasing this number a lot may slow down BaB. Same for the number of beta-crown iterations -- it may slow down BaB. So it's possible that you may observe a worse result. |
Thanks! @shizhouxing Should I consider skipping beta-CROWN and alpha-crown altogether? Like the following setting: |
Making the number of iterations too small or too large can both worsen the
If alpha-CROWN and beta-CROWN are disabled, bounds will be looser. Although BaB may be able to handle more ReLU neurons, due to the worsen bounds, you will likely get worse results. |
Thanks @shizhouxing May I ask why the final result is marked as "unknown" when both the "global_lb" and "best_l after optimization" are already greater than 0, which seems to suggest that the Alpha-Beta-Crown algorithm has successfully verified the instance? Below is the relevant log: |
@xiaoyuanpigo Could you post your full log? |
Certainly! Here’s the full log for your reference |
It's probably a timeout. Maybe 224s exceeded your timeout? What timeout did you set? I think " line 319 of ‘bab.py." is not relevant here. If |
I have encountered some questions while working with beta-crown, and I would appreciate your assistance in clarifying them.
I would like to know if it is possible to use beta-crown directly without the need for alpha-crown. If this is possible, could you kindly explain why I am encountering an error when setting
"alpha-crown:
alpha: false" in the "configuration.yaml"?
The specific error message is as follows:
BaB round 1
batch: 1
Traceback (most recent call last):
File "abcrown.py", line 658, in
abcrown.main()
File "abcrown.py", line 632, in main
verified_status = self.complete_verifier(
File "abcrown.py", line 424, in complete_verifier
l, nodes, ret = self.bab(
File "abcrown.py", line 244, in bab
result = general_bab(
File "/data/code/alpha-beta-CROWN/complete_verifier/bab.py", line 345, in general_bab
global_lb = act_split_round(
File "/data/code/alpha-beta-CROWN/complete_verifier/bab.py", line 170, in act_split_round
split_domain(net, domains, d, batch, impl_params=impl_params,
File "/data/code/alpha-beta-CROWN/complete_verifier/bab.py", line 75, in split_domain
branching_heuristic.get_branching_decisions(
File "/root/miniconda3/envs/py38/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/data/code//alpha-beta-CROWN/complete_verifier/heuristics/kfsb.py", line 148, in get_branching_decisions
k_ret_lbs = self.net.update_bounds(
File "/data/code/alpha-beta-CROWN/complete_verifier/beta_CROWN_solver.py", line 309, in update_bounds
lb, _, = self.net.compute_bounds(
File "/data/code/alpha-beta-CROWN/complete_verifier/auto_LiRPA/bound_general.py", line 1209, in compute_bounds
return self._compute_bounds_main(C=C,
File "/data/code/alpha-beta-CROWN/complete_verifier/auto_LiRPA/bound_general.py", line 1314, in _compute_bounds_main
ret = self.backward_general(
File "/data/code/alpha-beta-CROWN/complete_verifier/auto_LiRPA/backward_bound.py", line 256, in backward_general
A, lower_b, upper_b = l.bound_backward(
File "/data/code/alpha-beta-CROWN/complete_verifier/auto_LiRPA/operators/relu.py", line 249, in bound_backward
self._backward_relaxation(last_lA, last_uA, x, start_node, unstable_idx)
File "/data/code/alpha-beta-CROWN/complete_verifier/auto_LiRPA/operators/relu.py", line 555, in _backward_relaxation
selected_alpha, alpha_lookup_idx = self.select_alpha_by_idx(last_lA, last_uA,
File "/data/code/alpha-beta-CROWN/complete_verifier/auto_LiRPA/operators/relu.py", line 207, in select_alpha_by_idx
selected_alpha = self.alpha[start_node.name]
KeyError: '/50'
The text was updated successfully, but these errors were encountered: