Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(crypto): CRP-2648 CRP-2600 VetKD API and testing improvements #3283

Open
wants to merge 10 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 12 additions & 0 deletions rs/crypto/internal/crypto_lib/bls12_381/type/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1908,6 +1908,18 @@ declare_muln_vartime_affine_impl_for!(G1Projective, G1Affine);
impl_debug_using_serialize_for!(G1Affine);
impl_debug_using_serialize_for!(G1Projective);

impl G1Affine {
/// See draft-irtf-cfrg-bls-signature-01 §4.2.2 for details on BLS augmented signatures
pub fn augmented_hash(pk: &G2Affine, data: &[u8]) -> Self {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might make sense to add tests, given that this is new functionality here in the BLS type crate.

let domain_sep = b"BLS_SIG_BLS12381G1_XMD:SHA-256_SSWU_RO_AUG_";

let mut signature_input = vec![];
signature_input.extend_from_slice(&pk.serialize());
signature_input.extend_from_slice(data);
Self::hash(domain_sep, &signature_input)
}
}

define_affine_and_projective_types!(G2Affine, G2Projective, 96);
declare_addsub_ops_for!(G2Projective);
declare_mixed_addition_ops_for!(G2Projective, G2Affine);
Expand Down
2 changes: 2 additions & 0 deletions rs/crypto/internal/crypto_lib/bls12_381/vetkd/BUILD.bazel
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ DEV_DEPENDENCIES = [
# Keep sorted.
"//rs/crypto/test_utils/reproducible_rng",
"@crate_index//:hex",
"@crate_index//:rand_chacha",
]

MACRO_DEV_DEPENDENCIES = []
Expand Down Expand Up @@ -47,6 +48,7 @@ rust_bench(
deps = [
# Keep sorted.
":vetkd",
"//rs/crypto/internal/crypto_lib/bls12_381/type",
"//rs/crypto/test_utils/reproducible_rng",
"@crate_index//:criterion",
"@crate_index//:rand",
Expand Down
1 change: 1 addition & 0 deletions rs/crypto/internal/crypto_lib/bls12_381/vetkd/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ zeroize = { workspace = true }
criterion = { workspace = true }
hex = { workspace = true }
ic-crypto-test-utils-reproducible-rng = { path = "../../../../test_utils/reproducible_rng" }
rand_chacha = { workspace = true }

[[bench]]
name = "vetkd"
Expand Down
32 changes: 18 additions & 14 deletions rs/crypto/internal/crypto_lib/bls12_381/vetkd/benches/vetkd.rs
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
use criterion::*;
use ic_crypto_internal_bls12_381_type::{Polynomial, Scalar};
use ic_crypto_internal_bls12_381_vetkd::*;
use ic_crypto_test_utils_reproducible_rng::reproducible_rng;
use rand::Rng;
Expand Down Expand Up @@ -124,22 +125,25 @@ fn vetkd_bench(c: &mut Criterion) {
node_info.push((node as u32, node_pk, eks));
}

group.bench_function(format!("EncryptedKey::combine (n={})", nodes), |b| {
b.iter(|| {
EncryptedKey::combine(
&node_info,
threshold,
&master_pk,
&tpk,
&derivation_path,
&did,
)
.unwrap()
})
});
group.bench_function(
format!("EncryptedKey::combine_valid_shares (n={})", nodes),
|b| {
b.iter(|| {
EncryptedKey::combine_valid_shares(
&node_info,
threshold,
&master_pk,
&tpk,
&derivation_path,
&did,
)
.unwrap()
})
},
);

if threshold == 9 {
let ek = EncryptedKey::combine(
let ek = EncryptedKey::combine_valid_shares(
&node_info,
threshold,
&master_pk,
Expand Down
124 changes: 87 additions & 37 deletions rs/crypto/internal/crypto_lib/bls12_381/vetkd/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,9 @@
#![forbid(unsafe_code)]
#![forbid(missing_docs)]

pub use ic_crypto_internal_bls12_381_type::*;
pub use ic_crypto_internal_bls12_381_type::{G1Affine, G2Affine, PairingInvalidPoint, Scalar};
use ic_crypto_internal_bls12_381_type::{G1Projective, G2Prepared, Gt, LagrangeCoefficients};

use rand::{CryptoRng, RngCore};
use zeroize::{Zeroize, ZeroizeOnDrop};

Expand Down Expand Up @@ -94,7 +96,7 @@ impl TransportSecretKey {
dpk: &DerivedPublicKey,
did: &[u8],
) -> Option<G1Affine> {
let msg = augmented_hash_to_g1(&dpk.pt, did);
let msg = G1Affine::augmented_hash(&dpk.pt, did);

let k = G1Affine::from(G1Projective::from(&ek.c3) - &ek.c1 * self.secret());

Expand Down Expand Up @@ -206,6 +208,11 @@ impl DerivedPublicKey {
self.pt.serialize()
}

/// Return the derived point in G2
pub fn point(&self) -> &G2Affine {
&self.pt
}

/// Deserialize a previously serialized derived public key
pub fn deserialize(bytes: &[u8]) -> Result<Self, DerivedPublicKeyDeserializationError> {
let pt = G2Affine::deserialize(&bytes)
Expand All @@ -214,16 +221,6 @@ impl DerivedPublicKey {
}
}

/// See draft-irtf-cfrg-bls-signature-01 §4.2.2 for details on BLS augmented signatures
fn augmented_hash_to_g1(pk: &G2Affine, data: &[u8]) -> G1Affine {
let domain_sep = b"BLS_SIG_BLS12381G1_XMD:SHA-256_SSWU_RO_AUG_";

let mut signature_input = vec![];
signature_input.extend_from_slice(&pk.serialize());
signature_input.extend_from_slice(data);
G1Affine::hash(domain_sep, &signature_input)
}

/// Check the validity of an encrypted key or encrypted key share
///
/// The only difference in handling is when checking a share, the
Expand Down Expand Up @@ -275,9 +272,10 @@ pub enum EncryptedKeyCombinationError {
DuplicateNodeIndex,
/// There were insufficient shares to perform combination
InsufficientShares,
/// Some of the key shares are invalid; the Vec contains the list of
/// node indexes whose shares were malformed
InvalidKeyShares(Vec<NodeIndex>),
/// Not enough valid shares
InsufficientValidKeyShares,
/// Some of the key shares are invalid
InvalidShares,
}

#[derive(Clone, Eq, PartialEq, Debug)]
Expand All @@ -292,14 +290,11 @@ impl EncryptedKey {
/// The length of the serialized encoding of this type
pub const BYTES: usize = 2 * G1Affine::BYTES + G2Affine::BYTES;

/// Combine several shares into an encrypted key
pub fn combine(
nodes: &[(NodeIndex, G2Affine, EncryptedKeyShare)],
/// Combine, unchecked.
/// The returned key may be invalid.
fn combine_unchecked(
nodes: &[(NodeIndex, EncryptedKeyShare)],
reconstruction_threshold: usize,
master_pk: &G2Affine,
tpk: &TransportPublicKey,
derivation_path: &DerivationPath,
did: &[u8],
) -> Result<Self, EncryptedKeyCombinationError> {
if nodes.len() < reconstruction_threshold {
return Err(EncryptedKeyCombinationError::InsufficientShares);
Expand All @@ -309,31 +304,86 @@ impl EncryptedKey {
.map_err(|_| EncryptedKeyCombinationError::DuplicateNodeIndex)?;

let c1 = l
.interpolate_g1(&nodes.iter().map(|i| &i.2.c1).collect::<Vec<_>>())
.interpolate_g1(&nodes.iter().map(|i| &i.1.c1).collect::<Vec<_>>())
.expect("Number of nodes and shares guaranteed equal");
let c2 = l
.interpolate_g2(&nodes.iter().map(|i| &i.2.c2).collect::<Vec<_>>())
.interpolate_g2(&nodes.iter().map(|i| &i.1.c2).collect::<Vec<_>>())
.expect("Number of nodes and shares guaranteed equal");
let c3 = l
.interpolate_g1(&nodes.iter().map(|i| &i.2.c3).collect::<Vec<_>>())
.interpolate_g1(&nodes.iter().map(|i| &i.1.c3).collect::<Vec<_>>())
.expect("Number of nodes and shares guaranteed equal");

let c = Self { c1, c2, c3 };
Ok(Self { c1, c2, c3 })
}

if !c.is_valid(master_pk, derivation_path, did, tpk) {
// Detect and return the invalid share id(s)
let mut invalid = vec![];
/// Combines all the given shares into an encrypted key.
///
/// If the result is Ok(), the returned key is guaranteed to be valid.
/// Returns the combined key, if it is valid.
/// Does not take the nodes individual public keys as input.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: nodes -> nodes' (this typo comes from my own commit 😄 )

pub fn combine_all(
nodes: &[(NodeIndex, EncryptedKeyShare)],
reconstruction_threshold: usize,
master_pk: &G2Affine,
tpk: &TransportPublicKey,
derivation_path: &DerivationPath,
did: &[u8],
) -> Result<Self, EncryptedKeyCombinationError> {
let c = Self::combine_unchecked(nodes, reconstruction_threshold)?;
if c.is_valid(master_pk, derivation_path, did, tpk) {
Ok(c)
} else {
Err(EncryptedKeyCombinationError::InvalidShares)
}
}

for (node_id, node_pk, node_eks) in nodes {
if !node_eks.is_valid(master_pk, node_pk, derivation_path, did, tpk) {
invalid.push(*node_id);
/// Filters the valid shares from the given ones, and combines them into a valid key, if possible.
/// The returned key is guaranteed to be valid.
/// Returns an error if not sufficient shares are given or if not sufficient valid shares are given.
/// Takes also the node's individual public keys as input, which means the individual public keys
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: node's -> nodes' (again, from my own commit)

/// must be available: calculating them is comparatively expensive. Note that combine_all does not
/// take the individual public keys as input.
pub fn combine_valid_shares(
nodes: &[(NodeIndex, G2Affine, EncryptedKeyShare)],
reconstruction_threshold: usize,
master_pk: &G2Affine,
tpk: &TransportPublicKey,
derivation_path: &DerivationPath,
did: &[u8],
) -> Result<Self, EncryptedKeyCombinationError> {
if nodes.len() < reconstruction_threshold {
return Err(EncryptedKeyCombinationError::InsufficientShares);
}

// Take the first reconstruction_threshold shares which pass validity check
let mut valid_shares = Vec::with_capacity(reconstruction_threshold);

for (node_index, node_pk, node_eks) in nodes.iter() {
if node_eks.is_valid(master_pk, node_pk, derivation_path, did, tpk) {
valid_shares.push((*node_index, node_eks.clone()));

if valid_shares.len() >= reconstruction_threshold {
break;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

breaking early was something I could not do with the functional approach I used in the draft PR, so this approach here is certainly better.

}
}
}

return Err(EncryptedKeyCombinationError::InvalidKeyShares(invalid));
if valid_shares.len() < reconstruction_threshold {
return Err(EncryptedKeyCombinationError::InsufficientValidKeyShares);
}

Ok(c)
let c = Self::combine_unchecked(&valid_shares, reconstruction_threshold)?;

// If sufficient shares are available, and all were valid (which we already checked)
// then the resulting signature should always be valid as well.
//
// This check is mostly to catch the case where the reconstruction_threshold was
// somehow incorrect.
if c.is_valid(master_pk, derivation_path, did, tpk) {
Ok(c)
} else {
Err(EncryptedKeyCombinationError::InvalidShares)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would a dedicated error variant make sense here, so that we can distinguish between actual invalid shares, and the case where the reconstruction threshold was incorrect?

}
Comment on lines +380 to +386
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering what could cause the reconstruction_threshold to be incorrect.

If I understand correctly, this would be an implementation error in our code, right?
According to the spec for combine_encrypted_key_shares, the reconstruction threshold passed to EncryptedKey::combine_valid_shares is

let reconstruction_threshold = transcript_data_from_store.public_coefficients().coefficients.len();

So, this will come from the transcript that Consensus loads. The questions is, if this is good enough not to do the validity check, and thus save two multi-pairings.

But thinking about it again, we are here in combine_valid_shares, and this is only called if the optimistic combination with combine_all fails (see spec), so combine_valid_shares should be called very infrequently, so maybe the additional validity check is actually somewhat irrelevant.

}

/// Check if this encrypted key is valid with respect to the provided derivation path
Expand All @@ -345,7 +395,7 @@ impl EncryptedKey {
tpk: &TransportPublicKey,
) -> bool {
let dpk = DerivedPublicKey::compute_derived_key(master_pk, derivation_path);
let msg = augmented_hash_to_g1(&dpk.pt, did);
let msg = G1Affine::augmented_hash(&dpk.pt, did);
check_validity(&self.c1, &self.c2, &self.c3, tpk, &dpk.pt, &msg)
}

Expand Down Expand Up @@ -418,7 +468,7 @@ impl EncryptedKeyShare {

let r = Scalar::random(rng);

let msg = augmented_hash_to_g1(&dpk, did);
let msg = G1Affine::augmented_hash(&dpk, did);

let c1 = G1Affine::from(G1Affine::generator() * &r);
let c2 = G2Affine::from(G2Affine::generator() * &r);
Expand All @@ -439,7 +489,7 @@ impl EncryptedKeyShare {
let dpki = DerivedPublicKey::compute_derived_key(master_pki, derivation_path);
let dpk = DerivedPublicKey::compute_derived_key(master_pk, derivation_path);

let msg = augmented_hash_to_g1(&dpk.pt, did);
let msg = G1Affine::augmented_hash(&dpk.pt, did);

check_validity(&self.c1, &self.c2, &self.c3, tpk, &dpki.pt, &msg)
}
Expand Down
Loading
Loading