Skip to content

Commit

Permalink
Merge branch 'main' into ss/readme-version-compatibility
Browse files Browse the repository at this point in the history
  • Loading branch information
Savio-Sou authored Jan 15, 2025
2 parents 874d4eb + b6bbbd1 commit e6666ba
Show file tree
Hide file tree
Showing 17 changed files with 253 additions and 1,345 deletions.
28 changes: 24 additions & 4 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,36 @@ on:

env:
CARGO_TERM_COLOR: always
MINIMUM_NOIR_VERSION: v0.37.0

jobs:
noir-version-list:
name: Query supported Noir versions
runs-on: ubuntu-latest
outputs:
noir_versions: ${{ steps.get_versions.outputs.versions }}

steps:
- name: Checkout sources
id: get_versions
run: |
# gh returns the Noir releases in reverse chronological order so we keep all releases published after the minimum supported version.
VERSIONS=$(gh release list -R noir-lang/noir --exclude-pre-releases --json tagName -q 'map(.tagName) | index(env.MINIMUM_NOIR_VERSION) as $index | if $index then .[0:$index+1] else [env.MINIMUM_NOIR_VERSION] end')
echo "versions=$VERSIONS"
echo "versions=$VERSIONS" >> $GITHUB_OUTPUT
env:
GH_TOKEN: ${{ github.token }}

test:
needs: [noir-version-list]
name: Test on Nargo ${{matrix.toolchain}}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
toolchain: [nightly, 0.37.0]
toolchain: ${{ fromJson( needs.noir-version-list.outputs.noir_versions )}}
include:
- toolchain: nightly
steps:
- name: Checkout sources
uses: actions/checkout@v4
Expand All @@ -38,8 +59,7 @@ jobs:
- name: Install Nargo
uses: noir-lang/[email protected]
with:
toolchain: 0.37.0

toolchain: ${{ env.MINIMUM_NOIR_VERSION }}
- name: Run formatter
run: nargo fmt --check

Expand All @@ -64,4 +84,4 @@ jobs:
fi
env:
# We treat any cancelled, skipped or failing jobs as a failure for the workflow as a whole.
FAIL: ${{ contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled') || contains(needs.*.result, 'skipped') }}
FAIL: ${{ contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled') || contains(needs.*.result, 'skipped') }}
89 changes: 89 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
# Contributing

Thank you for your interest in contributing! We value your contributions. 🙏

This guide will discuss how the team handles [Commits](#commits), [Pull Requests](#pull-requests), [Releases](#releases), the [Changelog](#changelog), and [Response time](#response-time).

**Note:** We won't force external contributors to follow this verbatim, but following these guidelines definitely helps us in accepting your contributions.

## Commits

We want to keep our commits small and focused. This allows for easily reviewing individual commits and/or splitting up pull requests when they grow too big. Additionally, this allows us to merge smaller changes quicker.

When committing, it's often useful to use the `git add -p` workflow to decide on what parts of the changeset to stage for commit.

## Pull Requests

Before you create a pull request, search for any issues related to the change you are making. If none exist already, create an issue that thoroughly describes the problem that you are trying to solve. These are used to inform reviewers of the original intent and should be referenced via the pull request template.

Pull Requests should be focused on the specific change they are working towards. If prerequisite work is required to complete the original pull request, that work should be submitted as a separate pull request.

This strategy avoids scenarios where pull requests grow too large/out-of-scope and don't get proper reviews—we want to avoid "LGTM, I trust you" reviews.

### Conventional Commits

We use [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) naming conventions for PRs, which help with releases and changelogs. Please use the following format for PR titles:

```
<type>[optional scope]: <description>
```

Generally, we want to only use the three primary types defined by the specification:

- `feat:` - This should be the most used type, as most work we are doing in the project are new features. Commits using this type will always show up in the Changelog.
- `fix:` - When fixing a bug, we should use this type. Commits using this type will always show up in the Changelog.
- `chore:` - The least used type, these are not included in the Changelog unless they are breaking changes. But remain useful for an understandable commit history.

#### Breaking Changes

Annotating BREAKING CHANGES is extremely important to our release process and versioning. To mark a commit as breaking, we add the ! character after the type, but before the colon. For example:

```
feat!: Rename nargo build to nargo check (#693)
```

```
feat(nargo)!: Enforce minimum rustc version
```

#### Scopes

Scopes significantly improve the Changelog, so we want to use a scope whenever possible. If we are only changing one part of the project, we can use the name of the crate, like (nargo) or (noirc_driver). If a change touches multiple parts of the codebase, there might be a better scope, such as using (syntax) for new language features.

```
feat(nargo): Add support for wasm backend (#234)
```

```
feat(syntax): Implement String data type (#123)
```

### Typos and other small changes

Significant changes, like new features or important bug fixes, typically have a more pronounced impact on the project’s overall development. For smaller fixes, such as typos, we encourage you to report them as Issues instead of opening PRs. This approach helps us manage our resources effectively and ensures that every change contributes meaningfully to the project. PRs involving such smaller fixes will likely be closed and incorporated in PRs authored by the core team.

### Reviews

For any repository in the organization, we require code review & approval by **one** team member before the changes are merged, as enforced by GitHub branch protection. Non-breaking pull requests may be merged at any time. Breaking pull requests should only be merged when the team has general agreement of the changes and is preparing a breaking release.

### Documentation

Breaking changes must be documented, either through adding/updating existing docs or README.md.

## Releases

Releases are managed by [Release Please](https://github.com/googleapis/release-please) which runs in a GitHub Action whenever a commit is made on the master branch.

Release Please parses Conventional Commit messages and opens (or updates) a pull request against the master branch that contains updates to the versions & Changelog within the project. If it doesn't detect any breaking change commits, it will only increment the "patch" version; however, if it detects a breaking change commit, it will increment the "minor" version number to indicate a breaking release.

When we are ready to release the version, we approve and squash merge the release pull request into master. Release Please will detect this merge and generate the appropriate tags for the release. Additional release steps may be triggered inside the GitHub Action to automate other parts of the release process.

There is no strict release cadence, but a new release is usually cut every 1 to 2 months.

## Changelog

The Changelog is automatically managed by Release Please and informed by the Conventional Commits (as discussed above).

## Response time

The team will respond to issues and PRs within 1 week from submission.
2 changes: 1 addition & 1 deletion Nargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ authors = [""]
compiler_version = ">=0.37.0"

[dependencies]
noir_sort = {tag = "v0.2.0", git = "https://github.com/noir-lang/noir_sort"}
noir_sort = {tag = "v0.2.2", git = "https://github.com/noir-lang/noir_sort"}
3 changes: 2 additions & 1 deletion src/_comparison_tools/bounds_checker.nr
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,8 @@ in this case, i == M
* cost = 3 gates + 2 gates per iteration
**/
pub fn get_validity_flags<let N: u32>(boundary: u32) -> [Field; N] {
let flags: [Field; N] = __get_validity_flags(boundary);
//@Safety: The constraining is happening inside get_validity_flags_inner
let flags: [Field; N] = unsafe { __get_validity_flags(boundary) };
get_validity_flags_inner(boundary, flags)
}

Expand Down
15 changes: 12 additions & 3 deletions src/_comparison_tools/lt.nr
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,8 @@ pub unconstrained fn get_lte_predicate_large(x: Field, y: Field) -> bool {
}

pub fn lte_field_240_bit(x: Field, y: Field) -> bool {
let predicate = get_lte_predicate_large(x, y);
//@Safety: check the comments below
let predicate = unsafe { get_lte_predicate_large(x, y) };
let delta = y as Field - x as Field;

// (x - y) * predicate
Expand All @@ -23,6 +24,8 @@ pub fn lte_field_240_bit(x: Field, y: Field) -> bool {
// (y - x) * p + (1 - p) * (x - y + 1)
// (y - x) * p + x - y + 1 + p * (y - x)
let lt_parameter = 2 * (predicate as Field) * delta - predicate as Field - delta + 1;
// checks that the bit length of lt_parameter is 240
// i.e. checks the sign of lt_parameter
lt_parameter.assert_max_bit_size::<240>();

predicate
Expand All @@ -40,18 +43,24 @@ pub fn assert_lte_240_bit(x: Field, y: Field) {
}

pub fn lt_field_16_bit(x: Field, y: Field) -> bool {
let predicate = get_lt_predicate_f(x, y);
//@Safety: check the comments below
let predicate = unsafe { get_lt_predicate_f(x, y) };
let delta = y as Field - x as Field;
let lt_parameter = 2 * (predicate as Field) * delta - predicate as Field - delta;
// checks that the bit length of lt_parameter is 16
// i.e. checks the sign of lt_parameter
lt_parameter.assert_max_bit_size::<16>();

predicate
}

pub fn lt_field_8_bit(x: Field, y: Field) -> bool {
let predicate = get_lt_predicate_f(x, y);
//@Safety: check the comments below
let predicate = unsafe { get_lt_predicate_f(x, y) };
let delta = y as Field - x as Field;
let lt_parameter = 2 * (predicate as Field) * delta - predicate as Field - delta;
// checks that the bit length of lt_parameter is 8
// i.e. checks the sign of lt_parameter
lt_parameter.assert_max_bit_size::<8>();

predicate
Expand Down
5 changes: 4 additions & 1 deletion src/_string_tools/slice_field.nr
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,11 @@ unconstrained fn __slice_200_bits_from_field(f: Field) -> (Field, Field, bool) {
}

pub fn slice_200_bits_from_field(f: Field) -> Field {
let (lo, hi, borrow) = __slice_200_bits_from_field(f);
//@Safety: check the comments below
let (lo, hi, borrow) = unsafe { __slice_200_bits_from_field(f) };
// checks that lo and hi are the correct slices of f
assert(hi * TWO_POW_200 + lo == f);
// checks that lo and hi are the correct bit sizes
lo.assert_max_bit_size::<200>();
hi.assert_max_bit_size::<56>();
let lo_diff = PLO_200_felt - lo + (borrow as Field * TWO_POW_200);
Expand Down
24 changes: 17 additions & 7 deletions src/_string_tools/slice_packed_field.nr
Original file line number Diff line number Diff line change
Expand Up @@ -607,7 +607,8 @@ unconstrained fn __divmod(numerator: Field, denominator: Field) -> (Field, Field
* we know the quotient will fit into a 14 bit range check which will save us some fractional gates
**/
fn divmod_31(numerator: Field) -> (Field, Field) {
let (quotient, remainder) = __divmod(numerator, 31);
//@Safety: we check the bit lengths of qf and rf and their relation to the numerator with assertions later
let (quotient, remainder) = unsafe { __divmod(numerator, 31) };

let qf = quotient as Field;
let rf = remainder as Field;
Expand Down Expand Up @@ -641,14 +642,17 @@ unconstrained fn decompose(val: Field) -> [Field; 16] {
pub fn get_last_limb_path<let OutputFields: u32>(last_limb_index: Field) -> [Field; OutputFields] {
// TODO we offset by 1 explain why (0 byte length produces 0 - 1 which = invalid array index. we just add 1 and increase array length by 1 to compensate)
let path = LAST_LIMB_PATH[last_limb_index + 1]; // 2
let path_valid_bits = decompose(path);
//@Safety: check the comments below
let path_valid_bits = unsafe { decompose(path) };
let mut path_valid_sum: Field = 0;
let mut path_valid_output: [Field; OutputFields] = [0; OutputFields];
for i in 0..OutputFields {
// we check that the path valid bits are binary
assert(path_valid_bits[i] * path_valid_bits[i] - path_valid_bits[i] == 0);
path_valid_sum += (path_valid_bits[i] * (1 << i as u8) as Field);
path_valid_output[i] = path_valid_bits[i];
}
// we check that the path valid bits sum to the path
assert(path_valid_sum == path);
path_valid_output
}
Expand All @@ -659,7 +663,8 @@ pub fn get_last_limb_path<let OutputFields: u32>(last_limb_index: Field) -> [Fie
* @details cost 46 gates
**/
pub fn slice_field(f: Field, num_bytes: Field) -> (Field, Field) {
let chunks = __slice_field(f, num_bytes);
//@Safety: we check the bit lengths of the chunks with assertions later
let chunks = unsafe { __slice_field(f, num_bytes) };
chunks[0].assert_max_bit_size::<8>(); // 1.25 gates
chunks[1].assert_max_bit_size::<16>(); // 1.5 gates
chunks[2].assert_max_bit_size::<32>(); // 1.75 gates
Expand Down Expand Up @@ -863,7 +868,9 @@ mod test {
// let start_byte = 26;
let num_bytes = 0;
let start_byte: u32 = 0;
let mut expected_slices: [Field; 3] = build_slices_for_test(text, start_byte, num_bytes);
//@Safety: this is a test
let mut expected_slices: [Field; 3] =
unsafe { build_slices_for_test(text, start_byte, num_bytes) };
let result_slices: [Field; 3] =
slice_fields(slices, start_byte as Field, num_bytes as Field);
assert(result_slices == expected_slices);
Expand Down Expand Up @@ -895,16 +902,19 @@ mod test {
for j in 0..18 {
let start_byte: u32 = byte_positions[j];
let mut expected_slices: [Field; 3] =
build_slices_for_test(text, start_byte, num_bytes);
//@Safety: this is a test
unsafe { build_slices_for_test(text, start_byte, num_bytes) };
let result_slices: [Field; 3] =
slice_fields(slices, start_byte as Field, num_bytes as Field);
//@Safety: this is a test
unsafe { slice_fields(slices, start_byte as Field, num_bytes as Field) };
assert(result_slices == expected_slices);
}

for j in 0..18 {
let start_byte: u32 = text.len() - num_bytes - byte_positions[j];
let mut expected_slices: [Field; 3] =
build_slices_for_test(text, start_byte, num_bytes);
//@Safety: this is a test
unsafe { build_slices_for_test(text, start_byte, num_bytes) };
let result_slices: [Field; 3] =
slice_fields(slices, start_byte as Field, num_bytes as Field);
assert(result_slices == expected_slices);
Expand Down
4 changes: 3 additions & 1 deletion src/_string_tools/sum_bytes_into_field.nr
Original file line number Diff line number Diff line change
Expand Up @@ -185,10 +185,12 @@ fn sum_var_bytes_into_field<let N: u32>(
body_index: Field,
num_bytes: Field,
) -> Field {
let path = get_path(num_bytes); // 5 gates
//@Safety: check the comments below
let path = unsafe { get_path(num_bytes) }; // 5 gates
let path_f: [Field; 5] =
[path[0] as Field, path[1] as Field, path[2] as Field, path[3] as Field, path[4] as Field];

// checkes that path_f is the binary representation of num_bytes
assert(
path_f[0] + path_f[1] * 2 + path_f[2] * 4 + path_f[3] * 8 + path_f[4] * 16
== num_bytes as Field,
Expand Down
2 changes: 1 addition & 1 deletion src/get_literal.nr
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ global LITERAL_OFFSET_SHIFT: [Field; 6] =
* @brief a JSON "literal" type has 3 states: "true", "false", "null".
* As such we can't directly convert to a bool and cover all possible cases
**/
struct JSONLiteral {
pub struct JSONLiteral {
value: Field,
}

Expand Down
6 changes: 4 additions & 2 deletions src/get_string.nr
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,8 @@ fn process_escape_sequences<let N: u32>(input: BoundedVec<u8, N>) -> BoundedVec<
+ (1 - is_escape_sequence) * character as Field;

written_byte = written_byte * (1 - skip) + cached_byte * skip;
let written_byte_u8 = to_u8(written_byte);
//@Safety: we assert that the casting is done correctly
let written_byte_u8 = unsafe { to_u8(written_byte) };
assert(written_byte_u8 as Field == written_byte);

result[result_ptr] = written_byte_u8;
Expand All @@ -41,7 +42,8 @@ fn process_escape_sequences<let N: u32>(input: BoundedVec<u8, N>) -> BoundedVec<
}

let written_byte: Field = character as Field * (1 - skip) + cached_byte * skip;
let written_byte_u8 = to_u8(written_byte);
//@Safety: we assert that the casting is done correctly
let written_byte_u8 = unsafe { to_u8(written_byte) };
assert(written_byte_u8 as Field == written_byte);
result[result_ptr] = written_byte_u8;
result_ptr += 1;
Expand Down
Loading

0 comments on commit e6666ba

Please sign in to comment.