Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input-Output format of UINT64 types #754

Open
ckeshava opened this issue Sep 30, 2024 · 0 comments
Open

Input-Output format of UINT64 types #754

ckeshava opened this issue Sep 30, 2024 · 0 comments

Comments

@ckeshava
Copy link
Collaborator

Inconsistency in str type input format:

UINT32 type accepts str type containing base-10 inputs, whereas UINT64 type accepts str type in base-16 format only. This is enforced by the differing HEX_REGEX and .isdigit() checks in their respective UInt<bits>.from_value() methods.

Furthermore, the documentation states that only UINT64 type is represented as a str, to ensure no loss of precision. This begs the question: why does UINT32 type accept str inputs?

Inconsistency in the .to_json() method output:

UINT32 type returns a value of type int represented in base-10 format.
UINT64 type returns a value of type str. This value is represented in base-16 format. While I understand the need for using a str type (for maintaining precision), why is base-16 the preferred output format?

Differences between xrpl.js and xrpl-py appropos handling UINT64 type

Juxtaposing xrpl-py's UInt64.from_value() method with this issue, the Javascript SDK accepts only hexa-decimal strings, whereas the Python SDK accepts both base-10 int's, and base-16 str inputs.

Examples of these inconsistencies can be observed in this branch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant