Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement ArrayDecoding #12

Merged
merged 4 commits into from
Sep 18, 2021
Merged

Conversation

daxpedda
Copy link
Contributor

As discussed on Zulip. Any improvements on the documentation are welcome.

@tarcieri
Copy link
Member

On second thought, I'm not sure this is needed because I think what you're trying to do here can already be done by the existing ArrayEncoding trait.

You can write code that's generic with respect to the size of the input array like this:

use crypto_bigint::{ArrayEncoding, ByteArray,UInt, nlimbs};

fn parse_uint<const LIMBS: usize>(bytes: ByteArray<UInt<LIMBS>>) -> UInt<LIMBS>
where
    UInt<LIMBS>: ArrayEncoding
{
    UInt::from_be_byte_array(bytes)
}

@tarcieri
Copy link
Member

Okay, I see the problem though. It can't infer LIMBS from the size of the GenericArray, which is what you're trying to do.

This looks OK then but I'll leave some line notes.

src/uint/array.rs Outdated Show resolved Hide resolved
@tarcieri tarcieri merged commit 94c24cf into RustCrypto:master Sep 18, 2021
@tarcieri
Copy link
Member

Thank you!

@daxpedda daxpedda deleted the array-decoding branch September 18, 2021 00:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants