-
Notifications
You must be signed in to change notification settings - Fork 128
Add Internal SHAKE interfaces and their NIST test vectors. #572
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Integrate SHA3 low-level APIs from OpenSSL and high-level APIs following SHA2 APIs structure; Use ./crypto/crypto_test --gtest_filter=DigestTest.TestVectors for running test vectors and ./tool/bssl speed -filter SHA3-256 for performance evaluation
This reverts commit 039bf06.
This reverts commit e314ab9.
Address review comments, replace test vectors with NIST values, and add generated-src
Co-authored-by: dkostic <[email protected]> Co-authored-by: Andrew Hopkins <[email protected]>
crypto/fipsmodule/sha/internal.h
Outdated
// The data block size increases when the capacity decreases. | ||
// The data buffer should have at least the maximum | ||
// block size bytes to fit any digest length. | ||
#define SHA3_MIN_CAPACITY_BYTES (KECCAK1600_WIDTH / 8 - SHAKE128_BLOCKSIZE) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
few things before we continue:
- is this value Keccak specific or SHA3/SHAKE specific? If Keccak specific then I'd prefer Keccak in the name
- is there a simpler way to define what
capacity
is? - can we define
capacity
in the code comment before explaining what increases/decreases?
crypto/fipsmodule/sha/sha3.c
Outdated
KECCAK1600_CTX ctx; | ||
|
||
// The SHAKE block size depends on the security level of the algorithm only | ||
// It is independent of the digest size |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: digest size or output size? I think SHAKE is XOF not MD, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True. Sorry for this mistake! Fixed.
crypto/fipsmodule/sha/sha3.c
Outdated
KECCAK1600_CTX ctx; | ||
|
||
// The SHAKE block size depends on the security level of the algorithm only | ||
// It is independent of the digest size |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
crypto/fipsmodule/sha/sha3.c
Outdated
// digest length (for SHA3) or the security level for (SHAKE) | ||
if (pad == SHA3_PAD_CHAR) { | ||
block_size = SHA3_BLOCKSIZE(bit_len); | ||
} else { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what if pad
is some value that's different from both SHA3_PAD_CHAR
and SHAKE_PAD_CHAR
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added return 0
case since a wrong result will be produced if not. Sorry for this.
crypto/fipsmodule/sha/sha3_test.cc
Outdated
uint32_t digest_length = out_len_ / 8; | ||
uint8_t *digest = new uint8_t[digest_length]; | ||
|
||
#if !defined(OPENSSL_ANDROID) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
didn't we get rid of these in the SHA3 tests? Can we do it here as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will keep the branch updated with main. Sorry for this.
crypto/fipsmodule/sha/internal.h
Outdated
// In the sponge construction, the width of the underlying | ||
// function minus the rate (the block size in bits). | ||
// The |SHA3_MIN_CAPACITY_BYTES| corresponds to the | ||
// lowest security level capacity for SHA3/SHAKE. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
// In the sponge construction, the width of the underlying | |
// function minus the rate (the block size in bits). | |
// The |SHA3_MIN_CAPACITY_BYTES| corresponds to the | |
// lowest security level capacity for SHA3/SHAKE. | |
// In the sponge construction, capacity is the width of the underlying function | |
// minus the rate. In case of SHA3 and SHAKE, the underlying function is Keccak | |
// and the rate is the block size of a specific instantiation of SHA3 or SHAKE. | |
// |SHA3_MIN_CAPACITY_BYTES| corresponds to the minimum capacity taking into | |
// account all the instantiations of SHA3 and SHAKE we implement: SHA3-256, | |
// SHAKE128, SHAKE256. Since the block size of SHAKE128 is the largest among | |
// those, we compute the minimum capacity by subtracting SHA128 block size from | |
// Keccak width. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you so much. I have fixed it. I added the source for the capacity definition as NIST FIPS202:
// [NIST FIPS202] Capacity: In the sponge construction,
crypto/fipsmodule/sha/internal.h
Outdated
// The data blocks increase when the capacity decreases. | ||
// The data buffer should have at least the maximum number of | ||
// block size bytes to fit any SHA3/SHAKE block length. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these lines seem misplaced, so I would delete them from here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just fixed that! Sorry.
48d9e86
to
7787a0f
Compare
crypto/fipsmodule/sha/sha3_test.cc
Outdated
@@ -83,8 +83,53 @@ class SHA3TestVector { | |||
|
|||
} | |||
|
|||
void NISTTestVectors_SHAKE128() const { | |||
uint32_t digest_length = out_len_ / 8; | |||
uint8_t *digest = new uint8_t[digest_length]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use std::unique_ptr<uint8_t[]> digest(new uint8_t[digest_length];
here and in previous occurrences? Static analysis is seeing it as resource leak in the previous occurrences. I saw the unique_ptr usage in other test files.
Issues:
CryptoAlg-1171
Add SHAKE128 and SHAKE256 internal interface support for later use in the PQ protocols. Add NIST test vector checks.
Description of changes:
Add SHAKE128 and SHAKE256 support. Test the correctness through
SHAKE128Test.NISTTestVectors
andSHAKE256Test.NISTTestVectors
added from NIST Test Vectors for SHAKE https://csrc.nist.gov/projects/cryptographic-algorithm-validation-program/secure-hashing.Detailed description of the applied changes is presented in
CryptoAlg-1171?selectedConversation=3abf0a8b-88db-4e0e-b124-9dfc8701c89b
.Testing:
Testing of the correct assembly code integration is performed by corrupting the assembly code asserting failure.
TestVectors SHAKE128
SHAKE128Test.NISTTestVectors
TestVectors SHAKE1256
SHAKE256Test.NISTTestVectors
Keep the SHA3 tests:
TestVectors SHA3-XXX
./crypto/crypto_test --gtest_filter=DigestTest.TestVectors
SpeedTest SHA3-XXX
./tool/bssl speed -filter SHA3-XXX
Exhaustive TestVectors SHA3-XXX
./crypto/crypto_test --gtest_filter=SHA3Test.NISTTestVectors
./crypto/crypto_test --gtest_filter=SHA3Test.NISTTestVectors_SingleShot
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.