Compare commits

...

141 Commits
lynx ... main

Author SHA1 Message Date
derekpierre 0d074f2a28
Generate release notes for v7.6.0. 2025-08-18 17:03:49 -04:00
Derek Pierre 7a954dd919
Merge pull request #3626 from nucypher/v7.6.x
v7.6.x
2025-08-18 17:01:36 -04:00
Derek Pierre 8943408c62
Merge pull request #3637 from derekpierre/nucypher-core-handover
`nucypher-core` handover version and contract registries
2025-08-16 21:22:09 -04:00
derekpierre 3bc9ffc3d7
Add dev newsfragment for #3637. 2025-08-16 05:55:07 -04:00
derekpierre 70d7df39d7
Update mainnet testnet contract registry to the latest deployments. 2025-08-15 21:33:17 -04:00
derekpierre f6c6cbc719
Update tapir testnet contract registry to the latest deployments. 2025-08-15 21:32:38 -04:00
derekpierre 3d95864175
Relock dependencies to use `nucypher-core` v0.15.0 instead of github dependency. 2025-08-15 21:29:33 -04:00
Derek Pierre 45a2a7c516
Merge pull request #3635 from derekpierre/handover-fixes-from-testing
Handover fixes after testing
2025-08-15 08:37:46 -04:00
derekpierre 63d8fe2cbd
Add TODO regarding use of precomputed variant (which we never currently use). 2025-08-14 15:19:45 -04:00
derekpierre f1c6a4c6e7
Fix typo. 2025-08-14 08:47:23 -04:00
derekpierre e9332eed85
Use codecov gh-action v5. 2025-08-14 08:44:06 -04:00
derekpierre aef4ce0675
Add dev newsfragment for #3635. 2025-08-14 08:44:06 -04:00
derekpierre ce1ba25ee7
Add to handover acceptance test to ensure that ritual metadata caching is properly cleared during handover process, and repopulated afterwards for subsequent decryption request. 2025-08-14 08:44:06 -04:00
derekpierre f4ec095a72
Prune ritual metadata during handover process - at any stage. This ensures that up-to-date values are always used during handover process without issues with timing of receipts of events.
Ensure that checking whether handover event is applicable to node is done before checking for existing cached tx.
2025-08-14 08:44:06 -04:00
Derek Pierre 0b3bbf9763
Merge pull request #3634 from derekpierre/handover-tapir-registry
Update `tapir` contract registry based on Coordinator handover deployment on `tapir`
2025-08-11 11:29:53 -04:00
derekpierre 7378b8c435
Add dev newsfragment for #3634. 2025-08-11 09:43:56 -04:00
derekpierre 4f3f982091
Update tapir contract registry based on Coordinator handover deployment on tapir. 2025-08-11 09:43:44 -04:00
Derek Pierre fb5e8b1c88
Merge pull request #3633 from derekpierre/handover-lynx-registry
Update `lynx` contract registry based on Coordinator handover deployment on `lynx`
2025-08-08 17:13:15 -04:00
derekpierre a2df2cf63b
Add dev newsfragment for #3633. 2025-08-08 16:57:55 -04:00
derekpierre a6203fa1a0
Update lynx contract registry based on Coordinator handover deployment on lynx. 2025-08-08 16:57:25 -04:00
Derek Pierre 20c08cbbc7
Merge pull request #3632 from derekpierre/update-rust-python
Update base `rust-python` image
2025-08-08 15:07:22 -04:00
derekpierre a5f32563b7
Add dev newsfragment for #3632. 2025-08-08 13:34:03 -04:00
derekpierre c7fa3d770b
Built an updated nucypher/rust-python image, and then configure nucypher node images to use it as its base. 2025-08-08 13:34:03 -04:00
Derek Pierre 81ce370025
Merge pull request #3630 from derekpierre/handover-caching
Re-enable DKG storage caching with Handover
2025-08-08 13:33:44 -04:00
derekpierre 64690f15c4
Add dev newsfragment for #3630. 2025-08-08 09:14:20 -04:00
derekpierre 7b62428763
Re-enable tests that relied on caching. 2025-08-08 09:14:20 -04:00
derekpierre 1c00454c55
Use try-catch for clearing txs since dkg storage is not thread-safe. 2025-08-08 09:14:20 -04:00
derekpierre cce01371b4
Clear cached values for validator and active ritual since handover modifies those values.
To prevent timing issues, clear the values whenever a handover event is encountered to force obtaining the relevant information from the contract.
2025-08-08 09:14:20 -04:00
derekpierre c9318c1e63
Give handover phases their own dkg storage keys. 2025-08-08 09:14:20 -04:00
derekpierre a29351911f
Remove commented use of cache so that cache is used again. 2025-08-08 09:14:20 -04:00
Derek Pierre 4913b0188b
Merge pull request #3628 from derekpierre/handover-updates
Handover Updates
2025-08-08 09:13:34 -04:00
derekpierre 9686f34926
Check whether handover blinded share is required before calling function to produce blinded share; it maintains consistency with what is done for handover transcript. 2025-08-08 08:22:07 -04:00
derekpierre b5b15ebe25
Explicitly return None. 2025-08-08 08:22:07 -04:00
derekpierre aafe105a70
Refactor common code for checking whether handover request is applicable to node for transcript and blinded share phases. 2025-08-08 08:22:07 -04:00
derekpierre 61fe5014be
Add dev newsfragment for #3628. 2025-08-08 08:22:07 -04:00
derekpierre 0288a221b3
Update test_dkg_ritual to ensure that successful partial decryption performed by the incoming validator 2025-08-08 08:22:07 -04:00
derekpierre d8c49fa86e
Add handover blinded share phase test to test_ritualist. 2025-08-08 08:22:07 -04:00
derekpierre 90fee93dfc
Update test_ritualist to clarify departing and incoming ursula.
Blinded share, transcript, and static session key properties of Handover object should be empty when in awaiting handover transcript phase.
2025-08-08 08:22:07 -04:00
derekpierre 3b0db030ef
Use constant for dkg size in test_ritualist. 2025-08-08 08:22:07 -04:00
derekpierre 1488b30670
Add to test to check that different randomness produces different public key, and the same randomness generates the same public key. 2025-08-08 08:22:07 -04:00
derekpierre f7f3474799
Allow more randomness of addresses in ferveo test.
Allow all possible handover_slit_indices to be tested.
Remove sorting of validators since no longer necessary for ferveo.
2025-08-08 08:22:07 -04:00
derekpierre 1383a707ce
Rename initiate_handover to produce_handover_transcript. 2025-08-08 08:22:07 -04:00
derekpierre fe6b33dc54
Remove specification of "me" when generating handover transcript since incoming validator is never part of the existing validator set.
The "me" is only needed internally for generating the DKG ferveo object, and therefore the actual validator used as me seemingly doesn't matter and can be any index so why not 0. Therefore, don't expose it as part of initiate_handover.
2025-08-08 08:22:07 -04:00
derekpierre c06c0b455d
Update timeout value used by MockCoordinatorAgent to be dkg_timeout. 2025-08-08 08:22:07 -04:00
derekpierre 62d8f7cee5
Update typehint for validators parameter. 2025-08-08 08:22:07 -04:00
derekpierre e7a4f60f9e
Programmatically calculate index for incoming validator in test. 2025-08-08 08:22:07 -04:00
derekpierre 995baedb6e
Randomize index of departing validator for handover tests. 2025-08-08 08:22:07 -04:00
Derek Pierre f279aa333e
Merge pull request #3608 from cygnusv/handover
Handover: crypto, powers, agents and actors
2025-08-08 08:20:01 -04:00
derekpierre ac8e2e1f6e
Appease linter. 2025-07-31 19:46:31 +02:00
derekpierre 93da405c4e
Update ritual tracker acceptance tests to accomodate Handover* events and logic for action_required.
TimeoutChanged event is no longer available in Coordinator contract so use MaxDkgSizeChanged instead.
2025-07-31 19:46:31 +02:00
derekpierre 791f874819
Comment out integration test which specifically checks for the use of the cache, since currently the cache is not being utilized due to handover. This should be uncommented once we use the cache again when addressing #3623. 2025-07-31 19:46:31 +02:00
derekpierre 1e6193abce
Chaotic characters should store and use validator messages instead of transcripts. 2025-07-31 19:46:31 +02:00
derekpierre 20960dc76c
Remove excess parameter that is no longer needed for aggregating a transcript. 2025-07-31 19:46:31 +02:00
derekpierre ef66d2d027
Relock dependencies so that updated nucypher-core dependency is used.
Update marshmallow and pydantic dependency specification since newer versions have incompatible changes that led to execution errors.
2025-07-31 19:46:30 +02:00
derekpierre 629e667855
Update nucypher-core dependency to point to relevant git dependency for latest handover functionality. This is temporary until we do a nucypher-core release. 2025-07-31 19:46:30 +02:00
derekpierre b85145f49a
Update relock script to remove --regenerate. Not needed since the poetry.lock file gets deleted as part of the script before locking. 2025-07-31 19:46:30 +02:00
David Núñez f15b367638
Acceptance test showing that Operator handles handover actions
The test is high-level enough that it also shows that decryption works after the handover
2025-07-31 19:46:30 +02:00
David Núñez 986cb16053
Be explicit on parameters for make_dkg 2025-07-31 19:46:30 +02:00
David Núñez 4238fe1970
DKG storage cache is temporarily disabled
See #3623. This needs to be restored in the future.
2025-07-31 19:46:30 +02:00
David Núñez 5466866dfe
Workaround for AggregatedTranscript serialization issue
See https://github.com/nucypher/ferveo/issues/209
2025-07-31 19:46:30 +02:00
David Núñez 5da6b40144
Coordinator agent tests for handover 2025-07-31 19:46:30 +02:00
David Núñez bb24e2fd2a
Add handover supervisor functions to CoordinatorAgent
Mostly for testing purposes, although they may have an use if we use the agent as part of a production script
2025-07-31 19:46:30 +02:00
David Núñez 20bfb2027e
Newsfragment for #3608 2025-07-31 19:46:30 +02:00
David Núñez b86859b9c8
Distinction between dkg and handover timeouts 2025-07-31 19:46:30 +02:00
David Núñez bbff665b1a
Extend test_ritualist to cover the first handover phase 2025-07-31 19:46:30 +02:00
David Núñez 9fc838577a
Further integration with recent nucypher-core changes 2025-07-31 19:46:30 +02:00
David Núñez 0203be97f1
Absolute minimum unit test for RitualisticPower 2025-07-31 19:46:30 +02:00
David Núñez 5a633a9bfb
Extend ActiveRitualTracker to handle handover events 2025-07-31 19:46:30 +02:00
David Núñez 6bdf052c66
Canonical validator order is dictated by contract
Validators are not assumed sorted anymore. See https://github.com/nucypher/nucypher-contracts/issues/389 and https://github.com/nucypher/ferveo/issues/204
2025-07-31 19:46:30 +02:00
David Núñez 7147d02ba5
Extend setup async hooks method to handover phases 2025-07-31 19:46:30 +02:00
David Núñez 4f523864e4
Operator methods to create and post handover blinded share
* Method to check proper conditions to create a handover blinded share (e.g. the ritual is active, the handover is in the state of expecting a blinded share, etc)
* A method to produce the actual blinded share from the handover transcript
* A method to handle the async TX that publishes the handover blinded share
2025-07-31 19:46:30 +02:00
David Núñez 8c3cf02984
Operator methods to process handover transcript requests
* Method to check proper conditions to create a handover transcript (e.g. the ritual is active, the handover is in the state of expecting a transcript, the handover expects this node as the incoming validator, etc)
* A method to produce the handover transcript
* A method to handle the async TX that publishes the handover transcript
2025-07-31 19:46:30 +02:00
David Núñez c3b4a3e0b6
Function to read handover status and request data from Coordinator 2025-07-31 19:46:30 +02:00
David Núñez 8911327fca
Transacting methods to post handover transcripts and blind shares 2025-07-31 19:46:30 +02:00
David Núñez b081d98cdc
Define new "phases" for handover and status codes in Coordinator model 2025-07-31 19:46:30 +02:00
David Núñez dd8e9ca159
Integrate handover with RitualisticPower 2025-07-31 19:46:30 +02:00
David Núñez 814b28ee0a
Complete python test for core ferveo handover functionality 2025-07-31 19:46:30 +02:00
David Núñez 1fcaf99bda
Introduce new ferveo API for handover
This is the lowest level layer in nucypher/nucypher
2025-07-31 19:46:30 +02:00
derekpierre 8136ae85b3
Bump version: 7.5.0 → 7.6.0 2025-07-31 19:18:06 +02:00
Derek Pierre 1dff7d830c
Merge pull request #3590 from nucypher/fix-pypi
Fix PyPi Publishing Issues
2025-04-10 13:58:24 -04:00
derekpierre c0d0249a3c
Remove license classifier which is now deprecated. 2025-04-10 12:03:18 -04:00
derekpierre c42c9f65e7
Rely on published versions of nucypher-pychalk and nucypher-snaptime instead of github dependencies. Update pyproject.toml and poetry lock accordingly. 2025-04-09 16:47:48 -04:00
derekpierre fb95182f9a
Update pypi job gh action versions. 2025-04-08 15:31:01 -04:00
derekpierre ac90903805
Generate release notes for 7.5.0. 2025-04-08 13:56:02 -04:00
Derek Pierre 9a262f3ab2
Merge pull request #3520 from nucypher/v7.5.x
[EPIC] v7.5.x
2025-04-08 13:49:30 -04:00
Derek Pierre a216e14989
Merge pull request #3588 from cygnusv/incompatible
Prepare breaking changes to post transcript
2025-04-08 12:11:31 -04:00
Manuel Montenegro 2127289a7d
Update contract registries
These registries were updated using the ones in
3703a8342a321b9197cfe6e960c17e1f77e331bb
2025-04-08 17:19:00 +02:00
Manuel Montenegro 0f5dcabcea
Rename back method name for is_provider_key_set 2025-04-08 16:56:43 +02:00
Manuel Montenegro 275f38f9d5
Rewording newfragment description
Co-authored-by: David Núñez <david@nucypher.com>
2025-04-07 20:45:33 +02:00
Manuel Montenegro 2859ff2a16
Update coordinator contract method name 2025-04-07 19:00:47 +02:00
Manuel Montenegro 73674f1bd5
Update contract registris for lynx and mainnet 2025-04-07 18:46:49 +02:00
Manuel Montenegro 98ec47c8d6
Merge branch 'v7.5.x' into incompatible 2025-04-07 18:34:06 +02:00
Derek Pierre c392457f18
Merge pull request #3589 from derekpierre/fix-regex-warning
Fix BigInt Regex Warning
2025-04-04 16:05:15 -04:00
derekpierre 7999a02bbb
Add dev newsfragment for #3589. 2025-04-04 15:46:11 -04:00
derekpierre 1be5aa0b29
Configure fee model role in coordinator for acceptance tests to pass based on latest changes in nucypher-contracts. 2025-04-04 15:46:10 -04:00
derekpierre dcf0b1b667
Fix python warning about regex string syntax. The string is correct but python is warning about the escape sequence and assumes it was a mistake. 2025-04-04 15:46:06 -04:00
David Núñez 8c449dd0b2
Add newsfragment 2025-04-04 11:47:14 +02:00
David Núñez 98fc240d90
Introduce breaking change when posting transcripts 2025-04-04 11:38:21 +02:00
Derek Pierre 43ba98d36e
Merge pull request #3565 from derekpierre/redact-ip
Redact IP addresses of http requesters from logs
2025-03-25 15:31:54 -04:00
derekpierre bbe75c5368
Add dev newsfragment for #3565. 2025-03-25 11:40:27 -04:00
derekpierre 46f954f191
Redact IP address from twisted's http server log messages. 2025-03-25 11:40:25 -04:00
Derek Pierre 425ab87f17
Merge pull request #3585 from derekpierre/large-numbers-as-strings
Handle BigInts provided in JSON from `taco-web`
2025-03-25 11:39:22 -04:00
derekpierre 4c229120b9
Add acceptance test for handling of big int strings. 2025-03-25 08:55:48 -04:00
derekpierre 8d49a09be3
Make logic more robust for checking for BigInt strings.
Add unit test for utility method - it was already tested based on usage by a field, but let's test it directly.
2025-03-25 08:55:47 -04:00
derekpierre f985433f11
Rename integer field for clarity. 2025-03-25 08:55:46 -04:00
derekpierre 113cd3792e
Add feature newsfragment for #3585. 2025-03-25 08:55:45 -04:00
derekpierre 4198aed384
Allow processing of BigInt strings within context variable values.
Add tests.
2025-03-25 08:55:44 -04:00
derekpierre 46372dde1b
Add contract condition tests to verify use of big number integers as parameters and return value test values (comparator values). 2025-03-25 08:55:43 -04:00
derekpierre d1e2311a97
Rename IntegerField to AnyIntegerField and add unit tests. 2025-03-25 08:55:43 -04:00
derekpierre a4df4662aa
Properly handle bigint strings that are strings of large numbers that end with 'n'. 2025-03-25 08:55:42 -04:00
derekpierre 1b13c50b02
Add generic catch-all field for any data received from JSON (`taco-web`).
We need to account for the case when `taco-web` provides large numbers (bigints) as strings.
Add tests.
2025-03-25 08:55:35 -04:00
Derek Pierre 1cb40795d0
Merge pull request #3586 from derekpierre/fix-invalid-config
Fix setuptools invalid config for specific sub-depdencies
2025-03-25 08:54:01 -04:00
derekpierre 96ed40c396
Add dev newsfragment for #3586. 2025-03-24 16:40:50 -04:00
derekpierre 0cedfb732b
Now that there are github dependencies which need to be built, include rust toolchain in ruff github action. 2025-03-24 16:35:44 -04:00
derekpierre 4803827426
Relock dependencies using our own forks of pychalk and snaptime which have entries in the setup.cfg that now use underscores instead of dashes. setuptools enforces the use of underscores.
Update dev dependencies group to no longer use soon to be deprecated header.
2025-03-24 16:35:33 -04:00
David Núñez c8507105e9 Camel casing 🐫
Co-authored-by: Derek Pierre <derek.pierre@gmail.com>
2025-03-23 10:46:16 +01:00
David Núñez a614bc5e5f Add JWTConditionDict type 2025-03-23 10:46:16 +01:00
David Núñez 10e9d19e66 validate_public_key has void return type so no need to return values
Co-authored-by: Derek Pierre <derek.pierre@gmail.com>
2025-03-23 10:46:16 +01:00
David Núñez d75ddbe6b2 Add JWTConditions to condition lingo tests 2025-03-23 10:46:16 +01:00
David Núñez b022f7a0a0 Refactor JWTCondition to use execution calls instead of inheriting from them
The rationale for this is to keep the ExecutionCall abstraction to encapsulate the JWT verification logic, but without inheriting from ExecutionCallAccessControlConditions, which require ReturnValueTests, something that JWTConditions don't need.
2025-03-23 10:46:16 +01:00
David Núñez d2a6ae919e Better exception handling for JWT condition evaluation 2025-03-23 10:46:16 +01:00
David Núñez 7a33522410 Be consistent with parameters that are not optional
Co-authored-by: Derek Pierre <derek.pierre@gmail.com>
2025-03-23 10:46:16 +01:00
David Núñez 22a18983f4 Specify pyjwt extra dependency on cryptography
See https://pyjwt.readthedocs.io/en/stable/installation.html#cryptographic-dependencies-optional
2025-03-23 10:46:16 +01:00
David Núñez cee5e36519 Test for JWTCondition expiration 2025-03-23 10:46:16 +01:00
David Núñez 9c8db5e688 Validate public key PEM format in JWT conditions
Co-authored-by: James Campbell <james.campbell@tanti.org.uk>
2025-03-23 10:46:16 +01:00
David Núñez 05998b0a9a Apply suggestions from code review
Co-authored-by: Derek Pierre <derek.pierre@gmail.com>
2025-03-23 10:46:16 +01:00
David Núñez 28895c3463 Cleanup unused comments 2025-03-23 10:46:16 +01:00
David Núñez 4ce01d2a60 Validate expected issuer in JWT token 2025-03-23 10:46:16 +01:00
David Núñez 46d26768e7 Some comments and TODO regarding expected issuer 2025-03-23 10:46:16 +01:00
David Núñez 5464006500 Make sure that JWT tokens can include custom claims 2025-03-23 10:46:16 +01:00
David Núñez a734dfca2d In JWT tests, define a token issuance function instead of hardcoding it 2025-03-23 10:46:16 +01:00
David Núñez d9d7757922 Newsfragment for PR#3570 2025-03-23 10:46:16 +01:00
David Núñez 39c6ba8d59 Add JWTCondition to condition resolution 2025-03-23 10:46:16 +01:00
David Núñez bfba37db58 First iteration on JWTConditions 2025-03-23 10:46:16 +01:00
David Núñez c17f174501 Add new JWT condition type name 2025-03-23 10:46:16 +01:00
David Núñez fe36bdc5b4 Add pyjwt to dependencies and relock 2025-03-23 10:46:16 +01:00
Derek Pierre 234d829893
Merge pull request #3580 from beemeeupnow/typo_fix
Fix duplicate assignment typo
2025-03-12 11:44:44 -04:00
beemeeupnow 8d673c5c4d
Fix duplicate assignment typo 2025-03-12 08:10:49 -06:00
Derek Pierre ed2bcae458
Merge pull request #3581 from derekpierre/no-taco-app-commitment
TACoApplication contract no longer accepts commitment duration
2025-03-12 09:35:12 -04:00
derekpierre c61a3ab819
Add dev newsfragment for #3581. 2025-03-12 08:59:00 -04:00
derekpierre 3cae4c84a2
TacoApplication contract no longer takes commitment duration; fix usage in tests. 2025-03-12 08:54:05 -04:00
77 changed files with 12086 additions and 4524 deletions

View File

@ -1,5 +1,5 @@
[bumpversion]
current_version = 7.5.0
current_version = 7.6.0
commit = True
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(-(?P<stage>[^.]*)\.(?P<devnum>\d+))?

View File

@ -24,7 +24,7 @@ jobs:
- name: Build a binary wheel and a source tarball
run: python3 -m build
- name: Store the distribution packages
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: python-package-distributions
path: dist/
@ -43,7 +43,7 @@ jobs:
steps:
- name: Download all the dists
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: python-package-distributions
path: dist/

View File

@ -142,7 +142,7 @@ jobs:
# Only upload coverage files after all tests have passed
- name: Upload unit tests coverage to Codecov
if: matrix.python-version == '3.12'
uses: codecov/codecov-action@v4
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: unit-coverage.xml
@ -152,7 +152,7 @@ jobs:
- name: Upload integration tests coverage to Codecov
if: matrix.python-version == '3.12'
uses: codecov/codecov-action@v4
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: integration-coverage.xml
@ -162,7 +162,7 @@ jobs:
- name: Upload acceptance tests coverage to Codecov
if: matrix.python-version == '3.12'
uses: codecov/codecov-action@v4
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: tests/acceptance

View File

@ -15,6 +15,9 @@ jobs:
- name: Checkout repo
uses: actions/checkout@v4
- name: Install latest Rust stable
uses: dtolnay/rust-toolchain@stable
- name: Set up Python
uses: actions/setup-python@v4
with:

View File

@ -1,4 +1,4 @@
FROM nucypher/rust-python:3.12.0
FROM nucypher/rust-python:3.12.9
# set default user
USER $USER

View File

@ -1,4 +1,4 @@
FROM rust:slim-buster
FROM rust:slim-bullseye
# prepare container
RUN apt-get update -y

View File

@ -1,183 +1,186 @@
abnf==2.2.0 ; python_version >= "3.9" and python_version < "4.0"
aiohappyeyeballs==2.4.3 ; python_version >= "3.9" and python_version < "4"
aiohttp==3.10.10 ; python_version >= "3.9" and python_version < "4"
aiosignal==1.3.1 ; python_version >= "3.9" and python_version < "4"
annotated-types==0.7.0 ; python_version >= "3.9" and python_version < "4.0"
abnf==2.4.0 ; python_version >= "3.9" and python_version < "4.0"
aiohappyeyeballs==2.6.1 ; python_version >= "3.9" and python_version < "4"
aiohttp==3.12.15 ; python_version >= "3.9" and python_version < "4"
aiosignal==1.4.0 ; python_version >= "3.9" and python_version < "4"
annotated-types==0.7.0 ; python_version >= "3.9" and python_version < "4"
ape-solidity==0.8.5 ; python_version >= "3.9" and python_version < "4"
appdirs==1.4.4 ; python_version >= "3.9" and python_version < "4"
asttokens==2.4.1 ; python_version >= "3.9" and python_version < "4"
async-timeout==4.0.3 ; python_version >= "3.9" and python_version < "3.11"
attrs==24.2.0 ; python_version >= "3.9" and python_version < "4"
asttokens==3.0.0 ; python_version >= "3.9" and python_version < "4"
async-timeout==5.0.1 ; python_version >= "3.9" and python_version < "3.11"
attrs==25.3.0 ; python_version >= "3.9" and python_version < "4"
atxm==0.5.0 ; python_version >= "3.9" and python_version < "4"
autobahn==24.4.2 ; python_version >= "3.9" and python_version < "4"
automat==24.8.1 ; python_version >= "3.9" and python_version < "4"
automat==25.4.16 ; python_version >= "3.9" and python_version < "4"
base58==1.0.3 ; python_version >= "3.9" and python_version < "4"
bitarray==3.0.0 ; python_version >= "3.9" and python_version < "4"
blinker==1.8.2 ; python_version >= "3.9" and python_version < "4"
bitarray==3.6.1 ; python_version >= "3.9" and python_version < "4"
blinker==1.9.0 ; python_version >= "3.9" and python_version < "4"
bytestring-splitter==2.4.1 ; python_version >= "3.9" and python_version < "4"
cached-property==2.0.1 ; python_version >= "3.9" and python_version < "4"
certifi==2024.8.30 ; python_version >= "3.9" and python_version < "4"
certifi==2025.8.3 ; python_version >= "3.9" and python_version < "4"
cffi==1.17.1 ; python_version >= "3.9" and python_version < "4"
cfgv==3.4.0 ; python_version >= "3.9" and python_version < "4"
charset-normalizer==3.4.0 ; python_version >= "3.9" and python_version < "4"
charset-normalizer==3.4.3 ; python_version >= "3.9" and python_version < "4"
ckzg==1.0.2 ; python_version >= "3.9" and python_version < "4"
click==8.1.7 ; python_version >= "3.9" and python_version < "4"
click==8.1.8 ; python_version >= "3.9" and python_version < "4"
colorama==0.4.6 ; python_version >= "3.9" and python_version < "4"
constant-sorrow==0.1.0a9 ; python_version >= "3.9" and python_version < "4"
constantly==23.10.4 ; python_version >= "3.9" and python_version < "4"
coverage==7.6.4 ; python_version >= "3.9" and python_version < "4"
coverage[toml]==7.6.4 ; python_version >= "3.9" and python_version < "4"
coverage==7.10.3 ; python_version >= "3.9" and python_version < "4"
coverage[toml]==7.10.3 ; python_version >= "3.9" and python_version < "4"
cryptography==43.0.3 ; python_version >= "3.9" and python_version < "4"
cytoolz==1.0.0 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
cytoolz==1.0.1 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
dataclassy==0.11.1 ; python_version >= "3.9" and python_version < "4"
dateparser==1.2.0 ; python_version >= "3.9" and python_version < "4"
decorator==5.1.1 ; python_version >= "3.9" and python_version < "4"
distlib==0.3.9 ; python_version >= "3.9" and python_version < "4"
eip712==0.2.10 ; python_version >= "3.9" and python_version < "4"
eth-abi==5.1.0 ; python_version >= "3.9" and python_version < "4"
dateparser==1.2.2 ; python_version >= "3.9" and python_version < "4"
decorator==5.2.1 ; python_version >= "3.9" and python_version < "4"
distlib==0.4.0 ; python_version >= "3.9" and python_version < "4"
eip712==0.2.13 ; python_version >= "3.9" and python_version < "4"
eth-abi==5.2.0 ; python_version >= "3.9" and python_version < "4"
eth-account==0.11.3 ; python_version >= "3.9" and python_version < "4"
eth-ape==0.8.12 ; python_version >= "3.9" and python_version < "4"
eth-bloom==3.0.1 ; python_version >= "3.9" and python_version < "4"
eth-hash==0.7.0 ; python_version >= "3.9" and python_version < "4"
eth-hash[pycryptodome]==0.7.0 ; python_version >= "3.9" and python_version < "4"
eth-hash[pysha3]==0.7.0 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
eth-keyfile==0.8.1 ; python_version >= "3.9" and python_version < "4"
eth-keys==0.6.0 ; python_version >= "3.9" and python_version < "4"
eth-bloom==3.1.0 ; python_version >= "3.9" and python_version < "4"
eth-hash==0.7.1 ; python_version >= "3.9" and python_version < "4"
eth-hash[pycryptodome]==0.7.1 ; python_version >= "3.9" and python_version < "4"
eth-hash[pysha3]==0.7.1 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
eth-keyfile==0.9.1 ; python_version >= "3.9" and python_version < "4"
eth-keys==0.7.0 ; python_version >= "3.9" and python_version < "4"
eth-pydantic-types==0.1.3 ; python_version >= "3.9" and python_version < "4"
eth-rlp==1.0.1 ; python_version >= "3.9" and python_version < "4"
eth-tester[py-evm]==0.11.0b2 ; python_version >= "3.9" and python_version < "4"
eth-typing==3.5.2 ; python_version >= "3.9" and python_version < "4"
eth-utils==2.3.2 ; python_version >= "3.9" and python_version < "4"
ethpm-types==0.6.18 ; python_version >= "3.9" and python_version < "4"
eval-type-backport==0.2.0 ; python_version >= "3.9" and python_version < "3.10"
evm-trace==0.2.3 ; python_version >= "3.9" and python_version < "4"
ethpm-types==0.6.25 ; python_version >= "3.9" and python_version < "4"
eval-type-backport==0.2.2 ; python_version >= "3.9" and python_version < "3.10"
evm-trace==0.2.4 ; python_version >= "3.9" and python_version < "4"
evmchains==0.0.13 ; python_version >= "3.9" and python_version < "4"
exceptiongroup==1.2.2 ; python_version >= "3.9" and python_version < "3.11"
executing==2.1.0 ; python_version >= "3.9" and python_version < "4"
filelock==3.16.1 ; python_version >= "3.9" and python_version < "4"
flask==3.0.3 ; python_version >= "3.9" and python_version < "4"
frozenlist==1.5.0 ; python_version >= "3.9" and python_version < "4"
greenlet==3.1.1 ; python_version >= "3.9" and python_version < "4"
exceptiongroup==1.3.0 ; python_version >= "3.9" and python_version < "3.11"
executing==2.2.0 ; python_version >= "3.9" and python_version < "4"
filelock==3.19.1 ; python_version >= "3.9" and python_version < "4"
flask==3.1.1 ; python_version >= "3.9" and python_version < "4"
frozenlist==1.7.0 ; python_version >= "3.9" and python_version < "4"
greenlet==3.2.4 ; python_version >= "3.9" and python_version < "4"
hendrix==5.0.0 ; python_version >= "3.9" and python_version < "4"
hexbytes==0.3.1 ; python_version >= "3.9" and python_version < "4"
humanize==4.11.0 ; python_version >= "3.9" and python_version < "4"
humanize==4.12.3 ; python_version >= "3.9" and python_version < "4"
hyperlink==21.0.0 ; python_version >= "3.9" and python_version < "4"
identify==2.6.1 ; python_version >= "3.9" and python_version < "4"
identify==2.6.13 ; python_version >= "3.9" and python_version < "4"
idna==3.10 ; python_version >= "3.9" and python_version < "4"
ijson==3.3.0 ; python_version >= "3.9" and python_version < "4"
importlib-metadata==8.5.0 ; python_version >= "3.9" and python_version < "3.10"
ijson==3.4.0 ; python_version >= "3.9" and python_version < "4"
importlib-metadata==8.7.0 ; python_version >= "3.9" and python_version < "3.10"
incremental==24.7.2 ; python_version >= "3.9" and python_version < "4"
iniconfig==2.0.0 ; python_version >= "3.9" and python_version < "4"
iniconfig==2.1.0 ; python_version >= "3.9" and python_version < "4"
ipython==8.18.1 ; python_version >= "3.9" and python_version < "4"
itsdangerous==2.2.0 ; python_version >= "3.9" and python_version < "4"
jedi==0.19.1 ; python_version >= "3.9" and python_version < "4"
jinja2==3.1.4 ; python_version >= "3.9" and python_version < "4"
jedi==0.19.2 ; python_version >= "3.9" and python_version < "4"
jinja2==3.1.6 ; python_version >= "3.9" and python_version < "4"
jsonpath-ng==1.7.0 ; python_version >= "3.9" and python_version < "4"
jsonschema-specifications==2024.10.1 ; python_version >= "3.9" and python_version < "4"
jsonschema==4.23.0 ; python_version >= "3.9" and python_version < "4"
jsonschema-specifications==2025.4.1 ; python_version >= "3.9" and python_version < "4"
jsonschema==4.25.0 ; python_version >= "3.9" and python_version < "4"
lazyasd==0.1.4 ; python_version >= "3.9" and python_version < "4"
lru-dict==1.2.0 ; python_version >= "3.9" and python_version < "4"
mako==1.3.6 ; python_version >= "3.9" and python_version < "4"
mako==1.3.10 ; python_version >= "3.9" and python_version < "4"
markdown-it-py==3.0.0 ; python_version >= "3.9" and python_version < "4"
markupsafe==3.0.2 ; python_version >= "3.9" and python_version < "4"
marshmallow==3.23.1 ; python_version >= "3.9" and python_version < "4"
marshmallow==3.26.1 ; python_version >= "3.9" and python_version < "4"
matplotlib-inline==0.1.7 ; python_version >= "3.9" and python_version < "4"
maya==0.6.1 ; python_version >= "3.9" and python_version < "4"
mdurl==0.1.2 ; python_version >= "3.9" and python_version < "4"
mnemonic==0.21 ; python_version >= "3.9" and python_version < "4"
morphys==1.0 ; python_version >= "3.9" and python_version < "4"
msgpack-python==0.5.6 ; python_version >= "3.9" and python_version < "4"
msgspec==0.18.6 ; python_version >= "3.9" and python_version < "4"
multidict==6.1.0 ; python_version >= "3.9" and python_version < "4"
msgspec==0.19.0 ; python_version >= "3.9" and python_version < "3.13"
multidict==6.6.4 ; python_version >= "3.9" and python_version < "4"
nodeenv==1.9.1 ; python_version >= "3.9" and python_version < "4"
nucypher-core==0.13.0 ; python_version >= "3.9" and python_version < "4"
nucypher-core==0.15.0 ; python_version >= "3.9" and python_version < "4"
nucypher-pychalk==2.0.2 ; python_version >= "3.9" and python_version < "4"
nucypher-snaptime==0.2.5 ; python_version >= "3.9" and python_version < "4"
numpy==1.26.4 ; python_version >= "3.9" and python_version < "4"
packaging==23.2 ; python_version >= "3.9" and python_version < "4"
pandas==2.2.3 ; python_version >= "3.9" and python_version < "4"
pandas==2.3.1 ; python_version >= "3.9" and python_version < "4"
parsimonious==0.10.0 ; python_version >= "3.9" and python_version < "4"
parso==0.8.4 ; python_version >= "3.9" and python_version < "4"
pendulum==3.0.0 ; python_version >= "3.9" and python_version < "4"
pendulum==3.1.0 ; python_version >= "3.9" and python_version < "4"
pexpect==4.9.0 ; python_version >= "3.9" and python_version < "4" and sys_platform != "win32"
platformdirs==4.3.6 ; python_version >= "3.9" and python_version < "4"
pluggy==1.5.0 ; python_version >= "3.9" and python_version < "4"
platformdirs==4.3.8 ; python_version >= "3.9" and python_version < "4"
pluggy==1.6.0 ; python_version >= "3.9" and python_version < "4"
ply==3.11 ; python_version >= "3.9" and python_version < "4"
pre-commit==2.21.0 ; python_version >= "3.9" and python_version < "4"
prometheus-client==0.21.0 ; python_version >= "3.9" and python_version < "4"
prompt-toolkit==3.0.48 ; python_version >= "3.9" and python_version < "4"
propcache==0.2.0 ; python_version >= "3.9" and python_version < "4"
protobuf==5.28.3 ; python_version >= "3.9" and python_version < "4"
prometheus-client==0.22.1 ; python_version >= "3.9" and python_version < "4"
prompt-toolkit==3.0.51 ; python_version >= "3.9" and python_version < "4"
propcache==0.3.2 ; python_version >= "3.9" and python_version < "4"
protobuf==6.32.0 ; python_version >= "3.9" and python_version < "4"
ptyprocess==0.7.0 ; python_version >= "3.9" and python_version < "4" and sys_platform != "win32"
pure-eval==0.2.3 ; python_version >= "3.9" and python_version < "4"
py-cid==0.3.0 ; python_version >= "3.9" and python_version < "4"
py-ecc==7.0.1 ; python_version >= "3.9" and python_version < "4"
py-ecc==8.0.0 ; python_version >= "3.9" and python_version < "4"
py-evm==0.10.1b1 ; python_version >= "3.9" and python_version < "4"
py-geth==5.0.0 ; python_version >= "3.9" and python_version < "4"
py-geth==5.6.0 ; python_version >= "3.9" and python_version < "4"
py-multibase==1.0.3 ; python_version >= "3.9" and python_version < "4"
py-multicodec==0.2.1 ; python_version >= "3.9" and python_version < "4"
py-multihash==0.2.3 ; python_version >= "3.9" and python_version < "4"
py-solc-x==2.0.3 ; python_version >= "3.9" and python_version < "4"
pyasn1-modules==0.4.1 ; python_version >= "3.9" and python_version < "4"
py-solc-x==2.0.4 ; python_version >= "3.9" and python_version < "4"
pyasn1-modules==0.4.2 ; python_version >= "3.9" and python_version < "4"
pyasn1==0.6.1 ; python_version >= "3.9" and python_version < "4"
pychalk==2.0.1 ; python_version >= "3.9" and python_version < "4"
pycparser==2.22 ; python_version >= "3.9" and python_version < "4"
pycryptodome==3.21.0 ; python_version >= "3.9" and python_version < "4"
pydantic-core==2.23.4 ; python_version >= "3.9" and python_version < "4.0"
pydantic-settings==2.6.1 ; python_version >= "3.9" and python_version < "4"
pydantic==2.9.2 ; python_version >= "3.9" and python_version < "4.0"
pygments==2.18.0 ; python_version >= "3.9" and python_version < "4"
pycryptodome==3.23.0 ; python_version >= "3.9" and python_version < "4"
pydantic-core==2.33.2 ; python_version >= "3.9" and python_version < "4.0"
pydantic-settings==2.10.1 ; python_version >= "3.9" and python_version < "4"
pydantic==2.11.7 ; python_version >= "3.9" and python_version < "4"
pygments==2.19.2 ; python_version >= "3.9" and python_version < "4"
pyjwt[crypto]==2.10.1 ; python_version >= "3.9" and python_version < "4"
pynacl==1.5.0 ; python_version >= "3.9" and python_version < "4"
pyopenssl==24.2.1 ; python_version >= "3.9" and python_version < "4"
pytest-cov==6.0.0 ; python_version >= "3.9" and python_version < "4"
pytest-mock==3.14.0 ; python_version >= "3.9" and python_version < "4"
pytest-timeout==2.3.1 ; python_version >= "3.9" and python_version < "4"
pyopenssl==25.1.0 ; python_version >= "3.9" and python_version < "4"
pytest-cov==6.2.1 ; python_version >= "3.9" and python_version < "4"
pytest-mock==3.14.1 ; python_version >= "3.9" and python_version < "4"
pytest-timeout==2.4.0 ; python_version >= "3.9" and python_version < "4"
pytest-twisted==1.14.3 ; python_version >= "3.9" and python_version < "4"
pytest==8.3.3 ; python_version >= "3.9" and python_version < "4"
pytest==8.4.1 ; python_version >= "3.9" and python_version < "4"
python-baseconv==1.2.2 ; python_version >= "3.9" and python_version < "4"
python-dateutil==2.9.0.post0 ; python_version >= "3.9" and python_version < "4"
python-dotenv==1.0.1 ; python_version >= "3.9" and python_version < "4"
pytz==2024.2 ; python_version >= "3.9" and python_version < "4"
python-dotenv==1.1.1 ; python_version >= "3.9" and python_version < "4"
python-statemachine==2.3.4 ; python_version >= "3.9" and python_version < "4"
pytz==2025.2 ; python_version >= "3.9" and python_version < "4"
pyunormalize==16.0.0 ; python_version >= "3.9" and python_version < "4"
pywin32==308 ; python_version >= "3.9" and python_version < "4" and platform_system == "Windows"
pywin32==311 ; python_version >= "3.9" and python_version < "4" and platform_system == "Windows"
pyyaml==6.0.2 ; python_version >= "3.9" and python_version < "4"
referencing==0.35.1 ; python_version >= "3.9" and python_version < "4"
regex==2024.9.11 ; python_version >= "3.9" and python_version < "4"
requests==2.32.3 ; python_version >= "3.9" and python_version < "4"
referencing==0.36.2 ; python_version >= "3.9" and python_version < "4"
regex==2025.7.34 ; python_version >= "3.9" and python_version < "4"
requests==2.32.4 ; python_version >= "3.9" and python_version < "4"
rich==13.9.4 ; python_version >= "3.9" and python_version < "4"
rlp==4.0.1 ; python_version >= "3.9" and python_version < "4"
rpds-py==0.21.0 ; python_version >= "3.9" and python_version < "4"
safe-pysha3==1.0.4 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
rlp==4.1.0 ; python_version >= "3.9" and python_version < "4"
rpds-py==0.27.0 ; python_version >= "3.9" and python_version < "4"
safe-pysha3==1.0.5 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
semantic-version==2.10.0 ; python_version >= "3.9" and python_version < "4"
service-identity==24.2.0 ; python_version >= "3.9" and python_version < "4"
setuptools==75.3.0 ; python_version >= "3.9" and python_version < "4"
setuptools==80.9.0 ; python_version >= "3.9" and python_version < "4"
siwe==4.2.0 ; python_version >= "3.9" and python_version < "4.0"
six==1.16.0 ; python_version >= "3.9" and python_version < "4"
snaptime==0.2.4 ; python_version >= "3.9" and python_version < "4"
six==1.17.0 ; python_version >= "3.9" and python_version < "4"
sortedcontainers==2.4.0 ; python_version >= "3.9" and python_version < "4"
sqlalchemy==2.0.36 ; python_version >= "3.9" and python_version < "4"
sqlalchemy==2.0.43 ; python_version >= "3.9" and python_version < "4"
stack-data==0.6.3 ; python_version >= "3.9" and python_version < "4"
tabulate==0.9.0 ; python_version >= "3.9" and python_version < "4"
time-machine==2.16.0 ; python_version >= "3.9" and python_version < "4"
time-machine==2.17.0 ; python_version >= "3.9" and python_version < "4"
toml==0.10.2 ; python_version >= "3.9" and python_version < "3.11"
tomli==2.0.2 ; python_version >= "3.9" and python_full_version <= "3.11.0a6"
tomli==2.2.1 ; python_version >= "3.9" and python_full_version <= "3.11.0a6"
toolz==1.0.0 ; python_version >= "3.9" and python_version < "4" and (implementation_name == "pypy" or implementation_name == "cpython")
tqdm==4.67.0 ; python_version >= "3.9" and python_version < "4"
tqdm==4.67.1 ; python_version >= "3.9" and python_version < "4"
traitlets==5.14.3 ; python_version >= "3.9" and python_version < "4"
trie==3.0.1 ; python_version >= "3.9" and python_version < "4"
twisted==24.10.0 ; python_version >= "3.9" and python_version < "4"
txaio==23.1.1 ; python_version >= "3.9" and python_version < "4"
types-requests==2.32.0.20241016 ; python_version >= "3.9" and python_version < "4"
typing-extensions==4.12.2 ; python_version >= "3.9" and python_version < "4"
tzdata==2024.2 ; python_version >= "3.9" and python_version < "4"
tzlocal==5.2 ; python_version >= "3.9" and python_version < "4"
urllib3==2.2.3 ; python_version >= "3.9" and python_version < "4"
trie==3.1.0 ; python_version >= "3.9" and python_version < "4"
twisted==24.11.0 ; python_version >= "3.9" and python_version < "4"
txaio==23.6.1 ; python_version >= "3.9" and python_version < "4"
types-requests==2.32.4.20250809 ; python_version >= "3.9" and python_version < "4"
typing-extensions==4.14.1 ; python_version < "4" and python_version >= "3.9"
typing-inspection==0.4.1 ; python_version >= "3.9" and python_version < "4"
tzdata==2025.2 ; python_version >= "3.9" and python_version < "4"
tzlocal==5.3.1 ; python_version >= "3.9" and python_version < "4"
urllib3==2.5.0 ; python_version >= "3.9" and python_version < "4"
varint==1.0.2 ; python_version >= "3.9" and python_version < "4"
virtualenv==20.27.1 ; python_version >= "3.9" and python_version < "4"
virtualenv==20.34.0 ; python_version >= "3.9" and python_version < "4"
watchdog==3.0.0 ; python_version >= "3.9" and python_version < "4"
wcwidth==0.2.13 ; python_version >= "3.9" and python_version < "4"
web3==6.20.1 ; python_version >= "3.9" and python_version < "4"
web3[tester]==6.20.1 ; python_version >= "3.9" and python_version < "4"
websockets==13.1 ; python_version >= "3.9" and python_version < "4"
werkzeug==3.1.2 ; python_version >= "3.9" and python_version < "4"
yarl==1.17.1 ; python_version >= "3.9" and python_version < "4"
zipp==3.20.2 ; python_version >= "3.9" and python_version < "3.10"
zope-interface==7.1.1 ; python_version >= "3.9" and python_version < "4"
websockets==15.0.1 ; python_version >= "3.9" and python_version < "4"
werkzeug==3.1.3 ; python_version >= "3.9" and python_version < "4"
yarl==1.20.1 ; python_version >= "3.9" and python_version < "4"
zipp==3.23.0 ; python_version >= "3.9" and python_version < "3.10"
zope-interface==7.2 ; python_version >= "3.9" and python_version < "4"

View File

@ -1 +0,0 @@
Support for executing multiple conditions sequentially, where the outcome of one condition can be used as input for another.

View File

@ -1 +0,0 @@
Support for offchain JSON endpoint condition expression and evaluation

View File

@ -1 +0,0 @@
Expands recovery CLI to include audit and keystore identification features

View File

@ -1,2 +0,0 @@
Condition that allows for if-then-else branching based on underlying conditions i.e. IF ``CONDITION A`` THEN ``CONDITION B`` ELSE ``CONDITION_C``.
The ELSE component can either be a Condition or a boolean value.

View File

@ -1 +0,0 @@
Enable support for Bearer authorization tokens (e.g., OAuth, JWT) within HTTP GET requests for ``JsonApiCondition``.

View File

@ -1 +0,0 @@
Enhance threshold decryption request efficiency by prioritizing nodes in the cohort with lower communication latency.

View File

@ -1 +0,0 @@
Added plumbing to support EVM condition evaluation on "any" (major) EVM chain outside of Ethereum and Polygon - only enabled on ``lynx`` testnet for now.

View File

@ -1 +0,0 @@
Support for conditions based on APIs provided by off-chain JSON RPC 2.0 endpoints.

View File

@ -1 +0,0 @@
Add support for EIP1271 signature verification for smart contract wallets.

View File

@ -16,7 +16,7 @@ __url__ = "https://github.com/nucypher/nucypher"
__summary__ = "A threshold access control application to empower privacy in decentralized systems."
__version__ = "7.5.0"
__version__ = "7.6.0"
__author__ = "NuCypher"

View File

@ -24,8 +24,10 @@ from nucypher_core.ferveo import (
DecryptionShareSimple,
DkgPublicKey,
FerveoVariant,
HandoverTranscript,
Transcript,
Validator,
ValidatorMessage,
)
from web3 import HTTPProvider, Web3
from web3.types import TxReceipt
@ -47,7 +49,13 @@ from nucypher.blockchain.eth.interfaces import (
BlockchainInterface,
BlockchainInterfaceFactory,
)
from nucypher.blockchain.eth.models import PHASE1, PHASE2, Coordinator
from nucypher.blockchain.eth.models import (
HANDOVER_AWAITING_BLINDED_SHARE,
HANDOVER_AWAITING_TRANSCRIPT,
PHASE1,
PHASE2,
Coordinator,
)
from nucypher.blockchain.eth.registry import ContractRegistry
from nucypher.blockchain.eth.signers import Signer
from nucypher.blockchain.eth.trackers import dkg
@ -362,12 +370,13 @@ class Operator(BaseActor):
return validators
result = list()
for staking_provider_address in ritual.providers:
for i, staking_provider_address in enumerate(ritual.providers):
if self.checksum_address == staking_provider_address:
# Local
external_validator = Validator(
address=self.checksum_address,
public_key=self.ritual_power.public_key(),
share_index=i,
)
else:
# Remote
@ -380,11 +389,12 @@ class Operator(BaseActor):
f"Ferveo public key for {staking_provider_address} is {bytes(public_key).hex()[:-8:-1]}"
)
external_validator = Validator(
address=staking_provider_address, public_key=public_key
address=staking_provider_address,
public_key=public_key,
share_index=i,
)
result.append(external_validator)
result = sorted(result, key=lambda x: x.address)
self.dkg_storage.store_validators(ritual.id, result)
return result
@ -392,29 +402,65 @@ class Operator(BaseActor):
def _setup_async_hooks(
self, phase_id: PhaseId, *args
) -> BlockchainInterface.AsyncTxHooks:
tx_type = "POST_TRANSCRIPT" if phase_id.phase == PHASE1 else "POST_AGGREGATE"
TX_TYPES = {
PHASE1: "POST_TRANSCRIPT",
PHASE2: "POST_AGGREGATE",
HANDOVER_AWAITING_TRANSCRIPT: "HANDOVER_AWAITING_TRANSCRIPT",
HANDOVER_AWAITING_BLINDED_SHARE: "HANDOVER_POST_BLINDED_SHARE",
}
tx_type = TX_TYPES[phase_id.phase]
def resubmit_tx():
if phase_id.phase == PHASE1:
# check status of ritual before resubmitting; prevent infinite loops
if not self._is_phase_1_action_required(ritual_id=phase_id.ritual_id):
self.log.info(
f"No need to resubmit tx: additional action not required for ritual# {phase_id.ritual_id} (status={self.coordinator_agent.get_ritual_status(phase_id.ritual_id)})"
f"No need to resubmit tx: additional action not required for ritual #{phase_id.ritual_id} "
f"(status={self.coordinator_agent.get_ritual_status(phase_id.ritual_id)})"
)
return
async_tx = self.publish_transcript(*args)
else:
elif phase_id.phase == PHASE2:
# check status of ritual before resubmitting; prevent infinite loops
if not self._is_phase_2_action_required(ritual_id=phase_id.ritual_id):
self.log.info(
f"No need to resubmit tx: additional action not required for ritual# {phase_id.ritual_id} (status={self.coordinator_agent.get_ritual_status(phase_id.ritual_id)})"
f"No need to resubmit tx: additional action not required for ritual #{phase_id.ritual_id} "
f"(status={self.coordinator_agent.get_ritual_status(phase_id.ritual_id)})"
)
return
async_tx = self.publish_aggregated_transcript(*args)
elif phase_id.phase == HANDOVER_AWAITING_TRANSCRIPT:
# check status of handover before resubmitting; prevent infinite loops
_, departing_validator, _ = args
if not self._is_handover_transcript_required(
ritual_id=phase_id.ritual_id,
departing_validator=departing_validator,
):
self.log.info(
f"No need to resubmit tx: additional action not required for handover in ritual #{phase_id.ritual_id}"
)
return
async_tx = self._publish_handover_transcript(*args)
elif phase_id.phase == HANDOVER_AWAITING_BLINDED_SHARE:
# check status of handover before resubmitting; prevent infinite loops
if not self._is_handover_blinded_share_required(
ritual_id=phase_id.ritual_id
):
self.log.info(
f"No need to resubmit tx: additional action not required for handover in ritual #{phase_id.ritual_id}"
)
return
async_tx = self._publish_blinded_share_for_handover(*args)
else:
raise ValueError(
f"Unsupported phase {phase_id.phase} for async tx resubmission"
)
self.log.info(
f"{self.transacting_power.account[:8]} resubmitted a new async tx {async_tx.id} "
f"for DKG ritual #{phase_id.ritual_id}"
f"of type {tx_type} for DKG ritual #{phase_id.ritual_id}."
)
def on_broadcast_failure(tx: FutureTx, e: Exception):
@ -694,7 +740,7 @@ class Operator(BaseActor):
)
if async_tx:
self.log.info(
f"Active ritual in progress: {self.transacting_power.account} has submitted tx"
f"Active ritual in progress: {self.transacting_power.account} has submitted tx "
f"for ritual #{ritual_id}, phase #{PHASE2} (final: {async_tx.final})."
)
return async_tx
@ -711,19 +757,20 @@ class Operator(BaseActor):
)
validators = self._resolve_validators(ritual)
transcripts = (Transcript.from_bytes(bytes(t)) for t in ritual.transcripts)
messages = list(zip(validators, transcripts))
transcripts = list(Transcript.from_bytes(bytes(t)) for t in ritual.transcripts)
messages = [ValidatorMessage(v, t) for v, t in zip(validators, transcripts)]
try:
(
aggregated_transcript,
dkg_public_key,
) = self.ritual_power.aggregate_transcripts(
aggregated_transcript = self.ritual_power.aggregate_transcripts(
threshold=ritual.threshold,
shares=ritual.shares,
checksum_address=self.checksum_address,
ritual_id=ritual.id,
transcripts=messages,
validator_messages=messages,
)
dkg_public_key = aggregated_transcript.public_key
# FIXME: Workaround: remove the public key (last 8 + 48 bytes of the aggregated transcript)
# to pass size validation check on contract publish. See ferveo#209
aggregated_transcript = bytes(aggregated_transcript)[: -8 - 48]
except Exception as e:
stack_trace = traceback.format_stack()
self.log.critical(
@ -750,6 +797,330 @@ class Operator(BaseActor):
return async_tx
def _is_handover_applicable_to_node(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
expected_handover_status: int,
require_incoming_validator_match: bool = False,
) -> bool:
"""
Determines if the handover phase is applicable to this node for a given ritual.
Checks ritual status, handover request presence, and handover status.
If require_incoming_validator_match is True, also checks if this node is the incoming validator.
"""
# check ritual status
status = self.coordinator_agent.get_ritual_status(ritual_id=ritual_id)
if status != Coordinator.RitualStatus.ACTIVE:
# This is a normal state when replaying/syncing historical
# blocks that contain StartRitual events of pending or completed rituals.
self.log.debug(
f"Ritual #{ritual_id} is not active so handover is not possible; dkg status={status}."
)
return False
# check handover request present for the ritual
handover_request = self.coordinator_agent.get_handover(
ritual_id=ritual_id, departing_validator=departing_validator
)
# empty handover object could be returned - check departing_validator
if handover_request.departing_validator != departing_validator:
# handover request not found for departing validator
if departing_validator == self.checksum_address:
self.log.debug(
f"Handover request for ritual #{ritual_id} is not for this node."
)
else:
self.log.debug(
f"No handover request for ritual #{ritual_id} with departing validator {departing_validator}."
)
return False
# Optionally check incoming validator
if (
require_incoming_validator_match
and handover_request.incoming_validator != self.checksum_address
):
self.log.debug(
f"Handover request for ritual #{ritual_id} is not for this node."
)
return False
# Check handover status
handover_status = self.coordinator_agent.get_handover_status(
ritual_id=ritual_id,
departing_validator=departing_validator,
)
if handover_status != expected_handover_status:
self.log.debug(
f"Handover status, {handover_status}, for ritual #{ritual_id} is not in the expected state {expected_handover_status}."
)
return False
return True
def _is_handover_transcript_required(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
) -> bool:
"""Check whether node needs to act as the incoming validator in a handover."""
return self._is_handover_applicable_to_node(
ritual_id=ritual_id,
departing_validator=departing_validator,
expected_handover_status=Coordinator.HandoverStatus.HANDOVER_AWAITING_TRANSCRIPT,
require_incoming_validator_match=True,
)
def _produce_handover_transcript(
self, ritual_id: int, departing_validator: ChecksumAddress
) -> HandoverTranscript:
ritual = self._resolve_ritual(ritual_id)
validators = self._resolve_validators(ritual)
# Raises ValueError if departing_validator is not in the providers list
handover_slot_index = ritual.providers.index(departing_validator)
# FIXME: Workaround: add serialized public key to aggregated transcript.
# Since we use serde/bincode in rust, we need a metadata field for the public key, which is the field size,
# as 8 bytes in little-endian. See ferveo#209
public_key_metadata = b"0\x00\x00\x00\x00\x00\x00\x00"
transcript = (
bytes(ritual.aggregated_transcript)
+ public_key_metadata
+ bytes(ritual.public_key)
)
aggregated_transcript = AggregatedTranscript.from_bytes(transcript)
handover_transcript = self.ritual_power.produce_handover_transcript(
nodes=validators,
aggregated_transcript=aggregated_transcript,
handover_slot_index=handover_slot_index,
ritual_id=ritual_id,
shares=ritual.dkg_size,
threshold=ritual.threshold,
)
return handover_transcript
def _publish_handover_transcript(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
handover_transcript: HandoverTranscript,
) -> AsyncTx:
"""Publish a handover transcript to the Coordinator."""
participant_public_key = self.threshold_request_power.get_pubkey_from_ritual_id(
ritual_id
)
handover_transcript = bytes(handover_transcript)
identifier = PhaseId(ritual_id=ritual_id, phase=HANDOVER_AWAITING_TRANSCRIPT)
async_tx_hooks = self._setup_async_hooks(
identifier, ritual_id, departing_validator, handover_transcript
)
async_tx = self.coordinator_agent.post_handover_transcript(
ritual_id=ritual_id,
departing_validator=departing_validator,
handover_transcript=handover_transcript,
participant_public_key=participant_public_key,
transacting_power=self.transacting_power,
async_tx_hooks=async_tx_hooks,
)
self.dkg_storage.store_ritual_phase_async_tx(
phase_id=identifier, async_tx=async_tx
)
return async_tx
def perform_handover_transcript_phase(
self,
ritual_id: int,
departing_participant: ChecksumAddress,
**kwargs,
) -> Optional[AsyncTx]:
if not self._is_handover_transcript_required(
ritual_id=ritual_id, departing_validator=departing_participant
):
self.log.debug(
f"No action required for handover transcript for ritual #{ritual_id}"
)
return None
# check if there is a pending tx for this phase
async_tx = self.dkg_storage.get_ritual_phase_async_tx(
phase_id=PhaseId(ritual_id, HANDOVER_AWAITING_TRANSCRIPT)
)
if async_tx:
self.log.info(
f"Active handover in progress: {self.transacting_power.account} has submitted tx "
f"for ritual #{ritual_id}, handover transcript phase, (final: {async_tx.final})."
)
return async_tx
try:
handover_transcript = self._produce_handover_transcript(
ritual_id=ritual_id,
departing_validator=departing_participant,
)
except Exception as e:
stack_trace = traceback.format_stack()
message = (
f"Failed to produce handover transcript for ritual #{ritual_id} and "
f"departing validator {departing_participant}: {str(e)}\n{stack_trace}"
)
self.log.critical(message)
return None
async_tx = self._publish_handover_transcript(
ritual_id=ritual_id,
departing_validator=departing_participant,
handover_transcript=handover_transcript,
)
self.log.debug(
f"{self.transacting_power.account[:8]} created a handover transcript for "
f"DKG ritual #{ritual_id} and departing validator {departing_participant}."
)
return async_tx
def _is_handover_blinded_share_required(self, ritual_id: int) -> bool:
"""Check whether node needs to post a blind share for handover"""
return self._is_handover_applicable_to_node(
ritual_id=ritual_id,
departing_validator=self.checksum_address,
expected_handover_status=Coordinator.HandoverStatus.HANDOVER_AWAITING_BLINDED_SHARE,
require_incoming_validator_match=False,
)
def _produce_blinded_share_for_handover(self, ritual_id: int) -> bytes:
ritual = self._resolve_ritual(ritual_id)
# FIXME: Workaround: add serialized public key to aggregated transcript.
# Since we use serde/bincode in rust, we need a metadata field for the public key, which is the field size,
# as 8 bytes in little-endian. See ferveo#209
public_key_metadata = b"0\x00\x00\x00\x00\x00\x00\x00"
transcript = (
bytes(ritual.aggregated_transcript)
+ public_key_metadata
+ bytes(ritual.public_key)
)
aggregated_transcript = AggregatedTranscript.from_bytes(transcript)
handover = self.coordinator_agent.get_handover(
ritual_id=ritual_id,
departing_validator=self.checksum_address,
)
self.log.debug(
f"{self.transacting_power.account[:8]} producing a new blinded share "
f"for handover at ritual #{ritual_id}." # and validator index #{handover.share_index}." # TODO: See ferveo#210
)
handover_transcript = HandoverTranscript.from_bytes(bytes(handover.transcript))
try:
new_aggregate = self.ritual_power.finalize_handover(
aggregated_transcript=aggregated_transcript,
handover_transcript=handover_transcript,
)
except Exception as e:
stack_trace = traceback.format_stack()
self.log.critical(
f"Failed to produce blinded share for ritual #{ritual_id}: {str(e)}\n{stack_trace}"
)
return
# extract blinded share from aggregate
# TODO: This is a temporary workaround to extract the blinded share
# See https://github.com/nucypher/nucypher-contracts/issues/400
aggregate_bytes = bytes(new_aggregate)
# TODO: Workaround to find the share index of the current validator. This was supposed to be
# in the handover request, but the rust implementation does not expose it. See ferveo#210
for v in self._resolve_validators(ritual):
if v.address == self.checksum_address:
share_index = v.share_index
break
else:
raise ValueError(
f"Validator {self.checksum_address} not found in ritual #{ritual_id} providers."
)
start = 32 + 48 * ritual.threshold + 96 * share_index
length = 96
blinded_share = aggregate_bytes[start : start + length]
return blinded_share
def _publish_blinded_share_for_handover(
self,
ritual_id: int,
blinded_share: bytes,
) -> AsyncTx:
"""Publish a handover blinded share to the Coordinator."""
blinded_share = bytes(blinded_share)
identifier = PhaseId(ritual_id=ritual_id, phase=HANDOVER_AWAITING_BLINDED_SHARE)
async_tx_hooks = self._setup_async_hooks(
identifier,
ritual_id,
blinded_share,
)
async_tx = self.coordinator_agent.post_blinded_share_for_handover(
ritual_id=ritual_id,
blinded_share=blinded_share,
transacting_power=self.transacting_power,
async_tx_hooks=async_tx_hooks,
)
self.dkg_storage.store_ritual_phase_async_tx(
phase_id=identifier, async_tx=async_tx
)
return async_tx
def perform_handover_blinded_share_phase(
self, ritual_id: int, **kwargs
) -> Optional[AsyncTx]:
if not self._is_handover_blinded_share_required(ritual_id=ritual_id):
self.log.debug(
f"No action required for handover blinded share for ritual #{ritual_id}"
)
return None
# check if there is a pending tx for this phase
async_tx = self.dkg_storage.get_ritual_phase_async_tx(
phase_id=PhaseId(ritual_id, HANDOVER_AWAITING_BLINDED_SHARE)
)
if async_tx:
self.log.info(
f"Active handover in progress: {self.transacting_power.account} has submitted tx "
f"for ritual #{ritual_id}, blinded share phase, (final: {async_tx.final})."
)
return async_tx
if not self._is_handover_blinded_share_required(ritual_id=ritual_id):
self.log.debug(
f"No action required for handover blinded share for ritual #{ritual_id}"
)
return None
try:
blinded_share = self._produce_blinded_share_for_handover(
ritual_id=ritual_id,
)
except Exception as e:
stack_trace = traceback.format_stack()
self.log.critical(
f"Failed to produce handover blinded share for ritual #{ritual_id}: {e}\n{stack_trace}"
)
return None
async_tx = self._publish_blinded_share_for_handover(
ritual_id=ritual_id,
blinded_share=blinded_share,
)
self.log.debug(
f"{self.transacting_power.account[:8]} created a handover blinded share for "
f"DKG ritual #{ritual_id}."
)
return async_tx
def prune_ritual_metadata_due_to_handover(self, ritual_id: int) -> None:
# clear ritual object and validators since handover modifies ritual
self.dkg_storage.clear_active_ritual_object(ritual_id)
self.dkg_storage.clear_validators(ritual_id)
def produce_decryption_share(
self,
ritual_id: int,
@ -759,9 +1130,16 @@ class Operator(BaseActor):
) -> Union[DecryptionShareSimple, DecryptionSharePrecomputed]:
ritual = self._resolve_ritual(ritual_id)
validators = self._resolve_validators(ritual)
aggregated_transcript = AggregatedTranscript.from_bytes(
# FIXME: Workaround: add serialized public key to aggregated transcript.
# Since we use serde/bincode in rust, we need a metadata field for the public key, which is the field size,
# as 8 bytes in little-endian. See ferveo#209
public_key_metadata = b"0\x00\x00\x00\x00\x00\x00\x00"
transcript = (
bytes(ritual.aggregated_transcript)
+ public_key_metadata
+ bytes(ritual.public_key)
)
aggregated_transcript = AggregatedTranscript.from_bytes(transcript)
decryption_share = self.ritual_power.produce_decryption_share(
nodes=validators,
threshold=ritual.threshold,

View File

@ -48,7 +48,14 @@ from nucypher.blockchain.eth.interfaces import (
BlockchainInterface,
BlockchainInterfaceFactory,
)
from nucypher.blockchain.eth.models import PHASE1, PHASE2, Coordinator, Ferveo
from nucypher.blockchain.eth.models import (
HANDOVER_AWAITING_BLINDED_SHARE,
HANDOVER_AWAITING_TRANSCRIPT,
PHASE1,
PHASE2,
Coordinator,
Ferveo,
)
from nucypher.blockchain.eth.registry import (
ContractRegistry,
)
@ -565,8 +572,8 @@ class CoordinatorAgent(EthereumContractAgent):
contract_name: str = "Coordinator"
@contract_api(CONTRACT_CALL)
def get_timeout(self) -> int:
return self.contract.functions.timeout().call()
def get_dkg_timeout(self) -> int:
return self.contract.functions.dkgTimeout().call()
@contract_api(CONTRACT_CALL)
def get_ritual_status(self, ritual_id: int) -> int:
@ -749,9 +756,48 @@ class CoordinatorAgent(EthereumContractAgent):
@contract_api(CONTRACT_CALL)
def is_provider_public_key_set(self, staking_provider: ChecksumAddress) -> bool:
result = self.contract.functions.isProviderPublicKeySet(staking_provider).call()
result = self.contract.functions.isProviderKeySet(staking_provider).call()
return result
@contract_api(CONTRACT_CALL)
def get_handover_timeout(self) -> int:
return self.contract.functions.handoverTimeout().call()
@contract_api(CONTRACT_CALL)
def get_handover_status(
self, ritual_id: int, departing_validator: ChecksumAddress
) -> int:
result = self.contract.functions.getHandoverState(
ritual_id, departing_validator
).call()
return result
@contract_api(CONTRACT_CALL)
def get_handover_key(
self, ritual_id: int, departing_validator: ChecksumAddress
) -> bytes:
result = self.contract.functions.getHandoverKey(
ritual_id, departing_validator
).call()
return bytes(result)
@contract_api(CONTRACT_CALL)
def get_handover(
self, ritual_id: int, departing_validator: ChecksumAddress
) -> Coordinator.Handover:
key = self.get_handover_key(ritual_id, departing_validator)
result = self.contract.functions.handovers(key).call()
handover = Coordinator.Handover(
key=key,
departing_validator=ChecksumAddress(departing_validator),
init_timestamp=int(result[0]),
incoming_validator=ChecksumAddress(result[1]),
transcript=bytes(result[2]),
decryption_request_pubkey=bytes(result[3]),
blinded_share=bytes(result[4]),
)
return handover
@contract_api(TRANSACTION)
def set_provider_public_key(
self, public_key: FerveoPublicKey, transacting_power: TransactingPower
@ -790,7 +836,8 @@ class CoordinatorAgent(EthereumContractAgent):
transacting_power: TransactingPower,
async_tx_hooks: BlockchainInterface.AsyncTxHooks,
) -> AsyncTx:
contract_function: ContractFunction = self.contract.functions.postTranscript(
# See sprints/#145
contract_function: ContractFunction = self.contract.functions.publishTranscript(
ritualId=ritual_id, transcript=bytes(transcript)
)
async_tx = self.blockchain.send_async_transaction(
@ -826,6 +873,104 @@ class CoordinatorAgent(EthereumContractAgent):
)
return async_tx
@contract_api(TRANSACTION)
def request_handover(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
incoming_validator: ChecksumAddress,
transacting_power: TransactingPower,
) -> TxReceipt:
contract_function: ContractFunction = self.contract.functions.handoverRequest(
ritualId=ritual_id,
departingParticipant=departing_validator,
incomingParticipant=incoming_validator,
)
receipt = self.blockchain.send_transaction(
contract_function=contract_function, transacting_power=transacting_power
)
return receipt
@contract_api(TRANSACTION)
def post_handover_transcript(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
handover_transcript: bytes,
participant_public_key: SessionStaticKey,
transacting_power: TransactingPower,
async_tx_hooks: BlockchainInterface.AsyncTxHooks,
) -> AsyncTx:
contract_function: ContractFunction = (
self.contract.functions.postHandoverTranscript(
ritualId=ritual_id,
departingParticipant=departing_validator,
transcript=bytes(handover_transcript),
decryptionRequestStaticKey=bytes(participant_public_key),
)
)
async_tx = self.blockchain.send_async_transaction(
contract_function=contract_function,
gas_estimation_multiplier=1.4,
transacting_power=transacting_power,
async_tx_hooks=async_tx_hooks,
info={"ritual_id": ritual_id, "phase": HANDOVER_AWAITING_TRANSCRIPT},
)
return async_tx
@contract_api(TRANSACTION)
def post_blinded_share_for_handover(
self,
ritual_id: int,
blinded_share: bytes,
transacting_power: TransactingPower,
async_tx_hooks: BlockchainInterface.AsyncTxHooks,
) -> AsyncTx:
contract_function: ContractFunction = self.contract.functions.postBlindedShare(
ritualId=ritual_id,
blindedShare=bytes(blinded_share),
)
async_tx = self.blockchain.send_async_transaction(
contract_function=contract_function,
gas_estimation_multiplier=1.4,
transacting_power=transacting_power,
async_tx_hooks=async_tx_hooks,
info={"ritual_id": ritual_id, "phase": HANDOVER_AWAITING_BLINDED_SHARE},
)
return async_tx
@contract_api(TRANSACTION)
def finalize_handover(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
transacting_power: TransactingPower,
) -> TxReceipt:
contract_function: ContractFunction = self.contract.functions.finalizeHandover(
ritualId=ritual_id,
departingParticipant=departing_validator,
)
receipt = self.blockchain.send_transaction(
contract_function=contract_function, transacting_power=transacting_power
)
return receipt
@contract_api(TRANSACTION)
def cancel_handover(
self,
ritual_id: int,
departing_validator: ChecksumAddress,
transacting_power: TransactingPower,
) -> TxReceipt:
contract_function: ContractFunction = self.contract.functions.cancelHandover(
ritualId=ritual_id,
departingParticipant=departing_validator,
)
receipt = self.blockchain.send_transaction(
contract_function=contract_function, transacting_power=transacting_power
)
return receipt
@contract_api(CONTRACT_CALL)
def get_ritual_id_from_public_key(self, public_key: DkgPublicKey) -> int:
g1_point = Ferveo.G1Point.from_dkg_public_key(public_key)

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -17,7 +17,8 @@ from nucypher.types import PhaseNumber
PHASE1 = PhaseNumber(1)
PHASE2 = PhaseNumber(2)
HANDOVER_AWAITING_TRANSCRIPT = PhaseNumber(11)
HANDOVER_AWAITING_BLINDED_SHARE = PhaseNumber(12)
@dataclass
class Ferveo:
@ -162,3 +163,21 @@ class Coordinator:
for participant_data in data:
participant = Coordinator.Participant.from_data(data=participant_data)
yield participant
@dataclass
class HandoverStatus:
NON_INITIATED = 0
HANDOVER_AWAITING_TRANSCRIPT = 1
HANDOVER_AWAITING_BLINDED_SHARE = 2
HANDOVER_AWAITING_FINALIZATION = 3
HANDOVER_TIMEOUT = 4
@dataclass
class Handover:
key: bytes
departing_validator: ChecksumAddress
incoming_validator: ChecksumAddress
init_timestamp: int
blinded_share: bytes
transcript: bytes
decryption_request_pubkey: bytes

View File

@ -113,12 +113,17 @@ class ActiveRitualTracker:
self.actions = {
self.contract.events.StartRitual: self.operator.perform_round_1,
self.contract.events.StartAggregationRound: self.operator.perform_round_2,
self.contract.events.HandoverRequest: self.operator.perform_handover_transcript_phase,
self.contract.events.HandoverTranscriptPosted: self.operator.perform_handover_blinded_share_phase,
}
self.events = [
self.contract.events.StartRitual,
self.contract.events.StartAggregationRound,
self.contract.events.EndRitual,
self.contract.events.HandoverRequest,
self.contract.events.HandoverTranscriptPosted,
self.contract.events.HandoverFinalized,
]
# TODO: Remove the default JSON-RPC retry middleware
@ -141,7 +146,7 @@ class ActiveRitualTracker:
self.task = EventScannerTask(scanner=self.scan)
cache_ttl = (
self.coordinator_agent.get_timeout()
self.coordinator_agent.get_dkg_timeout()
+ self._RITUAL_TIMEOUT_ADDITIONAL_TTL_BUFFER
)
self._participation_states = TTLCache(
@ -169,7 +174,7 @@ class ActiveRitualTracker:
Returns the block number to start scanning for events from.
"""
w3 = self.web3
timeout = self.coordinator_agent.get_timeout()
timeout = self.coordinator_agent.get_dkg_timeout()
latest_block = w3.eth.get_block("latest")
if latest_block.number == 0:
@ -211,15 +216,51 @@ class ActiveRitualTracker:
def _action_required(self, ritual_event: AttributeDict) -> bool:
"""Check if an action is required for a given ritual event."""
# Let's handle separately handover events and non-handover events
handover_events = [
self.contract.events.HandoverTranscriptPosted,
self.contract.events.HandoverRequest,
self.contract.events.HandoverFinalized,
]
event_type = getattr(self.contract.events, ritual_event.event)
if event_type in handover_events:
# handover modifies existing ritual metadata; so we need to proactively prune it
# during handover process and at the end to avoid having any stale metadata
# in the cache
self.operator.prune_ritual_metadata_due_to_handover(
ritual_event.args.ritualId
)
if event_type == self.contract.events.HandoverFinalized:
# pruning metadata is sufficient when Handover is finalized;
# no further action required
return False
is_departing_participant_in_handover = (
event_type == self.contract.events.HandoverTranscriptPosted
and ritual_event.args.departingParticipant
== self.operator.checksum_address
)
is_incoming_participant_in_handover = (
event_type == self.contract.events.HandoverRequest
and ritual_event.args.incomingParticipant
== self.operator.checksum_address
)
# for handover events we need to act only if the operator is the departing or incoming participant
return (
is_departing_participant_in_handover
or is_incoming_participant_in_handover
)
# Non-handover events (for the moment, DKG events)
# establish participation state first
participation_state = self._get_participation_state(ritual_event)
if not participation_state.participating:
return False
# does event have an associated action
event_type = getattr(self.contract.events, ritual_event.event)
event_has_associated_action = event_type in self.actions
already_posted_transcript = (
event_type == self.contract.events.StartRitual

View File

@ -87,37 +87,40 @@ class DKGOmniscient:
]
validators = [
ferveo.Validator(checksum_addresses[i], keypair.public_key())
ferveo.Validator(checksum_addresses[i], keypair.public_key(), i)
for i, keypair in enumerate(validator_keypairs)
]
# Validators must be sorted by their public key
validators.sort(key=attrgetter("address"))
# Each validator generates a transcript which is publicly stored
self.transcripts = []
for sender in validators:
self.validator_messages = []
for validator in validators:
transcript = dkg.generate_transcript(
ritual_id=self.tau,
me=sender,
me=validator,
shares=self.shares_num,
threshold=self.security_threshold,
nodes=validators,
)
self.transcripts.append((sender, transcript))
self.validator_messages.append(
dkg.ValidatorMessage(validator, transcript)
)
self.dkg = dkg
self.validators = validators
self.validator_keypairs = validator_keypairs
# any validator can generate the same aggregated transcript
self.server_aggregate, self.dkg_public_key = dkg.aggregate_transcripts(
self.server_aggregate = dkg.aggregate_transcripts(
ritual_id=self.tau,
me=validators[0],
shares=self.shares_num,
threshold=self.security_threshold,
transcripts=self.transcripts,
validator_messages=self.validator_messages,
)
self.dkg_public_key = self.server_aggregate.public_key
_dkg_insight = DKGInsight()
@ -167,10 +170,7 @@ class DKGOmniscientDecryptionClient(ThresholdDecryptionClient):
variant = threshold_decryption_request.variant
# We can obtain the transcripts from the side-channel (deserialize) and aggregate them
validator_messages = [
ferveo.ValidatorMessage(validator, transcript)
for validator, transcript in self._learner._dkg_insight.transcripts
]
validator_messages = self._learner._dkg_insight.validator_messages
aggregate = ferveo.AggregatedTranscript(validator_messages)
assert aggregate.verify(
self._learner._dkg_insight.shares_num,

View File

@ -420,9 +420,7 @@ class CharacterConfiguration(BaseConfiguration):
self.crypto_power = crypto_power
if keystore_path and not keystore:
keystore = Keystore(keystore_path=keystore_path)
self.__keystore = self.__keystore = keystore or NO_KEYSTORE_ATTACHED.bool_value(
False
)
self.__keystore = keystore or NO_KEYSTORE_ATTACHED.bool_value(False)
self.keystore_dir = (
Path(keystore.keystore_path).parent
if keystore

View File

@ -1,4 +1,4 @@
from typing import List, Tuple, Union
from typing import List, Union
from nucypher_core.ferveo import (
AggregatedTranscript,
@ -8,6 +8,7 @@ from nucypher_core.ferveo import (
Dkg,
DkgPublicKey,
FerveoVariant,
HandoverTranscript,
Keypair,
Transcript,
Validator,
@ -43,27 +44,46 @@ def _make_dkg(
return dkg
def generate_transcript(*args, **kwargs) -> Transcript:
dkg = _make_dkg(*args, **kwargs)
def generate_transcript(
me: Validator,
ritual_id: int,
shares: int,
threshold: int,
nodes: List[Validator],
) -> Transcript:
dkg = _make_dkg(
me=me, ritual_id=ritual_id, shares=shares, threshold=threshold, nodes=nodes
)
transcript = dkg.generate_transcript()
return transcript
def derive_public_key(*args, **kwargs) -> DkgPublicKey:
dkg = _make_dkg(*args, **kwargs)
def derive_public_key(
me: Validator, ritual_id: int, shares: int, threshold: int, nodes: List[Validator]
) -> DkgPublicKey:
dkg = _make_dkg(
me=me, ritual_id=ritual_id, shares=shares, threshold=threshold, nodes=nodes
)
return dkg.public_key
def aggregate_transcripts(
transcripts: List[Tuple[Validator, Transcript]], shares: int, *args, **kwargs
) -> Tuple[AggregatedTranscript, DkgPublicKey]:
validators = [t[0] for t in transcripts]
_dkg = _make_dkg(nodes=validators, shares=shares, *args, **kwargs)
validator_msgs = [ValidatorMessage(v[0], v[1]) for v in transcripts]
pvss_aggregated = _dkg.aggregate_transcripts(validator_msgs)
verify_aggregate(pvss_aggregated, shares, validator_msgs)
LOGGER.debug(f"derived final DKG key {bytes(_dkg.public_key).hex()[:10]}")
return pvss_aggregated, _dkg.public_key
me: Validator,
ritual_id: int,
shares: int,
threshold: int,
validator_messages: List[ValidatorMessage],
) -> AggregatedTranscript:
nodes = [vm.validator for vm in validator_messages]
dkg = _make_dkg(
me=me, ritual_id=ritual_id, shares=shares, threshold=threshold, nodes=nodes
)
pvss_aggregated = dkg.aggregate_transcripts(validator_messages)
verify_aggregate(pvss_aggregated, shares, validator_messages)
LOGGER.debug(
f"derived final DKG key {bytes(pvss_aggregated.public_key).hex()[:10]}"
)
return pvss_aggregated
def verify_aggregate(
@ -81,15 +101,23 @@ def produce_decryption_share(
ciphertext_header: CiphertextHeader,
aad: bytes,
variant: FerveoVariant,
*args, **kwargs
me: Validator,
ritual_id: int,
shares: int,
threshold: int,
) -> Union[DecryptionShareSimple, DecryptionSharePrecomputed]:
dkg = _make_dkg(nodes=nodes, *args, **kwargs)
dkg = _make_dkg(
me=me, ritual_id=ritual_id, shares=shares, threshold=threshold, nodes=nodes
)
if not all((nodes, aggregated_transcript, keypair, ciphertext_header, aad)):
raise Exception("missing arguments") # sanity check
try:
derive_share = _VARIANTS[variant]
except KeyError:
raise ValueError(f"Invalid variant {variant}")
# TODO: #3636 - Precomputed variant now requires selected validators, which is not passed here
# However, we never use it in the codebase, so this is not a problem for now.
share = derive_share(
# first arg here is intended to be "self" since the method is unbound
aggregated_transcript,
@ -99,3 +127,45 @@ def produce_decryption_share(
keypair
)
return share
def produce_handover_transcript(
nodes: List[Validator],
aggregated_transcript: AggregatedTranscript,
handover_slot_index: int,
keypair: Keypair,
ritual_id: int,
shares: int,
threshold: int,
) -> HandoverTranscript:
if not all((nodes, aggregated_transcript, keypair)):
raise Exception("missing arguments") # sanity check
dkg = _make_dkg(
# TODO: is fixed 0-index fine here? I don't believe it matters
me=nodes[0],
ritual_id=ritual_id,
shares=shares,
threshold=threshold,
nodes=nodes,
)
handover_transcript = dkg.generate_handover_transcript(
aggregated_transcript,
handover_slot_index,
keypair,
)
return handover_transcript
def finalize_handover(
aggregated_transcript: AggregatedTranscript,
handover_transcript: HandoverTranscript,
keypair: Keypair,
) -> HandoverTranscript:
if not all((aggregated_transcript, handover_transcript, keypair)):
raise Exception("missing arguments") # sanity check
new_aggregate = aggregated_transcript.finalize_handover(
handover_transcript, keypair
)
return new_aggregate

View File

@ -13,7 +13,7 @@ import click
from constant_sorrow.constants import KEYSTORE_LOCKED
from mnemonic.mnemonic import Mnemonic
from nucypher_core import SessionSecretFactory
from nucypher_core.ferveo import Keypair
from nucypher_core.ferveo import Keypair as FerveoKeypair
from nucypher_core.umbral import SecretKeyFactory
from nucypher.config.constants import DEFAULT_CONFIG_ROOT
@ -502,7 +502,7 @@ class Keystore:
elif issubclass(power_class, RitualisticPower):
keypair_class: RitualisticKeypair = power_class._keypair_class
size = Keypair.secure_randomness_size()
size = FerveoKeypair.secure_randomness_size()
blob = __skf.make_secret(info)[:size]
keypair = keypair_class.from_secure_randomness(blob)
power = power_class(keypair=keypair, *power_args, **power_kwargs)

View File

@ -18,10 +18,11 @@ from nucypher_core.ferveo import (
CiphertextHeader,
DecryptionSharePrecomputed,
DecryptionShareSimple,
DkgPublicKey,
FerveoVariant,
HandoverTranscript,
Transcript,
Validator,
ValidatorMessage,
)
from nucypher_core.umbral import PublicKey, SecretKey, SecretKeyFactory, generate_kfrags
@ -263,7 +264,26 @@ class RitualisticPower(KeyPairBasedPower):
_default_private_key_class = ferveo.Keypair
not_found_error = NoRitualisticPower
provides = ("derive_decryption_share", "generate_transcript")
provides = (
"derive_decryption_share",
"generate_transcript",
"initiate_handover",
"finalize_handover",
)
def __find_me_in_validator_set(
self, checksum_address: ChecksumAddress, nodes: List[Validator]
) -> Validator:
"""
Finds the Validator in the list of nodes by checksum address.
Raises ValueError if not found.
"""
for node in nodes:
if node.address == checksum_address:
return node
raise ValueError(
f"Validator with address {checksum_address} not found in nodes."
)
def produce_decryption_share(
self,
@ -279,7 +299,9 @@ class RitualisticPower(KeyPairBasedPower):
) -> Union[DecryptionShareSimple, DecryptionSharePrecomputed]:
decryption_share = dkg.produce_decryption_share(
ritual_id=ritual_id,
me=Validator(address=checksum_address, public_key=self.keypair.pubkey),
me=self.__find_me_in_validator_set(
checksum_address=checksum_address, nodes=nodes
),
shares=shares,
threshold=threshold,
nodes=nodes,
@ -291,17 +313,39 @@ class RitualisticPower(KeyPairBasedPower):
)
return decryption_share
def produce_handover_transcript(
self,
nodes: List[Validator],
aggregated_transcript: AggregatedTranscript,
handover_slot_index: int,
ritual_id: int,
shares: int,
threshold: int,
) -> HandoverTranscript:
handover_transcript = dkg.produce_handover_transcript(
nodes=nodes,
aggregated_transcript=aggregated_transcript,
handover_slot_index=handover_slot_index,
ritual_id=ritual_id,
shares=shares,
threshold=threshold,
keypair=self.keypair._privkey,
)
return handover_transcript
def generate_transcript(
self,
checksum_address: ChecksumAddress,
ritual_id: int,
shares: int,
threshold: int,
nodes: list
self,
checksum_address: ChecksumAddress,
ritual_id: int,
shares: int,
threshold: int,
nodes: List[Validator],
) -> Transcript:
transcript = dkg.generate_transcript(
ritual_id=ritual_id,
me=Validator(address=checksum_address, public_key=self.keypair.pubkey),
me=self.__find_me_in_validator_set(
checksum_address=checksum_address, nodes=nodes
),
shares=shares,
threshold=threshold,
nodes=nodes
@ -314,16 +358,31 @@ class RitualisticPower(KeyPairBasedPower):
checksum_address: ChecksumAddress,
shares: int,
threshold: int,
transcripts: List[Tuple[Validator, Transcript]],
) -> Tuple[AggregatedTranscript, DkgPublicKey]:
aggregated_transcript, dkg_public_key = dkg.aggregate_transcripts(
validator_messages: List[ValidatorMessage],
) -> AggregatedTranscript:
nodes = [vm.validator for vm in validator_messages]
aggregated_transcript = dkg.aggregate_transcripts(
ritual_id=ritual_id,
me=Validator(address=checksum_address, public_key=self.keypair.pubkey),
me=self.__find_me_in_validator_set(
checksum_address=checksum_address, nodes=nodes
),
shares=shares,
threshold=threshold,
transcripts=transcripts
validator_messages=validator_messages,
)
return aggregated_transcript, dkg_public_key
return aggregated_transcript
def finalize_handover(
self,
aggregated_transcript: AggregatedTranscript,
handover_transcript: HandoverTranscript,
) -> AggregatedTranscript:
new_aggregate = dkg.finalize_handover(
aggregated_transcript=aggregated_transcript,
handover_transcript=handover_transcript,
keypair=self.keypair._privkey,
)
return new_aggregate
class DerivedKeyBasedPower(CryptoPowerUp):

View File

@ -6,7 +6,13 @@ from nucypher_core.ferveo import (
Validator,
)
from nucypher.blockchain.eth.models import PHASE1, Coordinator
from nucypher.blockchain.eth.models import (
HANDOVER_AWAITING_BLINDED_SHARE,
HANDOVER_AWAITING_TRANSCRIPT,
PHASE1,
PHASE2,
Coordinator,
)
from nucypher.types import PhaseId
@ -18,6 +24,9 @@ class DKGStorage:
_KEY_VALIDATORS = "validators"
# round 2
_KEY_PHASE_2_TXS = "phase_2_txs"
# handover phases
_KEY_PHASE_AWAITING_TRANSCRIPT_TXS = "handover_transcript_txs"
_KEY_PHASE_AWAITING_BLINDED_SHARE_TXS = "handover_blinded_share_txs"
# active rituals
_KEY_ACTIVE_RITUAL = "active_rituals"
@ -26,6 +35,8 @@ class DKGStorage:
_KEY_VALIDATORS,
_KEY_PHASE_2_TXS,
_KEY_ACTIVE_RITUAL,
_KEY_PHASE_AWAITING_TRANSCRIPT_TXS,
_KEY_PHASE_AWAITING_BLINDED_SHARE_TXS,
]
def __init__(self):
@ -45,7 +56,14 @@ class DKGStorage:
def __get_phase_key(cls, phase: int):
if phase == PHASE1:
return cls._KEY_PHASE_1_TXS
return cls._KEY_PHASE_2_TXS
elif phase == PHASE2:
return cls._KEY_PHASE_2_TXS
elif phase == HANDOVER_AWAITING_TRANSCRIPT:
return cls._KEY_PHASE_AWAITING_TRANSCRIPT_TXS
elif phase == HANDOVER_AWAITING_BLINDED_SHARE:
return cls._KEY_PHASE_AWAITING_BLINDED_SHARE_TXS
else:
raise ValueError(f"Unknown phase: {phase}")
def store_ritual_phase_async_tx(self, phase_id: PhaseId, async_tx: AsyncTx):
key = self.__get_phase_key(phase_id.phase)
@ -54,14 +72,18 @@ class DKGStorage:
def clear_ritual_phase_async_tx(self, phase_id: PhaseId, async_tx: AsyncTx) -> bool:
key = self.__get_phase_key(phase_id.phase)
if self._data[key].get(phase_id.ritual_id) is async_tx:
del self._data[key][phase_id.ritual_id]
return True
try:
del self._data[key][phase_id.ritual_id]
return True
except KeyError:
pass
return False
def get_ritual_phase_async_tx(self, phase_id: PhaseId) -> Optional[AsyncTx]:
key = self.__get_phase_key(phase_id.phase)
return self._data[key].get(phase_id.ritual_id)
# Validators for rituals
def store_validators(self, ritual_id: int, validators: List[Validator]) -> None:
self._data[self._KEY_VALIDATORS][ritual_id] = list(validators)
@ -72,6 +94,13 @@ class DKGStorage:
return list(validators)
def clear_validators(self, ritual_id: int) -> bool:
try:
del self._data[self._KEY_VALIDATORS][ritual_id]
return True
except KeyError:
return False
#
# Active Rituals
#
@ -83,3 +112,10 @@ class DKGStorage:
def get_active_ritual(self, ritual_id: int) -> Optional[Coordinator.Ritual]:
return self._data[self._KEY_ACTIVE_RITUAL].get(ritual_id)
def clear_active_ritual_object(self, ritual_id: int) -> bool:
try:
del self._data[self._KEY_ACTIVE_RITUAL][ritual_id]
return True
except KeyError:
return False

View File

@ -11,7 +11,10 @@ from nucypher.policy.conditions.exceptions import (
InvalidContextVariableData,
RequiredContextVariable,
)
from nucypher.policy.conditions.utils import ConditionProviderManager
from nucypher.policy.conditions.utils import (
ConditionProviderManager,
check_and_convert_big_int_string_to_int,
)
USER_ADDRESS_CONTEXT = ":userAddress"
USER_ADDRESS_EIP4361_EXTERNAL_CONTEXT = ":userAddressExternalEIP4361"
@ -114,6 +117,7 @@ def get_context_value(
try:
# DIRECTIVES are special context vars that will pre-processed by ursula
func = _DIRECTIVES[context_variable]
value = func(providers=providers, **context) # required inputs here
except KeyError:
# fallback for context variable without directive - assume key,value pair
# handles the case for user customized context variables
@ -122,8 +126,9 @@ def get_context_value(
raise RequiredContextVariable(
f'No value provided for unrecognized context variable "{context_variable}"'
)
else:
value = func(providers=providers, **context) # required inputs here
elif isinstance(value, str):
# possible big int value
value = check_and_convert_big_int_string_to_int(value)
return value

View File

@ -33,6 +33,7 @@ from nucypher.policy.conditions.exceptions import (
RPCExecutionFailed,
)
from nucypher.policy.conditions.lingo import (
AnyField,
ConditionType,
ExecutionCallAccessControlCondition,
ReturnValueTest,
@ -71,7 +72,7 @@ class RPCCall(ExecutionCall):
"null": "Undefined method name",
},
)
parameters = fields.List(fields.Field, required=False, allow_none=True)
parameters = fields.List(AnyField, required=False, allow_none=True)
@validates("method")
def validate_method(self, value):

View File

@ -62,3 +62,7 @@ class RPCExecutionFailed(ConditionEvaluationFailed):
class JsonRequestException(ConditionEvaluationFailed):
"""Raised when an exception is raised from a JSON request."""
class JWTException(ConditionEvaluationFailed):
"""Raised when an exception is raised when validating a JWT token"""

View File

@ -17,6 +17,7 @@ from nucypher.policy.conditions.json.base import (
JsonRequestCall,
)
from nucypher.policy.conditions.lingo import (
AnyField,
ConditionType,
ExecutionCallAccessControlCondition,
ReturnValueTest,
@ -26,7 +27,7 @@ from nucypher.policy.conditions.lingo import (
class BaseJsonRPCCall(JsonRequestCall, ABC):
class Schema(JsonRequestCall.Schema):
method = fields.Str(required=True)
params = fields.Field(required=False, allow_none=True)
params = AnyField(required=False, allow_none=True)
query = JSONPathField(required=False, allow_none=True)
authorization_token = fields.Str(required=False, allow_none=True)

View File

@ -0,0 +1,154 @@
from typing import Any, Optional, Tuple
import jwt
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import ec, rsa
from cryptography.hazmat.primitives.serialization import load_pem_public_key
from marshmallow import ValidationError, fields, post_load, validate, validates
from nucypher.policy.conditions.base import AccessControlCondition, ExecutionCall
from nucypher.policy.conditions.context import (
is_context_variable,
resolve_any_context_variables,
)
from nucypher.policy.conditions.exceptions import InvalidCondition, JWTException
from nucypher.policy.conditions.lingo import ConditionType
from nucypher.utilities.logging import Logger
class JWTVerificationCall(ExecutionCall):
_valid_jwt_algorithms = (
"ES256",
"RS256",
) # https://datatracker.ietf.org/doc/html/rfc7518#section-3.1
SECP_CURVE_FOR_ES256 = "secp256r1"
class Schema(ExecutionCall.Schema):
jwt_token = fields.Str(required=True)
# TODO: See #3572 for a discussion about deprecating this in favour of the expected issuer
public_key = fields.Str(
required=True
) # required? maybe a valid PK certificate passed by requester?
expected_issuer = fields.Str(required=False, allow_none=True)
# TODO: StringOrURI as per the spec.
@post_load
def make(self, data, **kwargs):
return JWTVerificationCall(**data)
@validates("jwt_token")
def validate_jwt_token(self, value):
if not is_context_variable(value):
raise ValidationError(
f"Invalid value for JWT token; expected a context variable, but got '{value}'"
)
@validates("public_key")
def validate_public_key(self, value):
try:
public_key = load_pem_public_key(
value.encode(), backend=default_backend()
)
if isinstance(public_key, rsa.RSAPublicKey):
return
elif isinstance(public_key, ec.EllipticCurvePublicKey):
curve = public_key.curve
if curve.name != JWTVerificationCall.SECP_CURVE_FOR_ES256:
raise ValidationError(
f"Invalid EC public key curve: {curve.name}"
)
except Exception as e:
raise ValidationError(f"Invalid public key format: {str(e)}")
def __init__(
self,
jwt_token: str,
public_key: str,
expected_issuer: Optional[str] = None,
):
self.jwt_token = jwt_token
self.public_key = public_key
self.expected_issuer = expected_issuer
self.logger = Logger(__name__)
super().__init__()
def execute(self, **context) -> Any:
jwt_token = resolve_any_context_variables(self.jwt_token, **context)
require = []
if self.expected_issuer:
require.append("iss")
try:
payload = jwt.decode(
jwt=jwt_token,
key=self.public_key,
algorithms=self._valid_jwt_algorithms,
options=dict(require=require),
issuer=self.expected_issuer,
)
except jwt.exceptions.InvalidAlgorithmError:
raise JWTException(f"valid algorithms: {self._valid_jwt_algorithms}")
except jwt.exceptions.InvalidTokenError as e:
raise JWTException(e)
return payload
class JWTCondition(AccessControlCondition):
"""
A JWT condition can be satisfied by presenting a valid JWT token, which not only is
required to be cryptographically verifiable, but also must fulfill certain additional
restrictions defined in the condition.
"""
CONDITION_TYPE = ConditionType.JWT.value
class Schema(AccessControlCondition.Schema, JWTVerificationCall.Schema):
condition_type = fields.Str(
validate=validate.Equal(ConditionType.JWT.value), required=True
)
@post_load
def make(self, data, **kwargs):
return JWTCondition(**data)
def __init__(
self,
jwt_token: str,
public_key: str,
condition_type: str = ConditionType.JWT.value,
name: Optional[str] = None,
expected_issuer: Optional[str] = None,
):
try:
self.execution_call = JWTVerificationCall(
jwt_token=jwt_token,
public_key=public_key,
expected_issuer=expected_issuer,
)
except ExecutionCall.InvalidExecutionCall as e:
raise InvalidCondition(str(e)) from e
super().__init__(condition_type=condition_type, name=name)
@property
def jwt_token(self):
return self.execution_call.jwt_token
@property
def public_key(self):
return self.execution_call.public_key
@property
def expected_issuer(self):
return self.execution_call.expected_issuer
def verify(self, **context) -> Tuple[bool, Any]:
payload = self.execution_call.execute(**context)
result = True
return result, payload

View File

@ -36,7 +36,53 @@ from nucypher.policy.conditions.exceptions import (
ReturnValueEvaluationError,
)
from nucypher.policy.conditions.types import ConditionDict, Lingo
from nucypher.policy.conditions.utils import CamelCaseSchema, ConditionProviderManager
from nucypher.policy.conditions.utils import (
CamelCaseSchema,
ConditionProviderManager,
check_and_convert_big_int_string_to_int,
)
class AnyField(fields.Field):
"""
Catch all field for all data types received in JSON.
However, `taco-web` will provide bigints as strings since typescript can't handle large
numbers as integers, so those need converting to integers.
"""
def _convert_any_big_ints_from_string(self, value):
if isinstance(value, list):
return [self._convert_any_big_ints_from_string(item) for item in value]
elif isinstance(value, dict):
return {
k: self._convert_any_big_ints_from_string(v) for k, v in value.items()
}
elif isinstance(value, str):
return check_and_convert_big_int_string_to_int(value)
return value
def _serialize(self, value, attr, obj, **kwargs):
return value
def _deserialize(self, value, attr, data, **kwargs):
return self._convert_any_big_ints_from_string(value)
class AnyLargeIntegerField(fields.Int):
"""
Integer field that also allows for big int values for large numbers
to be provided from `taco-web`. BigInts will be used for integer values > MAX_SAFE_INTEGER.
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def _deserialize(self, value, attr, data, **kwargs):
if isinstance(value, str):
value = check_and_convert_big_int_string_to_int(value)
return super()._deserialize(value, attr, data, **kwargs)
class _ConditionField(fields.Dict):
@ -58,7 +104,7 @@ class _ConditionField(fields.Dict):
return instance
# CONDITION = TIME | CONTRACT | RPC | JSON_API | COMPOUND | SEQUENTIAL | IF_THEN_ELSE_CONDITION
# CONDITION = TIME | CONTRACT | RPC | JSON_API | JSON_RPC | JWT | COMPOUND | SEQUENTIAL | IF_THEN_ELSE_CONDITION
class ConditionType(Enum):
"""
Defines the types of conditions that can be evaluated.
@ -69,6 +115,7 @@ class ConditionType(Enum):
RPC = "rpc"
JSONAPI = "json-api"
JSONRPC = "json-rpc"
JWT = "jwt"
COMPOUND = "compound"
SEQUENTIAL = "sequential"
IF_THEN_ELSE = "if-then-else"
@ -510,7 +557,7 @@ class ReturnValueTest:
class ReturnValueTestSchema(CamelCaseSchema):
SKIP_VALUES = (None,)
comparator = fields.Str(required=True, validate=OneOf(_COMPARATOR_FUNCTIONS))
value = fields.Raw(
value = AnyField(
allow_none=False, required=True
) # any valid type (excludes None)
index = fields.Int(
@ -701,6 +748,7 @@ class ConditionLingo(_Serializable):
from nucypher.policy.conditions.evm import ContractCondition, RPCCondition
from nucypher.policy.conditions.json.api import JsonApiCondition
from nucypher.policy.conditions.json.rpc import JsonRpcCondition
from nucypher.policy.conditions.jwt import JWTCondition
from nucypher.policy.conditions.time import TimeCondition
# version logical adjustments can be made here as required
@ -713,6 +761,7 @@ class ConditionLingo(_Serializable):
CompoundAccessControlCondition,
JsonApiCondition,
JsonRpcCondition,
JWTCondition,
SequentialAccessControlCondition,
IfThenElseCondition,
):

View File

@ -75,6 +75,12 @@ class JsonRpcConditionDict(BaseExecConditionDict):
authorizationToken: NotRequired[str]
class JWTConditionDict(_AccessControlCondition):
jwtToken: str
publicKey: str # TODO: See #3572 for a discussion about deprecating this in favour of the expected issuer
expectedIssuer: NotRequired[str]
#
# CompoundCondition represents:
# {
@ -130,6 +136,7 @@ class IfThenElseConditionDict(_AccessControlCondition):
# - CompoundCondition
# - JsonApiCondition
# - JsonRpcCondition
# - JWTCondition
# - SequentialCondition
# - IfThenElseCondition
ConditionDict = Union[
@ -139,6 +146,7 @@ ConditionDict = Union[
CompoundConditionDict,
JsonApiConditionDict,
JsonRpcConditionDict,
JWTConditionDict,
SequentialConditionDict,
IfThenElseConditionDict,
]

View File

@ -1,6 +1,6 @@
import re
from http import HTTPStatus
from typing import Dict, Iterator, List, Optional, Tuple
from typing import Dict, Iterator, List, Optional, Tuple, Union
from marshmallow import Schema, post_dump
from marshmallow.exceptions import SCHEMA
@ -223,3 +223,18 @@ def extract_single_error_message_from_schema_errors(
else ""
)
return f"{message_prefix}{message}"
def check_and_convert_big_int_string_to_int(value: str) -> Union[str, int]:
"""
Check if a string is a big int string and convert it to an integer, otherwise return the string.
"""
if re.fullmatch(r"^-?\d+n$", value):
try:
result = int(value[:-1])
return result
except ValueError:
# ignore
pass
return value

View File

@ -1,4 +1,5 @@
import pathlib
import re
import sys
from contextlib import contextmanager
from enum import Enum
@ -15,6 +16,7 @@ from twisted.logger import (
)
from twisted.logger import Logger as TwistedLogger
from twisted.python.logfile import LogFile
from twisted.web import http
import nucypher
from nucypher.config.constants import (
@ -211,11 +213,35 @@ def get_text_file_observer(name=DEFAULT_LOG_FILENAME, path=USER_LOG_DIR):
return observer
class Logger(TwistedLogger):
"""Drop-in replacement of Twisted's Logger, patching the emit() method to tolerate inputs with curly braces,
i.e., not compliant with PEP 3101.
def _redact_ip_address_when_logging_server_requests():
"""
Monkey-patch of twisted's HttpFactory log formatter so that logging of server requests
will exclude (redact) the IP address of the requester.
"""
original_formatter = http.combinedLogFormatter
ip_address_pattern = r"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}"
See Issue #724 and, particularly, https://github.com/nucypher/nucypher/issues/724#issuecomment-600190455"""
def redact_ip_address_formatter(timestamp, request):
line = original_formatter(timestamp, request)
# redact any ip address
line = re.sub(ip_address_pattern, "<IP_REDACTED>", line)
return line
http.combinedLogFormatter = redact_ip_address_formatter
class Logger(TwistedLogger):
"""Drop-in replacement of Twisted's Logger:
1. patch the emit() method to tolerate inputs with curly braces,
i.e., not compliant with PEP 3101. See Issue #724 and, particularly,
https://github.com/nucypher/nucypher/issues/724#issuecomment-600190455
2. redact IP addresses for http requests
"""
_redact_ip_address_when_logging_server_requests()
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@classmethod
def escape_format_string(cls, string):

4556
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,12 +1,13 @@
[tool.poetry]
name = "nucypher"
version = "7.5.0"
version = "7.6.0"
authors = ["NuCypher"]
description = "A threshold access control application to empower privacy in decentralized systems."
[tool.poetry.dependencies]
python = ">=3.9,<4"
nucypher-core = "==0.13.0"
nucypher-core = "==0.15.0"
cryptography = "*"
pynacl = ">=1.4.0"
mnemonic = "*"
@ -21,7 +22,7 @@ mako = "*"
click = '*'
colorama = '*'
tabulate = '*'
marshmallow = '*'
marshmallow = '^3.26.1' # TODO revert back to '*' in the future - breaking change in 4.0 - TypeError: validate_method() got an unexpected keyword argument 'data_key'
appdirs = '*'
constant-sorrow = '^0.1.0a9'
prometheus-client = '*'
@ -29,8 +30,14 @@ siwe = "^4.2.0"
time-machine = "^2.13.0"
twisted = "^24.2.0rc1"
jsonpath-ng = "^1.6.1"
pyjwt = {extras = ["crypto"], version = "^2.10.1"}
nucypher-pychalk = "^2.0.2"
nucypher-snaptime = "^0.2.5"
# TODO: this is to match requirements.txt; update and relock for 7.6.x
python-statemachine = "==2.3.4"
pydantic = "~2.11.0" # TODO 2.12; there some issue - 'pydantic_core._pydantic_core.ValidationInfo' object has no attribute 'manifest'
[tool.poetry.dev-dependencies]
[tool.poetry.group.dev.dependencies]
pytest = '*'
pytest-cov = '*'
pytest-mock = '*'

View File

@ -4,6 +4,54 @@ Releases
.. towncrier release notes start
v7.6.0 (2025-08-18)
-------------------
Features
~~~~~~~~
- Adds support for supervised handover of key material between TACo nodes. (`#3608 <https://github.com/nucypher/nucypher/issues/3608>`__)
Internal Development Tasks
~~~~~~~~~~~~~~~~~~~~~~~~~~
- `#3628 <https://github.com/nucypher/nucypher/issues/3628>`__, `#3630 <https://github.com/nucypher/nucypher/issues/3630>`__, `#3632 <https://github.com/nucypher/nucypher/issues/3632>`__, `#3633 <https://github.com/nucypher/nucypher/issues/3633>`__, `#3634 <https://github.com/nucypher/nucypher/issues/3634>`__, `#3635 <https://github.com/nucypher/nucypher/issues/3635>`__, `#3637 <https://github.com/nucypher/nucypher/issues/3637>`__
v7.5.0 (2025-04-08)
-------------------
Features
~~~~~~~~
- Support for executing multiple conditions sequentially, where the outcome of one condition can be used as input for another. (`#3500 <https://github.com/nucypher/nucypher/issues/3500>`__)
- Support for offchain JSON endpoint condition expression and evaluation (`#3511 <https://github.com/nucypher/nucypher/issues/3511>`__)
- Expands recovery CLI to include audit and keystore identification features (`#3538 <https://github.com/nucypher/nucypher/issues/3538>`__)
- Condition that allows for if-then-else branching based on underlying conditions i.e. IF ``CONDITION A`` THEN ``CONDITION B`` ELSE ``CONDITION_C``.
The ELSE component can either be a Condition or a boolean value. (`#3558 <https://github.com/nucypher/nucypher/issues/3558>`__)
- Enable support for Bearer authorization tokens (e.g., OAuth, JWT) within HTTP GET requests for ``JsonApiCondition``. (`#3560 <https://github.com/nucypher/nucypher/issues/3560>`__)
- Enhance threshold decryption request efficiency by prioritizing nodes in the cohort with lower communication latency. (`#3562 <https://github.com/nucypher/nucypher/issues/3562>`__)
- Added plumbing to support EVM condition evaluation on "any" (major) EVM chain outside of Ethereum and Polygon - only enabled on ``lynx`` testnet for now. (`#3569 <https://github.com/nucypher/nucypher/issues/3569>`__)
- Support for conditions based on verification of JWT tokens. (`#3570 <https://github.com/nucypher/nucypher/issues/3570>`__)
- Support for conditions based on APIs provided by off-chain JSON RPC 2.0 endpoints. (`#3571 <https://github.com/nucypher/nucypher/issues/3571>`__)
- Add support for EIP1271 signature verification for smart contract wallets. (`#3576 <https://github.com/nucypher/nucypher/issues/3576>`__)
- Allow BigInt values from ``taco-web`` typescript library to be provided as strings. (`#3585 <https://github.com/nucypher/nucypher/issues/3585>`__)
Improved Documentation
~~~~~~~~~~~~~~~~~~~~~~
- `#3577 <https://github.com/nucypher/nucypher/issues/3577>`__
Internal Development Tasks
~~~~~~~~~~~~~~~~~~~~~~~~~~
- `#3523 <https://github.com/nucypher/nucypher/issues/3523>`__, `#3535 <https://github.com/nucypher/nucypher/issues/3535>`__, `#3539 <https://github.com/nucypher/nucypher/issues/3539>`__, `#3545 <https://github.com/nucypher/nucypher/issues/3545>`__, `#3547 <https://github.com/nucypher/nucypher/issues/3547>`__, `#3553 <https://github.com/nucypher/nucypher/issues/3553>`__, `#3554 <https://github.com/nucypher/nucypher/issues/3554>`__, `#3556 <https://github.com/nucypher/nucypher/issues/3556>`__, `#3557 <https://github.com/nucypher/nucypher/issues/3557>`__, `#3563 <https://github.com/nucypher/nucypher/issues/3563>`__, `#3564 <https://github.com/nucypher/nucypher/issues/3564>`__, `#3565 <https://github.com/nucypher/nucypher/issues/3565>`__, `#3578 <https://github.com/nucypher/nucypher/issues/3578>`__, `#3581 <https://github.com/nucypher/nucypher/issues/3581>`__, `#3586 <https://github.com/nucypher/nucypher/issues/3586>`__, `#3589 <https://github.com/nucypher/nucypher/issues/3589>`__
- Introduce necessary changes to adapt agents methods to breaking changes in Coordinator contract. Previous methods are now deprecated from the API. (`#3588 <https://github.com/nucypher/nucypher/issues/3588>`__)
v7.4.1 (2024-09-12)
-------------------

View File

@ -1,105 +1,108 @@
abnf==2.2.0 ; python_version >= "3.9" and python_version < "4.0"
aiohappyeyeballs==2.4.3 ; python_version >= "3.9" and python_version < "4"
aiohttp==3.10.10 ; python_version >= "3.9" and python_version < "4"
aiosignal==1.3.1 ; python_version >= "3.9" and python_version < "4"
annotated-types==0.7.0 ; python_version >= "3.9" and python_version < "4.0"
abnf==2.4.0 ; python_version >= "3.9" and python_version < "4.0"
aiohappyeyeballs==2.6.1 ; python_version >= "3.9" and python_version < "4"
aiohttp==3.12.15 ; python_version >= "3.9" and python_version < "4"
aiosignal==1.4.0 ; python_version >= "3.9" and python_version < "4"
annotated-types==0.7.0 ; python_version >= "3.9" and python_version < "4"
appdirs==1.4.4 ; python_version >= "3.9" and python_version < "4"
async-timeout==4.0.3 ; python_version >= "3.9" and python_version < "3.11"
attrs==24.2.0 ; python_version >= "3.9" and python_version < "4"
async-timeout==5.0.1 ; python_version >= "3.9" and python_version < "3.11"
attrs==25.3.0 ; python_version >= "3.9" and python_version < "4"
atxm==0.5.0 ; python_version >= "3.9" and python_version < "4"
autobahn==24.4.2 ; python_version >= "3.9" and python_version < "4"
automat==24.8.1 ; python_version >= "3.9" and python_version < "4"
bitarray==3.0.0 ; python_version >= "3.9" and python_version < "4"
blinker==1.8.2 ; python_version >= "3.9" and python_version < "4"
automat==25.4.16 ; python_version >= "3.9" and python_version < "4"
bitarray==3.6.1 ; python_version >= "3.9" and python_version < "4"
blinker==1.9.0 ; python_version >= "3.9" and python_version < "4"
bytestring-splitter==2.4.1 ; python_version >= "3.9" and python_version < "4"
certifi==2024.8.30 ; python_version >= "3.9" and python_version < "4"
certifi==2025.8.3 ; python_version >= "3.9" and python_version < "4"
cffi==1.17.1 ; python_version >= "3.9" and python_version < "4"
charset-normalizer==3.4.0 ; python_version >= "3.9" and python_version < "4"
charset-normalizer==3.4.3 ; python_version >= "3.9" and python_version < "4"
ckzg==1.0.2 ; python_version >= "3.9" and python_version < "4"
click==8.1.7 ; python_version >= "3.9" and python_version < "4"
click==8.1.8 ; python_version >= "3.9" and python_version < "4"
colorama==0.4.6 ; python_version >= "3.9" and python_version < "4"
constant-sorrow==0.1.0a9 ; python_version >= "3.9" and python_version < "4"
constantly==23.10.4 ; python_version >= "3.9" and python_version < "4"
cryptography==43.0.3 ; python_version >= "3.9" and python_version < "4"
cytoolz==1.0.0 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
dateparser==1.2.0 ; python_version >= "3.9" and python_version < "4"
eth-abi==5.1.0 ; python_version >= "3.9" and python_version < "4"
cytoolz==1.0.1 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
dateparser==1.2.2 ; python_version >= "3.9" and python_version < "4"
eth-abi==5.2.0 ; python_version >= "3.9" and python_version < "4"
eth-account==0.11.3 ; python_version >= "3.9" and python_version < "4"
eth-hash==0.7.0 ; python_version >= "3.9" and python_version < "4"
eth-hash[pycryptodome]==0.7.0 ; python_version >= "3.9" and python_version < "4"
eth-keyfile==0.8.1 ; python_version >= "3.9" and python_version < "4"
eth-keys==0.6.0 ; python_version >= "3.9" and python_version < "4"
eth-hash==0.7.1 ; python_version >= "3.9" and python_version < "4"
eth-hash[pycryptodome]==0.7.1 ; python_version >= "3.9" and python_version < "4"
eth-keyfile==0.9.1 ; python_version >= "3.9" and python_version < "4"
eth-keys==0.7.0 ; python_version >= "3.9" and python_version < "4"
eth-rlp==1.0.1 ; python_version >= "3.9" and python_version < "4"
eth-typing==3.5.2 ; python_version >= "3.9" and python_version < "4"
eth-utils==2.3.2 ; python_version >= "3.9" and python_version < "4"
flask==3.0.3 ; python_version >= "3.9" and python_version < "4"
frozenlist==1.5.0 ; python_version >= "3.9" and python_version < "4"
flask==3.1.1 ; python_version >= "3.9" and python_version < "4"
frozenlist==1.7.0 ; python_version >= "3.9" and python_version < "4"
hendrix==5.0.0 ; python_version >= "3.9" and python_version < "4"
hexbytes==0.3.1 ; python_version >= "3.9" and python_version < "4"
humanize==4.11.0 ; python_version >= "3.9" and python_version < "4"
humanize==4.12.3 ; python_version >= "3.9" and python_version < "4"
hyperlink==21.0.0 ; python_version >= "3.9" and python_version < "4"
idna==3.10 ; python_version >= "3.9" and python_version < "4"
importlib-metadata==8.5.0 ; python_version >= "3.9" and python_version < "3.10"
importlib-metadata==8.7.0 ; python_version >= "3.9" and python_version < "3.10"
incremental==24.7.2 ; python_version >= "3.9" and python_version < "4"
itsdangerous==2.2.0 ; python_version >= "3.9" and python_version < "4"
jinja2==3.1.4 ; python_version >= "3.9" and python_version < "4"
jinja2==3.1.6 ; python_version >= "3.9" and python_version < "4"
jsonpath-ng==1.7.0 ; python_version >= "3.9" and python_version < "4"
jsonschema-specifications==2024.10.1 ; python_version >= "3.9" and python_version < "4"
jsonschema==4.23.0 ; python_version >= "3.9" and python_version < "4"
jsonschema-specifications==2025.4.1 ; python_version >= "3.9" and python_version < "4"
jsonschema==4.25.0 ; python_version >= "3.9" and python_version < "4"
lru-dict==1.2.0 ; python_version >= "3.9" and python_version < "4"
mako==1.3.6 ; python_version >= "3.9" and python_version < "4"
mako==1.3.10 ; python_version >= "3.9" and python_version < "4"
markupsafe==3.0.2 ; python_version >= "3.9" and python_version < "4"
marshmallow==3.23.1 ; python_version >= "3.9" and python_version < "4"
marshmallow==3.26.1 ; python_version >= "3.9" and python_version < "4"
maya==0.6.1 ; python_version >= "3.9" and python_version < "4"
mnemonic==0.21 ; python_version >= "3.9" and python_version < "4"
msgpack-python==0.5.6 ; python_version >= "3.9" and python_version < "4"
multidict==6.1.0 ; python_version >= "3.9" and python_version < "4"
nucypher-core==0.13.0 ; python_version >= "3.9" and python_version < "4"
multidict==6.6.4 ; python_version >= "3.9" and python_version < "4"
nucypher-core==0.15.0 ; python_version >= "3.9" and python_version < "4"
nucypher-pychalk==2.0.2 ; python_version >= "3.9" and python_version < "4"
nucypher-snaptime==0.2.5 ; python_version >= "3.9" and python_version < "4"
packaging==23.2 ; python_version >= "3.9" and python_version < "4"
parsimonious==0.10.0 ; python_version >= "3.9" and python_version < "4"
pendulum==3.0.0 ; python_version >= "3.9" and python_version < "4"
pendulum==3.1.0 ; python_version >= "3.9" and python_version < "4"
ply==3.11 ; python_version >= "3.9" and python_version < "4"
prometheus-client==0.21.0 ; python_version >= "3.9" and python_version < "4"
propcache==0.2.0 ; python_version >= "3.9" and python_version < "4"
protobuf==5.28.3 ; python_version >= "3.9" and python_version < "4"
pyasn1-modules==0.4.1 ; python_version >= "3.9" and python_version < "4"
prometheus-client==0.22.1 ; python_version >= "3.9" and python_version < "4"
propcache==0.3.2 ; python_version >= "3.9" and python_version < "4"
protobuf==6.32.0 ; python_version >= "3.9" and python_version < "4"
py-ecc==8.0.0 ; python_version >= "3.9" and python_version < "4"
pyasn1-modules==0.4.2 ; python_version >= "3.9" and python_version < "4"
pyasn1==0.6.1 ; python_version >= "3.9" and python_version < "4"
pychalk==2.0.1 ; python_version >= "3.9" and python_version < "4"
pycparser==2.22 ; python_version >= "3.9" and python_version < "4"
pycryptodome==3.21.0 ; python_version >= "3.9" and python_version < "4"
pydantic-core==2.23.4 ; python_version >= "3.9" and python_version < "4.0"
pydantic==2.9.2 ; python_version >= "3.9" and python_version < "4.0"
pycryptodome==3.23.0 ; python_version >= "3.9" and python_version < "4"
pydantic-core==2.33.2 ; python_version >= "3.9" and python_version < "4.0"
pydantic==2.11.7 ; python_version >= "3.9" and python_version < "4"
pyjwt[crypto]==2.10.1 ; python_version >= "3.9" and python_version < "4"
pynacl==1.5.0 ; python_version >= "3.9" and python_version < "4"
pyopenssl==24.2.1 ; python_version >= "3.9" and python_version < "4"
pyopenssl==25.1.0 ; python_version >= "3.9" and python_version < "4"
python-dateutil==2.9.0.post0 ; python_version >= "3.9" and python_version < "4"
python-statemachine==2.3.4 ; python_version >= "3.9" and python_version < "4"
pytz==2024.2 ; python_version >= "3.9" and python_version < "4"
pytz==2025.2 ; python_version >= "3.9" and python_version < "4"
pyunormalize==16.0.0 ; python_version >= "3.9" and python_version < "4"
pywin32==308 ; python_version >= "3.9" and python_version < "4" and platform_system == "Windows"
referencing==0.35.1 ; python_version >= "3.9" and python_version < "4"
regex==2024.9.11 ; python_version >= "3.9" and python_version < "4"
requests==2.32.3 ; python_version >= "3.9" and python_version < "4"
rlp==4.0.1 ; python_version >= "3.9" and python_version < "4"
rpds-py==0.21.0 ; python_version >= "3.9" and python_version < "4"
pywin32==311 ; python_version >= "3.9" and python_version < "4" and platform_system == "Windows"
referencing==0.36.2 ; python_version >= "3.9" and python_version < "4"
regex==2025.7.34 ; python_version >= "3.9" and python_version < "4"
requests==2.32.4 ; python_version >= "3.9" and python_version < "4"
rlp==4.1.0 ; python_version >= "3.9" and python_version < "4"
rpds-py==0.27.0 ; python_version >= "3.9" and python_version < "4"
service-identity==24.2.0 ; python_version >= "3.9" and python_version < "4"
setuptools==75.3.0 ; python_version >= "3.9" and python_version < "4"
setuptools==80.9.0 ; python_version >= "3.9" and python_version < "4"
siwe==4.2.0 ; python_version >= "3.9" and python_version < "4.0"
six==1.16.0 ; python_version >= "3.9" and python_version < "4"
snaptime==0.2.4 ; python_version >= "3.9" and python_version < "4"
six==1.17.0 ; python_version >= "3.9" and python_version < "4"
tabulate==0.9.0 ; python_version >= "3.9" and python_version < "4"
time-machine==2.16.0 ; python_version >= "3.9" and python_version < "4"
tomli==2.0.2 ; python_version >= "3.9" and python_version < "3.11"
time-machine==2.17.0 ; python_version >= "3.9" and python_version < "4"
tomli==2.2.1 ; python_version >= "3.9" and python_version < "3.11"
toolz==1.0.0 ; python_version >= "3.9" and python_version < "4" and (implementation_name == "pypy" or implementation_name == "cpython")
twisted==24.10.0 ; python_version >= "3.9" and python_version < "4"
txaio==23.1.1 ; python_version >= "3.9" and python_version < "4"
typing-extensions==4.12.2 ; python_version >= "3.9" and python_version < "4"
tzdata==2024.2 ; python_version >= "3.9" and python_version < "4"
tzlocal==5.2 ; python_version >= "3.9" and python_version < "4"
urllib3==2.2.3 ; python_version >= "3.9" and python_version < "4"
twisted==24.11.0 ; python_version >= "3.9" and python_version < "4"
txaio==23.6.1 ; python_version >= "3.9" and python_version < "4"
typing-extensions==4.14.1 ; python_version < "4" and python_version >= "3.9"
typing-inspection==0.4.1 ; python_version >= "3.9" and python_version < "4"
tzdata==2025.2 ; python_version >= "3.9" and python_version < "4"
tzlocal==5.3.1 ; python_version >= "3.9" and python_version < "4"
urllib3==2.5.0 ; python_version >= "3.9" and python_version < "4"
watchdog==3.0.0 ; python_version >= "3.9" and python_version < "4"
web3==6.20.1 ; python_version >= "3.9" and python_version < "4"
websockets==13.1 ; python_version >= "3.9" and python_version < "4"
werkzeug==3.1.2 ; python_version >= "3.9" and python_version < "4"
yarl==1.17.1 ; python_version >= "3.9" and python_version < "4"
zipp==3.20.2 ; python_version >= "3.9" and python_version < "3.10"
zope-interface==7.1.1 ; python_version >= "3.9" and python_version < "4"
websockets==15.0.1 ; python_version >= "3.9" and python_version < "4"
werkzeug==3.1.3 ; python_version >= "3.9" and python_version < "4"
yarl==1.20.1 ; python_version >= "3.9" and python_version < "4"
zipp==3.23.0 ; python_version >= "3.9" and python_version < "3.10"
zope-interface==7.2 ; python_version >= "3.9" and python_version < "4"

View File

@ -1,7 +1,7 @@
import random
from typing import List
from nucypher_core.ferveo import Transcript, Validator
from nucypher_core.ferveo import Transcript, Validator, ValidatorMessage
from nucypher.blockchain.eth import domains
from nucypher.blockchain.eth.agents import ContractAgency, CoordinatorAgent
@ -23,12 +23,14 @@ coordinator_agent = ContractAgency.get_agent(
def resolve_validators() -> List[Validator]:
result = list()
for staking_provider_address in ritual.providers:
for i, staking_provider_address in enumerate(ritual.providers):
public_key = coordinator_agent.get_provider_public_key(
provider=staking_provider_address, ritual_id=ritual.id
)
external_validator = Validator(
address=staking_provider_address, public_key=public_key
address=staking_provider_address,
public_key=public_key,
share_index=i,
)
result.append(external_validator)
result = sorted(result, key=lambda x: x.address)
@ -42,10 +44,10 @@ ritual = coordinator_agent.get_ritual(
validators = resolve_validators()
transcripts = [Transcript.from_bytes(bytes(t)) for t in ritual.transcripts]
messages = list(zip(validators, transcripts))
messages = [ValidatorMessage(v, t) for v, t in zip(validators, transcripts)]
aggregate_transcripts(
transcripts=messages,
validator_messages=messages,
shares=ritual.shares,
threshold=ritual.threshold,
me=random.choice(validators), # this is hacky

View File

@ -12,7 +12,6 @@ BASE_DIR = Path(__file__).parent
PYPI_CLASSIFIERS = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
"Natural Language :: English",
"Operating System :: OS Independent",
"Programming Language :: Python",

View File

@ -132,7 +132,7 @@ def test_dkg_failure_with_ferveo_key_mismatch(
)
yield clock.advance(interval)
yield testerchain.time_travel(
seconds=coordinator_agent.get_timeout() // 6
seconds=coordinator_agent.get_dkg_timeout() // 6
) # min. 6 rounds before timeout
assert (

View File

@ -143,6 +143,20 @@ def threshold_message_kit(coordinator_agent, plaintext, condition, signer, ritua
return enrico.encrypt_for_dkg(plaintext=plaintext.encode(), conditions=condition)
@pytest.fixture(scope="module")
def incoming_validator(ursulas, dkg_size, clock):
incoming_validator = ursulas[dkg_size]
incoming_validator.ritual_tracker.task._task.clock = clock
incoming_validator.ritual_tracker.start()
return incoming_validator
@pytest.fixture(scope="module")
def departing_validator(cohort):
# randomize departing validator from the cohort
return cohort[random.randint(0, len(cohort) - 1)]
def test_dkg_initiation(
coordinator_agent,
accounts,
@ -398,3 +412,184 @@ def test_encryption_and_decryption_prometheus_metrics():
assert num_decryption_requests == (
num_decryption_successes + num_decryption_failures
)
def check_nodes_ritual_metadata(nodes_to_check, ritual_id, should_exist=True):
for ursula in nodes_to_check:
ritual = ursula.dkg_storage.get_active_ritual(ritual_id)
validators = ursula.dkg_storage.get_validators(ritual_id)
if should_exist:
assert ritual is not None, f"Ritual {ritual_id} object should be in cache"
assert (
validators is not None
), f"Validators for ritual {ritual_id} should be in cache"
else:
assert ritual is None, f"Ritual {ritual_id} object should not be in cache"
assert (
validators is None
), f"Validators for ritual {ritual_id} should not be in cache"
def test_handover_request(
coordinator_agent,
testerchain,
ritual_id,
cohort,
supervisor_transacting_power,
departing_validator,
incoming_validator,
):
testerchain.tx_machine.start()
print("==================== INITIALIZING HANDOVER ====================")
# check that ritual metadata is present in cache
check_nodes_ritual_metadata(cohort, ritual_id, should_exist=True)
receipt = coordinator_agent.request_handover(
ritual_id=ritual_id,
departing_validator=departing_validator.checksum_address,
incoming_validator=incoming_validator.checksum_address,
transacting_power=supervisor_transacting_power,
)
testerchain.time_travel(seconds=1)
testerchain.wait_for_receipt(receipt["transactionHash"])
handover_status = coordinator_agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator.checksum_address
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_TRANSCRIPT
@pytest_twisted.inlineCallbacks
def test_handover_finality(
coordinator_agent,
ritual_id,
cohort,
clock,
interval,
testerchain,
departing_validator,
incoming_validator,
supervisor_transacting_power,
):
print("==================== AWAITING HANDOVER FINALITY ====================")
handover_status = coordinator_agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator.checksum_address
)
assert handover_status != Coordinator.HandoverStatus.NON_INITIATED
while handover_status not in (
Coordinator.HandoverStatus.NON_INITIATED,
Coordinator.HandoverStatus.HANDOVER_AWAITING_FINALIZATION,
):
handover_status = coordinator_agent.get_handover_status(
ritual_id=ritual_id,
departing_validator=departing_validator.checksum_address,
)
assert handover_status != Coordinator.HandoverStatus.HANDOVER_TIMEOUT
yield clock.advance(interval)
yield testerchain.time_travel(seconds=1)
# check that ritual metadata is not present in cache anymore because in the midst of handover
check_nodes_ritual_metadata(
[*cohort, incoming_validator], ritual_id, should_exist=False
)
_receipt = coordinator_agent.finalize_handover(
ritual_id=ritual_id,
departing_validator=departing_validator.checksum_address,
transacting_power=supervisor_transacting_power,
)
handover_status = coordinator_agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator.checksum_address
)
assert handover_status == Coordinator.HandoverStatus.NON_INITIATED
testerchain.tx_machine.stop()
assert not testerchain.tx_machine.running
last_scanned_block = REGISTRY.get_sample_value(
"ritual_events_last_scanned_block_number"
)
assert last_scanned_block > 0
yield
@pytest_twisted.inlineCallbacks
def test_decryption_after_handover(
mocker,
bob,
accounts,
coordinator_agent,
threshold_message_kit,
ritual_id,
cohort,
plaintext,
departing_validator,
incoming_validator,
):
print("==================== DKG DECRYPTION POST-HANDOVER ====================")
departing_validator_spy = mocker.spy(
departing_validator, "handle_threshold_decryption_request"
)
incoming_validator_spy = mocker.spy(
incoming_validator, "handle_threshold_decryption_request"
)
# ensure that the incoming validator handled the request;
# the ritual is 3/4 so we need 1 ursula in the cohort to fail to decrypt
# to ensure that the incoming validator is actually used
node_to_fail = None
for u in cohort:
if u.checksum_address != departing_validator.checksum_address:
node_to_fail = u
break
assert node_to_fail is not None
mocker.patch.object(
node_to_fail,
"handle_threshold_decryption_request",
side_effect=ValueError("forcibly failed"),
)
# ritual_id, ciphertext, conditions are obtained from the side channel
bob.start_learning_loop(now=True)
cleartext = yield bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
assert bytes(cleartext) == plaintext.encode()
# ensure that the departing validator did not handle the request
assert departing_validator_spy.call_count == 0
# ensure that the incoming validator handled the request
assert incoming_validator_spy.call_count == 1
num_successes = REGISTRY.get_sample_value(
"threshold_decryption_num_successes_total"
)
ritual = coordinator_agent.get_ritual(ritual_id)
# at least a threshold of ursulas were successful (concurrency)
assert int(num_successes) >= ritual.threshold
# now that handover is completed (clears cache), and there was a
# successful decryption (populates cache) check that ritual metadata is present again
nodes_to_check_for_participation_in_decryption = list(cohort)
nodes_to_check_for_participation_in_decryption.remove(
departing_validator
) # no longer in cohort
nodes_to_check_for_participation_in_decryption.append(
incoming_validator
) # now part of cohort
nodes_to_check_for_participation_in_decryption.remove(
node_to_fail
) # in the cohort but will fail to decrypt so cache not populated
check_nodes_ritual_metadata(
nodes_to_check_for_participation_in_decryption, ritual_id, should_exist=True
)
# this check reinforces that the departing validator did not participate in the decryption
check_nodes_ritual_metadata([departing_validator], ritual_id, should_exist=False)
print("===================== DECRYPTION SUCCESSFUL =====================")
yield

View File

@ -23,7 +23,7 @@ def cohort(ursulas):
return nodes
def test_action_required_not_participating(cohort):
def test_action_required_not_participating(cohort, get_random_checksum_address):
ursula = cohort[0]
agent = ursula.coordinator_agent
active_ritual_tracker = ActiveRitualTracker(operator=ursula)
@ -42,21 +42,28 @@ def test_action_required_not_participating(cohort):
_my_get_participation_state,
):
for event in agent.contract.events:
arg_values = {
"ritualId": 23,
}
if event.event_name.startswith("Handover"):
# Handover events have additional fields
arg_values["incomingParticipant"] = get_random_checksum_address()
arg_values["departingParticipant"] = get_random_checksum_address()
ritual_event = AttributeDict(
{
"event": event.event_name,
"args": AttributeDict(
{
"ritualId": 23,
}
),
"args": AttributeDict(arg_values),
}
)
# all events are irrelevant because not participating
assert not active_ritual_tracker._action_required(ritual_event)
def test_action_required_only_for_events_with_corresponding_actions(cohort):
def test_action_required_only_for_events_with_corresponding_actions(
cohort, get_random_checksum_address
):
ursula = cohort[0]
agent = ursula.coordinator_agent
active_ritual_tracker = ActiveRitualTracker(operator=ursula)
@ -76,14 +83,22 @@ def test_action_required_only_for_events_with_corresponding_actions(cohort):
):
for event in agent.contract.events:
event_type = getattr(agent.contract.events, event.event_name)
arg_values = {
"ritualId": 23,
}
if event.event_name == "HandoverRequest":
# must be incoming participant
arg_values["incomingParticipant"] = ursula.checksum_address
arg_values["departingParticipant"] = get_random_checksum_address()
elif event.event_name == "HandoverTranscriptPosted":
# must be departing participant
arg_values["incomingParticipant"] = get_random_checksum_address()
arg_values["departingParticipant"] = ursula.checksum_address
ritual_event = AttributeDict(
{
"event": event.event_name,
"args": AttributeDict(
{
"ritualId": 23,
}
),
"args": AttributeDict(arg_values),
}
)
@ -94,7 +109,7 @@ def test_action_required_only_for_events_with_corresponding_actions(cohort):
assert active_ritual_tracker._action_required(ritual_event)
def test_action_required_depending_on_participation_state(cohort):
def test_action_required_depending_on_dkg_participation_state(cohort):
ursula = cohort[0]
agent = ursula.coordinator_agent
active_ritual_tracker = ActiveRitualTracker(operator=ursula)
@ -137,9 +152,10 @@ def test_action_required_depending_on_participation_state(cohort):
assert (
agent.contract.events.StartAggregationRound in active_ritual_tracker.actions
)
assert (
len(active_ritual_tracker.actions) == 2
), "untested event with corresponding action"
# TODO not testing handover states here
# assert (
# len(active_ritual_tracker.actions) == 2
# ), "untested event with corresponding action"
#
# already posted transcript - action only required for aggregation
@ -541,19 +557,19 @@ def test_get_participation_state_unexpected_event_without_ritual_id_arg(cohort):
agent = ursula.coordinator_agent
active_ritual_tracker = ActiveRitualTracker(operator=ursula)
# TimeoutChanged
timeout_changed_event = agent.contract.events.TimeoutChanged()
# MaxDkgSizeChanged
max_dkg_size_changed = agent.contract.events.MaxDkgSizeChanged()
# create args data
args_dict = {"oldTimeout": 1, "newTimeout": 2}
args_dict = {"oldSize": 24, "newSize": 30}
# ensure that test matches latest event information
check_event_args_match_latest_event_inputs(
event=timeout_changed_event, args_dict=args_dict
event=max_dkg_size_changed, args_dict=args_dict
)
event_data = AttributeDict(
{"event": timeout_changed_event.event_name, "args": AttributeDict(args_dict)}
{"event": max_dkg_size_changed.event_name, "args": AttributeDict(args_dict)}
)
with pytest.raises(RuntimeError):
@ -566,12 +582,12 @@ def test_get_participation_state_unexpected_event_with_ritual_id_arg(cohort):
active_ritual_tracker = ActiveRitualTracker(operator=ursula)
# create args data - faked to include ritual id arg
args_dict = {"ritualId": 0, "oldTimeout": 1, "newTimeout": 2}
args_dict = {"ritualId": 0, "oldSize": 24, "newSize": 30}
# TimeoutChanged event
# MaxDkgSizeChanged event
event_data = AttributeDict(
{
"event": agent.contract.events.TimeoutChanged.event_name,
"event": agent.contract.events.MaxDkgSizeChanged.event_name,
"args": AttributeDict(args_dict),
}
)
@ -596,7 +612,7 @@ def test_get_participation_state_purge_expired_cache_entries(
ActiveRitualTracker._RITUAL_TIMEOUT_ADDITIONAL_TTL_BUFFER
)
with patch.object(agent, "get_timeout", return_value=faked_ritual_timeout):
with patch.object(agent, "get_dkg_timeout", return_value=faked_ritual_timeout):
# fake timeout only needed for initialization
active_ritual_tracker = ActiveRitualTracker(operator=ursula)

View File

@ -1,3 +1,6 @@
import os
import random
import pytest
import pytest_twisted
from eth_utils import keccak
@ -6,6 +9,7 @@ from twisted.internet import reactor
from twisted.internet.task import deferLater
from nucypher.blockchain.eth.agents import CoordinatorAgent
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.models import Coordinator
from nucypher.crypto.powers import TransactingPower
from tests.utils.dkg import generate_fake_ritual_transcript, threshold_from_shares
@ -36,7 +40,7 @@ def cohort_ursulas(cohort, taco_application_agent):
return ursulas_for_cohort
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def transacting_powers(accounts, cohort_ursulas):
return [
TransactingPower(account=ursula, signer=accounts.get_account_signer(ursula))
@ -44,6 +48,19 @@ def transacting_powers(accounts, cohort_ursulas):
]
@pytest.fixture(scope="module")
def incoming_validator(staking_providers, cohort):
return staking_providers[
len(cohort) + 1
] # deployer + cohort ursulas already assigned
@pytest.fixture(scope="module")
def departing_validator(cohort):
# randomize departing validator from the cohort
return cohort[random.randint(0, len(cohort) - 1)]
def test_coordinator_properties(agent):
assert len(agent.contract_address) == 42
assert agent.contract.address == agent.contract_address
@ -243,3 +260,281 @@ def test_post_aggregation(
ritual_dkg_key = agent.get_ritual_public_key(ritual_id=ritual_id)
assert bytes(ritual_dkg_key) == bytes(dkg_public_key)
@pytest.mark.usefixtures("ursulas")
def test_request_handover(
accounts,
agent,
testerchain,
incoming_validator,
departing_validator,
supervisor_transacting_power,
):
ritual_id = agent.number_of_rituals() - 1
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.NON_INITIATED
receipt = agent.request_handover(
ritual_id=ritual_id,
departing_validator=departing_validator,
incoming_validator=incoming_validator,
transacting_power=supervisor_transacting_power,
)
assert receipt["status"] == 1
handover_events = agent.contract.events.HandoverRequest().process_receipt(receipt)
handover_event = handover_events[0]
assert handover_event["args"]["ritualId"] == ritual_id
assert handover_event["args"]["incomingParticipant"] == incoming_validator
assert handover_event["args"]["departingParticipant"] == departing_validator
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_TRANSCRIPT
handover = agent.get_handover(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover.departing_validator == departing_validator
assert handover.incoming_validator == incoming_validator
assert handover.transcript == b"" # no transcript available yet
assert (
handover.decryption_request_pubkey == b""
) # no decryption request pubkey available yet
assert handover.blinded_share == b"" # no blinded share available yet
assert handover.key == agent.get_handover_key(
ritual_id=ritual_id, departing_validator=departing_validator
)
@pytest_twisted.inlineCallbacks
def test_post_handover_transcript(
agent,
accounts,
transacting_powers,
testerchain,
clock,
mock_async_hooks,
departing_validator,
incoming_validator,
taco_application_agent,
):
ritual_id = agent.number_of_rituals() - 1
transcript = os.urandom(32) # Randomly generated transcript for testing
participant_public_key = SessionStaticSecret.random().public_key()
operator = taco_application_agent.get_operator_from_staking_provider(
incoming_validator
)
incoming_operator_transacting_power = TransactingPower(
account=operator,
signer=accounts.get_account_signer(operator),
)
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_TRANSCRIPT
async_tx = agent.post_handover_transcript(
ritual_id=ritual_id,
departing_validator=departing_validator,
handover_transcript=transcript,
participant_public_key=participant_public_key,
transacting_power=incoming_operator_transacting_power,
async_tx_hooks=mock_async_hooks,
)
testerchain.tx_machine.start()
while not async_tx.final:
yield clock.advance(testerchain.tx_machine._task.interval)
testerchain.tx_machine.stop()
post_transcript_events = (
agent.contract.events.HandoverTranscriptPosted().process_receipt(
async_tx.receipt
)
)
handover_event = post_transcript_events[0]
assert handover_event["args"]["ritualId"] == ritual_id
assert handover_event["args"]["incomingParticipant"] == incoming_validator
assert handover_event["args"]["departingParticipant"] == departing_validator
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_BLINDED_SHARE
# ensure relevant hooks are called (once for each tx) OR not called (failure ones)
yield deferLater(reactor, 0.2, lambda: None)
assert mock_async_hooks.on_broadcast.call_count == 1
assert mock_async_hooks.on_finalized.call_count == 1
assert async_tx.successful is True
# failure hooks not called
assert mock_async_hooks.on_broadcast_failure.call_count == 0
assert mock_async_hooks.on_fault.call_count == 0
assert mock_async_hooks.on_insufficient_funds.call_count == 0
# check proper state
handover = agent.get_handover(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover.departing_validator == departing_validator
assert handover.incoming_validator == incoming_validator
assert handover.transcript == transcript
assert handover.decryption_request_pubkey == bytes(participant_public_key)
assert handover.blinded_share == b"" # no blinded share available yet
assert handover.key == agent.get_handover_key(
ritual_id=ritual_id, departing_validator=departing_validator
)
@pytest_twisted.inlineCallbacks
def test_post_blinded_share(
agent,
accounts,
transacting_powers,
testerchain,
clock,
mock_async_hooks,
departing_validator,
incoming_validator,
taco_application_agent,
):
ritual_id = agent.number_of_rituals() - 1
blinded_share = os.urandom(96) # Randomly generated bytes for testing
operator = taco_application_agent.get_operator_from_staking_provider(
departing_validator
)
departing_operator_transacting_power = TransactingPower(
account=operator,
signer=accounts.get_account_signer(operator),
)
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_BLINDED_SHARE
async_tx = agent.post_blinded_share_for_handover(
ritual_id=ritual_id,
blinded_share=blinded_share,
transacting_power=departing_operator_transacting_power,
async_tx_hooks=mock_async_hooks,
)
testerchain.tx_machine.start()
while not async_tx.final:
yield clock.advance(testerchain.tx_machine._task.interval)
testerchain.tx_machine.stop()
events = agent.contract.events.BlindedSharePosted().process_receipt(
async_tx.receipt
)
handover_event = events[0]
assert handover_event["args"]["ritualId"] == ritual_id
assert handover_event["args"]["departingParticipant"] == departing_validator
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_FINALIZATION
# ensure relevant hooks are called (once for each tx) OR not called (failure ones)
yield deferLater(reactor, 0.2, lambda: None)
assert mock_async_hooks.on_broadcast.call_count == 1
assert mock_async_hooks.on_finalized.call_count == 1
assert async_tx.successful is True
# failure hooks not called
assert mock_async_hooks.on_broadcast_failure.call_count == 0
assert mock_async_hooks.on_fault.call_count == 0
assert mock_async_hooks.on_insufficient_funds.call_count == 0
# check proper state
handover = agent.get_handover(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover.departing_validator == departing_validator
assert handover.incoming_validator == incoming_validator
assert handover.blinded_share == blinded_share
@pytest.mark.usefixtures("ursulas")
def test_finalize_handover(
accounts,
agent,
testerchain,
incoming_validator,
departing_validator,
supervisor_transacting_power,
cohort,
):
ritual_id = agent.number_of_rituals() - 1
ritual = agent.get_ritual(ritual_id)
old_aggregated_transcript = ritual.aggregated_transcript
blinded_share = agent.get_handover(
ritual_id=ritual_id, departing_validator=departing_validator
).blinded_share
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.HANDOVER_AWAITING_FINALIZATION
receipt = agent.finalize_handover(
ritual_id=ritual_id,
departing_validator=departing_validator,
transacting_power=supervisor_transacting_power,
)
assert receipt["status"] == 1
handover_events = agent.contract.events.HandoverFinalized().process_receipt(receipt)
handover_event = handover_events[0]
assert handover_event["args"]["ritualId"] == ritual_id
assert handover_event["args"]["incomingParticipant"] == incoming_validator
assert handover_event["args"]["departingParticipant"] == departing_validator
handover_status = agent.get_handover_status(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover_status == Coordinator.HandoverStatus.NON_INITIATED
handover = agent.get_handover(
ritual_id=ritual_id, departing_validator=departing_validator
)
# The handover model still contains key data
assert handover.key == agent.get_handover_key(
ritual_id=ritual_id, departing_validator=departing_validator
)
assert handover.departing_validator == departing_validator
# Remaining data should be empty, though
assert handover.incoming_validator == NULL_ADDRESS
assert handover.transcript == b""
assert handover.decryption_request_pubkey == b""
assert handover.blinded_share == b""
# Now let's check that agggregate transcript has been updated
ritual = agent.get_ritual(ritual_id)
new_aggregated_transcript = ritual.aggregated_transcript
assert new_aggregated_transcript != old_aggregated_transcript
index = cohort.index(departing_validator)
threshold = 2
blind_share_position = 32 + index * 96 + threshold * 48
old_aggregate_with_blinded_share = (
old_aggregated_transcript[:blind_share_position]
+ blinded_share
+ old_aggregated_transcript[blind_share_position + 96 :]
)
assert old_aggregate_with_blinded_share == new_aggregated_transcript

View File

@ -45,6 +45,28 @@ from tests.utils.policy import make_message_kits
GET_CONTEXT_VALUE_IMPORT_PATH = "nucypher.policy.conditions.context.get_context_value"
getActiveStakingProviders_abi_2_params = {
"type": "function",
"name": "getActiveStakingProviders",
"stateMutability": "view",
"inputs": [
{"name": "_startIndex", "type": "uint256", "internalType": "uint256"},
{
"name": "_maxStakingProviders",
"type": "uint256",
"internalType": "uint256",
},
],
"outputs": [
{"name": "allAuthorizedTokens", "type": "uint96", "internalType": "uint96"},
{
"name": "activeStakingProviders",
"type": "bytes32[]",
"internalType": "bytes32[]",
},
],
}
def _dont_validate_user_address(context_variable: str, **context):
if context_variable == USER_ADDRESS_CONTEXT:
@ -773,30 +795,9 @@ def test_contract_condition_using_overloaded_function(
#
# valid overloaded function - 2 params
#
valid_abi_2_params = {
"type": "function",
"name": "getActiveStakingProviders",
"stateMutability": "view",
"inputs": [
{"name": "_startIndex", "type": "uint256", "internalType": "uint256"},
{
"name": "_maxStakingProviders",
"type": "uint256",
"internalType": "uint256",
},
],
"outputs": [
{"name": "allAuthorizedTokens", "type": "uint96", "internalType": "uint96"},
{
"name": "activeStakingProviders",
"type": "bytes32[]",
"internalType": "bytes32[]",
},
],
}
condition = ContractCondition(
contract_address=taco_child_application_agent.contract.address,
function_abi=ABIFunction(valid_abi_2_params),
function_abi=ABIFunction(getActiveStakingProviders_abi_2_params),
method="getActiveStakingProviders",
chain=TESTERCHAIN_CHAIN_ID,
return_value_test=ReturnValueTest("==", ":expectedStakingProviders"),
@ -985,3 +986,60 @@ def test_rpc_condition_using_eip1271(
assert condition_result is False
assert call_result != eth_amount
assert call_result == (eth_amount - withdraw_amount)
@pytest.mark.usefixtures("staking_providers")
def test_big_int_string_handling(
accounts, taco_child_application_agent, bob, condition_providers
):
(
total_staked,
providers,
) = taco_child_application_agent._get_active_staking_providers_raw(0, 10, 0)
expected_result = [
total_staked,
[
HexBytes(provider_bytes).hex() for provider_bytes in providers
], # must be json serializable
]
context = {
":expectedStakingProviders": expected_result,
} # user-defined context vars
contract_condition = {
"conditionType": ConditionType.CONTRACT.value,
"contractAddress": taco_child_application_agent.contract.address,
"functionAbi": getActiveStakingProviders_abi_2_params,
"chain": TESTERCHAIN_CHAIN_ID,
"method": "getActiveStakingProviders",
"parameters": ["0n", "10n"], # use bigint notation
"returnValueTest": {
"comparator": "==",
"value": ":expectedStakingProviders",
},
}
rpc_condition = {
"conditionType": ConditionType.RPC.value,
"chain": TESTERCHAIN_CHAIN_ID,
"method": "eth_getBalance",
"parameters": [bob.checksum_address, "latest"],
"returnValueTest": {
"comparator": ">=",
"value": "10000000000000n",
}, # use bigint notation
}
compound_condition = {
"version": ConditionLingo.VERSION,
"condition": {
"conditionType": ConditionType.COMPOUND.value,
"operator": "and",
"operands": [contract_condition, rpc_condition],
},
}
compound_condition_json = json.dumps(compound_condition)
condition_result = ConditionLingo.from_json(compound_condition_json).eval(
providers=condition_providers, **context
)
assert condition_result, "condition executed and passes"

View File

@ -1,6 +1,5 @@
import random
import maya
import pytest
from web3 import Web3
@ -14,6 +13,7 @@ from nucypher.blockchain.eth.agents import (
)
from nucypher.blockchain.eth.interfaces import BlockchainInterfaceFactory
from nucypher.blockchain.eth.registry import ContractRegistry, RegistrySourceManager
from nucypher.crypto.powers import TransactingPower
from nucypher.utilities.logging import Logger
from tests.constants import (
BONUS_TOKENS_FOR_TESTS,
@ -44,18 +44,14 @@ MIN_AUTHORIZATION = Web3.to_wei(40_000, "ether")
REWARD_DURATION = 7 * ONE_DAY # one week in seconds
DEAUTHORIZATION_DURATION = 60 * ONE_DAY # 60 days in seconds
COMMITMENT_DURATION_1 = 182 * ONE_DAY # 182 days in seconds
COMMITMENT_DURATION_2 = 2 * COMMITMENT_DURATION_1 # 365 days in seconds
COMMITMENT_DEADLINE = 100 * ONE_DAY # 100 days after deployment
PENALTY_DEFAULT = 1000 # 10% penalty
PENALTY_INCREMENT = 2500 # 25% penalty increment
PENALTY_DURATION = ONE_DAY # 1 day in seconds
# Coordinator
TIMEOUT = 3600
DKG_TIMEOUT = 3600
HANDOVER_TIMEOUT = 1800
MAX_DKG_SIZE = 8
FEE_RATE = 1
@ -161,8 +157,6 @@ def taco_application(
MIN_OPERATOR_SECONDS,
REWARD_DURATION,
DEAUTHORIZATION_DURATION,
[COMMITMENT_DURATION_1, COMMITMENT_DURATION_2],
maya.now().epoch + COMMITMENT_DEADLINE,
PENALTY_DEFAULT,
PENALTY_DURATION,
PENALTY_INCREMENT,
@ -228,10 +222,12 @@ def coordinator(
_coordinator = deployer_account.deploy(
nucypher_dependency.Coordinator,
taco_child_application.address,
DKG_TIMEOUT,
HANDOVER_TIMEOUT,
)
encoded_initializer_function = _coordinator.initialize.encode_input(
TIMEOUT, MAX_DKG_SIZE, deployer_account.address
MAX_DKG_SIZE, deployer_account.address
)
proxy = deployer_account.deploy(
oz_dependency.TransparentUpgradeableProxy,
@ -255,15 +251,37 @@ def fee_model(nucypher_dependency, deployer_account, coordinator, ritual_token):
ritual_token.address,
FEE_RATE,
)
treasury_role = coordinator.TREASURY_ROLE()
coordinator.grantRole(
treasury_role, deployer_account.address, sender=deployer_account
coordinator.TREASURY_ROLE(), deployer_account.address, sender=deployer_account
)
coordinator.grantRole(
coordinator.FEE_MODEL_MANAGER_ROLE(),
deployer_account.address,
sender=deployer_account,
)
coordinator.approveFeeModel(contract.address, sender=deployer_account)
return contract
@pytest.fixture(scope="module")
def handover_supervisor(deployer_account, coordinator):
coordinator.grantRole(
coordinator.HANDOVER_SUPERVISOR_ROLE(),
deployer_account.address,
sender=deployer_account,
)
return deployer_account
@pytest.fixture(scope="module")
def supervisor_transacting_power(handover_supervisor, accounts):
return TransactingPower(
account=handover_supervisor.address,
signer=accounts.get_account_signer(handover_supervisor.address),
)
@pytest.fixture(scope="module")
def global_allow_list(nucypher_dependency, deployer_account, coordinator):
contract = deployer_account.deploy(

View File

@ -183,3 +183,9 @@ RPC_SUCCESSFUL_RESPONSE = {
"id": 1,
"result": "Geth/v1.9.20-stable-979fc968/linux-amd64/go1.15"
}
# Integers
UINT256_MAX = 2**256 - 1
INT256_MIN = -(2**255)

View File

@ -6,7 +6,6 @@ import tempfile
from datetime import timedelta
from functools import partial
from pathlib import Path
from typing import Tuple
from unittest.mock import PropertyMock
import maya
@ -767,9 +766,9 @@ def ursulas(accounts, ursula_test_config, staking_providers):
@pytest.fixture(scope="session")
def dkg_public_key_data(
def aggregated_transcript(
get_random_checksum_address,
) -> Tuple[AggregatedTranscript, DkgPublicKey]:
) -> AggregatedTranscript:
ritual_id = 0
num_shares = 4
threshold = 3
@ -779,12 +778,13 @@ def dkg_public_key_data(
Validator(
address=get_random_checksum_address(),
public_key=Keypair.random().public_key(),
share_index=i,
)
)
validators.sort(key=lambda x: x.address) # must be sorted
transcripts = []
validator_messages = []
for validator in validators:
transcript = dkg.generate_transcript(
ritual_id=ritual_id,
@ -793,30 +793,22 @@ def dkg_public_key_data(
threshold=threshold,
nodes=validators,
)
transcripts.append((validator, transcript))
validator_messages.append(dkg.ValidatorMessage(validator, transcript))
aggregate_transcript, public_key = dkg.aggregate_transcripts(
aggregate_transcript = dkg.aggregate_transcripts(
ritual_id=ritual_id,
me=validators[0],
shares=num_shares,
threshold=threshold,
transcripts=transcripts,
validator_messages=validator_messages,
)
return aggregate_transcript, public_key
return aggregate_transcript
@pytest.fixture(scope="session")
def dkg_public_key(dkg_public_key_data) -> DkgPublicKey:
_, dkg_public_key = dkg_public_key_data
return dkg_public_key
@pytest.fixture(scope="session")
def aggregated_transcript(dkg_public_key_data) -> AggregatedTranscript:
aggregated_transcript, _ = dkg_public_key_data
return aggregated_transcript
def dkg_public_key(aggregated_transcript) -> DkgPublicKey:
return aggregated_transcript.public_key
#
# DKG Ritual Aggregation

View File

@ -22,7 +22,7 @@ def ritualist(ursulas, mock_coordinator_agent) -> Operator:
ursula = ursulas[0]
mocked_agent = Mock(spec=CoordinatorAgent)
mocked_agent.contract = mock_coordinator_agent.contract
mocked_agent.get_timeout.return_value = 60 # 60s
mocked_agent.get_dkg_timeout.return_value = 60 # 60s
mocked_blockchain = Mock()
mocked_agent.blockchain = mocked_blockchain
mocked_w3 = Mock()
@ -79,7 +79,7 @@ def test_first_scan_start_block_calc_is_perfect(ritualist):
sample_base_block_number = latest_block_number - sample_window
# timeout
ritual_timeout = 60 * 60 * 24 # 24 hours
mocked_agent.get_timeout.return_value = ritual_timeout
mocked_agent.get_dkg_timeout.return_value = ritual_timeout
target_average_block_time = 8 # 8s block time
sample_base_block_timestamp = now.subtract(
seconds=target_average_block_time * sample_window
@ -146,7 +146,7 @@ def test_first_scan_start_block_calc_is_not_perfect_go_back_more_blocks(ritualis
sample_base_block_number = latest_block_number - sample_window
# timeout
ritual_timeout = 60 * 60 * 24 # 24 hours
mocked_agent.get_timeout.return_value = ritual_timeout
mocked_agent.get_dkg_timeout.return_value = ritual_timeout
target_average_block_time = 12 # 12s block tim4e
sample_base_block_timestamp = now.subtract(

View File

@ -40,9 +40,11 @@ class MockCoordinatorAgent(MockContractAgent):
START_RITUAL = 0
START_AGGREGATION_ROUND = 1
def __init__(self, blockchain: MockBlockchain, max_dkg_size: int = 64, timeout: int = 600):
def __init__(
self, blockchain: MockBlockchain, max_dkg_size: int = 64, dkg_timeout: int = 600
):
self.blockchain = blockchain
self.timeout = timeout
self.dkg_timeout = dkg_timeout
self.max_dkg_size = max_dkg_size
# Note that the call to super() is not necessary here
@ -213,8 +215,8 @@ class MockCoordinatorAgent(MockContractAgent):
def is_provider_public_key_set(self, staking_provider: ChecksumAddress) -> bool:
return staking_provider in self._participant_keys_history
def get_timeout(self) -> int:
return self.timeout
def get_dkg_timeout(self) -> int:
return self.dkg_timeout
def number_of_rituals(self) -> int:
return len(self._rituals)
@ -254,7 +256,7 @@ class MockCoordinatorAgent(MockContractAgent):
def get_ritual_status(self, ritual_id: int) -> int:
ritual = self._rituals[ritual_id]
timestamp = int(ritual.init_timestamp)
deadline = timestamp + self.timeout
deadline = timestamp + self.dkg_timeout
if timestamp == 0:
return self.RitualStatus.NON_INITIATED
elif ritual.total_aggregations == ritual.dkg_size:

View File

@ -1,6 +1,8 @@
import json
from collections import namedtuple
import pytest
from marshmallow import ValidationError
from packaging.version import parse as parse_version
import nucypher
@ -9,8 +11,13 @@ from nucypher.policy.conditions.context import USER_ADDRESS_CONTEXT
from nucypher.policy.conditions.exceptions import (
InvalidConditionLingo,
)
from nucypher.policy.conditions.lingo import ConditionLingo, ConditionType
from tests.constants import TESTERCHAIN_CHAIN_ID
from nucypher.policy.conditions.lingo import (
AnyField,
AnyLargeIntegerField,
ConditionLingo,
ConditionType,
)
from tests.constants import INT256_MIN, TESTERCHAIN_CHAIN_ID, UINT256_MAX
@pytest.fixture(scope="module")
@ -83,7 +90,7 @@ def lingo_with_all_condition_types(get_random_checksum_address):
}
json_rpc_condition = {
# JSON RPC
"conditionType": "json-rpc",
"conditionType": ConditionType.JSONRPC.value,
"endpoint": "https://math.example.com/",
"method": "subtract",
"params": [42, 23],
@ -93,6 +100,12 @@ def lingo_with_all_condition_types(get_random_checksum_address):
"value": 19,
},
}
jwt_condition = {
# JWT
"conditionType": ConditionType.JWT.value,
"jwtToken": ":token",
"publicKey": "-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEXHVxB7s5SR7I9cWwry/JkECIReka\nCwG3uOLCYbw5gVzn4dRmwMyYUJFcQWuFSfECRK+uQOOXD0YSEucBq0p5tA==\n-----END PUBLIC KEY-----",
}
sequential_condition = {
"conditionType": ConditionType.SEQUENTIAL.value,
"conditionVariables": [
@ -112,6 +125,10 @@ def lingo_with_all_condition_types(get_random_checksum_address):
"varName": "jsonValue",
"condition": json_api_condition,
},
{
"varName": "jwtValue",
"condition": jwt_condition,
},
],
}
if_then_else_condition = {
@ -141,6 +158,7 @@ def lingo_with_all_condition_types(get_random_checksum_address):
},
}
def test_invalid_condition():
# no version or condition
data = dict()
@ -365,3 +383,106 @@ def test_lingo_data(conditions_test_data):
for name, condition_dict in conditions_test_data.items():
condition_class = ConditionLingo.resolve_condition_class(condition_dict)
_ = condition_class.from_dict(condition_dict)
@pytest.mark.parametrize(
"value",
[
1231323123132,
2121.23211,
False,
'"foo"', # string
"5555555555", # example of a number that was a string and should remain a string
":userAddress", # context variable
"0xaDD9D957170dF6F33982001E4c22eCCdd5539118", # string
"0x1234", # hex string
125, # int
-123456789, # negative int
1.223, # float
True, # bool
[1, 1.2314, False, "love"], # list of different types
["a", "b", "c"], # list
[True, False], # list of bools
{"name": "John", "age": 22}, # dict
namedtuple("MyStruct", ["field1", "field2"])(1, "a"),
[True, 2, 6.5, "0x123"],
],
)
def test_any_field_various_types(value):
field = AnyField()
deserialized_value = field.deserialize(value)
serialized_value = field._serialize(deserialized_value, attr=None, obj=None)
assert deserialized_value == serialized_value
assert deserialized_value == value
@pytest.mark.parametrize(
"integer_value",
[
UINT256_MAX,
INT256_MIN,
123132312, # safe int
-1231231, # safe negative int
],
)
def test_any_field_integer_str_and_no_str_conversion(integer_value):
field = AnyField()
deserialized_raw_integer = field.deserialize(value=integer_value)
deserialized_big_int_string = field.deserialize(value=f"{integer_value}n")
assert deserialized_raw_integer == deserialized_big_int_string
assert (
field._serialize(deserialized_raw_integer, attr=None, obj=None) == integer_value
)
assert (
field._serialize(deserialized_big_int_string, attr=None, obj=None)
== integer_value
)
def test_any_field_nested_integer():
field = AnyField()
regular_number = 12341231
parameters = [
f"{UINT256_MAX}n",
{"a": [f"{INT256_MIN}n", "my_string_value", "0xdeadbeef"], "b": regular_number},
]
# quoted numbers get unquoted after deserialization
expected_parameters = [
UINT256_MAX,
{"a": [INT256_MIN, "my_string_value", "0xdeadbeef"], "b": regular_number},
]
deserialized_parameters = field.deserialize(value=parameters)
assert deserialized_parameters == expected_parameters
@pytest.mark.parametrize(
"json_value, expected_deserialized_value",
[
(123132312, 123132312), # safe int
(-1231231, -1231231), # safe negative int
(f"{UINT256_MAX}n", UINT256_MAX),
(f"{INT256_MIN}n", INT256_MIN),
(f"{UINT256_MAX*2}n", UINT256_MAX * 2), # larger than uint256 max
(f"{INT256_MIN*2}n", INT256_MIN * 2), # smaller than in256 min
# expected failures
("Totally a number", None),
("Totally a number that ends with n", None),
("fallen", None),
],
)
def test_any_large_integer_field(json_value, expected_deserialized_value):
field = AnyLargeIntegerField()
if expected_deserialized_value is not None:
assert field.deserialize(json_value) == expected_deserialized_value
else:
# expected to fail
with pytest.raises(ValidationError, match="Not a valid integer."):
_ = field.deserialize(json_value)

View File

@ -19,6 +19,7 @@ from nucypher.policy.conditions.exceptions import (
from nucypher.policy.conditions.lingo import (
ReturnValueTest,
)
from tests.constants import INT256_MIN, UINT256_MAX
INVALID_CONTEXT_PARAM_NAMES = [
":",
@ -89,6 +90,34 @@ def test_resolve_any_context_variables():
assert resolved_return_value.value == resolved_value
@pytest.mark.parametrize(
"value, expected_resolved_value",
[
(":foo", UINT256_MAX),
(":bar", INT256_MIN),
(
[":foo", 12, ":bar", "5555555555", "endWith_n"],
[UINT256_MAX, 12, INT256_MIN, "5555555555", "endWith_n"],
),
(
[":foo", ":foo", 5, [99, [":bar"]]],
[UINT256_MAX, UINT256_MAX, 5, [99, [INT256_MIN]]],
),
],
)
def test_resolve_big_int_context_variables(value, expected_resolved_value):
# bigints have the 'n' suffix
context = {":foo": f"{UINT256_MAX}n", ":bar": f"{INT256_MIN}n"}
# use with parameters
resolved_value = resolve_any_context_variables(value, **context)
assert resolved_value == expected_resolved_value
return_value_test = ReturnValueTest(comparator="==", value=value)
resolved_return_value = return_value_test.with_resolved_context(**context)
assert resolved_return_value.value == resolved_value
@pytest.mark.parametrize(
"value, expected_resolution",
[

View File

@ -4,11 +4,12 @@ import os
import random
from enum import Enum
from typing import Any, Dict, List, Optional, Sequence, Union
from unittest.mock import Mock
from unittest.mock import ANY, Mock
import pytest
from hexbytes import HexBytes
from marshmallow import post_load
from web3 import Web3
from web3.providers import BaseProvider
from nucypher.policy.conditions.evm import ContractCall, ContractCondition
@ -17,8 +18,11 @@ from nucypher.policy.conditions.exceptions import (
InvalidConditionLingo,
)
from nucypher.policy.conditions.lingo import ConditionType, ReturnValueTest
from nucypher.policy.conditions.utils import ConditionProviderManager
from tests.constants import TESTERCHAIN_CHAIN_ID
from nucypher.policy.conditions.utils import (
ConditionProviderManager,
check_and_convert_big_int_string_to_int,
)
from tests.constants import INT256_MIN, TESTERCHAIN_CHAIN_ID, UINT256_MAX
CHAIN_ID = 137
@ -53,7 +57,7 @@ class FakeExecutionContractCondition(ContractCondition):
def set_execution_return_value(self, value: Any):
self.execution_return_value = value
def execute(self, providers: ConditionProviderManager, **context) -> Any:
def _execute(self, w3: Web3, resolved_parameters: List[Any]) -> Any:
return self.execution_return_value
EXECUTION_CALL_TYPE = FakeRPCCall
@ -77,13 +81,21 @@ def contract_condition_dict():
def _replace_abi_outputs(condition_json: Dict, output_type: str, output_value: Any):
# modify outputs type
condition_json["functionAbi"]["outputs"][0]["internalType"] = output_type
condition_json["functionAbi"]["outputs"][0]["type"] = output_type
for entry in condition_json["functionAbi"]["outputs"]:
entry["internalType"] = output_type
entry["type"] = output_type
# modify return value test
condition_json["returnValueTest"]["value"] = output_value
def _replace_abi_inputs(condition_json: Dict, input_type: str):
# modify inputs type
for entry in condition_json["functionAbi"]["inputs"]:
entry["internalType"] = input_type
entry["type"] = input_type
class ContextVarTest(Enum):
CONTEXT_VAR_ONLY = 0
NO_CONTEXT_VAR_ONLY = 1
@ -126,7 +138,9 @@ def _check_execution_logic(
json.dumps(condition_dict)
)
fake_execution_contract_condition.set_execution_return_value(execution_result)
fake_providers = ConditionProviderManager({CHAIN_ID: {Mock(BaseProvider)}})
fake_providers = Mock(spec=ConditionProviderManager)
fake_providers.web3_endpoints.return_value = [Mock(BaseProvider)]
condition_result, call_result = fake_execution_contract_condition.verify(
fake_providers, **context
)
@ -1286,3 +1300,71 @@ def test_abi_nested_tuples_output_values(
expected_outcome=None,
context_var_testing=ContextVarTest.CONTEXT_VAR_ONLY,
)
@pytest.mark.parametrize(
"io_type, parameter,return_value_test_value,contract_result",
[
("uint256", f"{UINT256_MAX}n", f"{UINT256_MAX}n", UINT256_MAX),
(
"uint256",
f"{int(UINT256_MAX/2)}n",
f"{int(UINT256_MAX/2)}n",
int(UINT256_MAX / 2),
),
("int256", f"{INT256_MIN}n", f"{INT256_MIN}n", INT256_MIN),
(
"int256",
f"{int(INT256_MIN/2)}n",
f"{int(INT256_MIN/2)}n",
int(INT256_MIN / 2),
),
],
)
def test_big_int_inputs_and_outputs(
io_type, parameter, return_value_test_value, contract_result, mocker
):
# tests use of big int values in inputs and outputs (both parameters, and return value test)
contract_condition = {
"conditionType": "contract",
"contractAddress": "0x01B67b1194C75264d06F808A921228a95C765dd7",
"parameters": [parameter],
"method": "getValue",
"functionAbi": {
"inputs": [
{"internalType": "uint256", "name": "value", "type": "uint256"},
],
"name": "getValue",
"outputs": [
{"internalType": "uint256", "name": "result", "type": "uint256"},
],
"stateMutability": "view",
"type": "function",
"constant": True,
},
"chain": CHAIN_ID,
"returnValueTest": {
"comparator": "==",
"value": 0,
},
}
_replace_abi_inputs(contract_condition, input_type=io_type)
_replace_abi_outputs(
contract_condition, output_type=io_type, output_value=return_value_test_value
)
execute_spy = mocker.spy(FakeExecutionContractCondition.FakeRPCCall, "_execute")
_check_execution_logic(
condition_dict=contract_condition,
execution_result=contract_result, # value returned by contract
comparator_value=return_value_test_value, # value set in return value test
comparator="==",
expected_outcome=True,
context_var_testing=ContextVarTest.WITH_AND_WITHOUT_CONTEXT_VAR,
)
execute_spy.assert_called_with(
ANY, ANY, [check_and_convert_big_int_string_to_int(parameter)]
) # (self, w3, [value used for parameter])

View File

@ -0,0 +1,236 @@
import calendar
from datetime import datetime, timezone
import jwt
import pytest
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import ec
from marshmallow import validates
from nucypher.policy.conditions.base import ExecutionCall
from nucypher.policy.conditions.exceptions import InvalidCondition, JWTException
from nucypher.policy.conditions.jwt import JWTCondition, JWTVerificationCall
TEST_ECDSA_PRIVATE_KEY_RAW_B64 = (
"MHcCAQEEIHAhM7P6HG3LgkDvgvfDeaMA6uELj+jEKWsSeOpS/SfYoAoGCCqGSM49\n"
"AwEHoUQDQgAEXHVxB7s5SR7I9cWwry/JkECIRekaCwG3uOLCYbw5gVzn4dRmwMyY\n"
"UJFcQWuFSfECRK+uQOOXD0YSEucBq0p5tA=="
)
TEST_ECDSA_PRIVATE_KEY = ( # TODO: Workaround to bypass pre-commit hook that detects private keys in code
"-----BEGIN EC"
+ " PRIVATE KEY"
+ f"-----\n{TEST_ECDSA_PRIVATE_KEY_RAW_B64}\n-----END EC"
+ " PRIVATE KEY-----"
)
TEST_ECDSA_PUBLIC_KEY = (
"-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEXHVxB7s5SR7I9cWwry"
"/JkECIReka\nCwG3uOLCYbw5gVzn4dRmwMyYUJFcQWuFSfECRK+uQOOXD0YSEucBq0p5tA==\n-----END PUBLIC "
"KEY-----"
)
ISSUED_AT = calendar.timegm(datetime.now(tz=timezone.utc).utctimetuple())
TEST_JWT_TOKEN = jwt.encode(
{"iat": ISSUED_AT}, TEST_ECDSA_PRIVATE_KEY, algorithm="ES256"
)
def generate_pem_keypair(elliptic_curve):
# Generate an EC private key
private_key = ec.generate_private_key(elliptic_curve)
# Get the corresponding public key
public_key = private_key.public_key()
# Serialize the private key to PEM format
pem_private_key = private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
).decode("utf-8")
# Serialize the public key to PEM format
pem_public_key = public_key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo,
).decode("utf-8")
return pem_public_key, pem_private_key
def jwt_token(
with_iat: bool = True, claims: dict = None, expiration_offset: int = None
):
claims = claims or dict()
if with_iat:
claims["iat"] = ISSUED_AT
if expiration_offset is not None:
claims["exp"] = ISSUED_AT + expiration_offset
return jwt.encode(claims, TEST_ECDSA_PRIVATE_KEY, algorithm="ES256")
class TestJWTVerificationCall(JWTVerificationCall):
class Schema(JWTVerificationCall.Schema):
@validates("jwt_token")
def validate_jwt_token(self, value):
pass
def test_raw_jwt_decode():
token = jwt_token()
# Valid JWT
jwt.decode(token, TEST_ECDSA_PUBLIC_KEY, algorithms=["ES256"])
# Invalid JWT
with pytest.raises(jwt.exceptions.InvalidTokenError):
jwt.decode(token[1:], TEST_ECDSA_PUBLIC_KEY, algorithms=["ES256"])
def test_jwt_verification_call_invalid():
token = jwt_token()
message = r"Invalid value for JWT token; expected a context variable"
with pytest.raises(ExecutionCall.InvalidExecutionCall, match=message):
JWTVerificationCall(jwt_token=token, public_key=TEST_ECDSA_PUBLIC_KEY)
def test_jwt_verification_call_valid():
token = jwt_token()
call = TestJWTVerificationCall(jwt_token=token, public_key=TEST_ECDSA_PUBLIC_KEY)
assert call.execute()
def test_jwt_condition_missing_jwt_token():
with pytest.raises(
InvalidCondition, match="'jwt_token' field - Field may not be null."
):
_ = JWTCondition(jwt_token=None, public_key=None)
def test_jwt_condition_missing_public_key():
with pytest.raises(
InvalidCondition, match="'public_key' field - Field may not be null."
):
_ = JWTCondition(
jwt_token=":ok_ok_this_is_a_variable_for_a_jwt", public_key=None
)
def test_jwt_condition_invalid_public_key():
with pytest.raises(
InvalidCondition,
match="'public_key' field - Invalid public key format: Unable to load PEM",
):
_ = JWTCondition(
jwt_token=":ok_ok_this_is_a_variable_for_a_jwt",
public_key="-----BEGIN PUBLIC KEY----- haha, gotcha! 👌 -----END PUBLIC KEY-----",
)
def test_jwt_condition_but_unsupported_public_key():
pem_secp521_public_key, _ = generate_pem_keypair(ec.SECP521R1())
with pytest.raises(
InvalidCondition,
match="'public_key' field - Invalid public key format: Invalid EC public key curve",
):
_ = JWTCondition(
jwt_token=":ok_ok_this_is_a_variable_for_a_jwt",
public_key=pem_secp521_public_key,
)
def test_jwt_condition_initialization():
condition = JWTCondition(
jwt_token=":aContextVariableForJWTs",
public_key=TEST_ECDSA_PUBLIC_KEY,
)
assert condition.jwt_token == ":aContextVariableForJWTs"
assert condition.public_key == TEST_ECDSA_PUBLIC_KEY
assert condition.condition_type == JWTCondition.CONDITION_TYPE
def test_jwt_condition_verify():
token = jwt_token(with_iat=False)
condition = JWTCondition(
jwt_token=":anotherContextVariableForJWTs",
public_key=TEST_ECDSA_PUBLIC_KEY,
)
context = {":anotherContextVariableForJWTs": token}
success, result = condition.verify(**context)
assert success
assert result == {}
def test_jwt_condition_verify_of_jwt_with_custom_claims():
token = jwt_token(with_iat=False, claims={"foo": "bar"})
condition = JWTCondition(
jwt_token=":anotherContextVariableForJWTs",
public_key=TEST_ECDSA_PUBLIC_KEY,
)
context = {":anotherContextVariableForJWTs": token}
success, result = condition.verify(**context)
assert success
assert result == {"foo": "bar"}
def test_jwt_condition_verify_with_correct_issuer():
token = jwt_token(with_iat=False, claims={"iss": "Isabel"})
condition = JWTCondition(
jwt_token=":anotherContextVariableForJWTs",
public_key=TEST_ECDSA_PUBLIC_KEY,
expected_issuer="Isabel",
)
context = {":anotherContextVariableForJWTs": token}
success, result = condition.verify(**context)
assert success
assert result == {"iss": "Isabel"}
def test_jwt_condition_verify_with_invalid_issuer():
token = jwt_token(with_iat=False, claims={"iss": "Isabel"})
condition = JWTCondition(
jwt_token=":anotherContextVariableForJWTs",
public_key=TEST_ECDSA_PUBLIC_KEY,
expected_issuer="Isobel",
)
context = {":anotherContextVariableForJWTs": token}
with pytest.raises(JWTException, match="Invalid issuer"):
_ = condition.verify(**context)
def test_jwt_condition_verify_expired_token():
# Create a token that expired 100 seconds
expired_token = jwt_token(with_iat=True, expiration_offset=-100)
condition = JWTCondition(
jwt_token=":contextVar",
public_key=TEST_ECDSA_PUBLIC_KEY,
)
context = {":contextVar": expired_token}
with pytest.raises(JWTException, match="Signature has expired"):
_ = condition.verify(**context)
def test_jwt_condition_verify_valid_token_with_expiration():
# Create a token that will expire in 999 seconds
expired_token = jwt_token(with_iat=False, expiration_offset=999)
condition = JWTCondition(
jwt_token=":contextVar",
public_key=TEST_ECDSA_PUBLIC_KEY,
)
context = {":contextVar": expired_token}
success, result = condition.verify(**context)
assert success
assert result == {"exp": ISSUED_AT + 999}

View File

@ -39,9 +39,11 @@ from nucypher.policy.conditions.utils import (
ConditionEvalError,
ConditionProviderManager,
camel_case_to_snake,
check_and_convert_big_int_string_to_int,
evaluate_condition_lingo,
to_camelcase,
)
from tests.constants import INT256_MIN, UINT256_MAX
FAILURE_CASE_EXCEPTION_CODE_MATCHING = [
# (exception, constructor parameters, expected status code)
@ -216,3 +218,32 @@ def test_condition_provider_manager(mocker):
w3_2.eth.chain_id = 2
with patch.object(manager, "_configure_w3", side_effect=[w3_1, w3_2]):
assert list(manager.web3_endpoints(chain_id=2)) == [w3_1, w3_2]
@pytest.mark.parametrize(
"value, expectedValue",
[
# number string
("123132312", None),
("-1231231", None),
# big int string of form "<number>n"
(f"{UINT256_MAX}n", UINT256_MAX),
(f"{INT256_MIN}n", INT256_MIN),
(f"{UINT256_MAX*2}n", UINT256_MAX * 2), # larger than uint256 max
(f"{INT256_MIN*2}n", INT256_MIN * 2), # smaller than in256 min
("9007199254740992n", 9007199254740992), # bigger than max safe
("-9007199254740992n", -9007199254740992), # smaller than min safe
# regular strings
("Totally a number", None),
("Totally a number that ends with n", None),
("0xdeadbeef", None),
("fallen", None),
],
)
def test_conversion_from_big_int_string(value, expectedValue):
result = check_and_convert_big_int_string_to_int(value)
if expectedValue:
assert result == expectedValue
else:
# value unchanged
assert result == value

View File

@ -107,6 +107,7 @@ def random_transcript(get_random_checksum_address):
Validator(
address=get_random_checksum_address(),
public_key=Keypair.random().public_key(),
share_index=i,
)
)

View File

@ -0,0 +1,123 @@
import pytest
from nucypher_core.ferveo import (
AggregatedTranscript,
Dkg,
Keypair,
Validator,
ValidatorMessage,
combine_decryption_shares_simple,
decrypt_with_shared_secret,
encrypt,
)
SHARES_NUM = 4
# This test is a mirror of the handover python test in ferveo
@pytest.mark.parametrize("handover_slot_index", list(range(SHARES_NUM)))
def test_handover_with_encrypt_and_decrypt(
get_random_checksum_address, handover_slot_index
):
tau = 1
security_threshold = 3
shares_num = SHARES_NUM
validators_num = shares_num + 2
validator_keypairs = [Keypair.random() for _ in range(validators_num)]
# validators and associated keypairs must be in the same order
validators = [
Validator(get_random_checksum_address(), keypair.public_key(), i)
for i, keypair in enumerate(validator_keypairs)
]
# Each validator holds their own DKG instance and generates a transcript every
# validator, including themselves
messages = []
for sender in validators:
dkg = Dkg(
tau=tau,
shares_num=shares_num,
security_threshold=security_threshold,
validators=validators,
me=sender,
)
messages.append(ValidatorMessage(sender, dkg.generate_transcript()))
# Now that every validator holds a dkg instance and a transcript for every other validator,
# every validator can aggregate the transcripts
me = validators[0]
dkg = Dkg(
tau=tau,
shares_num=shares_num,
security_threshold=security_threshold,
validators=validators,
me=me,
)
# Server can aggregate the transcripts
server_aggregate = dkg.aggregate_transcripts(messages)
assert server_aggregate.verify(validators_num, messages)
# And the client can also aggregate and verify the transcripts
client_aggregate = AggregatedTranscript(messages)
assert client_aggregate.verify(validators_num, messages)
# In the meantime, the client creates a ciphertext and decryption request
msg = "abc".encode()
aad = "my-aad".encode()
ciphertext = encrypt(msg, aad, client_aggregate.public_key)
# The client can serialize/deserialize ciphertext for transport
_ciphertext_serialized = bytes(ciphertext)
# Let's simulate a handover
incoming_validator_keypair = Keypair.random()
incoming_validator = Validator(
get_random_checksum_address(),
incoming_validator_keypair.public_key(),
handover_slot_index,
)
departing_keypair = validator_keypairs[handover_slot_index]
handover_transcript = dkg.generate_handover_transcript(
server_aggregate,
handover_slot_index,
incoming_validator_keypair,
)
new_aggregate = server_aggregate.finalize_handover(
handover_transcript, departing_keypair
)
validator_keypairs[handover_slot_index] = incoming_validator_keypair
validators[handover_slot_index] = incoming_validator
# Having aggregated the transcripts, the validators can now create decryption shares
decryption_shares = []
for validator, validator_keypair in zip(validators, validator_keypairs):
dkg = Dkg(
tau=tau,
shares_num=shares_num,
security_threshold=security_threshold,
validators=validators,
me=validator,
)
# Create a decryption share for the ciphertext
decryption_share = new_aggregate.create_decryption_share_simple(
dkg, ciphertext.header, aad, validator_keypair
)
decryption_shares.append(decryption_share)
# We only need `threshold` decryption shares in simple variant
decryption_shares = decryption_shares[:security_threshold]
# Now, the decryption share can be used to decrypt the ciphertext
# This part is in the client API
shared_secret = combine_decryption_shares_simple(decryption_shares)
# The client should have access to the public parameters of the DKG
plaintext = decrypt_with_shared_secret(ciphertext, aad, shared_secret)
assert bytes(plaintext) == msg

View File

@ -0,0 +1,30 @@
import os
from nucypher_core.ferveo import FerveoPublicKey
from nucypher_core.ferveo import Keypair as FerveoKeypair
from nucypher.crypto.keypairs import RitualisticKeypair
from nucypher.crypto.powers import RitualisticPower
def test_derive_ritualistic_power(tmpdir):
size = FerveoKeypair.secure_randomness_size()
blob = os.urandom(size)
keypair = RitualisticKeypair.from_secure_randomness(blob)
power = RitualisticPower(keypair=keypair)
assert isinstance(power, RitualisticPower)
assert isinstance(power.keypair, RitualisticKeypair)
public_key = power.public_key()
assert isinstance(public_key, FerveoPublicKey)
# Generate keypair with different randomness
keypair2 = RitualisticKeypair.from_secure_randomness(os.urandom(size))
power2 = RitualisticPower(keypair=keypair2)
assert power.public_key() != power2.public_key()
# Generate keypair with same randomness
keypair3 = RitualisticKeypair.from_secure_randomness(blob)
power3 = RitualisticPower(keypair=keypair3)
assert power.public_key() == power3.public_key()

View File

@ -1,16 +1,25 @@
import os
from unittest.mock import patch
import pytest
from atxm.exceptions import Fault, InsufficientFunds
from nucypher.blockchain.eth.agents import CoordinatorAgent
from nucypher.blockchain.eth.models import PHASE1, PHASE2, Coordinator
from nucypher.blockchain.eth.models import (
HANDOVER_AWAITING_BLINDED_SHARE,
HANDOVER_AWAITING_TRANSCRIPT,
PHASE1,
PHASE2,
Coordinator,
)
from nucypher.crypto.powers import RitualisticPower
from nucypher.types import PhaseId
from tests.constants import MOCK_ETH_PROVIDER_URI
from tests.mock.coordinator import MockCoordinatorAgent
from tests.mock.interfaces import MockBlockchain
DKG_SIZE = 4
@pytest.fixture(scope="module")
def agent(mock_contract_agency, ursulas) -> MockCoordinatorAgent:
@ -23,8 +32,13 @@ def agent(mock_contract_agency, ursulas) -> MockCoordinatorAgent:
if ursula.checksum_address == provider:
return ursula.public_keys(RitualisticPower)
coordinator_agent.post_transcript = lambda *a, **kw: MockBlockchain.mock_async_tx()
coordinator_agent.post_aggregation = lambda *a, **kw: MockBlockchain.mock_async_tx()
def mock_async_tx(*args, **kwargs):
return MockBlockchain.mock_async_tx()
coordinator_agent.post_transcript = mock_async_tx
coordinator_agent.post_aggregation = mock_async_tx
coordinator_agent.post_handover_transcript = mock_async_tx
coordinator_agent.post_blinded_share_for_handover = mock_async_tx
coordinator_agent.get_provider_public_key = mock_get_provider_public_key
return coordinator_agent
@ -34,9 +48,14 @@ def ursula(ursulas):
return ursulas[1]
@pytest.fixture
def handover_incoming_ursula(ursulas):
return ursulas[DKG_SIZE]
@pytest.fixture(scope="module")
def cohort(ursulas):
return [u.staking_provider_address for u in ursulas[:4]]
return [u.staking_provider_address for u in ursulas[:DKG_SIZE]]
@pytest.fixture(scope="module")
@ -77,8 +96,8 @@ def test_initiate_ritual(
initiator=transacting_power.account,
authority=transacting_power.account,
access_controller=global_allow_list,
dkg_size=4,
threshold=MockCoordinatorAgent.get_threshold_for_ritual_size(dkg_size=4),
dkg_size=DKG_SIZE,
threshold=MockCoordinatorAgent.get_threshold_for_ritual_size(dkg_size=DKG_SIZE),
init_timestamp=123456,
end_timestamp=end_timestamp,
participants=participants,
@ -87,7 +106,6 @@ def test_initiate_ritual(
agent.get_ritual = lambda *args, **kwargs: ritual
assert receipt["transactionHash"]
return ritual_id
def test_perform_round_1(
@ -111,11 +129,11 @@ def test_perform_round_1(
initiator=random_address,
authority=random_address,
access_controller=get_random_checksum_address(),
dkg_size=4,
threshold=MockCoordinatorAgent.get_threshold_for_ritual_size(dkg_size=4),
dkg_size=DKG_SIZE,
threshold=MockCoordinatorAgent.get_threshold_for_ritual_size(dkg_size=DKG_SIZE),
init_timestamp=init_timestamp,
end_timestamp=end_timestamp,
total_transcripts=4,
total_transcripts=DKG_SIZE,
participants=list(participants.values()),
fee_model=get_random_checksum_address(),
)
@ -539,3 +557,210 @@ def test_async_tx_hooks_phase_2(ursula, mocker, aggregated_transcript, dkg_publi
assert (
mock_publish_transcript.call_count == current_call_count
), "no action needed, so not called"
def test_perform_handover_transcript_phase(
ursula,
handover_incoming_ursula,
cohort,
agent,
random_transcript,
):
init_timestamp = 123456
handover = Coordinator.Handover(
key=bytes(os.urandom(32)),
departing_validator=ursula.checksum_address,
incoming_validator=handover_incoming_ursula.checksum_address,
init_timestamp=init_timestamp,
blinded_share=b"",
transcript=b"",
decryption_request_pubkey=b"",
)
agent.get_handover = lambda *args, **kwargs: handover
# ensure no operation performed when ritual is not active
no_handover_dkg_states = [
Coordinator.RitualStatus.NON_INITIATED,
Coordinator.RitualStatus.DKG_AWAITING_TRANSCRIPTS,
Coordinator.RitualStatus.DKG_AWAITING_AGGREGATIONS,
Coordinator.RitualStatus.EXPIRED,
Coordinator.RitualStatus.DKG_TIMEOUT,
Coordinator.RitualStatus.DKG_INVALID,
]
for dkg_state in no_handover_dkg_states:
agent.get_ritual_status = lambda *args, **kwargs: dkg_state
assert not handover_incoming_ursula._is_handover_transcript_required(
ritual_id=0, departing_validator=ursula.checksum_address
)
async_tx = handover_incoming_ursula.perform_handover_transcript_phase(
ritual_id=0, departing_participant=ursula.checksum_address
)
assert async_tx is None # no execution performed
# ensure no operation performed when ritual is active but
# handover status is not awaiting transcript
agent.get_ritual_status = lambda *args, **kwargs: Coordinator.RitualStatus.ACTIVE
no_handover_transcript_states = [
Coordinator.HandoverStatus.NON_INITIATED,
Coordinator.HandoverStatus.HANDOVER_AWAITING_BLINDED_SHARE,
Coordinator.HandoverStatus.HANDOVER_AWAITING_FINALIZATION,
Coordinator.HandoverStatus.HANDOVER_TIMEOUT,
]
for handover_state in no_handover_transcript_states:
agent.get_handover_status = lambda *args, **kwargs: handover_state
assert not handover_incoming_ursula._is_handover_transcript_required(
ritual_id=0, departing_validator=ursula.checksum_address
)
async_tx = handover_incoming_ursula.perform_handover_transcript_phase(
ritual_id=0, departing_participant=ursula.checksum_address
)
assert async_tx is None # no execution performed
# set correct state
agent.get_handover_status = (
lambda *args, **kwargs: Coordinator.HandoverStatus.HANDOVER_AWAITING_TRANSCRIPT
)
assert handover_incoming_ursula._is_handover_transcript_required(
ritual_id=0, departing_validator=ursula.checksum_address
)
# cryptographic issue does not raise exception
with patch(
"nucypher.crypto.ferveo.dkg.produce_handover_transcript",
side_effect=Exception("transcript cryptography failed"),
):
async_tx = handover_incoming_ursula.perform_handover_transcript_phase(
ritual_id=0, departing_participant=ursula.checksum_address
)
# exception not raised, but None returned
assert async_tx is None
phase_id = PhaseId(ritual_id=0, phase=HANDOVER_AWAITING_TRANSCRIPT)
assert (
handover_incoming_ursula.dkg_storage.get_ritual_phase_async_tx(
phase_id=phase_id
)
is None
), "no tx data as yet"
# mock the handover transcript production
handover_incoming_ursula._produce_handover_transcript = (
lambda *args, **kwargs: random_transcript
)
# let's perform the handover transcript phase
async_tx = handover_incoming_ursula.perform_handover_transcript_phase(
ritual_id=0, departing_participant=ursula.checksum_address
)
# ensure tx is tracked
assert async_tx
assert (
handover_incoming_ursula.dkg_storage.get_ritual_phase_async_tx(
phase_id=phase_id
)
is async_tx
)
# try again
async_tx2 = handover_incoming_ursula.perform_handover_transcript_phase(
ritual_id=0, departing_participant=ursula.checksum_address
)
assert async_tx2 is async_tx
assert (
handover_incoming_ursula.dkg_storage.get_ritual_phase_async_tx(
phase_id=phase_id
)
is async_tx2
)
def test_perform_handover_blinded_share_phase(
ursula,
handover_incoming_ursula,
cohort,
agent,
):
init_timestamp = 123456
handover = Coordinator.Handover(
key=bytes(os.urandom(32)),
departing_validator=ursula.checksum_address,
incoming_validator=handover_incoming_ursula.checksum_address,
init_timestamp=init_timestamp,
blinded_share=b"",
transcript=bytes(os.urandom(32)),
decryption_request_pubkey=bytes(os.urandom(32)),
)
agent.get_handover = lambda *args, **kwargs: handover
# ensure no operation performed when ritual is not active
no_handover_dkg_states = [
Coordinator.RitualStatus.NON_INITIATED,
Coordinator.RitualStatus.DKG_AWAITING_TRANSCRIPTS,
Coordinator.RitualStatus.DKG_AWAITING_AGGREGATIONS,
Coordinator.RitualStatus.EXPIRED,
Coordinator.RitualStatus.DKG_TIMEOUT,
Coordinator.RitualStatus.DKG_INVALID,
]
for dkg_state in no_handover_dkg_states:
agent.get_ritual_status = lambda *args, **kwargs: dkg_state
assert not ursula._is_handover_blinded_share_required(ritual_id=0)
async_tx = ursula.perform_handover_blinded_share_phase(ritual_id=0)
assert async_tx is None # no execution performed
# ensure ritual is active
agent.get_ritual_status = lambda *args, **kwargs: Coordinator.RitualStatus.ACTIVE
# ensure no operation performed when ritual is active but
# handover status is not awaiting blinding share
agent.get_ritual_status = lambda *args, **kwargs: Coordinator.RitualStatus.ACTIVE
no_handover_transcript_states = [
Coordinator.HandoverStatus.NON_INITIATED,
Coordinator.HandoverStatus.HANDOVER_AWAITING_TRANSCRIPT,
Coordinator.HandoverStatus.HANDOVER_AWAITING_FINALIZATION,
Coordinator.HandoverStatus.HANDOVER_TIMEOUT,
]
for handover_state in no_handover_transcript_states:
agent.get_handover_status = lambda *args, **kwargs: handover_state
assert not ursula._is_handover_blinded_share_required(ritual_id=0)
async_tx = ursula.perform_handover_blinded_share_phase(ritual_id=0)
assert async_tx is None # no execution performed
# set correct state
agent.get_handover_status = (
lambda *args, **kwargs: Coordinator.HandoverStatus.HANDOVER_AWAITING_BLINDED_SHARE
)
assert ursula._is_handover_blinded_share_required(ritual_id=0)
# cryptographic issue does not raise exception
with patch(
"nucypher.crypto.ferveo.dkg.finalize_handover",
side_effect=Exception("blinded share cryptography failed"),
):
async_tx = ursula.perform_handover_blinded_share_phase(ritual_id=0)
# exception not raised, but None returned
assert async_tx is None
phase_id = PhaseId(ritual_id=0, phase=HANDOVER_AWAITING_BLINDED_SHARE)
assert (
ursula.dkg_storage.get_ritual_phase_async_tx(phase_id=phase_id) is None
), "no tx data as yet"
# mock the blinded share production
ursula._produce_blinded_share_for_handover = lambda *args, **kwargs: bytes(
os.urandom(32)
)
# let's perform the handover blinded share phase
async_tx = ursula.perform_handover_blinded_share_phase(ritual_id=0)
# ensure tx is tracked
assert async_tx
assert ursula.dkg_storage.get_ritual_phase_async_tx(phase_id=phase_id) is async_tx
# try again
async_tx2 = ursula.perform_handover_blinded_share_phase(ritual_id=0)
assert async_tx2 is async_tx
assert ursula.dkg_storage.get_ritual_phase_async_tx(phase_id=phase_id) is async_tx2