Merge pull request #3431 from nucypher/v7.3.x

v7.3.x
pull/3498/head
Manuel Montenegro 2024-05-07 15:27:43 +02:00 committed by GitHub
commit 9ce9048605
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
132 changed files with 14050 additions and 11906 deletions

View File

@ -1,7 +1,7 @@
[bumpversion]
current_version = 7.2.0
current_version = 7.3.0
commit = True
tag = True
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(-(?P<stage>[^.]*)\.(?P<devnum>\d+))?
serialize =
{major}.{minor}.{patch}-{stage}.{devnum}
@ -16,4 +16,5 @@ values =
[bumpversion:part:devnum]
[bumpversion:file:pyproject.toml]
[bumpversion:file:nucypher/__about__.py]

3
.github/labeler.yml vendored
View File

@ -5,8 +5,7 @@ Demos:
- examples/**/*
Dependencies:
- Pipfile
- Pipfile.lock
- poetry.lock
- setup.py
- dev-requirements.txt
- requirements.txt

View File

@ -18,7 +18,7 @@ jobs:
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v1
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}

View File

@ -12,7 +12,7 @@ on:
env: # TODO: Use variables when GH supports it for forks. See https://github.com/orgs/community/discussions/44322
DEMO_L1_PROVIDER_URI: "https://sepolia.infura.io/v3/3747007a284045d483c342fb39889a30"
DEMO_L2_PROVIDER_URI: "https://polygon-mumbai.infura.io/v3/3747007a284045d483c342fb39889a30"
DEMO_L2_PROVIDER_URI: "https://polygon-amoy.infura.io/v3/3747007a284045d483c342fb39889a30"
COLLECT_PROFILER_STATS: "" # any value is fine
jobs:

View File

@ -142,7 +142,7 @@ jobs:
# Only upload coverage files after all tests have passed
- name: Upload unit tests coverage to Codecov
if: matrix.python-version == '3.12'
uses: codecov/codecov-action@v3.1.1
uses: codecov/codecov-action@v4.3.0
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: unit-coverage.xml
@ -152,7 +152,7 @@ jobs:
- name: Upload integration tests coverage to Codecov
if: matrix.python-version == '3.12'
uses: codecov/codecov-action@v3.1.1
uses: codecov/codecov-action@v4.3.0
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: integration-coverage.xml
@ -162,7 +162,7 @@ jobs:
- name: Upload acceptance tests coverage to Codecov
if: matrix.python-version == '3.12'
uses: codecov/codecov-action@v3.1.1
uses: codecov/codecov-action@v4.3.0
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: tests/acceptance

View File

@ -66,7 +66,7 @@ Ensure Rust is Installed
Instruction for installing Rust can be found (\ `here <https://rustup.rs/>`_\ ).
After acquiring a local copy of the application code and installing rust, you will need to
install the project dependencies, we recommend using either ``pip`` or ``pipenv``.
install the project dependencies, we recommend using either ``pip`` or ``poetry``.
Pip Development Installation

View File

@ -7,8 +7,6 @@ help:
@echo "dist - build wheels and source distribution"
@echo "smoke-test - build a source distribution and spawn an active virtual environment"
@echo "lock - Regenerate dependency locks"
@echo "env - Regenerate locks and create a new development pipenv"
@echo "install - Development installation via pipenv"
clean: clean-build clean-pyc
@ -51,17 +49,3 @@ smoke-test: clean
lock: clean
# Relock dependencies
scripts/dependencies/relock_dependencies.sh
env: lock
# Relock dependencies and generate a pipenv virtualenv from the result
pipenv run pip install -e .[dev]
pipenv shell
nucypher --version
install: clean
pipenv --rm
# Development installation
pipenv run pip install -e .[dev]
# Show installed version and verify entry point
pipenv shell
nucypher --version

63
Pipfile
View File

@ -1,63 +0,0 @@
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"
[requires]
python_version = "3"
[packages]
nucypher-core = "==0.13.0"
# Cryptography
cryptography = ">=3.2"
mnemonic = "*"
pynacl = ">=1.4.0"
pyopenssl = "*"
# Ethereum
eth-abi = "<5.0.0" # eth-ape restriction
eth-tester = "<0.10.0,>0.9.0" # web3[tester]
eth-account = "<0.9,>=0.8.0"
eth-utils = "*"
web3 = ">=6.0.0"
# Web
flask = "*"
hendrix = ">=4.0"
requests = "*"
mako = "*"
# CLI
click = ">=7.0"
colorama = "*"
tabulate = "*"
# Serialization
bytestring-splitter = ">=2.4.0"
marshmallow = "*"
msgpack = "*"
# Utilities
aiohttp = "*"
appdirs = "*"
constant-sorrow = ">=0.1.0a9"
maya = "*"
pendulum = ">=3.0.0b1"
prometheus-client = "*"
setuptools = "*" # for distutils
urllib3 = "<2,>=1.26.16" # eth-ape
[dev-packages]
# Pytest
pytest = "<7" # See https://github.com/pytest-dev/pytest/issues/9703
pytest-cov = "*"
pytest-mock = "*"
pytest-timeout = "*"
pytest-twisted = "*"
# Tools
ape-solidity = ">=0.6.5"
coverage = ">=7.3.2"
eth-ape = ">=0.6.23"
pre-commit = ">=2.12.1"
[scripts]
nucypher = "python3 nucypher/cli/main.py"
[pipenv]
allow_prereleases = true

4796
Pipfile.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -11,9 +11,9 @@ secrets management and dynamic access control.*
----
# TACo Access Control
# Threshold Access Control (TACo)
TACo (Threshold Access Control) is end-to-end encrypted data sharing and communication, without the requirement of
TACo is end-to-end encrypted data sharing and communication, without the requirement of
trusting a centralized authority, who might unilaterally deny service or even decrypt private user data. It is the only
access control layer available to Web3 developers that can offer a decentralized service, through a live,
well-collateralized and battle-tested network. See more here: [https://docs.threshold.network/applications/threshold-access-control](https://docs.threshold.network/applications/threshold-access-control)
@ -22,7 +22,7 @@ well-collateralized and battle-tested network. See more here: [https://docs.thr
NuCypher is a community-driven project and we're very open to outside contributions.
All our development discussions happen in our [Discord server](https://discord.gg/7rmXa3S), where we're happy to answer
All our development discussions happen in our [Discord server](https://discord.gg/threshold), where we're happy to answer
technical questions, discuss feature requests,
and accept bug reports.

View File

@ -3,10 +3,17 @@ FROM nucypher/rust-python:3.12.0
# set default user
USER $USER
# set default in-container workdir
WORKDIR /code
# Layer 1: Install dependencies
COPY requirements.txt /code
RUN pip3 install --no-cache-dir -r requirements.txt
# Layer 2: Install nucypher entrypoint
COPY . /code
RUN pip3 install . --no-deps
RUN pip3 install .[ursula]
# Layer 3: Set environment variables
RUN export PATH="$HOME/.local/bin:$PATH"
CMD ["/bin/bash"]

View File

@ -1,130 +1,191 @@
-i https://pypi.python.org/simple
aiohttp==3.9.1; python_version >= '3.8'
aiosignal==1.3.1; python_version >= '3.7'
annotated-types==0.6.0; python_version >= '3.8'
ape-solidity==0.6.11; python_version >= '3.8' and python_version < '4'
asttokens==2.4.1
attrs==23.2.0; python_version >= '3.7'
base58==1.0.3
bitarray==2.9.2
cached-property==1.5.2
certifi==2023.11.17; python_version >= '3.6'
cffi==1.16.0; python_version >= '3.8'
cfgv==3.4.0; python_version >= '3.8'
charset-normalizer==3.3.2; python_full_version >= '3.7.0'
click==8.1.7; python_version >= '3.7'
commonmark==0.9.1
coverage[toml]==7.4.0; python_version >= '3.8'
cryptography==41.0.7; python_version >= '3.7'
cytoolz==0.12.2; implementation_name == 'cpython'
dataclassy==0.11.1; python_version >= '3.6'
decorator==5.1.1; python_version >= '3.5'
deprecated==1.2.14; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
distlib==0.3.8
eip712==0.2.2; python_version >= '3.8' and python_version < '4'
eth-abi==4.2.1; python_full_version >= '3.7.2' and python_version < '4'
eth-account==0.8.0; python_version >= '3.6' and python_version < '4'
eth-ape==0.6.27; python_version >= '3.8' and python_version < '4'
eth-bloom==3.0.0; python_version >= '3.8' and python_version < '4'
eth-hash[pycryptodome]==0.6.0; python_version >= '3.8' and python_version < '4'
eth-keyfile==0.6.1
eth-keys==0.4.0
eth-pydantic-types==0.1.0a5; python_version >= '3.8' and python_version < '4'
eth-rlp==0.3.0; python_version >= '3.7' and python_version < '4'
eth-tester==0.9.1b1; python_full_version >= '3.6.8' and python_version < '4'
eth-typing==3.5.2; python_full_version >= '3.7.2' and python_version < '4'
eth-utils==2.3.1; python_version >= '3.7' and python_version < '4'
ethpm-types==0.5.11; python_version >= '3.8' and python_version < '4'
evm-trace==0.1.2; python_version >= '3.8' and python_version < '4'
executing==2.0.1; python_version >= '3.5'
filelock==3.13.1; python_version >= '3.8'
frozenlist==1.4.1; python_version >= '3.8'
greenlet==3.0.3; python_version >= '3.7'
hexbytes==0.3.1; python_version >= '3.7' and python_version < '4'
identify==2.5.33; python_version >= '3.8'
idna==3.6; python_version >= '3.5'
ijson==3.2.3
importlib-metadata==7.0.1; python_version >= '3.8'
iniconfig==2.0.0; python_version >= '3.7'
ipython==8.20.0; python_version >= '3.10'
jedi==0.19.1; python_version >= '3.6'
jsonschema==4.20.0; python_version >= '3.8'
jsonschema-specifications==2023.12.1; python_version >= '3.8'
lazyasd==0.1.4
lru-dict==1.2.0
matplotlib-inline==0.1.6; python_version >= '3.5'
morphys==1.0
msgspec==0.18.5; python_version >= '3.8'
multidict==6.0.4; python_version >= '3.7'
mypy-extensions==1.0.0; python_version >= '3.5'
nodeenv==1.8.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6'
numpy==1.26.3; python_version >= '3.10'
packaging==23.2; python_version >= '3.7'
pandas==1.5.3; python_version >= '3.8'
parsimonious==0.9.0
parso==0.8.3; python_version >= '3.6'
pexpect==4.9.0; sys_platform != 'win32'
platformdirs==4.1.0; python_version >= '3.8'
pluggy==1.3.0; python_version >= '3.8'
pre-commit==3.6.0; python_version >= '3.9'
prompt-toolkit==3.0.43; python_full_version >= '3.7.0'
protobuf==4.25.2; python_version >= '3.8'
ptyprocess==0.7.0
pure-eval==0.2.2
py==1.11.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
py-cid==0.3.0
py-ecc==6.0.0; python_version >= '3.6' and python_version < '4'
py-evm==0.7.0a4
py-geth==3.13.0; python_version >= '3.7' and python_version < '4'
py-multibase==1.0.3
py-multicodec==0.2.1
py-multihash==0.2.3
py-solc-x==2.0.2; python_version >= '3.8' and python_version < '4'
pycparser==2.21
pycryptodome==3.20.0
pydantic==2.5.3; python_version >= '3.7'
pydantic-core==2.14.6; python_version >= '3.7'
pyethash==0.1.27
pygithub==1.59.1; python_version >= '3.7'
pygments==2.17.2; python_version >= '3.7'
pyjwt[crypto]==2.8.0; python_version >= '3.7'
pynacl==1.5.0; python_version >= '3.6'
pytest==6.2.5; python_version >= '3.6'
pytest-cov==4.1.0; python_version >= '3.7'
pytest-mock==3.12.0; python_version >= '3.8'
pytest-timeout==2.2.0; python_version >= '3.7'
pytest-twisted==1.14.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
python-baseconv==1.2.2
python-dateutil==2.8.2; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'
pytz==2023.3.post1
pyunormalize==15.1.0; python_version >= '3.6'
pyyaml==6.0.1; python_version >= '3.6'
referencing==0.32.1; python_version >= '3.8'
regex==2023.12.25; python_version >= '3.7'
requests==2.31.0; python_version >= '3.7'
rich==12.6.0; python_full_version >= '3.6.3' and python_full_version < '4.0.0'
rlp==3.0.0
rpds-py==0.16.2; python_version >= '3.8'
safe-pysha3==1.0.4
semantic-version==2.10.0; python_version >= '2.7'
setuptools==69.0.3; python_version >= '3.8'
six==1.16.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'
sortedcontainers==2.4.0
sqlalchemy==2.0.25; python_version >= '3.7'
stack-data==0.6.3
toml==0.10.2; python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2'
toolz==0.12.0; python_version >= '3.5'
tqdm==4.66.1; python_version >= '3.7'
traitlets==5.14.1; python_version >= '3.8'
trie==2.2.0; python_version >= '3.7' and python_version < '4'
typing-extensions==4.9.0; python_version >= '3.8'
urllib3==1.26.18; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
varint==1.0.2
virtualenv==20.25.0; python_version >= '3.7'
watchdog==3.0.0; python_version >= '3.7'
wcwidth==0.2.13
web3==6.14.0; python_full_version >= '3.7.2'
websockets==12.0; python_version >= '3.8'
wrapt==1.16.0; python_version >= '3.6'
yarl==1.9.4; python_version >= '3.7'
zipp==3.17.0; python_version >= '3.8'
aiohttp==3.9.4rc0 ; python_version >= "3.8" and python_version < "4"
aiosignal==1.3.1 ; python_version >= "3.8" and python_version < "4"
annotated-types==0.6.0 ; python_version >= "3.8" and python_version < "4"
ape-solidity==0.7.1 ; python_version >= "3.8" and python_version < "4"
appdirs==1.4.4 ; python_version >= "3.8" and python_version < "4"
appnope==0.1.4 ; python_version >= "3.8" and python_version < "4" and sys_platform == "darwin"
asttokens==2.4.1 ; python_version >= "3.8" and python_version < "4"
async-timeout==4.0.3 ; python_version >= "3.8" and python_version < "3.11"
atomicwrites==1.4.1 ; python_version >= "3.8" and python_version < "4" and sys_platform == "win32"
attrs==23.2.0 ; python_version >= "3.8" and python_version < "4"
atxm==0.3.0 ; python_version >= "3.8" and python_version < "4"
autobahn==23.1.2 ; python_version >= "3.8" and python_version < "4"
automat==22.10.0 ; python_version >= "3.8" and python_version < "4"
backcall==0.2.0 ; python_version >= "3.8" and python_version < "4"
backports-zoneinfo==0.2.1 ; python_version >= "3.8" and python_version < "3.9"
base58==1.0.3 ; python_version >= "3.8" and python_version < "4"
bitarray==2.9.2 ; python_version >= "3.8" and python_version < "4"
blinker==1.7.0 ; python_version >= "3.8" and python_version < "4"
bytestring-splitter==2.4.1 ; python_version >= "3.8" and python_version < "4"
cached-property==1.5.2 ; python_version >= "3.8" and python_version < "4"
certifi==2024.2.2 ; python_version >= "3.8" and python_version < "4"
cffi==1.16.0 ; python_version >= "3.8" and python_version < "4"
cfgv==3.4.0 ; python_version >= "3.8" and python_version < "4"
charset-normalizer==3.3.2 ; python_version >= "3.8" and python_version < "4"
click==8.1.7 ; python_version >= "3.8" and python_version < "4"
colorama==0.4.6 ; python_version >= "3.8" and python_version < "4"
constant-sorrow==0.1.0a9 ; python_version >= "3.8" and python_version < "4"
constantly==23.10.4 ; python_version >= "3.8" and python_version < "4"
coverage==7.4.4 ; python_version >= "3.8" and python_version < "4"
coverage[toml]==7.4.4 ; python_version >= "3.8" and python_version < "4"
cryptography==42.0.5 ; python_version >= "3.8" and python_version < "4"
cytoolz==0.12.3 ; python_version >= "3.8" and python_version < "4" and implementation_name == "cpython"
dataclassy==0.11.1 ; python_version >= "3.8" and python_version < "4"
dateparser==1.2.0 ; python_version >= "3.8" and python_version < "4"
decorator==5.1.1 ; python_version >= "3.8" and python_version < "4"
deprecated==1.2.14 ; python_version >= "3.8" and python_version < "4"
distlib==0.3.8 ; python_version >= "3.8" and python_version < "4"
eip712==0.2.5 ; python_version >= "3.8" and python_version < "4"
eth-abi==4.2.1 ; python_version >= "3.8" and python_version < "4"
eth-account==0.10.0 ; python_version >= "3.8" and python_version < "4"
eth-ape==0.7.13 ; python_version >= "3.8" and python_version < "4"
eth-bloom==3.0.0 ; python_version >= "3.8" and python_version < "4"
eth-hash==0.7.0 ; python_version >= "3.8" and python_version < "4"
eth-hash[pycryptodome]==0.7.0 ; python_version >= "3.8" and python_version < "4"
eth-hash[pysha3]==0.7.0 ; python_version >= "3.8" and python_version < "4" and implementation_name == "cpython"
eth-keyfile==0.8.0 ; python_version >= "3.8" and python_version < "4"
eth-keys==0.4.0 ; python_version >= "3.8" and python_version < "4"
eth-pydantic-types==0.1.0 ; python_version >= "3.8" and python_version < "4"
eth-rlp==1.0.1 ; python_version >= "3.8" and python_version < "4"
eth-tester[py-evm]==0.9.1b2 ; python_version >= "3.8" and python_version < "4"
eth-typing==3.5.2 ; python_version >= "3.8" and python_version < "4"
eth-utils==2.3.1 ; python_version >= "3.8" and python_version < "4"
ethpm-types==0.6.9 ; python_version >= "3.8" and python_version < "4"
evm-trace==0.1.3 ; python_version >= "3.8" and python_version < "4"
evmchains==0.0.6 ; python_version >= "3.8" and python_version < "4"
executing==2.0.1 ; python_version >= "3.8" and python_version < "4"
filelock==3.13.4 ; python_version >= "3.8" and python_version < "4"
flask==3.0.3 ; python_version >= "3.8" and python_version < "4"
frozenlist==1.4.1 ; python_version >= "3.8" and python_version < "4"
greenlet==3.0.3 ; python_version >= "3.8" and python_version < "4"
hendrix==5.0.0 ; python_version >= "3.8" and python_version < "4"
hexbytes==0.3.1 ; python_version >= "3.8" and python_version < "4"
humanize==4.9.0 ; python_version >= "3.8" and python_version < "4"
hyperlink==21.0.0 ; python_version >= "3.8" and python_version < "4"
identify==2.5.35 ; python_version >= "3.8" and python_version < "4"
idna==3.7 ; python_version >= "3.8" and python_version < "4"
ijson==3.2.3 ; python_version >= "3.8" and python_version < "4"
importlib-metadata==7.1.0 ; python_version >= "3.8" and python_version < "4"
importlib-resources==6.4.0 ; python_version >= "3.8" and python_version < "3.9"
incremental==22.10.0 ; python_version >= "3.8" and python_version < "4"
iniconfig==2.0.0 ; python_version >= "3.8" and python_version < "4"
ipython==8.12.3 ; python_version >= "3.8" and python_version < "4"
itsdangerous==2.1.2 ; python_version >= "3.8" and python_version < "4"
jedi==0.19.1 ; python_version >= "3.8" and python_version < "4"
jinja2==3.1.3 ; python_version >= "3.8" and python_version < "4"
jsonschema-specifications==2023.12.1 ; python_version >= "3.8" and python_version < "4"
jsonschema==4.21.1 ; python_version >= "3.8" and python_version < "4"
lazyasd==0.1.4 ; python_version >= "3.8" and python_version < "4"
lru-dict==1.2.0 ; python_version >= "3.8" and python_version < "4"
mako==1.3.3 ; python_version >= "3.8" and python_version < "4"
markdown-it-py==3.0.0 ; python_version >= "3.8" and python_version < "4"
markupsafe==2.1.5 ; python_version >= "3.8" and python_version < "4"
marshmallow==3.21.1 ; python_version >= "3.8" and python_version < "4"
matplotlib-inline==0.1.7 ; python_version >= "3.8" and python_version < "4"
maya==0.6.1 ; python_version >= "3.8" and python_version < "4"
mdurl==0.1.2 ; python_version >= "3.8" and python_version < "4"
mnemonic==0.20 ; python_version >= "3.8" and python_version < "4"
morphys==1.0 ; python_version >= "3.8" and python_version < "4"
msgpack-python==0.5.6 ; python_version >= "3.8" and python_version < "4"
msgspec==0.18.6 ; python_version >= "3.8" and python_version < "4"
multidict==6.0.5 ; python_version >= "3.8" and python_version < "4"
mypy-extensions==1.0.0 ; python_version >= "3.8" and python_version < "4"
nodeenv==1.8.0 ; python_version >= "3.8" and python_version < "4"
nucypher-core==0.13.0 ; python_version >= "3.8" and python_version < "4"
numpy==1.24.4 ; python_version >= "3.8" and python_version < "3.9"
numpy==1.26.4 ; python_version >= "3.9" and python_version < "4"
packaging==23.2 ; python_version >= "3.8" and python_version < "4"
pandas==1.5.3 ; python_version >= "3.8" and python_version < "4"
parsimonious==0.9.0 ; python_version >= "3.8" and python_version < "4"
parso==0.8.4 ; python_version >= "3.8" and python_version < "4"
pendulum==3.0.0 ; python_version >= "3.8" and python_version < "4"
pexpect==4.9.0 ; python_version >= "3.8" and python_version < "4" and sys_platform != "win32"
pickleshare==0.7.5 ; python_version >= "3.8" and python_version < "4"
pkgutil-resolve-name==1.3.10 ; python_version >= "3.8" and python_version < "3.9"
platformdirs==4.2.0 ; python_version >= "3.8" and python_version < "4"
pluggy==1.4.0 ; python_version >= "3.8" and python_version < "4"
pre-commit==2.21.0 ; python_version >= "3.8" and python_version < "4"
prometheus-client==0.20.0 ; python_version >= "3.8" and python_version < "4"
prompt-toolkit==3.0.43 ; python_version >= "3.8" and python_version < "4"
protobuf==5.26.1 ; python_version >= "3.8" and python_version < "4"
ptyprocess==0.7.0 ; python_version >= "3.8" and python_version < "4" and sys_platform != "win32"
pure-eval==0.2.2 ; python_version >= "3.8" and python_version < "4"
py-cid==0.3.0 ; python_version >= "3.8" and python_version < "4"
py-ecc==6.0.0 ; python_version >= "3.8" and python_version < "4"
py-evm==0.7.0a4 ; python_version >= "3.8" and python_version < "4"
py-geth==4.4.0 ; python_version >= "3.8" and python_version < "4"
py-multibase==1.0.3 ; python_version >= "3.8" and python_version < "4"
py-multicodec==0.2.1 ; python_version >= "3.8" and python_version < "4"
py-multihash==0.2.3 ; python_version >= "3.8" and python_version < "4"
py-solc-x==2.0.2 ; python_version >= "3.8" and python_version < "4"
py==1.11.0 ; python_version >= "3.8" and python_version < "4"
pyasn1-modules==0.4.0 ; python_version >= "3.8" and python_version < "4"
pyasn1==0.6.0 ; python_version >= "3.8" and python_version < "4"
pychalk==2.0.1 ; python_version >= "3.8" and python_version < "4"
pycparser==2.22 ; python_version >= "3.8" and python_version < "4"
pycryptodome==3.20.0 ; python_version >= "3.8" and python_version < "4"
pydantic-core==2.14.6 ; python_version >= "3.8" and python_version < "4"
pydantic-settings==2.2.1 ; python_version >= "3.8" and python_version < "4"
pydantic==2.5.3 ; python_version >= "3.8" and python_version < "4"
pyethash==0.1.27 ; python_version >= "3.8" and python_version < "4"
pygithub==1.59.1 ; python_version >= "3.8" and python_version < "4"
pygments==2.17.2 ; python_version >= "3.8" and python_version < "4"
pyjwt[crypto]==2.8.0 ; python_version >= "3.8" and python_version < "4"
pynacl==1.5.0 ; python_version >= "3.8" and python_version < "4"
pyopenssl==24.1.0 ; python_version >= "3.8" and python_version < "4"
pysha3==1.0.2 ; python_version < "3.9" and python_version >= "3.8" and implementation_name == "cpython"
pytest-cov==5.0.0 ; python_version >= "3.8" and python_version < "4"
pytest-mock==3.14.0 ; python_version >= "3.8" and python_version < "4"
pytest-timeout==2.2.0 ; python_version >= "3.8" and python_version < "4"
pytest-twisted==1.14.1 ; python_version >= "3.8" and python_version < "4"
pytest==6.2.5 ; python_version >= "3.8" and python_version < "4"
python-baseconv==1.2.2 ; python_version >= "3.8" and python_version < "4"
python-dateutil==2.9.0.post0 ; python_version >= "3.8" and python_version < "4"
python-dotenv==1.0.1 ; python_version >= "3.8" and python_version < "4"
python-statemachine==2.1.2 ; python_version >= "3.8" and python_version < "3.13"
pytz==2024.1 ; python_version >= "3.8" and python_version < "4"
pyunormalize==15.1.0 ; python_version >= "3.8" and python_version < "4"
pywin32==306 ; python_version >= "3.8" and python_version < "4" and platform_system == "Windows"
pyyaml==6.0.1 ; python_version >= "3.8" and python_version < "4"
referencing==0.34.0 ; python_version >= "3.8" and python_version < "4"
regex==2023.12.25 ; python_version >= "3.8" and python_version < "4"
requests==2.31.0 ; python_version >= "3.8" and python_version < "4"
rich==13.7.1 ; python_version >= "3.8" and python_version < "4"
rlp==3.0.0 ; python_version >= "3.8" and python_version < "4"
rpds-py==0.18.0 ; python_version >= "3.8" and python_version < "4"
safe-pysha3==1.0.4 ; python_version >= "3.9" and python_version < "4" and implementation_name == "cpython"
semantic-version==2.10.0 ; python_version >= "3.8" and python_version < "4"
service-identity==24.1.0 ; python_version >= "3.8" and python_version < "4"
setuptools==69.2.0 ; python_version >= "3.8" and python_version < "4"
six==1.16.0 ; python_version >= "3.8" and python_version < "4"
snaptime==0.2.4 ; python_version >= "3.8" and python_version < "4"
sortedcontainers==2.4.0 ; python_version >= "3.8" and python_version < "4"
sqlalchemy==2.0.29 ; python_version >= "3.8" and python_version < "4"
stack-data==0.6.3 ; python_version >= "3.8" and python_version < "4"
tabulate==0.9.0 ; python_version >= "3.8" and python_version < "4"
time-machine==2.14.1 ; python_version >= "3.8" and python_version < "4"
toml==0.10.2 ; python_version >= "3.8" and python_version < "4"
tomli==2.0.1 ; python_full_version <= "3.11.0a6" and python_version >= "3.8"
toolz==0.12.1 ; python_version >= "3.8" and python_version < "4"
tqdm==4.66.2 ; python_version >= "3.8" and python_version < "4"
traitlets==5.14.2 ; python_version >= "3.8" and python_version < "4"
trie==2.2.0 ; python_version >= "3.8" and python_version < "4"
twisted-iocpsupport==1.0.4 ; python_version >= "3.8" and python_version < "4" and platform_system == "Windows"
twisted==24.3.0 ; python_version >= "3.8" and python_version < "4"
txaio==23.1.1 ; python_version >= "3.8" and python_version < "4"
typing-extensions==4.11.0 ; python_version >= "3.8" and python_version < "4"
tzdata==2024.1 ; python_version >= "3.8" and python_version < "4"
tzlocal==5.2 ; python_version >= "3.8" and python_version < "4"
urllib3==2.2.0 ; python_version >= "3.8" and python_version < "4"
varint==1.0.2 ; python_version >= "3.8" and python_version < "4"
virtualenv==20.25.1 ; python_version >= "3.8" and python_version < "4"
watchdog==3.0.0 ; python_version >= "3.8" and python_version < "4"
wcwidth==0.2.13 ; python_version >= "3.8" and python_version < "4"
web3==6.15.1 ; python_version >= "3.8" and python_version < "4"
web3[tester]==6.15.1 ; python_version >= "3.8" and python_version < "4"
websockets==12.0 ; python_version >= "3.8" and python_version < "4"
werkzeug==3.0.2 ; python_version >= "3.8" and python_version < "4"
wrapt==1.16.0 ; python_version >= "3.8" and python_version < "4"
yarl==1.9.4 ; python_version >= "3.8" and python_version < "4"
zipp==3.18.1 ; python_version >= "3.8" and python_version < "4"
zope-interface==6.2 ; python_version >= "3.8" and python_version < "4"

View File

@ -4,6 +4,7 @@ from nucypher_core.ferveo import DkgPublicKey
from nucypher.blockchain.eth import domains
from nucypher.blockchain.eth.agents import CoordinatorAgent
from nucypher.blockchain.eth.domains import EthChain, PolygonChain
from nucypher.blockchain.eth.registry import ContractRegistry
from nucypher.blockchain.eth.signers import InMemorySigner
from nucypher.characters.lawful import Bob, Enrico
@ -38,7 +39,7 @@ coordinator_agent = CoordinatorAgent(
blockchain_endpoint=polygon_endpoint,
registry=registry,
)
ritual_id = 5 # got this from a side channel
ritual_id = 0 # got this from a side channel
ritual = coordinator_agent.get_ritual(ritual_id)
# known authorized encryptor for ritual 3
@ -61,28 +62,28 @@ conditions = {
"operands": [
{
"conditionType": ConditionType.RPC.value,
"chain": 1,
"chain": EthChain.MAINNET.id,
"method": "eth_getBalance",
"parameters": ["0x210eeAC07542F815ebB6FD6689637D8cA2689392", "latest"],
"returnValueTest": {"comparator": "==", "value": 0},
},
{
"conditionType": ConditionType.RPC.value,
"chain": 137,
"chain": PolygonChain.MAINNET.id,
"method": "eth_getBalance",
"parameters": ["0x210eeAC07542F815ebB6FD6689637D8cA2689392", "latest"],
"returnValueTest": {"comparator": "==", "value": 0},
},
{
"conditionType": ConditionType.RPC.value,
"chain": 11155111,
"chain": EthChain.SEPOLIA.id,
"method": "eth_getBalance",
"parameters": ["0x210eeAC07542F815ebB6FD6689637D8cA2689392", "latest"],
"returnValueTest": {"comparator": ">", "value": 1},
},
{
"conditionType": ConditionType.RPC.value,
"chain": 80001,
"chain": PolygonChain.AMOY.id,
"method": "eth_getBalance",
"parameters": ["0x210eeAC07542F815ebB6FD6689637D8cA2689392", "latest"],
"returnValueTest": {"comparator": "==", "value": 0},

View File

@ -4,6 +4,7 @@ from nucypher_core.ferveo import DkgPublicKey
from nucypher.blockchain.eth import domains
from nucypher.blockchain.eth.agents import CoordinatorAgent
from nucypher.blockchain.eth.domains import PolygonChain
from nucypher.blockchain.eth.registry import ContractRegistry
from nucypher.blockchain.eth.signers import InMemorySigner
from nucypher.characters.lawful import Bob, Enrico
@ -39,7 +40,7 @@ coordinator_agent = CoordinatorAgent(
blockchain_endpoint=polygon_endpoint,
registry=registry,
)
ritual_id = 5 # got this from a side channel
ritual_id = 0 # got this from a side channel
ritual = coordinator_agent.get_ritual(ritual_id)
# known authorized encryptor for ritual 3
@ -58,7 +59,7 @@ eth_balance_condition = {
"version": ConditionLingo.VERSION,
"condition": {
"conditionType": ConditionType.RPC.value,
"chain": 80001,
"chain": PolygonChain.AMOY.id,
"method": "eth_getBalance",
"parameters": ["0x210eeAC07542F815ebB6FD6689637D8cA2689392", "latest"],
"returnValueTest": {"comparator": "==", "value": 0},

View File

View File

@ -0,0 +1 @@
Automated transaction speedups and retries

View File

View File

View File

View File

View File

View File

@ -0,0 +1 @@
Migrate configuration files when path explicitly specified as a CLI parameter.

View File

@ -0,0 +1 @@
Allow staking providers to be filtered/sampled based on an expectation for staking duration.

View File

@ -0,0 +1 @@
Migrate ``Lynx`` and ``Tapir`` testnets from Polygon Mumbai to Polygon Amoy.

View File

View File

@ -0,0 +1 @@
Improve DKG transaction fault tolerance by incorporating async transactions and callbacks.

View File

View File

@ -0,0 +1 @@
Add --metrics-listen-address CLI Prometheus option

View File

@ -0,0 +1 @@
Nodes will better handle the effects of compatible contract changes such as additions to structs and overloaded functions.

View File

@ -0,0 +1,2 @@
Console logging was not using the correct log format, log levels were not adhered to for limiting log messages, and global log observers not correctly managed.
Improved the StdoutEmitter to better utilize the logger so that information is not lost.

View File

@ -0,0 +1 @@
Add ``--json-logs/--no-json-logs`` CLI option to enabled/disable logging to a json log file; json logging is now disabled be default.

View File

View File

@ -0,0 +1 @@
Fix inconsistent API for ``sign_transaction`` return values; all now return bytes.

View File

@ -0,0 +1 @@
Ensure that log observers filter log messages by log level.

View File

@ -0,0 +1 @@
Prevent ritual tx starvation when broadcast failures occur by proactively removing problematic tx from the ATxM queue, and resubmitting a fresh tx.

View File

@ -0,0 +1 @@
RPC connectivity is now agnostic to the EVM client fingerprint and implementation technology.

View File

@ -0,0 +1 @@
Wallet management and transaction signing via blockchain clients is deprecated for production use.

View File

@ -16,7 +16,7 @@ __url__ = "https://github.com/nucypher/nucypher"
__summary__ = "A threshold access control application to empower privacy in decentralized systems."
__version__ = "7.2.0"
__version__ = "7.3.0"
__author__ = "NuCypher"

View File

@ -1,13 +1,15 @@
import json
import random
import time
import traceback
from collections import defaultdict
from decimal import Decimal
from typing import DefaultDict, Dict, List, Optional, Set, Tuple, Union
from typing import DefaultDict, Dict, List, Optional, Set, Union
import maya
from atxm.exceptions import InsufficientFunds
from atxm.tx import AsyncTx, FaultedTx, FinalizedTx, FutureTx, PendingTx
from eth_typing import ChecksumAddress
from hexbytes import HexBytes
from nucypher_core import (
EncryptedThresholdDecryptionRequest,
EncryptedThresholdDecryptionResponse,
@ -26,7 +28,6 @@ from nucypher_core.ferveo import (
Validator,
)
from web3 import HTTPProvider, Web3
from web3.exceptions import TransactionNotFound
from web3.types import TxReceipt
from nucypher.acumen.nicknames import Nickname
@ -40,7 +41,10 @@ from nucypher.blockchain.eth.clients import PUBLIC_CHAINS
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.decorators import validate_checksum_address
from nucypher.blockchain.eth.domains import TACoDomain
from nucypher.blockchain.eth.interfaces import BlockchainInterfaceFactory
from nucypher.blockchain.eth.interfaces import (
BlockchainInterface,
BlockchainInterfaceFactory,
)
from nucypher.blockchain.eth.models import PHASE1, PHASE2, Coordinator
from nucypher.blockchain.eth.registry import ContractRegistry
from nucypher.blockchain.eth.signers import Signer
@ -57,6 +61,7 @@ from nucypher.datastore.dkg import DKGStorage
from nucypher.policy.conditions.evm import _CONDITION_CHAINS
from nucypher.policy.conditions.utils import evaluate_condition_lingo
from nucypher.policy.payment import ContractPayment
from nucypher.types import PhaseId
from nucypher.utilities.emitters import StdoutEmitter
from nucypher.utilities.logging import Logger
@ -78,15 +83,17 @@ class BaseActor:
checksum_address: Optional[ChecksumAddress] = None,
):
if not (bool(checksum_address) ^ bool(transacting_power)):
error = f'Pass transacting power or checksum address, got {checksum_address} and {transacting_power}.'
error = f"Pass transacting power or checksum address, got {checksum_address} and {transacting_power}."
raise ValueError(error)
try:
parent_address = self.checksum_address
if checksum_address is not None:
if parent_address != checksum_address:
raise ValueError(f"Can't have two different ethereum addresses. "
f"Got {parent_address} and {checksum_address}.")
raise ValueError(
f"Can't have two different ethereum addresses. "
f"Got {parent_address} and {checksum_address}."
)
except AttributeError:
if transacting_power:
self.checksum_address = transacting_power.account
@ -114,9 +121,11 @@ class BaseActor:
@property
def eth_balance(self) -> Decimal:
"""Return this actor's current ETH balance"""
blockchain = BlockchainInterfaceFactory.get_interface() # TODO: EthAgent? #1509
blockchain = (
BlockchainInterfaceFactory.get_interface()
) # TODO: EthAgent? #1509
balance = blockchain.client.get_balance(self.wallet_address)
return Web3.from_wei(balance, 'ether')
return Web3.from_wei(balance, "ether")
@property
def wallet_address(self):
@ -223,8 +232,7 @@ class Operator(BaseActor):
self.publish_finalization = (
publish_finalization # publish the DKG final key if True
)
# TODO: #3052 stores locally generated public DKG artifacts
self.dkg_storage = DKGStorage()
self.ritual_power = crypto_power.power_ups(
RitualisticPower
) # ferveo material contained within
@ -236,6 +244,8 @@ class Operator(BaseActor):
condition_blockchain_endpoints
)
self.dkg_storage = DKGStorage()
def set_provider_public_key(self) -> Union[TxReceipt, None]:
# TODO: Here we're assuming there is one global key per node. See nucypher/#3167
node_global_ferveo_key_set = self.coordinator_agent.is_provider_public_key_set(
@ -319,7 +329,7 @@ class Operator(BaseActor):
# Local
external_validator = Validator(
address=self.checksum_address,
public_key=self.ritual_power.public_key()
public_key=self.ritual_power.public_key(),
)
else:
# Remote
@ -341,78 +351,136 @@ class Operator(BaseActor):
return result
def publish_transcript(self, ritual_id: int, transcript: Transcript) -> HexBytes:
tx_hash = self.coordinator_agent.post_transcript(
def _setup_async_hooks(
self, phase_id: PhaseId, *args
) -> BlockchainInterface.AsyncTxHooks:
tx_type = "POST_TRANSCRIPT" if phase_id.phase == PHASE1 else "POST_AGGREGATE"
def resubmit_tx():
if phase_id.phase == PHASE1:
# check status of ritual before resubmitting; prevent infinite loops
if not self._is_phase_1_action_required(ritual_id=phase_id.ritual_id):
self.log.info(
f"No need to resubmit tx: additional action not required for ritual# {phase_id.ritual_id} (status={self.coordinator_agent.get_ritual_status(phase_id.ritual_id)})"
)
return
async_tx = self.publish_transcript(*args)
else:
# check status of ritual before resubmitting; prevent infinite loops
if not self._is_phase_2_action_required(ritual_id=phase_id.ritual_id):
self.log.info(
f"No need to resubmit tx: additional action not required for ritual# {phase_id.ritual_id} (status={self.coordinator_agent.get_ritual_status(phase_id.ritual_id)})"
)
return
async_tx = self.publish_aggregated_transcript(*args)
self.log.info(
f"{self.transacting_power.account[:8]} resubmitted a new async tx {async_tx.id} "
f"for DKG ritual #{phase_id.ritual_id}"
)
def on_broadcast_failure(tx: FutureTx, e: Exception):
# although error, tx was not removed from atxm
self.log.warn(
f"{tx_type} async tx {tx.id} for DKG ritual# {phase_id.ritual_id} "
f"failed to broadcast {e}; the same tx will be retried"
)
# either multiple retries already completed for recoverable error,
# or simply a non-recoverable error - remove and resubmit
# (analogous action to a node restart of old)
self.coordinator_agent.blockchain.tx_machine.remove_queued_transaction(tx)
# submit a new one
resubmit_tx()
def on_fault(tx: FaultedTx):
# fault means that tx was removed from atxm
error = f"({tx.error})" if tx.error else ""
self.log.warn(
f"{tx_type} async tx {tx.id} for DKG ritual# {phase_id.ritual_id} "
f"failed with fault {tx.fault.name}{error}; resubmitting a new one"
)
# submit a new one.
resubmit_tx()
def on_finalized(tx: FinalizedTx):
# finalized means that tx was removed from atxm
if not tx.successful:
self.log.warn(
f"{tx_type} async tx {tx.id} for DKG ritual# {phase_id.ritual_id} "
f"was reverted; resubmitting a new one"
)
# submit a new one.
resubmit_tx()
else:
# success and blockchain updated - no need to store tx anymore
self.dkg_storage.clear_ritual_phase_async_tx(
phase_id=phase_id, async_tx=tx
)
def on_insufficient_funds(tx: Union[FutureTx, PendingTx], e: InsufficientFunds):
# although error, tx was not removed from atxm
self.log.error(
f"{tx_type} async tx {tx.id} for DKG ritual# {phase_id.ritual_id} "
f"cannot be executed because {self.transacting_power.account[:8]} "
f"has insufficient funds {e}"
)
async_tx_hooks = BlockchainInterface.AsyncTxHooks(
on_broadcast_failure=on_broadcast_failure,
on_fault=on_fault,
on_finalized=on_finalized,
on_insufficient_funds=on_insufficient_funds,
)
return async_tx_hooks
def publish_transcript(self, ritual_id: int, transcript: Transcript) -> AsyncTx:
identifier = PhaseId(ritual_id, PHASE1)
async_tx_hooks = self._setup_async_hooks(identifier, ritual_id, transcript)
async_tx = self.coordinator_agent.post_transcript(
ritual_id=ritual_id,
transcript=transcript,
transacting_power=self.transacting_power,
async_tx_hooks=async_tx_hooks,
)
return tx_hash
self.dkg_storage.store_ritual_phase_async_tx(
phase_id=identifier, async_tx=async_tx
)
return async_tx
def publish_aggregated_transcript(
self,
ritual_id: int,
aggregated_transcript: AggregatedTranscript,
public_key: DkgPublicKey,
) -> HexBytes:
) -> AsyncTx:
"""Publish an aggregated transcript to publicly available storage."""
# look up the node index for this node on the blockchain
participant_public_key = self.threshold_request_power.get_pubkey_from_ritual_id(
ritual_id
)
tx_hash = self.coordinator_agent.post_aggregation(
identifier = PhaseId(ritual_id=ritual_id, phase=PHASE2)
async_tx_hooks = self._setup_async_hooks(
identifier, ritual_id, aggregated_transcript, public_key
)
async_tx = self.coordinator_agent.post_aggregation(
ritual_id=ritual_id,
aggregated_transcript=aggregated_transcript,
public_key=public_key,
participant_public_key=participant_public_key,
transacting_power=self.transacting_power,
async_tx_hooks=async_tx_hooks,
)
return tx_hash
def get_phase_receipt(
self, ritual_id: int, phase: int
) -> Tuple[Optional[HexBytes], Optional[TxReceipt]]:
if phase == 1:
txhash = self.dkg_storage.get_transcript_txhash(ritual_id=ritual_id)
elif phase == 2:
txhash = self.dkg_storage.get_aggregation_txhash(ritual_id=ritual_id)
else:
raise ValueError(f"Invalid phase: '{phase}'.")
if not txhash:
return None, None
try:
blockchain = self.coordinator_agent.blockchain.client
receipt = blockchain.get_transaction_receipt(txhash)
except TransactionNotFound:
return txhash, None
# at least for now (pre dkg tracker) - clear since receipt obtained
if phase == 1:
self.dkg_storage.clear_transcript_txhash(ritual_id, txhash)
else:
self.dkg_storage.clear_aggregated_txhash(ritual_id, txhash)
status = receipt.get("status")
if status == 1:
return txhash, receipt
else:
return None, None
def _phase_has_pending_tx(self, ritual_id: int, phase: int) -> bool:
tx_hash, _ = self.get_phase_receipt(ritual_id=ritual_id, phase=phase)
if not tx_hash:
return False
self.log.info(
f"Node {self.transacting_power.account} has pending tx {bytes(tx_hash).hex()} "
f"for ritual #{ritual_id}, phase #{phase}; skipping execution"
self.dkg_storage.store_ritual_phase_async_tx(
phase_id=identifier, async_tx=async_tx
)
return True
return async_tx
def _is_phase_1_action_required(self, ritual_id: int) -> bool:
"""Check whether node needs to perform a DKG round 1 action."""
# handle pending transactions
if self._phase_has_pending_tx(ritual_id=ritual_id, phase=PHASE1):
return False
# check ritual status from the blockchain
status = self.coordinator_agent.get_ritual_status(ritual_id=ritual_id)
@ -446,21 +514,58 @@ class Operator(BaseActor):
authority: ChecksumAddress,
participants: List[ChecksumAddress],
timestamp: int,
) -> Optional[HexBytes]:
"""Perform round 1 of the DKG protocol for a given ritual ID on this node."""
) -> Optional[AsyncTx]:
"""
Perform phase 1 of the DKG protocol for a given ritual ID on this node.
This method is idempotent and will not submit a transcript if one has
already been submitted. It is dispatched by the EventActuator when it
receives a StartRitual event from the blockchain. Since the EventActuator
scans overlapping blocks, it is possible that this method will be called
multiple times for the same ritual. This method will check the state of
the ritual and participant on the blockchain before submitting a transcript.
If there is a tracked AsyncTx for the given ritual and round
combination, this method will return the tracked transaction. If there is
no tracked transaction, this method will submit a transcript and return the
resulting FutureTx.
Returning None indicates that no action was required or taken.
Errors raised by this method are not explicitly caught and are expected
to be handled by the EventActuator.
"""
if self.checksum_address not in participants:
# should never get here
self.log.error(
f"Not part of ritual {ritual_id}; no need to submit transcripts"
)
raise RuntimeError(
f"Not participating in ritual {ritual_id}; should not have been notified"
message = (
f"{self.checksum_address}|{self.wallet_address} "
f"is not a member of ritual {ritual_id}"
)
stack_trace = traceback.format_stack()
self.log.critical(f"{message}\n{stack_trace}")
return
# check phase 1 contract state
if not self._is_phase_1_action_required(ritual_id=ritual_id):
self.log.debug(
"No action required for phase 1 of DKG protocol for some reason or another."
)
return
# check if there is already pending tx for this ritual + round combination
async_tx = self.dkg_storage.get_ritual_phase_async_tx(
phase_id=PhaseId(ritual_id, PHASE1)
)
if async_tx:
self.log.info(
f"Active ritual in progress: {self.transacting_power.account} has submitted tx "
f"for ritual #{ritual_id}, phase #{PHASE1} (final: {async_tx.final})"
)
return async_tx
#
# Perform phase 1 of the DKG protocol
#
ritual = self.coordinator_agent.get_ritual(
ritual_id=ritual_id,
transcripts=False,
@ -490,9 +595,8 @@ class Operator(BaseActor):
raise e
# publish the transcript and store the receipt
tx_hash = self.publish_transcript(ritual_id=ritual.id, transcript=transcript)
self.dkg_storage.store_transcript_txhash(ritual_id=ritual.id, txhash=tx_hash)
self.dkg_storage.store_validators(ritual_id=ritual.id, validators=validators)
async_tx = self.publish_transcript(ritual_id=ritual.id, transcript=transcript)
# logging
arrival = ritual.total_transcripts + 1
@ -500,13 +604,10 @@ class Operator(BaseActor):
f"{self.transacting_power.account[:8]} submitted a transcript for "
f"DKG ritual #{ritual.id} ({arrival}/{ritual.dkg_size}) with authority {authority}."
)
return tx_hash
return async_tx
def _is_phase_2_action_required(self, ritual_id: int) -> bool:
"""Check whether node needs to perform a DKG round 2 action."""
# check if there is a pending tx for this ritual + round combination
if self._phase_has_pending_tx(ritual_id=ritual_id, phase=PHASE2):
return False
# check ritual status from the blockchain
status = self.coordinator_agent.get_ritual_status(ritual_id=ritual_id)
@ -535,12 +636,23 @@ class Operator(BaseActor):
return True
def perform_round_2(self, ritual_id: int, timestamp: int) -> Optional[HexBytes]:
def perform_round_2(self, ritual_id: int, timestamp: int) -> Optional[AsyncTx]:
"""Perform round 2 of the DKG protocol for the given ritual ID on this node."""
# check phase 2 state
if not self._is_phase_2_action_required(ritual_id=ritual_id):
return
# check if there is a pending tx for this ritual + round combination
async_tx = self.dkg_storage.get_ritual_phase_async_tx(
phase_id=PhaseId(ritual_id, PHASE2)
)
if async_tx:
self.log.info(
f"Active ritual in progress: {self.transacting_power.account} has submitted tx"
f"for ritual #{ritual_id}, phase #{PHASE2} (final: {async_tx.final})."
)
return async_tx
ritual = self.coordinator_agent.get_ritual(
ritual_id=ritual_id,
transcripts=True,
@ -556,14 +668,15 @@ class Operator(BaseActor):
transcripts = (Transcript.from_bytes(bytes(t)) for t in ritual.transcripts)
messages = list(zip(validators, transcripts))
try:
aggregated_transcript, dkg_public_key = (
self.ritual_power.aggregate_transcripts(
threshold=ritual.threshold,
shares=ritual.shares,
checksum_address=self.checksum_address,
ritual_id=ritual.id,
transcripts=messages,
)
(
aggregated_transcript,
dkg_public_key,
) = self.ritual_power.aggregate_transcripts(
threshold=ritual.threshold,
shares=ritual.shares,
checksum_address=self.checksum_address,
ritual_id=ritual.id,
transcripts=messages,
)
except Exception as e:
self.log.debug(
@ -573,15 +686,12 @@ class Operator(BaseActor):
# publish the transcript with network-wide jitter to avoid tx congestion
time.sleep(random.randint(0, self.AGGREGATION_SUBMISSION_MAX_DELAY))
tx_hash = self.publish_aggregated_transcript(
async_tx = self.publish_aggregated_transcript(
ritual_id=ritual.id,
aggregated_transcript=aggregated_transcript,
public_key=dkg_public_key,
)
# store the txhash
self.dkg_storage.store_aggregation_txhash(ritual_id=ritual.id, txhash=tx_hash)
# logging
total = ritual.total_aggregations + 1
self.log.debug(
@ -590,7 +700,8 @@ class Operator(BaseActor):
)
if total >= ritual.dkg_size:
self.log.debug(f"DKG ritual #{ritual.id} should now be finalized")
return tx_hash
return async_tx
def derive_decryption_share(
self,
@ -600,13 +711,10 @@ class Operator(BaseActor):
variant: FerveoVariant,
) -> Union[DecryptionShareSimple, DecryptionSharePrecomputed]:
ritual = self._resolve_ritual(ritual_id)
validators = self._resolve_validators(ritual)
aggregated_transcript = AggregatedTranscript.from_bytes(
bytes(ritual.aggregated_transcript)
)
decryption_share = self.ritual_power.derive_decryption_share(
nodes=validators,
threshold=ritual.threshold,
@ -616,7 +724,7 @@ class Operator(BaseActor):
aggregated_transcript=aggregated_transcript,
ciphertext_header=ciphertext_header,
aad=aad,
variant=variant
variant=variant,
)
return decryption_share
@ -861,13 +969,6 @@ class Operator(BaseActor):
color="green",
)
def get_work_is_needed_check(self):
def func(self):
# we have not confirmed yet
return not self.is_confirmed
return func
class PolicyAuthor(NucypherTokenActor):
"""Alice base class for blockchain operations, mocking up new policies!"""

View File

@ -13,10 +13,10 @@ from typing import (
Optional,
Tuple,
Type,
Union,
cast,
)
from atxm.tx import AsyncTx
from constant_sorrow.constants import (
# type: ignore
CONTRACT_CALL,
@ -24,7 +24,6 @@ from constant_sorrow.constants import (
)
from eth_typing.evm import ChecksumAddress
from eth_utils import to_checksum_address, to_int
from hexbytes import HexBytes
from nucypher_core import SessionStaticKey
from nucypher_core.ferveo import (
AggregatedTranscript,
@ -45,8 +44,11 @@ from nucypher.blockchain.eth.constants import (
TACO_CHILD_APPLICATION_CONTRACT_NAME,
)
from nucypher.blockchain.eth.decorators import contract_api
from nucypher.blockchain.eth.interfaces import BlockchainInterfaceFactory
from nucypher.blockchain.eth.models import Coordinator, Ferveo
from nucypher.blockchain.eth.interfaces import (
BlockchainInterface,
BlockchainInterfaceFactory,
)
from nucypher.blockchain.eth.models import PHASE1, PHASE2, Coordinator, Ferveo
from nucypher.blockchain.eth.registry import (
ContractRegistry,
)
@ -68,7 +70,7 @@ class EthereumContractAgent:
# TODO - #842: Gas Management
DEFAULT_TRANSACTION_GAS_LIMITS: Dict[str, Optional[Wei]]
DEFAULT_TRANSACTION_GAS_LIMITS = {'default': None}
DEFAULT_TRANSACTION_GAS_LIMITS = {"default": None}
class ContractNotDeployed(Exception):
"""Raised when attempting to access a contract that is not deployed on the current network."""
@ -86,7 +88,6 @@ class EthereumContractAgent:
contract: Optional[Contract] = None,
transaction_gas: Optional[Wei] = None,
):
self.log = Logger(self.__class__.__name__)
self.registry = registry
@ -103,7 +104,9 @@ class EthereumContractAgent:
self.__contract = contract
self.events = events.ContractEvents(contract)
if not transaction_gas:
transaction_gas = EthereumContractAgent.DEFAULT_TRANSACTION_GAS_LIMITS['default']
transaction_gas = EthereumContractAgent.DEFAULT_TRANSACTION_GAS_LIMITS[
"default"
]
self.transaction_gas = transaction_gas
self.log.info(
@ -133,7 +136,6 @@ class EthereumContractAgent:
class NucypherTokenAgent(EthereumContractAgent):
contract_name: str = NUCYPHER_TOKEN_CONTRACT_NAME
@contract_api(CONTRACT_CALL)
@ -142,57 +144,6 @@ class NucypherTokenAgent(EthereumContractAgent):
balance: int = self.contract.functions.balanceOf(address).call()
return types.NuNits(balance)
@contract_api(CONTRACT_CALL)
def get_allowance(
self, owner: ChecksumAddress, spender: ChecksumAddress
) -> types.NuNits:
"""Check the amount of tokens that an owner allowed to a spender"""
allowance: int = self.contract.functions.allowance(owner, spender).call()
return types.NuNits(allowance)
@contract_api(TRANSACTION)
def increase_allowance(
self,
transacting_power: TransactingPower,
spender_address: ChecksumAddress,
increase: types.NuNits,
) -> TxReceipt:
"""Increase the allowance of a spender address funded by a sender address"""
contract_function: ContractFunction = self.contract.functions.increaseAllowance(spender_address, increase)
receipt: TxReceipt = self.blockchain.send_transaction(contract_function=contract_function,
transacting_power=transacting_power)
return receipt
@contract_api(TRANSACTION)
def decrease_allowance(
self,
transacting_power: TransactingPower,
spender_address: ChecksumAddress,
decrease: types.NuNits,
) -> TxReceipt:
"""Decrease the allowance of a spender address funded by a sender address"""
contract_function: ContractFunction = self.contract.functions.decreaseAllowance(spender_address, decrease)
receipt: TxReceipt = self.blockchain.send_transaction(contract_function=contract_function,
transacting_power=transacting_power)
return receipt
@contract_api(TRANSACTION)
def approve_transfer(
self,
amount: types.NuNits,
spender_address: ChecksumAddress,
transacting_power: TransactingPower,
) -> TxReceipt:
"""Approve the spender address to transfer an amount of tokens on behalf of the sender address"""
self._validate_zero_allowance(amount, spender_address, transacting_power)
payload: TxParams = {'gas': Wei(500_000)} # TODO #842: gas needed for use with geth! <<<< Is this still open?
contract_function: ContractFunction = self.contract.functions.approve(spender_address, amount)
receipt: TxReceipt = self.blockchain.send_transaction(contract_function=contract_function,
payload=payload,
transacting_power=transacting_power)
return receipt
@contract_api(TRANSACTION)
def transfer(
self,
@ -201,41 +152,16 @@ class NucypherTokenAgent(EthereumContractAgent):
transacting_power: TransactingPower,
) -> TxReceipt:
"""Transfer an amount of tokens from the sender address to the target address."""
contract_function: ContractFunction = self.contract.functions.transfer(target_address, amount)
receipt: TxReceipt = self.blockchain.send_transaction(contract_function=contract_function,
transacting_power=transacting_power)
contract_function: ContractFunction = self.contract.functions.transfer(
target_address, amount
)
receipt: TxReceipt = self.blockchain.send_transaction(
contract_function=contract_function, transacting_power=transacting_power
)
return receipt
@contract_api(TRANSACTION)
def approve_and_call(
self,
amount: types.NuNits,
target_address: ChecksumAddress,
transacting_power: TransactingPower,
call_data: bytes = b"",
gas_limit: Optional[Wei] = None,
) -> TxReceipt:
self._validate_zero_allowance(amount, target_address, transacting_power)
payload = None
if gas_limit: # TODO: Gas management - #842
payload = {'gas': gas_limit}
approve_and_call: ContractFunction = self.contract.functions.approveAndCall(target_address, amount, call_data)
approve_and_call_receipt: TxReceipt = self.blockchain.send_transaction(contract_function=approve_and_call,
transacting_power=transacting_power,
payload=payload)
return approve_and_call_receipt
def _validate_zero_allowance(self, amount, target_address, transacting_power):
if amount == 0:
return
current_allowance = self.get_allowance(owner=transacting_power.account, spender=target_address)
if current_allowance != 0:
raise self.RequirementError(f"Token allowance for spender {target_address} must be 0")
class SubscriptionManagerAgent(EthereumContractAgent):
contract_name: str = SUBSCRIPTION_MANAGER_CONTRACT_NAME
class PolicyInfo(NamedTuple):
@ -267,7 +193,7 @@ class SubscriptionManagerAgent(EthereumContractAgent):
end_timestamp=record[2],
size=record[3],
# If the policyOwner addr is null, we return the sponsor addr instead of the owner.
owner=record[0] if record[4] == NULL_ADDRESS else record[4]
owner=record[0] if record[4] == NULL_ADDRESS else record[4],
)
return policy_info
@ -276,27 +202,25 @@ class SubscriptionManagerAgent(EthereumContractAgent):
#
@contract_api(TRANSACTION)
def create_policy(self,
policy_id: bytes,
transacting_power: TransactingPower,
size: int,
start_timestamp: Timestamp,
end_timestamp: Timestamp,
value: Wei,
owner_address: Optional[ChecksumAddress] = None) -> TxReceipt:
def create_policy(
self,
policy_id: bytes,
transacting_power: TransactingPower,
size: int,
start_timestamp: Timestamp,
end_timestamp: Timestamp,
value: Wei,
owner_address: Optional[ChecksumAddress] = None,
) -> TxReceipt:
owner_address = owner_address or transacting_power.account
payload: TxParams = {'value': value}
payload: TxParams = {"value": value}
contract_function: ContractFunction = self.contract.functions.createPolicy(
policy_id,
owner_address,
size,
start_timestamp,
end_timestamp
policy_id, owner_address, size, start_timestamp, end_timestamp
)
receipt = self.blockchain.send_transaction(
contract_function=contract_function,
payload=payload,
transacting_power=transacting_power
transacting_power=transacting_power,
)
return receipt
@ -316,7 +240,7 @@ class StakerSamplingApplicationAgent(EthereumContractAgent):
@abstractmethod
def _get_active_staking_providers_raw(
self, start_index: int, max_results: int
self, start_index: int, max_results: int, duration: int
) -> Tuple[int, List[bytes]]:
raise NotImplementedError
@ -325,21 +249,21 @@ class StakerSamplingApplicationAgent(EthereumContractAgent):
raise NotImplementedError
def get_all_active_staking_providers(
self, pagination_size: Optional[int] = None
self, pagination_size: Optional[int] = None, duration: int = 0
) -> Tuple[types.TuNits, Dict[ChecksumAddress, types.TuNits]]:
n_tokens, staking_providers = self._get_active_stakers(
pagination_size=pagination_size
pagination_size=pagination_size, duration=duration
)
return n_tokens, staking_providers
@contract_api(CONTRACT_CALL)
def get_active_staking_providers(
self, start_index: int, max_results: int
self, start_index: int, max_results: int, duration: int = 0
) -> Tuple[types.TuNits, Dict[ChecksumAddress, types.TuNits]]:
(
total_authorized_tokens,
staking_providers_info,
) = self._get_active_staking_providers_raw(start_index, max_results)
) = self._get_active_staking_providers_raw(start_index, max_results, duration)
staking_providers = self._process_active_staker_info(staking_providers_info)
return types.TuNits(total_authorized_tokens), staking_providers
@ -348,10 +272,11 @@ class StakerSamplingApplicationAgent(EthereumContractAgent):
self,
without: Iterable[ChecksumAddress] = None,
pagination_size: Optional[int] = None,
duration: int = 0,
) -> "StakingProvidersReservoir":
# pagination_size = pagination_size or self.get_staking_providers_population()
n_tokens, stake_provider_map = self.get_all_active_staking_providers(
pagination_size=pagination_size
pagination_size=pagination_size, duration=duration
)
if n_tokens == 0:
@ -385,7 +310,9 @@ class StakerSamplingApplicationAgent(EthereumContractAgent):
return staking_providers
def _get_active_stakers(self, pagination_size: Optional[int] = None):
def _get_active_stakers(
self, pagination_size: Optional[int] = None, duration: int = 0
):
if pagination_size is None:
pagination_size = (
self.DEFAULT_PROVIDERS_PAGINATION_SIZE_LIGHT_NODE
@ -408,7 +335,9 @@ class StakerSamplingApplicationAgent(EthereumContractAgent):
(
batch_authorized_tokens,
batch_staking_providers,
) = self.get_active_staking_providers(start_index, pagination_size)
) = self.get_active_staking_providers(
start_index, pagination_size, duration
)
except Exception as e:
if "timeout" not in str(e):
# exception unrelated to pagination size and timeout
@ -447,6 +376,8 @@ class TACoChildApplicationAgent(StakerSamplingApplicationAgent):
authorized: int
operator_confirmed: bool
index: int
deauthorizing: int
end_deauthorization: int
@contract_api(CONTRACT_CALL)
def get_min_authorization(self) -> int:
@ -472,7 +403,9 @@ class TACoChildApplicationAgent(StakerSamplingApplicationAgent):
self, staking_provider: ChecksumAddress
) -> StakingProviderInfo:
result = self.contract.functions.stakingProviderInfo(staking_provider).call()
return TACoChildApplicationAgent.StakingProviderInfo(*result)
return TACoChildApplicationAgent.StakingProviderInfo(
*result[0 : len(TACoChildApplicationAgent.StakingProviderInfo._fields)]
)
@contract_api(CONTRACT_CALL)
def is_operator_confirmed(self, operator_address: ChecksumAddress) -> bool:
@ -500,13 +433,16 @@ class TACoChildApplicationAgent(StakerSamplingApplicationAgent):
@contract_api(CONTRACT_CALL)
def _get_active_staking_providers_raw(
self, start_index: int, max_results: int
self, start_index: int, max_results: int, duration: int
) -> Tuple[int, List[bytes]]:
active_staking_providers_info = (
self.contract.functions.getActiveStakingProviders(
start_index, max_results
).call()
get_active_providers_overloaded_function = (
self.contract.get_function_by_signature(
"getActiveStakingProviders(uint256,uint256,uint32)"
)
)
active_staking_providers_info = get_active_providers_overloaded_function(
start_index, max_results, duration
).call()
return active_staking_providers_info
@ -518,11 +454,6 @@ class TACoApplicationAgent(StakerSamplingApplicationAgent):
operator_confirmed: bool
operator_start_timestamp: int
class OperatorInfo(NamedTuple):
address: ChecksumAddress
confirmed: bool
start_timestamp: Timestamp
@contract_api(CONTRACT_CALL)
def get_min_authorization(self) -> int:
result = self.contract.functions.minimumAuthorization().call()
@ -566,8 +497,12 @@ class TACoApplicationAgent(StakerSamplingApplicationAgent):
self, staking_provider: ChecksumAddress
) -> StakingProviderInfo:
# remove reserved fields
info: list = self.contract.functions.stakingProviderInfo(staking_provider).call()
return TACoApplicationAgent.StakingProviderInfo(*info[0:3])
info: list = self.contract.functions.stakingProviderInfo(
staking_provider
).call()
return TACoApplicationAgent.StakingProviderInfo(
*info[0 : len(TACoApplicationAgent.StakingProviderInfo._fields)]
)
@contract_api(CONTRACT_CALL)
def get_authorized_stake(self, staking_provider: ChecksumAddress) -> int:
@ -596,11 +531,11 @@ class TACoApplicationAgent(StakerSamplingApplicationAgent):
@contract_api(CONTRACT_CALL)
def _get_active_staking_providers_raw(
self, start_index: int, max_results: int
self, start_index: int, max_results: int, duration: int
) -> Tuple[int, List[bytes]]:
active_staking_providers_info = (
self.contract.functions.getActiveStakingProviders(
start_index, max_results
start_index, max_results, duration
).call()
)
return active_staking_providers_info
@ -610,11 +545,19 @@ class TACoApplicationAgent(StakerSamplingApplicationAgent):
#
@contract_api(TRANSACTION)
def bond_operator(self, staking_provider: ChecksumAddress, operator: ChecksumAddress, transacting_power: TransactingPower) -> TxReceipt:
def bond_operator(
self,
staking_provider: ChecksumAddress,
operator: ChecksumAddress,
transacting_power: TransactingPower,
) -> TxReceipt:
"""For use by threshold operator accounts only."""
contract_function: ContractFunction = self.contract.functions.bondOperator(staking_provider, operator)
receipt = self.blockchain.send_transaction(contract_function=contract_function,
transacting_power=transacting_power)
contract_function: ContractFunction = self.contract.functions.bondOperator(
staking_provider, operator
)
receipt = self.blockchain.send_transaction(
contract_function=contract_function, transacting_power=transacting_power
)
return receipt
@ -703,7 +646,10 @@ class CoordinatorAgent(EthereumContractAgent):
) -> Iterable[Coordinator.Participant]:
if max_results < 0:
raise ValueError("Max results must be greater than or equal to zero.")
data = self.contract.functions.getParticipants(
get_participants_overloaded_function = self.contract.get_function_by_signature(
"getParticipants(uint32,uint256,uint256,bool)"
)
data = get_participants_overloaded_function(
ritual_id, start_index, max_results, transcripts
).call()
participants = Coordinator.Ritual.make_participants(data=data)
@ -796,17 +742,18 @@ class CoordinatorAgent(EthereumContractAgent):
ritual_id: int,
transcript: Transcript,
transacting_power: TransactingPower,
fire_and_forget: bool = True,
) -> Union[TxReceipt, HexBytes]:
async_tx_hooks: BlockchainInterface.AsyncTxHooks,
) -> AsyncTx:
contract_function: ContractFunction = self.contract.functions.postTranscript(
ritualId=ritual_id, transcript=bytes(transcript)
)
receipt = self.blockchain.send_transaction(
async_tx = self.blockchain.send_async_transaction(
contract_function=contract_function,
transacting_power=transacting_power,
fire_and_forget=fire_and_forget,
async_tx_hooks=async_tx_hooks,
info={"ritual_id": ritual_id, "phase": PHASE1},
)
return receipt
return async_tx
@contract_api(TRANSACTION)
def post_aggregation(
@ -816,23 +763,24 @@ class CoordinatorAgent(EthereumContractAgent):
public_key: DkgPublicKey,
participant_public_key: SessionStaticKey,
transacting_power: TransactingPower,
fire_and_forget: bool = True,
) -> Union[TxReceipt, HexBytes]:
async_tx_hooks: BlockchainInterface.AsyncTxHooks,
) -> AsyncTx:
contract_function: ContractFunction = self.contract.functions.postAggregation(
ritualId=ritual_id,
aggregatedTranscript=bytes(aggregated_transcript),
dkgPublicKey=Ferveo.G1Point.from_dkg_public_key(public_key),
decryptionRequestStaticKey=bytes(participant_public_key),
)
receipt = self.blockchain.send_transaction(
async_tx = self.blockchain.send_async_transaction(
contract_function=contract_function,
gas_estimation_multiplier=1.4,
transacting_power=transacting_power,
fire_and_forget=fire_and_forget,
async_tx_hooks=async_tx_hooks,
info={"ritual_id": ritual_id, "phase": PHASE2},
)
return receipt
return async_tx
@contract_api(TRANSACTION)
@contract_api(CONTRACT_CALL)
def get_ritual_initiation_cost(
self, providers: List[ChecksumAddress], duration: int
) -> Wei:
@ -841,7 +789,7 @@ class CoordinatorAgent(EthereumContractAgent):
).call()
return Wei(result)
@contract_api(TRANSACTION)
@contract_api(CONTRACT_CALL)
def get_ritual_id_from_public_key(self, public_key: DkgPublicKey) -> int:
g1_point = Ferveo.G1Point.from_dkg_public_key(public_key)
result = self.contract.functions.getRitualIdFromPublicKey(g1_point).call()
@ -865,7 +813,9 @@ class ContractAgency:
"""Where agents live and die."""
# TODO: Enforce singleton - #1506 - Okay, actually, make this into a module
__agents: Dict[str, Dict[Type[EthereumContractAgent], EthereumContractAgent]] = dict()
__agents: Dict[str, Dict[Type[EthereumContractAgent], EthereumContractAgent]] = (
dict()
)
@classmethod
def get_agent(
@ -926,7 +876,7 @@ class ContractAgency:
agent_class=agent_class,
registry=registry,
blockchain_endpoint=blockchain_endpoint,
contract_version=contract_version
contract_version=contract_version,
)
return agent
@ -959,7 +909,9 @@ class WeightedSampler:
return []
if quantity > len(self):
raise ValueError("Cannot sample more than the total amount of elements without replacement")
raise ValueError(
"Cannot sample more than the total amount of elements without replacement"
)
samples = []
@ -984,7 +936,6 @@ class WeightedSampler:
class StakingProvidersReservoir:
def __init__(self, staking_provider_map: Dict[ChecksumAddress, int]):
self._sampler = WeightedSampler(staking_provider_map)
self._rng = random.SystemRandom()

View File

@ -1,14 +1,9 @@
import os
import time
from functools import cached_property
from typing import Union
from constant_sorrow.constants import UNKNOWN_DEVELOPMENT_CHAIN_ID
from cytoolz.dicttoolz import dissoc
from eth_account import Account
from eth_account.messages import encode_defunct
from eth_typing.evm import BlockNumber, ChecksumAddress
from eth_utils import to_canonical_address, to_checksum_address
from eth_typing.evm import BlockNumber
from web3 import Web3
from web3._utils.threads import Timeout
from web3.contract.contract import Contract
@ -38,27 +33,11 @@ class Web3ClientUnexpectedVersionString(Web3ClientError):
pass
# TODO: Consider creating a ChainInventory class and/or moving this to a separate module
PUBLIC_CHAINS = {
0: "Olympic",
1: "Mainnet",
2: "Morden",
3: "Ropsten",
4: "Rinkeby",
5: "Goerli",
6: "Kotti",
42: "Kovan",
77: "Sokol",
100: "xDai",
137: "Polygon/Mainnet",
11155111: "Sepolia",
80001: "Polygon/Mumbai"
}
LOCAL_CHAINS = {
1337: "GethDev",
5777: "Ganache/TestRPC"
80002: "Polygon/Amoy",
}
# This list is not exhaustive,
@ -72,22 +51,11 @@ POA_CHAINS = {
10200, # gnosis/chiado,
137, # Polygon/Mainnet
80001, # "Polygon/Mumbai"
80002, # "Polygon/Amoy"
}
class EthereumClient:
is_local = False
# These two are used by Infura
GETH = 'Geth'
BOR = 'bor'
PARITY = 'Parity'
ALT_PARITY = 'Parity-Ethereum'
GANACHE = 'EthereumJS TestRPC'
ETHEREUM_TESTER = 'EthereumTester' # (PyEVM)
BLOCK_CONFIRMATIONS_POLLING_TIME = 3 # seconds
TRANSACTION_POLLING_TIME = 0.5 # seconds
COOLING_TIME = 5 # seconds
@ -123,107 +91,33 @@ class EthereumClient:
block_hash=Web3.to_hex(receipt['blockHash']))
super().__init__(self.message)
def __init__(self,
w3,
node_technology: str,
version: str,
platform: str,
backend: str):
def __init__(self, w3):
self.w3 = w3
self.node_technology = node_technology
self.node_version = version
self.platform = platform
self.backend = backend
self.log = Logger(self.__class__.__name__)
self._add_default_middleware()
def _add_default_middleware(self):
# default retry functionality
self.log.debug('Adding RPC retry middleware to client')
self.add_middleware(RetryRequestMiddleware)
@classmethod
def _get_variant(cls, w3):
return cls
@classmethod
def from_w3(cls, w3: Web3) -> 'EthereumClient':
"""
Client version strings:
Geth -> 'Geth/v1.4.11-stable-fed692f6/darwin/go1.7'
Parity -> 'Parity-Ethereum/v2.5.1-beta-e0141f8-20190510/x86_64-linux-gnu/rustc1.34.1'
Ganache -> 'EthereumJS TestRPC/v2.1.5/ethereum-js'
PyEVM -> 'EthereumTester/0.1.0b39/linux/python3.6.7'
Bor -> 'bor/v0.2.13-beta2-c227a072/linux-amd64/go1.17.5'
"""
clients = {
# Geth
cls.GETH: GethClient,
cls.BOR: BorClient,
# Parity
cls.PARITY: ParityClient,
cls.ALT_PARITY: ParityClient,
# Test Clients
cls.GANACHE: GanacheClient,
cls.ETHEREUM_TESTER: EthereumTesterClient,
}
try:
client_data = w3.client_version.split('/')
node_technology = client_data[0]
ClientSubclass = clients[node_technology]
except (ValueError, IndexError):
raise ValueError(f"Invalid client version string. Got '{w3.client_version}'")
except KeyError:
raise NotImplementedError(f'{w3.client_version} is not a supported ethereum client')
client_kwargs = {
'node_technology': node_technology,
'version': client_data[1],
'backend': client_data[-1],
'platform': client_data[2] if len(client_data) == 4 else None # Platform is optional
}
instance = ClientSubclass._get_variant(w3)(w3, **client_kwargs)
return instance
@property
def peers(self):
raise NotImplementedError
endpoint_uri = getattr(self.w3.provider, "endpoint_uri", "")
if "infura" in endpoint_uri:
self.log.debug("Adding Infura RPC retry middleware to client")
self.add_middleware(InfuraRetryRequestMiddleware)
elif "alchemyapi.io" in endpoint_uri:
self.log.debug("Adding Alchemy RPC retry middleware to client")
self.add_middleware(AlchemyRetryRequestMiddleware)
else:
self.log.debug("Adding RPC retry middleware to client")
self.add_middleware(RetryRequestMiddleware)
@property
def chain_name(self) -> str:
chain_inventory = LOCAL_CHAINS if self.is_local else PUBLIC_CHAINS
name = chain_inventory.get(self.chain_id, UNKNOWN_DEVELOPMENT_CHAIN_ID)
name = PUBLIC_CHAINS.get(self.chain_id, UNKNOWN_DEVELOPMENT_CHAIN_ID)
return name
def lock_account(self, account) -> bool:
if self.is_local:
return True
return NotImplemented
def unlock_account(self, account, password, duration=None) -> bool:
if self.is_local:
return True
return NotImplemented
@property
def is_connected(self):
return self.w3.is_connected()
@property
def etherbase(self) -> str:
return self.w3.eth.accounts[0]
@property
def accounts(self):
return self.w3.eth.accounts
@ -242,15 +136,8 @@ class EthereumClient:
@cached_property
def chain_id(self) -> int:
result = self.w3.eth.chain_id
try:
# from hex-str
chain_id = int(result, 16)
except TypeError:
# from str
chain_id = int(result)
return chain_id
_chain_id = self._get_chain_id(self.w3)
return _chain_id
@property
def net_version(self) -> int:
@ -277,10 +164,6 @@ class EthereumClient:
def block_number(self) -> BlockNumber:
return self.w3.eth.block_number
@property
def coinbase(self) -> ChecksumAddress:
return self.w3.eth.coinbase
def wait_for_receipt(self,
transaction_hash: str,
timeout: float,
@ -359,9 +242,6 @@ class EthereumClient:
# TODO: Consider adding an optional param in this exception to include extra info (e.g. new block)
return True
def sign_transaction(self, transaction_dict: dict) -> bytes:
raise NotImplementedError
def get_transaction(self, transaction_hash) -> dict:
return self.w3.eth.get_transaction(transaction_hash)
@ -378,14 +258,6 @@ class EthereumClient:
def send_raw_transaction(self, transaction_bytes: bytes) -> str:
return self.w3.eth.send_raw_transaction(transaction_bytes)
def sign_message(self, account: str, message: bytes) -> str:
"""
Calls the appropriate signing function for the specified account on the
backend. If the backend is based on eth-tester, then it uses the
eth-tester signing interface to do so.
"""
return self.w3.eth.sign(account, data=message)
def get_blocktime(self):
highest_block = self.w3.eth.get_block('latest')
now = highest_block['timestamp']
@ -399,176 +271,14 @@ class EthereumClient:
# check that our local chain data is up to date
return (time.time() - self.get_blocktime()) < self.STALECHECK_ALLOWABLE_DELAY
def parse_transaction_data(self, transaction):
return transaction.input
class GethClient(EthereumClient):
@classmethod
def _get_variant(cls, w3):
endpoint_uri = getattr(w3.provider, 'endpoint_uri', '')
if 'infura' in endpoint_uri:
return InfuraClient
elif 'alchemyapi.io' in endpoint_uri:
return AlchemyClient
return cls
@property
def is_local(self):
return self.chain_id not in PUBLIC_CHAINS
@property
def peers(self):
return self.w3.geth.admin.peers()
def new_account(self, password: str) -> str:
new_account = self.w3.geth.personal.new_account(password)
return to_checksum_address(new_account) # cast and validate
def unlock_account(self, account: str, password: str, duration: int = None):
if self.is_local:
return True
debug_message = f"Unlocking account {account}"
if duration is None:
debug_message += " for 5 minutes"
elif duration == 0:
debug_message += " indefinitely"
elif duration > 0:
debug_message += f" for {duration} seconds"
if password is None:
debug_message += " with no password."
self.log.debug(debug_message)
return self.w3.geth.personal.unlock_account(account, password, duration)
def lock_account(self, account):
return self.w3.geth.personal.lock_account(account)
def sign_transaction(self, transaction_dict: dict) -> bytes:
# Do not include a 'to' field for contract creation.
if transaction_dict['to'] == b'':
transaction_dict = dissoc(transaction_dict, 'to')
# Sign
result = self.w3.eth.sign_transaction(transaction_dict)
# Return RLP bytes
rlp_encoded_transaction = result.raw
return rlp_encoded_transaction
@property
def wallets(self):
return self.w3.geth.personal.list_wallets()
class BorClient(GethClient):
"""Geth to Bor adapter"""
class ParityClient(EthereumClient):
@property
def peers(self) -> list:
"""
TODO: Look for web3.py support for Parity Peers endpoint
"""
return self.w3.manager.request_blocking("parity_netPeers", [])
def new_account(self, password: str) -> str:
new_account = self.w3.parity.personal.new_account(password)
return to_checksum_address(new_account) # cast and validate
def unlock_account(self, account, password, duration: int = None) -> bool:
return self.w3.parity.personal.unlock_account(account, password, duration)
def lock_account(self, account):
return self.w3.parity.personal.lock_account(account)
class GanacheClient(EthereumClient):
is_local = True
def unlock_account(self, *args, **kwargs) -> bool:
return True
class InfuraClient(EthereumClient):
is_local = False
TRANSACTION_POLLING_TIME = 2 # seconds
def _add_default_middleware(self):
# default retry functionality
self.log.debug('Adding Infura RPC retry middleware to client')
self.add_middleware(InfuraRetryRequestMiddleware)
def unlock_account(self, *args, **kwargs) -> bool:
return True
class AlchemyClient(EthereumClient):
def _add_default_middleware(self):
# default retry functionality
self.log.debug('Adding Alchemy RPC retry middleware to client')
self.add_middleware(AlchemyRetryRequestMiddleware)
class EthereumTesterClient(EthereumClient):
is_local = True
def unlock_account(self, account, password, duration: int = None) -> bool:
"""Returns True if the testing backend keystore has control of the given address."""
account = to_checksum_address(account)
keystore_accounts = self.w3.provider.ethereum_tester.get_accounts()
if account in keystore_accounts:
return True
else:
return self.w3.provider.ethereum_tester.unlock_account(account=account,
password=password,
unlock_seconds=duration)
def lock_account(self, account) -> bool:
"""Returns True if the testing backend keystore has control of the given address."""
account = to_canonical_address(account)
keystore_accounts = self.w3.provider.ethereum_tester.backend.get_accounts()
if account in keystore_accounts:
return True
else:
return self.w3.provider.ethereum_tester.lock_account(account=account)
def new_account(self, password: str) -> str:
insecure_account = self.w3.provider.ethereum_tester.add_account(private_key=os.urandom(32).hex(),
password=password)
return insecure_account
def __get_signing_key(self, account: bytes):
"""Get signing key of test account"""
account = to_canonical_address(account)
def _get_chain_id(cls, w3: Web3):
result = w3.eth.chain_id
try:
signing_key = self.w3.provider.ethereum_tester.backend._key_lookup[account]._raw_key
except KeyError:
raise self.UnknownAccount(account)
return signing_key
# from hex-str
chain_id = int(result, 16)
except TypeError:
# from str
chain_id = int(result)
def sign_transaction(self, transaction_dict: dict) -> bytes:
# Sign using a local private key
address = to_canonical_address(transaction_dict['from'])
signing_key = self.__get_signing_key(account=address)
signed_transaction = self.w3.eth.account.sign_transaction(transaction_dict, private_key=signing_key)
rlp_transaction = signed_transaction.rawTransaction
return rlp_transaction
def sign_message(self, account: str, message: bytes) -> str:
"""Sign, EIP-191 (Geth) Style"""
signing_key = self.__get_signing_key(account=account)
signable_message = encode_defunct(primitive=message)
signature_and_stuff = Account.sign_message(signable_message=signable_message, private_key=signing_key)
return signature_and_stuff['signature']
def parse_transaction_data(self, transaction):
return transaction.data # See https://github.com/ethereum/eth-tester/issues/173
return chain_id

File diff suppressed because it is too large Load Diff

View File

@ -456,6 +456,18 @@
"internalType": "uint96",
"indexed": false
},
{
"name": "deauthorizing",
"type": "uint96",
"internalType": "uint96",
"indexed": false
},
{
"name": "endDeauthorization",
"type": "uint64",
"internalType": "uint64",
"indexed": false
},
{
"name": "operator",
"type": "address",
@ -916,6 +928,30 @@
}
]
},
{
"type": "function",
"name": "eligibleStake",
"stateMutability": "view",
"inputs": [
{
"name": "_stakingProvider",
"type": "address",
"internalType": "address"
},
{
"name": "_endDate",
"type": "uint256",
"internalType": "uint256"
}
],
"outputs": [
{
"name": "",
"type": "uint96",
"internalType": "uint96"
}
]
},
{
"type": "function",
"name": "getActiveStakingProviders",
@ -930,6 +966,11 @@
"name": "_maxStakingProviders",
"type": "uint256",
"internalType": "uint256"
},
{
"name": "_cohortDuration",
"type": "uint32",
"internalType": "uint32"
}
],
"outputs": [
@ -4585,10 +4626,22 @@
"indexed": true
},
{
"name": "amount",
"name": "authorized",
"type": "uint96",
"internalType": "uint96",
"indexed": false
},
{
"name": "deauthorizing",
"type": "uint96",
"internalType": "uint96",
"indexed": false
},
{
"name": "endDeauthorization",
"type": "uint64",
"internalType": "uint64",
"indexed": false
}
],
"anonymous": false
@ -4689,6 +4742,64 @@
}
]
},
{
"type": "function",
"name": "eligibleStake",
"stateMutability": "view",
"inputs": [
{
"name": "_stakingProvider",
"type": "address",
"internalType": "address"
},
{
"name": "_endDate",
"type": "uint256",
"internalType": "uint256"
}
],
"outputs": [
{
"name": "",
"type": "uint96",
"internalType": "uint96"
}
]
},
{
"type": "function",
"name": "getActiveStakingProviders",
"stateMutability": "view",
"inputs": [
{
"name": "_startIndex",
"type": "uint256",
"internalType": "uint256"
},
{
"name": "_maxStakingProviders",
"type": "uint256",
"internalType": "uint256"
},
{
"name": "_cohortDuration",
"type": "uint32",
"internalType": "uint32"
}
],
"outputs": [
{
"name": "allAuthorizedTokens",
"type": "uint96",
"internalType": "uint96"
},
{
"name": "activeStakingProviders",
"type": "bytes32[]",
"internalType": "bytes32[]"
}
]
},
{
"type": "function",
"name": "getActiveStakingProviders",
@ -4776,6 +4887,25 @@
}
]
},
{
"type": "function",
"name": "pendingAuthorizationDecrease",
"stateMutability": "view",
"inputs": [
{
"name": "_stakingProvider",
"type": "address",
"internalType": "address"
}
],
"outputs": [
{
"name": "",
"type": "uint96",
"internalType": "uint96"
}
]
},
{
"type": "function",
"name": "rootApplication",
@ -4820,6 +4950,16 @@
"name": "index",
"type": "uint248",
"internalType": "uint248"
},
{
"name": "deauthorizing",
"type": "uint96",
"internalType": "uint96"
},
{
"name": "endDeauthorization",
"type": "uint64",
"internalType": "uint64"
}
]
},
@ -4853,13 +4993,41 @@
"internalType": "address"
},
{
"name": "amount",
"name": "authorized",
"type": "uint96",
"internalType": "uint96"
}
],
"outputs": []
},
{
"type": "function",
"name": "updateAuthorization",
"stateMutability": "nonpayable",
"inputs": [
{
"name": "stakingProvider",
"type": "address",
"internalType": "address"
},
{
"name": "authorized",
"type": "uint96",
"internalType": "uint96"
},
{
"name": "deauthorizing",
"type": "uint96",
"internalType": "uint96"
},
{
"name": "endDeauthorization",
"type": "uint64",
"internalType": "uint64"
}
],
"outputs": []
},
{
"type": "function",
"name": "updateOperator",

File diff suppressed because it is too large Load Diff

View File

@ -20,7 +20,7 @@ class EthChain(ChainInfo, Enum):
class PolygonChain(ChainInfo, Enum):
MAINNET = (137, "polygon")
MUMBAI = (80001, "mumbai")
AMOY = (80002, "amoy")
class TACoDomain:
@ -83,11 +83,11 @@ MAINNET = TACoDomain(
LYNX = TACoDomain(
name="lynx",
eth_chain=EthChain.SEPOLIA,
polygon_chain=PolygonChain.MUMBAI,
polygon_chain=PolygonChain.AMOY,
condition_chains=(
EthChain.MAINNET,
EthChain.SEPOLIA,
PolygonChain.MUMBAI,
PolygonChain.AMOY,
PolygonChain.MAINNET,
),
)
@ -95,8 +95,8 @@ LYNX = TACoDomain(
TAPIR = TACoDomain(
name="tapir",
eth_chain=EthChain.SEPOLIA,
polygon_chain=PolygonChain.MUMBAI,
condition_chains=(EthChain.SEPOLIA, PolygonChain.MUMBAI),
polygon_chain=PolygonChain.AMOY,
condition_chains=(EthChain.SEPOLIA, PolygonChain.AMOY),
)

View File

@ -1,53 +1,50 @@
import math
import pprint
from pathlib import Path
from typing import Callable, Dict, NamedTuple, Optional, Union
from urllib.parse import urlparse
import requests
from atxm import AutomaticTxMachine
from atxm.exceptions import InsufficientFunds
from atxm.strategies import ExponentialSpeedupStrategy
from atxm.tx import AsyncTx, FaultedTx, FinalizedTx, FutureTx, PendingTx
from constant_sorrow.constants import (
INSUFFICIENT_FUNDS,
NO_BLOCKCHAIN_CONNECTION,
UNKNOWN_TX_STATUS,
)
from eth_tester import EthereumTester
from eth_tester.exceptions import (
TransactionFailed as TestTransactionFailed,
)
from eth_tester.exceptions import (
ValidationError,
INSUFFICIENT_FUNDS, # noqa
NO_BLOCKCHAIN_CONNECTION, # noqa
UNKNOWN_TX_STATUS, # noqa
)
from eth_utils import to_checksum_address
from hexbytes.main import HexBytes
from web3 import HTTPProvider, IPCProvider, Web3, WebsocketProvider
from web3.contract.contract import Contract, ContractConstructor, ContractFunction
from web3.exceptions import TimeExhausted
from web3.middleware import geth_poa_middleware, simple_cache_middleware
from web3.providers import BaseProvider
from web3.types import TxReceipt
from web3.types import TxParams, TxReceipt
from nucypher.blockchain.eth.clients import POA_CHAINS, EthereumClient, InfuraClient
from nucypher.blockchain.eth.clients import POA_CHAINS, EthereumClient
from nucypher.blockchain.eth.decorators import validate_checksum_address
from nucypher.blockchain.eth.providers import (
_get_auto_provider,
_get_HTTP_provider,
_get_IPC_provider,
_get_http_provider,
_get_mock_test_provider,
_get_pyevm_test_provider,
_get_websocket_provider,
)
from nucypher.blockchain.eth.registry import ContractRegistry
from nucypher.blockchain.eth.utils import get_transaction_name, prettify_eth_amount
from nucypher.blockchain.eth.utils import (
get_transaction_name,
get_tx_cost_data,
prettify_eth_amount,
)
from nucypher.crypto.powers import TransactingPower
from nucypher.utilities.emitters import StdoutEmitter
from nucypher.utilities.gas_strategies import (
WEB3_GAS_STRATEGIES,
construct_datafeed_median_strategy,
max_price_gas_strategy_wrapper,
)
from nucypher.utilities.logging import Logger
Web3Providers = Union[IPCProvider, WebsocketProvider, HTTPProvider, EthereumTester] # TODO: Move to types.py
Web3Providers = Union[
IPCProvider, WebsocketProvider, HTTPProvider
] # TODO: Move to types.py
class BlockchainInterface:
@ -58,7 +55,7 @@ class BlockchainInterface:
TIMEOUT = 600 # seconds # TODO: Correlate with the gas strategy - #2070
DEFAULT_GAS_STRATEGY = 'fast'
DEFAULT_GAS_STRATEGY = "fast"
GAS_STRATEGIES = WEB3_GAS_STRATEGIES
Web3 = Web3 # TODO: This is name-shadowing the actual Web3. Is this intentional?
@ -85,15 +82,15 @@ class BlockchainInterface:
}
class TransactionFailed(InterfaceError):
IPC_CODE = -32000
def __init__(self,
message: str,
transaction_dict: dict,
contract_function: Union[ContractFunction, ContractConstructor],
*args):
def __init__(
self,
message: str,
transaction_dict: dict,
contract_function: Union[ContractFunction, ContractConstructor],
*args,
):
self.base_message = message
self.name = get_transaction_name(contract_function=contract_function)
self.payload = transaction_dict
@ -107,30 +104,62 @@ class BlockchainInterface:
@property
def default(self) -> str:
sender = self.payload["from"]
message = f'{self.name} from {sender[:6]}... \n' \
f'Sender balance: {prettify_eth_amount(self.get_balance())} \n' \
f'Reason: {self.base_message} \n' \
f'Transaction: {self.payload}'
message = (
f"{self.name} from {sender[:6]}... \n"
f"Sender balance: {prettify_eth_amount(self.get_balance())} \n"
f"Reason: {self.base_message} \n"
f"Transaction: {self.payload}"
)
return message
def get_balance(self):
blockchain = BlockchainInterfaceFactory.get_interface()
balance = blockchain.client.get_balance(account=self.payload['from'])
balance = blockchain.client.get_balance(account=self.payload["from"])
return balance
@property
def insufficient_funds(self) -> str:
try:
transaction_fee = self.payload['gas'] * self.payload['gasPrice']
transaction_fee = self.payload["gas"] * self.payload["gasPrice"]
except KeyError:
return self.default
else:
cost = transaction_fee + self.payload.get('value', 0)
message = f'{self.name} from {self.payload["from"][:8]} - {self.base_message}.' \
f'Calculated cost is {prettify_eth_amount(cost)},' \
f'but sender only has {prettify_eth_amount(self.get_balance())}.'
cost = transaction_fee + self.payload.get("value", 0)
message = (
f'{self.name} from {self.payload["from"][:8]} - {self.base_message}. '
f"Calculated cost is {prettify_eth_amount(cost)}, "
f"but sender only has {prettify_eth_amount(self.get_balance())}."
)
return message
class AsyncTxHooks:
def __init__(
self,
on_broadcast_failure: Callable[[FutureTx, Exception], None],
on_fault: Callable[[FaultedTx], None],
on_finalized: Callable[[FinalizedTx], None],
on_insufficient_funds: Callable[
[Union[FutureTx, PendingTx], InsufficientFunds], None
],
on_broadcast: Optional[Callable[[PendingTx], None]] = None,
):
self.on_broadcast_failure = on_broadcast_failure
self.on_fault = on_fault
self.on_finalized = on_finalized
self.on_insufficient_funds = on_insufficient_funds
self.on_broadcast = (
on_broadcast if on_broadcast else self.__default_on_broadcast
)
@staticmethod
def __default_on_broadcast(tx: PendingTx):
emitter = StdoutEmitter()
max_cost, max_price_gwei, tx_type = get_tx_cost_data(tx.params)
emitter.message(
f"Broadcasted {tx_type} async tx {tx.id} with TXHASH {tx.txhash.hex()} ({max_cost} @ {max_price_gwei} gwei)",
color="yellow",
)
def __init__(
self,
emitter=None, # TODO # 1754
@ -204,7 +233,7 @@ class BlockchainInterface:
"""
self.log = Logger('Blockchain')
self.log = Logger("Blockchain")
self.poa = poa
self.endpoint = endpoint
self._provider = provider
@ -212,8 +241,22 @@ class BlockchainInterface:
self.client: EthereumClient = NO_BLOCKCHAIN_CONNECTION
self.is_light = light
speedup_strategy = ExponentialSpeedupStrategy(
w3=self.w3,
min_time_between_speedups=120,
) # speedup txs if not mined after 2 mins.
self.tx_machine = AutomaticTxMachine(
w3=self.w3, tx_exec_timeout=self.TIMEOUT, strategies=[speedup_strategy]
)
# TODO: Not ready to give users total flexibility. Let's stick for the moment to known values. See #2447
if gas_strategy not in ('slow', 'medium', 'fast', 'free', None): # FIXME: What is 'None' doing here?
if gas_strategy not in (
"slow",
"medium",
"fast",
"free",
None,
): # FIXME: What is 'None' doing here?
raise ValueError(f"'{gas_strategy}' is an invalid gas strategy")
self.gas_strategy = gas_strategy or self.DEFAULT_GAS_STRATEGY
self.max_gas_price = max_gas_price
@ -241,7 +284,9 @@ class BlockchainInterface:
except KeyError:
if gas_strategy:
if not callable(gas_strategy):
raise ValueError(f"{gas_strategy} must be callable to be a valid gas strategy.")
raise ValueError(
f"{gas_strategy} must be callable to be a valid gas strategy."
)
else:
gas_strategy = cls.GAS_STRATEGIES[cls.DEFAULT_GAS_STRATEGY]
return gas_strategy
@ -256,7 +301,7 @@ class BlockchainInterface:
# For use with Proof-Of-Authority test-blockchains
if self.poa is True:
self.log.debug('Injecting POA middleware at layer 0')
self.log.debug("Injecting POA middleware at layer 0")
self.client.inject_middleware(geth_poa_middleware, layer=0)
self.log.debug("Adding simple_cache_middleware")
@ -266,14 +311,8 @@ class BlockchainInterface:
# self.configure_gas_strategy()
def configure_gas_strategy(self, gas_strategy: Optional[Callable] = None) -> None:
if gas_strategy:
reported_gas_strategy = f"fixed/{gas_strategy.name}"
elif isinstance(self.client, InfuraClient):
gas_strategy = construct_datafeed_median_strategy(speed=self.gas_strategy)
reported_gas_strategy = f"datafeed/{self.gas_strategy}"
else:
reported_gas_strategy = f"web3/{self.gas_strategy}"
gas_strategy = self.get_gas_strategy(self.gas_strategy)
@ -281,8 +320,10 @@ class BlockchainInterface:
configuration_message = f"Using gas strategy '{reported_gas_strategy}'"
if self.max_gas_price:
__price = Web3.to_wei(self.max_gas_price, 'gwei') # from gwei to wei
gas_strategy = max_price_gas_strategy_wrapper(gas_strategy=gas_strategy, max_gas_price_wei=__price)
__price = Web3.to_wei(self.max_gas_price, "gwei") # from gwei to wei
gas_strategy = max_price_gas_strategy_wrapper(
gas_strategy=gas_strategy, max_gas_price_wei=__price
)
configuration_message += f", with a max price of {self.max_gas_price} gwei."
self.client.set_gas_strategy(gas_strategy=gas_strategy)
@ -295,7 +336,6 @@ class BlockchainInterface:
# self.log.debug(f"Gas strategy currently reports a gas price of {gwei_gas_price} gwei.")
def connect(self):
endpoint = self.endpoint
self.log.info(f"Using external Web3 Provider '{self.endpoint}'")
@ -311,7 +351,8 @@ class BlockchainInterface:
# Connect if not connected
try:
self.w3 = self.Web3(provider=self._provider)
self.client = EthereumClient.from_w3(w3=self.w3)
self.tx_machine.w3 = self.w3 # share this web3 instance with the tracker
self.client = EthereumClient(w3=self.w3)
except requests.ConnectionError: # RPC
raise self.ConnectionFailed(
f"Connection Failed - {str(self.endpoint)} - is RPC enabled?"
@ -343,53 +384,33 @@ class BlockchainInterface:
if endpoint and not provider:
uri_breakdown = urlparse(endpoint)
if uri_breakdown.scheme == 'tester':
providers = {
'pyevm': _get_pyevm_test_provider,
'mock': _get_mock_test_provider
}
provider_scheme = uri_breakdown.netloc
provider_scheme = (
uri_breakdown.netloc
if uri_breakdown.scheme == "tester"
else uri_breakdown.scheme
)
if provider_scheme == "pyevm":
self._provider = _get_pyevm_test_provider(endpoint)
elif provider_scheme == "mock":
self._provider = _get_mock_test_provider(endpoint)
elif provider_scheme == "http" or provider_scheme == "https":
self._provider = _get_http_provider(endpoint)
else:
providers = {
'auto': _get_auto_provider,
'ipc': _get_IPC_provider,
'file': _get_IPC_provider,
'ws': _get_websocket_provider,
'wss': _get_websocket_provider,
'http': _get_HTTP_provider,
'https': _get_HTTP_provider,
}
provider_scheme = uri_breakdown.scheme
# auto-detect for file based ipc
if not provider_scheme:
if Path(endpoint).is_file():
# file is available - assume ipc/file scheme
provider_scheme = "file"
self.log.info(
f"Auto-detected provider scheme as 'file://' for provider {endpoint}"
)
try:
self._provider = providers[provider_scheme](endpoint)
except KeyError:
raise self.UnsupportedProvider(
f"{endpoint} is an invalid or unsupported blockchain provider URI"
)
else:
self.endpoint = endpoint or NO_BLOCKCHAIN_CONNECTION
self.endpoint = endpoint or NO_BLOCKCHAIN_CONNECTION
else:
self._provider = provider
@classmethod
def _handle_failed_transaction(cls,
exception: Exception,
transaction_dict: dict,
contract_function: Union[ContractFunction, ContractConstructor],
logger: Logger = None
) -> None:
def _handle_failed_transaction(
cls,
exception: Exception,
transaction_dict: dict,
contract_function: Union[ContractFunction, ContractConstructor],
logger: Logger = None,
) -> None:
"""
Re-raising error handler and context manager for transaction broadcast or
build failure events at the interface layer. This method is a last line of defense
@ -401,8 +422,8 @@ class BlockchainInterface:
# Assume this error is formatted as an RPC response
try:
code = int(response['code'])
message = response['message']
code = int(response["code"])
message = response["message"]
except Exception:
# TODO: #1504 - Try even harder to determine if this is insufficient funds causing the issue,
# This may be best handled at the agent or actor layer for registry and token interactions.
@ -417,12 +438,16 @@ class BlockchainInterface:
if logger:
logger.critical(message) # simple context
transaction_failed = cls.TransactionFailed(message=message, # rich error (best case)
contract_function=contract_function,
transaction_dict=transaction_dict)
transaction_failed = cls.TransactionFailed(
message=message, # rich error (best case)
contract_function=contract_function,
transaction_dict=transaction_dict,
)
raise transaction_failed from exception
def __log_transaction(self, transaction_dict: dict, contract_function: ContractFunction):
def __log_transaction(
self, transaction_dict: dict, contract_function: ContractFunction
):
"""
Format and log a transaction dict and return the transaction name string.
This method *must not* mutate the original transaction dict.
@ -431,30 +456,38 @@ class BlockchainInterface:
tx = dict(transaction_dict).copy()
# Format
if tx.get('to'):
tx['to'] = to_checksum_address(contract_function.address)
if tx.get("to"):
tx["to"] = to_checksum_address(contract_function.address)
try:
tx['selector'] = contract_function.selector
tx["selector"] = contract_function.selector
except AttributeError:
pass
tx['from'] = to_checksum_address(tx['from'])
tx.update({f: prettify_eth_amount(v) for f, v in tx.items() if f in ('gasPrice', 'value')})
payload_pprint = ', '.join("{}: {}".format(k, v) for k, v in tx.items())
tx["from"] = to_checksum_address(tx["from"])
tx.update(
{
f: prettify_eth_amount(v)
for f, v in tx.items()
if f in ("gasPrice", "value")
}
)
payload_pprint = ", ".join("{}: {}".format(k, v) for k, v in tx.items())
# Log
transaction_name = get_transaction_name(contract_function=contract_function)
self.log.debug(f"[TX-{transaction_name}] | {payload_pprint}")
@validate_checksum_address
def build_payload(self,
sender_address: str,
payload: dict = None,
transaction_gas_limit: int = None,
use_pending_nonce: bool = True,
) -> dict:
nonce = self.client.get_transaction_count(account=sender_address, pending=use_pending_nonce)
base_payload = {'nonce': nonce, 'from': sender_address}
def build_payload(
self,
sender_address: str,
payload: dict = None,
transaction_gas_limit: int = None,
use_pending_nonce: bool = True,
) -> dict:
nonce = self.client.get_transaction_count(
account=sender_address, pending=use_pending_nonce
)
base_payload = {"nonce": nonce, "from": sender_address}
# Aggregate
if not payload:
@ -462,55 +495,72 @@ class BlockchainInterface:
payload.update(base_payload)
# Explicit gas override - will skip gas estimation in next operation.
if transaction_gas_limit:
payload['gas'] = int(transaction_gas_limit)
payload["gas"] = int(transaction_gas_limit)
return payload
@validate_checksum_address
def build_contract_transaction(self,
contract_function: ContractFunction,
sender_address: str,
payload: dict = None,
transaction_gas_limit: Optional[int] = None,
gas_estimation_multiplier: Optional[float] = None,
use_pending_nonce: Optional[bool] = None,
) -> dict:
def build_contract_transaction(
self,
contract_function: ContractFunction,
sender_address: str,
payload: dict = None,
transaction_gas_limit: Optional[int] = None,
gas_estimation_multiplier: Optional[float] = None,
use_pending_nonce: Optional[bool] = None,
log_now: bool = True,
) -> TxParams:
if transaction_gas_limit is not None:
self.log.warn("The transaction gas limit of {transaction_gas_limit} will override gas estimation attempts")
self.log.warn(
f"The transaction gas limit of {transaction_gas_limit} will override gas estimation attempts"
)
# Sanity checks for the gas estimation multiplier
if gas_estimation_multiplier is not None:
if not 1 <= gas_estimation_multiplier <= 3: # Arbitrary upper bound.
raise ValueError(f"The gas estimation multiplier should be a float between 1 and 3, "
f"but we received {gas_estimation_multiplier}.")
raise ValueError(
f"The gas estimation multiplier must be a float between 1 and 3, "
f"but we received {gas_estimation_multiplier}."
)
payload = self.build_payload(sender_address=sender_address,
payload=payload,
transaction_gas_limit=transaction_gas_limit,
use_pending_nonce=use_pending_nonce)
self.__log_transaction(transaction_dict=payload, contract_function=contract_function)
payload = self.build_payload(
sender_address=sender_address,
payload=payload,
transaction_gas_limit=transaction_gas_limit,
use_pending_nonce=use_pending_nonce,
)
if log_now:
self.__log_transaction(
transaction_dict=payload, contract_function=contract_function
)
try:
if 'gas' not in payload: # i.e., transaction_gas_limit is not None
if "gas" not in payload: # i.e., transaction_gas_limit is not None
# As web3 build_transaction() will estimate gas with block identifier "pending" by default,
# explicitly estimate gas here with block identifier 'latest' if not otherwise specified
# as a pending transaction can cause gas estimation to fail, notably in case of worklock refunds.
payload['gas'] = contract_function.estimate_gas(payload, block_identifier='latest')
payload["gas"] = contract_function.estimate_gas(
payload, block_identifier="latest"
)
transaction_dict = contract_function.build_transaction(payload)
except (TestTransactionFailed, ValidationError, ValueError) as error:
except ValueError as error:
# Note: Geth (1.9.15) raises ValueError in the same condition that pyevm raises ValidationError here.
# Treat this condition as "Transaction Failed" during gas estimation.
raise self._handle_failed_transaction(exception=error,
transaction_dict=payload,
contract_function=contract_function,
logger=self.log)
raise self._handle_failed_transaction(
exception=error,
transaction_dict=payload,
contract_function=contract_function,
logger=self.log,
)
# Increase the estimated gas limit according to the gas estimation multiplier, if any.
if gas_estimation_multiplier and not transaction_gas_limit:
gas_estimation = transaction_dict['gas']
gas_estimation = transaction_dict["gas"]
overestimation = int(math.ceil(gas_estimation * gas_estimation_multiplier))
self.log.debug(f"Gas limit for this TX was increased from {gas_estimation} to {overestimation}, "
f"using a multiplier of {gas_estimation_multiplier}.")
transaction_dict['gas'] = overestimation
self.log.debug(
f"Gas limit for this TX was increased from {gas_estimation} to {overestimation}, "
f"using a multiplier of {gas_estimation_multiplier}."
)
transaction_dict["gas"] = overestimation
# TODO: What if we're going over the block limit? Not likely, but perhaps worth checking (NRN)
return transaction_dict
@ -521,141 +571,159 @@ class BlockchainInterface:
transaction_dict: Dict,
transaction_name: str = "",
confirmations: int = 0,
fire_and_forget: bool = False,
) -> Union[TxReceipt, HexBytes]:
) -> TxReceipt:
"""
Takes a transaction dictionary, signs it with the configured signer, then broadcasts the signed
transaction using the ethereum provider's eth_sendRawTransaction RPC endpoint.
Optionally blocks for receipt and confirmation with 'confirmations', and 'fire_and_forget' flags.
If 'fire and forget' is True this method returns the transaction hash only, without waiting for a receipt -
otherwise return the transaction receipt.
Takes a transaction dictionary, signs it with the configured signer,
then broadcasts the signed transaction using the RPC provider's
eth_sendRawTransaction endpoint.
"""
#
# Setup
#
# TODO # 1754 - Move this to singleton - I do not approve... nor does Bogdan?
emitter = StdoutEmitter()
#
# Sign
#
# TODO: Show the USD Price: https://api.coinmarketcap.com/v1/ticker/ethereum/
try:
# post-london fork transactions (Type 2)
max_unit_price = transaction_dict['maxFeePerGas']
tx_type = 'EIP-1559'
except KeyError:
# pre-london fork "legacy" transactions (Type 0)
max_unit_price = transaction_dict['gasPrice']
tx_type = 'Legacy'
max_price_gwei = Web3.from_wei(max_unit_price, 'gwei')
max_cost_wei = max_unit_price * transaction_dict['gas']
max_cost = Web3.from_wei(max_cost_wei, 'ether')
max_cost, max_price_gwei, tx_type = get_tx_cost_data(transaction_dict)
if transacting_power.is_device:
emitter.message(f'Confirm transaction {transaction_name} on hardware wallet... '
f'({max_cost} ETH @ {max_price_gwei} gwei)',
color='yellow')
signed_raw_transaction = transacting_power.sign_transaction(transaction_dict)
emitter.message(
f"Confirm transaction {transaction_name} on hardware wallet... "
f"({max_cost} @ {max_price_gwei} gwei)",
color="yellow",
)
raw_transaction = transacting_power.sign_transaction(transaction_dict)
#
# Broadcast
#
emitter.message(f'Broadcasting {transaction_name} {tx_type} Transaction ({max_cost} ETH @ {max_price_gwei} gwei)',
color='yellow')
emitter.message(
f"Broadcasting {transaction_name} {tx_type} Transaction ({max_cost} @ {max_price_gwei} gwei)",
color="yellow",
)
try:
txhash = self.client.send_raw_transaction(signed_raw_transaction) # <--- BROADCAST
emitter.message(f'TXHASH {txhash.hex()}', color='yellow')
except (TestTransactionFailed, ValueError):
txhash = self.client.send_raw_transaction(raw_transaction) # <--- BROADCAST
emitter.message(f"TXHASH {txhash.hex()}", color="yellow")
except ValueError:
raise # TODO: Unify with Transaction failed handling -- Entry point for _handle_failed_transaction
else:
if fire_and_forget:
return txhash
#
# Receipt
#
try: # TODO: Handle block confirmation exceptions
waiting_for = 'receipt'
try:
waiting_for = "receipt"
if confirmations:
waiting_for = f'{confirmations} confirmations'
emitter.message(f'Waiting {self.TIMEOUT} seconds for {waiting_for}', color='yellow')
receipt = self.client.wait_for_receipt(txhash, timeout=self.TIMEOUT, confirmations=confirmations)
waiting_for = f"{confirmations} confirmations"
emitter.message(
f"Waiting {self.TIMEOUT} seconds for {waiting_for}", color="yellow"
)
receipt = self.client.wait_for_receipt(
txhash, timeout=self.TIMEOUT, confirmations=confirmations
)
except TimeExhausted:
# TODO: #1504 - Handle transaction timeout
raise
else:
self.log.debug(f"[RECEIPT-{transaction_name}] | txhash: {receipt['transactionHash'].hex()}")
self.log.debug(
f"[RECEIPT-{transaction_name}] | txhash: {receipt['transactionHash'].hex()}"
)
#
# Confirmations
#
# Primary check
transaction_status = receipt.get('status', UNKNOWN_TX_STATUS)
transaction_status = receipt.get("status", UNKNOWN_TX_STATUS)
if transaction_status == 0:
failure = f"Transaction transmitted, but receipt returned status code 0. " \
f"Full receipt: \n {pprint.pformat(receipt, indent=2)}"
failure = (
f"Transaction transmitted, but receipt returned status code 0. "
f"Full receipt: \n {pprint.pformat(receipt, indent=2)}"
)
raise self.InterfaceError(failure)
if transaction_status is UNKNOWN_TX_STATUS:
self.log.info(f"Unknown transaction status for {txhash} (receipt did not contain a status field)")
self.log.info(
f"Unknown transaction status for {txhash} (receipt did not contain a status field)"
)
# Secondary check
tx = self.client.get_transaction(txhash)
if tx["gas"] == receipt["gasUsed"]:
raise self.InterfaceError(f"Transaction consumed 100% of transaction gas."
f"Full receipt: \n {pprint.pformat(receipt, indent=2)}")
raise self.InterfaceError(
f"Transaction consumed 100% of transaction gas. "
f"Full receipt: \n {pprint.pformat(receipt, indent=2)}"
)
return receipt
def send_async_transaction(
self,
contract_function: ContractFunction,
transacting_power: TransactingPower,
async_tx_hooks: AsyncTxHooks,
transaction_gas_limit: Optional[int] = None,
gas_estimation_multiplier: float = 1.15,
info: Optional[Dict[str, str]] = None,
payload: dict = None,
) -> AsyncTx:
transaction = self.build_contract_transaction(
contract_function=contract_function,
sender_address=transacting_power.account,
payload=payload,
transaction_gas_limit=transaction_gas_limit,
gas_estimation_multiplier=gas_estimation_multiplier,
log_now=False,
)
basic_info = {
"name": contract_function.fn_name,
"contract": contract_function.address,
}
if info:
basic_info.update(info)
# TODO: This is a bit of a hack. temporary solution until incoming PR #3382 is merged.
signer = transacting_power._signer._get_signer(transacting_power.account)
async_tx = self.tx_machine.queue_transaction(
info=info,
params=transaction,
signer=signer,
on_broadcast=async_tx_hooks.on_broadcast,
on_broadcast_failure=async_tx_hooks.on_broadcast_failure,
on_fault=async_tx_hooks.on_fault,
on_finalized=async_tx_hooks.on_finalized,
on_insufficient_funds=async_tx_hooks.on_insufficient_funds,
)
return async_tx
@validate_checksum_address
def send_transaction(self,
contract_function: Union[ContractFunction, ContractConstructor],
transacting_power: TransactingPower,
payload: dict = None,
transaction_gas_limit: Optional[int] = None,
gas_estimation_multiplier: Optional[float] = 1.15, # TODO: Workaround for #2635, #2337
confirmations: int = 0,
fire_and_forget: bool = False, # do not wait for receipt. See #2385
replace: bool = False,
) -> Union[TxReceipt, HexBytes]:
if fire_and_forget:
if confirmations > 0:
raise ValueError("Transaction Prevented: "
"Cannot use 'confirmations' and 'fire_and_forget' options together.")
use_pending_nonce = False # TODO: #2385
else:
use_pending_nonce = replace # TODO: #2385
transaction = self.build_contract_transaction(contract_function=contract_function,
sender_address=transacting_power.account,
payload=payload,
transaction_gas_limit=transaction_gas_limit,
gas_estimation_multiplier=gas_estimation_multiplier,
use_pending_nonce=use_pending_nonce)
# Get transaction name
def send_transaction(
self,
contract_function: Union[ContractFunction, ContractConstructor],
transacting_power: TransactingPower,
payload: dict = None,
transaction_gas_limit: Optional[int] = None,
gas_estimation_multiplier: Optional[
float
] = 1.15, # TODO: Workaround for #2635, #2337
) -> TxReceipt:
transaction = self.build_contract_transaction(
contract_function=contract_function,
sender_address=transacting_power.account,
payload=payload,
transaction_gas_limit=transaction_gas_limit,
gas_estimation_multiplier=gas_estimation_multiplier,
log_now=True,
)
try:
transaction_name = contract_function.fn_name.upper()
except AttributeError:
transaction_name = 'DEPLOY' if isinstance(contract_function, ContractConstructor) else 'UNKNOWN'
txhash_or_receipt = self.sign_and_broadcast_transaction(transacting_power=transacting_power,
transaction_dict=transaction,
transaction_name=transaction_name,
confirmations=confirmations,
fire_and_forget=fire_and_forget)
return txhash_or_receipt
transaction_name = (
"DEPLOY"
if isinstance(contract_function, ContractConstructor)
else "UNKNOWN"
)
receipt = self.sign_and_broadcast_transaction(
transacting_power=transacting_power,
transaction_dict=transaction,
transaction_name=transaction_name,
)
return receipt
def get_contract_by_name(
self,
@ -714,12 +782,9 @@ class BlockchainInterfaceFactory:
return bool(cls._interfaces.get(endpoint, False))
@classmethod
def register_interface(cls,
interface: BlockchainInterface,
emitter=None,
force: bool = False
) -> None:
def register_interface(
cls, interface: BlockchainInterface, emitter=None, force: bool = False
) -> None:
endpoint = interface.endpoint
if (endpoint in cls._interfaces) and not force:
raise cls.InterfaceAlreadyInitialized(
@ -763,7 +828,6 @@ class BlockchainInterfaceFactory:
@classmethod
def get_interface(cls, endpoint: str = None) -> Interfaces:
# Try to get an existing cached interface.
if endpoint:
try:

View File

@ -13,8 +13,10 @@ from nucypher_core.ferveo import (
FerveoPublicKey,
)
PHASE1 = 1
PHASE2 = 2
from nucypher.types import PhaseNumber
PHASE1 = PhaseNumber(1)
PHASE2 = PhaseNumber(2)
@dataclass

View File

@ -1,11 +1,6 @@
from typing import Union
from urllib.parse import urlparse
from eth_tester import EthereumTester, PyEVMBackend
from eth_tester.backends.mock.main import MockBackend
from web3 import HTTPProvider, IPCProvider, WebsocketProvider
from web3 import HTTPProvider
from web3.providers import BaseProvider
from web3.providers.eth_tester.main import EthereumTesterProvider
from nucypher.exceptions import DevelopmentInstallationRequired
@ -14,15 +9,7 @@ class ProviderError(Exception):
pass
def _get_IPC_provider(endpoint) -> BaseProvider:
uri_breakdown = urlparse(endpoint)
from nucypher.blockchain.eth.interfaces import BlockchainInterface
return IPCProvider(ipc_path=uri_breakdown.path,
timeout=BlockchainInterface.TIMEOUT,
request_kwargs={'timeout': BlockchainInterface.TIMEOUT})
def _get_HTTP_provider(endpoint) -> BaseProvider:
def _get_http_provider(endpoint) -> BaseProvider:
from nucypher.blockchain.eth.interfaces import BlockchainInterface
return HTTPProvider(
@ -31,25 +18,8 @@ def _get_HTTP_provider(endpoint) -> BaseProvider:
)
def _get_websocket_provider(endpoint) -> BaseProvider:
from nucypher.blockchain.eth.interfaces import BlockchainInterface
def _get_pyevm_test_backend():
return WebsocketProvider(
endpoint_uri=endpoint,
websocket_kwargs={"timeout": BlockchainInterface.TIMEOUT},
)
def _get_auto_provider(endpoint) -> BaseProvider:
from web3.auto import w3
# how-automated-detection-works: https://web3py.readthedocs.io/en/latest/providers.html
connected = w3.isConnected()
if not connected:
raise ProviderError('Cannot auto-detect node. Provide a full URI instead.')
return w3.provider
def _get_pyevm_test_backend() -> PyEVMBackend:
try:
# TODO: Consider packaged support of --dev mode with testerchain
from tests.constants import NUMBER_OF_ETH_TEST_ACCOUNTS, PYEVM_GAS_LIMIT
@ -57,13 +27,21 @@ def _get_pyevm_test_backend() -> PyEVMBackend:
raise DevelopmentInstallationRequired(importable_name='tests.constants')
# Initialize
from eth_tester import PyEVMBackend
genesis_params = PyEVMBackend._generate_genesis_params(overrides={'gas_limit': PYEVM_GAS_LIMIT})
pyevm_backend = PyEVMBackend(genesis_parameters=genesis_params)
pyevm_backend.reset_to_genesis(genesis_params=genesis_params, num_accounts=NUMBER_OF_ETH_TEST_ACCOUNTS)
return pyevm_backend
def _get_ethereum_tester(test_backend: Union[PyEVMBackend, MockBackend]) -> EthereumTesterProvider:
def _get_ethereum_tester(test_backend):
try:
from eth_tester import EthereumTester
from web3.providers.eth_tester.main import EthereumTesterProvider
except ImportError:
raise DevelopmentInstallationRequired(
importable_name="web3.providers.eth_tester"
)
eth_tester = EthereumTester(backend=test_backend, auto_mine_transactions=True)
provider = EthereumTesterProvider(ethereum_tester=eth_tester)
return provider
@ -79,11 +57,10 @@ def _get_pyevm_test_provider(endpoint) -> BaseProvider:
def _get_mock_test_provider(endpoint) -> BaseProvider:
# https://github.com/ethereum/eth-tester#mockbackend
try:
from eth_tester import MockBackend
except ImportError:
raise DevelopmentInstallationRequired(importable_name="eth_tester.MockBackend")
mock_backend = MockBackend()
provider = _get_ethereum_tester(test_backend=mock_backend)
return provider
def _get_tester_ganache(endpoint=None) -> BaseProvider:
endpoint_uri = endpoint or "http://localhost:7545"
return HTTPProvider(endpoint_uri=endpoint_uri)

View File

@ -1,5 +1,3 @@
from abc import ABC, abstractmethod
from typing import List
from urllib.parse import urlparse
@ -79,7 +77,7 @@ class Signer(ABC):
return NotImplemented
@abstractmethod
def sign_transaction(self, transaction_dict: dict) -> HexBytes:
def sign_transaction(self, transaction_dict: dict) -> bytes:
return NotImplemented
@abstractmethod

View File

@ -1,5 +1,3 @@
import json
from json.decoder import JSONDecodeError
from pathlib import Path
@ -18,21 +16,27 @@ from nucypher.blockchain.eth.signers.base import Signer
class Web3Signer(Signer):
def __init__(self, client):
super().__init__()
self.__client = client
@validate_checksum_address
def _get_signer(self, account: str):
raise NotImplementedError(
"Can't sign via external blockchain clients; provide an explicit Signer"
)
@classmethod
def uri_scheme(cls) -> str:
return NotImplemented # web3 signer uses a "passthrough" scheme
@classmethod
def from_signer_uri(cls, uri: str, testnet: bool = False) -> 'Web3Signer':
def from_signer_uri(cls, uri: str, testnet: bool = False) -> "Web3Signer":
from nucypher.blockchain.eth.interfaces import (
BlockchainInterface,
BlockchainInterfaceFactory,
)
try:
blockchain = BlockchainInterfaceFactory.get_or_create_interface(
endpoint=uri
@ -51,41 +55,30 @@ class Web3Signer(Signer):
@validate_checksum_address
def is_device(self, account: str):
try:
# TODO: Temporary fix for #1128 and #1385. It's ugly af, but it works. Move somewhere else?
wallets = self.__client.wallets
except AttributeError:
return False
else:
HW_WALLET_URL_PREFIXES = ('trezor', 'ledger')
hw_accounts = [w['accounts'] for w in wallets if w['url'].startswith(HW_WALLET_URL_PREFIXES)]
hw_addresses = [to_checksum_address(account['address']) for sublist in hw_accounts for account in sublist]
return account in hw_addresses
return False
@validate_checksum_address
def unlock_account(self, account: str, password: str, duration: int = None):
if self.is_device(account=account):
unlocked = True
else:
unlocked = self.__client.unlock_account(account=account, password=password, duration=duration)
return unlocked
raise NotImplementedError(
"Can't manage accounts via external blockchain clients; provide an explicit Signer"
)
@validate_checksum_address
def lock_account(self, account: str):
if self.is_device(account=account):
result = None # TODO: Force Disconnect Devices?
else:
result = self.__client.lock_account(account=account)
return result
raise NotImplementedError(
"Can't manage accounts via external blockchain clients; provide an explicit Signer"
)
@validate_checksum_address
def sign_message(self, account: str, message: bytes, **kwargs) -> HexBytes:
signature = self.__client.sign_message(account=account, message=message)
return HexBytes(signature)
raise NotImplementedError(
"Can't sign via external blockchain clients; provide an explicit Signer"
)
def sign_transaction(self, transaction_dict: dict) -> HexBytes:
signed_raw_transaction = self.__client.sign_transaction(transaction_dict=transaction_dict)
return signed_raw_transaction
def sign_transaction(self, transaction_dict: dict) -> bytes:
raise NotImplementedError(
"Can't sign via external blockchain clients; provide an explicit Signer"
)
class KeystoreSigner(Signer):
@ -116,7 +109,7 @@ class KeystoreSigner(Signer):
@classmethod
def uri_scheme(cls) -> str:
return 'keystore'
return "keystore"
def __read_keystore(self, path: Path) -> None:
"""Read the keystore directory from the disk and populate accounts."""
@ -126,11 +119,15 @@ class KeystoreSigner(Signer):
elif path.is_file():
paths = (path,)
else:
raise self.InvalidSignerURI(f'Invalid keystore file or directory "{path}"')
raise self.InvalidSignerURI(
f'Invalid keystore file or directory "{path}"'
)
except FileNotFoundError:
raise self.InvalidSignerURI(f'No such keystore file or directory "{path}"')
except OSError as exc:
raise self.InvalidSignerURI(f'Error accessing keystore file or directory "{path}": {exc}')
raise self.InvalidSignerURI(
f'Error accessing keystore file or directory "{path}": {exc}'
)
for path in paths:
account, key_metadata = self.__handle_keyfile(path=path)
self.__keys[account] = key_metadata
@ -138,9 +135,9 @@ class KeystoreSigner(Signer):
@staticmethod
def __read_keyfile(path: Path) -> tuple:
"""Read an individual keystore key file from the disk"""
with open(path, 'r') as keyfile:
with open(path, "r") as keyfile:
key_metadata = json.load(keyfile)
address = key_metadata['address']
address = key_metadata["address"]
return address, key_metadata
def __handle_keyfile(self, path: Path) -> Tuple[str, dict]:
@ -161,12 +158,17 @@ class KeystoreSigner(Signer):
raise self.InvalidKeyfile(error)
else:
if not is_address(address):
raise self.InvalidKeyfile(f"'{path}' does not contain a valid ethereum address")
try:
address = to_checksum_address(address)
except ValueError:
raise self.InvalidKeyfile(
f"'{path}' does not contain a valid ethereum address"
)
address = to_checksum_address(address)
return address, key_metadata
@validate_checksum_address
def __get_signer(self, account: str) -> LocalAccount:
def _get_signer(self, account: str) -> LocalAccount:
"""Lookup a known keystore account by its checksum address or raise an error"""
try:
return self.__signers[account]
@ -186,14 +188,14 @@ class KeystoreSigner(Signer):
return self.__path
@classmethod
def from_signer_uri(cls, uri: str, testnet: bool = False) -> 'Signer':
"""Return a keystore signer from URI string i.e. keystore:///my/path/keystore """
def from_signer_uri(cls, uri: str, testnet: bool = False) -> "Signer":
"""Return a keystore signer from URI string i.e. keystore:///my/path/keystore"""
decoded_uri = urlparse(uri)
if decoded_uri.scheme != cls.uri_scheme() or decoded_uri.netloc:
raise cls.InvalidSignerURI(uri)
path = decoded_uri.path
if not path:
raise cls.InvalidSignerURI('Blank signer URI - No keystore path provided')
raise cls.InvalidSignerURI("Blank signer URI - No keystore path provided")
return cls(path=Path(path), testnet=testnet)
@validate_checksum_address
@ -223,10 +225,14 @@ class KeystoreSigner(Signer):
if not password:
# It is possible that password is None here passed from the above layer
# causing Account.decrypt to crash, expecting a value for password.
raise self.AuthenticationFailed('No password supplied to unlock account.')
raise self.AuthenticationFailed(
"No password supplied to unlock account."
)
raise
except ValueError as e:
raise self.AuthenticationFailed("Invalid or incorrect ethereum account password.") from e
raise self.AuthenticationFailed(
"Invalid or incorrect ethereum account password."
) from e
return True
@validate_checksum_address
@ -244,27 +250,31 @@ class KeystoreSigner(Signer):
return account not in self.__signers
@validate_checksum_address
def sign_transaction(self, transaction_dict: dict) -> HexBytes:
def sign_transaction(self, transaction_dict: dict) -> bytes:
"""
Produce a raw signed ethereum transaction signed by the account specified
in the 'from' field of the transaction dictionary.
"""
sender = transaction_dict['from']
signer = self.__get_signer(account=sender)
sender = transaction_dict["from"]
signer = self._get_signer(account=sender)
# TODO: Handle this at a higher level?
# Do not include a 'to' field for contract creation.
if not transaction_dict['to']:
transaction_dict = dissoc(transaction_dict, 'to')
if not transaction_dict["to"]:
transaction_dict = dissoc(transaction_dict, "to")
raw_transaction = signer.sign_transaction(transaction_dict=transaction_dict).rawTransaction
raw_transaction = signer.sign_transaction(
transaction_dict=transaction_dict
).rawTransaction
return raw_transaction
@validate_checksum_address
def sign_message(self, account: str, message: bytes, **kwargs) -> HexBytes:
signer = self.__get_signer(account=account)
signature = signer.sign_message(signable_message=encode_defunct(primitive=message)).signature
signer = self._get_signer(account=account)
signature = signer.sign_message(
signable_message=encode_defunct(primitive=message)
).signature
return HexBytes(signature)
@ -309,7 +319,7 @@ class InMemorySigner(Signer):
return True
@validate_checksum_address
def __get_signer(self, account: str) -> LocalAccount:
def _get_signer(self, account: str) -> LocalAccount:
"""Lookup a known keystore account by its checksum address or raise an error"""
try:
return self.__signers[account]
@ -320,9 +330,9 @@ class InMemorySigner(Signer):
raise self.AccountLocked(account=account)
@validate_checksum_address
def sign_transaction(self, transaction_dict: dict) -> HexBytes:
def sign_transaction(self, transaction_dict: dict) -> bytes:
sender = transaction_dict["from"]
signer = self.__get_signer(account=sender)
signer = self._get_signer(account=sender)
if not transaction_dict["to"]:
transaction_dict = dissoc(transaction_dict, "to")
raw_transaction = signer.sign_transaction(
@ -332,7 +342,7 @@ class InMemorySigner(Signer):
@validate_checksum_address
def sign_message(self, account: str, message: bytes, **kwargs) -> HexBytes:
signer = self.__get_signer(account=account)
signer = self._get_signer(account=account)
signature = signer.sign_message(
signable_message=encode_defunct(primitive=message)
).signature

View File

@ -8,7 +8,6 @@ from prometheus_client import REGISTRY, Gauge
from twisted.internet import threads
from web3.datastructures import AttributeDict
from nucypher.blockchain.eth import actors
from nucypher.blockchain.eth.models import Coordinator
from nucypher.policy.conditions.utils import camel_case_to_snake
from nucypher.utilities.cache import TTLCache
@ -50,15 +49,17 @@ class EventScannerTask(SimpleTask):
INTERVAL = 120 # seconds
def __init__(self, scanner: Callable, *args, **kwargs):
def __init__(self, scanner: Callable):
self.scanner = scanner
super().__init__(*args, **kwargs)
super().__init__()
def run(self):
def run(self) -> None:
self.scanner()
def handle_errors(self, *args, **kwargs):
self.log.warn("Error during ritual event scanning: {}".format(args[0].getTraceback()))
def handle_errors(self, *args, **kwargs) -> None:
self.log.warn(
"Error during ritual event scanning: {}".format(args[0].getTraceback())
)
if not self._task.running:
self.log.warn("Restarting event scanner task!")
self.start(now=False) # take a breather
@ -67,9 +68,7 @@ class EventScannerTask(SimpleTask):
class ActiveRitualTracker:
CHAIN_REORG_SCAN_WINDOW = 20
MAX_CHUNK_SIZE = 10000
MIN_CHUNK_SIZE = 60 # 60 blocks @ 2s per block on Polygon = 120s of blocks (somewhat related to interval)
# how often to check/purge for expired cached values - 8hrs?
@ -97,7 +96,7 @@ class ActiveRitualTracker:
def __init__(
self,
operator: "actors.Operator",
operator,
persistent: bool = False, # TODO: use persistent storage?
):
self.log = Logger("RitualTracker")
@ -172,7 +171,7 @@ class ActiveRitualTracker:
w3 = self.web3
timeout = self.coordinator_agent.get_timeout()
latest_block = w3.eth.get_block('latest')
latest_block = w3.eth.get_block("latest")
if latest_block.number == 0:
return 0
@ -202,13 +201,13 @@ class ActiveRitualTracker:
# if non-zero block found - return the block before
return expected_start_block.number - 1 if expected_start_block.number > 0 else 0
def start(self):
def start(self) -> None:
"""Start the event scanner task."""
return self.task.start()
self.task.start()
def stop(self):
def stop(self) -> None:
"""Stop the event scanner task."""
return self.task.stop()
self.task.stop()
def _action_required(self, ritual_event: AttributeDict) -> bool:
"""Check if an action is required for a given ritual event."""
@ -389,8 +388,10 @@ class ActiveRitualTracker:
camel_case_to_snake(k): v for k, v in ritual_event.args.items()
}
event_type = getattr(self.contract.events, ritual_event.event)
def task():
self.actions[event_type](timestamp=timestamp, **formatted_kwargs)
if defer:
d = threads.deferToThread(task)
d.addErrback(self.task.handle_errors)
@ -416,14 +417,18 @@ class ActiveRitualTracker:
def __scan(self, start_block, end_block, account):
# Run the scan
self.log.debug(f"({account[:8]}) Scanning events in block range {start_block} - {end_block}")
self.log.debug(
f"({account[:8]}) Scanning events in block range {start_block} - {end_block}"
)
start = time.time()
result, total_chunks_scanned = self.scanner.scan(start_block, end_block)
if self.persistent:
self.state.save()
duration = time.time() - start
self.log.debug(f"Scanned total of {len(result)} events, in {duration} seconds, "
f"total {total_chunks_scanned} chunk scans performed")
self.log.debug(
f"Scanned total of {len(result)} events, in {duration} seconds, "
f"total {total_chunks_scanned} chunk scans performed"
)
def scan(self):
"""

View File

@ -9,11 +9,11 @@ class PrometheusMetricsTracker(SimpleTask):
self.metrics_collectors = collectors
super().__init__(interval=interval)
def run(self):
def run(self) -> None:
for collector in self.metrics_collectors:
collector.collect()
def handle_errors(self, *args, **kwargs):
def handle_errors(self, *args, **kwargs) -> None:
self.log.warn(
"Error during prometheus metrics collection: {}".format(
args[0].getTraceback()

View File

@ -4,6 +4,7 @@ from typing import Union
from eth_typing import ChecksumAddress
from web3 import Web3
from web3.contract.contract import ContractConstructor, ContractFunction
from web3.types import TxParams
def prettify_eth_amount(amount, original_denomination: str = 'wei') -> str:
@ -46,3 +47,18 @@ def get_transaction_name(contract_function: Union[ContractFunction, ContractCons
def truncate_checksum_address(checksum_address: ChecksumAddress) -> str:
return f"{checksum_address[:8]}...{checksum_address[-8:]}"
def get_tx_cost_data(transaction_dict: TxParams):
try:
# post-london fork transactions (Type 2)
max_unit_price = transaction_dict["maxFeePerGas"]
tx_type = "EIP-1559"
except KeyError:
# pre-london fork "legacy" transactions (Type 0)
max_unit_price = transaction_dict["gasPrice"]
tx_type = "Legacy"
max_price_gwei = Web3.from_wei(max_unit_price, "gwei")
max_cost_wei = max_unit_price * transaction_dict["gas"]
max_cost = Web3.from_wei(max_cost_wei, "ether")
return max_cost, max_price_gwei, tx_type

View File

@ -10,8 +10,6 @@ from typing import (
NamedTuple,
Optional,
Sequence,
Set,
Tuple,
Union,
)
@ -65,7 +63,6 @@ from nucypher_core.umbral import (
reencrypt,
)
from twisted.internet import reactor
from web3.types import TxReceipt
import nucypher
from nucypher.acumen.nicknames import Nickname
@ -373,58 +370,6 @@ class Alice(Character, actors.PolicyAuthor):
policy_pubkey = alice_delegating_power.get_pubkey_from_label(label)
return policy_pubkey
def revoke(
self, policy: Policy, onchain: bool = True, offchain: bool = True
) -> Tuple[TxReceipt, Dict[ChecksumAddress, Tuple["actors.Revocation", Exception]]]:
if not (offchain or onchain):
raise ValueError("offchain or onchain must be True to issue revocation")
receipt, failed = dict(), dict()
if onchain:
pass
# TODO: Decouple onchain revocation from SubscriptionManager or deprecate.
# receipt = self.policy_agent.revoke_policy(policy_id=bytes(policy.hrac),
# transacting_power=self._crypto_power.power_ups(TransactingPower))
if offchain:
"""
Parses the treasure map and revokes onchain arrangements in it.
If any nodes cannot be revoked, then the node_id is added to a
dict as a key, and the revocation and Ursula's response is added as
a value.
"""
try:
# Wait for a revocation threshold of nodes to be known ((n - m) + 1)
revocation_threshold = (policy.shares - policy.threshold) + 1
self.block_until_specific_nodes_are_known(
policy.revocation_kit.revokable_addresses,
allow_missing=(policy.shares - revocation_threshold),
)
except self.NotEnoughTeachers:
raise # TODO NRN
for node_id in policy.revocation_kit.revokable_addresses:
ursula = self.known_nodes[node_id]
revocation = policy.revocation_kit[node_id]
try:
response = self.network_middleware.request_revocation(
ursula, revocation
)
except self.network_middleware.NotFound:
failed[node_id] = (revocation, self.network_middleware.NotFound)
except self.network_middleware.UnexpectedResponse:
failed[node_id] = (
revocation,
self.network_middleware.UnexpectedResponse,
)
else:
if response.status_code != 200:
message = f"Failed to revocation for node {node_id} with status code {response.status_code}"
raise self.ActorError(message)
return receipt, failed
def decrypt_message_kit(self, label: bytes, message_kit: MessageKit) -> List[bytes]:
"""
Decrypt this Alice's own encrypted data.
@ -645,9 +590,9 @@ class Bob(Character):
requester_public_key=requester_public_key,
)
shared_secrets[ursula_checksum_address] = shared_secret
decryption_request_mapping[
ursula_checksum_address
] = encrypted_decryption_request
decryption_request_mapping[ursula_checksum_address] = (
encrypted_decryption_request
)
decryption_client = self._threshold_decryption_client_class(learner=self)
successes, failures = decryption_client.gather_encrypted_decryption_shares(
@ -857,9 +802,6 @@ class Ursula(Teacher, Character, Operator):
TLSHostingPower
).keypair.certificate
# Only *YOU* can prevent forest fires
self.revoked_policies: Set[bytes] = set()
self.log.info(self.banner.format(self.nickname))
else:
@ -953,11 +895,10 @@ class Ursula(Teacher, Character, Operator):
) -> None:
"""Schedule and start select ursula services, then optionally start the reactor."""
# Connect to Provider
if not BlockchainInterfaceFactory.is_interface_initialized(
endpoint=self.eth_endpoint
):
BlockchainInterfaceFactory.initialize_interface(endpoint=self.eth_endpoint)
BlockchainInterfaceFactory.get_or_create_interface(endpoint=self.eth_endpoint)
BlockchainInterfaceFactory.get_or_create_interface(
endpoint=self.polygon_endpoint
)
if preflight:
self.__preflight()
@ -967,12 +908,12 @@ class Ursula(Teacher, Character, Operator):
#
if emitter:
emitter.message("Starting services", color="yellow")
emitter.message("Starting services...", color="yellow")
if discovery and not self.lonely:
self.start_learning_loop(now=eager)
if emitter:
emitter.message(f"Node Discovery ({self.domain})", color="green")
emitter.message(f"P2P Networking ({self.domain})", color="green")
if ritual_tracking:
self.ritual_tracker.start()
@ -997,8 +938,11 @@ class Ursula(Teacher, Character, Operator):
ursula=self, prometheus_config=prometheus_config
)
if emitter:
prometheus_addr = (
prometheus_config.listen_address or self.rest_interface.host
)
emitter.message(
f"✓ Prometheus Exporter http://{self.rest_interface.host}:"
f"✓ Prometheus Exporter http://{prometheus_addr}:"
f"{prometheus_config.port}/metrics",
color="green",
)
@ -1147,7 +1091,6 @@ class Ursula(Teacher, Character, Operator):
seed_uri = f"{seednode_metadata.checksum_address}@{seednode_metadata.rest_host}:{seednode_metadata.rest_port}"
return cls.from_seed_and_stake_info(seed_uri=seed_uri, *args, **kwargs)
@classmethod
def from_teacher_uri(
cls,

View File

@ -2,10 +2,9 @@ from copy import copy
from unittest import mock
from unittest.mock import Mock, patch
from eth_tester.exceptions import ValidationError
from nucypher_core import NodeMetadata
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.blockchain.eth.signers.software import InMemorySigner
from nucypher.characters.lawful import Alice, Ursula
from nucypher.config.constants import TEMPORARY_DOMAIN_NAME
from nucypher.crypto.powers import CryptoPower
@ -50,8 +49,6 @@ class Vladimir(Ursula):
crypto_power = CryptoPower(power_ups=target_ursula._default_crypto_powerups)
cls.attach_transacting_key(blockchain=eth_blockchain)
# Vladimir does not care about payment.
bogus_pre_payment_method = FreeReencryptions()
bogus_pre_payment_method.provider = Mock()
@ -74,7 +71,7 @@ class Vladimir(Ursula):
network_middleware=cls.network_middleware,
checksum_address=cls.fraud_address,
operator_address=cls.fraud_address,
signer=Web3Signer(eth_blockchain.client),
signer=InMemorySigner(private_key=cls.fraud_key),
eth_endpoint=eth_blockchain.endpoint,
polygon_endpoint=polygon_blockchain.endpoint,
pre_payment_method=bogus_pre_payment_method,
@ -115,22 +112,6 @@ class Vladimir(Ursula):
return vladimir
@classmethod
def attach_transacting_key(cls, blockchain):
"""
Upload Vladimir's ETH keys to the keychain via web3.
"""
try:
password = 'iamverybadass'
blockchain.w3.provider.ethereum_tester.add_account(cls.fraud_key, password=password)
except (ValidationError,):
# check if Vlad's key is already on the keystore...
if cls.fraud_address in blockchain.client.accounts:
return True
else:
raise
return True
class Amonia(Alice):
"""

View File

@ -126,6 +126,10 @@ class UrsulaConfigOptions:
config_class=UrsulaConfiguration,
do_auto_migrate=True,
)
else:
# config file specified
migrate(emitter=emitter, config_file=config_file)
try:
return UrsulaConfiguration.from_configuration_file(
emitter=emitter,
@ -385,6 +389,11 @@ def destroy(general_config, config_options, config_file, force):
default=9101,
type=NETWORK_PORT,
)
@click.option(
"--metrics-listen-address",
help="Run a Prometheus metrics exporter on the specified IP address",
default="",
)
@click.option(
"--metrics-interval",
help="The frequency of metrics collection in seconds (if prometheus enabled)",
@ -403,6 +412,7 @@ def run(
dry_run,
prometheus,
metrics_port,
metrics_listen_address,
metrics_interval,
force,
ip_checkup,
@ -418,7 +428,9 @@ def run(
prometheus_config = None
if prometheus:
prometheus_config = PrometheusMetricsConfig(
port=metrics_port, collection_interval=metrics_interval
port=metrics_port,
listen_address=metrics_listen_address,
collection_interval=metrics_interval,
)
ursula_config, URSULA = character_options.create_character(

View File

@ -23,18 +23,21 @@ class GroupGeneralConfig:
sentry_endpoint = os.environ.get("NUCYPHER_SENTRY_DSN", NUCYPHER_SENTRY_ENDPOINT)
log_to_sentry = get_env_bool("NUCYPHER_SENTRY_LOGS", False)
log_to_file = get_env_bool("NUCYPHER_FILE_LOGS", True)
log_to_json_file = get_env_bool("NUCYPHER_JSON_LOGS", False)
def __init__(self,
json_ipc: bool,
verbose: bool,
quiet: bool,
no_logs: bool,
console_logs: bool,
file_logs: bool,
sentry_logs: bool,
log_level: bool,
debug: bool):
def __init__(
self,
json_ipc: bool,
verbose: bool,
quiet: bool,
no_logs: bool,
console_logs: bool,
file_logs: bool,
json_logs: bool,
sentry_logs: bool,
log_level: bool,
debug: bool,
):
self.log = Logger(self.__class__.__name__)
# Session Emitter for pre and post character control engagement.
@ -66,6 +69,8 @@ class GroupGeneralConfig:
# Defaults
if file_logs is None:
file_logs = self.log_to_file
if json_logs is None:
json_logs = self.log_to_json_file
if sentry_logs is None:
sentry_logs = self.log_to_sentry
@ -78,6 +83,7 @@ class GroupGeneralConfig:
if no_logs:
console_logs = False
file_logs = False
json_logs = False
sentry_logs = False
if json_ipc:
console_logs = False
@ -88,6 +94,7 @@ class GroupGeneralConfig:
GlobalLoggerSettings.start_console_logging()
if file_logs:
GlobalLoggerSettings.start_text_file_logging()
if json_logs:
GlobalLoggerSettings.start_json_file_logging()
if sentry_logs:
GlobalLoggerSettings.start_sentry_logging(self.sentry_endpoint)
@ -116,7 +123,12 @@ group_general_config = group_options(
file_logs=click.option(
'--file-logs/--no-file-logs',
help="Enable/disable logging to file. Defaults to NUCYPHER_FILE_LOGS, or to `--file-logs` if it is not set.",
help="Enable/disable logging to text file. Defaults to NUCYPHER_FILE_LOGS, or to `--file-logs` if it is not set.",
default=None,
),
json_logs=click.option(
"--json-logs/--no-json-logs",
help="Enable/disable logging to a json file. Defaults to NUCYPHER_JSON_LOGS, or to `--no-json-logs` if it is not set.",
default=None),
sentry_logs=click.option(

View File

@ -3,7 +3,6 @@ from typing import List, Optional, Tuple, Union
from eth_account._utils.signing import to_standard_signature_bytes
from eth_typing.evm import ChecksumAddress
from hexbytes import HexBytes
from nucypher_core import (
EncryptedThresholdDecryptionRequest,
EncryptedThresholdDecryptionResponse,
@ -201,7 +200,7 @@ class TransactingPower(CryptoPowerUp):
# from the recovery byte, bringing it to the standard choice of {0, 1}.
return to_standard_signature_bytes(signature)
def sign_transaction(self, transaction_dict: dict) -> HexBytes:
def sign_transaction(self, transaction_dict: dict) -> bytes:
"""Signs the transaction with the private key of the TransactingPower."""
return self._signer.sign_transaction(transaction_dict=transaction_dict)

View File

@ -1,14 +1,16 @@
import datetime
from ipaddress import IPv4Address
from pathlib import Path
from typing import ClassVar, Optional, Tuple
from typing import Optional, Tuple, Type
from cryptography import x509
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.backends.openssl.ec import _EllipticCurvePrivateKey
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.asymmetric import ec
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurve
from cryptography.hazmat.primitives.asymmetric.ec import (
EllipticCurve,
EllipticCurvePrivateKey,
)
from cryptography.hazmat.primitives.serialization import Encoding
from cryptography.x509 import Certificate
from cryptography.x509.oid import NameOID
@ -31,8 +33,8 @@ def generate_self_signed_certificate(
host: str,
secret_seed: Optional[bytes] = None,
days_valid: int = 365,
curve: ClassVar[EllipticCurve] = _TLS_CURVE,
) -> Tuple[Certificate, _EllipticCurvePrivateKey]:
curve: Type[EllipticCurve] = _TLS_CURVE,
) -> Tuple[Certificate, EllipticCurvePrivateKey]:
if secret_seed:
private_bn = int.from_bytes(secret_seed[: _TLS_CURVE.key_size // 8], "big")
private_key = ec.derive_private_key(private_value=private_bn, curve=curve())

View File

@ -39,7 +39,7 @@ def secure_random(num_bytes: int) -> bytes:
def secure_random_range(min: int, max: int) -> int:
"""
Returns a number from a secure random source betwee the range of
Returns a number from a secure random source between the range of
`min` and `max` - 1.
:param min: Minimum number in the range

View File

@ -1,82 +1,77 @@
from collections import defaultdict
from typing import List, Optional
from hexbytes import HexBytes
from atxm.tx import AsyncTx
from nucypher_core.ferveo import (
Validator,
)
from nucypher.blockchain.eth.models import Coordinator
from nucypher.blockchain.eth.models import PHASE1, Coordinator
from nucypher.types import PhaseId
class DKGStorage:
"""A simple in-memory storage for DKG data"""
# round 1
KEY_TRANSCRIPT_TXS = "transcript_tx_hashes"
KEY_VALIDATORS = "validators"
_KEY_PHASE_1_TXS = "phase_1_txs"
_KEY_VALIDATORS = "validators"
# round 2
KEY_AGGREGATED_TXS = "aggregation_tx_hashes"
_KEY_PHASE_2_TXS = "phase_2_txs"
# active rituals
KEY_ACTIVE_RITUAL = "active_rituals"
_KEY_ACTIVE_RITUAL = "active_rituals"
_KEYS = [
KEY_TRANSCRIPT_TXS,
KEY_VALIDATORS,
KEY_AGGREGATED_TXS,
KEY_ACTIVE_RITUAL,
_KEY_PHASE_1_TXS,
_KEY_VALIDATORS,
_KEY_PHASE_2_TXS,
_KEY_ACTIVE_RITUAL,
]
def __init__(self):
self.data = defaultdict(dict)
self._data = defaultdict(dict)
def clear(self, ritual_id):
for key in self._KEYS:
try:
del self.data[key][ritual_id]
del self._data[key][ritual_id]
except KeyError:
continue
#
# DKG Round 1 - Transcripts
# DKG Phases
#
def store_transcript_txhash(self, ritual_id: int, txhash: HexBytes) -> None:
self.data[self.KEY_TRANSCRIPT_TXS][ritual_id] = txhash
@classmethod
def __get_phase_key(cls, phase: int):
if phase == PHASE1:
return cls._KEY_PHASE_1_TXS
return cls._KEY_PHASE_2_TXS
def clear_transcript_txhash(self, ritual_id: int, txhash: HexBytes) -> bool:
if self.get_transcript_txhash(ritual_id) == txhash:
del self.data[self.KEY_TRANSCRIPT_TXS][ritual_id]
def store_ritual_phase_async_tx(self, phase_id: PhaseId, async_tx: AsyncTx):
key = self.__get_phase_key(phase_id.phase)
self._data[key][phase_id.ritual_id] = async_tx
def clear_ritual_phase_async_tx(self, phase_id: PhaseId, async_tx: AsyncTx) -> bool:
key = self.__get_phase_key(phase_id.phase)
if self._data[key].get(phase_id.ritual_id) is async_tx:
del self._data[key][phase_id.ritual_id]
return True
return False
def get_transcript_txhash(self, ritual_id: int) -> Optional[HexBytes]:
return self.data[self.KEY_TRANSCRIPT_TXS].get(ritual_id)
def get_ritual_phase_async_tx(self, phase_id: PhaseId) -> Optional[AsyncTx]:
key = self.__get_phase_key(phase_id.phase)
return self._data[key].get(phase_id.ritual_id)
def store_validators(self, ritual_id: int, validators: List[Validator]) -> None:
self.data[self.KEY_VALIDATORS][ritual_id] = list(validators)
self._data[self._KEY_VALIDATORS][ritual_id] = list(validators)
def get_validators(self, ritual_id: int) -> Optional[List[Validator]]:
validators = self.data[self.KEY_VALIDATORS].get(ritual_id)
validators = self._data[self._KEY_VALIDATORS].get(ritual_id)
if not validators:
return None
return list(validators)
#
# DKG Round 2 - Aggregation
#
def store_aggregation_txhash(self, ritual_id: int, txhash: HexBytes) -> None:
self.data[self.KEY_AGGREGATED_TXS][ritual_id] = txhash
def clear_aggregated_txhash(self, ritual_id: int, txhash: HexBytes) -> bool:
if self.get_aggregation_txhash(ritual_id) == txhash:
del self.data[self.KEY_AGGREGATED_TXS][ritual_id]
return True
return False
def get_aggregation_txhash(self, ritual_id: int) -> Optional[HexBytes]:
return self.data[self.KEY_AGGREGATED_TXS].get(ritual_id)
#
# Active Rituals
#
@ -84,7 +79,7 @@ class DKGStorage:
if active_ritual.total_aggregations != active_ritual.dkg_size:
# safeguard against a non-active ritual being cached
raise ValueError("Only active rituals can be cached")
self.data[self.KEY_ACTIVE_RITUAL][active_ritual.id] = active_ritual
self._data[self._KEY_ACTIVE_RITUAL][active_ritual.id] = active_ritual
def get_active_ritual(self, ritual_id: int) -> Optional[Coordinator.Ritual]:
return self.data[self.KEY_ACTIVE_RITUAL].get(ritual_id)
return self._data[self._KEY_ACTIVE_RITUAL].get(ritual_id)

View File

@ -223,15 +223,6 @@ class RestMiddleware:
def __init__(self, eth_endpoint: str, registry=None):
self.client = self._client_class(registry=registry, eth_endpoint=eth_endpoint)
def request_revocation(self, ursula, revocation):
# TODO: Implement offchain revocation #2787
response = self.client.post(
node_or_sprout=ursula,
path="revoke",
data=bytes(revocation),
)
return response
def reencrypt(
self,
ursula: "characters.lawful.Ursula",

View File

@ -1133,7 +1133,7 @@ class Teacher:
if self._staking_provider_is_really_staking(
registry=registry, eth_endpoint=eth_endpoint
): # <-- Blockchain CALL
self.log.info(f"Verified operator {self}")
self.log.debug(f"Verified operator {self}")
self.verified_operator = True
else:
raise self.NotStaking(f"{self.checksum_address} is not staking")

View File

@ -213,11 +213,10 @@ class PRERetrievalClient(ThresholdAccessControlClient):
self.log.info(message)
raise RuntimeError(message) from e
except middleware.NotFound as e:
# This Ursula claims not to have a matching KFrag. Maybe this has been revoked?
# This Ursula claims not to have a matching KFrag.
# TODO: What's the thing to do here?
# Do we want to track these Ursulas in some way in case they're lying? #567
message = (f"Ursula ({ursula}) claims not to not know of the policy {reencryption_request.hrac}. "
f"Has access been revoked?")
message = f"Ursula ({ursula}) claims not to not know of the policy {reencryption_request.hrac}."
self.log.warn(message)
raise RuntimeError(message) from e
except middleware.UnexpectedResponse:

View File

@ -214,11 +214,6 @@ def _make_rest_app(this_node, log: Logger) -> Flask:
bob = Bob.from_public_keys(verifying_key=reenc_request.bob_verifying_key)
log.info(f"Reencryption request from {bob} for policy {hrac}")
# TODO: Can this be integrated into reencryption conditions?
# Stateful revocation by HRAC storage below
if hrac in this_node.revoked_policies:
return Response(response=f"Policy with {hrac} has been revoked.", status=HTTPStatus.UNAUTHORIZED)
# Alice or Publisher
publisher_verifying_key = reenc_request.publisher_verifying_key
@ -281,11 +276,6 @@ def _make_rest_app(this_node, log: Logger) -> Flask:
headers = {'Content-Type': 'application/octet-stream'}
return Response(headers=headers, response=bytes(response))
@rest_app.route('/revoke', methods=['POST'])
def revoke():
# TODO: Implement off-chain revocation.
return Response(status=HTTPStatus.OK)
@rest_app.route("/ping", methods=['GET'])
def ping():
"""Asks this node: What is my public IPv4 address?"""

View File

@ -49,7 +49,7 @@ _CONDITION_CHAINS = {
1: "ethereum/mainnet",
11155111: "ethereum/sepolia",
137: "polygon/mainnet",
80001: "polygon/mumbai",
80002: "polygon/amoy",
# TODO: Permit support for these chains
# 100: "gnosis/mainnet",
# 10200: "gnosis/chiado",

View File

@ -13,6 +13,7 @@ def make_staking_provider_reservoir(
exclude_addresses: Optional[Iterable[ChecksumAddress]] = None,
include_addresses: Optional[Iterable[ChecksumAddress]] = None,
pagination_size: Optional[int] = None,
duration: Optional[int] = 0,
):
"""Get a sampler object containing the currently registered staking providers."""
@ -21,7 +22,9 @@ def make_staking_provider_reservoir(
include_addresses = include_addresses or ()
without_set = set(include_addresses) | set(exclude_addresses or ())
try:
reservoir = application_agent.get_staking_provider_reservoir(without=without_set, pagination_size=pagination_size)
reservoir = application_agent.get_staking_provider_reservoir(
without=without_set, pagination_size=pagination_size, duration=duration
)
except StakerSamplingApplicationAgent.NotEnoughStakingProviders:
# TODO: do that in `get_staking_provider_reservoir()`?
reservoir = StakingProvidersReservoir({})

View File

@ -1,9 +1,15 @@
from typing import NewType, TypeVar
from nucypher.blockchain.eth import agents
from typing import NamedTuple, NewType, TypeVar
ERC20UNits = NewType("ERC20UNits", int)
NuNits = NewType("NuNits", ERC20UNits)
TuNits = NewType("TuNits", ERC20UNits)
Agent = TypeVar("Agent", bound="agents.EthereumContractAgent")
Agent = TypeVar("Agent", bound="agents.EthereumContractAgent") # noqa: F821
RitualId = int
PhaseNumber = int
class PhaseId(NamedTuple):
ritual_id: RitualId
phase: PhaseNumber

View File

@ -1,33 +1,16 @@
import os
from functools import partial
from typing import Callable
import click
from nucypher.utilities.logging import Logger
def null_stream():
return open(os.devnull, 'w')
class StdoutEmitter:
class MethodNotFound(BaseException):
"""Cannot find interface method to handle request"""
transport_serializer = str
default_color = 'white'
# sys.stdout.write() TODO: doesn't work well with click_runner's output capture
default_sink_callable = partial(print, flush=True)
def __init__(self,
sink: Callable = None,
verbosity: int = 1):
self.name = self.__class__.__name__.lower()
self.sink = sink or self.default_sink_callable
self.verbosity = verbosity
self.log = Logger(self.name)
@ -41,7 +24,12 @@ class StdoutEmitter:
bold: bool = False,
verbosity: int = 1):
self.echo(message, color=color or self.default_color, bold=bold, verbosity=verbosity)
self.log.debug(message)
# these are application messages that are desired to be
# printed to stdout (with or w/o console logging); send to logger
if verbosity > 1:
self.log.debug(message)
else:
self.log.info(message)
def echo(self,
message: str = None,
@ -49,21 +37,18 @@ class StdoutEmitter:
bold: bool = False,
nl: bool = True,
verbosity: int = 0):
# these are user interactions; don't send to logger
if verbosity <= self.verbosity:
click.secho(message=message, fg=color or self.default_color, bold=bold, nl=nl)
def banner(self, banner):
# these are purely for banners; don't send to logger
if self.verbosity >= 1:
click.echo(banner)
def error(self, e):
e_str = str(e)
if self.verbosity >= 1:
e_str = str(e)
click.echo(message=e_str, color="red")
self.log.info(e_str)
def get_stream(self, verbosity: int = 0):
if verbosity <= self.verbosity:
return click.get_text_stream('stdout')
else:
return null_stream()
# some kind of error; send to logger
self.log.error(e_str)

View File

@ -1,16 +1,17 @@
import pathlib
import sys
from contextlib import contextmanager
from enum import Enum
from typing import Callable
from twisted.logger import (
FileLogObserver,
LogEvent,
LogLevel,
formatEvent,
formatEventAsClassicLogText,
globalLogPublisher,
jsonFileLogObserver,
textFileLogObserver,
)
from twisted.logger import Logger as TwistedLogger
from twisted.python.logfile import LogFile
@ -69,58 +70,97 @@ class GlobalLoggerSettings:
log_level = LogLevel.levelWithName("info")
_json_ipc = False # TODO: Oh no... #1754
_observers = dict()
class LoggingType(Enum):
CONSOLE = "console"
TEXT = "text"
JSON = "json"
SENTRY = "sentry"
@classmethod
def set_log_level(cls, log_level_name):
cls.log_level = LogLevel.levelWithName(log_level_name)
@classmethod
def _stop_logging(cls, logging_type: LoggingType):
observer = cls._observers.pop(logging_type, None)
if observer:
globalLogPublisher.removeObserver(observer)
@classmethod
def _is_already_configured(cls, logging_type: LoggingType) -> bool:
return logging_type in cls._observers
@classmethod
def _start_logging(cls, logging_type: LoggingType):
if cls._is_already_configured(logging_type):
# no-op
return
if logging_type == cls.LoggingType.CONSOLE:
observer = textFileLogObserver(sys.stdout)
elif logging_type == cls.LoggingType.TEXT:
observer = get_text_file_observer()
elif logging_type == cls.LoggingType.JSON:
observer = get_json_file_observer()
else:
# sentry
observer = sentry_observer
# wrap to adhere to log level since other loggers rely on observer to differentiate
wrapped_observer = observer_log_level_wrapper(observer)
globalLogPublisher.addObserver(wrapped_observer)
cls._observers[logging_type] = wrapped_observer
@classmethod
def start_console_logging(cls):
globalLogPublisher.addObserver(console_observer)
cls._start_logging(cls.LoggingType.CONSOLE)
@classmethod
def stop_console_logging(cls):
globalLogPublisher.removeObserver(console_observer)
@classmethod
@contextmanager
def pause_all_logging_while(cls):
former_observers = tuple(globalLogPublisher._observers)
for observer in former_observers:
globalLogPublisher.removeObserver(observer)
yield
for observer in former_observers:
globalLogPublisher.addObserver(observer)
cls._stop_logging(cls.LoggingType.CONSOLE)
@classmethod
def start_text_file_logging(cls):
globalLogPublisher.addObserver(get_text_file_observer())
cls._start_logging(cls.LoggingType.TEXT)
@classmethod
def stop_text_file_logging(cls):
globalLogPublisher.removeObserver(get_text_file_observer())
cls._stop_logging(cls.LoggingType.TEXT)
@classmethod
def start_json_file_logging(cls):
globalLogPublisher.addObserver(get_json_file_observer())
cls._start_logging(cls.LoggingType.JSON)
@classmethod
def stop_json_file_logging(cls):
globalLogPublisher.removeObserver(get_json_file_observer())
cls._stop_logging(cls.LoggingType.JSON)
@classmethod
def start_sentry_logging(cls, dsn: str):
_SentryInitGuard.init(dsn)
globalLogPublisher.addObserver(sentry_observer)
cls._start_logging(cls.LoggingType.SENTRY)
@classmethod
def stop_sentry_logging(cls):
globalLogPublisher.removeObserver(sentry_observer)
cls._stop_logging(cls.LoggingType.SENTRY)
@classmethod
@contextmanager
def pause_all_logging_while(cls):
all_former_observers = tuple(globalLogPublisher._observers)
former_global_observers = dict(cls._observers)
for observer in all_former_observers:
globalLogPublisher.removeObserver(observer)
cls._observers.clear()
def console_observer(event):
if event['log_level'] >= GlobalLoggerSettings.log_level:
print(formatEvent(event))
yield
cls._observers.clear()
for observer in all_former_observers:
globalLogPublisher.addObserver(observer)
cls._observers.update(former_global_observers)
class _SentryInitGuard:
@ -186,5 +226,14 @@ class Logger(TwistedLogger):
return escaped_string
def emit(self, level, format=None, **kwargs):
clean_format = self.escape_format_string(str(format))
super().emit(level=level, format=clean_format, **kwargs)
if level >= GlobalLoggerSettings.log_level:
clean_format = self.escape_format_string(str(format))
super().emit(level=level, format=clean_format, **kwargs)
def observer_log_level_wrapper(observer: Callable[[LogEvent], None]):
def log_level_wrapper(event: LogEvent):
if event["log_level"] >= GlobalLoggerSettings.log_level:
observer(event)
return log_level_wrapper

View File

@ -9,44 +9,44 @@ from nucypher.utilities.logging import Logger
class SimpleTask(ABC):
"""Simple Twisted Looping Call abstract base class."""
INTERVAL = 60 # 60s default
INTERVAL = NotImplemented
CLOCK = reactor
def __init__(self, interval: float = None):
self.interval = interval or self.INTERVAL
self.log = Logger(self.__class__.__name__)
self._task = LoopingCall(self.run)
# self.__task.clock = self.CLOCK
self._task.clock = self.CLOCK
@property
def running(self) -> bool:
"""Determine whether the task is already running."""
return self._task.running
def start(self, now: bool = False):
def start(self, now: bool = False) -> None:
"""Start task."""
if not self.running:
d = self._task.start(interval=self.interval, now=now)
d.addErrback(self.handle_errors)
# return d
def stop(self):
def stop(self) -> None:
"""Stop task."""
if self.running:
self._task.stop()
@abstractmethod
def run(self):
def run(self) -> None:
"""Task method that should be periodically run."""
raise NotImplementedError
@abstractmethod
def handle_errors(self, *args, **kwargs):
def handle_errors(self, *args, **kwargs) -> None:
"""Error callback for error handling during execution."""
raise NotImplementedError
@staticmethod
def clean_traceback(failure: Failure) -> str:
# FIXME: Amazing.
cleaned_traceback = failure.getTraceback().replace('{', '').replace('}', '')
cleaned_traceback = failure.getTraceback().replace("{", "").replace("}", "")
return cleaned_traceback

5118
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,44 @@
[build-system]
requires = ["setuptools", "setuptools-rust", "wheel"]
build-backend = "setuptools.build_meta"
[tool.poetry]
name = "nucypher"
version = "7.3.0"
authors = ["NuCypher"]
description = "A threshold access control application to empower privacy in decentralized systems."
[tool.poetry.dependencies]
python = ">=3.8,<4"
nucypher-core = "==0.13.0"
cryptography = "*"
pynacl = ">=1.4.0"
mnemonic = "*"
pyopenssl = "*"
web3 = '^6.15.1'
atxm = "*"
flask = "*"
hendrix = "*"
requests = "*"
maya = '*'
mako = "*"
click = '*'
colorama = '*'
tabulate = '*'
marshmallow = '*'
appdirs = '*'
constant-sorrow = '^0.1.0a9'
prometheus-client = '*'
time-machine = "^2.13.0"
twisted = "^24.2.0rc1"
[tool.poetry.dev-dependencies]
pytest = '<7'
pytest-cov = '*'
pytest-mock = '*'
pytest-timeout = '*'
pytest-twisted = '*'
eth-ape = "*"
ape-solidity = '*'
coverage = '^7.3.2'
pre-commit = '^2.12.1'
[tool.towncrier]
package = "nucypher"

View File

@ -1,91 +1,98 @@
-i https://pypi.python.org/simple
aiohttp==3.9.1; python_version >= '3.8'
aiosignal==1.3.1; python_version >= '3.7'
appdirs==1.4.4
attrs==23.2.0; python_version >= '3.7'
autobahn==23.6.2; python_version >= '3.9'
automat==22.10.0
bitarray==2.9.2
blinker==1.7.0; python_version >= '3.8'
bytestring-splitter==2.4.1
certifi==2023.11.17; python_version >= '3.6'
cffi==1.16.0; python_version >= '3.8'
charset-normalizer==3.3.2; python_full_version >= '3.7.0'
click==8.1.7; python_version >= '3.7'
colorama==0.4.6; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6'
constant-sorrow==0.1.0a9; python_version >= '3'
constantly==23.10.4; python_version >= '3.8'
cryptography==41.0.7; python_version >= '3.7'
cytoolz==0.12.2; implementation_name == 'cpython'
dateparser==1.2.0; python_version >= '3.7'
eth-abi==4.2.1; python_version < '4' and python_full_version >= '3.7.2'
eth-account==0.8.0; python_version >= '3.6' and python_version < '4'
eth-hash[pycryptodome]==0.6.0; python_version >= '3.8' and python_version < '4'
eth-keyfile==0.6.1
eth-keys==0.4.0
eth-rlp==0.3.0; python_version >= '3.7' and python_version < '4'
eth-tester==0.9.1b1; python_version < '4' and python_full_version >= '3.6.8'
eth-typing==3.5.2; python_version < '4' and python_full_version >= '3.7.2'
eth-utils==2.3.1; python_version >= '3.7' and python_version < '4'
flask==3.0.0; python_version >= '3.8'
frozenlist==1.4.1; python_version >= '3.8'
hendrix==4.0.0
hexbytes==0.3.1; python_version >= '3.7' and python_version < '4'
humanize==4.9.0; python_version >= '3.8'
hyperlink==21.0.0
idna==3.6; python_version >= '3.5'
incremental==22.10.0
itsdangerous==2.1.2; python_version >= '3.7'
jinja2==3.1.3; python_version >= '3.7'
jsonschema==4.20.0; python_version >= '3.8'
jsonschema-specifications==2023.12.1; python_version >= '3.8'
lru-dict==1.2.0
mako==1.3.0; python_version >= '3.8'
markupsafe==2.1.3; python_version >= '3.7'
marshmallow==3.20.2; python_version >= '3.8'
maya==0.6.1
mnemonic==0.21; python_full_version >= '3.8.1'
msgpack==1.0.7; python_version >= '3.8'
msgpack-python==0.5.6
multidict==6.0.4; python_version >= '3.7'
nucypher-core==0.13.0
packaging==23.2; python_version >= '3.7'
parsimonious==0.9.0
pendulum==3.0.0; python_version >= '3.8'
prometheus-client==0.19.0; python_version >= '3.8'
protobuf==4.25.2; python_version >= '3.8'
pyasn1==0.5.1; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
pyasn1-modules==0.3.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
pychalk==2.0.1
pycparser==2.21
pycryptodome==3.20.0
pynacl==1.5.0; python_version >= '3.6'
pyopenssl==23.3.0; python_version >= '3.7'
python-dateutil==2.8.2; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
pytz==2023.3.post1
pyunormalize==15.1.0; python_version >= '3.6'
referencing==0.32.1; python_version >= '3.8'
regex==2023.12.25; python_version >= '3.7'
requests==2.31.0; python_version >= '3.7'
rlp==3.0.0
rpds-py==0.16.2; python_version >= '3.8'
semantic-version==2.10.0; python_version >= '2.7'
service-identity==23.1.0; python_version >= '3.8'
setuptools==69.0.3; python_version >= '3.8'
six==1.16.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
snaptime==0.2.4
tabulate==0.9.0; python_version >= '3.7'
time-machine==2.13.0; implementation_name != 'pypy'
toolz==0.12.0; python_version >= '3.5'
twisted==23.10.0; python_full_version >= '3.8.0'
txaio==23.1.1; python_version >= '3.7'
typing-extensions==4.9.0; python_version >= '3.8'
tzdata==2023.4; python_version >= '2'
tzlocal==5.2; python_version >= '3.8'
urllib3==1.26.18; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
watchdog==3.0.0; python_version >= '3.7'
web3==6.14.0; python_full_version >= '3.7.2'
websockets==12.0; python_version >= '3.8'
werkzeug==3.0.1; python_version >= '3.8'
yarl==1.9.4; python_version >= '3.7'
zope-interface==6.1; python_version >= '3.7'
aiohttp==3.9.4rc0 ; python_version >= "3.8" and python_version < "4"
aiosignal==1.3.1 ; python_version >= "3.8" and python_version < "4"
appdirs==1.4.4 ; python_version >= "3.8" and python_version < "4"
async-timeout==4.0.3 ; python_version >= "3.8" and python_version < "3.11"
attrs==23.2.0 ; python_version >= "3.8" and python_version < "4"
atxm==0.3.0 ; python_version >= "3.8" and python_version < "4"
autobahn==23.1.2 ; python_version >= "3.8" and python_version < "4"
automat==22.10.0 ; python_version >= "3.8" and python_version < "4"
backports-zoneinfo==0.2.1 ; python_version >= "3.8" and python_version < "3.9"
bitarray==2.9.2 ; python_version >= "3.8" and python_version < "4"
blinker==1.7.0 ; python_version >= "3.8" and python_version < "4"
bytestring-splitter==2.4.1 ; python_version >= "3.8" and python_version < "4"
certifi==2024.2.2 ; python_version >= "3.8" and python_version < "4"
cffi==1.16.0 ; python_version >= "3.8" and python_version < "4"
charset-normalizer==3.3.2 ; python_version >= "3.8" and python_version < "4"
click==8.1.7 ; python_version >= "3.8" and python_version < "4"
colorama==0.4.6 ; python_version >= "3.8" and python_version < "4"
constant-sorrow==0.1.0a9 ; python_version >= "3.8" and python_version < "4"
constantly==23.10.4 ; python_version >= "3.8" and python_version < "4"
cryptography==42.0.5 ; python_version >= "3.8" and python_version < "4"
cytoolz==0.12.3 ; python_version >= "3.8" and python_version < "4" and implementation_name == "cpython"
dateparser==1.2.0 ; python_version >= "3.8" and python_version < "4"
eth-abi==4.2.1 ; python_version >= "3.8" and python_version < "4"
eth-account==0.10.0 ; python_version >= "3.8" and python_version < "4"
eth-hash==0.7.0 ; python_version >= "3.8" and python_version < "4"
eth-hash[pycryptodome]==0.7.0 ; python_version >= "3.8" and python_version < "4"
eth-keyfile==0.8.0 ; python_version >= "3.8" and python_version < "4"
eth-keys==0.4.0 ; python_version >= "3.8" and python_version < "4"
eth-rlp==1.0.1 ; python_version >= "3.8" and python_version < "4"
eth-typing==3.5.2 ; python_version >= "3.8" and python_version < "4"
eth-utils==2.3.1 ; python_version >= "3.8" and python_version < "4"
flask==3.0.3 ; python_version >= "3.8" and python_version < "4"
frozenlist==1.4.1 ; python_version >= "3.8" and python_version < "4"
hendrix==5.0.0 ; python_version >= "3.8" and python_version < "4"
hexbytes==0.3.1 ; python_version >= "3.8" and python_version < "4"
humanize==4.9.0 ; python_version >= "3.8" and python_version < "4"
hyperlink==21.0.0 ; python_version >= "3.8" and python_version < "4"
idna==3.7 ; python_version >= "3.8" and python_version < "4"
importlib-metadata==7.1.0 ; python_version >= "3.8" and python_version < "3.10"
importlib-resources==6.4.0 ; python_version >= "3.8" and python_version < "3.9"
incremental==22.10.0 ; python_version >= "3.8" and python_version < "4"
itsdangerous==2.1.2 ; python_version >= "3.8" and python_version < "4"
jinja2==3.1.3 ; python_version >= "3.8" and python_version < "4"
jsonschema-specifications==2023.12.1 ; python_version >= "3.8" and python_version < "4"
jsonschema==4.21.1 ; python_version >= "3.8" and python_version < "4"
lru-dict==1.2.0 ; python_version >= "3.8" and python_version < "4"
mako==1.3.3 ; python_version >= "3.8" and python_version < "4"
markupsafe==2.1.5 ; python_version >= "3.8" and python_version < "4"
marshmallow==3.21.1 ; python_version >= "3.8" and python_version < "4"
maya==0.6.1 ; python_version >= "3.8" and python_version < "4"
mnemonic==0.20 ; python_version >= "3.8" and python_version < "4"
msgpack-python==0.5.6 ; python_version >= "3.8" and python_version < "4"
multidict==6.0.5 ; python_version >= "3.8" and python_version < "4"
nucypher-core==0.13.0 ; python_version >= "3.8" and python_version < "4"
packaging==23.2 ; python_version >= "3.8" and python_version < "4"
parsimonious==0.9.0 ; python_version >= "3.8" and python_version < "4"
pendulum==3.0.0 ; python_version >= "3.8" and python_version < "4"
pkgutil-resolve-name==1.3.10 ; python_version >= "3.8" and python_version < "3.9"
prometheus-client==0.20.0 ; python_version >= "3.8" and python_version < "4"
protobuf==5.26.1 ; python_version >= "3.8" and python_version < "4"
pyasn1-modules==0.4.0 ; python_version >= "3.8" and python_version < "4"
pyasn1==0.6.0 ; python_version >= "3.8" and python_version < "4"
pychalk==2.0.1 ; python_version >= "3.8" and python_version < "4"
pycparser==2.22 ; python_version >= "3.8" and python_version < "4"
pycryptodome==3.20.0 ; python_version >= "3.8" and python_version < "4"
pynacl==1.5.0 ; python_version >= "3.8" and python_version < "4"
pyopenssl==24.1.0 ; python_version >= "3.8" and python_version < "4"
python-dateutil==2.9.0.post0 ; python_version >= "3.8" and python_version < "4"
python-statemachine==2.1.2 ; python_version >= "3.8" and python_version < "3.13"
pytz==2024.1 ; python_version >= "3.8" and python_version < "4"
pyunormalize==15.1.0 ; python_version >= "3.8" and python_version < "4"
pywin32==306 ; python_version >= "3.8" and python_version < "4" and platform_system == "Windows"
referencing==0.34.0 ; python_version >= "3.8" and python_version < "4"
regex==2023.12.25 ; python_version >= "3.8" and python_version < "4"
requests==2.31.0 ; python_version >= "3.8" and python_version < "4"
rlp==3.0.0 ; python_version >= "3.8" and python_version < "4"
rpds-py==0.18.0 ; python_version >= "3.8" and python_version < "4"
service-identity==24.1.0 ; python_version >= "3.8" and python_version < "4"
setuptools==69.2.0 ; python_version >= "3.8" and python_version < "4"
six==1.16.0 ; python_version >= "3.8" and python_version < "4"
snaptime==0.2.4 ; python_version >= "3.8" and python_version < "4"
tabulate==0.9.0 ; python_version >= "3.8" and python_version < "4"
time-machine==2.14.1 ; python_version >= "3.8" and python_version < "4"
toolz==0.12.1 ; python_version >= "3.8" and python_version < "4"
twisted-iocpsupport==1.0.4 ; python_version >= "3.8" and python_version < "4" and platform_system == "Windows"
twisted==24.3.0 ; python_version >= "3.8" and python_version < "4"
txaio==23.1.1 ; python_version >= "3.8" and python_version < "4"
typing-extensions==4.11.0 ; python_version >= "3.8" and python_version < "4"
tzdata==2024.1 ; python_version >= "3.8" and python_version < "4"
tzlocal==5.2 ; python_version >= "3.8" and python_version < "4"
urllib3==2.2.0 ; python_version >= "3.8" and python_version < "4"
watchdog==3.0.0 ; python_version >= "3.8" and python_version < "4"
web3==6.15.1 ; python_version >= "3.8" and python_version < "4"
websockets==12.0 ; python_version >= "3.8" and python_version < "4"
werkzeug==3.0.2 ; python_version >= "3.8" and python_version < "4"
yarl==1.9.4 ; python_version >= "3.8" and python_version < "4"
zipp==3.18.1 ; python_version >= "3.8" and python_version < "3.10"
zope-interface==6.2 ; python_version >= "3.8" and python_version < "4"

View File

@ -1,40 +0,0 @@
#!/usr/bin/env bash
set -e
# Update lock and build requirements files.
# Use option -k to keep the Pipfile.lock file, so the process is deterministic with respect to that file.
yes | ./scripts/dependencies/relock_dependencies.sh -k circle-requirements
echo "---- Validating requirements.txt ----"
REQSHASH=$(md5sum requirements.txt | cut -d ' ' -f1)
TESTHASH=$(md5sum circle-requirements.txt | cut -d ' ' -f1)
echo "- $REQSHASH"
echo "- $TESTHASH"
if [ $REQSHASH == $TESTHASH ]; then
echo "- requirements.txt is valid ...."
else
echo "- requirements.txt contains inconsistencies ...."
echo "- you may want to run `pipenv sync --dev` and then ./scripts/dependencies/relock_dependencies.sh ...."
echo "- which will rebuild your *requirements.txt files ...."
diff requirements.txt circle-requirements.txt
exit 2
fi
echo "---- Validating dev-requirements.txt ----"
REQSHASH=$(md5sum dev-requirements.txt | cut -d ' ' -f1)
TESTHASH=$(md5sum dev-circle-requirements.txt | cut -d ' ' -f1)
echo "- $REQSHASH"
echo "- $TESTHASH"
if [ $REQSHASH == $TESTHASH ]; then
echo "- dev-requirements.txt is valid ...."
else
echo "- dev-requirements.txt contains inconsistencies ...."
echo "- you may want to run `pipenv sync --dev` and then ./scripts/dependencies/relock_dependencies.sh ...."
echo "- which will rebuild your *requirements.txt files ...."
diff dev-requirements.txt dev-circle-requirements.txt
exit 2
fi

View File

@ -1,6 +1,6 @@
#!/usr/bin/env bash
# Parse optional flag -k, to be used when we want to base the process on an existing Pipfile.lock
# Parse optional flag -k, to be used when we want to base the process on an existing poetry.lock
KEEP_LOCK=false
OPTIND=1
while getopts 'k' opt; do
@ -15,14 +15,21 @@ shift "$(( OPTIND - 1 ))"
# can change output file names with relock_dependencies.sh <prefix>
PREFIX=${1:-requirements}
# setup export plugin
poetry self add poetry-plugin-export
poetry config warnings.export false
# update poetry and pip
poetry self update
pip install --upgrade pip
# these steps might fail, but that's okay.
if ! "$KEEP_LOCK"; then
echo "Removing existing Pipfile.lock file"
rm -f Pipfile.lock
echo "Removing existing poetry.lock file"
rm -f poetry.lock
fi
echo "Removing existing requirement files"
pipenv --rm
rm -f $PREFIX.txt
rm -f dev-$PREFIX.txt
@ -33,11 +40,11 @@ pip cache purge
set -e
echo "Building Development Requirements"
pipenv --python 3.12 lock --clear --pre --dev-only
pipenv requirements --dev-only > dev-$PREFIX.txt
poetry lock
poetry export -o dev-requirements.txt --without-hashes --with dev
echo "Building Standard Requirements"
pipenv --python 3.12 lock --clear --pre
pipenv requirements > $PREFIX.txt
poetry export -o requirements.txt --without-hashes --without dev
echo "OK!"

118
setup.py
View File

@ -2,36 +2,10 @@
# -*- coding: utf-8 -*-
"""
This file is part of nucypher.
nucypher is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
nucypher is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with nucypher. If not, see <https://www.gnu.org/licenses/>.
"""
import os
import sys
from pathlib import Path
from typing import Dict
from urllib.parse import urlparse
from setuptools import find_namespace_packages, setup
from setuptools.command.develop import develop
from setuptools.command.install import install
#
# Metadata
#
PACKAGE_NAME = 'nucypher'
BASE_DIR = Path(__file__).parent
@ -57,90 +31,16 @@ with open(str(SOURCE_METADATA_PATH.resolve())) as f:
exec(f.read(), ABOUT)
#
# Utilities
#
class VerifyVersionCommand(install):
"""Custom command to verify that the git tag matches our version"""
description = 'verify that the git tag matches our version'
def run(self):
tag = os.getenv('CIRCLE_TAG')
if tag.startswith('v'):
tag = tag[1:]
version = ABOUT['__version__']
if version.startswith('v'):
version = version[1:]
if tag != version:
info = "Git tag: {0} does not match the version of this app: {1}".format(
os.getenv('CIRCLE_TAG'), ABOUT['__version__']
)
sys.exit(info)
class PostDevelopCommand(develop):
"""
Post-installation for development mode.
Execute manually with python setup.py develop or automatically included with
`pip install -e . -r dev-requirements.txt`.
"""
def run(self):
"""development setup scripts (pre-requirements)"""
develop.run(self)
#
# Requirements
#
def read_requirements(path):
with open(BASE_DIR / path) as f:
_pipenv_flags, *lines = f.read().split('\n')
# TODO remove when will be no more git dependencies in requirements.txt
# Transforms VCS requirements to PEP 508
requirements = []
for line in lines:
if line.startswith('-e git:') or line.startswith('-e git+') or \
line.startswith('git:') or line.startswith('git+'):
# parse out egg=... fragment from VCS URL
parsed = urlparse(line)
egg_name = parsed.fragment.partition("egg=")[-1]
without_fragment = parsed._replace(fragment="").geturl()
requirements.append(f"{egg_name} @ {without_fragment}")
else:
requirements.append(line)
return requirements
return f.read().split("\n")
INSTALL_REQUIRES = read_requirements("requirements.txt")
DEV_REQUIRES = read_requirements("dev-requirements.txt")
BENCHMARK_REQUIRES = [
'pytest-benchmark'
]
DEPLOY_REQUIRES = [
'bumpversion',
'ansible',
'twine',
'wheel'
]
URSULA_REQUIRES = ["sentry-sdk"]
EXTRAS = {
# Admin
"dev": DEV_REQUIRES + URSULA_REQUIRES,
"benchmark": DEV_REQUIRES + BENCHMARK_REQUIRES,
"deploy": DEPLOY_REQUIRES,
"ursula": URSULA_REQUIRES,
"dev": DEV_REQUIRES,
}
setup(
@ -155,7 +55,7 @@ setup(
exclude=["scripts", "nucypher.utilities.templates"]
),
include_package_data=True,
zip_safe=False,
zip_safe=True,
# Entry Points
entry_points={
@ -167,12 +67,6 @@ setup(
]
},
# setup.py commands
cmdclass={
'verify': VerifyVersionCommand,
'develop': PostDevelopCommand
},
# Metadata
name=ABOUT['__title__'],
url=ABOUT['__url__'],
@ -181,8 +75,6 @@ setup(
author_email=ABOUT['__email__'],
description=ABOUT['__summary__'],
license=ABOUT['__license__'],
long_description_content_type="text/markdown",
long_description_markdown_filename='README.md',
keywords="nucypher, proxy re-encryption",
classifiers=PYPI_CLASSIFIERS
keywords="threshold access control, distributed key generation",
classifiers=PYPI_CLASSIFIERS,
)

View File

@ -6,13 +6,11 @@ import pytest
import pytest_twisted
from hexbytes import HexBytes
from prometheus_client import REGISTRY
from twisted.internet.threads import deferToThread
from nucypher.blockchain.eth.agents import ContractAgency, SubscriptionManagerAgent
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.models import Coordinator
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.blockchain.eth.trackers.dkg import EventScannerTask
from nucypher.blockchain.eth.signers.software import InMemorySigner
from nucypher.characters.lawful import Enrico, Ursula
from nucypher.policy.conditions.evm import ContractCondition, RPCCondition
from nucypher.policy.conditions.lingo import (
@ -24,18 +22,35 @@ from nucypher.policy.conditions.lingo import (
from nucypher.policy.conditions.time import TimeCondition
from tests.constants import TEST_ETH_PROVIDER_URI, TESTERCHAIN_CHAIN_ID
# constants
DKG_SIZE = 4
RITUAL_ID = 0
# This is a hack to make the tests run faster
EventScannerTask.INTERVAL = 1
TIME_TRAVEL_INTERVAL = 60
@pytest.fixture(scope="module")
def ritual_id():
return 0
# The message to encrypt and its conditions
PLAINTEXT = "peace at dawn"
DURATION = 48 * 60 * 60
@pytest.fixture(scope="module")
def dkg_size():
return 4
@pytest.fixture(scope="module")
def duration():
return 48 * 60 * 60
@pytest.fixture(scope="module")
def plaintext():
return "peace at dawn"
@pytest.fixture(scope="module")
def interval(testerchain):
return testerchain.tx_machine._task.interval
@pytest.fixture(scope="module")
def signer():
return InMemorySigner()
@pytest.fixture(scope="module")
@ -94,282 +109,235 @@ def condition(test_registry):
return ConditionLingo(condition_to_use).to_dict()
@pytest.fixture(scope='module')
def cohort(ursulas):
"""Creates a cohort of Ursulas"""
nodes = list(sorted(ursulas[:DKG_SIZE], key=lambda x: int(x.checksum_address, 16)))
assert len(nodes) == DKG_SIZE # sanity check
@pytest.fixture(scope="module", autouse=True)
def transaction_tracker(testerchain, coordinator_agent):
testerchain.tx_machine.w3 = coordinator_agent.blockchain.w3
testerchain.tx_machine.start()
@pytest.fixture(scope="module")
def cohort(testerchain, clock, coordinator_agent, ursulas, dkg_size):
nodes = list(sorted(ursulas[:dkg_size], key=lambda x: int(x.checksum_address, 16)))
assert len(nodes) == dkg_size
for node in nodes:
node.ritual_tracker.task._task.clock = clock
node.ritual_tracker.start()
return nodes
@pytest_twisted.inlineCallbacks()
def test_ursula_ritualist(
condition,
testerchain,
@pytest.fixture(scope="module")
def threshold_message_kit(coordinator_agent, plaintext, condition, signer, ritual_id):
encrypting_key = coordinator_agent.get_ritual_public_key(ritual_id=ritual_id)
enrico = Enrico(encrypting_key=encrypting_key, signer=signer)
return enrico.encrypt_for_dkg(plaintext=plaintext.encode(), conditions=condition)
def test_dkg_initiation(
coordinator_agent,
global_allow_list,
cohort,
initiator,
bob,
ritual_token,
accounts,
initiator,
cohort,
global_allow_list,
testerchain,
ritual_token,
ritual_id,
duration,
):
"""Tests the DKG and the encryption/decryption of a message"""
signer = Web3Signer(client=testerchain.client)
print("==================== INITIALIZING ====================")
cohort_staking_provider_addresses = list(u.checksum_address for u in cohort)
# Round 0 - Initiate the ritual
def initialize():
"""Initiates the ritual"""
print("==================== INITIALIZING ====================")
cohort_staking_provider_addresses = list(u.checksum_address for u in cohort)
# Approve the ritual token for the coordinator agent to spend
amount = coordinator_agent.get_ritual_initiation_cost(
providers=cohort_staking_provider_addresses, duration=duration
)
ritual_token.approve(
coordinator_agent.contract_address,
amount,
sender=accounts[initiator.transacting_power.account],
)
# Approve the ritual token for the coordinator agent to spend
amount = coordinator_agent.get_ritual_initiation_cost(
providers=cohort_staking_provider_addresses, duration=DURATION
)
ritual_token.approve(
coordinator_agent.contract_address,
amount,
sender=accounts[initiator.transacting_power.account],
)
receipt = coordinator_agent.initiate_ritual(
providers=cohort_staking_provider_addresses,
authority=initiator.transacting_power.account,
duration=duration,
access_controller=global_allow_list.address,
transacting_power=initiator.transacting_power,
)
receipt = coordinator_agent.initiate_ritual(
providers=cohort_staking_provider_addresses,
authority=initiator.transacting_power.account,
duration=DURATION,
access_controller=global_allow_list.address,
transacting_power=initiator.transacting_power,
)
return receipt
testerchain.time_travel(seconds=1)
testerchain.wait_for_receipt(receipt["transactionHash"])
# Round 0 - Initiate the ritual
def check_initialize(receipt):
"""Checks the initialization of the ritual"""
print("==================== CHECKING INITIALIZATION ====================")
testerchain.wait_for_receipt(receipt['transactionHash'])
# check that the ritual was created on-chain
assert coordinator_agent.number_of_rituals() == ritual_id + 1
assert (
coordinator_agent.get_ritual_status(ritual_id)
== Coordinator.RitualStatus.DKG_AWAITING_TRANSCRIPTS
)
# check that the ritual was created on-chain
assert coordinator_agent.number_of_rituals() == RITUAL_ID + 1
@pytest_twisted.inlineCallbacks
def test_dkg_finality(
coordinator_agent, ritual_id, cohort, clock, interval, testerchain
):
print("==================== AWAITING DKG FINALITY ====================")
while (
coordinator_agent.get_ritual_status(ritual_id)
!= Coordinator.RitualStatus.ACTIVE
):
yield clock.advance(interval)
yield testerchain.time_travel(seconds=1)
testerchain.tx_machine.stop()
assert not testerchain.tx_machine.running
status = coordinator_agent.get_ritual_status(ritual_id)
assert status == Coordinator.RitualStatus.ACTIVE
last_scanned_block = REGISTRY.get_sample_value(
"ritual_events_last_scanned_block_number"
)
assert last_scanned_block > 0
yield
def test_transcript_publication(coordinator_agent, cohort, ritual_id, dkg_size):
print("==================== VERIFYING DKG FINALITY ====================")
for ursula in cohort:
assert (
coordinator_agent.get_ritual_status(RITUAL_ID)
== Coordinator.RitualStatus.DKG_AWAITING_TRANSCRIPTS
)
# time travel has a side effect of mining a block so that the scanner will definitively
# pick up ritual event
testerchain.time_travel(seconds=1)
for ursula in cohort:
# this is a testing hack to make the event scanner work
# normally it's called by the reactor clock in a loop
ursula.ritual_tracker.task.run()
# nodes received `StartRitual` and submitted their transcripts
assert (
len(
coordinator_agent.get_participant(
ritual_id=RITUAL_ID,
provider=ursula.checksum_address,
transcript=True,
).transcript
)
> 0
), "ursula posted transcript to Coordinator"
def block_until_dkg_finalized(_):
"""simulates the passage of time and the execution of the event scanner"""
print("==================== BLOCKING UNTIL DKG FINALIZED ====================")
while (
coordinator_agent.get_ritual_status(RITUAL_ID)
!= Coordinator.RitualStatus.ACTIVE
):
for ursula in cohort:
# this is a testing hack to make the event scanner work,
# normally it's called by the reactor clock in a loop
ursula.ritual_tracker.task.run()
testerchain.time_travel(seconds=TIME_TRAVEL_INTERVAL)
# Ensure that all events processed, including EndRitual
for ursula in cohort:
ursula.ritual_tracker.task.run()
def check_finality(_):
"""Checks the finality of the DKG"""
print("==================== CHECKING DKG FINALITY ====================")
status = coordinator_agent.get_ritual_status(RITUAL_ID)
assert status == Coordinator.RitualStatus.ACTIVE
for ursula in cohort:
participant = coordinator_agent.get_participant(
RITUAL_ID, ursula.checksum_address, True
len(
coordinator_agent.get_participant(
ritual_id=ritual_id,
provider=ursula.checksum_address,
transcript=True,
).transcript
)
assert participant.transcript
assert participant.aggregated
> 0
), "no transcript found for ursula"
print(f"Ursula {ursula.checksum_address} has submitted a transcript")
last_scanned_block = REGISTRY.get_sample_value(
"ritual_events_last_scanned_block_number"
)
assert last_scanned_block > 0
def check_participant_pagination(_):
print("================ PARTICIPANT PAGINATION ================")
pagination_sizes = range(0, DKG_SIZE) # 0 means get all in one call
for page_size in pagination_sizes:
with patch.object(
coordinator_agent, "_get_page_size", return_value=page_size
):
ritual = coordinator_agent.get_ritual(RITUAL_ID, transcripts=True)
for i, participant in enumerate(ritual.participants):
assert participant.provider == cohort[i].checksum_address
assert participant.aggregated is True
assert participant.transcript
assert participant.decryption_request_static_key
def test_get_participants(coordinator_agent, cohort, ritual_id, dkg_size):
pagination_sizes = range(0, dkg_size) # 0 means get all in one call
for page_size in pagination_sizes:
with patch.object(coordinator_agent, "_get_page_size", return_value=page_size):
ritual = coordinator_agent.get_ritual(ritual_id, transcripts=True)
for i, participant in enumerate(ritual.participants):
assert participant.provider == cohort[i].checksum_address
assert participant.aggregated is True
assert participant.transcript
assert participant.decryption_request_static_key
assert len(ritual.participants) == DKG_SIZE
assert len(ritual.participants) == dkg_size
def check_encrypt(_):
"""Encrypts a message and returns the ciphertext and conditions"""
print("==================== DKG ENCRYPTION ====================")
encrypting_key = coordinator_agent.get_ritual_public_key(ritual_id=RITUAL_ID)
def test_encrypt(
coordinator_agent, condition, ritual_id, plaintext, testerchain, signer
):
print("==================== DKG ENCRYPTION ====================")
encrypting_key = coordinator_agent.get_ritual_public_key(ritual_id=ritual_id)
plaintext = plaintext.encode()
enrico = Enrico(encrypting_key=encrypting_key, signer=signer)
print(f"encrypting for DKG with key {bytes(encrypting_key).hex()}")
tmk = enrico.encrypt_for_dkg(plaintext=plaintext, conditions=condition)
assert tmk.ciphertext_header
# prepare message and conditions
plaintext = PLAINTEXT.encode()
# create Enrico
enrico = Enrico(encrypting_key=encrypting_key, signer=signer)
# encrypt
print(f"encrypting for DKG with key {bytes(encrypting_key).hex()}")
threshold_message_kit = enrico.encrypt_for_dkg(
plaintext=plaintext, conditions=condition
)
return threshold_message_kit
def check_unauthorized_decrypt(threshold_message_kit):
"""Attempts to decrypt a message before Enrico is authorized to use the ritual"""
print("======== DKG DECRYPTION UNAUTHORIZED ENCRYPTION ========")
# ritual_id, ciphertext, conditions are obtained from the side channel
bob.start_learning_loop(now=True)
with pytest.raises(
Ursula.NotEnoughUrsulas,
match=f"Encrypted data not authorized for ritual {RITUAL_ID}",
):
_ = bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
# check prometheus metric for decryption requests
# since all running on the same machine - the value is not per-ursula but rather all
num_failures = REGISTRY.get_sample_value(
"threshold_decryption_num_failures_total"
)
assert len(cohort) == int(num_failures) # each ursula in cohort had a failure
print("========= UNAUTHORIZED DECRYPTION UNSUCCESSFUL =========")
return threshold_message_kit
def check_decrypt(threshold_message_kit):
"""Decrypts a message and checks that it matches the original plaintext"""
# authorize Enrico to encrypt for ritual
global_allow_list.authorize(
RITUAL_ID,
[signer.accounts[0]],
sender=accounts[initiator.transacting_power.account],
)
print("==================== DKG DECRYPTION ====================")
# ritual_id, ciphertext, conditions are obtained from the side channel
bob.start_learning_loop(now=True)
cleartext = bob.threshold_decrypt(
@pytest_twisted.inlineCallbacks
def test_unauthorized_decryption(bob, cohort, threshold_message_kit, ritual_id):
print("======== DKG DECRYPTION (UNAUTHORIZED) ========")
bob.start_learning_loop(now=True)
with pytest.raises(
Ursula.NotEnoughUrsulas,
match=f"Encrypted data not authorized for ritual {ritual_id}",
):
yield bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
assert bytes(cleartext) == PLAINTEXT.encode()
# check prometheus metric for decryption requests
# since all running on the same machine - the value is not per-ursula but rather all
num_successes = REGISTRY.get_sample_value(
"threshold_decryption_num_successes_total"
)
# check prometheus metric for decryption requests
# since all running on the same machine - the value is not per-ursula but rather all
num_failures = REGISTRY.get_sample_value("threshold_decryption_num_failures_total")
assert len(cohort) == int(num_failures) # each ursula in cohort had a failure
yield
ritual = coordinator_agent.get_ritual(RITUAL_ID)
# at least a threshold of ursulas were successful (concurrency)
assert int(num_successes) >= ritual.threshold
# decrypt again (should only use cached values)
with patch.object(
coordinator_agent,
"get_provider_public_key",
side_effect=RuntimeError(
"should not be called to create validators; cache should be used"
),
):
# would like to but can't patch agent.get_ritual, since bob uses it
cleartext = bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
assert bytes(cleartext) == PLAINTEXT.encode()
print("==================== DECRYPTION SUCCESSFUL ====================")
def check_decrypt_without_any_cached_values(
threshold_message_kit, ritual_id, cohort, bob, coordinator_agent, plaintext
):
print("==================== DKG DECRYPTION NO CACHE ====================")
original_validators = cohort[0].dkg_storage.get_validators(ritual_id)
return threshold_message_kit
for ursula in cohort:
ursula.dkg_storage.clear(ritual_id)
assert ursula.dkg_storage.get_validators(ritual_id) is None
assert ursula.dkg_storage.get_active_ritual(ritual_id) is None
def check_decrypt_without_any_cached_values(threshold_message_kit):
print("==================== DKG DECRYPTION NO CACHE ====================")
original_validators = cohort[0].dkg_storage.get_validators(RITUAL_ID)
bob.start_learning_loop(now=True)
cleartext = bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
assert bytes(cleartext) == plaintext.encode()
for ursula in cohort:
ursula.dkg_storage.clear(RITUAL_ID)
assert ursula.dkg_storage.get_validators(RITUAL_ID) is None
assert ursula.dkg_storage.get_active_ritual(RITUAL_ID) is None
ritual = coordinator_agent.get_ritual(ritual_id)
num_used_ursulas = 0
for ursula_index, ursula in enumerate(cohort):
stored_ritual = ursula.dkg_storage.get_active_ritual(ritual_id)
if not stored_ritual:
# this ursula was not used for threshold decryption; skip
continue
assert stored_ritual == ritual
bob.start_learning_loop(now=True)
cleartext = bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
assert bytes(cleartext) == PLAINTEXT.encode()
stored_validators = ursula.dkg_storage.get_validators(ritual_id)
num_used_ursulas += 1
for v_index, v in enumerate(stored_validators):
assert v.address == original_validators[v_index].address
assert v.public_key == original_validators[v_index].public_key
ritual = coordinator_agent.get_ritual(RITUAL_ID)
num_used_ursulas = 0
for ursula_index, ursula in enumerate(cohort):
stored_ritual = ursula.dkg_storage.get_active_ritual(RITUAL_ID)
if not stored_ritual:
# this ursula was not used for threshold decryption; skip
continue
assert stored_ritual == ritual
assert num_used_ursulas >= ritual.threshold
print("===================== DECRYPTION SUCCESSFUL =====================")
stored_validators = ursula.dkg_storage.get_validators(RITUAL_ID)
num_used_ursulas += 1
for v_index, v in enumerate(stored_validators):
assert v.address == original_validators[v_index].address
assert v.public_key == original_validators[v_index].public_key
assert num_used_ursulas >= ritual.threshold
print("===================== DECRYPTION SUCCESSFUL =====================")
@pytest_twisted.inlineCallbacks
def test_authorized_decryption(
bob,
global_allow_list,
accounts,
coordinator_agent,
threshold_message_kit,
signer,
initiator,
ritual_id,
plaintext,
):
print("==================== DKG DECRYPTION (AUTHORIZED) ====================")
# authorize Enrico to encrypt for ritual
global_allow_list.authorize(
ritual_id,
[signer.accounts[0]],
sender=accounts[initiator.transacting_power.account],
)
def error_handler(e):
"""Prints the error and raises it"""
print("==================== ERROR ====================")
print(e.getTraceback())
raise e
# ritual_id, ciphertext, conditions are obtained from the side channel
bob.start_learning_loop(now=True)
cleartext = yield bob.threshold_decrypt(
threshold_message_kit=threshold_message_kit,
)
assert bytes(cleartext) == plaintext.encode()
# order matters
callbacks = [
check_initialize,
block_until_dkg_finalized,
check_finality,
check_participant_pagination,
check_encrypt,
check_unauthorized_decrypt,
check_decrypt,
check_decrypt_without_any_cached_values,
]
# check prometheus metric for decryption requests
# since all running on the same machine - the value is not per-ursula but rather all
num_successes = REGISTRY.get_sample_value(
"threshold_decryption_num_successes_total"
)
d = deferToThread(initialize)
for callback in callbacks:
d.addCallback(callback)
d.addErrback(error_handler)
yield d
ritual = coordinator_agent.get_ritual(ritual_id)
# at least a threshold of ursulas were successful (concurrency)
assert int(num_successes) >= ritual.threshold
yield
def test_encryption_and_decryption_prometheus_metrics():
print("==================== METRICS ====================")
# check prometheus metric for decryption requests
# since all running on the same machine - the value is not per-ursula but rather all
num_decryption_failures = REGISTRY.get_sample_value(

View File

@ -1,6 +1,4 @@
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
from nucypher.utilities.logging import Logger
from tests.utils.ursula import select_test_port
@ -19,11 +17,11 @@ def test_ursula_operator_confirmation(
threshold_staking,
taco_application_agent,
taco_child_application_agent,
test_registry,
deployer_account,
accounts,
):
staking_provider = testerchain.stake_provider_account(0)
operator_address = testerchain.ursula_account(0)
staking_provider = accounts.staking_provider_account(0)
operator_address = accounts.ursula_account(0)
min_authorization = taco_application_agent.get_min_authorization()
# make an staking_providers and some stakes
@ -49,7 +47,7 @@ def test_ursula_operator_confirmation(
# bond this operator
tpower = TransactingPower(
account=staking_provider, signer=Web3Signer(testerchain.client)
account=staking_provider, signer=accounts.get_account_signer(staking_provider)
)
taco_application_agent.bond_operator(
staking_provider=staking_provider,
@ -59,7 +57,9 @@ def test_ursula_operator_confirmation(
# make an ursula.
ursula = ursula_test_config.produce(
operator_address=operator_address, rest_port=select_test_port()
operator_address=operator_address,
rest_port=select_test_port(),
signer=accounts.get_account_signer(operator_address),
)
# now the worker has a staking provider

View File

@ -0,0 +1,465 @@
import pytest_twisted
from atxm.exceptions import Fault, InsufficientFunds
from twisted.internet import reactor
from twisted.internet.task import deferLater
from nucypher.blockchain.eth.models import PHASE1, PHASE2, Coordinator
from nucypher.types import PhaseId
from tests.mock.interfaces import MockBlockchain
@pytest_twisted.inlineCallbacks
def test_ursula_dkg_rounds_fault_tolerance(
clock,
ursulas,
testerchain,
ritual_token,
global_allow_list,
coordinator_agent,
accounts,
initiator,
mocker,
):
#
# DKG Setup
#
ursula_1, ursula_2 = ursulas[0], ursulas[1]
cohort_addresses = sorted([ursula_1.checksum_address, ursula_2.checksum_address])
initiate_dkg(
accounts,
initiator,
testerchain,
ritual_token,
global_allow_list,
coordinator_agent,
cohort_addresses,
)
ritual_id = 0
# Round 1 (make issues occur)
yield from perform_round_1_with_fault_tolerance(
clock,
mocker,
testerchain,
ritual_id,
ursula_1,
ursula_2,
cohort_addresses,
initiator,
)
# Round 2 (make issues occur)
# Use "ursula_2" to experience the problems; ursula_2 must be used to prevent
# having a `spy(spy(...))` for methods on ursula_1 since spies already utilized from round_1
yield from perform_round_2_with_fault_tolerance(
clock, mocker, coordinator_agent, ritual_id, testerchain, ursula_2, ursula_1
)
def initiate_dkg(
accounts,
initiator,
testerchain,
ritual_token,
global_allow_list,
coordinator_agent,
cohort_addresses,
):
duration = 24 * 60 * 60
# Approve the ritual token for the coordinator agent to spend
amount = coordinator_agent.get_ritual_initiation_cost(
providers=cohort_addresses, duration=duration
)
ritual_token.approve(
coordinator_agent.contract_address,
amount,
sender=accounts[initiator.transacting_power.account],
)
receipt = coordinator_agent.initiate_ritual(
providers=cohort_addresses,
authority=initiator.transacting_power.account,
duration=duration,
access_controller=global_allow_list.address,
transacting_power=initiator.transacting_power,
)
testerchain.time_travel(seconds=1)
testerchain.wait_for_receipt(receipt["transactionHash"])
def perform_round_1_with_fault_tolerance(
clock,
mocker,
testerchain,
ritual_id,
ursula_experiencing_problems,
ursula_2,
cohort_staking_provider_addresses,
initiator,
):
phase_id = PhaseId(ritual_id=ritual_id, phase=PHASE1)
# pause machine so that txs don't actually get processed until the end
testerchain.tx_machine.pause()
assert len(testerchain.tx_machine.queued) == 0
publish_transcript_spy = mocker.spy(
ursula_experiencing_problems, "publish_transcript"
)
publish_aggregated_transcript_spy = mocker.spy(
ursula_experiencing_problems, "publish_aggregated_transcript"
)
publish_transcript_call_count = 0
assert (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
is None
), "nothing cached as yet"
original_async_tx = ursula_experiencing_problems.perform_round_1(
ritual_id=ritual_id,
authority=initiator.transacting_power.account,
participants=cohort_staking_provider_addresses,
timestamp=testerchain.get_blocktime(),
)
publish_transcript_call_count += 1
assert len(testerchain.tx_machine.queued) == 1
assert (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
is original_async_tx
)
assert publish_transcript_spy.call_count == publish_transcript_call_count
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
# calling again has no effect
repeat_call_async_tx = ursula_experiencing_problems.perform_round_1(
ritual_id=ritual_id,
authority=initiator.transacting_power.account,
participants=cohort_staking_provider_addresses,
timestamp=testerchain.get_blocktime(),
)
assert repeat_call_async_tx is original_async_tx
assert (
publish_transcript_spy.call_count == publish_transcript_call_count
), "no change"
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
# broadcast callback called - must mock existing tx since it wasn't actually broadcasted as yet
mocked_pending_tx = mocker.Mock()
mocked_pending_tx.params = original_async_tx.params
mocked_pending_tx.id = original_async_tx.id
mocked_pending_tx.txhash = MockBlockchain.FAKE_TX_HASH
original_async_tx.on_broadcast(mocked_pending_tx)
assert (
publish_transcript_spy.call_count == publish_transcript_call_count
), "no change"
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
assert len(testerchain.tx_machine.queued) == 1
# insufficient funds callback called
original_async_tx.on_insufficient_funds(original_async_tx, InsufficientFunds())
assert (
publish_transcript_spy.call_count == publish_transcript_call_count
), "no change"
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
assert len(testerchain.tx_machine.queued) == 1
# broadcast failure callback called
# tx is explicitly removed and resubmitted by callback
original_async_tx.on_broadcast_failure(original_async_tx, Exception())
publish_transcript_call_count += 1 # on_fault should trigger resubmission
assert (
publish_transcript_spy.call_count == publish_transcript_call_count
), "updated call"
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
resubmitted_after_broadcast_failure_async_tx = (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
)
assert (
resubmitted_after_broadcast_failure_async_tx is not original_async_tx
), "cache updated with resubmitted tx"
assert len(testerchain.tx_machine.queued) == 1
# on_fault callback called - this should cause a resubmission of tx because
# tx was removed from atxm after faulting
testerchain.tx_machine.remove_queued_transaction(
resubmitted_after_broadcast_failure_async_tx
) # simulate removal from atxm
assert len(testerchain.tx_machine.queued) == 0
resubmitted_after_broadcast_failure_async_tx.fault = Fault.ERROR
resubmitted_after_broadcast_failure_async_tx.error = None
resubmitted_after_broadcast_failure_async_tx.on_fault(
resubmitted_after_broadcast_failure_async_tx
)
publish_transcript_call_count += 1 # on_fault should trigger resubmission
assert (
publish_transcript_spy.call_count == publish_transcript_call_count
), "updated call"
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
resubmitted_after_fault_async_tx = (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
)
assert (
resubmitted_after_fault_async_tx
is not resubmitted_after_broadcast_failure_async_tx
), "cache updated with resubmitted tx"
assert len(testerchain.tx_machine.queued) == 1
# on_finalized (unsuccessful) callback called - this should cause a resubmission of tx because
# tx was removed from atxm after faulting
testerchain.tx_machine.remove_queued_transaction(
resubmitted_after_fault_async_tx
) # simulate removal from atxm
assert len(testerchain.tx_machine.queued) == 0
resubmitted_after_fault_async_tx.successful = False
resubmitted_after_fault_async_tx.on_finalized(resubmitted_after_fault_async_tx)
publish_transcript_call_count += (
1 # on_finalized (unsuccessful) should trigger resubmission
)
assert (
publish_transcript_spy.call_count == publish_transcript_call_count
), "updated call"
assert (
publish_aggregated_transcript_spy.call_count == 0
), "phase 2 method never called"
resubmitted_after_finalized_async_tx = (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
)
assert (
resubmitted_after_finalized_async_tx is not resubmitted_after_fault_async_tx
), "cache updated with resubmitted tx"
assert len(testerchain.tx_machine.queued) == 1
ursula_1_on_finalized_spy = mocker.spy(
resubmitted_after_finalized_async_tx, "on_finalized"
)
assert ursula_1_on_finalized_spy.call_count == 0
# have ursula_2 also submit their transcript
ursula_2_async_tx = ursula_2.perform_round_1(
ritual_id=ritual_id,
authority=initiator.transacting_power.account,
participants=cohort_staking_provider_addresses,
timestamp=testerchain.get_blocktime(),
)
ursula_2_on_finalized_spy = mocker.spy(ursula_2_async_tx, "on_finalized")
assert ursula_2_on_finalized_spy.call_count == 0
testerchain.tx_machine.resume() # resume processing
interval = testerchain.tx_machine._task.interval
# wait for txs to be processed
while not all(
tx in testerchain.tx_machine.finalized
for tx in [resubmitted_after_finalized_async_tx, ursula_2_async_tx]
):
yield clock.advance(interval)
yield testerchain.time_travel(seconds=1)
# wait for hooks to be called
yield deferLater(reactor, 0.2, lambda: None)
ursula_1_on_finalized_spy.assert_called_once_with(
resubmitted_after_finalized_async_tx
)
ursula_2_on_finalized_spy.assert_called_once_with(ursula_2_async_tx)
def perform_round_2_with_fault_tolerance(
clock,
mocker,
coordinator_agent,
ritual_id,
testerchain,
ursula_experiencing_problems,
ursula_2,
):
# ensure we are actually in the 2nd round of the dkg process
while (
coordinator_agent.get_ritual_status(ritual_id=ritual_id)
!= Coordinator.RitualStatus.DKG_AWAITING_AGGREGATIONS
):
yield testerchain.time_travel(seconds=1)
phase_id = PhaseId(ritual_id=ritual_id, phase=PHASE2)
# pause machine so that txs don't actually get processed until the end
testerchain.tx_machine.pause()
assert len(testerchain.tx_machine.queued) == 0
publish_transcript_spy = mocker.spy(
ursula_experiencing_problems, "publish_transcript"
)
publish_aggregated_transcript_spy = mocker.spy(
ursula_experiencing_problems, "publish_aggregated_transcript"
)
publish_aggregated_transcript_call_count = 0
assert (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
is None
), "nothing cached as yet"
original_async_tx = ursula_experiencing_problems.perform_round_2(
ritual_id=ritual_id, timestamp=testerchain.get_blocktime()
)
publish_aggregated_transcript_call_count += 1
assert len(testerchain.tx_machine.queued) == 1
assert (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
is original_async_tx
)
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
)
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
# calling again has no effect
repeat_call_async_tx = ursula_experiencing_problems.perform_round_2(
ritual_id=ritual_id, timestamp=testerchain.get_blocktime()
)
assert repeat_call_async_tx is original_async_tx
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
), "no change"
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
# broadcast callback called - must mock existing tx since it wasn't actually broadcasted as yet
mocked_pending_tx = mocker.Mock()
mocked_pending_tx.params = original_async_tx.params
mocked_pending_tx.id = original_async_tx.id
mocked_pending_tx.txhash = MockBlockchain.FAKE_TX_HASH
original_async_tx.on_broadcast(mocked_pending_tx)
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
), "no change"
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
assert len(testerchain.tx_machine.queued) == 1
# insufficient funds callback called
original_async_tx.on_insufficient_funds(original_async_tx, InsufficientFunds())
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
), "no change"
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
assert len(testerchain.tx_machine.queued) == 1
# broadcast failure callback called
# tx is explicitly removed and resubmitted by callback
original_async_tx.on_broadcast_failure(original_async_tx, Exception())
publish_aggregated_transcript_call_count += (
1 # on_fault should trigger resubmission
)
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
), "updated call"
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
resubmitted_after_broadcast_failure_async_tx = (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
)
assert (
resubmitted_after_broadcast_failure_async_tx is not original_async_tx
), "cache updated with resubmitted tx"
assert len(testerchain.tx_machine.queued) == 1
# on_fault callback called - this should cause a resubmission of tx because
# tx was removed from atxm after faulting
testerchain.tx_machine.remove_queued_transaction(
resubmitted_after_broadcast_failure_async_tx
) # simulate removal from atxm
assert len(testerchain.tx_machine.queued) == 0
resubmitted_after_broadcast_failure_async_tx.fault = Fault.ERROR
resubmitted_after_broadcast_failure_async_tx.error = None
resubmitted_after_broadcast_failure_async_tx.on_fault(
resubmitted_after_broadcast_failure_async_tx
)
publish_aggregated_transcript_call_count += (
1 # on_fault should trigger resubmission
)
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
), "updated call"
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
resubmitted_after_fault_async_tx = (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
)
assert (
resubmitted_after_fault_async_tx
is not resubmitted_after_broadcast_failure_async_tx
), "cache updated with resubmitted tx"
assert len(testerchain.tx_machine.queued) == 1
# on_finalized (unsuccessful) callback called - this should cause a resubmission of tx because
# tx was removed from atxm after faulting
testerchain.tx_machine.remove_queued_transaction(
resubmitted_after_fault_async_tx
) # simulate removal from atxm
assert len(testerchain.tx_machine.queued) == 0
resubmitted_after_fault_async_tx.successful = False
resubmitted_after_fault_async_tx.on_finalized(resubmitted_after_fault_async_tx)
publish_aggregated_transcript_call_count += (
1 # on_finalized (unsuccessful) should trigger resubmission
)
assert (
publish_aggregated_transcript_spy.call_count
== publish_aggregated_transcript_call_count
), "updated call"
assert publish_transcript_spy.call_count == 0, "phase 1 method never called"
resubmitted_after_finalized_async_tx = (
ursula_experiencing_problems.dkg_storage.get_ritual_phase_async_tx(phase_id)
)
assert (
resubmitted_after_finalized_async_tx is not resubmitted_after_fault_async_tx
), "cache updated with resubmitted tx"
assert len(testerchain.tx_machine.queued) == 1
ursula_1_on_finalized_spy = mocker.spy(
resubmitted_after_finalized_async_tx, "on_finalized"
)
assert ursula_1_on_finalized_spy.call_count == 0
# have ursula_2 also submit their transcript
ursula_2_async_tx = ursula_2.perform_round_2(
ritual_id=ritual_id, timestamp=testerchain.get_blocktime()
)
ursula_2_on_finalized_spy = mocker.spy(ursula_2_async_tx, "on_finalized")
assert ursula_2_on_finalized_spy.call_count == 0
testerchain.tx_machine.resume() # resume processing
interval = testerchain.tx_machine._task.interval
# wait for txs to be processed
while not all(
tx in testerchain.tx_machine.finalized
for tx in [resubmitted_after_finalized_async_tx, ursula_2_async_tx]
):
yield clock.advance(interval)
yield testerchain.time_travel(seconds=1)
# wait for hooks to be called
yield deferLater(reactor, 0.2, lambda: None)
ursula_1_on_finalized_spy.assert_called_once_with(
resubmitted_after_finalized_async_tx
)
ursula_2_on_finalized_spy.assert_called_once_with(ursula_2_async_tx)
# ensure ritual is successfully completed
assert (
coordinator_agent.get_ritual_status(ritual_id=ritual_id)
== Coordinator.RitualStatus.ACTIVE
)

View File

@ -1,14 +1,14 @@
import os
import pytest
import pytest_twisted
from eth_utils import keccak
from nucypher_core import SessionStaticSecret
from twisted.internet import reactor
from twisted.internet.task import deferLater
from nucypher.blockchain.eth.agents import (
CoordinatorAgent,
)
from nucypher.blockchain.eth.agents import CoordinatorAgent
from nucypher.blockchain.eth.models import Coordinator
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
@ -43,9 +43,9 @@ def cohort_ursulas(cohort, taco_application_agent):
@pytest.fixture(scope='module')
def transacting_powers(testerchain, cohort_ursulas):
def transacting_powers(accounts, cohort_ursulas):
return [
TransactingPower(account=ursula, signer=Web3Signer(testerchain.client))
TransactingPower(account=ursula, signer=accounts.get_account_signer(ursula))
for ursula in cohort_ursulas
]
@ -112,24 +112,48 @@ def test_initiate_ritual(
assert ritual_dkg_key is None # no dkg key available until ritual is completed
def test_post_transcript(agent, transcripts, transacting_powers, testerchain):
@pytest_twisted.inlineCallbacks
def test_post_transcript(
agent, transcripts, transacting_powers, testerchain, clock, mock_async_hooks
):
ritual_id = agent.number_of_rituals() - 1
txs = []
for i, transacting_power in enumerate(transacting_powers):
txhash = agent.post_transcript(
async_tx = agent.post_transcript(
ritual_id=ritual_id,
transcript=transcripts[i],
transacting_power=transacting_power,
async_tx_hooks=mock_async_hooks,
)
txs.append(async_tx)
receipt = testerchain.wait_for_receipt(txhash)
testerchain.tx_machine.start()
while not all([tx.final for tx in txs]):
yield clock.advance(testerchain.tx_machine._task.interval)
testerchain.tx_machine.stop()
for i, async_tx in enumerate(txs):
post_transcript_events = (
agent.contract.events.TranscriptPosted().process_receipt(receipt)
agent.contract.events.TranscriptPosted().process_receipt(async_tx.receipt)
)
# assert len(post_transcript_events) == 1
event = post_transcript_events[0]
assert event["args"]["ritualId"] == ritual_id
assert event["args"]["transcriptDigest"] == keccak(transcripts[i])
# ensure relevant hooks are called (once for each tx) OR not called (failure ones)
yield deferLater(reactor, 0.2, lambda: None)
assert mock_async_hooks.on_broadcast.call_count == len(txs)
assert mock_async_hooks.on_finalized.call_count == len(txs)
for async_tx in txs:
assert async_tx.successful is True
# failure hooks not called
assert mock_async_hooks.on_broadcast_failure.call_count == 0
assert mock_async_hooks.on_fault.call_count == 0
assert mock_async_hooks.on_insufficient_funds.call_count == 0
ritual = agent.get_ritual(ritual_id, transcripts=True)
assert [p.transcript for p in ritual.participants] == transcripts
@ -142,6 +166,7 @@ def test_post_transcript(agent, transcripts, transacting_powers, testerchain):
assert ritual_dkg_key is None # no dkg key available until ritual is completed
@pytest_twisted.inlineCallbacks
def test_post_aggregation(
agent,
aggregated_transcript,
@ -149,24 +174,37 @@ def test_post_aggregation(
transacting_powers,
cohort,
testerchain,
clock,
mock_async_hooks,
):
testerchain.tx_machine.start()
ritual_id = agent.number_of_rituals() - 1
participant_public_keys = {}
txs = []
participant_public_key = SessionStaticSecret.random().public_key()
for i, transacting_power in enumerate(transacting_powers):
participant_public_key = SessionStaticSecret.random().public_key()
txhash = agent.post_aggregation(
async_tx = agent.post_aggregation(
ritual_id=ritual_id,
aggregated_transcript=aggregated_transcript,
public_key=dkg_public_key,
participant_public_key=participant_public_key,
transacting_power=transacting_power,
async_tx_hooks=mock_async_hooks,
)
txs.append(async_tx)
testerchain.tx_machine.start()
while not all([tx.final for tx in txs]):
yield clock.advance(testerchain.tx_machine._task.interval)
testerchain.tx_machine.stop()
for i, async_tx in enumerate(txs):
participant_public_keys[cohort[i]] = participant_public_key
receipt = testerchain.wait_for_receipt(txhash)
post_aggregation_events = (
agent.contract.events.AggregationPosted().process_receipt(receipt)
agent.contract.events.AggregationPosted().process_receipt(async_tx.receipt)
)
# assert len(post_aggregation_events) == 1
assert len(post_aggregation_events) == 1
event = post_aggregation_events[0]
assert event["args"]["ritualId"] == ritual_id
assert event["args"]["aggregatedTranscriptDigest"] == keccak(
@ -180,6 +218,18 @@ def test_post_aggregation(
participant_public_keys[p.provider]
)
# ensure relevant hooks are called (once for each tx) OR not called (failure ones)
yield deferLater(reactor, 0.2, lambda: None)
assert mock_async_hooks.on_broadcast.call_count == len(txs)
assert mock_async_hooks.on_finalized.call_count == len(txs)
for async_tx in txs:
assert async_tx.successful is True
# failure hooks not called
assert mock_async_hooks.on_broadcast_failure.call_count == 0
assert mock_async_hooks.on_fault.call_count == 0
assert mock_async_hooks.on_insufficient_funds.call_count == 0
ritual = agent.get_ritual(ritual_id)
assert ritual.participant_public_keys == participant_public_keys

View File

@ -9,7 +9,6 @@ from nucypher.blockchain.eth.agents import (
WeightedSampler,
)
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
@ -19,9 +18,10 @@ def test_sampling_distribution(
threshold_staking,
coordinator_agent,
deployer_account,
accounts,
):
# setup
stake_provider_accounts = testerchain.stake_providers_accounts
stake_provider_accounts = accounts.staking_providers_accounts
amount = taco_application_agent.get_min_authorization()
all_locked_tokens = len(stake_provider_accounts) * amount
@ -35,7 +35,10 @@ def test_sampling_distribution(
provider_address, 0, amount, sender=deployer_account
)
power = TransactingPower(account=provider_address, signer=Web3Signer(testerchain.client))
power = TransactingPower(
account=provider_address,
signer=accounts.get_account_signer(provider_address),
)
# We assume that the staking provider knows in advance the account of her operator
taco_application_agent.bond_operator(

View File

@ -2,9 +2,7 @@ import random
import pytest
from nucypher.blockchain.eth.agents import TACoApplicationAgent
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
@ -30,13 +28,14 @@ def test_authorized_tokens(
def test_staking_providers_and_operators_relationships(
testerchain,
accounts,
taco_application_agent,
threshold_staking,
taco_application,
deployer_account,
get_random_checksum_address,
):
staking_provider_account, operator_account, *other = testerchain.unassigned_accounts
staking_provider_account, operator_account, *other = accounts.unassigned_accounts
threshold_staking.setRoles(staking_provider_account, sender=deployer_account)
threshold_staking.authorizationIncreased(
staking_provider_account,
@ -51,7 +50,8 @@ def test_staking_providers_and_operators_relationships(
)
tpower = TransactingPower(
account=staking_provider_account, signer=Web3Signer(testerchain.client)
account=staking_provider_account,
signer=accounts.get_account_signer(staking_provider_account),
)
_txhash = taco_application_agent.bond_operator(
transacting_power=tpower,
@ -88,7 +88,10 @@ def test_get_staker_population(taco_application_agent, staking_providers):
@pytest.mark.usefixtures("staking_providers", "ursulas")
def test_sample_staking_providers(taco_application_agent):
@pytest.mark.parametrize(
"duration", [0, 60 * 60 * 24, 60 * 60 * 24 * 182, 60 * 60 * 24 * 365]
)
def test_sample_staking_providers(taco_application_agent, duration):
all_staking_providers = list(taco_application_agent.get_staking_providers())
providers_population = taco_application_agent.get_staking_providers_population()
@ -99,13 +102,15 @@ def test_sample_staking_providers(taco_application_agent):
providers_population + 1
) # One more than we have deployed
providers = taco_application_agent.get_staking_provider_reservoir().draw(3)
providers = taco_application_agent.get_staking_provider_reservoir(
duration=duration
).draw(3)
assert len(providers) == 3 # Three...
assert len(set(providers)) == 3 # ...unique addresses
# Same but with pagination
providers = taco_application_agent.get_staking_provider_reservoir(
pagination_size=1
pagination_size=1, duration=duration
).draw(3)
assert len(providers) == 3
assert len(set(providers)) == 3
@ -114,7 +119,9 @@ def test_sample_staking_providers(taco_application_agent):
# repeat for opposite blockchain light setting
light = taco_application_agent.blockchain.is_light
taco_application_agent.blockchain.is_light = not light
providers = taco_application_agent.get_staking_provider_reservoir().draw(3)
providers = taco_application_agent.get_staking_provider_reservoir(
duration=duration
).draw(3)
assert len(providers) == 3
assert len(set(providers)) == 3
assert len(set(providers).intersection(all_staking_providers)) == 3
@ -132,17 +139,19 @@ def test_sample_staking_providers(taco_application_agent):
def test_get_staking_provider_info(
testerchain, taco_application_agent, get_random_checksum_address
taco_application_agent, ursulas, get_random_checksum_address
):
staking_provider_account, operator_account, *other = testerchain.unassigned_accounts
info: TACoApplicationAgent.StakingProviderInfo = (
taco_application_agent.get_staking_provider_info(
staking_provider=staking_provider_account
)
# existing staker
staking_provider, operator_address = (
ursulas[0].checksum_address,
ursulas[0].operator_address,
)
info = taco_application_agent.get_staking_provider_info(
staking_provider=staking_provider
)
assert info.operator_start_timestamp > 0
assert info.operator == operator_account
assert info.operator_confirmed is False
assert info.operator == operator_address
assert info.operator_confirmed is True
# non-existent staker
info = taco_application_agent.get_staking_provider_info(

View File

@ -101,7 +101,10 @@ def test_get_staker_population(taco_child_application_agent, staking_providers):
@pytest.mark.usefixtures("staking_providers", "ursulas")
def test_sample_staking_providers(taco_child_application_agent):
@pytest.mark.parametrize(
"duration", [0, 60 * 60 * 24, 60 * 60 * 24 * 182, 60 * 60 * 24 * 365]
)
def test_sample_staking_providers(taco_child_application_agent, duration):
all_staking_providers = list(taco_child_application_agent.get_staking_providers())
providers_population = (
taco_child_application_agent.get_staking_providers_population()
@ -110,18 +113,22 @@ def test_sample_staking_providers(taco_child_application_agent):
assert len(all_staking_providers) == providers_population
with pytest.raises(taco_child_application_agent.NotEnoughStakingProviders):
taco_child_application_agent.get_staking_provider_reservoir().draw(
taco_child_application_agent.get_staking_provider_reservoir(
duration=duration
).draw(
providers_population + 1
) # One more than we have deployed
providers = taco_child_application_agent.get_staking_provider_reservoir().draw(3)
providers = taco_child_application_agent.get_staking_provider_reservoir(
duration=duration
).draw(3)
assert len(providers) == 3 # Three...
assert len(set(providers)) == 3 # ...unique addresses
assert len(set(providers).intersection(all_staking_providers)) == 3
# Same but with pagination
providers = taco_child_application_agent.get_staking_provider_reservoir(
pagination_size=1
pagination_size=1, duration=duration
).draw(3)
assert len(providers) == 3
assert len(set(providers)) == 3
@ -130,7 +137,9 @@ def test_sample_staking_providers(taco_child_application_agent):
# repeat for opposite blockchain light setting
light = taco_child_application_agent.blockchain.is_light
taco_child_application_agent.blockchain.is_light = not light
providers = taco_child_application_agent.get_staking_provider_reservoir().draw(3)
providers = taco_child_application_agent.get_staking_provider_reservoir(
duration=duration
).draw(3)
assert len(providers) == 3
assert len(set(providers)) == 3
assert len(set(providers).intersection(all_staking_providers)) == 3
@ -145,3 +154,33 @@ def test_sample_staking_providers(taco_child_application_agent):
assert len(set(providers)) == 3
assert len(set(providers).intersection(all_staking_providers)) == 3
assert len(set(providers).intersection(exclude_providers)) == 0
def test_get_staking_provider_info(
taco_child_application_agent, ursulas, get_random_checksum_address
):
# existing staker
staking_provider, operator_address = (
ursulas[0].checksum_address,
ursulas[0].operator_address,
)
info = taco_child_application_agent.staking_provider_info(
staking_provider=staking_provider
)
assert info.operator == operator_address
assert info.authorized > taco_child_application_agent.get_min_authorization()
assert info.operator_confirmed is True
assert info.index == 1
assert info.deauthorizing == 0
assert info.end_deauthorization == 0
# non-existent staker
info = taco_child_application_agent.staking_provider_info(
get_random_checksum_address()
)
assert info.operator == NULL_ADDRESS
assert info.authorized == 0
assert info.operator_confirmed is False
assert info.index == 0
assert info.deauthorizing == 0
assert info.end_deauthorization == 0

View File

@ -1,116 +0,0 @@
import pytest
from eth_tester.exceptions import TransactionFailed
from nucypher.blockchain.eth.agents import ContractAgency, NucypherTokenAgent
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
from tests.constants import TEST_ETH_PROVIDER_URI
@pytest.fixture(scope='module')
def agent(testerchain, test_registry) -> NucypherTokenAgent:
token_agent = ContractAgency.get_agent(
NucypherTokenAgent,
registry=test_registry,
blockchain_endpoint=TEST_ETH_PROVIDER_URI,
)
return token_agent
@pytest.mark.skip()
def test_token_properties(agent):
testerchain = agent.blockchain
# Internal
assert 'NuCypher' == agent.contract.functions.name().call()
assert 18 == agent.contract.functions.decimals().call()
assert 'NU' == agent.contract.functions.symbol().call()
# Cannot transfer any ETH to token contract
with pytest.raises((TransactionFailed, ValueError)):
origin = testerchain.client.coinbase
payload = {'from': origin, 'to': agent.contract_address, 'value': 1}
tx = testerchain.client.send_transaction(payload)
testerchain.wait_for_receipt(tx)
assert len(agent.contract_address) == 42
assert agent.contract.address == agent.contract_address
assert agent.contract_name == NucypherTokenAgent.contract_name
@pytest.mark.skip()
def test_get_balance(agent):
testerchain = agent.blockchain
deployer, someone, *everybody_else = testerchain.client.accounts
balance = agent.get_balance(address=someone)
assert balance == 0
balance = agent.get_balance(address=deployer)
assert balance > 0
@pytest.mark.skip()
def test_approve_transfer(agent, taco_application_agent):
testerchain = agent.blockchain
deployer, someone, *everybody_else = testerchain.client.accounts
tpower = TransactingPower(account=someone, signer=Web3Signer(testerchain.client))
# Approve
receipt = agent.approve_transfer(
amount=taco_application_agent.get_min_authorization(),
spender_address=agent.contract_address,
transacting_power=tpower,
)
assert receipt['status'] == 1, "Transaction Rejected"
assert receipt['logs'][0]['address'] == agent.contract_address
@pytest.mark.skip()
def test_transfer(agent, taco_application_agent):
testerchain = agent.blockchain
origin, someone, *everybody_else = testerchain.client.accounts
tpower = TransactingPower(account=origin, signer=Web3Signer(testerchain.client))
old_balance = agent.get_balance(someone)
receipt = agent.transfer(
amount=taco_application_agent.get_min_authorization(),
target_address=someone,
transacting_power=tpower,
)
assert receipt['status'] == 1, "Transaction Rejected"
assert receipt['logs'][0]['address'] == agent.contract_address
new_balance = agent.get_balance(someone)
assert new_balance == old_balance + taco_application_agent.get_min_authorization()
@pytest.mark.skip()
def test_approve_and_call(agent, taco_application_agent, deploy_contract):
testerchain = agent.blockchain
deployer, someone, *everybody_else = testerchain.client.accounts
mock_target, _ = deploy_contract('ReceiveApprovalMethodMock')
# Approve and call
tpower = TransactingPower(account=someone, signer=Web3Signer(testerchain.client))
call_data = b"Good morning, that's a nice tnetennba."
receipt = agent.approve_and_call(
amount=taco_application_agent.get_min_authorization(),
target_address=mock_target.address,
transacting_power=tpower,
call_data=call_data,
)
assert receipt['status'] == 1, "Transaction Rejected"
assert receipt['logs'][0]['address'] == agent.contract_address
assert mock_target.functions.extraData().call() == call_data
assert mock_target.functions.sender().call() == someone
assert (
mock_target.functions.value().call()
== taco_application_agent.get_min_authorization()
)

View File

@ -7,6 +7,16 @@ dependencies:
- name: nucypher-contracts
github: nucypher/nucypher-contracts
ref: main
config_override:
solidity:
version: 0.8.23
evm_version: paris
import_remapping:
- "@openzeppelin/contracts=openzeppelin/v5.0.0"
- "@openzeppelin-upgradeable/contracts=openzeppelin-upgradeable/v5.0.0"
- "@fx-portal/contracts=fx-portal/v1.0.5"
- "@threshold/contracts=threshold/v1.2.1"
- name: openzeppelin
github: OpenZeppelin/openzeppelin-contracts
version: 5.0.0
@ -17,7 +27,6 @@ solidity:
import_remapping:
- "@openzeppelin/contracts=openzeppelin/v5.0.0"
test:
provider:
chain_id: 131277322940537 # ensure ape doesn't change chain id to 1337

View File

@ -5,11 +5,13 @@ from twisted.logger import LogLevel, globalLogPublisher
from nucypher.acumen.perception import FleetSensor
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.config.constants import TEMPORARY_DOMAIN_NAME
from nucypher.crypto.powers import TransactingPower
from tests.constants import TEST_ETH_PROVIDER_URI
from tests.utils.ursula import make_ursulas, start_pytest_ursula_services
from tests.utils.ursula import (
make_ursulas,
start_pytest_ursula_services,
)
def test_ursula_stamp_verification_tolerance(ursulas, mocker):
@ -79,6 +81,7 @@ def test_ursula_stamp_verification_tolerance(ursulas, mocker):
def test_invalid_operators_tolerance(
testerchain,
accounts,
test_registry,
ursulas,
threshold_staking,
@ -90,15 +93,12 @@ def test_invalid_operators_tolerance(
#
# Setup
#
(
creator,
_staking_provider,
operator_address,
*everyone_else,
) = testerchain.client.accounts
existing_ursula, other_ursula, *the_others = list(ursulas)
_staking_provider, operator_address = (
accounts.unassigned_accounts[0],
accounts.unassigned_accounts[1],
)
# We start with an ursula with no tokens staked
owner, _, _ = threshold_staking.rolesOf(_staking_provider, sender=deployer_account)
assert owner == NULL_ADDRESS
@ -112,7 +112,7 @@ def test_invalid_operators_tolerance(
# now lets bond this worker
tpower = TransactingPower(
account=_staking_provider, signer=Web3Signer(testerchain.client)
account=_staking_provider, signer=accounts.get_account_signer(_staking_provider)
)
taco_application_agent.bond_operator(
staking_provider=_staking_provider,
@ -121,7 +121,11 @@ def test_invalid_operators_tolerance(
)
# Make the Operator
ursulas = make_ursulas(ursula_test_config, [_staking_provider], [operator_address])
ursulas = make_ursulas(
ursula_test_config,
[_staking_provider],
[accounts.get_account_signer(operator_address)],
)
ursula = ursulas[0]
ursula.run(
preflight=False,

View File

@ -4,7 +4,6 @@ from eth_utils import to_checksum_address
from nucypher_core.ferveo import Keypair
from nucypher.blockchain.eth.models import Ferveo
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.characters.lawful import Character
from nucypher.config.constants import TEMPORARY_DOMAIN_NAME
from nucypher.crypto.powers import TransactingPower
@ -15,10 +14,9 @@ from tests.constants import INSECURE_DEVELOPMENT_PASSWORD, TEST_ETH_PROVIDER_URI
TransactingPower.lock_account = LOCK_FUNCTION
def test_character_transacting_power_signing(testerchain, test_registry):
def test_character_transacting_power_signing(testerchain, test_registry, accounts):
# Pretend to be a character.
eth_address = testerchain.etherbase_account
eth_address = accounts.etherbase_account
signer = Character(
is_me=True,
domain=TEMPORARY_DOMAIN_NAME,
@ -28,9 +26,11 @@ def test_character_transacting_power_signing(testerchain, test_registry):
)
# Manually consume the power up
transacting_power = TransactingPower(password=INSECURE_DEVELOPMENT_PASSWORD,
signer=Web3Signer(testerchain.client),
account=eth_address)
transacting_power = TransactingPower(
password=INSECURE_DEVELOPMENT_PASSWORD,
signer=accounts.get_account_signer(eth_address),
account=eth_address,
)
signer._crypto_power.consume_power_up(transacting_power)
@ -40,68 +40,80 @@ def test_character_transacting_power_signing(testerchain, test_registry):
assert power == transacting_power
# Sign Message
data_to_sign = b'Premium Select Luxury Pencil Holder'
data_to_sign = b"Premium Select Luxury Pencil Holder"
signature = power.sign_message(message=data_to_sign)
is_verified = verify_eip_191(address=eth_address, message=data_to_sign, signature=signature)
is_verified = verify_eip_191(
address=eth_address, message=data_to_sign, signature=signature
)
assert is_verified is True
# Sign Transaction
transaction_dict = {'nonce': testerchain.client.w3.eth.get_transaction_count(eth_address),
'gasPrice': testerchain.client.w3.eth.gas_price,
'gas': 100000,
'from': eth_address,
'to': testerchain.unassigned_accounts[1],
'value': 1,
'data': b''}
transaction_dict = {
"nonce": testerchain.client.w3.eth.get_transaction_count(eth_address),
"gasPrice": testerchain.client.w3.eth.gas_price,
"gas": 100000,
"from": eth_address,
"to": accounts.unassigned_accounts[1],
"value": 1,
"data": b"",
}
signed_transaction = power.sign_transaction(transaction_dict=transaction_dict)
raw_transaction = power.sign_transaction(transaction_dict=transaction_dict)
# Demonstrate that the transaction is valid RLP encoded.
restored_transaction = Transaction.from_bytes(serialized_bytes=signed_transaction)
restored_transaction = Transaction.from_bytes(serialized_bytes=raw_transaction)
restored_dict = restored_transaction.as_dict()
assert to_checksum_address(restored_dict['to']) == transaction_dict['to']
assert to_checksum_address(restored_dict["to"]) == transaction_dict["to"]
def test_transacting_power_sign_message(testerchain):
def test_transacting_power_sign_message(testerchain, accounts):
# Manually create a TransactingPower
eth_address = testerchain.etherbase_account
power = TransactingPower(password=INSECURE_DEVELOPMENT_PASSWORD,
signer=Web3Signer(testerchain.client),
account=eth_address)
eth_address = accounts.etherbase_account
power = TransactingPower(
password=INSECURE_DEVELOPMENT_PASSWORD,
signer=accounts.get_account_signer(eth_address),
account=eth_address,
)
# Manually unlock
power.unlock(password=INSECURE_DEVELOPMENT_PASSWORD)
# Sign
data_to_sign = b'Premium Select Luxury Pencil Holder'
data_to_sign = b"Premium Select Luxury Pencil Holder"
signature = power.sign_message(message=data_to_sign)
# Verify
is_verified = verify_eip_191(address=eth_address, message=data_to_sign, signature=signature)
is_verified = verify_eip_191(
address=eth_address, message=data_to_sign, signature=signature
)
assert is_verified is True
# Test invalid address/pubkey pair
is_verified = verify_eip_191(address=testerchain.client.accounts[1],
message=data_to_sign,
signature=signature)
is_verified = verify_eip_191(
address=accounts.accounts_addresses[1],
message=data_to_sign,
signature=signature,
)
assert is_verified is False
def test_transacting_power_sign_transaction(testerchain):
def test_transacting_power_sign_transaction(testerchain, accounts):
eth_address = accounts.unassigned_accounts[2]
power = TransactingPower(
password=INSECURE_DEVELOPMENT_PASSWORD,
signer=accounts.get_account_signer(eth_address),
account=eth_address,
)
eth_address = testerchain.unassigned_accounts[2]
power = TransactingPower(password=INSECURE_DEVELOPMENT_PASSWORD,
signer=Web3Signer(testerchain.client),
account=eth_address)
transaction_dict = {'nonce': testerchain.client.w3.eth.get_transaction_count(eth_address),
'gasPrice': testerchain.client.w3.eth.gas_price,
'gas': 100000,
'from': eth_address,
'to': testerchain.unassigned_accounts[1],
'value': 1,
'data': b''}
transaction_dict = {
"nonce": testerchain.client.w3.eth.get_transaction_count(eth_address),
"gasPrice": testerchain.client.w3.eth.gas_price,
"gas": 100000,
"from": eth_address,
"to": accounts.unassigned_accounts[1],
"value": 1,
"data": b"",
}
# Sign
power.activate()
@ -109,39 +121,50 @@ def test_transacting_power_sign_transaction(testerchain):
# Demonstrate that the transaction is valid RLP encoded.
from eth_account._utils.legacy_transactions import Transaction
restored_transaction = Transaction.from_bytes(serialized_bytes=signed_transaction)
restored_dict = restored_transaction.as_dict()
assert to_checksum_address(restored_dict['to']) == transaction_dict['to']
assert to_checksum_address(restored_dict["to"]) == transaction_dict["to"]
# Try signing with missing transaction fields
del transaction_dict['gas']
del transaction_dict['nonce']
del transaction_dict["gas"]
del transaction_dict["nonce"]
with pytest.raises(TypeError):
power.sign_transaction(transaction_dict=transaction_dict)
def test_transacting_power_sign_agent_transaction(testerchain, coordinator_agent):
def test_transacting_power_sign_agent_transaction(
testerchain, coordinator_agent, accounts
):
public_key = Keypair.random().public_key()
g2_point = Ferveo.G2Point.from_public_key(public_key)
contract_function = coordinator_agent.contract.functions.setProviderPublicKey(
g2_point
)
payload = {'chainId': int(testerchain.client.chain_id),
'nonce': testerchain.client.w3.eth.get_transaction_count(testerchain.etherbase_account),
'from': testerchain.etherbase_account,
'gasPrice': testerchain.client.gas_price,
'gas': 500_000}
payload = {
"chainId": int(testerchain.client.chain_id),
"nonce": testerchain.client.w3.eth.get_transaction_count(
accounts.etherbase_account
),
"from": accounts.etherbase_account,
"gasPrice": testerchain.client.gas_price,
"gas": 500_000,
}
unsigned_transaction = contract_function.build_transaction(payload)
# Sign with Transacting Power
transacting_power = TransactingPower(password=INSECURE_DEVELOPMENT_PASSWORD,
signer=Web3Signer(testerchain.client),
account=testerchain.etherbase_account)
transacting_power = TransactingPower(
password=INSECURE_DEVELOPMENT_PASSWORD,
signer=accounts.get_account_signer(accounts.etherbase_account),
account=accounts.etherbase_account,
)
signed_raw_transaction = transacting_power.sign_transaction(unsigned_transaction)
# Demonstrate that the transaction is valid RLP encoded.
restored_transaction = Transaction.from_bytes(serialized_bytes=signed_raw_transaction)
restored_transaction = Transaction.from_bytes(
serialized_bytes=signed_raw_transaction
)
restored_dict = restored_transaction.as_dict()
assert to_checksum_address(restored_dict['to']) == unsigned_transaction['to']
assert to_checksum_address(restored_dict["to"]) == unsigned_transaction["to"]

View File

@ -8,7 +8,6 @@ from web3 import Web3
from nucypher.blockchain.eth.agents import ContractAgency, TACoApplicationAgent
from nucypher.blockchain.eth.signers import KeystoreSigner
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.cli.main import nucypher_cli
from nucypher.config.characters import UrsulaConfiguration
from nucypher.config.constants import (
@ -34,6 +33,7 @@ def mock_funded_account_password_keystore(
taco_application_agent,
test_registry,
deployer_account,
accounts,
):
"""
Generate a random keypair & password and create a local keystore. Then prepare a staking provider
@ -49,14 +49,14 @@ def mock_funded_account_password_keystore(
testerchain.client.w3.eth.send_transaction(
{
"to": account.address,
"from": testerchain.etherbase_account,
"from": accounts.etherbase_account,
"value": Web3.to_wei("100", "ether"),
}
)
)
# initialize threshold stake
provider_address = testerchain.unassigned_accounts[0]
provider_address = accounts.unassigned_accounts[0]
threshold_staking.setRoles(provider_address, sender=deployer_account)
threshold_staking.setStakes(
provider_address,
@ -66,7 +66,7 @@ def mock_funded_account_password_keystore(
)
provider_power = TransactingPower(
account=provider_address, signer=Web3Signer(testerchain.client)
account=provider_address, signer=accounts.get_account_signer(provider_address)
)
provider_power.unlock(password=INSECURE_DEVELOPMENT_PASSWORD)
@ -92,6 +92,7 @@ def test_ursula_and_local_keystore_signer_integration(
mocker,
mock_funded_account_password_keystore,
testerchain,
accounts,
):
config_root_path = tmp_path
ursula_config_path = config_root_path / UrsulaConfiguration.generate_filename()
@ -101,7 +102,7 @@ def test_ursula_and_local_keystore_signer_integration(
testerchain.client.w3.eth.send_transaction(
{
"to": worker_account,
"from": testerchain.etherbase_account,
"from": accounts.etherbase_account,
"value": Web3.to_wei("100", "ether"),
}
)

View File

@ -3,6 +3,7 @@ from unittest import mock
import pytest
import pytest_twisted as pt
from eth_account import Account
from twisted.internet import threads
from nucypher.characters.base import Learner
@ -16,7 +17,6 @@ from nucypher.utilities.networking import LOOPBACK_ADDRESS
from tests.constants import (
INSECURE_DEVELOPMENT_PASSWORD,
TEST_ETH_PROVIDER_URI,
TEST_POLYGON_PROVIDER_URI,
)
from tests.utils.ursula import select_test_port, start_pytest_ursula_services
@ -32,8 +32,9 @@ def test_missing_configuration_file(_default_filepath_mock, click_runner):
@pt.inlineCallbacks
def test_run_lone_default_development_ursula(click_runner, ursulas, testerchain):
def test_run_lone_default_development_ursula(click_runner, mocker, ursulas, accounts):
deploy_port = select_test_port()
operator_address = ursulas[0].operator_address
args = (
"ursula",
"run", # Stat Ursula Command
@ -44,13 +45,18 @@ def test_run_lone_default_development_ursula(click_runner, ursulas, testerchain)
"--dry-run", # Disable twisted reactor in subprocess
"--lonely", # Do not load seednodes,
"--operator-address",
ursulas[0].operator_address,
operator_address,
"--eth-endpoint",
TEST_ETH_PROVIDER_URI,
"--polygon-endpoint",
TEST_ETH_PROVIDER_URI,
"--signer",
"memory://",
)
account = Account.from_key(private_key=accounts[operator_address].private_key)
mocker.patch.object(Account, "create", return_value=account)
result = yield threads.deferToThread(
click_runner.invoke,
nucypher_cli,

View File

@ -7,8 +7,12 @@ import pytest
from hexbytes import HexBytes
from web3 import Web3
from web3.providers import BaseProvider
from web3.types import ABIFunction
from nucypher.blockchain.eth.agents import ContractAgency, SubscriptionManagerAgent
from nucypher.blockchain.eth.agents import (
ContractAgency,
SubscriptionManagerAgent,
)
from nucypher.blockchain.eth.constants import NULL_ADDRESS
from nucypher.policy.conditions.context import (
USER_ADDRESS_CONTEXT,
@ -75,7 +79,9 @@ def test_user_address_context_invalid_eip712_typed_data(valid_user_address_conte
get_context_value(USER_ADDRESS_CONTEXT, **context)
def test_user_address_context_variable_verification(testerchain, valid_user_address_context):
def test_user_address_context_variable_verification(
valid_user_address_context, accounts
):
# valid user address context - signature matches address
address = get_context_value(USER_ADDRESS_CONTEXT, **valid_user_address_context)
assert address == valid_user_address_context[USER_ADDRESS_CONTEXT]["address"]
@ -85,7 +91,7 @@ def test_user_address_context_variable_verification(testerchain, valid_user_addr
mismatch_with_address_context = copy.deepcopy(valid_user_address_context)
mismatch_with_address_context[USER_ADDRESS_CONTEXT][
"address"
] = testerchain.etherbase_account
] = accounts.etherbase_account
with pytest.raises(ContextVariableVerificationFailed):
get_context_value(USER_ADDRESS_CONTEXT, **mismatch_with_address_context)
@ -115,9 +121,9 @@ def test_user_address_context_variable_verification(testerchain, valid_user_addr
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_no_providers(
get_context_value_mock, testerchain, rpc_condition
get_context_value_mock, testerchain, accounts, rpc_condition
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
with pytest.raises(NoConnectionToChain):
_ = rpc_condition.verify(providers={}, **context)
@ -132,9 +138,9 @@ def test_rpc_condition_evaluation_no_providers(
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_invalid_provider_for_chain(
get_context_value_mock, testerchain, rpc_condition
get_context_value_mock, testerchain, accounts, rpc_condition
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
new_chain = 23
rpc_condition.chain = new_chain
condition_providers = {new_chain: {testerchain.provider}}
@ -148,8 +154,10 @@ def test_rpc_condition_evaluation_invalid_provider_for_chain(
GET_CONTEXT_VALUE_IMPORT_PATH,
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation(get_context_value_mock, testerchain, rpc_condition, condition_providers):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
def test_rpc_condition_evaluation(
get_context_value_mock, accounts, rpc_condition, condition_providers
):
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
condition_result, call_result = rpc_condition.verify(
providers=condition_providers, **context
)
@ -164,9 +172,9 @@ def test_rpc_condition_evaluation(get_context_value_mock, testerchain, rpc_condi
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_multiple_chain_providers(
get_context_value_mock, testerchain, rpc_condition
get_context_value_mock, testerchain, accounts, rpc_condition
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
condition_providers = {
"1": {"fake1a", "fake1b"},
@ -190,9 +198,9 @@ def test_rpc_condition_evaluation_multiple_chain_providers(
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_multiple_providers_no_valid_fallback(
get_context_value_mock, mocker, testerchain, rpc_condition
get_context_value_mock, mocker, accounts, rpc_condition
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
def my_configure_w3(provider: BaseProvider):
return Web3(provider)
@ -219,9 +227,9 @@ def test_rpc_condition_evaluation_multiple_providers_no_valid_fallback(
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_multiple_providers_valid_fallback(
get_context_value_mock, mocker, testerchain, rpc_condition
get_context_value_mock, mocker, testerchain, accounts, rpc_condition
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
def my_configure_w3(provider: BaseProvider):
return Web3(provider)
@ -256,9 +264,9 @@ def test_rpc_condition_evaluation_multiple_providers_valid_fallback(
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_no_connection_to_chain(
get_context_value_mock, testerchain, rpc_condition
get_context_value_mock, testerchain, accounts, rpc_condition
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
# condition providers for other unrelated chains
providers = {
@ -275,9 +283,9 @@ def test_rpc_condition_evaluation_no_connection_to_chain(
side_effect=_dont_validate_user_address,
)
def test_rpc_condition_evaluation_with_context_var_in_return_value_test(
get_context_value_mock, testerchain, condition_providers
get_context_value_mock, testerchain, accounts, condition_providers
):
account, *other_accounts = testerchain.client.accounts
account, *other_accounts = accounts.accounts_addresses
balance = testerchain.client.get_balance(account)
# we have balance stored, use for rpc condition with context variable
@ -314,15 +322,15 @@ def test_rpc_condition_evaluation_with_context_var_in_return_value_test(
side_effect=_dont_validate_user_address,
)
def test_erc20_evm_condition_evaluation(
get_context_value_mock, testerchain, erc20_evm_condition_balanceof, condition_providers
get_context_value_mock, erc20_evm_condition_balanceof, condition_providers, accounts
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.unassigned_accounts[0]}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.unassigned_accounts[0]}}
condition_result, call_result = erc20_evm_condition_balanceof.verify(
providers=condition_providers, **context
)
assert condition_result is True
context[USER_ADDRESS_CONTEXT]["address"] = testerchain.etherbase_account
context[USER_ADDRESS_CONTEXT]["address"] = accounts.etherbase_account
condition_result, call_result = erc20_evm_condition_balanceof.verify(
providers=condition_providers, **context
)
@ -330,15 +338,15 @@ def test_erc20_evm_condition_evaluation(
def test_erc20_evm_condition_evaluation_with_custom_context_variable(
testerchain, custom_context_variable_erc20_condition, condition_providers
custom_context_variable_erc20_condition, condition_providers, accounts
):
context = {":addressToUse": testerchain.unassigned_accounts[0]}
context = {":addressToUse": accounts.unassigned_accounts[0]}
condition_result, call_result = custom_context_variable_erc20_condition.verify(
providers=condition_providers, **context
)
assert condition_result is True
context[":addressToUse"] = testerchain.etherbase_account
context[":addressToUse"] = accounts.etherbase_account
condition_result, call_result = custom_context_variable_erc20_condition.verify(
providers=condition_providers, **context
)
@ -350,9 +358,13 @@ def test_erc20_evm_condition_evaluation_with_custom_context_variable(
side_effect=_dont_validate_user_address,
)
def test_erc721_evm_condition_owner_evaluation(
get_context_value_mock, testerchain, test_registry, erc721_evm_condition_owner, condition_providers
get_context_value_mock,
accounts,
test_registry,
erc721_evm_condition_owner,
condition_providers,
):
account, *other_accounts = testerchain.client.accounts
account, *other_accounts = accounts.accounts_addresses
# valid owner of nft
context = {
USER_ADDRESS_CONTEXT: {"address": account},
@ -389,9 +401,13 @@ def test_erc721_evm_condition_owner_evaluation(
side_effect=_dont_validate_user_address,
)
def test_erc721_evm_condition_balanceof_evaluation(
get_context_value_mock, testerchain, test_registry, erc721_evm_condition_balanceof, condition_providers
get_context_value_mock,
accounts,
test_registry,
erc721_evm_condition_balanceof,
condition_providers,
):
account, *other_accounts = testerchain.client.accounts
account, *other_accounts = accounts.accounts_addresses
context = {USER_ADDRESS_CONTEXT: {"address": account}} # owner of NFT
condition_result, call_result = erc721_evm_condition_balanceof.verify(
providers=condition_providers, **context
@ -719,11 +735,11 @@ def test_not_of_simple_compound_conditions_lingo_evaluation(
)
def test_onchain_conditions_lingo_evaluation(
get_context_value_mock,
testerchain,
compound_lingo,
condition_providers,
accounts,
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.etherbase_account}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.etherbase_account}}
result = compound_lingo.eval(providers=condition_providers, **context)
assert result is True
@ -734,11 +750,11 @@ def test_onchain_conditions_lingo_evaluation(
)
def test_not_of_onchain_conditions_lingo_evaluation(
get_context_value_mock,
testerchain,
compound_lingo,
condition_providers,
accounts,
):
context = {USER_ADDRESS_CONTEXT: {"address": testerchain.etherbase_account}}
context = {USER_ADDRESS_CONTEXT: {"address": accounts.etherbase_account}}
result = compound_lingo.eval(providers=condition_providers, **context)
assert result is True
@ -788,3 +804,154 @@ def test_single_retrieve_with_onchain_conditions(enacted_policy, bob, ursulas):
)
assert cleartexts == messages
@pytest.mark.usefixtures("staking_providers")
def test_contract_condition_using_overloaded_function(
taco_child_application_agent, condition_providers
):
(
total_staked,
providers,
) = taco_child_application_agent._get_active_staking_providers_raw(0, 10, 0)
expected_result = [
total_staked,
[
HexBytes(provider_bytes).hex() for provider_bytes in providers
], # must be json serializable
]
context = {
":expectedStakingProviders": expected_result,
} # user-defined context vars
#
# valid overloaded function - 2 params
#
valid_abi_2_params = {
"type": "function",
"name": "getActiveStakingProviders",
"stateMutability": "view",
"inputs": [
{"name": "_startIndex", "type": "uint256", "internalType": "uint256"},
{
"name": "_maxStakingProviders",
"type": "uint256",
"internalType": "uint256",
},
],
"outputs": [
{"name": "allAuthorizedTokens", "type": "uint96", "internalType": "uint96"},
{
"name": "activeStakingProviders",
"type": "bytes32[]",
"internalType": "bytes32[]",
},
],
}
condition = ContractCondition(
contract_address=taco_child_application_agent.contract.address,
function_abi=ABIFunction(valid_abi_2_params),
method="getActiveStakingProviders",
chain=TESTERCHAIN_CHAIN_ID,
return_value_test=ReturnValueTest("==", ":expectedStakingProviders"),
parameters=[0, 10],
)
condition_result, call_result = condition.verify(
providers=condition_providers, **context
)
assert condition_result, "results match and condition passes"
json_serializable_result = [
call_result[0],
[HexBytes(provider_bytes).hex() for provider_bytes in call_result[1]],
]
assert expected_result == json_serializable_result
#
# valid overloaded function - 3 params
#
valid_abi_3_params = {
"type": "function",
"name": "getActiveStakingProviders",
"stateMutability": "view",
"inputs": [
{"name": "_startIndex", "type": "uint256", "internalType": "uint256"},
{
"name": "_maxStakingProviders",
"type": "uint256",
"internalType": "uint256",
},
{"name": "_cohortDuration", "type": "uint32", "internalType": "uint32"},
],
"outputs": [
{"name": "allAuthorizedTokens", "type": "uint96", "internalType": "uint96"},
{
"name": "activeStakingProviders",
"type": "bytes32[]",
"internalType": "bytes32[]",
},
],
}
condition = ContractCondition(
contract_address=taco_child_application_agent.contract.address,
function_abi=ABIFunction(valid_abi_3_params),
method="getActiveStakingProviders",
chain=TESTERCHAIN_CHAIN_ID,
return_value_test=ReturnValueTest("==", ":expectedStakingProviders"),
parameters=[0, 10, 0],
)
condition_result, call_result = condition.verify(
providers=condition_providers, **context
)
assert condition_result, "results match and condition passes"
json_serializable_result = [
call_result[0],
[HexBytes(provider_bytes).hex() for provider_bytes in call_result[1]],
]
assert expected_result == json_serializable_result
#
# valid overloaded contract abi but wrong parameters
#
condition = ContractCondition(
contract_address=taco_child_application_agent.contract.address,
function_abi=ABIFunction(valid_abi_3_params),
method="getActiveStakingProviders",
chain=TESTERCHAIN_CHAIN_ID,
return_value_test=ReturnValueTest("==", ":expectedStakingProviders"),
parameters=[0, 10], # 2 params instead of 3 (old overloaded function)
)
with pytest.raises(RPCExecutionFailed):
_ = condition.verify(providers=condition_providers, **context)
#
# invalid abi
#
invalid_abi_all_bool_inputs = {
"type": "function",
"name": "getActiveStakingProviders",
"stateMutability": "view",
"inputs": [
{"name": "_startIndex", "type": "bool", "internalType": "bool"},
{"name": "_maxStakingProviders", "type": "bool", "internalType": "bool"},
{"name": "_cohortDuration", "type": "bool", "internalType": "bool"},
],
"outputs": [
{"name": "allAuthorizedTokens", "type": "uint96", "internalType": "uint96"},
{
"name": "activeStakingProviders",
"type": "bytes32[]",
"internalType": "bytes32[]",
},
],
}
condition = ContractCondition(
contract_address=taco_child_application_agent.contract.address,
function_abi=ABIFunction(invalid_abi_all_bool_inputs),
method="getActiveStakingProviders",
chain=TESTERCHAIN_CHAIN_ID,
return_value_test=ReturnValueTest("==", ":expectedStakingProviders"),
parameters=[False, False, False], # parameters match fake abi
)
with pytest.raises(RPCExecutionFailed):
_ = condition.verify(providers=condition_providers, **context)

View File

@ -14,19 +14,16 @@ from nucypher.blockchain.eth.agents import (
)
from nucypher.blockchain.eth.interfaces import BlockchainInterfaceFactory
from nucypher.blockchain.eth.registry import ContractRegistry, RegistrySourceManager
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
from nucypher.policy.conditions.evm import RPCCondition
from nucypher.utilities.logging import Logger
from tests.constants import (
BONUS_TOKENS_FOR_TESTS,
INSECURE_DEVELOPMENT_PASSWORD,
MIN_OPERATOR_SECONDS,
TEMPORARY_DOMAIN,
TEST_ETH_PROVIDER_URI,
TESTERCHAIN_CHAIN_ID,
)
from tests.utils.blockchain import TesterBlockchain
from tests.utils.blockchain import ReservedTestAccountManager, TesterBlockchain
from tests.utils.registry import ApeRegistrySource
from tests.utils.ursula import (
mock_permitted_multichain_connections,
@ -78,6 +75,11 @@ def monkeymodule():
#
@pytest.fixture(scope="session")
def accounts():
return ReservedTestAccountManager()
@pytest.fixture(scope="module")
def deployer_account(accounts):
return accounts[0]
@ -167,11 +169,11 @@ def taco_application(
maya.now().epoch + COMMITMENT_DEADLINE,
)
proxy = oz_dependency.TransparentUpgradeableProxy.deploy(
proxy = deployer_account.deploy(
oz_dependency.TransparentUpgradeableProxy,
taco_application_implementation.address,
deployer_account.address,
b"",
sender=deployer_account,
)
proxy_contract = nucypher_dependency.TACoApplication.at(proxy.address)
@ -194,11 +196,11 @@ def taco_child_application(
MIN_AUTHORIZATION,
)
proxy = oz_dependency.TransparentUpgradeableProxy.deploy(
proxy = deployer_account.deploy(
oz_dependency.TransparentUpgradeableProxy,
taco_child_application_implementation.address,
deployer_account.address,
b"",
sender=deployer_account,
)
proxy_contract = nucypher_dependency.TACoChildApplication.at(proxy.address)
taco_application.setChildApplication(
@ -226,11 +228,11 @@ def coordinator(
encoded_initializer_function = _coordinator.initialize.encode_input(
TIMEOUT, MAX_DKG_SIZE, deployer_account.address
)
proxy = oz_dependency.TransparentUpgradeableProxy.deploy(
proxy = deployer_account.deploy(
oz_dependency.TransparentUpgradeableProxy,
_coordinator.address,
deployer_account.address,
encoded_initializer_function,
sender=deployer_account,
)
proxy_contract = nucypher_dependency.Coordinator.at(proxy.address)
@ -301,10 +303,11 @@ def test_registry(deployed_contracts, module_mocker):
@pytest.mark.usefixtures("test_registry")
@pytest.fixture(scope="module")
def testerchain(project) -> TesterBlockchain:
def testerchain(project, clock, accounts) -> TesterBlockchain:
# Extract the web3 provider containing EthereumTester from the ape project's chain manager
provider = project.chain_manager.provider.web3.provider
testerchain = TesterBlockchain(provider=provider)
testerchain.tx_machine._task.clock = clock
BlockchainInterfaceFactory.register_interface(interface=testerchain, force=True)
yield testerchain
@ -326,13 +329,8 @@ def staking_providers(
staking_providers = list()
for provider_address, operator_address in zip(
testerchain.stake_providers_accounts, testerchain.ursulas_accounts
accounts.staking_providers_accounts, accounts.ursulas_accounts
):
provider_power = TransactingPower(
account=provider_address, signer=Web3Signer(testerchain.client)
)
provider_power.unlock(password=INSECURE_DEVELOPMENT_PASSWORD)
# for a random amount
amount = minimum_stake + random.randrange(BONUS_TOKENS_FOR_TESTS)
@ -344,7 +342,9 @@ def staking_providers(
)
taco_application.bondOperator(
provider_address, operator_address, sender=accounts[provider_address]
provider_address,
operator_address,
sender=accounts[provider_address],
)
# track

View File

@ -1,68 +0,0 @@
import pytest
from tests.constants import (
DEVELOPMENT_ETH_AIRDROP_AMOUNT,
NUMBER_OF_ETH_TEST_ACCOUNTS,
NUMBER_OF_STAKING_PROVIDERS_IN_BLOCKCHAIN_TESTS,
NUMBER_OF_URSULAS_IN_BLOCKCHAIN_TESTS,
)
# Prevents TesterBlockchain to be picked up by py.test as a test class
from tests.utils.blockchain import TesterBlockchain as _TesterBlockchain
@pytest.fixture()
def another_testerchain():
testerchain = _TesterBlockchain(eth_airdrop=True, light=True)
testerchain.deployer_address = testerchain.etherbase_account
assert testerchain.is_light
yield testerchain
def test_testerchain_creation(testerchain, another_testerchain):
chains = (testerchain, another_testerchain)
for chain in chains:
# Ensure we are testing on the correct network...
assert "tester" in chain.endpoint
# ... and that there are already some blocks mined
chain.w3.eth.w3.testing.mine(1)
assert chain.w3.eth.block_number > 0
# Check that we have enough test accounts
assert len(chain.client.accounts) >= NUMBER_OF_ETH_TEST_ACCOUNTS
# Check that distinguished accounts are assigned
etherbase = chain.etherbase_account
assert etherbase == chain.client.accounts[0]
alice = chain.alice_account
assert alice == chain.client.accounts[1]
bob = chain.bob_account
assert bob == chain.client.accounts[2]
stakers = [chain.stake_provider_account(i) for i in range(NUMBER_OF_STAKING_PROVIDERS_IN_BLOCKCHAIN_TESTS)]
assert stakers == chain.stake_providers_accounts
ursulas = [chain.ursula_account(i) for i in range(NUMBER_OF_URSULAS_IN_BLOCKCHAIN_TESTS)]
assert ursulas == chain.ursulas_accounts
# Check that the remaining accounts are different from the previous ones:
assert set([etherbase, alice, bob] + ursulas + stakers).isdisjoint(set(chain.unassigned_accounts))
# Check that accounts are funded
for account in chain.client.accounts:
assert chain.client.get_balance(account) >= DEVELOPMENT_ETH_AIRDROP_AMOUNT
# Check that accounts can send transactions
for account in chain.client.accounts:
balance = chain.client.get_balance(account)
assert balance
tx = {'to': etherbase, 'from': account, 'value': 100}
txhash = chain.client.send_transaction(tx)
_receipt = chain.wait_for_receipt(txhash)

View File

@ -32,14 +32,8 @@ GLOBAL_ALLOW_LIST = "GlobalAllowList"
# Ursula
#
NUMBER_OF_URSULAS_IN_BLOCKCHAIN_TESTS = 10
NUMBER_OF_STAKING_PROVIDERS_IN_BLOCKCHAIN_TESTS = NUMBER_OF_URSULAS_IN_BLOCKCHAIN_TESTS
# Ursulas (Operators) and Staking Providers have their own account
NUMBER_OF_ETH_TEST_ACCOUNTS = NUMBER_OF_URSULAS_IN_BLOCKCHAIN_TESTS + NUMBER_OF_STAKING_PROVIDERS_IN_BLOCKCHAIN_TESTS + 10
NUMBER_OF_URSULAS_IN_DEVELOPMENT_DOMAIN = NUMBER_OF_URSULAS_IN_BLOCKCHAIN_TESTS
NUMBER_OF_ETH_TEST_ACCOUNTS = 30
#
# Local Signer Keystore

View File

@ -20,9 +20,11 @@ from web3 import Web3
import tests
from nucypher.blockchain.eth.actors import Operator
from nucypher.blockchain.eth.interfaces import BlockchainInterfaceFactory
from nucypher.blockchain.eth.interfaces import (
BlockchainInterface,
BlockchainInterfaceFactory,
)
from nucypher.blockchain.eth.signers.software import KeystoreSigner
from nucypher.blockchain.eth.trackers.dkg import EventScannerTask
from nucypher.characters.lawful import Enrico, Ursula
from nucypher.config.characters import (
AliceConfiguration,
@ -45,6 +47,7 @@ from nucypher.policy.payment import SubscriptionManagerPayment
from nucypher.utilities.emitters import StdoutEmitter
from nucypher.utilities.logging import GlobalLoggerSettings, Logger
from nucypher.utilities.networking import LOOPBACK_ADDRESS
from nucypher.utilities.task import SimpleTask
from tests.constants import (
MIN_OPERATOR_SECONDS,
MOCK_CUSTOM_INSTALLATION_PATH,
@ -65,6 +68,7 @@ from tests.mock.performance_mocks import (
mock_rest_app_creation,
mock_verify_node,
)
from tests.utils.blockchain import ReservedTestAccountManager
from tests.utils.config import (
make_alice_test_configuration,
make_bob_test_configuration,
@ -75,7 +79,12 @@ from tests.utils.middleware import (
MockRestMiddlewareForLargeFleetTests,
)
from tests.utils.policy import generate_random_label
from tests.utils.ursula import MOCK_KNOWN_URSULAS_CACHE, make_ursulas, select_test_port
from tests.utils.ursula import (
MOCK_KNOWN_URSULAS_CACHE,
make_random_ursulas,
make_reserved_ursulas,
select_test_port,
)
test_logger = Logger("test-logger")
@ -98,15 +107,22 @@ def tempfile_path():
@pytest.fixture(scope="module")
def temp_dir_path():
temp_dir = tempfile.TemporaryDirectory(prefix='nucypher-test-')
temp_dir = tempfile.TemporaryDirectory(prefix="nucypher-test-")
yield Path(temp_dir.name)
temp_dir.cleanup()
#
# Accounts
#
@pytest.fixture(scope="session", autouse=True)
def accounts():
"""a la ape"""
return ReservedTestAccountManager()
@pytest.fixture(scope="module")
def random_account():
key = Account.create(extra_entropy="lamborghini mercy")
@ -118,19 +134,19 @@ def random_account():
def random_address(random_account):
return random_account.address
#
# Character Configurations
#
@pytest.fixture(scope="module")
def ursula_test_config(test_registry, temp_dir_path, testerchain):
def ursula_test_config(test_registry, temp_dir_path):
config = make_ursula_test_configuration(
eth_endpoint=TEST_ETH_PROVIDER_URI,
polygon_endpoint=TEST_ETH_PROVIDER_URI,
test_registry=test_registry,
rest_port=select_test_port(),
operator_address=testerchain.ursulas_accounts.pop(),
)
yield config
config.cleanup()
@ -139,12 +155,12 @@ def ursula_test_config(test_registry, temp_dir_path, testerchain):
@pytest.fixture(scope="module")
def alice_test_config(ursulas, testerchain, test_registry):
def alice_test_config(ursulas, accounts, test_registry):
config = make_alice_test_configuration(
eth_endpoint=TEST_ETH_PROVIDER_URI,
polygon_endpoint=TEST_ETH_PROVIDER_URI,
known_nodes=ursulas,
checksum_address=testerchain.alice_account,
checksum_address=accounts.alice_account,
test_registry=test_registry,
)
yield config
@ -152,11 +168,11 @@ def alice_test_config(ursulas, testerchain, test_registry):
@pytest.fixture(scope="module")
def bob_test_config(testerchain, test_registry):
def bob_test_config(accounts, test_registry):
config = make_bob_test_configuration(
eth_endpoint=TEST_ETH_PROVIDER_URI,
test_registry=test_registry,
checksum_address=testerchain.bob_account,
checksum_address=accounts.bob_account,
)
yield config
config.cleanup()
@ -224,7 +240,9 @@ def capsule_side_channel(enacted_policy):
self.plaintext_passthrough = False
def __call__(self):
message = "Welcome to flippering number {}.".format(len(self.messages)).encode()
message = "Welcome to flippering number {}.".format(
len(self.messages)
).encode()
message_kit = self.enrico.encrypt_for_pre(message)
self.messages.append((message_kit, self.enrico))
if self.plaintext_passthrough:
@ -250,31 +268,34 @@ def random_policy_label():
# Alice, Bob, and Ursula
#
@pytest.fixture(scope="module")
def alice(alice_test_config, ursulas, testerchain):
alice = alice_test_config.produce()
def alice(alice_test_config, accounts):
alice = alice_test_config.produce(
signer=accounts.get_account_signer(accounts.alice_account)
)
yield alice
alice.disenchant()
@pytest.fixture(scope="module")
def bob(bob_test_config, testerchain):
def bob(bob_test_config, accounts):
bob = bob_test_config.produce(
polygon_endpoint=TEST_ETH_PROVIDER_URI,
signer=accounts.get_account_signer(accounts.bob_account),
)
yield bob
bob.disenchant()
@pytest.fixture(scope="function")
def lonely_ursula_maker(ursula_test_config, testerchain):
def lonely_ursula_maker(ursula_test_config, accounts):
class _PartialUrsulaMaker:
_partial = partial(
make_ursulas,
make_reserved_ursulas,
accounts=accounts,
ursula_config=ursula_test_config,
know_each_other=False,
staking_provider_addresses=testerchain.stake_providers_accounts,
operator_addresses=testerchain.ursulas_accounts,
)
_made = []
@ -290,6 +311,7 @@ def lonely_ursula_maker(ursula_test_config, testerchain):
del MOCK_KNOWN_URSULAS_CACHE[ursula.rest_interface.port]
for ursula in self._made:
ursula._finalize()
_maker = _PartialUrsulaMaker()
yield _maker
_maker.clean()
@ -304,7 +326,7 @@ def mock_registry_sources(module_mocker):
yield
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def mock_testerchain() -> MockBlockchain:
BlockchainInterfaceFactory._interfaces = dict()
testerchain = MockBlockchain()
@ -314,9 +336,7 @@ def mock_testerchain() -> MockBlockchain:
@pytest.fixture()
def light_ursula(temp_dir_path, random_account, mocker):
mocker.patch.object(
KeystoreSigner, "_KeystoreSigner__get_signer", return_value=random_account
)
mocker.patch.object(KeystoreSigner, "_get_signer", return_value=random_account)
pre_payment_method = SubscriptionManagerPayment(
blockchain_endpoint=MOCK_ETH_PROVIDER_URI, domain=TEMPORARY_DOMAIN_NAME
)
@ -340,13 +360,13 @@ def light_ursula(temp_dir_path, random_account, mocker):
return ursula
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def policy_rate():
rate = Web3.to_wei(21, 'gwei')
rate = Web3.to_wei(21, "gwei")
return rate
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def policy_value(policy_rate):
value = policy_rate * MIN_OPERATOR_SECONDS
return value
@ -357,7 +377,7 @@ def policy_value(policy_rate):
#
@pytest.fixture(autouse=True, scope='function')
@pytest.fixture(autouse=True, scope="function")
def log_in_and_out_of_test(request):
test_name = request.node.name
module_name = request.module.__name__
@ -385,28 +405,22 @@ def fleet_of_highperf_mocked_ursulas(ursula_test_config, request, testerchain):
mock_cert_generation,
mock_remember_node,
mock_message_verification,
)
)
try:
quantity = request.param
except AttributeError:
quantity = 5000 # Bigass fleet by default; that's kinda the point.
staking_addresses = (to_checksum_address('0x' + os.urandom(20).hex()) for _ in range(5000))
operator_addresses = (to_checksum_address('0x' + os.urandom(20).hex()) for _ in range(5000))
with GlobalLoggerSettings.pause_all_logging_while():
with contextlib.ExitStack() as stack:
for mock in mocks:
stack.enter_context(mock)
_ursulas = make_ursulas(
_ursulas = make_random_ursulas(
ursula_config=ursula_test_config,
quantity=quantity,
know_each_other=False,
staking_provider_addresses=staking_addresses,
operator_addresses=operator_addresses,
)
all_ursulas = {u.checksum_address: u for u in _ursulas}
@ -415,7 +429,9 @@ def fleet_of_highperf_mocked_ursulas(ursula_test_config, request, testerchain):
# It only needs to see whatever public info we can normally get via REST.
# Also sharing mutable Ursulas like that can lead to unpredictable results.
ursula.known_nodes.current_state._nodes = all_ursulas
ursula.known_nodes.current_state.checksum = b"This is a fleet state checksum..".hex()
ursula.known_nodes.current_state.checksum = (
b"This is a fleet state checksum..".hex()
)
yield _ursulas
@ -427,13 +443,13 @@ def fleet_of_highperf_mocked_ursulas(ursula_test_config, request, testerchain):
def highperf_mocked_alice(
fleet_of_highperf_mocked_ursulas,
monkeymodule,
testerchain,
accounts,
):
config = AliceConfiguration(
dev_mode=True,
domain=TEMPORARY_DOMAIN_NAME,
eth_endpoint=TEST_ETH_PROVIDER_URI,
checksum_address=testerchain.alice_account,
checksum_address=accounts.alice_account,
network_middleware=MockRestMiddlewareForLargeFleetTests(
eth_endpoint=TEST_ETH_PROVIDER_URI
),
@ -474,7 +490,8 @@ def highperf_mocked_bob(fleet_of_highperf_mocked_ursulas):
# CLI
#
@pytest.fixture(scope='function')
@pytest.fixture(scope="function")
def test_emitter(mocker):
# Note that this fixture does not capture console output.
# Whether the output is captured or not is controlled by
@ -482,13 +499,13 @@ def test_emitter(mocker):
return StdoutEmitter()
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def click_runner():
runner = CliRunner()
yield runner
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def nominal_configuration_fields():
config = UrsulaConfiguration(
dev_mode=True,
@ -500,7 +517,7 @@ def nominal_configuration_fields():
del config
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def custom_filepath():
_custom_filepath = MOCK_CUSTOM_INSTALLATION_PATH
with contextlib.suppress(FileNotFoundError):
@ -510,7 +527,7 @@ def custom_filepath():
shutil.rmtree(_custom_filepath, ignore_errors=True)
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def custom_filepath_2():
_custom_filepath = MOCK_CUSTOM_INSTALLATION_PATH_2
with contextlib.suppress(FileNotFoundError):
@ -522,9 +539,11 @@ def custom_filepath_2():
shutil.rmtree(_custom_filepath, ignore_errors=True)
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def worker_configuration_file_location(custom_filepath) -> Path:
_configuration_file_location = MOCK_CUSTOM_INSTALLATION_PATH / UrsulaConfiguration.generate_filename()
_configuration_file_location = (
MOCK_CUSTOM_INSTALLATION_PATH / UrsulaConfiguration.generate_filename()
)
return _configuration_file_location
@ -537,15 +556,15 @@ def mock_teacher_nodes(mocker):
@pytest.fixture(autouse=True)
def disable_interactive_keystore_generation(mocker):
# Do not notify or confirm mnemonic seed words during tests normally
mocker.patch.object(Keystore, '_confirm_generate')
mocker.patch.object(Keystore, "_confirm_generate")
#
# Web Auth
#
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def basic_auth_file(temp_dir_path):
basic_auth = Path(temp_dir_path) / 'htpasswd'
basic_auth = Path(temp_dir_path) / "htpasswd"
with basic_auth.open("w") as f:
# username: "admin", password: "admin"
f.write("admin:$apr1$hlEpWVoI$0qjykXrvdZ0yO2TnBggQO0\n")
@ -553,7 +572,7 @@ def basic_auth_file(temp_dir_path):
basic_auth.unlink()
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def mock_rest_middleware():
return MockRestMiddleware(eth_endpoint=TEST_ETH_PROVIDER_URI)
@ -563,14 +582,14 @@ def mock_rest_middleware():
#
@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def conditions_test_data():
test_conditions = Path(tests.__file__).parent / "data" / "test_conditions.json"
with open(test_conditions, 'r') as file:
with open(test_conditions, "r") as file:
data = json.loads(file.read())
for name, condition in data.items():
if condition.get('chain'):
condition['chain'] = TESTERCHAIN_CHAIN_ID
if condition.get("chain"):
condition["chain"] = TESTERCHAIN_CHAIN_ID
return data
@ -627,7 +646,7 @@ def rpc_condition():
return condition
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def valid_user_address_context():
return {
USER_ADDRESS_CONTEXT: {
@ -666,30 +685,29 @@ def valid_user_address_context():
}
@pytest.fixture(scope='module', autouse=True)
def control_time():
@pytest.fixture(scope="session", autouse=True)
def clock():
"""Distorts the space-time continuum. Use with caution."""
clock = Clock()
EventScannerTask.CLOCK = clock
EventScannerTask.INTERVAL = .1
clock.llamas = 0
SimpleTask.CLOCK = clock
SimpleTask.INTERVAL = 1
return clock
@pytest.fixture(scope="module")
def ursulas(testerchain, ursula_test_config, staking_providers):
def ursulas(accounts, ursula_test_config, staking_providers):
if MOCK_KNOWN_URSULAS_CACHE:
# TODO: Is this a safe assumption / test behaviour?
# raise RuntimeError("Ursulas cache was unclear at fixture loading time. Did you use one of the ursula maker functions without cleaning up?")
MOCK_KNOWN_URSULAS_CACHE.clear()
_ursulas = make_ursulas(
_ursulas = make_reserved_ursulas(
accounts=accounts,
ursula_config=ursula_test_config,
staking_provider_addresses=testerchain.stake_providers_accounts,
operator_addresses=testerchain.ursulas_accounts,
know_each_other=True,
)
for u in _ursulas:
u.synchronous_query_timeout = .01 # We expect to never have to wait for content that is actually on-chain during tests.
u.synchronous_query_timeout = 0.01 # We expect to never have to wait for content that is actually on-chain during tests.
_ports_to_remove = [ursula.rest_interface.port for ursula in _ursulas]
yield _ursulas
@ -767,3 +785,16 @@ def mock_operator_aggregation_delay(module_mocker):
"nucypher.blockchain.eth.actors.Operator.AGGREGATION_SUBMISSION_MAX_DELAY",
PropertyMock(return_value=1),
)
@pytest.fixture
def mock_async_hooks(mocker):
hooks = BlockchainInterface.AsyncTxHooks(
on_broadcast=mocker.Mock(),
on_broadcast_failure=mocker.Mock(),
on_fault=mocker.Mock(),
on_finalized=mocker.Mock(),
on_insufficient_funds=mocker.Mock(),
)
return hooks

View File

@ -1,17 +1,16 @@
import pytest
from nucypher.blockchain.eth.clients import EthereumClient
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.crypto.powers import TransactingPower
@pytest.mark.skip(
"This test need to be refactored to use some other transaction than deployment"
)
def test_block_confirmations(testerchain, test_registry, mocker):
origin = testerchain.etherbase_account
def test_block_confirmations(testerchain, test_registry, mocker, accounts):
origin = accounts.etherbase_account
transacting_power = TransactingPower(
account=origin, signer=Web3Signer(testerchain.client)
account=origin, signer=accounts.get_account_signer(origin)
)
# Mocks and test adjustments

View File

@ -11,7 +11,7 @@ from web3.datastructures import AttributeDict
from nucypher.blockchain.eth.agents import CoordinatorAgent
from nucypher.blockchain.eth.models import Coordinator
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.blockchain.eth.signers.software import InMemorySigner
from nucypher.characters.lawful import Enrico, Ursula
from nucypher.crypto.powers import RitualisticPower
from nucypher.policy.conditions.lingo import ConditionLingo, ConditionType
@ -123,7 +123,6 @@ def execute_round_2(ritual_id: int, cohort: List[Ursula]):
)
@pytest.mark.usefixtures("mock_sign_message")
@pytest.mark.parametrize("dkg_size, ritual_id, variant", PARAMS)
@pytest_twisted.inlineCallbacks()
def test_ursula_ritualist(
@ -212,8 +211,7 @@ def test_ursula_ritualist(
plaintext = PLAINTEXT.encode()
# create Enrico
signer = Web3Signer(client=testerchain.client)
enrico = Enrico(encrypting_key=encrypting_key, signer=signer)
enrico = Enrico(encrypting_key=encrypting_key, signer=InMemorySigner())
# encrypt
print(f"encrypting for DKG with key {bytes(encrypting_key).hex()}")
@ -318,19 +316,21 @@ def test_ursula_ritualist(
ritual = mock_coordinator_agent.get_ritual(ritual_id)
num_used_ursulas = 0
for ursula_index, ursula in enumerate(cohort):
for ursula in cohort:
stored_ritual = ursula.dkg_storage.get_active_ritual(ritual_id)
if not stored_ritual:
# this ursula was not used for threshold decryption; skip
continue
assert stored_ritual == ritual
stored_validators = ursula.dkg_storage.get_validators(ritual_id)
num_used_ursulas += 1
for v_index, v in enumerate(stored_validators):
assert v.address == original_validators[v_index].address
assert v.public_key == original_validators[v_index].public_key
if stored_validators:
for v_index, v in enumerate(stored_validators):
assert v.address == original_validators[v_index].address
assert v.public_key == original_validators[v_index].public_key
# increment here - timing issue since multiple ursulas contacted at the same time
num_used_ursulas += 1
assert num_used_ursulas >= ritual.threshold
print("===================== DECRYPTION SUCCESSFUL =====================")

View File

@ -8,10 +8,7 @@ from twisted.internet.task import Clock
from nucypher.characters.lawful import Bob, Enrico
from nucypher.config.constants import TEMPORARY_DOMAIN_NAME
from tests.constants import (
MOCK_ETH_PROVIDER_URI,
NUMBER_OF_URSULAS_IN_DEVELOPMENT_DOMAIN,
)
from tests.constants import MOCK_ETH_PROVIDER_URI
from tests.utils.middleware import MockRestMiddleware
@ -38,7 +35,7 @@ def test_bob_full_retrieve_flow(
assert b"Welcome to flippering number 0." == delivered_cleartexts[0]
def test_bob_retrieves(alice, ursulas):
def test_bob_retrieves(accounts, alice, ursulas):
"""A test to show that Bob can retrieve data from Ursula"""
# Let's partition Ursulas in two parts
@ -60,7 +57,7 @@ def test_bob_retrieves(alice, ursulas):
# Alice creates a policy granting access to Bob
# Just for fun, let's assume she distributes KFrags among Ursulas unknown to Bob
shares = NUMBER_OF_URSULAS_IN_DEVELOPMENT_DOMAIN - 2
shares = accounts.NUMBER_OF_URSULAS_IN_TESTS - 2
label = b'label://' + os.urandom(32)
contract_end_datetime = maya.now() + datetime.timedelta(days=5)
policy = alice.grant(
@ -96,17 +93,6 @@ def test_bob_retrieves(alice, ursulas):
# Indeed, they're the same cleartexts.
assert delivered_cleartexts == cleartexts_delivered_a_second_time
# Let's try retrieve again, but Alice revoked the policy.
receipt, failed_revocations = alice.revoke(policy)
assert len(failed_revocations) == 0
# One thing to note here is that Bob *can* still retrieve with the cached CFrags,
# even though this Policy has been revoked. #892
_cleartexts = bob.retrieve_and_decrypt([message_kit],
alice_verifying_key=alices_verifying_key,
encrypted_treasure_map=policy.treasure_map)
assert _cleartexts == delivered_cleartexts # TODO: 892
bob.disenchant()

View File

@ -1,7 +1,7 @@
import pytest
from nucypher.blockchain.eth import domains
from nucypher.blockchain.eth.signers.software import Web3Signer
from nucypher.blockchain.eth.signers.software import InMemorySigner
from nucypher.characters.chaotic import (
NiceGuyEddie,
ThisBobAlwaysDecrypts,
@ -19,8 +19,7 @@ from tests.constants import (
def _attempt_decryption(BobClass, plaintext, testerchain):
trinket = 80 # Doens't matter.
signer = Web3Signer(client=testerchain.client)
enrico = NiceGuyEddie(encrypting_key=trinket, signer=signer)
enrico = NiceGuyEddie(encrypting_key=trinket, signer=InMemorySigner())
bob = BobClass(
registry=MOCK_REGISTRY_FILEPATH,
domain=domains.LYNX,
@ -50,14 +49,12 @@ def _attempt_decryption(BobClass, plaintext, testerchain):
return decrypted_cleartext
@pytest.mark.usefixtures("mock_sign_message")
def test_user_controls_success(testerchain):
plaintext = b"ever thus to deadbeats"
result = _attempt_decryption(ThisBobAlwaysDecrypts, plaintext, testerchain)
assert bytes(result) == bytes(plaintext)
@pytest.mark.usefixtures("mock_sign_message")
def test_user_controls_failure(testerchain):
plaintext = b"ever thus to deadbeats"
with pytest.raises(Ursula.NotEnoughUrsulas):

View File

@ -74,36 +74,3 @@ def test_alice_can_decrypt(alice, bob):
)
assert [plaintext] == decrypted_data
@pytest.mark.skip("Needs rework post-TMcKF") # TODO: Implement offchain revocation.
@pytest.mark.usefixtures("bursulas")
def test_revocation(alice, bob):
threshold, shares = 2, 3
policy_end_datetime = maya.now() + datetime.timedelta(days=5)
label = b"revocation test"
policy = alice.grant(
bob, label, threshold=threshold, shares=shares, expiration=policy_end_datetime
)
for node_id, encrypted_kfrag in policy.treasure_map:
assert policy.revocation_kit[node_id]
# Test revocation kit's signatures
for revocation in policy.revocation_kit:
assert revocation.verify_signature(alice.stamp.as_umbral_pubkey())
# Test Revocation deserialization
revocation = policy.revocation_kit[node_id]
revocation_bytes = bytes(revocation)
deserialized_revocation = RevocationOrder.from_bytes(revocation_bytes)
assert deserialized_revocation == revocation
# Attempt to revoke the new policy
receipt, failed_revocations = alice.revoke(policy)
assert len(failed_revocations) == 0
# Try to revoke the already revoked policy
receipt, already_revoked = alice.revoke(policy)
assert len(already_revoked) == 3

Some files were not shown because too many files have changed in this diff Show More