Compare commits

...

341 Commits

Author SHA1 Message Date
Chris Veilleux e10ac91cde stop writing database rows that are eating up a bunch of disk. 2022-11-23 16:37:42 -06:00
Chris Veilleux 625a9fd9d3 fix template variable for password reset url 2022-10-05 17:14:41 -05:00
Chris Veilleux aba999f3f5 fix template variable for password reset url 2022-10-05 17:12:42 -05:00
Chris Veilleux 6cfd54d767 more gracefully handle pantacor API error 2022-10-04 14:39:08 -05:00
Chris Veilleux efa72df42e fix index name 2022-10-04 14:38:33 -05:00
Chris Veilleux 16d2f178dd
Merge pull request #317 from MycroftAI/feature/google-cloud-stt
changed audio transcription endpoint to use Google Cloud
2022-09-28 16:52:11 -05:00
Chris Veilleux a7e5dd9533 add Google STT api key back to fix test. 2022-09-28 16:34:33 -05:00
Chris Veilleux 3f6298a057 add Google STT api key back to fix test. 2022-09-28 16:15:10 -05:00
Chris Veilleux f8e062a5a3 point Google Cloud STT to secrets file on Jenkins host 2022-09-28 15:50:19 -05:00
Chris Veilleux 6b2d4f8699 minor bugfix 2022-09-28 14:17:52 -05:00
Chris Veilleux 6275f5f17a fixed pylint issues 2022-09-28 13:27:32 -05:00
Chris Veilleux 8ed421708d change Google STT environment variable to new Google Cloud API key and remove Assembly AI environment variables. 2022-09-28 13:22:34 -05:00
Chris Veilleux 0c2afdee53 bug fix for not selecting a membership in new account process 2022-09-27 16:04:27 -05:00
Chris Veilleux 7d7bfdd0ec Fixed failing behave test 2022-09-22 15:03:02 -05:00
Chris Veilleux df0c60239d changed audio transcription endpoint to use Google Cloud instead of Assembly AI 2022-09-21 13:57:52 -05:00
Chris Veilleux c372289480
Merge pull request #316 from MycroftAI/bugfix/skill-family
Added logic to determine skill family for Dinkum skills
2022-09-16 13:43:47 -05:00
Chris Veilleux f6aecd8336 Added logic to determine skill family for Dinkum skills and de-linted. 2022-09-15 10:49:40 -05:00
Chris Veilleux bf4dec8f05
Merge pull request #315 from MycroftAI/feature/new-pantacor-channels
Changed release channels to new values
2022-09-15 10:28:35 -05:00
Chris Veilleux c93055d3a8 Changed release channels to new values 2022-09-14 13:10:04 -05:00
Chris Veilleux 57349b90df
Merge pull request #314 from MycroftAI/feature/improve-membership
minor change to membership request data
2022-09-13 15:43:50 -05:00
Chris Veilleux e625863908 fixed tests to handle new membership request object 2022-09-12 14:28:20 -05:00
Chris Veilleux 04a7961739 minor change to membership request data 2022-09-12 14:13:26 -05:00
Chris Veilleux f06ed19ac8 fixed a bug with usage of the Decimal class 2022-09-02 16:00:27 -05:00
Chris Veilleux 9186c8ed6f make the durations on the new stt metrics table numeric 2022-08-31 19:00:19 -05:00
Chris Veilleux 4ca3be012e fixed an issue surfaced due to a new version of the JWT library 2022-08-30 22:37:20 -05:00
Chris Veilleux b86bbc9133 Merge remote-tracking branch 'origin/dev' into test 2022-08-26 19:12:21 -05:00
Chris Veilleux 00e5cba78f add transcription metrics to Google STT transcription. 2022-08-26 18:42:46 -05:00
Chris Veilleux db053b7892
Merge pull request #313 from MycroftAI/feature/new-stt-endpoint
Assembly AI STT endpoint
2022-08-26 14:51:19 -05:00
Chris Veilleux 91294258f9 add environment variables necessary for STT api call 2022-08-26 14:37:48 -05:00
Chris Veilleux bfc537d973 added the libsndfile system package to resolve a librosa issue. 2022-08-26 08:17:35 -05:00
Chris Veilleux 4199f12375 Merge remote-tracking branch 'origin/dev' into feature/new-stt-endpoint
# Conflicts:
#	api/public/tests/features/get_utterance.feature
2022-08-25 17:40:40 -05:00
Chris Veilleux 977f43ed7e add a database table for STT transcription metrics and logic to populate it. 2022-08-25 17:34:20 -05:00
Chris Veilleux e8ce3ca8e7 add librosa for decoding/encoding audio 2022-08-25 16:36:17 -05:00
Chris Veilleux 83cc505293 Merge remote-tracking branch 'origin/dev' into feature/new-stt-endpoint
# Conflicts:
#	api/public/Pipfile.lock
2022-08-25 15:42:36 -05:00
Chris Veilleux 3d38274961
Merge pull request #311 from MycroftAI/feature/change-email
Change Account API to handle an email change request
2022-08-25 15:41:15 -05:00
Chris Veilleux b9e19aeb35 add allure directory if it wasn't there 2022-08-25 15:22:55 -05:00
Chris Veilleux eda1350203 add allure directory if it wasn't there 2022-08-25 15:17:45 -05:00
Chris Veilleux ae7757a595 Publish allure report to Jenkins 2022-08-25 15:10:18 -05:00
Chris Veilleux 58e2011df7 de-linted password reset endpoint 2022-08-25 10:44:10 -05:00
Chris Veilleux dce580c8c1 changes to pipenv files are already in pyproject.toml 2022-08-24 14:05:21 -05:00
Chris Veilleux 439f8b4f09 applied code review change to include support email address in message. 2022-08-24 13:50:59 -05:00
Chris Veilleux e0c166572f Merge remote-tracking branch 'origin/dev' into feature/change-email 2022-08-24 13:46:43 -05:00
Chris Veilleux f48c79e3dc
Merge pull request #312 from MycroftAI/refactor/poetry
change package manager from pipenv to poetry
2022-08-24 13:45:30 -05:00
Chris Veilleux e552211c68
Merge branch 'dev' into feature/change-email 2022-08-24 13:37:24 -05:00
Chris Veilleux 0a02e28110 change package manager from pipenv to poetry 2022-08-24 13:22:05 -05:00
Chris Veilleux 325c927763
Merge pull request #310 from MycroftAI/feature/change-password
Add ability for user to change password
2022-08-22 17:22:45 -05:00
Chris Veilleux f8c1f830c8 Add a new endpoint to perform STT transcription through Assembly AI. 2022-08-22 14:39:25 -05:00
Chris Veilleux 4c7d2bf664 Merge branch 'feature/change-password' into feature/change-email 2022-08-22 12:58:48 -05:00
Chris Veilleux a6debb983f fixed a small bug introduced in refactoring 2022-08-22 12:54:31 -05:00
Chris Veilleux 57d0600c60 Merge branch 'feature/change-password' into feature/change-email
# Conflicts:
#	shared/selene/util/email/templates/password_change.html
2022-08-15 14:20:45 -05:00
Chris Veilleux c06b729a2d added support email address to password change email 2022-08-15 14:15:12 -05:00
Chris Veilleux 72008f7225 added ability for an authenticated user to change their email address 2022-08-15 13:20:32 -05:00
Chris Veilleux 3633fa96f6 Change the email templates to use Jinja's inheritance feature to reduce code duplication. 2022-08-11 12:49:35 -05:00
Chris Veilleux 2963719747 de-lint setting.py 2022-08-09 12:12:42 -05:00
Chris Veilleux ee0270f6c9 Add an endpoint to the account API that changes the user's password 2022-08-08 13:58:34 -05:00
Chris Veilleux aad4468ed0 fixed an error in the marketplace response for two api calls 2022-08-02 13:07:21 -05:00
Chris Veilleux d5fe923870
Merge pull request #309 from MycroftAI/feature/disable-transcription-save
Disable transcription save
2022-07-20 11:32:22 -05:00
Chris Veilleux 7e65aa26a2 de-linted test step file 2022-07-19 16:57:36 -05:00
Chris Veilleux 40ad9dbdfe fixed tests to no longer check for saved transcription files 2022-07-19 16:01:39 -05:00
Chris Veilleux 6fe19ddde2 remove pylint pre-commit hook as it was not working with the repository setup as it is 2022-07-19 14:22:29 -05:00
Chris Veilleux dc71229474 Merge remote-tracking branch 'origin/dev' into feature/disable-transcription-save
# Conflicts:
#	api/public/Pipfile.lock
2022-07-19 11:49:48 -05:00
Chris Veilleux 6e3fb118ed
Merge pull request #308 from MycroftAI/refactor/update-dependencies
Update account, device, and single sign on APIs to Python 3.9
2022-07-19 11:43:28 -05:00
Chris Veilleux ad40517569 fixed Jenkinsfile to run linting and formatting step on PRs to dev 2022-07-18 15:38:05 -05:00
Chris Veilleux d1f56aefa5 fixed Jenkinsfile to run linting and formatting step on PRs to dev 2022-07-18 15:37:06 -05:00
Chris Veilleux 43284b158d remove no-cache option now that debugging is complete 2022-07-18 15:32:51 -05:00
Chris Veilleux 9298be41b6 upgraded db python environment to 3.9 and changed how city table is loaded due to new version of psycopg 2022-07-18 15:31:46 -05:00
Chris Veilleux 3121f091b9 changed github api environment variable to use new github app
changed github api environment variable to use new github app

changed github api environment variable to use new github app

changed github api environment variable to use new github app

changed github api environment variable to use new github app

changed github api environment variable to use new github app
2022-07-18 15:31:45 -05:00
Chris Veilleux 07cb73a9bd changed the base image for the database bootstrap step 2022-07-18 15:31:44 -05:00
Chris Veilleux 66f5bf4af3 updated database python environment to 3.9 2022-07-15 12:03:55 -05:00
Chris Veilleux a13fc0f919 updated Dockerfile to use Python 3.9 as a base 2022-07-15 12:03:54 -05:00
Chris Veilleux 424aaee5d6 updated python version on account API, public API and SSO API to 3.9. updated versions of packages that depend on flask 1.0 to get behave tests to run 2022-07-12 15:36:08 -05:00
Chris Veilleux 36d17d9ed8 upgrade python packages 2022-07-12 13:07:29 -05:00
Chris Veilleux 9173a99d46 add pylint to precommit hook 2022-07-12 13:06:51 -05:00
Chris Veilleux 9dc800de88 remove logic that saved TTS transcriptions as part of new privacy policy initiative 2022-07-12 13:04:16 -05:00
Chris Veilleux 33aa803865 upgrade black in pre commit config 2022-07-12 13:03:45 -05:00
Chris Veilleux 2100ebd380 added a logger 2022-03-16 12:26:03 -05:00
Chris Veilleux 69e397eca0 Fixed a bug with JWT encoding 2022-03-16 11:34:51 -05:00
Chris Veilleux 098bf89709 Converted an insert to an upsert to avoid an error on subsequent calls to the new pantacor endpoint. 2022-03-16 11:34:26 -05:00
Chris Veilleux 133e3575aa
Merge pull request #293 from MycroftAI/refactor/black
"Black"en the rest of the repository
2022-03-11 13:37:44 -06:00
Chris Veilleux 26ed641b48 applied the "Black" formatter to all files and added pre-commit hook to check 2022-03-11 13:22:33 -06:00
Chris Veilleux bbad8e2f3b
Merge pull request #292 from MycroftAI/bugfix/stt-transcriptions
Fix a audio write bug in Google STT
2022-03-11 13:17:32 -06:00
Chris Veilleux 2f6e06839f fixed a bug in the Google STT endpoint where a network disconnect affected writing the audio to the file system 2022-03-11 12:38:27 -06:00
Chris Veilleux adb9f013a1 refactored the authentication code to make more sense 2022-03-11 12:37:29 -06:00
Chris Veilleux edb5da7230
Merge pull request #291 from MycroftAI/feature/better-logging
Improve logging setup code
2022-03-11 12:29:10 -06:00
Chris Veilleux dc14c0ac47
Merge branch 'dev' into feature/better-logging 2022-03-11 12:12:32 -06:00
Chris Veilleux 9dac79dd9e reverting a change that accidentally got committed as part of this PR. 2022-03-10 18:24:04 -06:00
Chris Veilleux ae11c4d01c updated data access layer to use new logging mechanism 2022-03-10 14:31:03 -06:00
Chris Veilleux a0bff8bd39 updated scripting base class to use new logging mechanism 2022-03-10 14:30:24 -06:00
Chris Veilleux 5674a13356 updated behave tests to use new logging mechanism 2022-03-10 14:29:25 -06:00
Chris Veilleux b435f515dd use new logging mechanism in single sign on API endpoints. 2022-03-10 14:28:16 -06:00
Chris Veilleux 21756f6659 removed old logging code as it was no longer used anywhere in the repository 2022-03-10 14:08:57 -06:00
Chris Veilleux 59a7b6449b updated dependencies 2022-03-10 14:03:15 -06:00
Chris Veilleux 57d942519f updated marketplace API dependencies 2022-03-10 12:31:04 -06:00
Chris Veilleux 1d7c71465e use new logging mechanism in marketplace API endpoints. 2022-03-10 12:30:41 -06:00
Chris Veilleux 75cc3c8363 use new logging mechanism in account API endpoints. 2022-03-10 12:10:37 -06:00
Chris Veilleux abb91401c6 remove old logging mechanism as it is no longer used. 2022-03-10 12:03:08 -06:00
Chris Veilleux 7e5a3f94d2 use new logging mechanism in public API endpoints. 2022-03-10 12:02:09 -06:00
Chris Veilleux 116dcf2ce6 change loggers in selene.api to all use module name instead of package 2022-03-10 11:45:18 -06:00
Chris Veilleux 3295ac70a0 change loggers in selene.util to all use module name instead of package 2022-03-10 11:37:39 -06:00
Chris Veilleux 0817ee35a2 changed log file location to be production directory, not test directory 2022-03-09 18:32:43 -06:00
Chris Veilleux 02d79ecad2 added a UUID as a request identifier so that log messages can be related to one another for a single request 2022-03-09 17:54:44 -06:00
Chris Veilleux 0db649e4ff added new logger config code but left old in place until new is implemented everywhere 2022-03-09 17:50:09 -06:00
Chris Veilleux 9fdb2077d4 Merge branch 'master' into dev 2022-03-09 17:32:21 -06:00
Chris Veilleux 2d0a033e3a
Merge pull request #284 from MycroftAI/feature/new-pantacor-endpoint
Moved Pantacor sync from device activation to its own endpoint.
2022-03-04 13:00:45 -06:00
Chris Veilleux 525d6b26cc fixed a comment copied from the activation endpoint 2022-03-04 12:10:24 -06:00
Chris Veilleux 506c2d555e Fix GitHub authentication 2022-03-03 13:06:47 -06:00
Chris Veilleux 8c8ff10ab8 Fix GitHub authentication 2022-03-03 12:34:20 -06:00
Chris Veilleux fc34fec409
Merge pull request #279 from MycroftAI/dependabot/pip/shared/starkbank-ecdsa-2.0.1
Bump starkbank-ecdsa from 1.1.1 to 2.0.1 in /shared
2022-03-03 12:13:33 -06:00
Chris Veilleux 063110adb7
Merge pull request #281 from simcop2387/patch-2
Fix client encoding for PG connection
2022-03-03 12:13:08 -06:00
Chris Veilleux a78b33b130 Fix GitHub authentication 2022-03-03 11:36:05 -06:00
Chris Veilleux ccae724ad7 Moved Pantacor sync from device activation to its own endpoint. This will keep Selene pairing and Pantacor registration independent to help the user experience in pairing 2022-03-03 10:48:25 -06:00
Ryan Voots 1341fc1555
Fix client encoding for PG connection
Set the encoding to utf8 so that it will correctly handle accented characters in the geography data.
2022-02-28 09:54:43 -05:00
Kris Gesling a8c6251d90 Merge branch 'dev' into test 2021-11-19 10:00:55 +09:30
Kris Gesling 5cbfb41e95
Merge pull request #280 from MycroftAI/feature/new-wa-v2
New Wolfram Alpha v2 endpoint
2021-11-19 09:58:24 +09:30
Kris Gesling 064973e3dc Add new Wolfram Alpha Full Restults v2 API endpoint
Useful for fetching images related to Wolfram search results.
2021-11-19 09:55:46 +09:30
Chris Veilleux f4a911ab87 Merge remote-tracking branch 'origin/master' into test 2021-11-12 19:10:38 -06:00
dependabot[bot] 9b6c73eece
Bump starkbank-ecdsa from 1.1.1 to 2.0.1 in /shared
Bumps [starkbank-ecdsa](https://github.com/starkbank/ecdsa-python) from 1.1.1 to 2.0.1.
- [Release notes](https://github.com/starkbank/ecdsa-python/releases)
- [Changelog](https://github.com/starkbank/ecdsa-python/blob/master/CHANGELOG.md)
- [Commits](https://github.com/starkbank/ecdsa-python/compare/v1.1.1...v2.0.1)

---
updated-dependencies:
- dependency-name: starkbank-ecdsa
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2021-11-08 21:56:14 +00:00
Kris Gesling bda7e7f464
Merge pull request #278 from MycroftAI/feature/wolfram-simple-api
Add Wolfram Alpha Simple API endpoint
2021-11-05 07:46:53 +09:30
Kris Gesling 14c1293d1a Fix the copyright date 2021-11-04 07:45:10 +09:30
Kris Gesling 1d3332b8a6 Add Wolfram Alpha Simple API endpoint
Useful for fetching images related to Wolfram search results.
2021-10-26 23:51:49 +09:30
Chris Veilleux 9266a1a385
Merge pull request #277 from MycroftAI/bugfix/support-email
Fixed and issue with the sending email on a support ticket.
2021-10-05 20:31:21 -05:00
Chris Veilleux 6d191a24f7 Changed email address for support tickets. 2021-10-05 20:11:54 -05:00
Chris Veilleux f00421864f Fixed and issue with the sending email on a support ticket. 2021-10-05 17:14:16 -05:00
Chris Veilleux 5cc0598c02 Merge branch 'master' into dev 2021-10-05 15:48:25 -05:00
Chris Veilleux ee34b4a85e
Merge pull request #276 from MycroftAI/bugfix/missing-milliseconds
No milliseconds in last activity timestamp
2021-10-05 15:47:04 -05:00
Chris Veilleux 4f02f6e9e3 Fixes an obscure bug where the last activity timestamp fails to be converted to a datetime object because there are no milliseconds. 2021-09-15 22:56:50 -05:00
Kris Gesling 16ed6268d8
Merge pull request #273 from MycroftAI/docs/first-setup
Docs: minor improvements for first setup instructions
2021-09-06 16:38:01 +09:30
Kris Gesling c685d87d5e Add note about using an application specific user 2021-09-06 16:13:08 +09:30
Chris Veilleux 6df233710e
Merge pull request #274 from MycroftAI/bugfix/ssh-encoding
Change how the SSH key is passed to the API
2021-08-17 23:23:54 -05:00
Chris Veilleux c0138f1b23 Update the SSH key API calls to reflect the new call format. 2021-08-17 23:18:24 -05:00
Chris Veilleux a02db901f2 Change how the SSH key is passed to the API to account for special characters. 2021-08-17 19:22:23 -05:00
Chris Veilleux 6e8cb797ad
Merge pull request #270 from MycroftAI/bugfix/mark-ii-device-edit
Bugfix/mark ii device edit
2021-08-03 22:26:58 -05:00
Chris Veilleux bf554b3db7
Merge branch 'dev' into bugfix/mark-ii-device-edit 2021-08-03 22:15:58 -05:00
Chris Veilleux b2ec51801b Allow for the comment part of an RSA SSH key to be optional. 2021-08-03 21:59:25 -05:00
Kris Gesling 28e8c4db38
Merge pull request #271 from simcop2387/patch-1
Fix endpoint paths in readme
2021-08-02 10:28:30 +09:30
Ryan Voots 85a8fd1b2f
Fix paths in README.md for api installation 2021-07-29 17:40:05 -07:00
Chris Veilleux 7c5321f597 Add tests for the new SSH key validation endpoint. 2021-07-22 17:00:45 -05:00
Chris Veilleux c9ff473bd4 Updated the date from a copyright date copied from another file 2021-07-22 15:10:21 -05:00
Chris Veilleux 6e95be236c Fixed a typo 2021-07-22 15:09:04 -05:00
Chris Veilleux e93ca34662 Merge remote-tracking branch 'origin/dev' into bugfix/mark-ii-device-edit 2021-07-22 15:04:14 -05:00
Chris Veilleux 9074975f13 Merge remote-tracking branch 'origin/master' into dev
# Conflicts:
#	shared/Pipfile.lock
2021-07-22 15:02:05 -05:00
Chris Veilleux 02be1c7da1
Merge pull request #269 from MycroftAI/refactor/simplify-request-validation
Simplify request validation in the account API's device endpoint
2021-07-22 14:52:46 -05:00
Chris Veilleux f5b21f1d7c Improved docstring to provide better clarity. 2021-07-22 14:43:08 -05:00
Chris Veilleux e4d1ad49b4 Removed log statement used in debugging. 2021-07-22 14:40:00 -05:00
Chris Veilleux ae0cbeee71 Add function and API call to check if a RSA SSH key is well-formed. 2021-07-19 18:27:54 -05:00
Chris Veilleux 4584e9b003
Merge branch 'dev' into refactor/simplify-request-validation 2021-07-19 18:25:55 -05:00
Chris Veilleux 3b6908af30
Merge pull request #268 from MycroftAI/upgrade-jwt
Update pyjwt version
2021-07-19 17:20:48 -05:00
Kris Gesling a92b475212 Add section on testing the endpoints 2021-07-15 14:54:39 +09:30
Kris Gesling 0e5110ca09 Fix heading levels 2021-07-15 10:04:28 +09:30
Kris Gesling d670bf6b03 Fix missing text in Redis description 2021-07-15 09:57:52 +09:30
Kris Gesling dcd2b5ec5f Add Precise API endpoint 2021-07-15 09:49:31 +09:30
Kris Gesling 232b7b6213 Fix API endpoint paths 2021-07-15 09:45:53 +09:30
Kris Gesling a90f256218 add work around for postgres auth error 2021-07-15 09:45:03 +09:30
Kris Gesling 7aa466bdf5 reuse env variables for setting passwords 2021-07-15 09:45:03 +09:30
Kris Gesling a1a012de0a Use $USER variable to work regardless of username 2021-07-15 09:45:03 +09:30
Kris Gesling 755e4cb2cf Add link to Ubuntu image download 2021-07-15 07:22:42 +09:30
Chris Veilleux 8c97977c21 update to reflect changes in the shared library requirements 2021-07-14 15:46:01 -05:00
Chris Veilleux f030bf7817 log the InvalidTokenError for more information. 2021-07-14 15:46:01 -05:00
Chris Veilleux 0e65589a01 add descriptors to assertions 2021-07-14 15:46:01 -05:00
Chris Veilleux df22e63a78 Fix a bug that caused an error when attempting save edits made to device attributes. 2021-07-13 19:26:41 -05:00
Chris Veilleux eba505e0df Found a feature of the schematics library that allowed simplification of the logic that converts the API request into the schematic for validation. 2021-07-13 14:26:48 -05:00
Chris Veilleux d59d49b922 Update pyjwt version in Pipfile and change JWT code to comply with new major version. 2021-07-13 14:16:19 -05:00
Chris Veilleux cfdc167443
Merge pull request #265 from MycroftAI/dependabot/pip/shared/urllib3-1.26.5
Bump urllib3 from 1.26.3 to 1.26.5 in /shared
2021-07-13 13:14:39 -05:00
Chris Veilleux 3f03b333e5
Merge pull request #266 from MycroftAI/dependabot/pip/batch/urllib3-1.26.5
Bump urllib3 from 1.26.3 to 1.26.5 in /batch
2021-07-13 13:14:18 -05:00
Chris Veilleux 3abdb7e318
Merge pull request #267 from MycroftAI/dependabot/pip/api/precise/urllib3-1.26.5
Bump urllib3 from 1.26.3 to 1.26.5 in /api/precise
2021-07-13 13:13:50 -05:00
Chris Veilleux adf26eb2c1
Merge pull request #262 from MycroftAI/feature/21.02
Bump mycroft-core version to 21.02
2021-06-14 23:07:54 -05:00
dependabot[bot] 10ba7400e6
Bump urllib3 from 1.26.3 to 1.26.5 in /api/precise
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.3 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.3...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2021-06-02 03:49:21 +00:00
dependabot[bot] f026a348d1
Bump urllib3 from 1.26.3 to 1.26.5 in /batch
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.3 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.3...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2021-06-02 00:49:22 +00:00
dependabot[bot] ab74b7f850
Bump urllib3 from 1.26.3 to 1.26.5 in /shared
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.3 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.3...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2021-06-02 00:43:30 +00:00
Kris Gesling 5654572258 Bump mycroft-core version to 21.02 2021-05-24 14:44:17 +09:30
Chris Veilleux d5d2eef6dc
Merge pull request #255 from MycroftAI/bugfix/send-email
Fix email sending logic
2021-03-16 23:07:31 -05:00
Chris Veilleux 80d0abc32e Added a dummy bearer token to the request that expects a 401 to be returned 2021-03-16 23:02:43 -05:00
Chris Veilleux 1bb3c39954 Added a check for both body and template file name being None. 2021-03-15 23:29:56 -05:00
Chris Veilleux 4a9225393c Add a dummy SendGrid key for the behave tests. 2021-03-15 22:16:16 -05:00
Chris Veilleux e268bf2875 Fix the email endpoint of the device API to send an email through SendGrid. 2021-03-15 18:39:32 -05:00
Chris Veilleux 4eef250051 Add logic to script to handle duplicated cities that have devices assigned to them. 2021-03-12 14:02:20 -06:00
Chris Veilleux 58683a7dee Merge remote-tracking branch 'origin/test' into dev 2021-02-25 20:16:17 -06:00
Chris Veilleux 6b9f495247 A change in the mark-2 branch of mycroft core is not yet propagated to the dev or master branches. Put a hack in place to fix it for now. 2021-02-25 19:19:34 -06:00
Chris Veilleux 03d26cf970
Merge pull request #254 from MycroftAI/bugfix/pantacor-config
Bugfix/pantacor config
2021-02-24 17:28:20 -06:00
Chris Veilleux 3e71a377b3 Pantacor API was being called unnecessarily. 2021-02-24 17:24:53 -06:00
Chris Veilleux 2ef4beaa0c fixed test to ignore device id no longer returned from add device call 2021-02-23 18:03:31 -06:00
Chris Veilleux 29b8e52e92 fixed an issue in the web console indicating the response data was not formatted as JSON 2021-02-23 18:03:31 -06:00
Chris Veilleux 4c84f6db48 Merge remote-tracking branch 'origin/dev' into bugfix/pantacor-config 2021-02-23 17:18:14 -06:00
Chris Veilleux f23ab4dc70 Merge remote-tracking branch 'origin/master' into test 2021-02-23 17:17:07 -06:00
Chris Veilleux 1ea76f9b38 make sure the channel is lowercase before applying updates. Don't update the database unless the API call succeeds 2021-02-23 16:21:10 -06:00
Chris Veilleux e25fcc418d Add an exception when the API call fails. 2021-02-23 16:19:59 -06:00
Chris Veilleux 46177a075f Merge branch 'master' into test 2021-02-22 11:53:20 -06:00
Chris Veilleux 9695fd3a5a
Merge pull request #246 from MycroftAI/dependabot/pip/batch/cryptography-3.3.2
Bump cryptography from 3.2.1 to 3.3.2 in /batch
2021-02-21 21:35:32 -06:00
dependabot[bot] 47da41f35d
Bump cryptography from 3.2.1 to 3.3.2 in /batch
Bumps [cryptography](https://github.com/pyca/cryptography) from 3.2.1 to 3.3.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/3.2.1...3.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-22 03:32:20 +00:00
Chris Veilleux 4052e4207c
Merge pull request #247 from MycroftAI/dependabot/pip/api/account/cryptography-3.3.2
Bump cryptography from 3.2.1 to 3.3.2 in /api/account
2021-02-21 21:27:33 -06:00
dependabot[bot] f3c4dcc3b3
Bump cryptography from 3.2.1 to 3.3.2 in /api/account
Bumps [cryptography](https://github.com/pyca/cryptography) from 3.2.1 to 3.3.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/3.2.1...3.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-22 03:10:39 +00:00
Chris Veilleux 498f622718
Merge pull request #248 from MycroftAI/dependabot/pip/api/public/cryptography-3.3.2
Bump cryptography from 3.2.1 to 3.3.2 in /api/public
2021-02-21 21:04:45 -06:00
dependabot[bot] cc4faa6475
Bump cryptography from 3.2.1 to 3.3.2 in /api/public
Bumps [cryptography](https://github.com/pyca/cryptography) from 3.2.1 to 3.3.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/3.2.1...3.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-22 02:22:04 +00:00
Chris Veilleux 7aaf1209f0
Merge pull request #249 from MycroftAI/dependabot/pip/api/precise/cryptography-3.3.2
Bump cryptography from 3.1.1 to 3.3.2 in /api/precise
2021-02-21 20:17:41 -06:00
Chris Veilleux e6821ec1a3
Merge pull request #250 from MycroftAI/dependabot/pip/shared/cryptography-3.3.2
Bump cryptography from 3.2.1 to 3.3.2 in /shared
2021-02-21 20:17:21 -06:00
Chris Veilleux b351c72dfe
Merge pull request #253 from MycroftAI/bugfix/move_pantacor_config
Move Pantacor device retrieval logic
2021-02-21 20:16:35 -06:00
Chris Veilleux e96720590a catch the PantacorError raised when a device is not found and log it. 2021-02-21 19:20:53 -06:00
Chris Veilleux b5bf8ebd43 added pantacor environment variables to fix breaking test 2021-02-21 18:43:20 -06:00
Chris Veilleux b6d7597c33
Merge pull request #252 from MycroftAI/bugfix/remove_duplicate_cities
Remove duplicate cities from database
2021-02-21 18:12:41 -06:00
Chris Veilleux 2f49f797b1 Change the logic to get the Pantacor device configuration to occur at activation time and use the Pantacor device ID sent to the API by the device. 2021-02-20 17:17:53 -06:00
Chris Veilleux 47fb276b4f Add pylint to dependencies 2021-02-18 18:52:27 -06:00
Chris Veilleux 5146c9346f New script and SQL to remove the duplicated cities that are part of the geographical information we download. 2021-02-18 18:50:51 -06:00
Chris Veilleux 79516ecef7
Merge pull request #251 from MycroftAI/feature/pantacor-update
Pantacor software update
2021-02-18 11:00:57 -06:00
Chris Veilleux 365ef3b02b Add the ability to get a pending deployment and trigger it if manual updates are turned on 2021-02-16 15:17:55 -06:00
Chris Veilleux 2b862c989a
Merge pull request #244 from MycroftAI/feature/pantacor-api
Feature/pantacor api
2021-02-09 22:31:59 -06:00
Chris Veilleux 0493aa11dc Added dummy environment variables for pantacor API 2021-02-09 22:21:48 -06:00
Chris Veilleux a56458c24f fixed failing test 2021-02-09 21:57:11 -06:00
dependabot[bot] c954afb67e
Bump cryptography from 3.2.1 to 3.3.2 in /shared
Bumps [cryptography](https://github.com/pyca/cryptography) from 3.2.1 to 3.3.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/3.2.1...3.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-10 02:48:17 +00:00
dependabot[bot] 44bafb1631
Bump cryptography from 3.1.1 to 3.3.2 in /api/precise
Bumps [cryptography](https://github.com/pyca/cryptography) from 3.1.1 to 3.3.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/3.1.1...3.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-10 02:44:05 +00:00
Chris Veilleux 52685c92b9 Add logic to update the Pantacor config when the device is edited in the UI 2021-02-08 10:57:18 -06:00
Chris Veilleux 3960cff2df Add PATCH methods to update Pantacor config values. 2021-02-05 14:45:48 -06:00
Chris Veilleux 779d3e8fac Rename "release" column in pantacor_config table to "release_channel". 2021-02-04 13:57:48 -06:00
Chris Veilleux d7f40e2e11 Merge remote-tracking branch 'origin/feature/pantacor_config' into feature/pantacor-api 2021-02-04 12:28:19 -06:00
Chris Veilleux f7bb357b7e Added IP address to the pantacor_config table. 2021-02-04 12:27:35 -06:00
Chris Veilleux 32d2dd2931 Changed device activation tests to check for the pantacor ID and fixed bugs as a result of running tests. 2021-02-03 15:15:06 -06:00
Chris Veilleux a396df6aff refactored to split into one test per endpoint and added check for packaging type in redis 2021-02-03 14:22:33 -06:00
Chris Veilleux a9ffb221ed minor refactor to use cache key constants instead of literals 2021-02-03 14:13:13 -06:00
Chris Veilleux 5fbccdaf03 minor refactor 2021-02-03 13:49:21 -06:00
Chris Veilleux 057fcbfd4a Add code to search for a device on Pantacor using their API. Pairing code is sent and their device ID is returned then added to our database. 2021-02-03 12:03:13 -06:00
Chris Veilleux 077c449ea2 Add the new packaging_type device config to the data stored in Redis at pairing time. Subsequent call to activate device using the pairing code will use this information to search for the device using the Pantacor API 2021-02-03 12:01:45 -06:00
Chris Veilleux be0c6212f7 Add CRUD operations for the pantacor_config table. 2021-02-01 12:23:04 -06:00
Chris Veilleux c49de30f20 Add the pantacor_config table to the database. 2021-02-01 12:20:44 -06:00
Chris Veilleux 1713c423d0 Re-apply original merge from test to master 2020-12-28 12:29:19 -06:00
Chris Veilleux 0b3c61d6aa minor bug fixes 2020-12-22 14:25:15 -06:00
Chris Veilleux 05f752d8a6 fixed another sql statement with the wake word subselect issue 2020-12-22 12:24:27 -06:00
Chris Veilleux 99798852b8 Fixed an issue with wake word case not matching database. 2020-12-22 12:21:49 -06:00
Chris Veilleux e63dcaf077 Fixed and issue with wake word not being selected on the device edit screen 2020-12-22 12:00:50 -06:00
Chris Veilleux ea6084227f Added a check for no defaults 2020-12-22 12:00:15 -06:00
Chris Veilleux 0c11307011 Fixed an issue where the account defaults for wake words was not being properly handled. 2020-12-22 11:45:53 -06:00
Chris Veilleux dd86371b45 Merge remote-tracking branch 'origin/bugfix/pairing' 2020-12-21 20:37:41 -06:00
Chris Veilleux 561fe898b2 Fixed an issue that cause the device edit screen to return a 500 error. The code expected the results of two method calls to be a Device object but one returned a dictionary instead. 2020-12-21 19:41:51 -06:00
Chris Veilleux 1db2bb520f Fixed SQL statement that referenced old wake word data structure. 2020-12-18 14:01:41 -06:00
Kris Gesling 5e47af4117
Merge pull request #240 from MycroftAI/bugfix/pairing-down
Fix ModuleNotFoundError: No module named 'data'
2020-12-18 22:33:03 +09:30
Kris Gesling 70601bda5b remove old branch pull 2020-12-18 22:27:21 +09:30
Kris Gesling b2869dcbe5 fix module reference 2020-12-18 21:34:10 +09:30
Kris Gesling 59fd67cb3a run all tests on PR's to master 2020-12-18 21:33:20 +09:30
Kris Gesling b95b598ff5 prevent KeyError from missing timezone 2020-12-18 21:10:51 +09:30
Kris Gesling 12749da3af Revert "Merge remote-tracking branch 'origin/test'"
This reverts commit 9d3dd24601, reversing
changes made to 7ec565c4d8.
2020-12-18 21:04:48 +09:30
Chris Veilleux 9d3dd24601 Merge remote-tracking branch 'origin/test' 2020-12-17 21:44:04 -06:00
Chris Veilleux 5b44fcaf78 Add the file directory into the response for each file. 2020-12-14 16:35:26 -06:00
Chris Veilleux fecf09b4f4 Merge remote-tracking branch 'origin/dev' into test 2020-12-11 14:21:29 -06:00
Chris Veilleux 588f13f6f8
Merge pull request #236 from MycroftAI/feature/model-building-api
Feature/model building api
2020-12-11 14:12:35 -06:00
Chris Veilleux 4757b9b80a
Merge branch 'dev' into feature/model-building-api 2020-12-11 11:25:01 -06:00
Chris Veilleux 36795adfae fix sql error 2020-12-04 13:21:24 -06:00
Chris Veilleux 83b88ac649 remove authentication 2020-12-04 13:13:02 -06:00
Chris Veilleux 448eee3487
Merge pull request #234 from MycroftAI/feature/add-speech-tag
Add speech tag and enhance tagging selection criteria for wake word tagging
2020-12-01 20:47:07 -06:00
Chris Veilleux 5f17d80435 Merge remote-tracking branch 'origin/feature/add-speech-tag' into feature/model-building-api 2020-12-01 17:55:56 -06:00
Chris Veilleux 605f885018 Merge branch 'dev' into feature/add-speech-tag 2020-12-01 17:55:31 -06:00
Chris Veilleux 55f3ff01cf new precise API endpoint for retrieving a list of sample files for a specified wake word that have been designated since the specified date 2020-12-01 17:54:31 -06:00
Chris Veilleux 404fc3dabd
Merge pull request #232 from MycroftAI/feature/wake-word-designation
New script to make file designations out of file tags.
2020-12-01 17:03:36 -06:00
Chris Veilleux 2510ac916a added a check for removing files designated as non-speaking 2020-12-01 13:37:59 -06:00
Chris Veilleux 483f8a7d4d Merge fix 2020-11-30 18:04:26 -06:00
Chris Veilleux f0f1966e2f Merge branch 'feature/wake-word-designation' into feature/add-speech-tag
# Conflicts:
#	api/precise/precise_api/endpoints/tag.py
2020-11-30 18:03:04 -06:00
Chris Veilleux 4221556515
Merge branch 'dev' into feature/wake-word-designation 2020-11-30 17:59:08 -06:00
Chris Veilleux aded8d85df Move the retrieval/creation of taggers and sessions sooner into the workflow. 2020-11-30 17:15:25 -06:00
Chris Veilleux 5b9f7ff478 Enhance to be smarter about how files are selected for tagging based on the designations and tags that have been applied to the file up to this point. 2020-11-30 11:34:09 -06:00
Chris Veilleux 6e4177b5a5 Add a priority column to the tag table because some tags are prerequisites to others 2020-11-30 11:32:35 -06:00
Chris Veilleux 12050b60a9
Merge pull request #230 from MycroftAI/ci-docker-image-cleanup
Add post processing to Jenkins to clean up Docker containers and images.
2020-11-24 12:20:06 -06:00
Chris Veilleux c72d598bb6 Change the label used for image pruning to prevent PR name clashes across different CI jobs 2020-11-12 15:13:38 -10:00
Chris Veilleux 17bd500394 New script to make file designations out of file tags. 2020-11-05 22:21:36 -06:00
Chris Veilleux 8528bb87d3 Add post processing to Jenkins to clean up Docker containers and images. 2020-11-04 15:05:34 -06:00
Chris Veilleux 93d3efdb55
Merge pull request #229 from MycroftAI/feature/tagger
Feature/tagger
2020-11-03 17:06:21 -06:00
Chris Veilleux 2eb5532c35 Fixed tests that were breaking due to the new request format. 2020-11-03 14:16:40 -06:00
Chris Veilleux af6618fd51 Pin pyjwt to version 1.7.1 to fix a testing issue where one virtualenv was running 2.0.0a1. 2020-11-02 19:28:31 -06:00
Chris Veilleux efcd9b8b4f PyLint fixes 2020-11-02 15:44:08 -06:00
Chris Veilleux df59fe437c PyLint fixes 2020-11-02 14:52:21 -06:00
Chris Veilleux df3efccebd PyLint fixes 2020-11-02 14:43:52 -06:00
Chris Veilleux a9802afac4 PyLint fixesç 2020-11-02 14:32:25 -06:00
Chris Veilleux b648ebe3ca update a tag value based on user feedback 2020-10-30 15:46:37 -05:00
Chris Veilleux 8a917ffb81 Small performance improvement. 2020-10-30 15:00:59 -05:00
Chris Veilleux e2ef562125 Improve randomization of query to get a file to tag. Also, change logic to only return non-wake word tags when the sample is designated as a wake word. 2020-10-30 14:19:20 -05:00
Chris Veilleux 144cb0ed89 fix timestamp range issues with tagging.session table 2020-10-29 20:52:30 -05:00
Chris Veilleux 1cd3e90e4b fixed an sql file name 2020-10-29 20:38:30 -05:00
Chris Veilleux dade8fbc36 Fixed PyLint issues 2020-10-29 18:19:51 -05:00
Chris Veilleux bd41dbcfae wake words coming from device may contain a dash. remove it to find the right wake word on the database. 2020-10-29 16:03:12 -05:00
Chris Veilleux 161e1eef09 de-linted 2020-10-29 15:01:33 -05:00
Chris Veilleux 9a599df7f3 change to use new wake word data object 2020-10-29 14:59:15 -05:00
Chris Veilleux 0f6da41e78 change to use new wake word data object 2020-10-29 14:59:02 -05:00
Chris Veilleux 30bb8979f9 fix issue where UI uses title case for wake word names while database uses lower case 2020-10-29 14:58:29 -05:00
Chris Veilleux cfd1747fa3
Merge pull request #226 from MycroftAI/feature/file-name-hashing
Feature/file name hashing
2020-10-27 13:14:33 -05:00
Chris Veilleux 5bcac25bd7 fixed bugs in SFTP logic due to learning to use paramiko 2020-10-26 13:07:46 -05:00
Chris Veilleux 74b6feb702 fixed naming issue 2020-10-24 01:35:01 -05:00
Chris Veilleux 671134ab91 removed unused import 2020-10-24 00:52:21 -05:00
Chris Veilleux e76faf714b added remainder of tagging tables 2020-10-24 00:51:56 -05:00
Chris Veilleux df56cfa372 fix typo in name of private key file 2020-10-24 00:50:53 -05:00
Chris Veilleux 92a7666d56 fixed a copy paste error and named the flask app correctly 2020-10-22 23:40:20 -05:00
Chris Veilleux 36986ef456 pass a session id around to help with the wake word file selection process. 2020-10-21 23:12:57 -05:00
Chris Veilleux 78530189cf fixed an error with password validation found during testing 2020-10-21 17:46:47 -05:00
Chris Veilleux 222bbbe1ca Merge remote-tracking branch 'origin/feature/file-name-hashing' into feature/tagger 2020-10-21 17:24:43 -05:00
Chris Veilleux a088b3bf5f addressed pylint errors 2020-10-21 17:01:01 -05:00
Chris Veilleux 91557c4f73 fixed a breaking environmental cleanup for the behavioral tests. 2020-10-21 16:26:04 -05:00
Chris Veilleux 83795ebb55 Merge remote-tracking branch 'origin/dev' into feature/file-name-hashing 2020-10-21 15:34:14 -05:00
Chris Veilleux 9d97fc09cb
Merge pull request #225 from MycroftAI/feature/wake-word-file-removal
Feature/wake word file removal
2020-10-21 15:32:55 -05:00
Chris Veilleux 70958b80d1 removed staticmethod decorator from validate methods in Schematics models to fix failing tests. 2020-10-21 15:20:51 -05:00
Chris Veilleux 67a40284db fixed a pylint refactor 2020-10-20 17:55:15 -05:00
Chris Veilleux d6d6846752 fixed an import error 2020-10-20 17:49:33 -05:00
Chris Veilleux 23e6101fc4 addressed pylint issues 2020-10-20 17:38:28 -05:00
Chris Veilleux 7ec565c4d8
Merge pull request #220 from hammyMarc/patch-1
Update README.md
2020-10-20 17:35:10 -05:00
Chris Veilleux 04ee634747 fixed import statement 2020-10-20 17:31:27 -05:00
Chris Veilleux 1cf9a3f926 addressed some pylint refactors 2020-10-20 17:02:49 -05:00
Chris Veilleux 0de627da80 removed checkout of devops branch that has been merged 2020-10-20 15:07:54 -05:00
Chris Veilleux 36eec4d253 removed checkout of devops branch that has been merged 2020-10-20 15:07:20 -05:00
Chris Veilleux 5624a73871 changed wake word to be passed as a query parameter. 2020-10-20 14:17:14 -05:00
Chris Veilleux 0f771ca69c implement precise API endpoints for the tagger 2020-10-20 12:22:56 -05:00
Chris Veilleux 435bf70016 New ssh utilities library using paramiko to log in and transfer files over SFTP 2020-10-19 15:07:25 -05:00
Chris Veilleux 2e1c67efd2 New tables to support tagging of wake word files. 2020-10-19 14:55:16 -05:00
Chris Veilleux 9a506a0740 Stub files for new precise API 2020-10-08 13:18:45 -05:00
Chris Veilleux 22ec4ee385 removed duplicate licensing info 2020-10-08 13:12:36 -05:00
Chris Veilleux 22b4fdd685 update the environment.py file to use a new helper function 2020-10-06 01:10:45 -05:00
Chris Veilleux 40dbcd0c8a change logic that names files to use a hash of the file contents as the file name. 2020-10-06 01:08:53 -05:00
Chris Veilleux 780606cb7f refactor to remove duplicate SQL 2020-10-06 01:06:31 -05:00
Chris Veilleux b4b2ab2b50 new testing helper functions for the tagging schema 2020-10-06 00:36:44 -05:00
Chris Veilleux 0106523772 make the file location on the tagging.file table not nullable 2020-10-06 00:33:29 -05:00
Chris Veilleux 1988266147 Merge branch 'feature/wake-word-file-removal' of https://github.com/MycroftAI/selene-backend into feature/wake-word-file-removal 2020-10-02 13:48:14 -05:00
Chris Veilleux 492f5aa2e9 Merge branch 'dev' into feature/wake-word-file-removal 2020-10-02 13:47:41 -05:00
Chris Veilleux 09fe436cce Fixed PyLint errors discovered in CI process 2020-10-02 13:46:56 -05:00
Chris Veilleux 1d5ea64a40 fix import statement 2020-10-01 14:41:52 -05:00
Chris Veilleux 1c574afb89
Merge branch 'dev' into feature/wake-word-file-removal 2020-10-01 14:37:29 -05:00
Chris Veilleux 2f205868a3 Merge remote-tracking branch 'origin/master' into dev 2020-10-01 14:36:36 -05:00
Chris Veilleux 04dd774c50 Merge remote-tracking branch 'origin/master' into test 2020-10-01 14:36:08 -05:00
Kris Gesling 9d1e07f2bb
Merge pull request #221 from MycroftAI/feature/20.08
Update for mycroft-core v20.08
2020-09-28 09:19:50 +09:30
Chris Veilleux dc114fe3c5
Merge branch 'dev' into feature/20.08 2020-09-25 11:24:35 -05:00
Chris Veilleux 6f2de64f3b
Merge pull request #223 from MycroftAI/feature/wake-word-storage
Feature/wake word storage
2020-09-24 11:54:41 -05:00
Chris Veilleux 69e04bce4e unique index does not propagate down from parent table. 2020-09-24 11:54:06 -05:00
Chris Veilleux 97d06f191a Merge branch 'feature/wake-word-storage' into feature/wake-word-file-removal 2020-09-21 13:39:28 -05:00
Chris Veilleux 7f079dfcab Merge branch 'dev' into feature/wake-word-storage 2020-09-21 13:36:27 -05:00
Chris Veilleux 6f0d074f08 new script to delete files when an account is deleted 2020-09-21 12:53:58 -05:00
Chris Veilleux 15d75621b1 remove file location row if added in a test 2020-09-17 15:45:20 -05:00
Chris Veilleux d521bec189 add logic to set the wake word file status to "pending delete" when an account with associated files is deleted. 2020-09-17 15:30:44 -05:00
Chris Veilleux 2dbfd1265c add logic to existing code handling the new status column on the wake word file table 2020-09-17 15:29:25 -05:00
Chris Veilleux 88943afe37 add status to wake word file table 2020-09-17 13:56:11 -05:00
Chris Veilleux 4f6b9771ed New script to move files from Selene Public API server to longer term storage. 2020-09-15 14:13:37 -05:00
Chris Veilleux 5573912707
Merge pull request #222 from MycroftAI/feature/wake_word_upload
Feature/wake word upload
2020-09-11 20:32:04 -05:00
Chris Veilleux 0c9197d707 Change the cursor factory from NamedTupleCursor to RealDictCursor to match what the API code is using. 2020-09-11 18:45:55 -05:00
Chris Veilleux 699ab4f1d2 added endpoint to Public API for uploading a wake word sample file. 2020-09-02 13:57:00 -05:00
Chris Veilleux 333d19efea added tagging schema 2020-09-02 13:55:06 -05:00
Chris Veilleux 628fa90e0d removed wake word engine enum type 2020-09-02 13:54:22 -05:00
Kris Gesling 7307a99c84 update for 20.08 2020-09-02 11:06:45 +09:30
Chris Veilleux 2b074ef10d Moved wake_word and wake_word_settings tables to a new wake_word schema. 2020-08-27 13:16:28 -05:00
hammyMarc 92e056fa57
Update README.md
Line 63: to lower-case P->python3.7

Line 48: Moved the install of package: python3-pip
2020-07-13 12:59:31 -04:00
hammyMarc ef8ef26ccc
Update README.md
For some reason, a fresh install of Unbuntu 18 and python3.7 didn't include pip. Added a line to ensure people have pip, before proceeding.
2020-07-12 18:24:47 -04:00
364 changed files with 19862 additions and 6018 deletions

11
.pre-commit-config.yaml Normal file
View File

@ -0,0 +1,11 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.3.0
hooks:
- id: check-yaml
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black

View File

@ -12,21 +12,26 @@
# "--net <network name>" argument.
# Build steps that apply to all of the selene applications.
FROM python:3.7-slim as selene-base
RUN apt-get update && apt-get -y install gcc git
RUN python3 -m pip install pipenv
FROM python:3.9 as base-build
RUN apt-get update && apt-get -y install gcc git libsndfile-dev
RUN curl -sSL https://install.python-poetry.org | python3 -
ENV PATH ${PATH}:/root/.local/bin
RUN poetry --version
RUN mkdir -p /root/allure /opt/selene/selene-backend /root/code-quality /var/log/mycroft
WORKDIR /opt/selene/selene-backend
ENV DB_HOST selene-db
ENV DB_PASSWORD adam
ENV SELENE_ENVIRONMENT dev
ENV DB_NAME mycroft
ENV DB_PASSWORD adam
ENV DB_USER selene
ENV JWT_ACCESS_SECRET access-secret
ENV JWT_REFRESH_SECRET refresh-secret
ENV SALT testsalt
ENV REDIS_HOST selene-cache
ENV REDIS_PORT 6379
ENV SALT testsalt
ENV SELENE_ENVIRONMENT dev
# Put the copy of the shared library code in its own section to avoid reinstalling base software every time
FROM base-build as selene-base
COPY shared shared
# Code quality scripts and user agreements are stored in the MycroftAI/devops repository. This repository is private.
@ -39,11 +44,9 @@ ARG github_api_key
ENV GITHUB_API_KEY=$github_api_key
RUN mkdir -p /opt/mycroft
WORKDIR /opt/mycroft
RUN git clone https://$github_api_key@github.com/MycroftAI/devops.git
RUN git clone https://${github_api_key}@github.com/MycroftAI/devops.git
WORKDIR /opt/mycroft/devops/jenkins
# TODO: remove when the pull-request-identifier branch is merged.
RUN git checkout bug/pull-request-identifier
RUN pipenv install
RUN poetry install
# Run a linter and code formatter against the API specified in the build argument
FROM devops-build as api-code-check
@ -51,10 +54,10 @@ ARG api_name
WORKDIR /opt/selene/selene-backend
COPY api/${api_name} api/${api_name}
WORKDIR /opt/selene/selene-backend/api/${api_name}
RUN pipenv install --dev
RUN poetry install
ENV PYTHONPATH=$PYTHONPATH:/opt/selene/selene-backend/api/${api_name}
WORKDIR /opt/mycroft/devops/jenkins
ENTRYPOINT ["pipenv", "run", "python", "-m", "pipeline.code_check", "--repository", "selene-backend", "--base-dir", "/opt/selene"]
ENTRYPOINT ["poetry", "run", "python", "-m", "pipeline.code_check", "--repository", "selene-backend", "--base-dir", "/opt/selene"]
# Bootstrap the Selene database as it will be needed to run any Selene applications.
FROM devops-build as db-bootstrap
@ -62,19 +65,23 @@ ENV POSTGRES_PASSWORD selene
WORKDIR /opt/selene/selene-backend
COPY db db
WORKDIR /opt/selene/selene-backend/db
RUN pipenv install
ENTRYPOINT ["pipenv", "run", "python", "scripts/bootstrap_mycroft_db.py"]
RUN poetry install
RUN mkdir -p /tmp/selene
ENTRYPOINT ["poetry", "run", "python", "scripts/bootstrap_mycroft_db.py", "--ci"]
# Run the tests defined in the Account API
FROM selene-base as account-api-test
ARG stripe_api_key
ENV ACCOUNT_BASE_URL https://account.mycroft.test
ENV PANTACOR_API_TOKEN pantacor-token
ENV PANTACOR_API_BASE_URL pantacor.test.url
ENV PYTHONPATH=$PYTHONPATH:/opt/selene/selene-backend/api/account
ENV STRIPE_PRIVATE_KEY $stripe_api_key
COPY api/account api/account
WORKDIR /opt/selene/selene-backend/api/account
RUN pipenv install --dev
RUN poetry install
WORKDIR /opt/selene/selene-backend/api/account/tests
ENTRYPOINT ["pipenv", "run", "behave", "-f", "allure_behave.formatter:AllureFormatter", "-o", "/root/allure/allure-result"]
ENTRYPOINT ["poetry", "run", "behave", "-f", "allure_behave.formatter:AllureFormatter", "-o", "/root/allure/allure-result"]
# Run the tests defined in the Single Sign On API
FROM selene-base as sso-api-test
@ -88,21 +95,27 @@ ENV GITHUB_CLIENT_ID $github_client_id
ENV GITHUB_CLIENT_SECRET $github_client_secret
COPY api/sso api/sso
WORKDIR /opt/selene/selene-backend/api/sso
RUN pipenv install --dev
RUN poetry install
WORKDIR /opt/selene/selene-backend/api/sso/tests
ENTRYPOINT ["pipenv", "run", "behave", "-f", "allure_behave.formatter:AllureFormatter", "-o", "/root/allure/allure-result"]
ENTRYPOINT ["poetry", "run", "behave", "-f", "allure_behave.formatter:AllureFormatter", "-o", "/root/allure/allure-result"]
# Run the tests defined in the Public Device API
FROM selene-base as public-api-test
RUN mkdir -p /opt/selene/data
ARG google_stt_key
ARG stt_api_key
ARG wolfram_alpha_key
ENV GOOGLE_APPLICATION_CREDENTIALS="/root/secrets/transcription-test-363101-6532632520e1.json"
ENV GOOGLE_STT_KEY $google_stt_key
ENV PANTACOR_API_TOKEN pantacor-token
ENV PANTACOR_API_BASE_URL pantacor.test.url
ENV PYTHONPATH=$PYTHONPATH:/opt/selene/selene-backend/api/public
ENV GOOGLE_STT_KEY $google_stt_key
ENV SENDGRID_API_KEY test_sendgrid_key
ENV WOLFRAM_ALPHA_KEY $wolfram_alpha_key
ENV WOLFRAM_ALPHA_URL https://api.wolframalpha.com
COPY api/public api/public
WORKDIR /opt/selene/selene-backend/api/public
RUN pipenv install --dev
RUN poetry install
WORKDIR /opt/selene/selene-backend/api/public/tests
ENTRYPOINT ["pipenv", "run", "behave", "-f", "allure_behave.formatter:AllureFormatter", "-o", "/root/allure/allure-result"]
ENTRYPOINT ["poetry", "run", "behave", "-f", "allure_behave.formatter:AllureFormatter", "-o", "/root/allure/allure-result"]

115
Jenkinsfile vendored
View File

@ -5,6 +5,7 @@ pipeline {
// building the Docker image.
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '5'))
ansiColor('xterm')
}
environment {
// Some branches have a "/" in their name (e.g. feature/new-and-cool)
@ -17,7 +18,7 @@ pipeline {
).trim()
DOCKER_BUILDKIT=1
//spawns GITHUB_USR and GITHUB_PSW environment variables
GITHUB_API=credentials('38b2e4a6-167a-40b2-be6f-d69be42c8190')
GITHUB_API_KEY=credentials('38b2e4a6-167a-40b2-be6f-d69be42c8190')
GITHUB_CLIENT_ID=credentials('380f58b1-8a33-4a9d-a67b-354a9b0e792e')
GITHUB_CLIENT_SECRET=credentials('71626c21-de59-4450-bfad-5034fd596fb2')
GOOGLE_STT_KEY=credentials('287949f8-2ada-4450-8806-1fe2dd8e4c4d')
@ -28,38 +29,42 @@ pipeline {
stage('Lint & Format') {
// Run PyLint and Black to check code quality.
when {
changeRequest target: 'dev'
anyOf {
changeRequest target: 'dev'
changeRequest target: 'master'
}
}
steps {
labelledShell label: 'Account API Setup', script: """
docker build \
--build-arg github_api_key=${GITHUB_API_PSW} \
--build-arg github_api_key=${GITHUB_API_KEY} \
--build-arg api_name=account \
--target api-code-check --no-cache \
-t selene-linter:${BRANCH_ALIAS} .
"""
labelledShell label: 'Account API Check', script: """
docker run selene-linter:${BRANCH_ALIAS} --pipenv-dir api/account --pull-request=${BRANCH_NAME}
docker run selene-linter:${BRANCH_ALIAS} --poetry-dir api/account --pull-request=${BRANCH_NAME}
"""
labelledShell label: 'Single Sign On API Setup', script: """
docker build \
--build-arg github_api_key=${GITHUB_API_PSW} \
--build-arg github_api_key=${GITHUB_API_KEY} \
--build-arg api_name=sso \
--target api-code-check --no-cache \
-t selene-linter:${BRANCH_ALIAS} .
"""
labelledShell label: 'Single Sign On API Check', script: """
docker run selene-linter:${BRANCH_ALIAS} --pipenv-dir api/sso --pull-request=${BRANCH_NAME}
docker run selene-linter:${BRANCH_ALIAS} --poetry-dir api/sso --pull-request=${BRANCH_NAME}
"""
labelledShell label: 'Public API Setup', script: """
docker build \
--build-arg github_api_key=${GITHUB_API_PSW} \
--build-arg github_api_key=${GITHUB_API_KEY} \
--build-arg api_name=public \
--target api-code-check --no-cache \
--label job=${JOB_NAME} \
-t selene-linter:${BRANCH_ALIAS} .
"""
labelledShell label: 'Public API Check', script: """
docker run selene-linter:${BRANCH_ALIAS} --pipenv-dir api/public --pull-request=${BRANCH_NAME}
docker run selene-linter:${BRANCH_ALIAS} --poetry-dir api/public --pull-request=${BRANCH_NAME}
"""
}
}
@ -69,19 +74,23 @@ pipeline {
branch 'dev'
branch 'master'
changeRequest target: 'dev'
changeRequest target: 'master'
}
}
steps {
labelledShell label: 'Building Docker image', script: """
docker build \
--target db-bootstrap \
--build-arg github_api_key=${GITHUB_API_PSW} \
--build-arg github_api_key=${GITHUB_API_KEY} \
--label job=${JOB_NAME} \
-t selene-db:${BRANCH_ALIAS} .
"""
timeout(time: 5, unit: 'MINUTES')
{
labelledShell label: 'Run database bootstrap script', script: """
docker run --net selene-net selene-db:${BRANCH_ALIAS}
docker run \
-v '${HOME}/selene:/tmp/selene' \
--net selene-net selene-db:${BRANCH_ALIAS}
"""
}
}
@ -92,6 +101,7 @@ pipeline {
branch 'dev'
branch 'master'
changeRequest target: 'dev'
changeRequest target: 'master'
}
}
steps {
@ -99,18 +109,32 @@ pipeline {
docker build \
--build-arg stripe_api_key=${STRIPE_KEY} \
--target account-api-test \
--label job=${JOB_NAME} \
-t selene-account:${BRANCH_ALIAS} .
"""
timeout(time: 5, unit: 'MINUTES')
{
sh 'mkdir -p $HOME/selene/$BRANCH_ALIAS/allure'
labelledShell label: 'Running behave tests', script: """
docker run \
--net selene-net \
-v '${HOME}/allure/selene/:/root/allure' \
-v '$HOME/selene/$BRANCH_ALIAS/allure/:/root/allure' \
--label job=${JOB_NAME} \
selene-account:${BRANCH_ALIAS}
"""
}
}
post {
always {
sh 'docker run \
-v "$HOME/selene/$BRANCH_ALIAS/allure:/root/allure" \
--entrypoint=/bin/bash \
--label build=${JOB_NAME} \
selene-account:${BRANCH_ALIAS} \
-x -c "chown $(id -u $USER):$(id -g $USER) \
-R /root/allure/"'
}
}
}
stage('Single Sign On API Tests') {
when {
@ -118,6 +142,7 @@ pipeline {
branch 'dev'
branch 'master'
changeRequest target: 'dev'
changeRequest target: 'master'
}
}
steps {
@ -126,6 +151,7 @@ pipeline {
--build-arg github_client_id=${GITHUB_CLIENT_ID} \
--build-arg github_client_secret=${GITHUB_CLIENT_SECRET} \
--target sso-api-test \
--label job=${JOB_NAME} \
-t selene-sso:${BRANCH_ALIAS} .
"""
timeout(time: 2, unit: 'MINUTES')
@ -133,11 +159,22 @@ pipeline {
labelledShell label: 'Running behave tests', script: """
docker run \
--net selene-net \
-v '${HOME}/allure/selene/:/root/allure' \
-v '$HOME/selene/$BRANCH_ALIAS/allure/:/root/allure' \
selene-sso:${BRANCH_ALIAS}
"""
}
}
post {
always {
sh 'docker run \
-v "$HOME/selene/$BRANCH_ALIAS/allure:/root/allure" \
--entrypoint=/bin/bash \
--label build=${JOB_NAME} \
selene-sso:${BRANCH_ALIAS} \
-x -c "chown $(id -u $USER):$(id -g $USER) \
-R /root/allure/"'
}
}
}
stage('Public Device API Tests') {
when {
@ -145,14 +182,16 @@ pipeline {
branch 'dev'
branch 'master'
changeRequest target: 'dev'
changeRequest target: 'master'
}
}
steps {
labelledShell label: 'Building Docker image', script: """
docker build \
--build-arg google_stt_key=${GOOGLE_STT_KEY} \
--build-arg wolfram_alpha_key=${WOLFRAM_ALPHA_KEY} \
--build-arg google_stt_key=${GOOGLE_STT_KEY} \
--target public-api-test \
--label job=${JOB_NAME} \
-t selene-public:${BRANCH_ALIAS} .
"""
timeout(time: 2, unit: 'MINUTES')
@ -160,11 +199,59 @@ pipeline {
labelledShell label: 'Running behave tests', script: """
docker run \
--net selene-net \
-v '$HOME/allure/selene/:/root/allure' \
-v '$HOME/selene/$BRANCH_ALIAS/allure/:/root/allure' \
-v '$HOME/selene/secrets/:/root/secrets' \
selene-public:${BRANCH_ALIAS}
"""
}
}
post {
always {
sh 'docker run \
-v "$HOME/selene/$BRANCH_ALIAS/allure:/root/allure" \
--entrypoint=/bin/bash \
--label build=${JOB_NAME} \
selene-account:${BRANCH_ALIAS} \
-x -c "chown $(id -u $USER):$(id -g $USER) \
-R /root/allure/"'
}
}
}
}
post {
always {
sh 'rm -rf allure-result/*'
sh 'mkdir -p $HOME/selene/$BRANCH_ALIAS/allure/allure-result'
sh 'mv $HOME/selene/$BRANCH_ALIAS/allure/allure-result allure-result'
// This directory should now be empty, rmdir will intentionally fail if not.
sh 'rmdir $HOME/selene/$BRANCH_ALIAS/allure'
script {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'allure-result']]
])
}
sh(
label: 'Cleanup lingering docker containers and images.',
script: """
docker container prune --force;
docker image prune --force;
"""
)
}
success {
// Docker images should remain upon failure for troubleshooting purposes. However,
// if the stage is successful, there is no reason to look back at the Docker image. In theory
// broken builds will eventually be fixed so this step should run eventually for every PR
sh(
label: 'Delete Docker Image on Success',
script: '''
docker image prune --all --force --filter label=job=${JOB_NAME};
'''
)
}
}
}

190
README.md
View File

@ -1,53 +1,57 @@
[![License](https://img.shields.io/badge/License-GNU_AGPL%203.0-blue.svg)](LICENSE)
[![CLA](https://img.shields.io/badge/CLA%3F-Required-blue.svg)](https://mycroft.ai/cla)
[![Team](https://img.shields.io/badge/Team-Mycroft_Backend-violetblue.svg)](https://github.com/MycroftAI/contributors/blob/master/team/Mycroft%20Backend.md)
[![License](https://img.shields.io/badge/License-GNU_AGPL%203.0-blue.svg)](LICENSE)
[![CLA](https://img.shields.io/badge/CLA%3F-Required-blue.svg)](https://mycroft.ai/cla)
[![Team](https://img.shields.io/badge/Team-Mycroft_Backend-violetblue.svg)](https://github.com/MycroftAI/contributors/blob/master/team/Mycroft%20Backend.md)
![Status](https://img.shields.io/badge/-Production_ready-green.svg)
[![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](http://makeapullrequest.com)
[![Join chat](https://img.shields.io/badge/Mattermost-join_chat-brightgreen.svg)](https://chat.mycroft.ai)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
Selene -- Mycroft's Server Backend
==========
Selene provides the services used by [Mycroft Core](https://github.com/mycroftai/mycroft-core) to manage devices, skills
and settings. It consists of two repositories. This one contains Python and SQL representing the database definition,
data access layer, APIs and scripts. The second repository, [Selene UI](https://github.com/mycroftai/selene-ui),
and settings. It consists of two repositories. This one contains Python and SQL representing the database definition,
data access layer, APIs and scripts. The second repository, [Selene UI](https://github.com/mycroftai/selene-ui),
contains Angular web applications that use the APIs defined in this repository.
There are four APIs defined in this repository, account management, single sign on, skill marketplace and device.
The first three support account.mycroft.ai (aka home.mycroft.ai), sso.mycroft.ai, and market.mycroft.ai, respectively.
The first three support account.mycroft.ai (aka home.mycroft.ai), sso.mycroft.ai, and market.mycroft.ai, respectively.
The device API is how devices running Mycroft Core communicate with the server. Also included in this repository is
a package containing batch scripts for maintenance and the definition of the database schema.
Each API is designed to run independently of the others. Code common to each of the APIs, such as the Data Access Layer,
can be found in the "shared" directory. The shared code is an independent Python package required by each of the APIs.
Each API has its own Pipfile so that it can be run in its own virtual environment.
Each API is designed to run independently of the others. Code common to each of the APIs, such as the Data Access Layer,
can be found in the "shared" directory. The shared code is an independent Python package required by each of the APIs.
Each API has its own Pipfile so that it can be run in its own virtual environment.
# Installation
The Python code utilizes features introduced in Python 3.7, such as data classes.
## Installation
The Python code utilizes features introduced in Python 3.7, such as data classes.
[Pipenv](https://pipenv.readthedocs.io/en/latest/) is used for virtual environment and package management.
If you prefer to use pip and pyenv (or virtualenv), you can find the required libraries in the files named "Pipfile".
These instructions will use pipenv commands.
If the Selene applications will be servicing a large number of devices (enterprise usage, for example), it is
If the Selene applications will be servicing a large number of devices (enterprise usage, for example), it is
recommended that each of the applications run on their own server or virtual machine. This configuration makes it
easier to scale and monitor each application independently. However, all applications can be run on a single server.
This configuration could be more practical for a household running a handful of devices.
easier to scale and monitor each application independently. However, all applications can be run on a single server.
This configuration could be more practical for a household running a handful of devices.
These instructions will assume a multi-server setup for several thousand devices. To run on a single server servicing a
These instructions will assume a multi-server setup for several thousand devices. To run on a single server servicing a
small number of devices, the recommended system requirements are 4 CPU, 8GB RAM and 100GB of disk. There are a lot of
manual steps in this section that will eventually be replaced with an installation script.
All Selene applications are time zone agnostic. It is recommended that the time zone on any server running Selene be UTC.
## Postgres DB
* Recommended server configuration: Ubuntu 18.04 LTS, 2 CPU, 4GB RAM, 50GB disk.
* Use the package management system to install Python 3.7, Python 3 pip and PostgreSQL 10
It is recommended to create an application specific user. In these instructions this user will be `mycroft`.
### Postgres DB
* Recommended server configuration: [Ubuntu 18.04 LTS (server install)](https://releases.ubuntu.com/bionic/), 2 CPU, 4GB RAM, 50GB disk.
* Use the package management system to install Python 3.7, Python 3 pip and PostgreSQL 10
```
sudo apt-get install postgresql python3.7 python
sudo apt-get install postgresql python3.7 python python3-pip
```
* Set Postgres to start on boot
* Set Postgres to start on boot
```
sudo systemctl enable postgresql
```
@ -60,7 +64,7 @@ git clone https://github.com/MycroftAI/selene-backend.git
```
* Create the virtual environment for the database code
```
sudo Python3.7 -m pip install pipenv
sudo python3.7 -m pip install pipenv
cd /opt/selene/selene-backend/db
pipenv install
```
@ -73,22 +77,29 @@ wget http://download.geonames.org/export/dump/timeZones.txt
wget http://download.geonames.org/export/dump/admin1CodesASCII.txt
wget http://download.geonames.org/export/dump/cities500.zip
```
* Generate secure passwords for the postgres user and selene user on the database
```
sudo -u postgres psql -c "ALTER USER postgres PASSWORD '<new password>'"
sudo -u postgres psql -c "CREATE ROLE selene WITH LOGIN ENCRYPTED PASSWORD '<password>'"
```
* Add environment variables containing these passwords for the bootstrap script
```
export DB_PASSWORD=<selene user password>
export POSTGRES_PASSWORD=<postgres user password>
```
* Generate secure passwords for the postgres user and selene user on the database
```
sudo -u postgres psql -c "ALTER USER postgres PASSWORD '$POSTGRES_PASSWORD'"
sudo -u postgres psql -c "CREATE ROLE selene WITH LOGIN ENCRYPTED PASSWORD '$DB_PASSWORD'"
```
* Run the bootstrap script
```
cd /opt/selene/selene-backend/db/scripts
pipenv run python bootstrap_mycroft_db.py
```
* By default, Postgres only listens on localhost. This will not do for a multi-server setup. Change the
* Note: if you get an authentication error you can temporarily edit `/etc/postgresql/<version>/main/pg_hba.conf` replacing the following lines:
```
# "local" is for Unix domain socket connections only
local all all trust
# IPv4 local connections:
host all all 127.0.0.1/32 trust
```
* By default, Postgres only listens on localhost. This will not do for a multi-server setup. Change the
`listen_addresses` value in the `posgresql.conf` file to the private IP of the database server. This file is owned by
the `postgres` user so use the following command to edit it (substituting vi for your favorite editor)
```
@ -100,7 +111,7 @@ the `postgres` user so use the following command to edit it (substituting vi for
```
sudo -u postgres vi /etc/postgres/10/main/pg_hba.conf
```
* Instructions on how to update the `pg_hba.conf` file can be found in
* Instructions on how to update the `pg_hba.conf` file can be found in
[Postgres' documentation](https://www.postgresql.org/docs/10/auth-pg-hba-conf.html). Below is an example for reference.
```
# IPv4 Selene connections
@ -110,19 +121,23 @@ host mycroft selene <private IP address>/32 md5
```
sudo systemctl restart postgresql
```
## Redis DB
### Redis DB
* Recommended server configuration: Ubuntu 18.04 LTS, 1 CPU, 1GB RAM, 5GB disk.
So as to not reinvent the wheel, here are some easy-to-follow instructions for
So as to not reinvent the wheel, here are some easy-to-follow instructions for
[installing Redis on Ubuntu 18.04](https://www.digitalocean.com/community/tutorials/how-to-install-and-secure-redis-on-ubuntu-18-04).
* By default, Redis only listens for One additional step is to change the "bind" variable in /etc/redis/redis.conf to be the private IP of the Redis host.
## APIs
* By default, Redis only listens on local host. For multi-server setups, one additional step is to change the "bind" variable in `/etc/redis/redis.conf` to be the private IP of the Redis host.
### APIs
The majority of the setup for each API is the same. This section defines the steps common to all APIs. Steps specific
to each API will be defined in their respective sections.
* Add an application user to the VM. Either give this user sudo privileges or execute the sudo commands below as a user
with sudo privileges. These instructions will assume a user name of "mycroft"
* Use the package management system to install Python 3.7, Python 3 pip and Python 3.7 Developer Tools
* Use the package management system to install Python 3.7, Python 3 pip and Python 3.7 Developer Tools
```
sudo apt install python3.7 python3-pip python3.7-dev
sudo apt install python3.7 python3-pip python3.7-dev
sudo python3.7 -m pip install pipenv
```
* Setup the Backend Application Directory
@ -141,46 +156,59 @@ cd /opt/selene
git clone https://github.com/MycroftAI/selene-backend.git
```
* If running in a test environment, be sure to checkout the "test" branch of the repository
## Single Sign On API
#### Single Sign On API
Recommended server configuration: Ubuntu 18.04 LTS, 1 CPU, 1GB RAM, 5GB disk
* Create the virtual environment and install the requirements for the application
```
cd /opt/selene/selene-backend/sso
cd /opt/selene/selene-backend/api/sso
pipenv install
```
## Account API
#### Account API
* Recommended server configuration: Ubuntu 18.04 LTS, 1 CPU, 1GB RAM, 5GB disk
* Create the virtual environment and install the requirements for the application
```
cd /opt/selene/selene-backend/account
cd /opt/selene/selene-backend/api/account
pipenv install
```
## Marketplace API
#### Marketplace API
* Recommended server configuration: Ubuntu 18.04 LTS, 1 CPU, 1GB RAM, 10GB disk
* Create the virtual environment and install the requirements for the application
```
cd /opt/selene/selene-backend/market
cd /opt/selene/selene-backend/api/market
pipenv install
```
## Device API
#### Device API
* Recommended server configuration: Ubuntu 18.04 LTS, 2 CPU, 2GB RAM, 50GB disk
* Create the virtual environment and install the requirements for the application
```
cd /opt/selene/selene-backend/public
cd /opt/selene/selene-backend/api/public
pipenv install
```
# Running the APIs
#### Precise API
* Recommended server configuration: Ubuntu 18.04 LTS, 1 CPU, 1GB RAM, 5GB disk
* Create the virtual environment and install the requirements for the application
```
cd /opt/selene/selene-backend/api/precise
pipenv install
```
### Running the APIs
Each API is configured to run on port 5000. This is not a problem if each is running in its own VM but will be an
issue if all APIs are running on the same server, or if port 5000 is already in use. To address these scenarios,
issue if all APIs are running on the same server, or if port 5000 is already in use. To address these scenarios,
change the port numbering in the uwsgi.ini file for each API.
## Single Sign On API
#### Single Sign On API
* The SSO application uses three JWTs for authentication. First is an access key, which is required to authenticate a
user for API calls. Second is a refresh key that automatically refreshes the access key when it expires. Third is a
reset key, which is used in a password reset scenario. Generate a secret key for each JWT.
* Any data that can identify a user is encrypted. Generate a salt that will be used with the encryption algorithm.
* Access to the Github API is required to support logging in with your Github account. Details can be found
[here](https://developer.github.com/v3/guides/basics-of-authentication/).
* The password reset functionality sends an email to the user with a link to reset their password. Selene uses
* The password reset functionality sends an email to the user with a link to reset their password. Selene uses
SendGrid to send these emails so a SendGrid account and API key are required.
* Define a systemd service to run the API. The service defines environment variables that use the secret and API keys
generated in previous steps.
@ -222,8 +250,9 @@ WantedBy=multi-user.target
sudo systemctl start sso_api.service
sudo systemctl enable sso_api.service
```
## Account API
* The account API uses the same authentication mechanism as the single sign on API. The JWT_ACCESS_SECRET,
#### Account API
* The account API uses the same authentication mechanism as the single sign on API. The JWT_ACCESS_SECRET,
JWT_REFRESH_SECRET and SALT environment variables must be the same values as those on the single sign on API.
* This application uses the Redis database so the service needs to know where it resides.
* Define a systemd service to run the API. The service defines environment variables that use the secret and API keys
@ -264,8 +293,9 @@ WantedBy=multi-user.target
sudo systemctl start account_api.service
sudo systemctl enable account_api.service
```
## Marketplace API
* The marketplace API uses the same authentication mechanism as the single sign on API. The JWT_ACCESS_SECRET,
#### Marketplace API
* The marketplace API uses the same authentication mechanism as the single sign on API. The JWT_ACCESS_SECRET,
JWT_REFRESH_SECRET and SALT environment variables must be the same values as those on the single sign on API.
* This application uses the Redis database so the service needs to know where it resides.
* Define a systemd service to run the API. The service defines environment variables that use the secret and API keys
@ -316,8 +346,8 @@ pipenv install
pipenv run python load_skill_display_data.py --core-version <specify core version, e.g. 19.02>
```
## Device API
* The device API uses the same authentication mechanism as the single sign on API. The JWT_ACCESS_SECRET,
#### Device API
* The device API uses the same authentication mechanism as the single sign on API. The JWT_ACCESS_SECRET,
JWT_REFRESH_SECRET and SALT environment variables must be the same values as those on the single sign on API.
* This application uses the Redis database so the service needs to know where it resides.
* The weather skill requires a key to the Open Weather Map API
@ -370,9 +400,42 @@ WantedBy=multi-user.target
sudo systemctl start public_api.service
sudo systemctl enable public_api.service
```
### Testing the endpoints
Before we continue, let's make sure that your endpoints are operational - for this we'll use the `public_api` endpoint as an example.
1. As we do not yet have a http router configured, we must change the `uwsgi` configuration for the endpoint we want to test. This is contained in: `/opt/selene/selene-backend/api/public/uwsgi.ini`. Here we want to replace
```
socket = :$PORT
```
with
```
http = :$PORT
```
then restart the service:
```
sudo systemctl restart public_api.service
```
2. Check the status of the systemd service:
```
systemctl status public_api.service
```
Should report the service as "active (running)"
3. Send a GET request from a remote device:
```
curl -v http://$IP_ADDRESS:$PORT/code?state=this-is-a-test
```
You can also monitor this from the service logs by running:
```
journalctl -u public_api.service -f
```
## Other Considerations
### DNS
There are multiple ways to setup DNS. This document will not dictate how to do so for Selene. However, here is an
There are multiple ways to setup DNS. This document will not dictate how to do so for Selene. However, here is an
example, based on how DNS is setup at Mycroft AI...
Each application runs on its own sub-domain. Assuming a top level domain of "mycroft.ai" the subdomains are:
@ -381,28 +444,27 @@ Each application runs on its own sub-domain. Assuming a top level domain of "my
* market.mycroft.ai
* sso.mycroft.ai
The APIs that support the web applications are directories within the sub-domain (e.g. account.mycroft.ai/api). Since
The APIs that support the web applications are directories within the sub-domain (e.g. account.mycroft.ai/api). Since
the device API is externally facing, it is versioned. It's subdirectory must be "v1".
### Reverse Proxy
There are multiple tools available for setting up a reverse proxy that will point your DNS entries to your APIs.
As such, the decision on how to set this up will be left to the user.
There are multiple tools available for setting up a reverse proxy that will point your DNS entries to your APIs. As such, the decision on how to set this up will be left to the user.
### SSL
It is recommended that Selene applications be run using HTTPS. To do this an SSL certificate is necessary.
It is recommended that Selene applications be run using HTTPS. To do this an SSL certificate is necessary.
[Let's Encrypt](https://letsencrypt.org) is a great way to easily set up SSL certificates for free.
# What About the GUI???
Once the database and API setup is complete, the next step is to setup the GUI, The README file for the
[Selene UI](https://github.com/mycroftai/selene-ui) repository contains the instructions for setting up the web
## What About the GUI???
Once the database and API setup is complete, the next step is to setup the GUI, The README file for the
[Selene UI](https://github.com/mycroftai/selene-ui) repository contains the instructions for setting up the web
applications.
# Getting Involved
## Getting Involved
This is an open source project and we would love your help. We have prepared a [contributing](.github/CONTRIBUTING.md)
This is an open source project and we would love your help. We have prepared a [contributing](.github/CONTRIBUTING.md)
guide to help you get started.
If this is your first PR or you're not sure where to get started,
say hi in [Mycroft Chat](https://chat.mycroft.ai/) and a team member would be happy to mentor you.
say hi in [Mycroft Chat](https://chat.mycroft.ai/) and a team member would be happy to guide you.
Join the [Mycroft Forum](https://community.mycroft.ai/) for questions and answers.

View File

@ -1,20 +0,0 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
flask = "<1.1"
uwsgi = "*"
schematics = "*"
stripe = "*"
selene = {editable = true,path = "./../../shared"}
[dev-packages]
behave = "*"
pyhamcrest = "*"
allure-behave = "*"
pylint = "*"
[requires]
python_version = "3.7"

449
api/account/Pipfile.lock generated
View File

@ -1,449 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "6e3770aff871d60899d0443bb793fbf0a16317e312a80aa0553990f4db437696"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.7"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"certifi": {
"hashes": [
"sha256:5930595817496dd21bb8dc35dad090f1c2cd0adfaf21204bf6732ca5d8ee34d3",
"sha256:8fc0819f1f30ba15bdb34cceffb9ef04d99f420f68eb75d901e9560b8749fc41"
],
"version": "==2020.6.20"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"click": {
"hashes": [
"sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a",
"sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"
],
"version": "==7.1.2"
},
"deprecated": {
"hashes": [
"sha256:525ba66fb5f90b07169fdd48b6373c18f1ee12728ca277ca44567a367d9d7f74",
"sha256:a766c1dccb30c5f6eb2b203f87edd1d8588847709c78589e1521d769addc8218"
],
"version": "==1.2.10"
},
"facebook-sdk": {
"hashes": [
"sha256:2e987b3e0f466a6f4ee77b935eb023dba1384134f004a2af21f1cfff7fe0806e",
"sha256:cabcd2e69ea3d9f042919c99b353df7aa1e2be86d040121f6e9f5e63c1cf0f8d"
],
"version": "==3.1.0"
},
"flask": {
"hashes": [
"sha256:1a21ccca71cee5e55b6a367cc48c6eb47e3c447f76e64d41f3f3f931c17e7c96",
"sha256:ed1330220a321138de53ec7c534c3d90cf2f7af938c7880fc3da13aa46bf870f"
],
"index": "pypi",
"version": "==1.0.4"
},
"idna": {
"hashes": [
"sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6",
"sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"
],
"version": "==2.10"
},
"itsdangerous": {
"hashes": [
"sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19",
"sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749"
],
"version": "==1.1.0"
},
"jinja2": {
"hashes": [
"sha256:89aab215427ef59c34ad58735269eb58b1a5808103067f7bb9d5836c651b3bb0",
"sha256:f0a4641d3cf955324a89c04f3d94663aa4d638abe8f733ecd3582848e1c37035"
],
"version": "==2.11.2"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7",
"sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be"
],
"version": "==1.1.1"
},
"passlib": {
"hashes": [
"sha256:68c35c98a7968850e17f1b6892720764cc7eed0ef2b7cb3116a89a28e43fe177",
"sha256:8d666cef936198bc2ab47ee9b0410c94adf2ba798e5a84bf220be079ae7ab6a8"
],
"version": "==1.7.2"
},
"psycopg2-binary": {
"hashes": [
"sha256:008da3ab51adc70a5f1cfbbe5db3a22607ab030eb44bcecf517ad11a0c2b3cac",
"sha256:07cf82c870ec2d2ce94d18e70c13323c89f2f2a2628cbf1feee700630be2519a",
"sha256:08507efbe532029adee21b8d4c999170a83760d38249936038bd0602327029b5",
"sha256:107d9be3b614e52a192719c6bf32e8813030020ea1d1215daa86ded9a24d8b04",
"sha256:17a0ea0b0eabf07035e5e0d520dabc7950aeb15a17c6d36128ba99b2721b25b1",
"sha256:3286541b9d85a340ee4ed42732d15fc1bb441dc500c97243a768154ab8505bb5",
"sha256:3939cf75fc89c5e9ed836e228c4a63604dff95ad19aed2bbf71d5d04c15ed5ce",
"sha256:40abc319f7f26c042a11658bf3dd3b0b3bceccf883ec1c565d5c909a90204434",
"sha256:51f7823f1b087d2020d8e8c9e6687473d3d239ba9afc162d9b2ab6e80b53f9f9",
"sha256:6bb2dd006a46a4a4ce95201f836194eb6a1e863f69ee5bab506673e0ca767057",
"sha256:702f09d8f77dc4794651f650828791af82f7c2efd8c91ae79e3d9fe4bb7d4c98",
"sha256:7036ccf715925251fac969f4da9ad37e4b7e211b1e920860148a10c0de963522",
"sha256:7b832d76cc65c092abd9505cc670c4e3421fd136fb6ea5b94efbe4c146572505",
"sha256:8f74e631b67482d504d7e9cf364071fc5d54c28e79a093ff402d5f8f81e23bfa",
"sha256:930315ac53dc65cbf52ab6b6d27422611f5fb461d763c531db229c7e1af6c0b3",
"sha256:96d3038f5bd061401996614f65d27a4ecb62d843eb4f48e212e6d129171a721f",
"sha256:a20299ee0ea2f9cca494396ac472d6e636745652a64a418b39522c120fd0a0a4",
"sha256:a34826d6465c2e2bbe9d0605f944f19d2480589f89863ed5f091943be27c9de4",
"sha256:a69970ee896e21db4c57e398646af9edc71c003bc52a3cc77fb150240fefd266",
"sha256:b9a8b391c2b0321e0cd7ec6b4cfcc3dd6349347bd1207d48bcb752aa6c553a66",
"sha256:ba13346ff6d3eb2dca0b6fa0d8a9d999eff3dcd9b55f3a890f12b0b6362b2b38",
"sha256:bb0608694a91db1e230b4a314e8ed00ad07ed0c518f9a69b83af2717e31291a3",
"sha256:c8830b7d5f16fd79d39b21e3d94f247219036b29b30c8270314c46bf8b732389",
"sha256:cac918cd7c4c498a60f5d2a61d4f0a6091c2c9490d81bc805c963444032d0dab",
"sha256:cc30cb900f42c8a246e2cb76539d9726f407330bc244ca7729c41a44e8d807fb",
"sha256:ccdc6a87f32b491129ada4b87a43b1895cf2c20fdb7f98ad979647506ffc41b6",
"sha256:d1a8b01f6a964fec702d6b6dac1f91f2b9f9fe41b310cbb16c7ef1fac82df06d",
"sha256:e004db88e5a75e5fdab1620fb9f90c9598c2a195a594225ac4ed2a6f1c23e162",
"sha256:eb2f43ae3037f1ef5e19339c41cf56947021ac892f668765cd65f8ab9814192e",
"sha256:fa466306fcf6b39b8a61d003123d442b23707d635a5cb05ac4e1b62cc79105cd"
],
"version": "==2.8.5"
},
"pygithub": {
"hashes": [
"sha256:8375a058ec651cc0774244a3bc7395cf93617298735934cdd59e5bcd9a1df96e",
"sha256:d2d17d1e3f4474e070353f201164685a95b5a92f5ee0897442504e399c7bc249"
],
"version": "==1.51"
},
"pyhamcrest": {
"hashes": [
"sha256:412e00137858f04bde0729913874a48485665f2d36fe9ee449f26be864af9316",
"sha256:7ead136e03655af85069b6f47b23eb7c3e5c221aa9f022a4fbb499f5b7308f29"
],
"version": "==2.0.2"
},
"pyjwt": {
"hashes": [
"sha256:5c6eca3c2940464d106b99ba83b00c6add741c9becaec087fb7ccdefea71350e",
"sha256:8d59a976fb773f3e6a39c85636357c4f0e242707394cadadd9814f5cbaa20e96"
],
"version": "==1.7.1"
},
"python-http-client": {
"hashes": [
"sha256:93d6a26b426e48b04e589c1f103e7c040193e4ccc379ea50cd6e12f94cca7c69"
],
"version": "==3.2.7"
},
"redis": {
"hashes": [
"sha256:0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2",
"sha256:432b788c4530cfe16d8d943a09d40ca6c16149727e4afe8c2c9d5580c59d9f24"
],
"version": "==3.5.3"
},
"requests": {
"hashes": [
"sha256:b3559a131db72c33ee969480840fff4bb6dd111de7dd27c8ee1f820f4f00231b",
"sha256:fe75cc94a9443b9246fc7049224f75604b113c36acb93f87b80ed42c44cbb898"
],
"markers": "python_version >= '3.0'",
"version": "==2.24.0"
},
"schedule": {
"hashes": [
"sha256:3f895a1036799a25ab9c335de917073e63cf8256920917e932777382f101f08f",
"sha256:f9fb5181283de4db6e701d476dd01b6a3dd81c38462a54991ddbb9d26db857c9"
],
"version": "==0.6.0"
},
"schematics": {
"hashes": [
"sha256:8fcc6182606fd0b24410a1dbb066d9bbddbe8da9c9509f47b743495706239283",
"sha256:a40b20635c0e43d18d3aff76220f6cd95ea4decb3f37765e49529b17d81b0439"
],
"index": "pypi",
"version": "==2.1.0"
},
"selene": {
"editable": true,
"path": "./../../shared"
},
"sendgrid": {
"hashes": [
"sha256:54e51ca9afbfe1a4706864f42eb1a12d597e375249d80a8ce679e7a4fa91e776",
"sha256:dd0eddf079be040172a4d0afdf9b9becb4e53210ead015a0e6b2d680eea92ac0"
],
"version": "==6.4.1"
},
"starkbank-ecdsa": {
"hashes": [
"sha256:cd17ec9fa7ad8ae3fc81a63ddb7e0d7fb798a048e40c1a9c55afd1a207d1eff9"
],
"version": "==1.0.0"
},
"stripe": {
"hashes": [
"sha256:515fe2cc915e639468f30150a39c162fc0fb090256ae9d6a04e5022925d136f1",
"sha256:bdbbea632b8faa983c670db61debbe0bdb5802ef98fd0613a03aa466e56cdade"
],
"index": "pypi",
"version": "==2.48.0"
},
"urllib3": {
"hashes": [
"sha256:3018294ebefce6572a474f0604c2021e33b3fd8006ecd11d62107a5d2a963527",
"sha256:88206b0eb87e6d677d424843ac5209e3fb9d0190d0ee169599165ec25e9d9115"
],
"version": "==1.25.9"
},
"uwsgi": {
"hashes": [
"sha256:faa85e053c0b1be4d5585b0858d3a511d2cd10201802e8676060fd0a109e5869"
],
"index": "pypi",
"version": "==2.0.19.1"
},
"werkzeug": {
"hashes": [
"sha256:2de2a5db0baeae7b2d2664949077c2ac63fbd16d98da0ff71837f7d1dea3fd43",
"sha256:6c80b1e5ad3665290ea39320b91e1be1e0d5f60652b964a3070216de83d2e47c"
],
"version": "==1.0.1"
},
"wrapt": {
"hashes": [
"sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7"
],
"version": "==1.12.1"
}
},
"develop": {
"allure-behave": {
"hashes": [
"sha256:71f7ab8f7afb38ca323bdf0f300cb3a280928e63c2e962a30748c23914c8ee3d",
"sha256:a6ec9968ec6c6ee69ab964cbea65e9dfa81e283f3b55ad5be8d42f3df70f8766"
],
"index": "pypi",
"version": "==2.8.16"
},
"allure-python-commons": {
"hashes": [
"sha256:3cf65bce770e4d6b6b1bd46bfecad8a04f1f7bef44133f9a3ded4295510187e2",
"sha256:f67104a51643f2b0f1807acfe324bc13c1fa97f16d9b5c85670199acabd5c40d"
],
"version": "==2.8.16"
},
"astroid": {
"hashes": [
"sha256:2f4078c2a41bf377eea06d71c9d2ba4eb8f6b1af2135bec27bbbb7d8f12bb703",
"sha256:bc58d83eb610252fd8de6363e39d4f1d0619c894b0ed24603b881c02e64c7386"
],
"version": "==2.4.2"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"behave": {
"hashes": [
"sha256:b9662327aa53294c1351b0a9c369093ccec1d21026f050c3bd9b3e5cccf81a86",
"sha256:ebda1a6c9e5bfe95c5f9f0a2794e01c7098b3dde86c10a95d8621c5907ff6f1c"
],
"index": "pypi",
"version": "==1.2.6"
},
"importlib-metadata": {
"hashes": [
"sha256:90bb658cdbbf6d1735b6341ce708fc7024a3e14e99ffdc5783edea9f9b077f83",
"sha256:dc15b2969b4ce36305c51eebe62d418ac7791e9a157911d58bfb1f9ccd8e2070"
],
"markers": "python_version < '3.8'",
"version": "==1.7.0"
},
"isort": {
"hashes": [
"sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1",
"sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd"
],
"version": "==4.3.21"
},
"lazy-object-proxy": {
"hashes": [
"sha256:0c4b206227a8097f05c4dbdd323c50edf81f15db3b8dc064d08c62d37e1a504d",
"sha256:194d092e6f246b906e8f70884e620e459fc54db3259e60cf69a4d66c3fda3449",
"sha256:1be7e4c9f96948003609aa6c974ae59830a6baecc5376c25c92d7d697e684c08",
"sha256:4677f594e474c91da97f489fea5b7daa17b5517190899cf213697e48d3902f5a",
"sha256:48dab84ebd4831077b150572aec802f303117c8cc5c871e182447281ebf3ac50",
"sha256:5541cada25cd173702dbd99f8e22434105456314462326f06dba3e180f203dfd",
"sha256:59f79fef100b09564bc2df42ea2d8d21a64fdcda64979c0fa3db7bdaabaf6239",
"sha256:8d859b89baf8ef7f8bc6b00aa20316483d67f0b1cbf422f5b4dc56701c8f2ffb",
"sha256:9254f4358b9b541e3441b007a0ea0764b9d056afdeafc1a5569eee1cc6c1b9ea",
"sha256:9651375199045a358eb6741df3e02a651e0330be090b3bc79f6d0de31a80ec3e",
"sha256:97bb5884f6f1cdce0099f86b907aa41c970c3c672ac8b9c8352789e103cf3156",
"sha256:9b15f3f4c0f35727d3a0fba4b770b3c4ebbb1fa907dbcc046a1d2799f3edd142",
"sha256:a2238e9d1bb71a56cd710611a1614d1194dc10a175c1e08d75e1a7bcc250d442",
"sha256:a6ae12d08c0bf9909ce12385803a543bfe99b95fe01e752536a60af2b7797c62",
"sha256:ca0a928a3ddbc5725be2dd1cf895ec0a254798915fb3a36af0964a0a4149e3db",
"sha256:cb2c7c57005a6804ab66f106ceb8482da55f5314b7fcb06551db1edae4ad1531",
"sha256:d74bb8693bf9cf75ac3b47a54d716bbb1a92648d5f781fc799347cfc95952383",
"sha256:d945239a5639b3ff35b70a88c5f2f491913eb94871780ebfabb2568bd58afc5a",
"sha256:eba7011090323c1dadf18b3b689845fd96a61ba0a1dfbd7f24b921398affc357",
"sha256:efa1909120ce98bbb3777e8b6f92237f5d5c8ea6758efea36a473e1d38f7d3e4",
"sha256:f3900e8a5de27447acbf900b4750b0ddfd7ec1ea7fbaf11dfa911141bc522af0"
],
"version": "==1.4.3"
},
"mccabe": {
"hashes": [
"sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
"sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
],
"version": "==0.6.1"
},
"parse": {
"hashes": [
"sha256:a6d4e2c2f1fbde6717d28084a191a052950f758c0cbd83805357e6575c2b95c0"
],
"version": "==1.15.0"
},
"parse-type": {
"hashes": [
"sha256:089a471b06327103865dfec2dd844230c3c658a4a1b5b4c8b6c16c8f77577f9e",
"sha256:7f690b18d35048c15438d6d0571f9045cffbec5907e0b1ccf006f889e3a38c0b"
],
"version": "==0.5.2"
},
"pluggy": {
"hashes": [
"sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
],
"version": "==0.13.1"
},
"pyhamcrest": {
"hashes": [
"sha256:412e00137858f04bde0729913874a48485665f2d36fe9ee449f26be864af9316",
"sha256:7ead136e03655af85069b6f47b23eb7c3e5c221aa9f022a4fbb499f5b7308f29"
],
"version": "==2.0.2"
},
"pylint": {
"hashes": [
"sha256:7dd78437f2d8d019717dbf287772d0b2dbdfd13fc016aa7faa08d67bccc46adc",
"sha256:d0ece7d223fe422088b0e8f13fa0a1e8eb745ebffcb8ed53d3e95394b6101a1c"
],
"index": "pypi",
"version": "==2.5.3"
},
"six": {
"hashes": [
"sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
"sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
],
"version": "==1.15.0"
},
"toml": {
"hashes": [
"sha256:926b612be1e5ce0634a2ca03470f95169cf16f939018233a670519cb4ac58b0f",
"sha256:bda89d5935c2eac546d648028b9901107a595863cb36bae0c73ac804a9b4ce88"
],
"version": "==0.10.1"
},
"typed-ast": {
"hashes": [
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"markers": "implementation_name == 'cpython' and python_version < '3.8'",
"version": "==1.4.1"
},
"wrapt": {
"hashes": [
"sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7"
],
"version": "==1.12.1"
},
"zipp": {
"hashes": [
"sha256:aa36550ff0c0b7ef7fa639055d797116ee891440eac1a56f378e2d3179e0320b",
"sha256:c599e4d75c98f6798c509911d08a22e6c021d074469042177c8c86fb92eefd96"
],
"version": "==3.1.0"
}
}
}

View File

@ -16,4 +16,3 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

View File

@ -21,29 +21,38 @@
from flask import Flask
from selene.api import get_base_config, selene_api, SeleneResponse
from selene.api.endpoints import AccountEndpoint, AgreementsEndpoint
from selene.api.endpoints import (
AccountEndpoint,
AgreementsEndpoint,
ValidateEmailEndpoint,
)
from selene.util.cache import SeleneCache
from selene.util.log import configure_logger
from selene.util.log import configure_selene_logger
from .endpoints import (
PreferencesEndpoint,
AccountDefaultsEndpoint,
CityEndpoint,
CountryEndpoint,
AccountDefaultsEndpoint,
EmailAddressChangeEndpoint,
DeviceEndpoint,
DeviceCountEndpoint,
GeographyEndpoint,
MembershipEndpoint,
RegionEndpoint,
PairingCodeEndpoint,
PasswordChangeEndpoint,
PreferencesEndpoint,
SkillsEndpoint,
SkillOauthEndpoint,
SkillSettingsEndpoint,
SoftwareUpdateEndpoint,
SshKeyValidatorEndpoint,
TimezoneEndpoint,
VerifyEmailAddressEndpoint,
VoiceEndpoint,
WakeWordEndpoint
WakeWordEndpoint,
)
_log = configure_logger('account_api')
configure_selene_logger("account_api")
# Define the Flask application
@ -51,141 +60,119 @@ acct = Flask(__name__)
acct.config.from_object(get_base_config())
acct.response_class = SeleneResponse
acct.register_blueprint(selene_api)
acct.config['SELENE_CACHE'] = SeleneCache()
acct.config["SELENE_CACHE"] = SeleneCache()
account_endpoint = AccountEndpoint.as_view('account_endpoint')
account_endpoint = AccountEndpoint.as_view("account_endpoint")
acct.add_url_rule(
'/api/account',
view_func=account_endpoint,
methods=['GET', 'PATCH', 'DELETE']
"/api/account", view_func=account_endpoint, methods=["GET", "PATCH", "DELETE"]
)
agreements_endpoint = AgreementsEndpoint.as_view('agreements_endpoint')
agreements_endpoint = AgreementsEndpoint.as_view("agreements_endpoint")
acct.add_url_rule(
'/api/agreement/<string:agreement_type>',
"/api/agreement/<string:agreement_type>",
view_func=agreements_endpoint,
methods=['GET']
methods=["GET"],
)
city_endpoint = CityEndpoint.as_view('city_endpoint')
city_endpoint = CityEndpoint.as_view("city_endpoint")
acct.add_url_rule("/api/cities", view_func=city_endpoint, methods=["GET"])
country_endpoint = CountryEndpoint.as_view("country_endpoint")
acct.add_url_rule("/api/countries", view_func=country_endpoint, methods=["GET"])
defaults_endpoint = AccountDefaultsEndpoint.as_view("defaults_endpoint")
acct.add_url_rule(
'/api/cities',
view_func=city_endpoint,
methods=['GET']
"/api/defaults", view_func=defaults_endpoint, methods=["GET", "PATCH", "POST"]
)
country_endpoint = CountryEndpoint.as_view('country_endpoint')
device_endpoint = DeviceEndpoint.as_view("device_endpoint")
acct.add_url_rule(
'/api/countries',
view_func=country_endpoint,
methods=['GET']
)
defaults_endpoint = AccountDefaultsEndpoint.as_view('defaults_endpoint')
acct.add_url_rule(
'/api/defaults',
view_func=defaults_endpoint,
methods=['GET', 'PATCH', 'POST']
)
device_endpoint = DeviceEndpoint.as_view('device_endpoint')
acct.add_url_rule(
'/api/devices',
defaults={'device_id': None},
"/api/devices",
defaults={"device_id": None},
view_func=device_endpoint,
methods=['GET']
methods=["GET"],
)
acct.add_url_rule("/api/devices", view_func=device_endpoint, methods=["POST"])
acct.add_url_rule(
'/api/devices',
"/api/devices/<string:device_id>",
view_func=device_endpoint,
methods=['POST']
)
acct.add_url_rule(
'/api/devices/<string:device_id>',
view_func=device_endpoint,
methods=['DELETE', 'GET', 'PATCH']
methods=["DELETE", "GET", "PATCH"],
)
device_count_endpoint = DeviceCountEndpoint.as_view('device_count_endpoint')
device_count_endpoint = DeviceCountEndpoint.as_view("device_count_endpoint")
acct.add_url_rule("/api/device-count", view_func=device_count_endpoint, methods=["GET"])
change_email_endpoint = EmailAddressChangeEndpoint.as_view("change_email_endpoint")
acct.add_url_rule("/api/change-email", view_func=change_email_endpoint, methods=["PUT"])
change_password_endpoint = PasswordChangeEndpoint.as_view("change_password_endpoint")
acct.add_url_rule(
'/api/device-count',
view_func=device_count_endpoint,
methods=['GET']
"/api/change-password", view_func=change_password_endpoint, methods=["PUT"]
)
geography_endpoint = GeographyEndpoint.as_view('geography_endpoint')
acct.add_url_rule(
'/api/geographies',
view_func=geography_endpoint,
methods=['GET']
)
geography_endpoint = GeographyEndpoint.as_view("geography_endpoint")
acct.add_url_rule("/api/geographies", view_func=geography_endpoint, methods=["GET"])
membership_endpoint = MembershipEndpoint.as_view('membership_endpoint')
acct.add_url_rule(
'/api/memberships',
view_func=membership_endpoint,
methods=['GET']
)
membership_endpoint = MembershipEndpoint.as_view("membership_endpoint")
acct.add_url_rule("/api/memberships", view_func=membership_endpoint, methods=["GET"])
pairing_code_endpoint = PairingCodeEndpoint.as_view('pairing_code_endpoint')
pairing_code_endpoint = PairingCodeEndpoint.as_view("pairing_code_endpoint")
acct.add_url_rule(
'/api/pairing-code/<string:pairing_code>',
"/api/pairing-code/<string:pairing_code>",
view_func=pairing_code_endpoint,
methods=['GET']
methods=["GET"],
)
preferences_endpoint = PreferencesEndpoint.as_view('preferences_endpoint')
preferences_endpoint = PreferencesEndpoint.as_view("preferences_endpoint")
acct.add_url_rule(
'/api/preferences',
view_func=preferences_endpoint,
methods=['GET', 'PATCH', 'POST']
"/api/preferences", view_func=preferences_endpoint, methods=["GET", "PATCH", "POST"]
)
region_endpoint = RegionEndpoint.as_view('region_endpoint')
acct.add_url_rule(
'/api/regions',
view_func=region_endpoint,
methods=['GET']
)
region_endpoint = RegionEndpoint.as_view("region_endpoint")
acct.add_url_rule("/api/regions", view_func=region_endpoint, methods=["GET"])
setting_endpoint = SkillSettingsEndpoint.as_view('setting_endpoint')
setting_endpoint = SkillSettingsEndpoint.as_view("setting_endpoint")
acct.add_url_rule(
'/api/skills/<string:skill_family_name>/settings',
"/api/skills/<string:skill_family_name>/settings",
view_func=setting_endpoint,
methods=['GET', 'PUT']
methods=["GET", "PUT"],
)
skill_endpoint = SkillsEndpoint.as_view('skill_endpoint')
skill_endpoint = SkillsEndpoint.as_view("skill_endpoint")
acct.add_url_rule("/api/skills", view_func=skill_endpoint, methods=["GET"])
skill_oauth_endpoint = SkillOauthEndpoint.as_view("skill_oauth_endpoint")
acct.add_url_rule(
'/api/skills',
view_func=skill_endpoint,
methods=['GET']
"/api/skills/oauth/<int:oauth_id>", view_func=skill_oauth_endpoint, methods=["GET"]
)
skill_oauth_endpoint = SkillOauthEndpoint.as_view('skill_oauth_endpoint')
software_update_endpoint = SoftwareUpdateEndpoint.as_view("software_update_endpoint")
acct.add_url_rule(
'/api/skills/oauth/<int:oauth_id>',
view_func=skill_oauth_endpoint,
methods=['GET']
"/api/software-update", view_func=software_update_endpoint, methods=["PATCH"]
)
timezone_endpoint = TimezoneEndpoint.as_view('timezone_endpoint')
ssh_key_validation_endpoint = SshKeyValidatorEndpoint.as_view(
"ssh_key_validation_endpoint"
)
acct.add_url_rule(
'/api/timezones',
view_func=timezone_endpoint,
methods=['GET']
"/api/ssh-key",
view_func=ssh_key_validation_endpoint,
methods=["GET"],
)
voice_endpoint = VoiceEndpoint.as_view('voice_endpoint')
timezone_endpoint = TimezoneEndpoint.as_view("timezone_endpoint")
acct.add_url_rule("/api/timezones", view_func=timezone_endpoint, methods=["GET"])
validate_email_endpoint = ValidateEmailEndpoint.as_view("validate_email_endpoint")
acct.add_url_rule(
'/api/voices',
view_func=voice_endpoint,
methods=['GET']
"/api/validate-email", view_func=validate_email_endpoint, methods=["GET"]
)
wake_word_endpoint = WakeWordEndpoint.as_view('wake_word_endpoint')
acct.add_url_rule(
'/api/wake-words',
view_func=wake_word_endpoint,
methods=['GET']
)
verify_email_endpoint = VerifyEmailAddressEndpoint.as_view("verify_email_endpoint")
acct.add_url_rule("/api/verify-email", view_func=verify_email_endpoint, methods=["PUT"])
voice_endpoint = VoiceEndpoint.as_view("voice_endpoint")
acct.add_url_rule("/api/voices", view_func=voice_endpoint, methods=["GET"])
wake_word_endpoint = WakeWordEndpoint.as_view("wake_word_endpoint")
acct.add_url_rule("/api/wake-words", view_func=wake_word_endpoint, methods=["GET"])

View File

@ -16,8 +16,11 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Public API into the endpoints package."""
from .preferences import PreferencesEndpoint
from .change_email_address import EmailAddressChangeEndpoint
from .change_password import PasswordChangeEndpoint
from .city import CityEndpoint
from .country import CountryEndpoint
from .defaults import AccountDefaultsEndpoint
@ -30,6 +33,9 @@ from .region import RegionEndpoint
from .skills import SkillsEndpoint
from .skill_oauth import SkillOauthEndpoint
from .skill_settings import SkillSettingsEndpoint
from .software_update import SoftwareUpdateEndpoint
from .ssh_key_validator import SshKeyValidatorEndpoint
from .timezone import TimezoneEndpoint
from .verify_email_address import VerifyEmailAddressEndpoint
from .voice_endpoint import VoiceEndpoint
from .wake_word_endpoint import WakeWordEndpoint

View File

@ -0,0 +1,90 @@
# Mycroft Server - Backend
# Copyright (c) 2022 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
# #
# This file is part of the Mycroft Server.
# #
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
# #
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# #
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
"""Defines the password change endpoint for the account API.
This endpoint does not update the email address in the database. The user needs to
verify the email address is correct before the change is applied. See the
verify_email_address module in this package for the verification step.
"""
from binascii import a2b_base64, b2a_base64
from http import HTTPStatus
from os import environ
from selene.api import APIError, SeleneEndpoint
from selene.util.email import EmailMessage, SeleneMailer, validate_email_address
class EmailAddressChangeEndpoint(SeleneEndpoint):
"""Adds authentication to the common password changing endpoint."""
def put(self):
"""Executes an HTTP PUT request."""
self._authenticate()
new_email_address = self._validate_request()
self._send_notification()
self._send_verification_email(new_email_address)
return "", HTTPStatus.NO_CONTENT
def _validate_request(self) -> str:
"""Validates the content of the API request.
:returns: A validated and normalized email address
:raises: APIError when email address is invalid
"""
request_token = self.request.json["token"]
new_email_address = a2b_base64(request_token).decode()
normalized_address, error = validate_email_address(new_email_address)
if error is not None:
raise APIError(error)
return normalized_address
def _send_notification(self):
"""Notifies the current email address' owner of the requested change."""
_, error = validate_email_address(self.account.email_address)
if error is None:
email = EmailMessage(
recipient=self.account.email_address,
sender="Mycroft AI<no-reply@mycroft.ai>",
subject="Email Address Changed",
template_file_name="email_change.html",
)
mailer = SeleneMailer(email)
mailer.send(using_jinja=True)
@staticmethod
def _send_verification_email(new_email_address):
"""Sends an email with a link for email verification to the requested address.
:param new_email_address: the recipient of the verification email
"""
base_url = environ["ACCOUNT_BASE_URL"]
token = b2a_base64(new_email_address.encode(), newline=False).decode()
url = f"{base_url}/verify-email?token={token}"
email = EmailMessage(
recipient=new_email_address,
sender="Mycroft AI<no-reply@mycroft.ai>",
subject="Email Change Verification",
template_file_name="email_verification.html",
template_variables=dict(email_verification_url=url),
)
mailer = SeleneMailer(email)
mailer.send(using_jinja=True)

View File

@ -0,0 +1,40 @@
# Mycroft Server - Backend
# Copyright (c) 2022 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
# #
# This file is part of the Mycroft Server.
# #
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
# #
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# #
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
"""Defines the password change endpoint for the account API."""
from selene.api.endpoints import PasswordChangeEndpoint as CommonPasswordChangeEndpoint
from selene.util.email import EmailMessage, SeleneMailer
class PasswordChangeEndpoint(CommonPasswordChangeEndpoint):
"""Adds authentication to the common password changing endpoint."""
@property
def account_id(self):
return self.account.id
def _send_email(self):
email = EmailMessage(
recipient=self.account.email_address,
sender="Mycroft AI<no-reply@mycroft.ai>",
subject="Password Changed",
template_file_name="password_change.html",
)
mailer = SeleneMailer(email)
mailer.send(using_jinja=True)

View File

@ -16,26 +16,7 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
# Mycroft Server - Backend
# Copyright (C) 2019 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Account API endpoint for retrieving city geographical information."""
from http import HTTPStatus
from selene.api import SeleneEndpoint
@ -43,8 +24,11 @@ from selene.data.geography import CityRepository
class CityEndpoint(SeleneEndpoint):
"""Retrieve a city in a region"""
def get(self):
region_id = self.request.args['region']
"""Process an HTTP GET request."""
region_id = self.request.args["region"]
city_repository = CityRepository(self.db)
cities = city_repository.get_cities_by_region(region_id=region_id)

View File

@ -16,18 +16,22 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Account API endpoint for account defaults."""
from http import HTTPStatus
from flask import json
from schematics import Model
from schematics.types import StringType
from selene.api import SeleneEndpoint
from selene.data.device import DefaultsRepository
from selene.util.log import get_selene_logger
_log = get_selene_logger(__name__)
class DefaultsRequest(Model):
"""Data model of the POST request."""
city = StringType()
country = StringType()
region = StringType()
@ -37,15 +41,18 @@ class DefaultsRequest(Model):
class AccountDefaultsEndpoint(SeleneEndpoint):
"""Handle account default HTTP requests."""
def __init__(self):
super(AccountDefaultsEndpoint, self).__init__()
super().__init__()
self.defaults = None
def get(self):
"""Process a HTTP GET request."""
self._authenticate()
self._get_defaults()
if self.defaults is None:
response_data = ''
response_data = ""
response_code = HTTPStatus.NO_CONTENT
else:
response_data = self.defaults
@ -54,36 +61,46 @@ class AccountDefaultsEndpoint(SeleneEndpoint):
return response_data, response_code
def _get_defaults(self):
"""Get the account defaults from the database."""
default_repository = DefaultsRepository(self.db, self.account.id)
self.defaults = default_repository.get_account_defaults()
if self.defaults is not None and self.defaults.wake_word.name is not None:
self.defaults.wake_word.name = self.defaults.wake_word.name.title()
def post(self):
"""Process a HTTP POST request."""
self._authenticate()
defaults = self._validate_request()
self._upsert_defaults(defaults)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def patch(self):
"""Process an HTTP PATCH request."""
self._authenticate()
defaults = self._validate_request()
self._upsert_defaults(defaults)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def _validate_request(self):
def _validate_request(self) -> dict:
"""Validate the data on the POST/PATCH request"""
request_data = json.loads(self.request.data)
defaults = DefaultsRequest()
defaults.city = request_data.get('city')
defaults.country = request_data.get('country')
defaults.region = request_data.get('region')
defaults.timezone = request_data.get('timezone')
defaults.voice = request_data['voice']
defaults.wake_word = request_data['wakeWord']
defaults.city = request_data.get("city")
defaults.country = request_data.get("country")
defaults.region = request_data.get("region")
defaults.timezone = request_data.get("timezone")
defaults.voice = request_data["voice"]
defaults.wake_word = request_data["wakeWord"]
defaults.validate()
return defaults
return defaults.to_native()
def _upsert_defaults(self, defaults):
def _upsert_defaults(self, defaults: dict):
"""Apply the changes in the request to the database."""
defaults_repository = DefaultsRepository(self.db, self.account.id)
defaults_repository.upsert(defaults.to_native())
wake_word_default = defaults.get("wake_word")
if wake_word_default is not None:
defaults["wake_word"] = defaults["wake_word"].lower()
defaults_repository.upsert(defaults)

View File

@ -16,63 +16,104 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Account API endpoint for retrieving and maintaining device information."""
from dataclasses import asdict
from datetime import datetime, timedelta
from http import HTTPStatus
from logging import getLogger
from typing import List, Optional
from flask import json
from schematics import Model
from schematics.exceptions import ValidationError
from schematics.types import StringType
from schematics.types import BooleanType, StringType
from selene.api import SeleneEndpoint
from selene.api.etag import ETagManager
from selene.api.pantacor import get_pantacor_pending_deployment, update_pantacor_config
from selene.api.public_endpoint import delete_device_login
from selene.data.device import DeviceRepository, Geography, GeographyRepository
from selene.util.cache import DEVICE_LAST_CONTACT_KEY, SeleneCache
from selene.data.device import Device, DeviceRepository, Geography, GeographyRepository
from selene.util.cache import (
DEVICE_LAST_CONTACT_KEY,
DEVICE_PAIRING_CODE_KEY,
DEVICE_PAIRING_TOKEN_KEY,
SeleneCache,
)
from selene.util.db import use_transaction
from selene.util.log import get_selene_logger
ONE_DAY = 86400
CONNECTED = 'Connected'
DISCONNECTED = 'Disconnected'
DORMANT = 'Dormant'
CONNECTED = "Connected"
DISCONNECTED = "Disconnected"
DORMANT = "Dormant"
_log = getLogger()
_log = get_selene_logger(__name__)
def validate_pairing_code(pairing_code):
cache_key = 'pairing.code:' + pairing_code
"""Ensure the pairing code exists in the cache of valid pairing codes."""
cache_key = DEVICE_PAIRING_CODE_KEY.format(pairing_code=pairing_code)
cache = SeleneCache()
pairing_cache = cache.get(cache_key)
if pairing_cache is None:
raise ValidationError('pairing code not found')
raise ValidationError("pairing code not found")
class UpdateDeviceRequest(Model):
"""Schematic for a request to update a device."""
city = StringType(required=True)
country = StringType(required=True)
name = StringType(required=True)
placement = StringType()
region = StringType(required=True)
timezone = StringType(required=True)
wake_word = StringType(required=True)
wake_word = StringType(required=True, deserialize_from="wakeWord")
voice = StringType(required=True)
auto_update = BooleanType(deserialize_from="autoUpdate")
ssh_public_key = StringType(deserialize_from="sshPublicKey")
release_channel = StringType(deserialize_from="releaseChannel")
class NewDeviceRequest(UpdateDeviceRequest):
pairing_code = StringType(required=True, validators=[validate_pairing_code])
"""Schematic for a request to add a device."""
pairing_code = StringType(
required=True,
deserialize_from="pairingCode",
validators=[validate_pairing_code],
)
class DeviceEndpoint(SeleneEndpoint):
def __init__(self):
super(DeviceEndpoint, self).__init__()
self.devices = None
self.cache = self.config['SELENE_CACHE']
self.etag_manager: ETagManager = ETagManager(self.cache, self.config)
"""Retrieve and maintain device information for the Account API"""
def get(self, device_id):
_device_repository = None
def __init__(self):
super().__init__()
self.devices = None
self.validated_request = None
self.cache = self.config["SELENE_CACHE"]
self.etag_manager: ETagManager = ETagManager(self.cache, self.config)
self.pantacor_channels = dict(
myc200_dev_test="Development",
myc200_beta_qa_test="Beta QA",
myc200_beta="Beta",
myc200_stable="Stable",
myc200_lts="LTS",
)
@property
def device_repository(self):
"""Lazily instantiate the device repository."""
if self._device_repository is None:
self._device_repository = DeviceRepository(self.db)
return self._device_repository
def get(self, device_id: str):
"""Process an HTTP GET request."""
self._authenticate()
if device_id is None:
response_data = self._get_devices()
@ -81,11 +122,12 @@ class DeviceEndpoint(SeleneEndpoint):
return response_data, HTTPStatus.OK
def _get_devices(self):
device_repository = DeviceRepository(self.db)
devices = device_repository.get_devices_by_account_id(
self.account.id
)
def _get_devices(self) -> List[dict]:
"""Get a list of the devices belonging to the account in the request JWT
:return: list of devices to be returned to the UI.
"""
devices = self.device_repository.get_devices_by_account_id(self.account.id)
response_data = []
for device in devices:
response_device = self._format_device_for_response(device)
@ -93,31 +135,71 @@ class DeviceEndpoint(SeleneEndpoint):
return response_data
def _get_device(self, device_id):
device_repository = DeviceRepository(self.db)
device = device_repository.get_device_by_id(device_id)
def _get_device(self, device_id: str) -> dict:
"""Get the device information for a specific device.
:param device_id: Identifier of the device to retrieve
:return: device information to return to the UI
"""
device = self.device_repository.get_device_by_id(device_id)
response_data = self._format_device_for_response(device)
return response_data
def _format_device_for_response(self, device):
"""Convert device object into a response object for this endpoint."""
def _format_device_for_response(self, device: Device) -> dict:
"""Convert device object into a response object for this endpoint.
:param device: the device data retrieved from the database.
:return: device information formatted for the UI
"""
pantacor_config = self._format_pantacor_config(device.pantacor_config)
device_status, disconnect_duration = self._format_device_status(device)
formatted_device = asdict(device)
formatted_device["pantacor_config"].update(pantacor_config)
formatted_device["wake_word"].update(name=device.wake_word.name.title())
formatted_device.update(
status=device_status,
disconnect_duration=disconnect_duration,
voice=formatted_device.pop("text_to_speech"),
)
return formatted_device
def _format_pantacor_config(self, config) -> dict[str, str]:
"""Converts Pantacor config values in the database into displayable values.
:param config: Pantacor config database values
:returns: Pantacor config displayable values
"""
formatted_config = dict(deployment_id=None)
manual_update = config.auto_update is not None and not config.auto_update
if manual_update:
formatted_config.update(
deployment_id=get_pantacor_pending_deployment(config.pantacor_id)
)
if config.release_channel is not None:
formatted_config.update(
release_channel=self.pantacor_channels.get(config.release_channel)
)
return formatted_config
def _format_device_status(self, device: Device) -> tuple[str, Optional[str]]:
"""Determines the status of the device being returned.
:param device: The device to determine the status of
:return: status of the device and the duration of disconnect (if applicable)
"""
last_contact_age = self._get_device_last_contact(device)
device_status = self._determine_device_status(last_contact_age)
if device_status == DISCONNECTED:
disconnect_duration = self._determine_disconnect_duration(
last_contact_age
)
disconnect_duration = self._determine_disconnect_duration(last_contact_age)
else:
disconnect_duration = None
device_dict = asdict(device)
device_dict['status'] = device_status
device_dict['disconnect_duration'] = disconnect_duration
device_dict['voice'] = device_dict.pop('text_to_speech')
return device_dict
return device_status, disconnect_duration
def _get_device_last_contact(self, device):
def _get_device_last_contact(self, device: Device) -> timedelta:
"""Get the last time the device contacted the backend.
The timestamp returned by this method will be used to determine if a
@ -128,6 +210,9 @@ class DeviceEndpoint(SeleneEndpoint):
If the Redis query returns nothing, the device hasn't contacted the
backend yet. This could be because it was just activated. Give the
device a couple of minutes to make that first call to the backend.
:param device: the device data retrieved from the database.
:return: the timestamp the device was last seen by Selene
"""
last_contact_ts = self.cache.get(
DEVICE_LAST_CONTACT_KEY.format(device_id=device.id)
@ -139,17 +224,18 @@ class DeviceEndpoint(SeleneEndpoint):
last_contact_age = datetime.utcnow() - device.last_contact_ts
else:
last_contact_ts = last_contact_ts.decode()
last_contact_ts = datetime.strptime(
last_contact_ts,
'%Y-%m-%d %H:%M:%S.%f'
)
last_contact_ts = datetime.strptime(last_contact_ts, "%Y-%m-%d %H:%M:%S.%f")
last_contact_age = datetime.utcnow() - last_contact_ts
return last_contact_age
@staticmethod
def _determine_device_status(last_contact_age):
"""Derive device status from the last time device contacted servers."""
def _determine_device_status(last_contact_age: timedelta) -> str:
"""Derive device status from the last time device contacted servers.
:param last_contact_age: amount of time since the device was last seen
:return: the status of the device
"""
if last_contact_age <= timedelta(seconds=120):
device_status = CONNECTED
elif timedelta(seconds=120) < last_contact_age < timedelta(days=30):
@ -160,131 +246,193 @@ class DeviceEndpoint(SeleneEndpoint):
return device_status
@staticmethod
def _determine_disconnect_duration(last_contact_age):
"""Derive device status from the last time device contacted servers."""
disconnect_duration = 'unknown'
def _determine_disconnect_duration(last_contact_age: timedelta) -> str:
"""Derive device status from the last time device contacted servers.
:param last_contact_age: amount of time since the device was last seen
:return human readable amount of time since the device was last seen
"""
disconnect_duration = "unknown"
days, _ = divmod(last_contact_age, timedelta(days=1))
if days:
disconnect_duration = str(days) + ' days'
disconnect_duration = str(days) + " days"
else:
hours, remaining = divmod(last_contact_age, timedelta(hours=1))
if hours:
disconnect_duration = str(hours) + ' hours'
disconnect_duration = str(hours) + " hours"
else:
minutes, _ = divmod(remaining, timedelta(minutes=1))
if minutes:
disconnect_duration = str(minutes) + ' minutes'
disconnect_duration = str(minutes) + " minutes"
return disconnect_duration
def post(self):
"""Handle a HTTP POST request."""
self._authenticate()
device = self._validate_request()
device_id = self._pair_device(device)
self._validate_request()
self._pair_device()
return device_id, HTTPStatus.OK
return "", HTTPStatus.NO_CONTENT
def _validate_request(self):
request_data = json.loads(self.request.data)
if self.request.method == 'POST':
device = NewDeviceRequest()
device.pairing_code = request_data['pairingCode']
else:
device = UpdateDeviceRequest()
device.city = request_data['city']
device.country = request_data['country']
device.name = request_data['name']
device.placement = request_data['placement']
device.region = request_data['region']
device.timezone = request_data['timezone']
device.wake_word = request_data['wakeWord']
device.voice = request_data['voice']
device.validate()
@use_transaction
def _pair_device(self):
"""Add the paired device to the database."""
cache_key = DEVICE_PAIRING_CODE_KEY.format(
pairing_code=self.validated_request["pairing_code"]
)
pairing_data = self._get_pairing_data(cache_key)
device_id = self._add_device()
pairing_data["uuid"] = device_id
self.cache.delete(cache_key)
self._build_pairing_token(pairing_data)
return device
def _get_pairing_data(self, cache_key) -> dict:
"""Checking if there's one pairing session for the pairing code.
def _pair_device(self, device):
self.db.autocommit = False
try:
pairing_data = self._get_pairing_data(device.pairing_code)
device_id = self._add_device(device)
pairing_data['uuid'] = device_id
self.cache.delete('pairing.code:{}'.format(device.pairing_code))
self._build_pairing_token(pairing_data)
except Exception:
self.db.rollback()
raise
else:
self.db.commit()
return device_id
def _get_pairing_data(self, pairing_code: str) -> dict:
"""Checking if there's one pairing session for the pairing code."""
cache_key = 'pairing.code:' + pairing_code
:return: the pairing code information from the Redis database
"""
pairing_cache = self.cache.get(cache_key)
pairing_data = json.loads(pairing_cache)
return pairing_data
def _add_device(self, device: NewDeviceRequest):
"""Creates a device and associate it to a pairing session"""
device_dict = device.to_native()
geography_id = self._ensure_geography_exists(self.db, device_dict)
device_dict.update(geography_id=geography_id)
device_repository = DeviceRepository(self.db)
device_id = device_repository.add(self.account.id, device_dict)
def _add_device(self) -> str:
"""Creates a device and associate it to a pairing session.
:return: the database identifier of the new device
"""
self._ensure_geography_exists()
device_id = self.device_repository.add(self.account.id, self.validated_request)
return device_id
def _ensure_geography_exists(self, db, device: dict):
geography = Geography(
city=device['city'],
country=device['country'],
region=device['region'],
time_zone=device['timezone']
)
geography_repository = GeographyRepository(db, self.account.id)
geography_id = geography_repository.get_geography_id(geography)
if geography_id is None:
geography_id = geography_repository.add(geography)
def _build_pairing_token(self, pairing_data: dict):
"""Add a pairing token to the Redis database.
return geography_id
def _build_pairing_token(self, pairing_data):
:param pairing_data: the pairing data retrieved from Redis
"""
self.cache.set_with_expiration(
key='pairing.token:' + pairing_data['token'],
key=DEVICE_PAIRING_TOKEN_KEY.format(pairing_token=pairing_data["token"]),
value=json.dumps(pairing_data),
expiration=ONE_DAY
expiration=ONE_DAY,
)
def delete(self, device_id):
def delete(self, device_id: str):
"""Handle an HTTP DELETE request.
:param device_id: database identifier of a device
"""
self._authenticate()
self._delete_device(device_id)
return '', HTTPStatus.NO_CONTENT
def _delete_device(self, device_id):
device_repository = DeviceRepository(self.db)
device_repository.remove(device_id)
return "", HTTPStatus.NO_CONTENT
def _delete_device(self, device_id: str):
"""Delete the specified device from the database.
There are other tables related to the device table in the database. This
method assumes that the child tables contain "delete cascade" clauses.
:param device_id: database identifier of a device
"""
self.device_repository.remove(device_id)
delete_device_login(device_id, self.cache)
def patch(self, device_id):
def patch(self, device_id: str):
"""Handle a HTTP PATCH request.
:param device_id: database identifier of a device
"""
self._authenticate()
updates = self._validate_request()
self._update_device(device_id, updates)
self._validate_request()
self._update_device(device_id)
self.etag_manager.expire_device_etag_by_device_id(device_id)
self.etag_manager.expire_device_location_etag_by_device_id(device_id)
self.etag_manager.expire_device_setting_etag_by_device_id(device_id)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def _update_device(self, device_id, updates):
device_updates = updates.to_native()
geography_id = self._ensure_geography_exists(self.db, device_updates)
device_updates.update(geography_id=geography_id)
device_repository = DeviceRepository(self.db)
device_repository.update_device_from_account(
self.account.id,
device_id,
device_updates
def _validate_request(self):
"""Validate the contents of the HTTP POST request."""
if self.request.method == "POST":
device = NewDeviceRequest(self.request.json)
else:
device = UpdateDeviceRequest(self.request.json)
device.validate()
self.validated_request = device.to_native()
self.validated_request.update(
wake_word=self.validated_request["wake_word"].lower()
)
if self.validated_request["release_channel"] is not None:
self.validated_request.update(
release_channel=self.validated_request["release_channel"].lower()
)
def _ensure_geography_exists(self):
"""If the requested geography is not linked to the account, add it.
:return: database identifier for the geography
"""
geography = Geography(
city=self.validated_request.pop("city"),
country=self.validated_request.pop("country"),
region=self.validated_request.pop("region"),
time_zone=self.validated_request.pop("timezone"),
)
geography_repository = GeographyRepository(self.db, self.account.id)
geography_id = geography_repository.get_geography_id(geography)
if geography_id is None:
geography_id = geography_repository.add(geography)
self.validated_request.update(geography_id=geography_id)
@use_transaction
def _update_device(self, device_id: str):
"""Update the device attributes on the database based on the request.
If the device's continuous delivery is managed by Pantacor, attempt the
Pantacor API calls first. That way, if they fail, the database updates won't
happen and we won't get stuck in a half-updated state.
:param device_id: database identifier of a device
"""
device = self.device_repository.get_device_by_id(device_id)
if device.pantacor_config.pantacor_id is not None:
self._update_pantacor_config(device)
self._ensure_geography_exists()
self.device_repository.update_device_from_account(
self.account.id, device_id, self.validated_request
)
def _update_pantacor_config(self, device: Device):
"""Update the Pantacor configuration on the database based on the request.
:param device: data object representing a Mycroft-enabled device
"""
new_pantacor_config = dict(
auto_update=self.validated_request.pop("auto_update"),
release_channel=self.validated_request.pop("release_channel"),
ssh_public_key=self.validated_request.pop("ssh_public_key"),
)
pantacor_channel_name = self._convert_release_channel(
new_pantacor_config["release_channel"]
)
new_pantacor_config.update(release_channel=pantacor_channel_name)
old_pantacor_config = asdict(device.pantacor_config)
update_pantacor_config(old_pantacor_config, new_pantacor_config)
self.device_repository.update_pantacor_config(device.id, new_pantacor_config)
def _convert_release_channel(self, release_channel: str) -> str:
"""Converts the channel sent in the request to one recognized by Pantacor.
:param release_channel: the value of the release channel in the request
:returns: the release channel as recognized by Pantacor
"""
pantacor_channel_name = None
for channel_name, channel_display in self.pantacor_channels.items():
if channel_display.lower() == release_channel:
pantacor_channel_name = channel_name
_log.info("pantacor channel name: %s", pantacor_channel_name)
return pantacor_channel_name

View File

@ -31,8 +31,6 @@ class DeviceCountEndpoint(SeleneEndpoint):
def _get_devices(self):
device_repository = DeviceRepository(self.db)
device_count = device_repository.get_account_device_count(
self.account.id
)
device_count = device_repository.get_account_device_count(self.account.id)
return device_count

View File

@ -25,7 +25,7 @@ from selene.api import SeleneEndpoint
class PairingCodeEndpoint(SeleneEndpoint):
def __init__(self):
super(PairingCodeEndpoint, self).__init__()
self.cache = self.config['SELENE_CACHE']
self.cache = self.config["SELENE_CACHE"]
def get(self, pairing_code):
self._authenticate()
@ -36,7 +36,7 @@ class PairingCodeEndpoint(SeleneEndpoint):
def _get_pairing_data(self, pairing_code: str) -> bool:
"""Checking if there's one pairing session for the pairing code."""
pairing_code_is_valid = False
cache_key = 'pairing.code:' + pairing_code
cache_key = "pairing.code:" + pairing_code
pairing_cache = self.cache.get(cache_key)
if pairing_cache is not None:
pairing_code_is_valid = True

View File

@ -29,29 +29,23 @@ from selene.data.device import AccountPreferences, PreferenceRepository
class PreferencesRequest(Model):
date_format = StringType(
required=True,
choices=['DD/MM/YYYY', 'MM/DD/YYYY']
)
measurement_system = StringType(
required=True,
choices=['Imperial', 'Metric']
)
time_format = StringType(required=True, choices=['12 Hour', '24 Hour'])
date_format = StringType(required=True, choices=["DD/MM/YYYY", "MM/DD/YYYY"])
measurement_system = StringType(required=True, choices=["Imperial", "Metric"])
time_format = StringType(required=True, choices=["12 Hour", "24 Hour"])
class PreferencesEndpoint(SeleneEndpoint):
def __init__(self):
super(PreferencesEndpoint, self).__init__()
self.preferences = None
self.cache = self.config['SELENE_CACHE']
self.cache = self.config["SELENE_CACHE"]
self.etag_manager: ETagManager = ETagManager(self.cache, self.config)
def get(self):
self._authenticate()
self._get_preferences()
if self.preferences is None:
response_data = ''
response_data = ""
response_code = HTTPStatus.NO_CONTENT
else:
response_data = asdict(self.preferences)
@ -68,22 +62,20 @@ class PreferencesEndpoint(SeleneEndpoint):
self._validate_request()
self._upsert_preferences()
self.etag_manager.expire_device_setting_etag_by_account_id(self.account.id)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def patch(self):
self._authenticate()
self._validate_request()
self._upsert_preferences()
self.etag_manager.expire_device_setting_etag_by_account_id(self.account.id)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def _validate_request(self):
self.preferences = PreferencesRequest()
self.preferences.date_format = self.request.json['dateFormat']
self.preferences.measurement_system = (
self.request.json['measurementSystem']
)
self.preferences.time_format = self.request.json['timeFormat']
self.preferences.date_format = self.request.json["dateFormat"]
self.preferences.measurement_system = self.request.json["measurementSystem"]
self.preferences.time_format = self.request.json["timeFormat"]
self.preferences.validate()
def _upsert_preferences(self):

View File

@ -25,7 +25,7 @@ from selene.data.geography import RegionRepository
class RegionEndpoint(SeleneEndpoint):
def get(self):
country_id = self.request.args['country']
country_id = self.request.args["country"]
region_repository = RegionRepository(self.db)
regions = region_repository.get_regions_by_country(country_id)

View File

@ -27,17 +27,15 @@ from selene.api import SeleneEndpoint
class SkillOauthEndpoint(SeleneEndpoint):
def __init__(self):
super(SkillOauthEndpoint, self).__init__()
self.oauth_base_url = os.environ['OAUTH_BASE_URL']
self.oauth_base_url = os.environ["OAUTH_BASE_URL"]
def get(self, oauth_id):
self._authenticate()
return self._get_oauth_url(oauth_id)
def _get_oauth_url(self, oauth_id):
url = '{base_url}/auth/{oauth_id}/auth_url?uuid={account_id}'.format(
base_url=self.oauth_base_url,
oauth_id=oauth_id,
account_id=self.account.id
url = "{base_url}/auth/{oauth_id}/auth_url?uuid={account_id}".format(
base_url=self.oauth_base_url, oauth_id=oauth_id, account_id=self.account.id
)
response = requests.get(url)
return response.text, response.status_code

View File

@ -35,8 +35,7 @@ class SkillSettingsEndpoint(SeleneEndpoint):
self.account_skills = None
self.family_settings = None
self.etag_manager: ETagManager = ETagManager(
self.config['SELENE_CACHE'],
self.config
self.config["SELENE_CACHE"], self.config
)
@property
@ -51,8 +50,7 @@ class SkillSettingsEndpoint(SeleneEndpoint):
"""Process an HTTP GET request"""
self._authenticate()
self.family_settings = self.setting_repository.get_family_settings(
self.account.id,
skill_family_name
self.account.id, skill_family_name
)
self._parse_selection_options()
response_data = self._build_response_data()
@ -62,7 +60,7 @@ class SkillSettingsEndpoint(SeleneEndpoint):
return Response(
response=json.dumps(response_data),
status=HTTPStatus.OK,
content_type='application/json'
content_type="application/json",
)
def _parse_selection_options(self):
@ -75,19 +73,16 @@ class SkillSettingsEndpoint(SeleneEndpoint):
"""
for skill_settings in self.family_settings:
if skill_settings.settings_definition is not None:
for section in skill_settings.settings_definition['sections']:
for field in section['fields']:
if field['type'] == 'select':
for section in skill_settings.settings_definition["sections"]:
for field in section["fields"]:
if field["type"] == "select":
parsed_options = []
for option in field['options'].split(';'):
option_display, option_value = option.split('|')
for option in field["options"].split(";"):
option_display, option_value = option.split("|")
parsed_options.append(
dict(
display=option_display,
value=option_value
)
dict(display=option_display, value=option_value)
)
field['options'] = parsed_options
field["options"] = parsed_options
def _build_response_data(self):
"""Build the object to return to the UI."""
@ -100,7 +95,7 @@ class SkillSettingsEndpoint(SeleneEndpoint):
response_skill = dict(
settingsDisplay=skill_settings.settings_definition,
settingsValues=skill_settings.settings_values,
deviceNames=skill_settings.device_names
deviceNames=skill_settings.device_names,
)
response_data.append(response_skill)
@ -111,19 +106,17 @@ class SkillSettingsEndpoint(SeleneEndpoint):
self._authenticate()
self._update_settings_values()
return '', HTTPStatus.OK
return "", HTTPStatus.OK
def _update_settings_values(self):
"""Update the value of the settings column on the device_skill table,"""
for new_skill_settings in self.request.json['skillSettings']:
for new_skill_settings in self.request.json["skillSettings"]:
account_skill_settings = AccountSkillSetting(
settings_definition=new_skill_settings['settingsDisplay'],
settings_values=new_skill_settings['settingsValues'],
device_names=new_skill_settings['deviceNames']
settings_definition=new_skill_settings["settingsDisplay"],
settings_values=new_skill_settings["settingsValues"],
device_names=new_skill_settings["deviceNames"],
)
self.setting_repository.update_skill_settings(
self.account.id,
account_skill_settings,
self.request.json['skillIds']
self.account.id, account_skill_settings, self.request.json["skillIds"]
)
self.etag_manager.expire_skill_etag_by_account_id(self.account.id)

View File

@ -44,11 +44,11 @@ class SkillsEndpoint(SeleneEndpoint):
market_id=skill.market_id,
name=skill.display_name or skill.family_name,
has_settings=skill.has_settings,
skill_ids=skill.skill_ids
skill_ids=skill.skill_ids,
)
else:
response_skill['skill_ids'].extend(skill.skill_ids)
if response_skill['market_id'] is None:
response_skill['market_id'] = skill.market_id
response_skill["skill_ids"].extend(skill.skill_ids)
if response_skill["market_id"] is None:
response_skill["market_id"] = skill.market_id
return sorted(response_data.values(), key=lambda x: x['name'])
return sorted(response_data.values(), key=lambda x: x["name"])

View File

@ -0,0 +1,49 @@
# Mycroft Server - Backend
# Copyright (C) 2021 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Endpoint to process a user's request to apply an software update on their device."""
from http import HTTPStatus
from schematics import Model
from schematics.types import StringType
from selene.api import SeleneEndpoint
from selene.api.pantacor import apply_pantacor_update
class SoftwareUpdateRequest(Model):
"""Schematic for a request to update software on a device."""
deployment_id = StringType(required=True)
class SoftwareUpdateEndpoint(SeleneEndpoint):
"""Send a request to Pantacor to update a device."""
def patch(self):
"""Handle a HTTP PATCH request."""
self._authenticate()
self._validate_request()
apply_pantacor_update(self.request.json["deploymentId"])
return "", HTTPStatus.NO_CONTENT
def _validate_request(self):
"""Validate the contents of the PATCH request."""
request_validator = SoftwareUpdateRequest()
request_validator.deployment_id = self.request.json["deploymentId"]
request_validator.validate()

View File

@ -0,0 +1,40 @@
# Mycroft Server - Backend
# Copyright (C) 2021 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Endpoint to validate the contents of the SSH public key."""
from urllib.parse import unquote_plus
from http import HTTPStatus
from selene.api import SeleneEndpoint
from selene.util.ssh import validate_rsa_public_key
class SshKeyValidatorEndpoint(SeleneEndpoint):
"""Validate the contents of an SSH public key."""
def get(self):
"""Handle and HTTP GET request.
The SSH key is encoded in the UI because it can contain characters that are
reserved for URL delimiting.
"""
self._authenticate()
decoded_ssh_key = unquote_plus(self.request.args["key"])
ssh_key_is_valid = validate_rsa_public_key(decoded_ssh_key)
return dict(isValid=ssh_key_is_valid), HTTPStatus.OK

View File

@ -25,7 +25,7 @@ from selene.data.geography import TimezoneRepository
class TimezoneEndpoint(SeleneEndpoint):
def get(self):
country_id = self.request.args['country']
country_id = self.request.args["country"]
timezone_repository = TimezoneRepository(self.db)
timezones = timezone_repository.get_timezones_by_country(country_id)

View File

@ -0,0 +1,65 @@
# Mycroft Server - Backend
# Copyright (c) 2022 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
# #
# This file is part of the Mycroft Server.
# #
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
# #
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# #
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
"""Account API endpoint to be called when a user is verifying their email address."""
from binascii import a2b_base64
from http import HTTPStatus
from selene.api import SeleneEndpoint, APIError
from selene.data.account import AccountRepository
from selene.util.email import validate_email_address
class VerifyEmailAddressEndpoint(SeleneEndpoint):
"""Updates a user's email address after they have verified it."""
def put(self):
"""Processes an HTTP PUT request to update the email address."""
self._authenticate()
email_address = self._validate_email_address()
self._update_account(email_address)
return "", HTTPStatus.NO_CONTENT
def _validate_email_address(self) -> str:
"""Validates that the email address is well formatted and reachable.
By this point in the email address change process, this validation has
already been done. It is done again here as a protection against malicious
calls to this endpoint.
:returns: a normalized version of the email address in the request
:raises: an API error if the email address validation fails
"""
encoded_email_address = self.request.json["token"]
email_address = a2b_base64(encoded_email_address).decode()
normalized_email_address, error = validate_email_address(email_address)
if error is not None:
raise APIError(f"invalid email address: {error}")
return normalized_email_address
def _update_account(self, email_address: str):
"""Updates the email address on the DB now that it has been verified.
:param email_address: the email address to apply to the account.account table
"""
account_repo = AccountRepository(self.db)
account_repo.update_email_address(self.account.id, email_address)

View File

@ -15,33 +15,33 @@
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
# along with this program. If not, see <https://www.gnu.org/licenses/
"""Account API endpoint to return a list of available wake words."""
from http import HTTPStatus
from selene.api import SeleneEndpoint
from selene.data.device import WakeWordRepository
from selene.data.wake_word import WakeWordRepository
class WakeWordEndpoint(SeleneEndpoint):
"""Return a list of available wake words"""
def get(self):
"""Handle a HTTP GET request."""
self._authenticate()
response_data = self._build_response_data()
return response_data, HTTPStatus.OK
def _build_response_data(self):
wake_word_repository = WakeWordRepository(self.db, self.account.id)
wake_words = wake_word_repository.get_wake_words()
"""Build the response to the HTTP GET request."""
response_data = []
wake_word_repository = WakeWordRepository(self.db)
wake_words = wake_word_repository.get_wake_words_for_web()
for wake_word in wake_words:
response_data.append(
dict(
id=wake_word.id,
name=wake_word.wake_word,
user_defined=wake_word.user_defined
)
dict(id=wake_word.id, name=wake_word.name, user_defined=False,)
)
return response_data

1257
api/account/poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,33 @@
[tool.poetry]
name = "account"
version = "0.1.0"
description = "API to support account.mycroft.ai"
authors = ["Chris Veilleux <veilleux.chris@gmail.com>"]
license = "GNU AGPL 3.0"
[tool.poetry.dependencies]
python = "^3.9"
# Version 1.0 of flask required because later versions do not allow lists to be passed as API repsonses. The Google
# STT endpoint passes a list of transcriptions to the device. Changing this to return a dictionary would break the
# API's V1 contract with Mycroft Core.
#
# To make flask 1.0 work, older versions of itsdangerous, jinja2, markupsafe and werkszeug are required.
flask = "<1.1"
itsdangerous = "<=2.0.1"
jinja2 = "<=2.10.1"
markupsafe = "<=2.0.1"
schematics = "*"
stripe = "*"
selene = {path = "./../../shared", develop = true}
uwsgi = "*"
werkzeug = "<=2.0.3"
[tool.poetry.dev-dependencies]
allure-behave = "*"
behave = "*"
pyhamcrest = "*"
pylint = "*"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

View File

@ -1,4 +1,4 @@
Feature: Pair a device
Feature: Account API -- Pair a device
Test the device add endpoint
Scenario: Add a device

View File

@ -1,4 +1,4 @@
Feature: Get the active agreements
Feature: Account API -- Get the active agreements
We need to be able to retrieve an agreement and display it on the web app.
Scenario: Multiple versions of an agreement exist

View File

@ -1,4 +1,4 @@
Feature: Authentication with JWTs
Feature: Account API - Authentication with JWTs
Some of the API endpoints contain information that is specific to a user.
To ensure that information is seen only by the user that owns it, we will
use a login mechanism coupled with authentication tokens to securely identify

View File

@ -18,7 +18,6 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Setup the environment for the account API behavioral tests."""
from datetime import datetime
from logging import getLogger
from behave import fixture, use_fixture
@ -27,13 +26,12 @@ from selene.data.metric import AccountActivityRepository
from selene.testing.account import add_account, remove_account
from selene.testing.account_geography import add_account_geography
from selene.testing.agreement import add_agreements, remove_agreements
from selene.testing.tagging import remove_wake_word_files
from selene.testing.text_to_speech import add_text_to_speech, remove_text_to_speech
from selene.testing.wake_word import add_wake_word, remove_wake_word
from selene.util.cache import SeleneCache
from selene.util.db import connect_to_db
_log = getLogger()
@fixture
def acct_api_client(context):
@ -54,6 +52,7 @@ def before_all(context):
use_fixture(acct_api_client, context)
context.db = connect_to_db(context.client_config["DB_CONNECTION_CONFIG"])
add_agreements(context)
context.wake_word = add_wake_word(context.db)
def after_all(context):
@ -62,6 +61,7 @@ def after_all(context):
This is data that does not change from test to test so it only needs to be setup
and torn down once.
"""
remove_wake_word(context.db, context.wake_word)
remove_agreements(
context.db, [context.privacy_policy, context.terms_of_use, context.open_dataset]
)
@ -72,7 +72,6 @@ def before_scenario(context, _):
account = add_account(context.db)
context.accounts = dict(foo=account)
context.geography_id = add_account_geography(context.db, account)
context.wake_word = add_wake_word(context.db)
context.voice = add_text_to_speech(context.db)
acct_activity_repository = AccountActivityRepository(context.db)
context.account_activity = acct_activity_repository.get_activity_by_date(
@ -89,9 +88,10 @@ def after_scenario(context, _):
"""
for account in context.accounts.values():
remove_account(context.db, account)
remove_wake_word(context.db, context.wake_word)
remove_text_to_speech(context.db, context.voice)
_clean_cache()
if hasattr(context, "wake_word_file"):
remove_wake_word_files(context.db, context.wake_word_file)
def _clean_cache():

View File

@ -0,0 +1,36 @@
Feature: Account API -- Interact with the Pantacor API
Devices that use Pantacor to manage the software running on them have a set of
additional attributes that can be updated using the Pantacor API
Scenario: Indicate to user that software update is available
Given an account
And the account is authenticated
And a device using Pantacor for continuous delivery
And the device has pending deployment from Pantacor
When the user requests to view the device
Then the request will be successful
And the response contains the pending deployment ID
Scenario: User elects to apply a software update
Given an account
And the account is authenticated
And a device using Pantacor for continuous delivery
And the device has pending deployment from Pantacor
When the user selects to apply the update
Then the request will be successful
Scenario: User enters a valid SSH key
Given an account
And the account is authenticated
And a device using Pantacor for continuous delivery
When the user enters a well formed RSA SSH key
Then the request will be successful
And the response indicates that the SSH key is properly formatted
Scenario: User enters an invalid SSH key
Given an account
And the account is authenticated
And a device using Pantacor for continuous delivery
When the user enters a malformed RSA SSH key
Then the request will be successful
And the response indicates that the SSH key is malformed

View File

@ -1,4 +1,4 @@
Feature: Manage account profiles
Feature: Account API -- Manage account profiles
Test the ability of the account API to retrieve and manage a user's profile
settings.
@ -44,3 +44,26 @@ Feature: Manage account profiles
Then the request will be successful
And the account will not have a open dataset agreement
And the deleted agreement will be reflected in the account activity metrics
Scenario: User changes password
Given a user who authenticates with a password
And the account is authenticated
When the user changes their password
Then the request will be successful
And the password on the account will be changed
And an password change notification will be sent
Scenario: User changes email address
Given a user who authenticates with a password
And the account is authenticated
When the user changes their email address
Then the request will be successful
And an email change notification will be sent to the old email address
And an email change verification message will be sent to the new email address
Scenario: User changes email address to a value is assigned to an existing account
Given a user who authenticates with a password
And the account is authenticated
When the user changes their email address to that of an existing account
Then the request will be successful
And a duplicate email address error is returned

View File

@ -1,9 +1,24 @@
Feature: Delete an account
Feature: Account API -- Delete an account
Test the API call to delete an account and all its related data from the database.
Scenario: Successful account deletion
Given an account
And the account is authenticated
When a user requests to delete their account
Then the request will be successful
And the user's account is deleted
And the deleted account will be reflected in the account activity metrics
Scenario: Membership removed upon account deletion
Given an account with a monthly membership
When the user's account is deleted
When a user requests to delete their account
Then the request will be successful
And the membership is removed from stripe
And the deleted account will be reflected in the account activity metrics
Scenario: Wake word files removed upon account deletion
Given an account opted into the Open Dataset agreement
And a wake word sample contributed by the user
And the account is authenticated
When a user requests to delete their account
Then the request will be successful
And the wake word contributions are flagged for deletion

View File

@ -16,21 +16,27 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Python code to support the add device feature."""
import json
from behave import given, when, then
from behave import given, when, then # pylint: disable=no-name-in-module
from hamcrest import assert_that, equal_to, none, not_none
from selene.data.device import DeviceRepository
from selene.util.cache import SeleneCache
from selene.util.cache import (
DEVICE_PAIRING_CODE_KEY,
DEVICE_PAIRING_TOKEN_KEY,
SeleneCache,
)
from selene.util.db import connect_to_db
@given("a device pairing code")
def set_device_pairing_code(context):
"""Add dummy data to the Redis cache for the test."""
pairing_data = dict(
code="ABC123",
packaging_type="pantacor",
state="this is a state",
token="this is a token",
expiration=84600,
@ -45,6 +51,7 @@ def set_device_pairing_code(context):
@when("an API request is sent to add a device")
def add_device(context):
"""Call the endpoint to add a device based on user input."""
device = dict(
city="Kansas City",
country="United States",
@ -53,43 +60,52 @@ def add_device(context):
placement="Mycroft Offices",
region="Missouri",
timezone="America/Chicago",
wakeWord="Selene Test Wake Word",
wakeWord="hey selene",
voice="Selene Test Voice",
)
response = context.client.post(
"/api/devices", data=json.dumps(device), content_type="application_json"
"/api/devices", data=json.dumps(device), content_type="application/json"
)
context.response = response
@then("the pairing code is removed from cache")
def validate_pairing_code_removal(context):
"""Ensure that the endpoint removed the pairing code entry from the cache."""
cache = SeleneCache()
pairing_data = cache.get("pairing.code:ABC123")
pairing_data = cache.get(
DEVICE_PAIRING_CODE_KEY.format(pairing_code=context.pairing_code)
)
assert_that(pairing_data, none())
@then("the device is added to the database")
def validate_response(context):
device_id = context.response.data.decode()
"""Ensure that the database was updated as expected."""
account = context.accounts["foo"]
db = connect_to_db(context.client_config["DB_CONNECTION_CONFIG"])
device_repository = DeviceRepository(db)
device = device_repository.get_device_by_id(device_id)
devices = device_repository.get_devices_by_account_id(account.id)
device = None
for device in devices:
if device.name == "Selene Test Device":
break
assert_that(device, not_none())
assert_that(device.name, equal_to("Selene Test Device"))
assert_that(device.placement, equal_to("Mycroft Offices"))
assert_that(device.account_id, equal_to(account.id))
context.device_id = device.id
@then("the pairing token is added to cache")
def validate_pairing_token(context):
device_id = context.response.data.decode()
"""Validate the pairing token data was added to the cache as expected."""
cache = SeleneCache()
pairing_data = cache.get("pairing.token:this is a token")
pairing_data = cache.get(
DEVICE_PAIRING_TOKEN_KEY.format(pairing_token="this is a token")
)
pairing_data = json.loads(pairing_data)
assert_that(pairing_data["uuid"], equal_to(device_id))
assert_that(pairing_data["uuid"], equal_to(context.device_id))
assert_that(pairing_data["state"], equal_to(context.pairing_data["state"]))
assert_that(pairing_data["token"], equal_to(context.pairing_data["token"]))

View File

@ -26,19 +26,19 @@ from hamcrest import assert_that, equal_to
from selene.data.account import PRIVACY_POLICY, TERMS_OF_USE
@when('API request for {agreement} is made')
@when("API request for {agreement} is made")
def call_agreement_endpoint(context, agreement):
if agreement == PRIVACY_POLICY:
url = '/api/agreement/privacy-policy'
url = "/api/agreement/privacy-policy"
elif agreement == TERMS_OF_USE:
url = '/api/agreement/terms-of-use'
url = "/api/agreement/terms-of-use"
else:
raise ValueError('invalid agreement type')
raise ValueError("invalid agreement type")
context.response = context.client.get(url)
@then('{agreement} version {version} is returned')
@then("{agreement} version {version} is returned")
def validate_response(context, agreement, version):
response_data = json.loads(context.response.data)
if agreement == PRIVACY_POLICY:
@ -46,7 +46,7 @@ def validate_response(context, agreement, version):
elif agreement == TERMS_OF_USE:
expected_response = asdict(context.terms_of_use)
else:
raise ValueError('invalid agreement type')
raise ValueError("invalid agreement type")
del(expected_response['effective_date'])
del expected_response["effective_date"]
assert_that(response_data, equal_to(expected_response))

View File

@ -25,79 +25,62 @@ from selene.testing.api import (
generate_refresh_token,
set_access_token_cookie,
set_refresh_token_cookie,
validate_token_cookies
validate_token_cookies,
)
from selene.util.auth import AuthenticationToken
EXPIRE_IMMEDIATELY = 0
@given('an account with a valid access token')
@given("an account with a valid access token")
def use_account_with_valid_access_token(context):
context.username = 'foo'
context.username = "foo"
context.access_token = generate_access_token(context)
set_access_token_cookie(context)
context.refresh_token = generate_refresh_token(context)
set_refresh_token_cookie(context)
@given('an account with an expired access token')
@given("an account with an expired access token")
def generate_expired_access_token(context):
context.username = 'foo'
context.access_token = generate_access_token(
context,
duration=EXPIRE_IMMEDIATELY
)
context.username = "foo"
context.access_token = generate_access_token(context, duration=EXPIRE_IMMEDIATELY)
set_access_token_cookie(context, duration=EXPIRE_IMMEDIATELY)
context.refresh_token = generate_refresh_token(context)
set_refresh_token_cookie(context)
context.old_refresh_token = context.refresh_token.jwt
@given('an account with a refresh token but no access token')
@given("an account with a refresh token but no access token")
def generate_refresh_token_only(context):
context.username = 'foo'
context.username = "foo"
context.refresh_token = generate_refresh_token(context)
set_refresh_token_cookie(context)
context.old_refresh_token = context.refresh_token.jwt
@given('an account with expired access and refresh tokens')
@given("an account with expired access and refresh tokens")
def expire_both_tokens(context):
context.username = 'foo'
context.access_token = generate_access_token(
context,
duration=EXPIRE_IMMEDIATELY
)
context.username = "foo"
context.access_token = generate_access_token(context, duration=EXPIRE_IMMEDIATELY)
set_access_token_cookie(context, duration=EXPIRE_IMMEDIATELY)
context.refresh_token = generate_refresh_token(
context,
duration=EXPIRE_IMMEDIATELY
)
context.refresh_token = generate_refresh_token(context, duration=EXPIRE_IMMEDIATELY)
set_refresh_token_cookie(context, duration=EXPIRE_IMMEDIATELY)
@then('the authentication tokens will remain unchanged')
@then("the authentication tokens will remain unchanged")
def check_for_no_new_cookie(context):
cookies = context.response.headers.getlist('Set-Cookie')
cookies = context.response.headers.getlist("Set-Cookie")
assert_that(cookies, equal_to([]))
@then('the authentication tokens will be refreshed')
@then("the authentication tokens will be refreshed")
def check_for_new_cookies(context):
validate_token_cookies(context)
assert_that(
context.refresh_token,
is_not(equal_to(context.old_refresh_token))
)
refresh_token = AuthenticationToken(
context.client_config['REFRESH_SECRET'],
0
)
assert_that(context.refresh_token, is_not(equal_to(context.old_refresh_token)))
refresh_token = AuthenticationToken(context.client_config["REFRESH_SECRET"], 0)
refresh_token.jwt = context.refresh_token
refresh_token.validate()
assert_that(refresh_token.is_valid, equal_to(True))
assert_that(refresh_token.is_expired, equal_to(False))
assert_that(
refresh_token.account_id,
equal_to(context.accounts['foo'].id))
assert_that(refresh_token.is_valid, equal_to(True), "refresh token valid")
assert_that(refresh_token.is_expired, equal_to(False), "refresh token expired")
assert_that(refresh_token.account_id, equal_to(context.accounts["foo"].id))

View File

@ -0,0 +1,126 @@
# Mycroft Server - Backend
# Copyright (C) 2019 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Step functions for applying a software update via the account API."""
import json
from unittest.mock import MagicMock, patch
from behave import given, then, when # pylint: disable=no-name-in-module
from hamcrest import assert_that, equal_to
from selene.testing.device import add_device, add_pantacor_config
@given("a device using Pantacor for continuous delivery")
def add_pantacor_device(context):
"""Add a device with a Pantacor config and software update set to manual."""
context.device_id = add_device(
context.db, context.accounts["foo"].id, context.geography_id
)
add_pantacor_config(context.db, context.device_id)
@given("the device has pending deployment from Pantacor")
def add_pantacor_deployment_id(context):
"""Add a dummy deployment ID to the context for use later in tests."""
context.deployment_id = "test_deployment_id"
@when("the user selects to apply the update")
def apply_software_update(context):
"""Make an API call to apply the software update.
The Pantacor API code is patched because there is currently no way to call it
reliably with a test device.
"""
with patch("requests.request") as request_patch:
apply_update_response = MagicMock(spec=["ok", "content"])
apply_update_response.ok = True
apply_update_response.content = '{"response":"ok"}'.encode()
request_patch.side_effect = [apply_update_response]
request_data = dict(deploymentId=context.deployment_id)
response = context.client.patch(
"/api/software-update",
data=json.dumps(request_data),
content_type="application/json",
)
context.response = response
@when("the user enters a malformed RSA SSH key")
def validate_invalid_ssh_key(context):
"""Make an API call to check the validity of a RSA SSH key."""
response = context.client.get(
"/api/ssh-key?key=foo", content_type="application/json"
)
context.response = response
@when("the user enters a well formed RSA SSH key")
def validate_valid_ssh_key(context):
"""Make an API call to check the validity of a RSA SSH key."""
response = context.client.get(
"/api/ssh-key?key=ssh-rsa%20AAAAB3NzaC1yc2EAAAADAQABAAACAQDEwmtmRho==%20foo",
content_type="application/json",
)
context.response = response
@when("the user requests to view the device")
def get_device(context):
"""Make an API call to get device data, including a software update ID.
The Pantacor API code is patched because there is currently no way to call it
reliably with a test device.
"""
with patch("requests.request") as request_patch:
api_response = MagicMock(spec=["ok", "content"])
api_response.ok = True
deployment = dict(id="test_deployment_id")
get_deployment_content = dict(items=[deployment])
api_response.content = json.dumps(get_deployment_content).encode()
request_patch.side_effect = [api_response]
response = context.client.get(
"/api/devices/" + context.device_id, content_type="application/json"
)
context.response = response
@then("the response contains the pending deployment ID")
def check_for_deployment_id(context):
"""Check the response of the device query to ensure the update ID is populated."""
device_attributes = context.response.json
assert_that(
device_attributes["pantacorConfig"]["deploymentId"],
equal_to("test_deployment_id"),
)
@then("the response indicates that the SSH key is malformed")
def check_for_malformed_ssh_key(context):
"""Ensure the response indicates the SSH key passed on the URL is invalid"""
response = context.response
assert_that(response.json, equal_to(dict(isValid=False)))
@then("the response indicates that the SSH key is properly formatted")
def check_for_well_formed_ssh_key(context):
"""Ensure the response indicates the SSH key passed on the URL is valid"""
response = context.response
assert_that(response.json, equal_to(dict(isValid=True)))

View File

@ -19,7 +19,10 @@
"""Step functions for maintaining an account profile via the account API."""
import json
from binascii import b2a_base64
from datetime import datetime
from os import environ
from unittest.mock import patch
from behave import given, then, when # pylint: disable=no-name-in-module
from hamcrest import (
@ -29,6 +32,7 @@ from hamcrest import (
has_item,
is_in,
none,
not_none,
starts_with,
)
@ -47,6 +51,7 @@ from selene.testing.api import (
set_refresh_token_cookie,
)
from selene.testing.membership import MONTHLY_MEMBERSHIP, YEARLY_MEMBERSHIP
from selene.util.email import EmailMessage
BAR_EMAIL_ADDRESS = "bar@mycroft.ai"
STRIPE_METHOD = "Stripe"
@ -65,6 +70,11 @@ def add_membership_to_account(context):
context.refresh_token = generate_refresh_token(context)
set_refresh_token_cookie(context)
_add_membership_via_api(context)
acct_repository = AccountRepository(context.db)
membership = acct_repository.get_active_account_membership(
context.accounts["foo"].id
)
context.accounts["foo"].membership = membership
@given("an account without a membership")
@ -83,6 +93,62 @@ def set_account_open_dataset(context, in_or_out):
account_repo.expire_open_dataset_agreement(account.id)
@given("a user who authenticates with a password")
def setup_user(context):
"""Set user context for use in other steps."""
context.username = "foo"
context.password = "barfoo"
@when("the user changes their password")
def call_password_change_endpoint(context):
"""Call the password change endpoint for the single sign on API."""
change_password_request = dict(
password=b2a_base64(context.password.encode()).decode()
)
with patch("account_api.endpoints.change_password.SeleneMailer") as email_mock:
response = context.client.put(
"/api/change-password",
data=json.dumps(change_password_request),
content_type="application/json",
)
context.response = response
context.email_mock = email_mock
@when("the user changes their email address")
def call_email_address_change_endpoint(context):
"""Call the password change endpoint for the single sign on API."""
context.new_email_address = "bar@mycroft.ai"
encoded_email_address = context.new_email_address.encode()
context.email_verification_token = b2a_base64(
encoded_email_address, newline=False
).decode()
change_email_request = dict(token=context.email_verification_token)
with patch("account_api.endpoints.change_email_address.SeleneMailer") as email_mock:
response = context.client.put(
"/api/change-email",
data=json.dumps(change_email_request),
content_type="application/json",
)
context.response = response
context.email_mock = email_mock
@when("the user changes their email address to that of an existing account")
def call_email_validation_endpoint(context):
"""Call the email validation endpoint on the account API."""
existing_account = context.accounts["foo"]
email_address = existing_account.email_address.encode()
token = b2a_base64(email_address).decode()
context.client.content_type = "application/json"
response = context.client.get(
f"/api/validate-email?platform=Internal&token={token}"
)
context.response = response
@when("a user requests their profile")
def call_account_endpoint(context):
"""Issue API call to retrieve account profile."""
@ -100,7 +166,7 @@ def add_monthly_membership(context):
@when("the membership is cancelled")
def cancel_membership(context):
"""Issue API call to cancel and account's membership."""
membership_data = dict(newMembership=False, membershipType=None)
membership_data = dict(action="cancel")
context.response = context.client.patch(
"/api/account",
data=json.dumps(dict(membership=membership_data)),
@ -111,7 +177,7 @@ def cancel_membership(context):
def _add_membership_via_api(context):
"""Helper function to add account membership via API call"""
membership_data = dict(
newMembership=True,
action="add",
membershipType=MONTHLY_MEMBERSHIP,
paymentMethod=STRIPE_METHOD,
paymentToken=VISA_TOKEN,
@ -126,7 +192,7 @@ def _add_membership_via_api(context):
@when("the membership is changed to yearly")
def change_to_yearly_account(context):
"""Issue API call to change a monthly membership to a yearly membership."""
membership_data = dict(newMembership=False, membershipType=YEARLY_MEMBERSHIP)
membership_data = dict(action="update", membershipType=YEARLY_MEMBERSHIP)
context.response = context.client.patch(
"/api/account",
data=json.dumps(dict(membership=membership_data)),
@ -222,7 +288,8 @@ def check_expired_member_account_metrics(context):
# Membership was added in a previous step so rather than the membership being
# decreased by one, it would net to being the same after the expiration.
assert_that(
account_activity.members, equal_to(context.account_activity.members),
account_activity.members,
equal_to(context.account_activity.members),
)
assert_that(
account_activity.members_expired,
@ -254,3 +321,77 @@ def check_new_open_dataset_account_metrics(context):
def check_deleted_open_dataset_account_metrics(context):
"""Ensure a new agreement is accurately reflected in the metrics."""
check_account_metrics(context, "open_dataset", "open_dataset_deleted")
@then("the password on the account will be changed")
def check_new_password(context):
"""Retrieves the account with the new password to verify it was changed."""
acct_repository = AccountRepository(context.db)
test_account = context.accounts["foo"]
account = acct_repository.get_account_from_credentials(
test_account.email_address, context.password
)
assert_that(account, not_none())
@then("a duplicate email address error is returned")
def check_for_duplicate_account_error(context):
"""Check the API response for an "account exists" error."""
response = context.response
assert_that(response.json["accountExists"], equal_to(True))
@then("an password change notification will be sent")
def check_password_change_notification_sent(context):
"""Ensures the email change notification message was sent.
Using a mock for email as we don't want to be sending emails every time the tests
run.
"""
email_mock = context.email_mock
notification_email = EmailMessage(
recipient=context.accounts["foo"].email_address,
sender="Mycroft AI<no-reply@mycroft.ai>",
subject="Password Changed",
template_file_name="password_change.html",
)
email_mock.assert_any_call(notification_email)
@then("an email change notification will be sent to the old email address")
def check_email_change_notification_sent(context):
"""Ensures the email change notification message was sent.
Using a mock for email as we don't want to be sending emails every time the tests
run.
"""
email_mock = context.email_mock
notification_email = EmailMessage(
recipient=context.accounts["foo"].email_address,
sender="Mycroft AI<no-reply@mycroft.ai>",
subject="Email Address Changed",
template_file_name="email_change.html",
)
email_mock.assert_any_call(notification_email)
@then("an email change verification message will be sent to the new email address")
def check_new_email_verification_sent(context):
"""Ensures the new email verification message was sent.
Using a mock for email as we don't want to be sending emails every time the tests
run.
"""
email_mock = context.email_mock
url = (
f"{environ['ACCOUNT_BASE_URL']}/verify-email?"
+ f"token={context.email_verification_token}"
)
verification_email = EmailMessage(
recipient=context.new_email_address,
sender="Mycroft AI<no-reply@mycroft.ai>",
subject="Email Change Verification",
template_file_name="email_verification.html",
template_variables=dict(email_verification_url=url),
)
email_mock.assert_any_call(verification_email)

View File

@ -21,23 +21,56 @@ import os
from datetime import datetime
import stripe
from behave import then, when # pylint: disable=no-name-in-module
from hamcrest import assert_that, equal_to
from behave import given, then, when # pylint: disable=no-name-in-module
from hamcrest import assert_that, equal_to, is_in, none
from stripe.error import InvalidRequestError
from selene.data.account import AccountRepository
from selene.data.metric import AccountActivityRepository
from selene.data.tagging import (
PENDING_DELETE_STATUS,
TaggingFileLocation,
TaggingFileLocationRepository,
WakeWordFile,
WakeWordFileRepository,
)
@when("the user's account is deleted")
@given("a wake word sample contributed by the user")
def add_wake_word_sample(context):
"""Add a sample wake word file to the database to be queried by future steps."""
file_repository = WakeWordFileRepository(context.db)
location_repository = TaggingFileLocationRepository(context.db)
location = TaggingFileLocation(server="127.0.0.1", directory="/opt/selene/data")
location.id = location_repository.add(location)
wake_word_file = WakeWordFile(
wake_word=context.wake_word,
name="test.wav",
origin="mycroft",
submission_date=datetime.utcnow().date(),
account_id=context.accounts["foo"].id,
status="uploaded",
location=location,
)
file_repository.add(wake_word_file)
file_repository.change_file_status(wake_word_file, PENDING_DELETE_STATUS)
context.wake_word_file = wake_word_file
@when("a user requests to delete their account")
def call_account_endpoint(context):
"""Issue API call to delete an account."""
context.response = context.client.delete("/api/account")
@then("the user's account is deleted")
def account_deleted(context):
"""Ensure account no longer exists in database."""
acct_repository = AccountRepository(context.db)
membership = acct_repository.get_active_account_membership(
context.accounts["foo"].id
)
context.accounts["foo"].membership = membership
context.response = context.client.delete("/api/account")
deleted_account = context.accounts["foo"]
account_in_db = acct_repository.get_account_by_id(deleted_account.id)
assert_that(account_in_db, none())
@then("the membership is removed from stripe")
@ -67,3 +100,12 @@ def check_db_for_account_metrics(context):
account_activity.accounts_deleted,
equal_to(context.account_activity.accounts_deleted + 1),
)
@then("the wake word contributions are flagged for deletion")
def check_wake_word_file_status(context):
"""An account that contributed wake word samples has those samples removed."""
deleted_account = context.accounts["foo"]
file_repository = WakeWordFileRepository(context.db)
files_pending_delete = file_repository.get_pending_delete()
assert_that(deleted_account.id, is_in(files_pending_delete))

View File

@ -1,17 +0,0 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
flask = "<1.1"
requests = "*"
pyjwt = "*"
uwsgi = "*"
markdown = "*"
selene = {editable = true, path = "./../../shared"}
[dev-packages]
[requires]
python_version = "3.7"

274
api/market/Pipfile.lock generated
View File

@ -1,274 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "8384eaea32c04faedff8a5da354330595d332d8a676a09e427a1402644d91544"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.7"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"certifi": {
"hashes": [
"sha256:046832c04d4e752f37383b628bc601a7ea7211496b4638f6514d0e5b9acc4939",
"sha256:945e3ba63a0b9f577b1395204e13c3a231f9bc0223888be653286534e5873695"
],
"version": "==2019.6.16"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"click": {
"hashes": [
"sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13",
"sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7"
],
"version": "==7.0"
},
"deprecated": {
"hashes": [
"sha256:a515c4cf75061552e0284d123c3066fbbe398952c87333a92b8fc3dd8e4f9cc1",
"sha256:b07b414c8aac88f60c1d837d21def7e83ba711052e03b3cbaff27972567a8f8d"
],
"version": "==1.2.6"
},
"facebook-sdk": {
"hashes": [
"sha256:2e987b3e0f466a6f4ee77b935eb023dba1384134f004a2af21f1cfff7fe0806e",
"sha256:cabcd2e69ea3d9f042919c99b353df7aa1e2be86d040121f6e9f5e63c1cf0f8d"
],
"version": "==3.1.0"
},
"flask": {
"hashes": [
"sha256:1a21ccca71cee5e55b6a367cc48c6eb47e3c447f76e64d41f3f3f931c17e7c96",
"sha256:ed1330220a321138de53ec7c534c3d90cf2f7af938c7880fc3da13aa46bf870f"
],
"index": "pypi",
"version": "==1.0.4"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c"
],
"version": "==2.8"
},
"itsdangerous": {
"hashes": [
"sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19",
"sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749"
],
"version": "==1.1.0"
},
"jinja2": {
"hashes": [
"sha256:065c4f02ebe7f7cf559e49ee5a95fb800a9e4528727aec6f24402a5374c65013",
"sha256:14dd6caf1527abb21f08f86c784eac40853ba93edb79552aa1e4b8aef1b61c7b"
],
"version": "==2.10.1"
},
"markdown": {
"hashes": [
"sha256:2e50876bcdd74517e7b71f3e7a76102050edec255b3983403f1a63e7c8a41e7a",
"sha256:56a46ac655704b91e5b7e6326ce43d5ef72411376588afa1dd90e881b83c7e8c"
],
"index": "pypi",
"version": "==3.1.1"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7"
],
"version": "==1.1.1"
},
"passlib": {
"hashes": [
"sha256:3d948f64138c25633613f303bcc471126eae67c04d5e3f6b7b8ce6242f8653e0",
"sha256:43526aea08fa32c6b6dbbbe9963c4c767285b78147b7437597f992812f69d280"
],
"version": "==1.7.1"
},
"psycopg2-binary": {
"hashes": [
"sha256:080c72714784989474f97be9ab0ddf7b2ad2984527e77f2909fcd04d4df53809",
"sha256:110457be80b63ff4915febb06faa7be002b93a76e5ba19bf3f27636a2ef58598",
"sha256:171352a03b22fc099f15103959b52ee77d9a27e028895d7e5fde127aa8e3bac5",
"sha256:19d013e7b0817087517a4b3cab39c084d78898369e5c46258aab7be4f233d6a1",
"sha256:249b6b21ae4eb0f7b8423b330aa80fab5f821b9ffc3f7561a5e2fd6bb142cf5d",
"sha256:2ac0731d2d84b05c7bb39e85b7e123c3a0acd4cda631d8d542802c88deb9e87e",
"sha256:2b6d561193f0dc3f50acfb22dd52ea8c8dfbc64bcafe3938b5f209cc17cb6f00",
"sha256:2bd23e242e954214944481124755cbefe7c2cf563b1a54cd8d196d502f2578bf",
"sha256:3e1239242ca60b3725e65ab2f13765fc199b03af9eaf1b5572f0e97bdcee5b43",
"sha256:3eb70bb697abbe86b1d2b1316370c02ba320bfd1e9e35cf3b9566a855ea8e4e5",
"sha256:51a2fc7e94b98bd1bb5d4570936f24fc2b0541b63eccadf8fdea266db8ad2f70",
"sha256:52f1bdafdc764b7447e393ed39bb263eccb12bfda25a4ac06d82e3a9056251f6",
"sha256:5b3581319a3951f1e866f4f6c5e42023db0fae0284273b82e97dfd32c51985cd",
"sha256:63c1b66e3b2a3a336288e4bcec499e0dc310cd1dceaed1c46fa7419764c68877",
"sha256:8123a99f24ecee469e5c1339427bcdb2a33920a18bb5c0d58b7c13f3b0298ba3",
"sha256:85e699fcabe7f817c0f0a412d4e7c6627e00c412b418da7666ff353f38e30f67",
"sha256:8dbff4557bbef963697583366400822387cccf794ccb001f1f2307ed21854c68",
"sha256:908d21d08d6b81f1b7e056bbf40b2f77f8c499ab29e64ec5113052819ef1c89b",
"sha256:af39d0237b17d0a5a5f638e9dffb34013ce2b1d41441fd30283e42b22d16858a",
"sha256:af51bb9f055a3f4af0187149a8f60c9d516cf7d5565b3dac53358796a8fb2a5b",
"sha256:b2ecac57eb49e461e86c092761e6b8e1fd9654dbaaddf71a076dcc869f7014e2",
"sha256:cd37cc170678a4609becb26b53a2bc1edea65177be70c48dd7b39a1149cabd6e",
"sha256:d17e3054b17e1a6cb8c1140f76310f6ede811e75b7a9d461922d2c72973f583e",
"sha256:d305313c5a9695f40c46294d4315ed3a07c7d2b55e48a9010dad7db7a66c8b7f",
"sha256:dd0ef0eb1f7dd18a3f4187226e226a7284bda6af5671937a221766e6ef1ee88f",
"sha256:e1adff53b56db9905db48a972fb89370ad5736e0450b96f91bcf99cadd96cfd7",
"sha256:f0d43828003c82dbc9269de87aa449e9896077a71954fbbb10a614c017e65737",
"sha256:f78e8b487de4d92640105c1389e5b90be3496b1d75c90a666edd8737cc2dbab7"
],
"version": "==2.8.3"
},
"pygithub": {
"hashes": [
"sha256:db415a5aeb5ab1e4a3263b1a091b4f9ffbd85a12a06a0303d5bf083ce7c1b2c8"
],
"version": "==1.43.8"
},
"pyhamcrest": {
"hashes": [
"sha256:6b672c02fdf7470df9674ab82263841ce8333fb143f32f021f6cb26f0e512420",
"sha256:8ffaa0a53da57e89de14ced7185ac746227a8894dbd5a3c718bf05ddbd1d56cd"
],
"version": "==1.9.0"
},
"pyjwt": {
"hashes": [
"sha256:5c6eca3c2940464d106b99ba83b00c6add741c9becaec087fb7ccdefea71350e",
"sha256:8d59a976fb773f3e6a39c85636357c4f0e242707394cadadd9814f5cbaa20e96"
],
"index": "pypi",
"version": "==1.7.1"
},
"python-http-client": {
"hashes": [
"sha256:7e430f4b9dd2b621b0051f6a362f103447ea8e267594c602a5c502a0c694ee38"
],
"version": "==3.1.0"
},
"redis": {
"hashes": [
"sha256:0607faf60d44768e17f65e506fe390679b54be6fd6d5f0c2d28f3ebf4f0535e7",
"sha256:9c96c5bf11a8c47eb33cefdefd41c47cf1ff68db41c51b56b3ec7938b7c627f7"
],
"version": "==3.3.7"
},
"requests": {
"hashes": [
"sha256:11e007a8a2aa0323f5a921e9e6a2d7e4e67d9877e85773fba9ba6419025cbeb4",
"sha256:9cf5292fcd0f598c671cfc1e0d7d1a7f13bb8085e9a590f48c010551dc6c4b31"
],
"index": "pypi",
"version": "==2.22.0"
},
"schedule": {
"hashes": [
"sha256:3f895a1036799a25ab9c335de917073e63cf8256920917e932777382f101f08f",
"sha256:f9fb5181283de4db6e701d476dd01b6a3dd81c38462a54991ddbb9d26db857c9"
],
"version": "==0.6.0"
},
"schematics": {
"hashes": [
"sha256:8fcc6182606fd0b24410a1dbb066d9bbddbe8da9c9509f47b743495706239283",
"sha256:a40b20635c0e43d18d3aff76220f6cd95ea4decb3f37765e49529b17d81b0439"
],
"version": "==2.1.0"
},
"selene": {
"editable": true,
"path": "./../../shared"
},
"sendgrid": {
"hashes": [
"sha256:297d33363a70df9b39419210e1273b165d487730e85c495695e0015bc626db71",
"sha256:8b82c8c801dde8180a567913a9f80d8a63f38e39f209edde302b6df899b4bca1"
],
"version": "==6.0.5"
},
"six": {
"hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
],
"version": "==1.12.0"
},
"stripe": {
"hashes": [
"sha256:344cd691a542f08c508b9d12ac201da46b7f0f21a0a7f72f56199b3baee795eb",
"sha256:e07efa567ae0831fe351ddb49de074aa1681569fd234d4f1dc0a9f7f4c017820"
],
"version": "==2.35.0"
},
"urllib3": {
"hashes": [
"sha256:b246607a25ac80bedac05c6f282e3cdaf3afb65420fd024ac94435cabe6e18d1",
"sha256:dbe59173209418ae49d485b87d1681aefa36252ee85884c31346debd19463232"
],
"version": "==1.25.3"
},
"uwsgi": {
"hashes": [
"sha256:4972ac538800fb2d421027f49b4a1869b66048839507ccf0aa2fda792d99f583"
],
"index": "pypi",
"version": "==2.0.18"
},
"werkzeug": {
"hashes": [
"sha256:87ae4e5b5366da2347eb3116c0e6c681a0e939a33b2805e2c0cbd282664932c4",
"sha256:a13b74dd3c45f758d4ebdb224be8f1ab8ef58b3c0ffc1783a8c7d9f4f50227e6"
],
"version": "==0.15.5"
},
"wrapt": {
"hashes": [
"sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1"
],
"version": "==1.11.2"
}
},
"develop": {}
}

View File

@ -16,4 +16,3 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

View File

@ -23,55 +23,41 @@ from flask import Flask
from selene.api import get_base_config, selene_api, SeleneResponse
from selene.api.endpoints import AccountEndpoint
from selene.util.cache import SeleneCache
from selene.util.log import configure_logger
from selene.util.log import configure_selene_logger
from .endpoints import (
AvailableSkillsEndpoint,
SkillDetailEndpoint,
SkillInstallEndpoint,
SkillInstallStatusEndpoint
SkillInstallStatusEndpoint,
)
_log = configure_logger('market_api')
configure_selene_logger("market_api")
# Define the Flask application
market = Flask(__name__)
market.config.from_object(get_base_config())
market.response_class = SeleneResponse
market.register_blueprint(selene_api)
market.config['SELENE_CACHE'] = SeleneCache()
market.config["SELENE_CACHE"] = SeleneCache()
# Define the API and its endpoints.
account_endpoint = AccountEndpoint.as_view('account_endpoint')
account_endpoint = AccountEndpoint.as_view("account_endpoint")
market.add_url_rule("/api/account", view_func=account_endpoint, methods=["GET"])
available_endpoint = AvailableSkillsEndpoint.as_view("available_endpoint")
market.add_url_rule(
'/api/account',
view_func=account_endpoint,
methods=['GET']
"/api/skills/available", view_func=available_endpoint, methods=["GET"]
)
available_endpoint = AvailableSkillsEndpoint.as_view('available_endpoint')
market.add_url_rule(
'/api/skills/available',
view_func=available_endpoint,
methods=['GET']
)
status_endpoint = SkillInstallStatusEndpoint.as_view("status_endpoint")
market.add_url_rule("/api/skills/status", view_func=status_endpoint, methods=["GET"])
status_endpoint = SkillInstallStatusEndpoint.as_view('status_endpoint')
skill_detail_endpoint = SkillDetailEndpoint.as_view("skill_detail_endpoint")
market.add_url_rule(
'/api/skills/status',
view_func=status_endpoint,
methods=['GET']
)
skill_detail_endpoint = SkillDetailEndpoint.as_view('skill_detail_endpoint')
market.add_url_rule(
'/api/skills/<string:skill_display_id>',
"/api/skills/<string:skill_display_id>",
view_func=skill_detail_endpoint,
methods=['GET']
methods=["GET"],
)
install_endpoint = SkillInstallEndpoint.as_view('install_endpoint')
market.add_url_rule(
'/api/skills/install',
view_func=install_endpoint,
methods=['PUT']
)
install_endpoint = SkillInstallEndpoint.as_view("install_endpoint")
market.add_url_rule("/api/skills/install", view_func=install_endpoint, methods=["PUT"])

View File

@ -20,28 +20,31 @@
"""Endpoint to provide skill summary data to the marketplace."""
from collections import defaultdict
from http import HTTPStatus
from logging import getLogger
from typing import List
from selene.api import SeleneEndpoint
from selene.data.skill import SkillDisplay, SkillDisplayRepository
from selene.util.log import get_selene_logger
_log = getLogger(__package__)
_log = get_selene_logger(__name__)
class AvailableSkillsEndpoint(SeleneEndpoint):
"""Marketplace endpoint to get all the skills available in the market."""
authentication_required = False
def __init__(self):
super(AvailableSkillsEndpoint, self).__init__()
super().__init__()
self.available_skills: List[SkillDisplay] = []
self.response_skills: List[dict] = []
self.skills_in_manifests = defaultdict(list)
def get(self):
"""Handles a HTTP GET request."""
self._get_available_skills()
self._build_response_data()
self.response = (self.response_skills, HTTPStatus.OK)
self.response = (dict(skills=self.response_skills), HTTPStatus.OK)
return self.response
@ -69,17 +72,17 @@ class AvailableSkillsEndpoint(SeleneEndpoint):
skills_to_include = []
query_string = self.request.query_string.decode()
search_term = query_string.lower().split('=')[1]
search_term = query_string.lower().split("=")[1]
for skill in self.available_skills:
display_data = skill.display_data
search_term_match = (
search_term is None or
search_term in display_data['title'].lower() or
search_term in display_data['description'].lower() or
search_term in display_data['short_desc'].lower() or
search_term in [c.lower() for c in display_data['categories']] or
search_term in [t.lower() for t in display_data['tags']] or
search_term in [t.lower() for t in display_data['examples']]
search_term is None
or search_term in display_data["title"].lower()
or search_term in display_data["description"].lower()
or search_term in display_data["short_desc"].lower()
or search_term in [c.lower() for c in display_data["categories"]]
or search_term in [t.lower() for t in display_data["tags"]]
or search_term in [t.lower() for t in display_data["examples"]]
)
if search_term_match:
skills_to_include.append(skill)
@ -90,37 +93,35 @@ class AvailableSkillsEndpoint(SeleneEndpoint):
"""Build the response data from the skill service response"""
for skill in skills_to_include:
skill_info = dict(
display_name=skill.display_data.get('display_name'),
icon=skill.display_data.get('icon'),
iconImage=skill.display_data.get('icon_img'),
displayName=skill.display_data.get("display_name"),
icon=skill.display_data.get("icon"),
iconImage=skill.display_data.get("icon_img"),
isMycroftMade=False,
isSystemSkill=False,
marketCategory='Undefined',
marketCategory="Undefined",
id=skill.id,
summary=skill.display_data.get('short_desc'),
trigger=None
summary=skill.display_data.get("short_desc"),
trigger=None,
)
examples = skill.display_data.get('examples')
examples = skill.display_data.get("examples")
if examples is not None and examples:
skill_info.update(trigger=skill.display_data['examples'][0])
tags = skill.display_data.get('tags')
if tags is not None and 'system' in tags:
skill_info.update(trigger=skill.display_data["examples"][0])
tags = skill.display_data.get("tags")
if tags is not None and "system" in tags:
skill_info.update(isSystemSkill=True)
categories = skill.display_data.get('categories')
categories = skill.display_data.get("categories")
if categories is not None and categories:
skill_info.update(marketCategory=categories[0])
skill_credits = skill.display_data.get('credits')
skill_credits = skill.display_data.get("credits")
if skill_credits is not None:
credits_names = [credit.get('name') for credit in skill_credits]
if 'Mycroft AI' in credits_names:
credits_names = [credit.get("name") for credit in skill_credits]
if "Mycroft AI" in credits_names:
skill_info.update(isMycroftMade=True)
self.response_skills.append(skill_info)
def _sort_skills(self):
"""Sort the skills in alphabetical order"""
sorted_skills = sorted(
self.response_skills,
key=lambda skill:
skill['display_name']
self.response_skills, key=lambda skill: skill["displayName"]
)
self.response_skills = sorted_skills

View File

@ -27,11 +27,12 @@ from selene.data.skill import SkillDisplay, SkillDisplayRepository
class SkillDetailEndpoint(SeleneEndpoint):
""""Supply the data that will populate the skill detail page."""
"""Supply the data that will populate the skill detail page."""
authentication_required = False
def __init__(self):
super(SkillDetailEndpoint, self).__init__()
super().__init__()
self.skill_display_id = None
self.response_skill = None
self.manifest_skills = []
@ -57,39 +58,37 @@ class SkillDetailEndpoint(SeleneEndpoint):
def _build_response_data(self, skill_display: SkillDisplay):
"""Make some modifications to the response skill for the marketplace"""
self.response_skill = dict(
categories=skill_display.display_data.get('categories'),
credits=skill_display.display_data.get('credits'),
categories=skill_display.display_data.get("categories"),
credits=skill_display.display_data.get("credits"),
description=markdown(
skill_display.display_data.get('description'),
output_format='html5'
skill_display.display_data.get("description"), output_format="html5"
),
display_name=skill_display.display_data['display_name'],
icon=skill_display.display_data.get('icon'),
iconImage=skill_display.display_data.get('icon_img'),
displayName=skill_display.display_data["display_name"],
icon=skill_display.display_data.get("icon"),
iconImage=skill_display.display_data.get("icon_img"),
isSystemSkill=False,
worksOnMarkOne=(
'all' in skill_display.display_data['platforms'] or
'platform_mark1' in skill_display.display_data['platforms']
"all" in skill_display.display_data["platforms"]
or "platform_mark1" in skill_display.display_data["platforms"]
),
worksOnMarkTwo=(
'all' in skill_display.display_data['platforms'] or
'platform_mark2' in skill_display.display_data['platforms']
"all" in skill_display.display_data["platforms"]
or "platform_mark2" in skill_display.display_data["platforms"]
),
worksOnPicroft=(
'all' in skill_display.display_data['platforms'] or
'platform_picroft' in skill_display.display_data['platforms']
"all" in skill_display.display_data["platforms"]
or "platform_picroft" in skill_display.display_data["platforms"]
),
worksOnKDE=(
'all' in skill_display.display_data['platforms'] or
'platform_plasmoid' in skill_display.display_data['platforms']
"all" in skill_display.display_data["platforms"]
or "platform_plasmoid" in skill_display.display_data["platforms"]
),
repositoryUrl=skill_display.display_data.get('repo'),
repositoryUrl=skill_display.display_data.get("repo"),
summary=markdown(
skill_display.display_data['short_desc'],
output_format='html5'
skill_display.display_data["short_desc"], output_format="html5"
),
triggers=skill_display.display_data['examples']
triggers=skill_display.display_data["examples"],
)
if skill_display.display_data['tags'] is not None:
if 'system' in skill_display.display_data['tags']:
self.response_skill['isSystemSkill'] = True
if skill_display.display_data["tags"] is not None:
if "system" in skill_display.display_data["tags"]:
self.response_skill["isSystemSkill"] = True

View File

@ -25,7 +25,6 @@ remove the skill.
"""
import ast
from http import HTTPStatus
from logging import getLogger
from typing import List
from schematics import Model
@ -35,39 +34,39 @@ from selene.api import ETagManager, SeleneEndpoint
from selene.data.skill import (
AccountSkillSetting,
SkillDisplayRepository,
SkillSettingRepository
SkillSettingRepository,
)
from selene.util.log import get_selene_logger
INSTALL_SECTION = 'to_install'
UNINSTALL_SECTION = 'to_remove'
INSTALL_SECTION = "to_install"
UNINSTALL_SECTION = "to_remove"
_log = getLogger(__package__)
_log = get_selene_logger(__name__)
class InstallRequest(Model):
"""Defines the expected state of the request JSON data"""
setting_section = StringType(
required=True,
choices=[INSTALL_SECTION, UNINSTALL_SECTION]
required=True, choices=[INSTALL_SECTION, UNINSTALL_SECTION]
)
skill_display_id = StringType(required=True)
class SkillInstallEndpoint(SeleneEndpoint):
"""Install a skill on user device(s)."""
_settings_repo = None
def __init__(self):
super(SkillInstallEndpoint, self).__init__()
super().__init__()
self.installer_settings: List[AccountSkillSetting] = []
self.skill_name = None
self.etag_manager = ETagManager(
self.config['SELENE_CACHE'],
self.config
)
self.etag_manager = ETagManager(self.config["SELENE_CACHE"], self.config)
@property
def settings_repo(self):
"""Lazy instantiation of the skill settings repository."""
if self._settings_repo is None:
self._settings_repo = SkillSettingRepository(self.db)
@ -84,7 +83,7 @@ class SkillInstallEndpoint(SeleneEndpoint):
self._apply_update()
self.etag_manager.expire_skill_etag_by_account_id(self.account.id)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def _validate_request(self):
"""Ensure the data passed in the request is as expected.
@ -92,8 +91,8 @@ class SkillInstallEndpoint(SeleneEndpoint):
:raises schematics.exceptions.ValidationError if the validation fails
"""
install_request = InstallRequest()
install_request.setting_section = self.request.json['section']
install_request.skill_display_id = self.request.json['skillDisplayId']
install_request.setting_section = self.request.json["section"]
install_request.skill_display_id = self.request.json["skillDisplayId"]
install_request.validate()
def _get_skill_name(self):
@ -104,9 +103,9 @@ class SkillInstallEndpoint(SeleneEndpoint):
"""
display_repo = SkillDisplayRepository(self.db)
skill_display = display_repo.get_display_data_for_skill(
self.request.json['skillDisplayId']
self.request.json["skillDisplayId"]
)
self.skill_name = skill_display.display_data['name']
self.skill_name = skill_display.display_data["name"]
def _apply_update(self):
"""Add the skill in the request to the installer skill settings.
@ -115,7 +114,7 @@ class SkillInstallEndpoint(SeleneEndpoint):
devices associated with an account. It will be updated in the
future to target specific devices.
"""
section = self.request.json['section']
section = self.request.json["section"]
for settings in self.installer_settings:
setting_value = settings.settings_values.get(section, [])
if isinstance(setting_value, str):
@ -127,6 +126,5 @@ class SkillInstallEndpoint(SeleneEndpoint):
def _update_skill_settings(self, new_skill_settings):
"""Update the DB with the new installer skill settings."""
self.settings_repo.update_skill_settings(
self.account.id,
new_skill_settings
self.account.id, new_skill_settings, None
)

View File

@ -26,12 +26,7 @@ from selene.api import SeleneEndpoint
from selene.data.device import DeviceSkillRepository, ManifestSkill
from selene.util.auth import AuthenticationError
VALID_STATUS_VALUES = (
'failed',
'installed',
'installing',
'uninstalling'
)
VALID_STATUS_VALUES = ("failed", "installed", "installing", "uninstalling")
class SkillInstallStatusEndpoint(SeleneEndpoint):
@ -45,7 +40,7 @@ class SkillInstallStatusEndpoint(SeleneEndpoint):
try:
self._authenticate()
except AuthenticationError:
self.response = ('', HTTPStatus.NO_CONTENT)
self.response = ("", HTTPStatus.NO_CONTENT)
else:
self._get_installed_skills()
response_data = self._build_response_data()
@ -55,9 +50,7 @@ class SkillInstallStatusEndpoint(SeleneEndpoint):
def _get_installed_skills(self):
skill_repo = DeviceSkillRepository(self.db)
installed_skills = skill_repo.get_skill_manifest_for_account(
self.account.id
)
installed_skills = skill_repo.get_skill_manifest_for_account(self.account.id)
for skill in installed_skills:
self.installed_skills[skill.skill_id].append(skill)
@ -67,18 +60,13 @@ class SkillInstallStatusEndpoint(SeleneEndpoint):
for skill_id, skills in self.installed_skills.items():
skill_aggregator = SkillManifestAggregator(skills)
skill_aggregator.aggregate_skill_status()
if skill_aggregator.aggregate_skill.install_status == 'failed':
failure_reasons[skill_id] = (
skill_aggregator.aggregate_skill.install_failure_reason
)
install_statuses[skill_id] = (
skill_aggregator.aggregate_skill.install_status
)
if skill_aggregator.aggregate_skill.install_status == "failed":
failure_reasons[
skill_id
] = skill_aggregator.aggregate_skill.install_failure_reason
install_statuses[skill_id] = skill_aggregator.aggregate_skill.install_status
return dict(
installStatuses=install_statuses,
failureReasons=failure_reasons
)
return dict(installStatuses=install_statuses, failureReasons=failure_reasons)
class SkillManifestAggregator(object):
@ -96,7 +84,7 @@ class SkillManifestAggregator(object):
"""
self._validate_install_status()
self._determine_install_status()
if self.aggregate_skill.install_status == 'failed':
if self.aggregate_skill.install_status == "failed":
self._determine_failure_reason()
def _validate_install_status(self):
@ -104,7 +92,7 @@ class SkillManifestAggregator(object):
if skill.install_status not in VALID_STATUS_VALUES:
raise ValueError(
'"{install_status}" is not a supported value of the '
'installation field in the skill manifest'.format(
"installation field in the skill manifest".format(
install_status=skill.install_status
)
)
@ -120,33 +108,24 @@ class SkillManifestAggregator(object):
If the install fails on any device, the install will be flagged as a
failed install in the Marketplace.
"""
failed = [
skill.install_status == 'failed' for skill in self.installed_skills
]
installing = [
s.install_status == 'installing' for s in self.installed_skills
]
failed = [skill.install_status == "failed" for skill in self.installed_skills]
installing = [s.install_status == "installing" for s in self.installed_skills]
uninstalling = [
skill.install_status == 'uninstalling' for skill in
self.installed_skills
]
installed = [
s.install_status == 'installed' for s in self.installed_skills
skill.install_status == "uninstalling" for skill in self.installed_skills
]
installed = [s.install_status == "installed" for s in self.installed_skills]
if any(failed):
self.aggregate_skill.install_status = 'failed'
self.aggregate_skill.install_status = "failed"
elif any(installing):
self.aggregate_skill.install_status = 'installing'
self.aggregate_skill.install_status = "installing"
elif any(uninstalling):
self.aggregate_skill.install_status = 'uninstalling'
self.aggregate_skill.install_status = "uninstalling"
elif all(installed):
self.aggregate_skill.install_status = 'installed'
self.aggregate_skill.install_status = "installed"
def _determine_failure_reason(self):
"""When a skill fails to install, determine the reason"""
for skill in self.installed_skills:
if skill.install_status == 'failed':
self.aggregate_skill.failure_reason = (
skill.install_failure_reason
)
if skill.install_status == "failed":
self.aggregate_skill.failure_reason = skill.install_failure_reason
break

1311
api/market/poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

25
api/market/pyproject.toml Normal file
View File

@ -0,0 +1,25 @@
[tool.poetry]
name = "market"
version = "0.1.0"
description = "API for Mycroft Marketplace"
authors = ["Chris Veilleux <veilleux.chris@gmail.com>"]
license = "GNU AGPL 3.0"
[tool.poetry.dependencies]
python = "^3.9"
flask = "*"
requests = "*"
pyjwt = "*"
uwsgi = "*"
markdown = "*"
selene = {path = "./../../shared", develop = true}
[tool.poetry.dev-dependencies]
behave = "*"
pyhamcrest = "*"
allure-behave = "*"
pylint = "*"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

1261
api/precise/poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

View File

@ -0,0 +1,47 @@
# Mycroft Server - Backend
# Copyright (C) 2020 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Entry point for the API that supports the Mycroft Marketplace."""
from flask import Flask
from selene.api import get_base_config, selene_api, SeleneResponse
from selene.util.log import configure_logger
from .endpoints import AudioFileEndpoint, DesignationEndpoint, TagEndpoint
_log = configure_logger("precise_api")
# Define the Flask application
precise = Flask(__name__)
precise.config.from_object(get_base_config())
precise.response_class = SeleneResponse
precise.register_blueprint(selene_api)
audio_file_endpoint = AudioFileEndpoint.as_view("audio_file_endpoint")
precise.add_url_rule(
"/api/audio/<string:file_name>", view_func=audio_file_endpoint, methods=["GET"]
)
designation_endpoint = DesignationEndpoint.as_view("designation_endpoint")
precise.add_url_rule(
"/api/designation", view_func=designation_endpoint, methods=["GET"]
)
tag_endpoint = TagEndpoint.as_view("tag_endpoint")
precise.add_url_rule("/api/tag", view_func=tag_endpoint, methods=["GET", "POST"])

View File

@ -0,0 +1,23 @@
# Mycroft Server - Backend
# Copyright (C) 2020 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Public API into the Precise API endpoint package."""
from .audio_file import AudioFileEndpoint
from .designation import DesignationEndpoint
from .tag import TagEndpoint

View File

@ -0,0 +1,37 @@
# Mycroft Server - Backend
# Copyright (C) 2020 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Precise API endpoint for presenting a URL to the GUI for the audio file."""
from os import environ
from flask import abort, send_from_directory
from selene.api import SeleneEndpoint
class AudioFileEndpoint(SeleneEndpoint):
"""Precise API endpoint for presenting a URL to the GUI for the audio file."""
def get(self, file_name):
"""Handle an HTTP GET request."""
self._authenticate()
try:
return send_from_directory(environ["SELENE_DATA_DIR"], file_name)
except FileNotFoundError:
abort(404)

View File

@ -0,0 +1,143 @@
# Mycroft Server - Backend
# Copyright (C) 2020 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Precise API endpoint for retrieving file designations.
A designation is a decision made from a set of tags as to what the attributes of a
sample file are.
"""
from collections import defaultdict
from datetime import datetime
from http import HTTPStatus
from pathlib import Path
from typing import List
from selene.api import SeleneEndpoint
from selene.data.tagging import (
FileDesignation,
FileDesignationRepository,
Tag,
TagRepository,
TagValue,
)
class DesignationEndpoint(SeleneEndpoint):
"""Precise API endpoint for tagging a file.
The HTTP GET request will select all sample files for a specified wake word that
have been designated since a specified date. Optional tag and tag value arguments
can be used to filter the result set.
"""
_tags = None
@property
def tags(self) -> List[Tag]:
"""Get all the possible tags.
:return a list of all tags and their values
"""
if self._tags is None:
tag_repository = TagRepository(self.db)
tags = tag_repository.get_all()
self._tags = sorted(tags, key=lambda tag: tag.priority)
return self._tags
def get(self):
"""Handle an HTTP GET request."""
response_data = self._build_response_data()
return response_data, HTTPStatus.OK
def _build_response_data(self):
"""Build the response from data retrieved from the database.
:return the response
"""
designations = self._get_designations()
response_data = dict()
for designation in designations:
tag = self._get_tag(designation)
tag_value = self._get_tag_value(designation, tag)
if self._include_in_result(tag, tag_value):
if tag.name not in response_data:
response_data[tag.name] = defaultdict(list)
file_path = Path(designation.file_directory).joinpath(
designation.file_name
)
response_data[tag.name][tag_value.value].append(str(file_path))
return response_data
def _get_designations(self) -> List[FileDesignation]:
"""Retrieve the designations from the database that meet the criteria."""
wake_word = self.request.args["wakeWord"].replace("-", " ")
start_date = datetime.strptime(self.request.args["startDate"], "%Y-%m-%d")
designation_repo = FileDesignationRepository(self.db)
designations = designation_repo.get_from_date(wake_word, start_date)
return designations
def _include_in_result(self, tag: Tag, tag_value: TagValue) -> bool:
"""Use the tag associated with a file to determine inclusion in results set.
:param tag: The tag designated to the sample file
:param tag_value: The value of the tag designated to the sample file.
:return a boolean value indicating if the file should be included in response.
"""
requested_tag = self.request.args.get("tag")
requested_tag_value = self.request.args.get("tagValue")
if requested_tag is None:
include = True
else:
include = requested_tag == tag.name and (
requested_tag_value is None or requested_tag_value == tag_value.value
)
return include
def _get_tag(self, designation: FileDesignation) -> Tag:
"""Get the attributes of the tag designated to the file.
:param designation: Object containing attributes of the designated file.
:return: The attributes of the tag
:raises ValueError when the designation tag is not valid.
"""
for tag in self.tags:
if tag.id == designation.tag_id:
return tag
raise ValueError(f"Tag ID {designation.tag_id} not found")
@staticmethod
def _get_tag_value(designation: FileDesignation, tag: Tag):
"""Get the attributes of the tag value designated to the file.
:param designation: Object containing attributes of the designated file.
:param tag: The attributes of the tag, which include valid values
:return: The attributes of the tag value
:raises ValueError when the designation tag value is not valid.
"""
for tag_value in tag.values:
if tag_value.id == designation.tag_value_id:
return tag_value
raise ValueError(f"Tag value ID {designation.tag_value_id} not found")

View File

@ -0,0 +1,222 @@
# Mycroft Server - Backend
# Copyright (C) 2020 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Precise API endpoint for tagging a file."""
import getpass
from http import HTTPStatus
from os import environ
from pathlib import Path
from typing import List
from flask import jsonify
from schematics import Model
from schematics.types import StringType
from selene.api import SeleneEndpoint
from selene.data.tagging import (
FileTag,
FileTagRepository,
SessionRepository,
Tag,
TaggableFile,
Tagger,
TaggerRepository,
TagRepository,
WakeWordFileRepository,
)
from selene.util.ssh import get_remote_file, SshClientConfig
class TagPostRequest(Model):
"""Define the expected arguments to be passed in the POST request."""
tag_id = StringType(required=True)
tag_value = StringType(required=True)
file_name = StringType(required=True)
session_id = StringType(required=True)
class TagEndpoint(SeleneEndpoint):
"""Precise API endpoint for tagging a file.
The HTTP GET request will randomly select a type of tag, which will in turn be used
to retrieve an audio file that requires the tag. The selected audio file must not
have been tagged in the last hour. This will prevent the same files from being
tagged more times than necessary. The file will also be copied to local storage
for a subsequent API call.
"""
_tags = None
@property
def tags(self) -> List[Tag]:
"""Get all the possible tags.
:return a list of all tags and their values
"""
if self._tags is None:
tag_repository = TagRepository(self.db)
tags = tag_repository.get_all()
self._tags = sorted(tags, key=lambda tag: tag.priority)
return self._tags
def get(self):
"""Handle an HTTP GET request."""
self._authenticate()
session_id = self._ensure_session_exists()
response_data, file_to_tag = self._build_response_data(session_id)
if response_data:
self._copy_audio_file(file_to_tag)
return response_data, HTTPStatus.OK if response_data else HTTPStatus.NO_CONTENT
def _ensure_session_exists(self):
"""If no session ID is provided in the request, get it from the database."""
session_id = self.request.args.get("sessionId")
if session_id is None:
tagger = self._ensure_tagger_exists()
session_repository = SessionRepository(self.db)
session_id = session_repository.ensure_session_exists(tagger)
return session_id
def _ensure_tagger_exists(self):
"""Get the tagger attributes for this account."""
tagger = Tagger(entity_type="account", entity_id=self.account.id)
tagger_repository = TaggerRepository(self.db)
tagger.id = tagger_repository.ensure_tagger_exists(tagger)
return tagger
def _build_response_data(self, session_id: str):
"""Build the response from data retrieved from the database.
:param session_id: Identifier of the user's tagging session
:return the response and the taggable file object
"""
wake_word = self.request.args["wakeWord"].replace("-", " ")
file_to_tag = self._get_taggable_file(wake_word, session_id)
if file_to_tag is None:
response_data = ""
else:
tag = self._select_tag(file_to_tag)
response_data = dict(
audioFileId=file_to_tag.id,
audioFileName=file_to_tag.name,
sessionId=session_id,
tagId=tag.id,
tagInstructions=tag.instructions,
tagName=(wake_word if tag.name == "wake word" else tag.name).title(),
tagTitle=tag.title,
tagValues=tag.values,
)
return response_data, file_to_tag
def _get_taggable_file(self, wake_word: str, session_id: str) -> TaggableFile:
"""Get a file that has still requires some tagging for a specified tag type.
:param wake_word: the wake word being tagged by the user
:param session_id: identifier of the user's tagging session
:return: dataclass instance representing the file to be tagged
"""
file_repository = WakeWordFileRepository(self.db)
file_to_tag = file_repository.get_taggable_file(
wake_word, len(self.tags), session_id
)
return file_to_tag
def _select_tag(self, file_to_tag: TaggableFile) -> Tag:
"""Determine which tag to return in the response.
:param file_to_tag: Attributes of the file that will be tagged by the user
:return: the tag to put in the response
"""
selected_tag = None
for tag in self.tags:
if file_to_tag.designations is None:
selected_tag = tag
elif file_to_tag.tag is None:
if tag.id not in file_to_tag.designations:
selected_tag = tag
else:
if tag.id == file_to_tag.tag:
selected_tag = tag
if selected_tag is not None:
break
return selected_tag or self.tags[0]
@staticmethod
def _copy_audio_file(file_to_tag: TaggableFile):
"""Copy the file from the location specified in the database to local storage
:param file_to_tag: dataclass instance representing the file to be tagged
"""
local_path = Path(environ["SELENE_DATA_DIR"]).joinpath(file_to_tag.name)
if not local_path.exists():
if file_to_tag.location.server == environ["PRECISE_SERVER"]:
remote_user = "precise"
ssh_port = environ["PRECISE_SSH_PORT"]
else:
remote_user = "mycroft"
ssh_port = 22
ssh_config = SshClientConfig(
local_user=getpass.getuser(),
remote_server=file_to_tag.location.server,
remote_user=remote_user,
ssh_port=ssh_port,
)
remote_path = Path(file_to_tag.location.directory).joinpath(
file_to_tag.name
)
get_remote_file(ssh_config, local_path, remote_path)
def post(self):
"""Process HTTP POST request for an account."""
self._authenticate()
self._validate_post_request()
self._add_tag()
return jsonify("File tagged successfully"), HTTPStatus.OK
def _validate_post_request(self):
"""Validate the contents of the request object for a POST request."""
post_request = TagPostRequest(
dict(
session_id=self.request.json.get("sessionId"),
tag_id=self.request.json.get("tagId"),
tag_value=self.request.json.get("tagValueId"),
file_name=self.request.json.get("audioFileId"),
)
)
post_request.validate()
def _add_tag(self):
"""Add the tagging result to the database."""
file_tag = FileTag(
file_id=self.request.json["audioFileId"],
session_id=self.request.json["sessionId"],
tag_id=self.request.json["tagId"],
tag_value_id=self.request.json["tagValueId"],
)
file_tag_repository = FileTagRepository(self.db)
file_tag_repository.add(file_tag)

View File

@ -0,0 +1,22 @@
[tool.poetry]
name = "precise"
version = "0.1.0"
description = "API for Precise wake word tagger"
authors = ["Chris Veilleux <veilleux.chris@gmail.com>"]
license = "GNU AGPL 3.0"
[tool.poetry.dependencies]
python = "^3.9"
flask = "*"
selene = {path = "./../../shared", develop = true}
uwsgi = "*"
[tool.poetry.dev-dependencies]
behave = "*"
pyhamcrest = "*"
pylint = "*"
black = "*"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

8
api/precise/uwsgi.ini Normal file
View File

@ -0,0 +1,8 @@
[uwsgi]
master = true
module = precise_api.api:precise
processes = 4
socket = :5000
die-on-term = true
lazy = true
lazy-apps = true

View File

@ -1,21 +0,0 @@
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
behave = "*"
pyhamcrest = "*"
allure-behave = "*"
pylint = "*"
[packages]
flask = "<1.1"
requests = "*"
selene = {editable = true,path = "./../../shared"}
SpeechRecognition = "*"
uwsgi = "*"
stripe = "*"
[requires]
python_version = "3.7"

455
api/public/Pipfile.lock generated
View File

@ -1,455 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "fd790c72c6d273e26491798624b288ea38d3f8a69b2073cb50021a5d10894e82"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.7"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"certifi": {
"hashes": [
"sha256:5930595817496dd21bb8dc35dad090f1c2cd0adfaf21204bf6732ca5d8ee34d3",
"sha256:8fc0819f1f30ba15bdb34cceffb9ef04d99f420f68eb75d901e9560b8749fc41"
],
"version": "==2020.6.20"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"click": {
"hashes": [
"sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a",
"sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"
],
"version": "==7.1.2"
},
"deprecated": {
"hashes": [
"sha256:525ba66fb5f90b07169fdd48b6373c18f1ee12728ca277ca44567a367d9d7f74",
"sha256:a766c1dccb30c5f6eb2b203f87edd1d8588847709c78589e1521d769addc8218"
],
"version": "==1.2.10"
},
"facebook-sdk": {
"hashes": [
"sha256:2e987b3e0f466a6f4ee77b935eb023dba1384134f004a2af21f1cfff7fe0806e",
"sha256:cabcd2e69ea3d9f042919c99b353df7aa1e2be86d040121f6e9f5e63c1cf0f8d"
],
"version": "==3.1.0"
},
"flask": {
"hashes": [
"sha256:1a21ccca71cee5e55b6a367cc48c6eb47e3c447f76e64d41f3f3f931c17e7c96",
"sha256:ed1330220a321138de53ec7c534c3d90cf2f7af938c7880fc3da13aa46bf870f"
],
"index": "pypi",
"version": "==1.0.4"
},
"idna": {
"hashes": [
"sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6",
"sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"
],
"version": "==2.10"
},
"itsdangerous": {
"hashes": [
"sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19",
"sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749"
],
"version": "==1.1.0"
},
"jinja2": {
"hashes": [
"sha256:89aab215427ef59c34ad58735269eb58b1a5808103067f7bb9d5836c651b3bb0",
"sha256:f0a4641d3cf955324a89c04f3d94663aa4d638abe8f733ecd3582848e1c37035"
],
"version": "==2.11.2"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7",
"sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be"
],
"version": "==1.1.1"
},
"passlib": {
"hashes": [
"sha256:68c35c98a7968850e17f1b6892720764cc7eed0ef2b7cb3116a89a28e43fe177",
"sha256:8d666cef936198bc2ab47ee9b0410c94adf2ba798e5a84bf220be079ae7ab6a8"
],
"version": "==1.7.2"
},
"psycopg2-binary": {
"hashes": [
"sha256:008da3ab51adc70a5f1cfbbe5db3a22607ab030eb44bcecf517ad11a0c2b3cac",
"sha256:07cf82c870ec2d2ce94d18e70c13323c89f2f2a2628cbf1feee700630be2519a",
"sha256:08507efbe532029adee21b8d4c999170a83760d38249936038bd0602327029b5",
"sha256:107d9be3b614e52a192719c6bf32e8813030020ea1d1215daa86ded9a24d8b04",
"sha256:17a0ea0b0eabf07035e5e0d520dabc7950aeb15a17c6d36128ba99b2721b25b1",
"sha256:3286541b9d85a340ee4ed42732d15fc1bb441dc500c97243a768154ab8505bb5",
"sha256:3939cf75fc89c5e9ed836e228c4a63604dff95ad19aed2bbf71d5d04c15ed5ce",
"sha256:40abc319f7f26c042a11658bf3dd3b0b3bceccf883ec1c565d5c909a90204434",
"sha256:51f7823f1b087d2020d8e8c9e6687473d3d239ba9afc162d9b2ab6e80b53f9f9",
"sha256:6bb2dd006a46a4a4ce95201f836194eb6a1e863f69ee5bab506673e0ca767057",
"sha256:702f09d8f77dc4794651f650828791af82f7c2efd8c91ae79e3d9fe4bb7d4c98",
"sha256:7036ccf715925251fac969f4da9ad37e4b7e211b1e920860148a10c0de963522",
"sha256:7b832d76cc65c092abd9505cc670c4e3421fd136fb6ea5b94efbe4c146572505",
"sha256:8f74e631b67482d504d7e9cf364071fc5d54c28e79a093ff402d5f8f81e23bfa",
"sha256:930315ac53dc65cbf52ab6b6d27422611f5fb461d763c531db229c7e1af6c0b3",
"sha256:96d3038f5bd061401996614f65d27a4ecb62d843eb4f48e212e6d129171a721f",
"sha256:a20299ee0ea2f9cca494396ac472d6e636745652a64a418b39522c120fd0a0a4",
"sha256:a34826d6465c2e2bbe9d0605f944f19d2480589f89863ed5f091943be27c9de4",
"sha256:a69970ee896e21db4c57e398646af9edc71c003bc52a3cc77fb150240fefd266",
"sha256:b9a8b391c2b0321e0cd7ec6b4cfcc3dd6349347bd1207d48bcb752aa6c553a66",
"sha256:ba13346ff6d3eb2dca0b6fa0d8a9d999eff3dcd9b55f3a890f12b0b6362b2b38",
"sha256:bb0608694a91db1e230b4a314e8ed00ad07ed0c518f9a69b83af2717e31291a3",
"sha256:c8830b7d5f16fd79d39b21e3d94f247219036b29b30c8270314c46bf8b732389",
"sha256:cac918cd7c4c498a60f5d2a61d4f0a6091c2c9490d81bc805c963444032d0dab",
"sha256:cc30cb900f42c8a246e2cb76539d9726f407330bc244ca7729c41a44e8d807fb",
"sha256:ccdc6a87f32b491129ada4b87a43b1895cf2c20fdb7f98ad979647506ffc41b6",
"sha256:d1a8b01f6a964fec702d6b6dac1f91f2b9f9fe41b310cbb16c7ef1fac82df06d",
"sha256:e004db88e5a75e5fdab1620fb9f90c9598c2a195a594225ac4ed2a6f1c23e162",
"sha256:eb2f43ae3037f1ef5e19339c41cf56947021ac892f668765cd65f8ab9814192e",
"sha256:fa466306fcf6b39b8a61d003123d442b23707d635a5cb05ac4e1b62cc79105cd"
],
"version": "==2.8.5"
},
"pygithub": {
"hashes": [
"sha256:8375a058ec651cc0774244a3bc7395cf93617298735934cdd59e5bcd9a1df96e",
"sha256:d2d17d1e3f4474e070353f201164685a95b5a92f5ee0897442504e399c7bc249"
],
"version": "==1.51"
},
"pyhamcrest": {
"hashes": [
"sha256:412e00137858f04bde0729913874a48485665f2d36fe9ee449f26be864af9316",
"sha256:7ead136e03655af85069b6f47b23eb7c3e5c221aa9f022a4fbb499f5b7308f29"
],
"version": "==2.0.2"
},
"pyjwt": {
"hashes": [
"sha256:5c6eca3c2940464d106b99ba83b00c6add741c9becaec087fb7ccdefea71350e",
"sha256:8d59a976fb773f3e6a39c85636357c4f0e242707394cadadd9814f5cbaa20e96"
],
"version": "==1.7.1"
},
"python-http-client": {
"hashes": [
"sha256:93d6a26b426e48b04e589c1f103e7c040193e4ccc379ea50cd6e12f94cca7c69"
],
"version": "==3.2.7"
},
"redis": {
"hashes": [
"sha256:0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2",
"sha256:432b788c4530cfe16d8d943a09d40ca6c16149727e4afe8c2c9d5580c59d9f24"
],
"version": "==3.5.3"
},
"requests": {
"hashes": [
"sha256:b3559a131db72c33ee969480840fff4bb6dd111de7dd27c8ee1f820f4f00231b",
"sha256:fe75cc94a9443b9246fc7049224f75604b113c36acb93f87b80ed42c44cbb898"
],
"index": "pypi",
"version": "==2.24.0"
},
"schedule": {
"hashes": [
"sha256:3f895a1036799a25ab9c335de917073e63cf8256920917e932777382f101f08f",
"sha256:f9fb5181283de4db6e701d476dd01b6a3dd81c38462a54991ddbb9d26db857c9"
],
"version": "==0.6.0"
},
"schematics": {
"hashes": [
"sha256:8fcc6182606fd0b24410a1dbb066d9bbddbe8da9c9509f47b743495706239283",
"sha256:a40b20635c0e43d18d3aff76220f6cd95ea4decb3f37765e49529b17d81b0439"
],
"version": "==2.1.0"
},
"selene": {
"editable": true,
"path": "./../../shared"
},
"sendgrid": {
"hashes": [
"sha256:54e51ca9afbfe1a4706864f42eb1a12d597e375249d80a8ce679e7a4fa91e776",
"sha256:dd0eddf079be040172a4d0afdf9b9becb4e53210ead015a0e6b2d680eea92ac0"
],
"version": "==6.4.1"
},
"speechrecognition": {
"hashes": [
"sha256:4d8f73a0c05ec70331c3bacaa89ecc06dfa8d9aba0899276664cda06ab597e8e"
],
"index": "pypi",
"version": "==3.8.1"
},
"starkbank-ecdsa": {
"hashes": [
"sha256:cd17ec9fa7ad8ae3fc81a63ddb7e0d7fb798a048e40c1a9c55afd1a207d1eff9"
],
"version": "==1.0.0"
},
"stripe": {
"hashes": [
"sha256:515fe2cc915e639468f30150a39c162fc0fb090256ae9d6a04e5022925d136f1",
"sha256:bdbbea632b8faa983c670db61debbe0bdb5802ef98fd0613a03aa466e56cdade"
],
"index": "pypi",
"version": "==2.48.0"
},
"urllib3": {
"hashes": [
"sha256:3018294ebefce6572a474f0604c2021e33b3fd8006ecd11d62107a5d2a963527",
"sha256:88206b0eb87e6d677d424843ac5209e3fb9d0190d0ee169599165ec25e9d9115"
],
"version": "==1.25.9"
},
"uwsgi": {
"hashes": [
"sha256:faa85e053c0b1be4d5585b0858d3a511d2cd10201802e8676060fd0a109e5869"
],
"index": "pypi",
"version": "==2.0.19.1"
},
"werkzeug": {
"hashes": [
"sha256:2de2a5db0baeae7b2d2664949077c2ac63fbd16d98da0ff71837f7d1dea3fd43",
"sha256:6c80b1e5ad3665290ea39320b91e1be1e0d5f60652b964a3070216de83d2e47c"
],
"version": "==1.0.1"
},
"wrapt": {
"hashes": [
"sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7"
],
"version": "==1.12.1"
}
},
"develop": {
"allure-behave": {
"hashes": [
"sha256:71f7ab8f7afb38ca323bdf0f300cb3a280928e63c2e962a30748c23914c8ee3d",
"sha256:a6ec9968ec6c6ee69ab964cbea65e9dfa81e283f3b55ad5be8d42f3df70f8766"
],
"index": "pypi",
"version": "==2.8.16"
},
"allure-python-commons": {
"hashes": [
"sha256:3cf65bce770e4d6b6b1bd46bfecad8a04f1f7bef44133f9a3ded4295510187e2",
"sha256:f67104a51643f2b0f1807acfe324bc13c1fa97f16d9b5c85670199acabd5c40d"
],
"version": "==2.8.16"
},
"astroid": {
"hashes": [
"sha256:2f4078c2a41bf377eea06d71c9d2ba4eb8f6b1af2135bec27bbbb7d8f12bb703",
"sha256:bc58d83eb610252fd8de6363e39d4f1d0619c894b0ed24603b881c02e64c7386"
],
"version": "==2.4.2"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"behave": {
"hashes": [
"sha256:b9662327aa53294c1351b0a9c369093ccec1d21026f050c3bd9b3e5cccf81a86",
"sha256:ebda1a6c9e5bfe95c5f9f0a2794e01c7098b3dde86c10a95d8621c5907ff6f1c"
],
"index": "pypi",
"version": "==1.2.6"
},
"importlib-metadata": {
"hashes": [
"sha256:90bb658cdbbf6d1735b6341ce708fc7024a3e14e99ffdc5783edea9f9b077f83",
"sha256:dc15b2969b4ce36305c51eebe62d418ac7791e9a157911d58bfb1f9ccd8e2070"
],
"markers": "python_version < '3.8'",
"version": "==1.7.0"
},
"isort": {
"hashes": [
"sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1",
"sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd"
],
"version": "==4.3.21"
},
"lazy-object-proxy": {
"hashes": [
"sha256:0c4b206227a8097f05c4dbdd323c50edf81f15db3b8dc064d08c62d37e1a504d",
"sha256:194d092e6f246b906e8f70884e620e459fc54db3259e60cf69a4d66c3fda3449",
"sha256:1be7e4c9f96948003609aa6c974ae59830a6baecc5376c25c92d7d697e684c08",
"sha256:4677f594e474c91da97f489fea5b7daa17b5517190899cf213697e48d3902f5a",
"sha256:48dab84ebd4831077b150572aec802f303117c8cc5c871e182447281ebf3ac50",
"sha256:5541cada25cd173702dbd99f8e22434105456314462326f06dba3e180f203dfd",
"sha256:59f79fef100b09564bc2df42ea2d8d21a64fdcda64979c0fa3db7bdaabaf6239",
"sha256:8d859b89baf8ef7f8bc6b00aa20316483d67f0b1cbf422f5b4dc56701c8f2ffb",
"sha256:9254f4358b9b541e3441b007a0ea0764b9d056afdeafc1a5569eee1cc6c1b9ea",
"sha256:9651375199045a358eb6741df3e02a651e0330be090b3bc79f6d0de31a80ec3e",
"sha256:97bb5884f6f1cdce0099f86b907aa41c970c3c672ac8b9c8352789e103cf3156",
"sha256:9b15f3f4c0f35727d3a0fba4b770b3c4ebbb1fa907dbcc046a1d2799f3edd142",
"sha256:a2238e9d1bb71a56cd710611a1614d1194dc10a175c1e08d75e1a7bcc250d442",
"sha256:a6ae12d08c0bf9909ce12385803a543bfe99b95fe01e752536a60af2b7797c62",
"sha256:ca0a928a3ddbc5725be2dd1cf895ec0a254798915fb3a36af0964a0a4149e3db",
"sha256:cb2c7c57005a6804ab66f106ceb8482da55f5314b7fcb06551db1edae4ad1531",
"sha256:d74bb8693bf9cf75ac3b47a54d716bbb1a92648d5f781fc799347cfc95952383",
"sha256:d945239a5639b3ff35b70a88c5f2f491913eb94871780ebfabb2568bd58afc5a",
"sha256:eba7011090323c1dadf18b3b689845fd96a61ba0a1dfbd7f24b921398affc357",
"sha256:efa1909120ce98bbb3777e8b6f92237f5d5c8ea6758efea36a473e1d38f7d3e4",
"sha256:f3900e8a5de27447acbf900b4750b0ddfd7ec1ea7fbaf11dfa911141bc522af0"
],
"version": "==1.4.3"
},
"mccabe": {
"hashes": [
"sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
"sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
],
"version": "==0.6.1"
},
"parse": {
"hashes": [
"sha256:a6d4e2c2f1fbde6717d28084a191a052950f758c0cbd83805357e6575c2b95c0"
],
"version": "==1.15.0"
},
"parse-type": {
"hashes": [
"sha256:089a471b06327103865dfec2dd844230c3c658a4a1b5b4c8b6c16c8f77577f9e",
"sha256:7f690b18d35048c15438d6d0571f9045cffbec5907e0b1ccf006f889e3a38c0b"
],
"version": "==0.5.2"
},
"pluggy": {
"hashes": [
"sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
],
"version": "==0.13.1"
},
"pyhamcrest": {
"hashes": [
"sha256:412e00137858f04bde0729913874a48485665f2d36fe9ee449f26be864af9316",
"sha256:7ead136e03655af85069b6f47b23eb7c3e5c221aa9f022a4fbb499f5b7308f29"
],
"version": "==2.0.2"
},
"pylint": {
"hashes": [
"sha256:7dd78437f2d8d019717dbf287772d0b2dbdfd13fc016aa7faa08d67bccc46adc",
"sha256:d0ece7d223fe422088b0e8f13fa0a1e8eb745ebffcb8ed53d3e95394b6101a1c"
],
"index": "pypi",
"version": "==2.5.3"
},
"six": {
"hashes": [
"sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
"sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
],
"version": "==1.15.0"
},
"toml": {
"hashes": [
"sha256:926b612be1e5ce0634a2ca03470f95169cf16f939018233a670519cb4ac58b0f",
"sha256:bda89d5935c2eac546d648028b9901107a595863cb36bae0c73ac804a9b4ce88"
],
"version": "==0.10.1"
},
"typed-ast": {
"hashes": [
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"markers": "implementation_name == 'cpython' and python_version < '3.8'",
"version": "==1.4.1"
},
"wrapt": {
"hashes": [
"sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7"
],
"version": "==1.12.1"
},
"zipp": {
"hashes": [
"sha256:aa36550ff0c0b7ef7fa639055d797116ee891440eac1a56f378e2d3179e0320b",
"sha256:c599e4d75c98f6798c509911d08a22e6c021d074469042177c8c86fb92eefd96"
],
"version": "==3.1.0"
}
}
}

View File

@ -16,4 +16,3 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

1964
api/public/poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -16,4 +16,3 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

View File

@ -16,7 +16,7 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Define the Selene Public API"""
import os
from flask import Flask
@ -25,7 +25,8 @@ from selene.api import SeleneResponse, selene_api
from selene.api.base_config import get_base_config
from selene.api.public_endpoint import check_oauth_token
from selene.util.cache import SeleneCache
from selene.util.log import configure_logger
from selene.util.log import configure_selene_logger
from .endpoints.audio_transcription import AudioTranscriptionEndpoint
from .endpoints.device import DeviceEndpoint
from .endpoints.device_activate import DeviceActivateEndpoint
from .endpoints.device_code import DeviceCodeEndpoint
@ -33,6 +34,7 @@ from .endpoints.device_email import DeviceEmailEndpoint
from .endpoints.device_location import DeviceLocationEndpoint
from .endpoints.device_metrics import DeviceMetricsEndpoint
from .endpoints.device_oauth import OauthServiceEndpoint
from .endpoints.device_pantacor import DevicePantacorEndpoint
from .endpoints.device_refresh_token import DeviceRefreshTokenEndpoint
from .endpoints.device_setting import DeviceSettingEndpoint
from .endpoints.device_skill import SkillSettingsMetaEndpoint
@ -46,135 +48,157 @@ from .endpoints.oauth_callback import OauthCallbackEndpoint
from .endpoints.open_weather_map import OpenWeatherMapEndpoint
from .endpoints.premium_voice import PremiumVoiceEndpoint
from .endpoints.stripe_webhook import StripeWebHookEndpoint
from .endpoints.wake_word_file import WakeWordFileUpload
from .endpoints.wolfram_alpha import WolframAlphaEndpoint
from .endpoints.wolfram_alpha_simple import WolframAlphaSimpleEndpoint
from .endpoints.wolfram_alpha_spoken import WolframAlphaSpokenEndpoint
from .endpoints.wolfram_alpha_v2 import WolframAlphaV2Endpoint
_log = configure_logger('public_api')
configure_selene_logger("public_api")
public = Flask(__name__)
public.config.from_object(get_base_config())
public.config['GOOGLE_STT_KEY'] = os.environ['GOOGLE_STT_KEY']
public.config['SELENE_CACHE'] = SeleneCache()
public.config["GOOGLE_STT_KEY"] = os.environ["GOOGLE_STT_KEY"]
public.config["SELENE_CACHE"] = SeleneCache()
public.response_class = SeleneResponse
public.register_blueprint(selene_api)
public.add_url_rule(
'/v1/device/<string:device_id>/skill/<string:skill_gid>',
view_func=DeviceSkillSettingsEndpoint.as_view('device_skill_delete_api'),
methods=['DELETE']
"/v1/transcribe",
view_func=AudioTranscriptionEndpoint.as_view("audio_transcription_api"),
methods=["POST"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/skill',
view_func=DeviceSkillSettingsEndpoint.as_view('device_skill_api'),
methods=['GET', 'PUT']
"/v1/device/<string:device_id>/skill/<string:skill_gid>",
view_func=DeviceSkillSettingsEndpoint.as_view("device_skill_delete_api"),
methods=["DELETE"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/skill/settings',
view_func=DeviceSkillSettingsEndpointV2.as_view('skill_settings_api'),
methods=['GET']
"/v1/device/<string:device_id>/skill",
view_func=DeviceSkillSettingsEndpoint.as_view("device_skill_api"),
methods=["GET", "PUT"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/settingsMeta',
view_func=SkillSettingsMetaEndpoint.as_view('device_user_skill_api'),
methods=['PUT']
"/v1/device/<string:device_id>/skill/settings",
view_func=DeviceSkillSettingsEndpointV2.as_view("skill_settings_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/device/<string:device_id>',
view_func=DeviceEndpoint.as_view('device_api'),
methods=['GET', 'PATCH']
"/v1/device/<string:device_id>/settingsMeta",
view_func=SkillSettingsMetaEndpoint.as_view("device_user_skill_api"),
methods=["PUT"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/setting',
view_func=DeviceSettingEndpoint.as_view('device_settings_api'),
methods=['GET']
"/v1/device/<string:device_id>",
view_func=DeviceEndpoint.as_view("device_api"),
methods=["GET", "PATCH"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/subscription',
view_func=DeviceSubscriptionEndpoint.as_view('device_subscription_api'),
methods=['GET']
"/v1/device/<string:device_id>/setting",
view_func=DeviceSettingEndpoint.as_view("device_settings_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/geolocation',
view_func=GeolocationEndpoint.as_view('location_api'),
methods=['GET']
"/v1/device/<string:device_id>/subscription",
view_func=DeviceSubscriptionEndpoint.as_view("device_subscription_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/wa',
view_func=WolframAlphaEndpoint.as_view('wolfram_alpha_api'),
methods=['GET']
"/v1/geolocation",
view_func=GeolocationEndpoint.as_view("location_api"),
methods=["GET"],
)
public.add_url_rule(
"/v1/wa",
view_func=WolframAlphaEndpoint.as_view("wolfram_alpha_api"),
methods=["GET"],
) # TODO: change this path in the API v2
public.add_url_rule(
'/v1/owm/<path:path>',
view_func=OpenWeatherMapEndpoint.as_view('open_weather_map_api'),
methods=['GET']
) # TODO: change this path in the API v2
public.add_url_rule(
'/v1/stt',
view_func=GoogleSTTEndpoint.as_view('google_stt_api'),
methods=['POST']
"/v1/owm/<path:path>",
view_func=OpenWeatherMapEndpoint.as_view("open_weather_map_api"),
methods=["GET"],
) # TODO: change this path in the API v2
public.add_url_rule(
'/v1/device/code',
view_func=DeviceCodeEndpoint.as_view('device_code_api'),
methods=['GET']
"/v1/stt", view_func=GoogleSTTEndpoint.as_view("google_stt_api"), methods=["POST"]
) # TODO: change this path in the API v2
public.add_url_rule(
"/v1/device/code",
view_func=DeviceCodeEndpoint.as_view("device_code_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/device/activate',
view_func=DeviceActivateEndpoint.as_view('device_activate_api'),
methods=['POST']
"/v1/device/activate",
view_func=DeviceActivateEndpoint.as_view("device_activate_api"),
methods=["POST"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/message',
view_func=DeviceEmailEndpoint.as_view('device_email_api'),
methods=['PUT']
"/v1/device/pantacor",
view_func=DevicePantacorEndpoint.as_view("device_pantacor_api"),
methods=["POST"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/metric/<path:metric>',
view_func=DeviceMetricsEndpoint.as_view('device_metric_api'),
methods=['POST']
"/v1/device/<string:device_id>/message",
view_func=DeviceEmailEndpoint.as_view("device_email_api"),
methods=["PUT"],
)
public.add_url_rule(
'/v1/auth/token',
view_func=DeviceRefreshTokenEndpoint.as_view('refresh_token_api'),
methods=['GET']
"/v1/device/<string:device_id>/metric/<path:metric>",
view_func=DeviceMetricsEndpoint.as_view("device_metric_api"),
methods=["POST"],
)
public.add_url_rule(
'/v1/wolframAlphaSpoken',
view_func=WolframAlphaSpokenEndpoint.as_view('wolfram_alpha_spoken_api'),
methods=['GET']
"/v1/auth/token",
view_func=DeviceRefreshTokenEndpoint.as_view("refresh_token_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/location',
view_func=DeviceLocationEndpoint.as_view('device_location_api'),
methods=['GET']
"/v1/wolframAlphaSimple",
view_func=WolframAlphaSimpleEndpoint.as_view("wolfram_alpha_simple_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/skillJson',
view_func=DeviceSkillManifestEndpoint.as_view('skill_manifest_api'),
methods=['GET', 'PUT']
"/v1/wolframAlphaSpoken",
view_func=WolframAlphaSpokenEndpoint.as_view("wolfram_alpha_spoken_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/auth/callback',
view_func=OauthCallbackEndpoint.as_view('oauth_callback_api'),
methods=['GET']
"/v1/wolframAlphaFull",
view_func=WolframAlphaV2Endpoint.as_view("wolfram_alpha_v2_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/voice',
view_func=PremiumVoiceEndpoint.as_view('premium_voice_api'),
methods=['GET']
"/v1/device/<string:device_id>/location",
view_func=DeviceLocationEndpoint.as_view("device_location_api"),
methods=["GET"],
)
public.add_url_rule(
'/v1/device/<string:device_id>/<string:oauth_path>/<string:credentials>',
view_func=OauthServiceEndpoint.as_view('oauth_api'),
methods=['GET']
"/v1/device/<string:device_id>/skillJson",
view_func=DeviceSkillManifestEndpoint.as_view("skill_manifest_api"),
methods=["GET", "PUT"],
)
public.add_url_rule(
'/v1/user/stripe/webhook',
view_func=StripeWebHookEndpoint.as_view('stripe_webhook_api'),
methods=['POST']
"/v1/auth/callback",
view_func=OauthCallbackEndpoint.as_view("oauth_callback_api"),
methods=["GET"],
)
public.add_url_rule(
"/v1/device/<string:device_id>/voice",
view_func=PremiumVoiceEndpoint.as_view("premium_voice_api"),
methods=["GET"],
)
public.add_url_rule(
"/v1/device/<string:device_id>/<string:oauth_path>/<string:credentials>",
view_func=OauthServiceEndpoint.as_view("oauth_api"),
methods=["GET"],
)
public.add_url_rule(
"/v1/user/stripe/webhook",
view_func=StripeWebHookEndpoint.as_view("stripe_webhook_api"),
methods=["POST"],
)
public.add_url_rule(
"/v1/device/<string:device_id>/wake-word-file",
view_func=WakeWordFileUpload.as_view("wake_word_file"),
methods=["POST"],
)

View File

@ -16,4 +16,3 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

View File

@ -0,0 +1,134 @@
# Mycroft Server - Backend
# Copyright (C) 2019 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Public API endpoint for official Mycroft-supported audio transcriptions.
When a device is configured to use the Mycroft STT plugin for transcribing audio,
this endpoint will be called to do the transcription anonymously.
"""
from datetime import datetime
from decimal import Decimal
from http import HTTPStatus
from io import BytesIO
from typing import Optional
import librosa
from google.cloud import speech
from selene.api import PublicEndpoint, track_account_activity
from selene.data.account import AccountRepository
from selene.data.metric import SttTranscriptionMetric, TranscriptionMetricRepository
from selene.util.log import get_selene_logger
SAMPLE_RATE = 16000
_log = get_selene_logger(__name__)
class AudioTranscriptionEndpoint(PublicEndpoint):
"""Transcribes audio data to text and responds with the result."""
def __init__(self):
super().__init__()
self.audio_duration = Decimal(0.0)
self.transcription_duration = Decimal(0.0)
def post(self):
"""Processes an HTTP Post request."""
self._authenticate()
transcription = self._transcribe()
self._add_transcription_metric(transcription)
if transcription is not None:
track_account_activity(self.db, self.device_id)
return dict(transcription=transcription), HTTPStatus.OK
def _transcribe(self) -> Optional[str]:
"""Transcribes the audio in the request to text using a transcription service.
:returns: None if the transcription failed or the transcription
"""
response = self._call_transcription_api()
transcription = self._get_transcription(response)
return transcription
def _call_transcription_api(self) -> Optional[speech.RecognizeResponse]:
"""Calls the configured audio transcription service API.
:returns: None if the call fails or the result of the API call
"""
response = None
client = speech.SpeechClient()
audio = speech.RecognitionAudio(content=self.request.data)
config_values = dict(
encoding=speech.RecognitionConfig.AudioEncoding.FLAC,
sample_rate_hertz=SAMPLE_RATE,
language_code="en-US",
)
config = speech.RecognitionConfig(**config_values)
start_timestamp = datetime.now()
try:
response = client.recognize(config=config, audio=audio)
except Exception:
_log.exception(f"{self.request_id}: Transcription failed.")
finally:
end_timestamp = datetime.now()
transcription_duration = (end_timestamp - start_timestamp).total_seconds()
self.transcription_duration = Decimal(str(transcription_duration))
return response
def _get_transcription(
self, response: Optional[speech.RecognizeResponse]
) -> Optional[str]:
"""Interrogates the response from the transcription service API.
:param response: the transcription service API response
:return: None if the audio could not be transcribed or the transcription
"""
transcription = None
if response:
highest_confidence = 0
for result in response.results:
for alternative in result.alternatives:
if alternative.confidence > highest_confidence:
transcription = alternative.transcript
return transcription
def _add_transcription_metric(self, transcription: str):
"""Adds metrics for this STT transcription to the database."""
account_repo = AccountRepository(self.db)
account = account_repo.get_account_by_device_id(self.device_id)
transcription_metric = SttTranscriptionMetric(
account_id=account.id,
engine="Google Cloud",
success=transcription is not None,
audio_duration=Decimal(str(self._determine_audio_duration())),
transcription_duration=Decimal(str(self.transcription_duration)),
)
transcription_metric_repo = TranscriptionMetricRepository(self.db)
transcription_metric_repo.add(transcription_metric)
def _determine_audio_duration(self) -> float:
"""Determines the duration of the audio data for the metrics."""
with BytesIO(self.request.data) as request_audio:
audio, _ = librosa.load(request_audio, sr=SAMPLE_RATE, mono=True)
return librosa.get_duration(y=audio, sr=SAMPLE_RATE)

View File

@ -29,14 +29,15 @@ from selene.data.device import DeviceRepository
class UpdateDevice(Model):
coreVersion = StringType(default='unknown')
platform = StringType(default='unknown')
coreVersion = StringType(default="unknown")
platform = StringType(default="unknown")
platform_build = StringType()
enclosureVersion = StringType(default='unknown')
enclosureVersion = StringType(default="unknown")
class DeviceEndpoint(PublicEndpoint):
"""Return the device entity using the device_id"""
def __init__(self):
super(DeviceEndpoint, self).__init__()
@ -53,13 +54,13 @@ class DeviceEndpoint(PublicEndpoint):
coreVersion=device.core_version,
enclosureVersion=device.enclosure_version,
platform=device.platform,
user=dict(uuid=device.account_id)
user=dict(uuid=device.account_id),
)
response = response_data, HTTPStatus.OK
self._add_etag(device_etag_key(device_id))
else:
response = '', HTTPStatus.NO_CONTENT
response = "", HTTPStatus.NO_CONTENT
return response
@ -69,10 +70,10 @@ class DeviceEndpoint(PublicEndpoint):
update_device = UpdateDevice(payload)
update_device.validate()
updates = dict(
platform=payload.get('platform') or 'unknown',
enclosure_version=payload.get('enclosureVersion') or 'unknown',
core_version=payload.get('coreVersion') or 'unknown'
platform=payload.get("platform") or "unknown",
enclosure_version=payload.get("enclosureVersion") or "unknown",
core_version=payload.get("coreVersion") or "unknown",
)
DeviceRepository(self.db).update_device_from_core(device_id, updates)
return '', HTTPStatus.OK
return "", HTTPStatus.OK

View File

@ -32,41 +32,64 @@ from http import HTTPStatus
from schematics import Model
from schematics.types import StringType
from selene.api import PublicEndpoint
from selene.api import generate_device_login
from selene.api import generate_device_login, PublicEndpoint
from selene.data.device import DeviceRepository
from selene.util.cache import DEVICE_PAIRING_TOKEN_KEY
class ActivationRequest(Model):
"""Data model of the fields in the request."""
token = StringType(required=True)
state = StringType(required=True)
platform = StringType(default='unknown')
coreVersion = StringType(default='unknown')
enclosureVersion = StringType(default='unknown')
platform = StringType(default="unknown")
core_version = StringType(default="unknown")
enclosure_version = StringType(default="unknown")
platform_build = StringType()
pantacor_device_id = StringType()
class DeviceActivateEndpoint(PublicEndpoint):
"""API endpoint for activating a device, the last step in device pairing."""
_device_repository = None
@property
def device_repository(self):
"""Lazily load an instance of the device repository"""
if self._device_repository is None:
self._device_repository = DeviceRepository(self.db)
return self._device_repository
def post(self):
"""Process a HTTP POST request."""
activation_request = self._validate_request()
pairing_session = self._get_pairing_session()
if pairing_session is not None:
device_id = pairing_session['uuid']
device_id = pairing_session["uuid"]
self._activate(device_id, activation_request)
response = (
generate_device_login(device_id, self.cache),
HTTPStatus.OK
)
response = (generate_device_login(device_id, self.cache), HTTPStatus.OK)
else:
response = '', HTTPStatus.NOT_FOUND
response = "", HTTPStatus.NOT_FOUND
return response
def _validate_request(self):
def _validate_request(self) -> dict:
"""Validate the contents of the API request against the data model."""
# TODO: remove this hack when mycroft-core mark-2 branch is merged into dev
if "coreVersion" in self.request.json:
self.request.json["core_version"] = self.request.json["coreVersion"]
del self.request.json["coreVersion"]
if "enclosureVersion" in self.request.json:
self.request.json["enclosure_version"] = self.request.json[
"enclosureVersion"
]
del self.request.json["enclosureVersion"]
activation_request = ActivationRequest(self.request.json)
activation_request.validate()
return activation_request
return activation_request.to_native()
def _get_pairing_session(self):
"""Get the pairing session from the cache.
@ -74,22 +97,27 @@ class DeviceActivateEndpoint(PublicEndpoint):
The request must have same state value as that stored in the
pairing session.
"""
pairing_session_token = self.request.json['token']
pairing_session_token = self.request.json["token"]
pairing_session_key = DEVICE_PAIRING_TOKEN_KEY.format(
pairing_token=pairing_session_token
)
pairing_session = self.cache.get(pairing_session_key)
if pairing_session:
pairing_session = json.loads(pairing_session)
if self.request.json['state'] == pairing_session['state']:
if self.request.json["state"] == pairing_session["state"]:
self.cache.delete(pairing_session_key)
return pairing_session
def _activate(self, device_id: str, activation_request: ActivationRequest):
"""Updates the core version, platform and enclosure_version columns"""
return pairing_session
def _activate(self, device_id: str, activation_request: dict):
"""Updates the core version, platform and enclosure_version columns
:param device_id: internal identifier of the device
:param activation_request: validated request data
"""
updates = dict(
platform=str(activation_request.platform),
enclosure_version=str(activation_request.enclosureVersion),
core_version=str(activation_request.coreVersion)
platform=str(activation_request["platform"]),
enclosure_version=str(activation_request["enclosure_version"]),
core_version=str(activation_request["core_version"]),
)
DeviceRepository(self.db).update_device_from_core(device_id, updates)
self.device_repository.update_device_from_core(device_id, updates)

View File

@ -16,7 +16,6 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Endpoint to generate a pairing code and return it to the device.
The response returned to the device consists of:
@ -38,23 +37,21 @@ import json
import random
import uuid
from http import HTTPStatus
from logging import getLogger
from selene.api import PublicEndpoint
from selene.util.cache import DEVICE_PAIRING_CODE_KEY
from selene.util.log import get_selene_logger
# Avoid using ambiguous characters in the pairing code, like 0 and O, that
# are hard to distinguish on a device display.
ALLOWED_CHARACTERS = "ACEFHJKLMNPRTUVWXY3479"
ONE_DAY = 86400
_log = getLogger(__package__)
_log = get_selene_logger(__name__)
class DeviceCodeEndpoint(PublicEndpoint):
# Avoid using ambiguous characters in the pairing code, like 0 and O, that
# are hard to distinguish on a device display.
allowed_characters = "ACEFHJKLMNPRTUVWXY3479"
def __init__(self):
super(DeviceCodeEndpoint, self).__init__()
"""Endpoint to generate a pairing code and send it back to the device."""
def get(self):
"""Return a pairing code to the requesting device.
@ -75,22 +72,15 @@ class DeviceCodeEndpoint(PublicEndpoint):
exist is created.
"""
response_data = dict(
state=self.request.args['state'],
state=self.request.args["state"],
token=self._generate_token(),
expiration=ONE_DAY
expiration=ONE_DAY,
)
pairing_code_added = False
while not pairing_code_added:
added_to_cache = False
while not added_to_cache:
pairing_code = self._generate_pairing_code()
_log.debug('Generated pairing code ' + pairing_code)
response_data.update(code=pairing_code)
pairing_code_added = self.cache.set_if_not_exists_with_expiration(
DEVICE_PAIRING_CODE_KEY.format(pairing_code=pairing_code),
value=json.dumps(response_data),
expiration=ONE_DAY
)
log_msg = 'Pairing code {pairing_code} exists, generating new code'
_log.debug(log_msg.format(pairing_code=pairing_code))
added_to_cache = self._add_pairing_code_to_cache(response_data)
return response_data
@ -98,9 +88,30 @@ class DeviceCodeEndpoint(PublicEndpoint):
def _generate_token():
"""Generate the token used by this API to identify pairing session"""
sha512 = hashlib.sha512()
sha512.update(bytes(str(uuid.uuid4()), 'utf-8'))
sha512.update(bytes(str(uuid.uuid4()), "utf-8"))
return sha512.hexdigest()
def _generate_pairing_code(self):
@staticmethod
def _generate_pairing_code():
"""Generate the pairing code that will be spoken by the device."""
return ''.join(random.choice(self.allowed_characters) for _ in range(6))
pairing_code = "".join(random.choice(ALLOWED_CHARACTERS) for _ in range(6))
_log.info("Generated pairing code {}".format(pairing_code))
return pairing_code
def _add_pairing_code_to_cache(self, response_data):
"""Add data necessary to activate the device to cache for retrieval."""
cache_key = DEVICE_PAIRING_CODE_KEY.format(pairing_code=response_data["code"])
cache_value = dict(**response_data)
core_packaging_type = self.request.args.get("packaging")
if core_packaging_type is not None:
cache_value.update(packaging_type=core_packaging_type)
added_to_cache = self.cache.set_if_not_exists_with_expiration(
cache_key, value=json.dumps(cache_value), expiration=ONE_DAY
)
if not added_to_cache:
log_msg = "Pairing code {pairing_code} exists, generating new code"
_log.debug(log_msg.format(pairing_code=response_data["pairing_code"]))
return added_to_cache

View File

@ -16,11 +16,7 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
import json
import os
import smtplib
from email.message import EmailMessage
"""Device API endpoint to send an email as specified by the device."""
from http import HTTPStatus
from schematics import Model
@ -28,9 +24,12 @@ from schematics.types import StringType
from selene.api import PublicEndpoint
from selene.data.account import AccountRepository
from selene.util.email import EmailMessage, SeleneMailer
class SendEmail(Model):
"""Data model of the incoming PUT request."""
title = StringType(required=True)
sender = StringType(required=True)
body = StringType(required=True)
@ -39,37 +38,27 @@ class SendEmail(Model):
class DeviceEmailEndpoint(PublicEndpoint):
"""Endpoint to send an email to the account associated to a device"""
def __init__(self):
super(DeviceEmailEndpoint, self).__init__()
def put(self, device_id):
"""Handle an HTTP PUT request."""
self._authenticate(device_id)
payload = json.loads(self.request.data)
send_email = SendEmail(payload)
self._validate_request()
account = AccountRepository(self.db).get_account_by_device_id(device_id)
self._send_message(account)
return "", HTTPStatus.OK
def _validate_request(self):
"""Validate that the request is well-formed."""
send_email = SendEmail(self.request.json)
send_email.validate()
account = AccountRepository(self.db).get_account_by_device_id(device_id)
if account:
message = EmailMessage()
message['Subject'] = str(send_email.title)
message['From'] = str(send_email.sender)
message.set_content(str(send_email.body))
message['To'] = account.email_address
self._send_email(message)
response = '', HTTPStatus.OK
else:
response = '', HTTPStatus.NO_CONTENT
return response
def _send_email(self, message: EmailMessage):
email_client = self.config.get('EMAIL_CLIENT')
if email_client is None:
host = os.environ['EMAIL_SERVICE_HOST']
port = os.environ['EMAIL_SERVICE_PORT']
user = os.environ['EMAIL_SERVICE_USER']
password = os.environ['EMAIL_SERVICE_PASSWORD']
email_client = smtplib.SMTP(host, port)
email_client.login(user, password)
email_client.send_message(message)
email_client.quit()
def _send_message(self, account):
"""Send an email to the account that owns the device that requested it."""
message = EmailMessage(
recipient=account.email_address,
sender="support@mycroft.ai",
subject=self.request.json["title"],
body=self.request.json["body"],
)
mailer = SeleneMailer(message)
mailer.send()

View File

@ -25,17 +25,18 @@ from selene.data.device import GeographyRepository
class DeviceLocationEndpoint(PublicEndpoint):
def __init__(self):
super(DeviceLocationEndpoint, self).__init__()
def get(self, device_id):
self._authenticate(device_id)
self._validate_etag(device_location_etag_key(device_id))
location = GeographyRepository(self.db, None).get_location_by_device_id(device_id)
location = GeographyRepository(self.db, None).get_location_by_device_id(
device_id
)
if location:
response = (location, HTTPStatus.OK)
self._add_etag(device_location_etag_key(device_id))
else:
response = ('', HTTPStatus.NOT_FOUND)
response = ("", HTTPStatus.NOT_FOUND)
return response

View File

@ -36,5 +36,6 @@ class DeviceMetricsEndpoint(PublicEndpoint):
core_metric = CoreMetric(
device_id=self.device_id, metric_type=metric, metric_value=self.request.json
)
core_metrics_repo = CoreMetricRepository(self.db)
core_metrics_repo.add(core_metric)
# Writing metrics from devices is being deactivated to enable
# core_metrics_repo = CoreMetricRepository(self.db)
# core_metrics_repo.add(core_metric)

View File

@ -26,18 +26,15 @@ from selene.data.account import AccountRepository
class OauthServiceEndpoint(PublicEndpoint):
def __init__(self):
super(OauthServiceEndpoint, self).__init__()
self.oauth_service_host = os.environ['OAUTH_BASE_URL']
self.oauth_service_host = os.environ["OAUTH_BASE_URL"]
def get(self, device_id, credentials, oauth_path):
account = AccountRepository(self.db).get_account_by_device_id(device_id)
uuid = account.id
url = '{host}/auth/{credentials}/{oauth_path}'.format(
host=self.oauth_service_host,
credentials=credentials,
oauth_path=oauth_path
url = "{host}/auth/{credentials}/{oauth_path}".format(
host=self.oauth_service_host, credentials=credentials, oauth_path=oauth_path
)
params = dict(uuid=uuid)
response = requests.get(url, params=params)

View File

@ -0,0 +1,97 @@
# Mycroft Server - Backend
# Copyright (C) 2022 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Endpoint to determine if a device has registered with Pantacor.
Device pairing with Selene is considered complete after the device/activate endpoint
is successful, but there is one more step in the pairing process of a device that
uses Pantacor for continuous deployment. This endpoint calls the Pantacor Fleet API to
determine if the device's registration is complete and reports back to the device.
"""
from http import HTTPStatus
from schematics import Model
from schematics.types import StringType
from selene.api import PublicEndpoint
from selene.api.pantacor import get_pantacor_device, PantacorError
from selene.data.device import DeviceRepository
from selene.util.log import get_selene_logger
_log = get_selene_logger(__name__)
class PantacorSyncRequest(Model):
"""Data model of the fields in the request."""
mycroft_device_id = StringType(required=True)
pantacor_device_id = StringType(required=True)
class DevicePantacorEndpoint(PublicEndpoint):
"""API endpoint for devices that use Pantacor for deployments.
Retrieves Pantacor configuration values, such as "auto update" from the
Pantacor Fleet API and adds the config to the device.pantacor table in the
database. The data on this table allows users to view and edit the config
values in the Selene UI. For this endpoint to be successful, the Pantacor Device
ID must be recognized by Pantacor and the device must be "claimed" by Pantacor.
"""
def post(self):
"""Process a HTTP POST request."""
self._validate_request()
pantacor_config = self._get_config_from_pantacor()
if pantacor_config is None:
response = "Pantacor Device ID not found", HTTPStatus.NOT_FOUND
elif not pantacor_config.claimed:
response = (
"Device not yet claimed by Pantacor",
HTTPStatus.PRECONDITION_REQUIRED,
)
else:
self._add_pantacor_config_to_db(pantacor_config)
response = "", HTTPStatus.OK
return response
def _validate_request(self):
"""Validate the contents of the API request against the data model."""
# TODO: remove this hack when mycroft-core mark-2 branch is merged into dev
activation_request = PantacorSyncRequest(self.request.json)
activation_request.validate()
def _get_config_from_pantacor(self):
"""Attempts to get the Pantacor config values from their Fleet API."""
pantacor_config = None
try:
pantacor_config = get_pantacor_device(
self.request.json["pantacor_device_id"]
)
except PantacorError:
_log.exception("Pantacor device ID not found on PantaHub")
return pantacor_config
def _add_pantacor_config_to_db(self, pantacor_config):
"""Adds the software update configs to the database."""
device_repository = DeviceRepository(self.db)
device_repository.upsert_pantacor_config(
self.request.json["mycroft_device_id"], pantacor_config
)

View File

@ -35,44 +35,44 @@ class DeviceRefreshTokenEndpoint(PublicEndpoint):
def get(self):
headers = self.request.headers
if 'Authorization' not in headers:
raise AuthenticationError('Oauth token not found')
token_header = self.request.headers['Authorization']
if token_header.startswith('Bearer '):
refresh = token_header[len('Bearer '):]
if "Authorization" not in headers:
raise AuthenticationError("Oauth token not found")
token_header = self.request.headers["Authorization"]
if token_header.startswith("Bearer "):
refresh = token_header[len("Bearer ") :]
session = self._refresh_session_token(refresh)
# Trying to fetch a session using the refresh token
if session:
response = session, HTTPStatus.OK
else:
device = self.request.headers.get('Device')
device = self.request.headers.get("Device")
if device:
# trying to fetch a session using the device uuid
session = self._refresh_session_token_device(device)
if session:
response = session, HTTPStatus.OK
else:
response = '', HTTPStatus.UNAUTHORIZED
response = "", HTTPStatus.UNAUTHORIZED
else:
response = '', HTTPStatus.UNAUTHORIZED
response = "", HTTPStatus.UNAUTHORIZED
else:
response = '', HTTPStatus.UNAUTHORIZED
response = "", HTTPStatus.UNAUTHORIZED
return response
def _refresh_session_token(self, refresh: str):
refresh_key = 'device.token.refresh:{}'.format(refresh)
refresh_key = "device.token.refresh:{}".format(refresh)
session = self.cache.get(refresh_key)
if session:
old_login = json.loads(session)
device_id = old_login['uuid']
device_id = old_login["uuid"]
self.cache.delete(refresh_key)
return generate_device_login(device_id, self.cache)
def _refresh_session_token_device(self, device: str):
refresh_key = 'device.session:{}'.format(device)
refresh_key = "device.session:{}".format(device)
session = self.cache.get(refresh_key)
if session:
old_login = json.loads(session)
device_id = old_login['uuid']
device_id = old_login["uuid"]
self.cache.delete(refresh_key)
return generate_device_login(device_id, self.cache)

View File

@ -25,6 +25,7 @@ from selene.data.device import SettingRepository
class DeviceSettingEndpoint(PublicEndpoint):
"""Return the device's settings for the API v1 model"""
def __init__(self):
super(DeviceSettingEndpoint, self).__init__()
@ -36,5 +37,5 @@ class DeviceSettingEndpoint(PublicEndpoint):
response = (setting, HTTPStatus.OK)
self._add_etag(device_setting_etag_key(device_id))
else:
response = ('', HTTPStatus.NO_CONTENT)
response = ("", HTTPStatus.NO_CONTENT)
return response

View File

@ -29,15 +29,9 @@ cannot send it's settings... right? The skill and its relationship to the
device should already be known when this endpoint is called.
"""
from http import HTTPStatus
from logging import getLogger
from schematics import Model
from schematics.types import (
BooleanType,
ListType,
ModelType,
StringType
)
from schematics.types import BooleanType, ListType, ModelType, StringType
from schematics.exceptions import DataError
from selene.api import PublicEndpoint
@ -47,35 +41,38 @@ from selene.data.skill import (
extract_family_from_global_id,
SettingsDisplay,
SettingsDisplayRepository,
SkillRepository
SkillRepository,
)
from selene.data.skill import SkillSettingRepository
from selene.util.log import get_selene_logger
_log = getLogger(__package__)
_log = get_selene_logger(__name__)
def _normalize_field_value(field):
"""The field values in skillMetadata are all strings, convert to native."""
normalized_value = field.get('value')
if field['type'].lower() == 'checkbox':
if field['value'] in ('false', 'False', '0'):
normalized_value = field.get("value")
if field["type"].lower() == "checkbox":
if field["value"] in ("false", "False", "0"):
normalized_value = False
elif field['value'] in ('true', 'True', '1'):
elif field["value"] in ("true", "True", "1"):
normalized_value = True
elif field['type'].lower() == 'number' and isinstance(field['value'], str):
if field['value']:
normalized_value = float(field['value'])
elif field["type"].lower() == "number" and isinstance(field["value"], str):
if field["value"]:
normalized_value = float(field["value"])
if not normalized_value % 1:
normalized_value = int(field['value'])
normalized_value = int(field["value"])
else:
normalized_value = 0
elif field['value'] == "[]":
elif field["value"] == "[]":
normalized_value = []
return normalized_value
class RequestSkillField(Model):
"""Representation of skill setting field for use in validation."""
name = StringType()
type = StringType()
label = StringType()
@ -87,20 +84,28 @@ class RequestSkillField(Model):
class RequestSkillSection(Model):
"""Representation of skill setting section for use in validation."""
name = StringType(required=True)
fields = ListType(ModelType(RequestSkillField))
class RequestSkillMetadata(Model):
"""Representation of skill setting metadata for use in validation."""
sections = ListType(ModelType(RequestSkillSection))
class RequestSkillIcon(Model):
"""Representation of skill icon for use in validation."""
color = StringType()
icon = StringType()
class RequestDeviceSkill(Model):
"""Representation of the PUT request object for use in validation."""
display_name = StringType(required=True)
icon = ModelType(RequestSkillIcon)
icon_img = StringType()
@ -109,6 +114,8 @@ class RequestDeviceSkill(Model):
class SkillSettingsMetaEndpoint(PublicEndpoint):
"""Public API endpoint for maintaining skill settings."""
def __init__(self):
super().__init__()
self.skill = None
@ -118,13 +125,19 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
self._device_skill_repo = None
@property
def device_skill_repo(self):
def device_skill_repo(self) -> DeviceSkillRepository:
"""Lazily instantiates an instance of the DeviceSkillRepository."""
if self._device_skill_repo is None:
self._device_skill_repo = DeviceSkillRepository(self.db)
return self._device_skill_repo
def put(self, device_id):
def put(self, device_id: str):
"""Handles a HTTP PUT request.
Args:
device_id: Mycroft identifier of a paired device.
"""
self._authenticate(device_id)
self._validate_request()
self._get_skill()
@ -132,7 +145,7 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
self._ensure_settings_definition_exists()
self._update_device_skill(device_id)
return '', HTTPStatus.NO_CONTENT
return "", HTTPStatus.NO_CONTENT
def _validate_request(self):
"""Ensure the request is well-formed."""
@ -142,14 +155,9 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
def _get_skill(self):
"""Retrieve the skill associated with the request."""
skill_repo = SkillRepository(self.db)
self.skill = skill_repo.get_skill_by_global_id(
self.request.json['skill_gid']
)
self.skill = skill_repo.get_skill_by_global_id(self.request.json["skill_gid"])
if self.skill is None:
err_msg = (
'No skill on database for skill ' +
self.request.json['skill_gid']
)
err_msg = "No skill on database for skill " + self.request.json["skill_gid"]
_log.error(err_msg)
raise DataError(dict(skill_gid=[err_msg]))
@ -160,20 +168,18 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
fields that should be boolean or numeric. Ensure all fields are cast
to the correct type before interacting with the database.
"""
self.skill_has_settings = 'skillMetadata' in self.request.json
self.skill_has_settings = "skillMetadata" in self.request.json
if self.skill_has_settings:
skill_metadata = self.request.json['skillMetadata']
skill_metadata = self.request.json["skillMetadata"]
self.default_settings = {}
normalized_sections = []
for section in skill_metadata['sections']:
for field in section['fields']:
if field['type'] != 'label':
field['value'] = _normalize_field_value(field)
self.default_settings[field['name']] = field['value']
for section in skill_metadata["sections"]:
for field in section["fields"]:
if field["type"] != "label":
field["value"] = _normalize_field_value(field)
self.default_settings[field["name"]] = field["value"]
normalized_sections.append(section)
self.request.json['skillMetadata'].update(
sections=normalized_sections
)
self.request.json["skillMetadata"].update(sections=normalized_sections)
def _ensure_settings_definition_exists(self):
"""Add a row to skill.settings_display if it doesn't already exist."""
@ -197,12 +203,9 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
"""The settings definition does not exist on database so add it."""
settings_def_repo = SettingsDisplayRepository(self.db)
settings_definition = SettingsDisplay(
skill_id=self.skill.id,
display_data=self.request.json
)
self.settings_definition_id = settings_def_repo.add(
settings_definition
skill_id=self.skill.id, display_data=self.request.json
)
self.settings_definition_id = settings_def_repo.add(settings_definition)
def _update_device_skill(self, device_id):
"""Update device.device_skill to match the new settings definition.
@ -215,29 +218,23 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
device_skill.settings_display_id = self.settings_definition_id
if self.skill_has_settings:
if device_skill.settings_values is None:
new_settings_values = self._initialize_skill_settings(
device_id
)
new_settings_values = self._initialize_skill_settings(device_id)
else:
new_settings_values = self._reconcile_skill_settings(
device_skill.settings_values
)
device_skill.settings_values = new_settings_values
self.device_skill_repo.update_device_skill_settings(
device_id,
device_skill
)
self.device_skill_repo.update_device_skill_settings(device_id, device_skill)
def _get_device_skill(self, device_id):
"""Retrieve the device's skill entry from the database."""
device_skill = self.device_skill_repo.get_skill_settings_for_device(
device_id,
self.skill.id
device_id, self.skill.id
)
if device_skill is None:
error_msg = (
'Received skill setting definition before manifest for '
'skill ' + self.skill.skill_gid
"Received skill setting definition before manifest for "
"skill " + self.skill.skill_gid
)
_log.error(error_msg)
raise DataError(dict(skill_gid=[error_msg]))
@ -247,12 +244,12 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
def _reconcile_skill_settings(self, settings_values):
"""Fix any new or removed settings."""
new_settings_values = {}
for name, value in self.default_settings.items():
for name in self.default_settings:
if name in settings_values:
new_settings_values[name] = settings_values[name]
else:
new_settings_values[name] = self.default_settings[name]
for name, value in settings_values.items():
for name in settings_values:
if name in self.default_settings:
new_settings_values[name] = settings_values[name]
@ -260,14 +257,13 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
def _initialize_skill_settings(self, device_id):
"""Use default settings or copy from another device in same account."""
_log.info('Initializing settings for skill ' + self.skill.skill_gid)
_log.info(f"Initializing settings for skill {self.skill.skill_gid}")
account_repo = AccountRepository(self.db)
account = account_repo.get_account_by_device_id(device_id)
skill_settings_repo = SkillSettingRepository(self.db)
skill_family = extract_family_from_global_id(self.skill.skill_gid)
family_settings = skill_settings_repo.get_family_settings(
account.id,
skill_family
account.id, skill_family
)
new_settings_values = self.default_settings
if family_settings is not None:
@ -278,15 +274,12 @@ class SkillSettingsMetaEndpoint(PublicEndpoint):
field_names = settings.settings_values.keys()
if field_names == self.default_settings.keys():
_log.info(
'Copying settings from another device for skill' +
self.skill.skill_gid
"Copying settings from another device for skill"
f"{self.skill.skill_gid}"
)
new_settings_values = settings.settings_values
break
else:
_log.info(
'Using default skill settings for skill ' +
self.skill.skill_gid
)
_log.info(f"Using default skill settings for skill {self.skill.skill_gid}")
return new_settings_values

View File

@ -28,7 +28,7 @@ from schematics.types import (
ListType,
IntType,
BooleanType,
TimestampType
TimestampType,
)
from selene.api import PublicEndpoint
@ -43,9 +43,7 @@ class SkillManifestReconciler(object):
self.skill_repo = SkillRepository(self.db)
self.device_manifest = {sm.skill_gid: sm for sm in device_manifest}
self.db_manifest = {ds.skill_gid: ds for ds in db_manifest}
self.device_manifest_global_ids = {
gid for gid in self.device_manifest.keys()
}
self.device_manifest_global_ids = {gid for gid in self.device_manifest.keys()}
self.db_manifest_global_ids = {gid for gid in self.db_manifest}
def reconcile(self):
@ -82,16 +80,14 @@ class SkillManifestReconciler(object):
for gid in skills_to_add:
skill_id = self.skill_repo.ensure_skill_exists(gid)
self.device_manifest[gid].skill_id = skill_id
self.skill_manifest_repo.add_manifest_skill(
self.device_manifest[gid]
)
self.skill_manifest_repo.add_manifest_skill(self.device_manifest[gid])
class RequestManifestSkill(Model):
name = StringType(required=True)
origin = StringType(required=True)
installation = StringType(required=True)
failure_message = StringType(default='')
failure_message = StringType(default="")
status = StringType(required=True)
beta = BooleanType(required=True)
installed = TimestampType(required=True)
@ -126,7 +122,7 @@ class DeviceSkillManifestEndpoint(PublicEndpoint):
self._validate_put_request()
self._update_skill_manifest(device_id)
return '', HTTPStatus.OK
return "", HTTPStatus.OK
def _validate_put_request(self):
request_data = SkillManifestRequest(self.request.json)
@ -137,29 +133,27 @@ class DeviceSkillManifestEndpoint(PublicEndpoint):
device_id
)
device_skill_manifest = []
for manifest_skill in self.request.json['skills']:
for manifest_skill in self.request.json["skills"]:
self._convert_manifest_timestamps(manifest_skill)
device_skill_manifest.append(
ManifestSkill(
device_id=device_id,
install_method=manifest_skill['origin'],
install_status=manifest_skill['installation'],
install_failure_reason=manifest_skill.get('failure_message'),
install_ts=manifest_skill['installed'],
skill_gid=manifest_skill['skill_gid'],
update_ts=manifest_skill['updated']
install_method=manifest_skill["origin"],
install_status=manifest_skill["installation"],
install_failure_reason=manifest_skill.get("failure_message"),
install_ts=manifest_skill["installed"],
skill_gid=manifest_skill["skill_gid"],
update_ts=manifest_skill["updated"],
)
)
reconciler = SkillManifestReconciler(
self.db,
device_skill_manifest,
db_skill_manifest
self.db, device_skill_manifest, db_skill_manifest
)
reconciler.reconcile()
@staticmethod
def _convert_manifest_timestamps(manifest_skill):
for key in ('installed', 'updated'):
for key in ("installed", "updated"):
value = manifest_skill[key]
if value:
manifest_skill[key] = datetime.fromtimestamp(value)

View File

@ -33,39 +33,37 @@ from selene.data.skill import (
SettingsDisplayRepository,
Skill,
SkillRepository,
SkillSettingRepository
SkillSettingRepository,
)
from selene.util.cache import DEVICE_SKILL_ETAG_KEY
# matches <submodule_name>|<branch>
GLOBAL_ID_PATTERN = '^([^\|@]+)\|([^\|]+$)'
GLOBAL_ID_PATTERN = "^([^\|@]+)\|([^\|]+$)"
# matches @<device_id>|<submodule_name>|<branch>
GLOBAL_ID_DIRTY_PATTERN = '^@(.*)\|(.*)\|(.*)$'
GLOBAL_ID_DIRTY_PATTERN = "^@(.*)\|(.*)\|(.*)$"
# matches @<device_id>|<folder_name>
GLOBAL_ID_NON_MSM_PATTERN = '^@([^\|]+)\|([^\|]+$)'
GLOBAL_ID_ANY_PATTERN = '(?:{})|(?:{})|(?:{})'.format(
GLOBAL_ID_PATTERN,
GLOBAL_ID_DIRTY_PATTERN,
GLOBAL_ID_NON_MSM_PATTERN
GLOBAL_ID_NON_MSM_PATTERN = "^@([^\|]+)\|([^\|]+$)"
GLOBAL_ID_ANY_PATTERN = "(?:{})|(?:{})|(?:{})".format(
GLOBAL_ID_PATTERN, GLOBAL_ID_DIRTY_PATTERN, GLOBAL_ID_NON_MSM_PATTERN
)
def _normalize_field_value(field):
"""The field values in skillMetadata are all strings, convert to native."""
normalized_value = field.get('value')
if field['type'].lower() == 'checkbox':
if field['value'] in ('false', 'False', '0'):
normalized_value = field.get("value")
if field["type"].lower() == "checkbox":
if field["value"] in ("false", "False", "0"):
normalized_value = False
elif field['value'] in ('true', 'True', '1'):
elif field["value"] in ("true", "True", "1"):
normalized_value = True
elif field['type'].lower() == 'number' and isinstance(field['value'], str):
if field['value']:
normalized_value = float(field['value'])
elif field["type"].lower() == "number" and isinstance(field["value"], str):
if field["value"]:
normalized_value = float(field["value"])
if not normalized_value % 1:
normalized_value = int(field['value'])
normalized_value = int(field["value"])
else:
normalized_value = 0
elif field['value'] == "[]":
elif field["value"] == "[]":
normalized_value = []
return normalized_value
@ -78,6 +76,7 @@ class SkillSettingUpdater(object):
request specifies a single device to update, all devices with
the same skill must be updated as well.
"""
_device_skill_repo = None
_settings_display_repo = None
@ -115,28 +114,27 @@ class SkillSettingUpdater(object):
settings_meta.json file before sending the result to this API. The
settings values are stored separately from the metadata in the database.
"""
settings_definition = self.display_data.get('skillMetadata')
settings_definition = self.display_data.get("skillMetadata")
if settings_definition is not None:
self.settings_values = dict()
sections_without_values = []
for section in settings_definition['sections']:
for section in settings_definition["sections"]:
section_without_values = dict(**section)
for field in section_without_values['fields']:
field_name = field.get('name')
field_value = field.get('value')
for field in section_without_values["fields"]:
field_name = field.get("name")
field_value = field.get("value")
if field_name is not None:
if field_value is not None:
field_value = _normalize_field_value(field)
del(field['value'])
del field["value"]
self.settings_values[field_name] = field_value
sections_without_values.append(section_without_values)
settings_definition['sections'] = sections_without_values
settings_definition["sections"] = sections_without_values
def _get_skill_id(self):
"""Get the id of the skill in the request"""
skill_global_id = (
self.display_data.get('skill_gid') or
self.display_data.get('identifier')
skill_global_id = self.display_data.get("skill_gid") or self.display_data.get(
"identifier"
)
skill_repo = SkillRepository(self.db)
skill_id = skill_repo.ensure_skill_exists(skill_global_id)
@ -145,14 +143,9 @@ class SkillSettingUpdater(object):
def _ensure_settings_display_exists(self) -> bool:
"""If the settings display changed, a new row needs to be added."""
new_settings_display = False
self.settings_display = SettingsDisplay(
self.skill.id,
self.display_data
)
self.settings_display.id = (
self.settings_display_repo.get_settings_display_id(
self.settings_display
)
self.settings_display = SettingsDisplay(self.skill.id, self.display_data)
self.settings_display.id = self.settings_display_repo.get_settings_display_id(
self.settings_display
)
if self.settings_display.id is None:
self.settings_display.id = self.settings_display_repo.add(
@ -173,11 +166,8 @@ class SkillSettingUpdater(object):
"""Get all the permutations of settings for a skill"""
account_repo = AccountRepository(self.db)
account = account_repo.get_account_by_device_id(self.device_id)
skill_settings = (
self.device_skill_repo.get_skill_settings_for_account(
account.id,
self.skill.id
)
skill_settings = self.device_skill_repo.get_skill_settings_for_account(
account.id, self.skill.id
)
return skill_settings
@ -187,14 +177,14 @@ class SkillSettingUpdater(object):
for skill_setting in skill_settings:
if self.device_id in skill_setting.device_ids:
device_skill_found = True
if skill_setting.install_method in ('voice', 'cli'):
if skill_setting.install_method in ("voice", "cli"):
devices_to_update = [self.device_id]
else:
devices_to_update = skill_setting.device_ids
self.device_skill_repo.upsert_device_skill_settings(
devices_to_update,
self.settings_display,
self._merge_settings_values(skill_setting.settings_values)
self._merge_settings_values(skill_setting.settings_values),
)
break
@ -225,9 +215,7 @@ class SkillSettingUpdater(object):
manifest endpoint in some cases.
"""
self.device_skill_repo.upsert_device_skill_settings(
[self.device_id],
self.settings_display,
self._merge_settings_values()
[self.device_id], self.settings_display, self._merge_settings_values()
)
@ -267,15 +255,16 @@ class RequestSkill(Model):
identifier = StringType()
def validate_skill_gid(self, data, value):
if data['skill_gid'] is None and data['identifier'] is None:
if data["skill_gid"] is None and data["identifier"] is None:
raise ValidationError(
'skill should have either skill_gid or identifier defined'
"skill should have either skill_gid or identifier defined"
)
return value
class DeviceSkillSettingsEndpoint(PublicEndpoint):
"""Fetch all skills associated with a device using the API v1 format"""
_device_skill_repo = None
_skill_repo = None
_skill_setting_repo = None
@ -317,23 +306,19 @@ class DeviceSkillSettingsEndpoint(PublicEndpoint):
"""
self._authenticate(device_id)
self._validate_etag(DEVICE_SKILL_ETAG_KEY.format(device_id=device_id))
device_skills = self.skill_setting_repo.get_skill_settings_for_device(
device_id
)
device_skills = self.skill_setting_repo.get_skill_settings_for_device(device_id)
if device_skills:
response_data = self._build_response_data(device_skills)
response = Response(
json.dumps(response_data),
status=HTTPStatus.OK,
content_type='application/json'
content_type="application/json",
)
self._add_etag(DEVICE_SKILL_ETAG_KEY.format(device_id=device_id))
else:
response = Response(
'',
status=HTTPStatus.NO_CONTENT,
content_type='application/json'
"", status=HTTPStatus.NO_CONTENT, content_type="application/json"
)
return response
@ -341,7 +326,7 @@ class DeviceSkillSettingsEndpoint(PublicEndpoint):
response_data = []
for skill in device_skills:
response_skill = dict(uuid=skill.skill_id)
settings_definition = skill.settings_display.get('skillMetadata')
settings_definition = skill.settings_display.get("skillMetadata")
if settings_definition:
settings_sections = self._apply_settings_values(
settings_definition, skill.settings_values
@ -350,10 +335,10 @@ class DeviceSkillSettingsEndpoint(PublicEndpoint):
response_skill.update(
skillMetadata=dict(sections=settings_sections)
)
skill_gid = skill.settings_display.get('skill_gid')
skill_gid = skill.settings_display.get("skill_gid")
if skill_gid is not None:
response_skill.update(skill_gid=skill_gid)
identifier = skill.settings_display.get('identifier')
identifier = skill.settings_display.get("identifier")
if identifier is None:
response_skill.update(identifier=skill_gid)
else:
@ -366,10 +351,10 @@ class DeviceSkillSettingsEndpoint(PublicEndpoint):
def _apply_settings_values(settings_definition, settings_values):
"""Build a copy of the settings sections populated with values."""
sections_with_values = []
for section in settings_definition['sections']:
for section in settings_definition["sections"]:
section_with_values = dict(**section)
for field in section_with_values['fields']:
field_name = field.get('name')
for field in section_with_values["fields"]:
field_name = field.get("name")
if field_name is not None and field_name in settings_values:
field.update(value=str(settings_values[field_name]))
sections_with_values.append(section_with_values)
@ -380,9 +365,7 @@ class DeviceSkillSettingsEndpoint(PublicEndpoint):
self._authenticate(device_id)
self._validate_put_request()
skill_id = self._update_skill_settings(device_id)
self.etag_manager.expire(
DEVICE_SKILL_ETAG_KEY.format(device_id=device_id)
)
self.etag_manager.expire(DEVICE_SKILL_ETAG_KEY.format(device_id=device_id))
return dict(uuid=skill_id), HTTPStatus.OK
@ -392,9 +375,7 @@ class DeviceSkillSettingsEndpoint(PublicEndpoint):
def _update_skill_settings(self, device_id):
skill_setting_updater = SkillSettingUpdater(
self.db,
device_id,
self.request.json
self.db, device_id, self.request.json
)
skill_setting_updater.update()
self._delete_orphaned_settings_display(
@ -418,6 +399,7 @@ class DeviceSkillSettingsEndpointV2(PublicEndpoint):
with pre 19.08 versions of mycroft-core. Once those versions are no
longer supported, the older class can be deprecated.
"""
def get(self, device_id):
"""
Retrieve skills installed on device from the database.
@ -433,9 +415,7 @@ class DeviceSkillSettingsEndpointV2(PublicEndpoint):
def _build_response_data(self, device_id):
device_skill_repo = DeviceSkillRepository(self.db)
device_skills = device_skill_repo.get_skill_settings_for_device(
device_id
)
device_skills = device_skill_repo.get_skill_settings_for_device(device_id)
if device_skills is not None:
response_data = {}
for skill in device_skills:
@ -446,15 +426,13 @@ class DeviceSkillSettingsEndpointV2(PublicEndpoint):
def _build_response(self, device_id, response_data):
if response_data is None:
response = Response(
'',
status=HTTPStatus.NO_CONTENT,
content_type='application/json'
"", status=HTTPStatus.NO_CONTENT, content_type="application/json"
)
else:
response = Response(
json.dumps(response_data),
status=HTTPStatus.OK,
content_type='application/json'
content_type="application/json",
)
self._add_etag(DEVICE_SKILL_ETAG_KEY.format(device_id=device_id))

View File

@ -33,10 +33,10 @@ class DeviceSubscriptionEndpoint(PublicEndpoint):
if account:
membership = account.membership
response = (
{'@type': membership.type if membership is not None else 'free'},
HTTPStatus.OK
{"@type": membership.type if membership is not None else "free"},
HTTPStatus.OK,
)
else:
response = '', HTTPStatus.NO_CONTENT
response = "", HTTPStatus.NO_CONTENT
return response

View File

@ -1,17 +1,18 @@
"""Call this endpoint to retrieve the timezone for a given location"""
from dataclasses import asdict
from http import HTTPStatus
from logging import getLogger
from selene.api import PublicEndpoint
from selene.data.geography import CityRepository
from selene.util.log import get_selene_logger
ONE_HUNDRED_MILES = 100
_log = getLogger()
_log = get_selene_logger(__name__)
class GeolocationEndpoint(PublicEndpoint):
"""Selene endpoint that will search for a geography give a city name."""
def __init__(self):
super().__init__()
self.device_id = None
@ -29,7 +30,7 @@ class GeolocationEndpoint(PublicEndpoint):
def get(self):
"""Handle a HTTP GET request."""
self.request_geolocation = self.request.args['location'].lower()
self.request_geolocation = self.request.args["location"].lower()
response_geolocation = self._get_geolocation()
return dict(data=response_geolocation), HTTPStatus.OK
@ -50,12 +51,8 @@ class GeolocationEndpoint(PublicEndpoint):
)
if selected_geolocation is not None:
selected_geolocation.latitude = float(
selected_geolocation.latitude
)
selected_geolocation.longitude = float(
selected_geolocation.longitude
)
selected_geolocation.latitude = float(selected_geolocation.latitude)
selected_geolocation.longitude = float(selected_geolocation.longitude)
return selected_geolocation
@ -74,8 +71,8 @@ class GeolocationEndpoint(PublicEndpoint):
"""
possible_city_names = []
geolocation_words = self.request_geolocation.split()
for index, word in enumerate(geolocation_words):
possible_city_name = ' '.join(geolocation_words[:index + 1])
for index, _ in enumerate(geolocation_words):
possible_city_name = " ".join(geolocation_words[: index + 1])
possible_city_names.append(possible_city_name)
self.cities = self.city_repo.get_geographic_location_by_city(
@ -113,7 +110,7 @@ class GeolocationEndpoint(PublicEndpoint):
"""
city_in_requested_region = None
for city in self.cities:
location_without_city = self.request_geolocation[len(city.city):]
location_without_city = self.request_geolocation[len(city.city) :]
if city.region.lower() in location_without_city.strip():
city_in_requested_region = city
break
@ -129,7 +126,7 @@ class GeolocationEndpoint(PublicEndpoint):
"""
selected_city = None
for city in self.cities:
location_without_city = self.request_geolocation[len(city.city):]
location_without_city = self.request_geolocation[len(city.city) :]
if city.country.lower() in location_without_city.strip():
selected_city = city
break

View File

@ -16,120 +16,140 @@
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Public API endpoint for transcribing audio using Google's STT API
import os
DEPRECATION WARNING:
This endpoint is being replaced with the audio_transcription endpoint. It will
remain in the V1 API for backwards compatibility.
"""
from datetime import datetime
from decimal import Decimal
from http import HTTPStatus
from io import BytesIO
from time import time
from speech_recognition import AudioFile, Recognizer
import librosa
from speech_recognition import (
AudioData,
AudioFile,
Recognizer,
RequestError,
UnknownValueError,
)
from selene.api import PublicEndpoint, track_account_activity
from selene.data.account import AccountRepository, OPEN_DATASET
from selene.data.metric import SttTranscriptionMetric, TranscriptionMetricRepository
from selene.util.log import get_selene_logger
_log = get_selene_logger(__name__)
SAMPLE_RATE = 16000
SELENE_DATA_DIR = "/opt/selene/data"
class GoogleSTTEndpoint(PublicEndpoint):
"""Endpoint to send a flac audio file with voice and get back a utterance"""
_account_repo = None
"""Endpoint to send a flac audio file with voice and get back a utterance."""
def __init__(self):
super(GoogleSTTEndpoint, self).__init__()
self.google_stt_key = self.config["GOOGLE_STT_KEY"]
super().__init__()
self.recognizer = Recognizer()
self.account = None
self.account_shares_data = False
@property
def account_repo(self):
if self._account_repo is None:
self._account_repo = AccountRepository(self.db)
return self._account_repo
self.transcription_success = False
self.audio_duration = 0
self.transcription_duration = 0
def post(self):
"""Processes an HTTP Post request."""
_log.info(f"{self.request_id}: Google STT transcription requested")
self._authenticate()
self._get_account()
self._check_for_open_dataset_agreement()
self._write_flac_audio_file()
stt_response = self._call_google_stt()
response = self._build_response(stt_response)
self._write_stt_result_file(response)
if response:
request_audio_data = self._extract_audio_from_request()
transcription = self._call_google_stt(request_audio_data)
self._add_transcription_metric()
if transcription is not None:
track_account_activity(self.db, self.device_id)
return response, HTTPStatus.OK
return [transcription], HTTPStatus.OK
def _get_account(self):
if self.device_id is not None:
self.account = self.account_repo.get_account_by_device_id(self.device_id)
"""Retrieves the account associated with the device from the database."""
account_repo = AccountRepository(self.db)
self.account = account_repo.get_account_by_device_id(self.device_id)
def _check_for_open_dataset_agreement(self):
for agreement in self.account.agreements:
if agreement.type == OPEN_DATASET:
self.account_shares_data = True
"""Determines if the account is opted into the Open Dataset Agreement."""
if self.account is not None:
for agreement in self.account.agreements:
if agreement.type == OPEN_DATASET:
self.account_shares_data = True
break
def _write_flac_audio_file(self):
"""Save the audio file for STT tagging"""
self._write_open_dataset_file(self.request.data, file_type="flac")
def _write_stt_result_file(self, stt_result):
"""Save the STT results for tagging."""
file_contents = "\n".join(stt_result)
self._write_open_dataset_file(file_contents.encode(), file_type="stt")
def _write_open_dataset_file(self, content, file_type):
if self.account is not None and self.account_shares_data:
file_name = "{account_id}_{time}.{file_type}".format(
account_id=self.account.id, file_type=file_type, time=time()
)
file_path = os.path.join(SELENE_DATA_DIR, file_name)
with open(file_path, "wb") as flac_file:
flac_file.write(content)
def _call_google_stt(self):
"""Use the audio data from the request to call the Google STT API
def _extract_audio_from_request(self) -> AudioData:
"""Extracts the audio data from the request for use in Google STT API.
We need to replicate the first 16 bytes in the audio due a bug with
the Google speech recognition library that removes the first 16 bytes
from the flac file we are sending.
Returns:
Object representing the audio data in a format that can be used to call
Google's STT API
"""
_log.info(f"{self.request_id}: Extracting audio data from request")
request_audio = self.request.data[:16] + self.request.data
with AudioFile(BytesIO(request_audio)) as source:
audio_data = self.recognizer.record(source)
with BytesIO(self.request.data) as request_audio:
audio, _ = librosa.load(request_audio, sr=SAMPLE_RATE, mono=True)
self.audio_duration = librosa.get_duration(y=audio, sr=SAMPLE_RATE)
return audio_data
def _call_google_stt(self, audio: AudioData) -> str:
"""Uses the audio data from the request to call the Google STT API
Args:
audio: audio data representing the words spoken by the user
Returns:
text transcription of the audio data
"""
_log.info(f"{self.request_id}: Transcribing audio with Google STT")
lang = self.request.args["lang"]
audio = self.request.data
with AudioFile(BytesIO(audio[:16] + audio)) as source:
recording = self.recognizer.record(source)
response = self.recognizer.recognize_google(
recording, key=self.google_stt_key, language=lang, show_all=True
)
return response
def _build_response(self, stt_response):
"""Build the response to return to the device.
Return n transcripts with the higher confidence. That is useful for
the case when send a ambiguous voice file and the correct utterance is
not the utterance with highest confidence and the API.
"""
limit = int(self.request.args["limit"])
if isinstance(stt_response, dict):
alternative = stt_response.get("alternative")
if "confidence" in alternative:
# Sorting by confidence:
alternative = sorted(
alternative, key=lambda alt: alt["confidence"], reverse=True
)
alternative = [alt["transcript"] for alt in alternative]
# client is interested in test the utterances found.
if len(alternative) <= limit:
response = alternative
else:
response = alternative[:limit]
else:
response = [alternative[0]["transcript"]]
transcription = None
start_time = datetime.now()
try:
transcription = self.recognizer.recognize_google(
audio, key=self.config["GOOGLE_STT_KEY"], language=lang
)
except RequestError:
_log.exception("Request to Google TTS failed")
except UnknownValueError:
_log.exception("TTS transcription deemed unintelligible by Google")
else:
response = []
log_message = "Google STT request successful"
if self.account_shares_data:
log_message += f": {transcription}"
_log.info(log_message)
self.transcription_success = True
end_time = datetime.now()
self.transcription_duration = (end_time - start_time).total_seconds()
return response
return transcription
def _add_transcription_metric(self):
"""Adds metrics for this STT transcription to the database."""
account_repo = AccountRepository(self.db)
account = account_repo.get_account_by_device_id(self.device_id)
transcription_metric = SttTranscriptionMetric(
account_id=account.id,
engine="Google",
success=self.transcription_success,
audio_duration=Decimal(str(self.audio_duration)),
transcription_duration=Decimal(str(self.transcription_duration)),
)
transcription_metric_repo = TranscriptionMetricRepository(self.db)
transcription_metric_repo.add(transcription_metric)

View File

@ -25,13 +25,12 @@ from selene.api import PublicEndpoint
class OauthCallbackEndpoint(PublicEndpoint):
def __init__(self):
super(OauthCallbackEndpoint, self).__init__()
self.oauth_service_host = os.environ['OAUTH_BASE_URL']
self.oauth_service_host = os.environ["OAUTH_BASE_URL"]
def get(self):
params = dict(self.request.args)
url = self.oauth_service_host + '/auth/callback'
url = self.oauth_service_host + "/auth/callback"
response = requests.get(url, params=params)
return response.text, response.status_code

View File

@ -30,20 +30,20 @@ class PremiumVoiceEndpoint(PublicEndpoint):
def get(self, device_id):
self._authenticate(device_id)
arch = self.request.args.get('arch')
arch = self.request.args.get("arch")
account = AccountRepository(self.db).get_account_by_device_id(device_id)
if account and account.membership:
link = self._get_premium_voice_link(arch)
response = {'link': link}, HTTPStatus.OK
response = {"link": link}, HTTPStatus.OK
else:
response = '', HTTPStatus.NO_CONTENT
response = "", HTTPStatus.NO_CONTENT
return response
def _get_premium_voice_link(self, arch):
if arch == 'arm':
response = os.environ['URL_VOICE_ARM']
elif arch == 'x86_64':
response = os.environ['URL_VOICE_X86_64']
if arch == "arm":
response = os.environ["URL_VOICE_ARM"]
elif arch == "x86_64":
response = os.environ["URL_VOICE_X86_64"]
else:
response = ''
response = ""
return response

View File

@ -25,14 +25,13 @@ from selene.data.account import AccountRepository
class StripeWebHookEndpoint(PublicEndpoint):
def __init__(self):
super(StripeWebHookEndpoint, self).__init__()
def post(self):
event = json.loads(self.request.data)
type = event.get('type')
if type == 'customer.subscription.deleted':
customer = event['data']['object']['customer']
type = event.get("type")
if type == "customer.subscription.deleted":
customer = event["data"]["object"]["customer"]
AccountRepository(self.db).end_active_membership(customer)
return '', HTTPStatus.OK
return "", HTTPStatus.OK

View File

@ -0,0 +1,177 @@
# Mycroft Server - Backend
# Copyright (C) 2020 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""Public Device API endpoint for uploading a sample wake word for tagging."""
import json
from datetime import datetime
from http import HTTPStatus
from os import environ
from pathlib import Path
from flask import jsonify
from schematics import Model
from schematics.types import StringType
from schematics.exceptions import DataError
from selene.api import PublicEndpoint
from selene.data.account import Account, AccountRepository
from selene.data.tagging import (
build_tagging_file_name,
TaggingFileLocationRepository,
UPLOADED_STATUS,
WakeWordFile,
WakeWordFileRepository,
)
from selene.data.wake_word import WakeWordRepository
from selene.util.log import get_selene_logger
LOCAL_IP = "127.0.0.1"
_log = get_selene_logger(__name__)
class UploadRequest(Model):
"""Data class for validating the content of the POST request."""
wake_word = StringType(required=True)
engine = StringType(required=True)
timestamp = StringType(required=True)
model = StringType(required=True)
class WakeWordFileUpload(PublicEndpoint):
"""Endpoint for submitting and retrieving wake word sample files.
Samples will be saved to a temporary location on the API host until a daily batch
job moves them to a permanent one. Each file will be logged on the sample table
for their location and classification data.
"""
_file_location = None
_wake_word_repository = None
_wake_word = None
def __init__(self):
super().__init__()
self.request_data = None
@property
def wake_word_repository(self):
"""Lazy instantiation of wake word repository object."""
if self._wake_word_repository is None:
self._wake_word_repository = WakeWordRepository(self.db)
return self._wake_word_repository
@property
def wake_word(self):
"""Build and return a WakeWord object."""
if self._wake_word is None:
self._wake_word = self.wake_word_repository.ensure_wake_word_exists(
name=self.request_data["wake_word"].strip().replace("-", " "),
engine=self.request_data["engine"],
)
return self._wake_word
@property
def file_location(self):
"""Build and return a TaggingFileLocation object."""
if self._file_location is None:
data_dir = Path(environ["SELENE_DATA_DIR"])
wake_word = self.request_data["wake_word"].replace(" ", "-")
wake_word_dir = data_dir.joinpath("wake-word").joinpath(wake_word)
wake_word_dir.mkdir(parents=True, exist_ok=True)
file_location_repository = TaggingFileLocationRepository(self.db)
self._file_location = file_location_repository.ensure_location_exists(
server=LOCAL_IP, directory=str(wake_word_dir)
)
return self._file_location
def post(self, device_id):
"""
Process a HTTP POST request submitting a wake word sample from a device.
:param device_id: UUID of the device that originated the request.
:return: HTTP response indicating status of the request.
"""
self._authenticate(device_id)
self._validate_post_request()
account = self._get_account(device_id)
file_contents = self.request.files["audio"].read()
hashed_file_name = build_tagging_file_name(file_contents)
new_file_name = self._add_wake_word_file(account, hashed_file_name)
if new_file_name is not None:
hashed_file_name = new_file_name
self._save_audio_file(hashed_file_name, file_contents)
return jsonify("Wake word sample uploaded successfully"), HTTPStatus.OK
def _validate_post_request(self):
"""Load the post request into the validation class and perform validations."""
if "audio" not in self.request.files:
raise DataError(dict(audio="No audio file included in request"))
if "metadata" not in self.request.files:
raise DataError(dict(metadata="No metadata file included in request"))
metadata = json.loads(self.request.files["metadata"].read().decode())
upload_request = UploadRequest(
dict(
wake_word=metadata.get("wake_word"),
engine=metadata.get("engine"),
timestamp=metadata.get("timestamp"),
model=metadata.get("model"),
)
)
upload_request.validate()
self.request_data = upload_request.to_native()
def _get_account(self, device_id: str):
"""Use the device ID to find the account.
:param device_id: The database ID for the device that made this API call
"""
account_repository = AccountRepository(self.db)
return account_repository.get_account_by_device_id(device_id)
def _save_audio_file(self, hashed_file_name: str, file_contents: bytes):
"""Build the file path for the audio file."""
file_path = Path(self.file_location.directory).joinpath(hashed_file_name)
with open(file_path, "wb") as audio_file:
audio_file.write(file_contents)
def _add_wake_word_file(self, account: Account, hashed_file_name: str):
"""Add the sample to the database for reference and classification.
:param account: the account from which sample originated
:param hashed_file_name: name of the audio file saved to file system
"""
sample = WakeWordFile(
account_id=account.id,
location=self.file_location,
name=hashed_file_name,
origin="mycroft",
submission_date=datetime.utcnow().date(),
wake_word=self.wake_word,
status=UPLOADED_STATUS,
)
file_repository = WakeWordFileRepository(self.db)
new_file_name = file_repository.add(sample)
return new_file_name

View File

@ -27,7 +27,13 @@ from selene.api import PublicEndpoint, track_account_activity
class WolframAlphaEndpoint(PublicEndpoint):
"""Proxy to the Wolfram Alpha API"""
"""Proxy to the Wolfram Alpha API.
WARNING: This Endpoint is deprecated in favor of WolframAlphaV2Endpoint.
The new endpoint allows for the usage of additional query params beyond
the 'input' such as output format to return JSON or XML.
"""
def __init__(self):
super(WolframAlphaEndpoint, self).__init__()

View File

@ -0,0 +1,47 @@
# Mycroft Server - Backend
# Copyright (C) 2021 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
import os
from http import HTTPStatus
import requests
from selene.api import PublicEndpoint
class WolframAlphaSimpleEndpoint(PublicEndpoint):
"""Endpoint to communicate with the Wolfram Alpha Simple API.
The Simple API returns a universally viewable image format.
https://products.wolframalpha.com/simple-api/
"""
def __init__(self):
super(WolframAlphaSimpleEndpoint, self).__init__()
self.wolfram_alpha_key = os.environ["WOLFRAM_ALPHA_KEY"]
self.wolfram_alpha_url = os.environ["WOLFRAM_ALPHA_URL"]
def get(self):
self._authenticate()
params = dict(self.request.args)
params["appid"] = self.wolfram_alpha_key
response = requests.get(self.wolfram_alpha_url + "/v1/simple", params=params)
code = response.status_code
response = (response.text, code) if code == HTTPStatus.OK else ("", code)
return response

View File

@ -30,14 +30,14 @@ class WolframAlphaSpokenEndpoint(PublicEndpoint):
def __init__(self):
super(WolframAlphaSpokenEndpoint, self).__init__()
self.wolfram_alpha_key = os.environ['WOLFRAM_ALPHA_KEY']
self.wolfram_alpha_url = os.environ['WOLFRAM_ALPHA_URL']
self.wolfram_alpha_key = os.environ["WOLFRAM_ALPHA_KEY"]
self.wolfram_alpha_url = os.environ["WOLFRAM_ALPHA_URL"]
def get(self):
self._authenticate()
params = dict(self.request.args)
params['appid'] = self.wolfram_alpha_key
response = requests.get(self.wolfram_alpha_url + '/v1/spoken', params=params)
params["appid"] = self.wolfram_alpha_key
response = requests.get(self.wolfram_alpha_url + "/v1/spoken", params=params)
code = response.status_code
response = (response.text, code) if code == HTTPStatus.OK else ('', code)
response = (response.text, code) if code == HTTPStatus.OK else ("", code)
return response

View File

@ -0,0 +1,46 @@
# Mycroft Server - Backend
# Copyright (C) 2019 Mycroft AI Inc
# SPDX-License-Identifier: AGPL-3.0-or-later
#
# This file is part of the Mycroft Server.
#
# The Mycroft Server is free software: you can redistribute it and/or
# modify it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
import os
from http import HTTPStatus
import requests
from selene.api import PublicEndpoint, track_account_activity
class WolframAlphaV2Endpoint(PublicEndpoint):
"""Proxy to the Wolfram Alpha Full Results v2 API with JSON output.
https://products.wolframalpha.com/api/documentation/
"""
def __init__(self):
super(WolframAlphaV2Endpoint, self).__init__()
self.wolfram_alpha_key = os.environ["WOLFRAM_ALPHA_KEY"]
self.wolfram_alpha_url = os.environ["WOLFRAM_ALPHA_URL"]
def get(self):
self._authenticate()
track_account_activity(self.db, self.device_id)
params = dict(self.request.args)
params["appid"] = self.wolfram_alpha_key
params["output"] = "json"
response = requests.get(self.wolfram_alpha_url + "/v2/query", params=params)
return response.json(), response.status_code

39
api/public/pyproject.toml Normal file
View File

@ -0,0 +1,39 @@
[tool.poetry]
name = "public"
version = "0.1.0"
description = "API for interactions between Selene and Mycroft devices"
authors = ["Chris Veilleux <veilleux.chris@gmail.com>"]
license = "GNU AGPL 3.0"
[tool.poetry.dependencies]
python = "^3.9"
# Version 1.0 of flask required because later versions do not allow lists to be passed as API repsonses. The Google
# STT endpoint passes a list of transcriptions to the device. Changing this to return a dictionary would break the
# API's V1 contract with Mycroft Core.
#
# To make flask 1.0 work, older versions of itsdangerous, jinja2, markupsafe and werkszeug are required.
flask = "<1.1"
google-cloud-speech = "^2.15.1"
itsdangerous = "<=2.0.1"
jinja2 = "<=2.10.1"
markupsafe = "<=2.0.1"
requests = "*"
selene = {path = "./../../shared", develop = true}
SpeechRecognition = "*"
stripe = "*"
uwsgi = "*"
werkzeug = "<=2.0.3"
librosa = "^0.9.2"
numpy = "<=1.22"
[tool.poetry.dev-dependencies]
allure-behave = "*"
black = "*"
pyhamcrest = "*"
pylint = "*"
behave = "^1.2.6"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

View File

@ -1,11 +1,13 @@
Feature: Send email to a to the account that owns a device
Test the email endpoint
Feature: Device API -- Send email to the account holder
Some skills have the ability to send email upon request. One example of
this is the support skill, which emails device diagnostics.
Scenario: an email payload is passed to the email endpoint
When an email message is sent to the email endpoint
Then an email should be sent to the user's account that owns the device
Scenario: Email sent to account holder
When a user interaction with a device causes an email to be sent
Then the request will be successful
And an email should be sent to the account that owns the device
And the device's last contact time is updated
Scenario: an email payload is passed to the the email endpoint using a not allowed device
When the email endpoint is called by a not allowed device
Then 401 status code should be returned by the email endpoint
Scenario: Email request sent by unauthorized device
When an unpaired or unauthenticated device attempts to send an email
Then the request will fail with an unauthorized error

View File

@ -1,4 +1,4 @@
Feature: Fetch device's location
Feature: Device API -- Request device location
Scenario: Location is successfully retrieved from a device
When a api call to get the location is done

View File

@ -1,4 +1,4 @@
Feature: Save metrics sent to selene from mycroft core
Feature: Device API -- Save device activity metrics
Scenario: User opted into the open dataset uses their device
Given a device registered to a user opted into the open dataset

View File

@ -1,9 +1,28 @@
Feature: Pair a device
Feature: Device API -- Pair a device
Test the device pairing workflow
Scenario: Device activation
Scenario: Pairing code generation
When a device requests a pairing code
And the device is added to an account using the pairing code
And the device is activated
Then the pairing code request is successful
And the device activation request is successful
Then the request will be successful
And the pairing data is stored in Redis
And the pairing data is sent to the device
Scenario: Device activation
Given the user completes the pairing process on the web application
When the device requests to be activated
Then the request will be successful
And the activation data is sent to the device
And the device attributes are stored in the database
Scenario: Pantacor device configuration sync
Given an authorized device
When Pantacor has claimed the device
And a device requests to sync with Pantacor
Then the request will be successful
And the Pantacor device configuration is stored in the database
Scenario: Pantacor device not claimed
Given an authorized device
When Pantacor has not yet claimed the device
And a device requests to sync with Pantacor
Then the request will fail with a precondition required error

View File

@ -1,4 +1,4 @@
Feature: Device can upload and fetch skills manifest
Feature: Device API -- Upload and fetch skills manifest
Scenario: Device uploads an unchanged manifest
Given an authorized device

View File

@ -1,4 +1,4 @@
Feature: Upload and fetch skills and their settings
Feature: Device API -- Upload and fetch skills and their settings
Test all endpoints related to upload and fetch skill settings
Scenario: A device requests the settings for its skills

View File

@ -1,4 +1,4 @@
Feature: Get the subscription type from the account linked to a device
Feature: Device API -- Request account subscription type
Test the endpoint used to fetch the subscription type of a device
Scenario: User has a free subscription

Some files were not shown because too many files have changed in this diff Show More