Commit Graph

17 Commits (18f4ff3e901faae71351204c66a901b19eed2dc1)

Author SHA1 Message Date
Arron Atchison c1ea573493 fixed pep8 in enclosure client folder 2017-02-15 15:37:52 -06:00
Steve Penrod 0ecc736829 Fixes issue #434. Developers working on both Cerberus and Home durin… (#435)
* Fixes issue #434.  Developers working on both Cerberus and Home during the transition would have to re-pair.

Also bumping enclosure client version.

* Correcting error from when Tarturus code was merged.  At startup it was calling Enclosure.system_reset(), which rebooted the Arduino, instead of implementing Enclosure.reset(), which sets the UI to a "ready for input" state.

While in here, I also added docstrings for all Enclosure API methods.

* Increment Arduino code version

* Adding a call to reset the face UI when the enclosure service starts up.  This is needed because the enclosure.reset that is posted by the speech service on the messagebus sometimes occurs before the enclosure client is up and listening for it -- especially if there is a Arduino firmware upgrade.

In the future, we may want to consider a core service roll-call that gets triggered whenever any of the core services come up.

* Update dev_setup.sh
2016-12-28 15:00:03 -06:00
Jonathan D'Orleans 99ff4e3ce8 Issues 351 - Tartarus Integration
- Master rebase
- Renaming Websocket variable from client to ws
- Resetting enclosure when ws open
- Formatting pairing skill
2016-12-17 14:53:22 -05:00
Jonathan D'Orleans 86e712ec84 Issues 356 - Ensuring only code is sent as a message to visime enclosure 2016-12-17 10:16:29 -05:00
Jonathan D'Orleans 4c1ba4e337 Issues 356 - Rebasing with master 2016-12-17 10:16:29 -05:00
Jonathan D'Orleans ccceb62b7a Issues 351 - Renaming metadata to data and simplifying data usage 2016-12-17 10:15:24 -05:00
Jonathan D'Orleans 3304474a22 Issues 354 - Pairing device with remote server
- Getting pairing code from server
- Treating Api response after request
- Simplifying enclosure mouth events control, upgrade and test process
2016-12-17 10:12:10 -05:00
Steve Penrod e47ac6b895 Renamed the EnclosureAPI "system_reset()" method which generated an
"enclosure.system.reset" on the messagebus (which was intended to
only reset the enclosure's visual elements) to simply "reset()" and
"enclosure.reset" to avoid confusion with the "system.reset" serial
port message (which resets the Arduino).
2016-11-08 01:44:20 -06:00
Steve Penrod f08bbe5902 Refined the reset mechanism on boot. Now there is a single Enclosure API
system_reset() that means the Enclosure appearance should be reset to
its defaults.  The implementation of this is now a reset of both the
mouth and the eyes.  This command gets sent to the Enclosure once the
speech client has fully opened its connection to the messagebus.
2016-11-03 16:18:08 -05:00
Steve c653c43910 Added viseme support for TTS, allowing enclosure to display visemes (#357)
* Added viseme support for TTS, allowing enclosure to display visemes as appropriate

* Enclosure versino bump
2016-09-05 16:27:09 -05:00
Isaac Ward ff1aa67269 added eye spinning animation to indicate shutdown 2016-08-18 12:44:31 -05:00
Isaac Ward 0f69e3bdb8 added volume communication with enclosure 2016-07-28 15:23:24 -05:00
isaacnward a7fb9d2fe5 Issues/234: Weather display on faceplate (#255)
* i really need to fork

* Added weather functionality

* pep8

* uncommented line

* removed variable default values

* changed enclosure version

* edited metadata syntax
2016-07-12 10:26:18 -05:00
isaacnward 083bda8e03 Issues/122 (#220)
* added mouth spelling

* spell word after talking

* removed speech to test

* test

* re-added speech

* used emitter

* mycroft/skills/spelling/__init__.py

* emitters 102: intermediate emitters

* emitters 103: importing the right method

* trying enclosure methods

* more fixing

* sorry slack

* added brief pause

* added api listener method

* fixed syntax

* further edited syntax

* slightly changed syntax

* added brief pause before listener reactivation

* changed ordering

* testing method

* further tests

* test test test

* logger

* further logger

* altered logic

* i really need to fork

* more debug

* changed boolean logic

* more debug

* fixed it??

* added brief pause again

* final commit

* test

* test

* fixed it

* sleep

* more testing

* stuff

* added constants:

* pep8
2016-06-22 14:36:56 -05:00
Ryan Sipes 8f2c451938 Fixed Missing License Headers on All Files.
GPL LIcense added to the top of each python file.
2016-05-26 11:16:13 -05:00
Leo Arias d618676089 Issues-4 - Fix pep8 errors. 2016-05-23 17:23:47 +00:00
Arron Atchison 6e42bb1736 In the 1970s computer users had to understand the arcane syntax of the machines they used. They programed their computers using the machine's native language and hardly gave it a thought.
The 1980s birthed a new form of interaction between computers and users.  For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting.  The mouse and the GUI revolutionized computing and made computers accessible to the masses.

We have now entered a third era.  We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech.

This is an important development.  Some might even call it revolutionary.

Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart.

Not anymore.

Meet Mycroft – the worlds first open source natural language platform.  Mycroft understands human language and responds with speech.  It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways.

Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine.

This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology.  We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction.

Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project.  With your help and support we can truly make Mycroft an AI for everyone.

Joshua W Montgomery – May 17, 2016
2016-05-20 09:16:01 -05:00