* Added mic test to enclosure
* Added system.test enclosure commands
This also improves the audio test method; it now uses its own voice to make the test faster to perform
* Added set function to mycroft configuration
This allows setting config values to a particular section in the user config
* Added upload and test on first boot features
* Added mute test to enclosure
* added mouth spelling
* spell word after talking
* removed speech to test
* test
* re-added speech
* used emitter
* mycroft/skills/spelling/__init__.py
* emitters 102: intermediate emitters
* emitters 103: importing the right method
* trying enclosure methods
* more fixing
* sorry slack
* added brief pause
* added api listener method
* fixed syntax
* further edited syntax
* slightly changed syntax
* added brief pause before listener reactivation
* changed ordering
* testing method
* further tests
* test test test
* logger
* further logger
* altered logic
* i really need to fork
* more debug
* changed boolean logic
* more debug
* fixed it??
* added brief pause again
* final commit
* test
* test
* fixed it
* sleep
* more testing
* stuff
* added constants:
* pep8
* added volume control in enclosure.py
* fixed syntax for skill method usage
* addressing a problem with the enclosure service
* trying to use intents
* oops
* fixed pep8
* fixed issue with reversed volume controls
* changed enclosure version number to reflect updated enclosure code
The 1980s birthed a new form of interaction between computers and users. For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting. The mouse and the GUI revolutionized computing and made computers accessible to the masses.
We have now entered a third era. We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech.
This is an important development. Some might even call it revolutionary.
Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart.
Not anymore.
Meet Mycroft – the worlds first open source natural language platform. Mycroft understands human language and responds with speech. It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways.
Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine.
This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology. We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction.
Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project. With your help and support we can truly make Mycroft an AI for everyone.
Joshua W Montgomery – May 17, 2016