* Added new listener config options
* Fixed audio unit tests
This adds a cleareraudio file for the wakeup test and changesto the new LocalRecognizer constructor
* Added .dict files to .gitignore
This is because they are now auto-generated on startup rather than stored permanently
* Fixed audio accuracy test for new LocalRecognizer constructor
* Added support for spaces in wake word config
In the phonemes a new word is indicated by a period character. The separating of the words actually changes the way pocketsphinx interprets the sound of it and in this case improves it
* Fixed unit test
* Added new compiled regex for what's
This is for verbs that don't have a space between the question word and verb
* Added support for 'whats' in wolfram
* Added tests for the EnglishQuestionParser
* Issues 192 - WA skill will now search again with top alternative
* Issues 192 - Add tests for new method
* Issues 192 - Fix pep8
* Issues 192 - Update test syntax
* Issues 192 - Change WA skill to use dialog
* Issues 192 - Address feedback
* Issues 158 - Update ScheduledSkill time formatting
* Issues 158 - Add unit tests for new method changes
* Issues 158 - Change to not use date-based implementation to fix tests
* Issues 158 - Address feedback
One unit test, call and response, no longer directly applies. Perhaps we should test that the listener waits for the user to respond however for that I think the 1000 sudio samples would work better.
Another unit test, testing the WakeWord extractor, should be routed through the WakWord extractor directly. For the moment this has been disabled (since it isn't too much a priority because the WakeWord extractor is not used at the moment) but should be a TODO.
1. To prevent many of the listener "dead state" (avoiding unwanted exceptions to break the loop)
2. Wake up must work for both "mycroft wake up" and "wake up mycroft"
3. To ensure "mycroft" keyword is always detected when it exists
4. To rely on "mycroft" instead of "hey mycroft"
5. To process "ok", "okay" and "allright" as part of the wake word detection
Some of the possible test cases:
1. wake up mycroft with: "mycroft wake up" and "wake up mycroft"
- it must wake up no matter the position of the "wake up" keyword
- it must always say it's awake at the first time (before, it'd only say it the second time you try to wake up)
2. try "okay mycroft, what's the weather in lawrence" and similar requests with "alright"
- this ensures we ignore "okay" and "alright" on the sentence
3. "mycroft" keyword should be detected as much as possible even if the sentence does not trigger a real skill/action
- that means the wake word detection is working
4. try all of those from close distance to the unit
- this ensures the changes are working
5. try all of those from a far distance to the unit
- to test the auto gain mic (for those who have one)
The 1980s birthed a new form of interaction between computers and users. For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting. The mouse and the GUI revolutionized computing and made computers accessible to the masses.
We have now entered a third era. We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech.
This is an important development. Some might even call it revolutionary.
Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart.
Not anymore.
Meet Mycroft – the worlds first open source natural language platform. Mycroft understands human language and responds with speech. It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways.
Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine.
This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology. We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction.
Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project. With your help and support we can truly make Mycroft an AI for everyone.
Joshua W Montgomery – May 17, 2016