Commit Graph

68 Commits (faa3070f3c9e06ada1a638cb4b5bee7bcac0aac3)

Author SHA1 Message Date
Jonathan D'Orleans 6cfe4a765c Issues 96 - Configuration Test
- Adding AbstractConfigurationTest
- Adding ConfigurationManagerTest
2016-06-09 18:26:20 -04:00
Jonathan D'Orleans e75d51b296 Issues 96 - Adding ConfigurationLoaderTest 2016-06-09 18:26:20 -04:00
Jonathan D'Orleans 8d1373387a Issues 96 - Renaming config to mycroft 2016-06-09 18:26:20 -04:00
Jonathan D'Orleans 42453b69d3 Issues 125 - Running test files without the need of 'test' suffix
- Removing unused 'util' folder
- Migrating test runner to module level __init__.py
- Updating start.sh to run new test file
2016-06-09 17:53:39 -04:00
Jonathan D'Orleans 2698e14f72 Issues 61 - Adjusting silence threshold
- Prevent short words commands such as 'stop' and 'record' to be ignored
 - Adding unit tests for the new silence threshold
2016-06-02 19:31:29 -04:00
Sean Fitzgerald 745d4920e1 More gracefully handle unexpected errors in listener, unit test mock class shouldn't generate unexpected errors. 2016-05-28 14:28:12 -07:00
Ryan Sipes fac834cf4a Merge remote-tracking branch 'refs/remotes/origin/master'
Conflicts:
	mycroft/client/speech/listener.py
	mycroft/client/speech/wakeword_recognizer.py
2016-05-27 11:40:24 -05:00
Jonathan D'Orleans 1fdeb65d1b Issues 14 - Listener Improvements
- Adding headers
2016-05-26 16:55:40 -04:00
Ryan Sipes ffb088ef7b Fixed Pyflakes errors
Fixed Pyflakes errors, had to add exceptions and debugs. May need further work.
2016-05-26 15:28:28 -05:00
Jonathan D'Orleans 11e1230889 Issues 14 - Listener Improvements
- Adding call_response audio test
2016-05-26 16:21:44 -04:00
Jonathan D'Orleans 08d07360da Issues 14 - Listener Improvements
- Renaming methods name to convey more meaning
- Renaming variable names to disambiguate
- Adding description for word extractor test
2016-05-26 16:21:44 -04:00
Jonathan D'Orleans 841fc588a6 Issues 14 - Fixing weather audio filename 2016-05-26 16:21:44 -04:00
Jonathan D'Orleans 41027edc89 Issues 14 - Listener Improvements
- Fixing audio tests
- Adding wakeup audio test
2016-05-26 16:21:44 -04:00
Jonathan D'Orleans a7361ba14a Issues 14 - Listener Improvements
- Fixing audio test type
2016-05-26 16:21:43 -04:00
Jonathan D'Orleans d947a8ec98 Issues 14 - Listener Improvements
- Word extractor refactoring
2016-05-26 16:21:43 -04:00
Jonathan D'Orleans b9a34a03b5 Issues 14 - Listener Improvements
1. To prevent many of the listener "dead state" (avoiding unwanted exceptions to break the loop)
2. Wake up must work for both "mycroft wake up" and "wake up mycroft"
3. To ensure "mycroft" keyword is always detected when it exists
4. To rely on "mycroft" instead of "hey mycroft"
5. To process "ok", "okay" and "allright" as part of the wake word detection
Some of the possible test cases:

1. wake up mycroft with: "mycroft wake up" and "wake up mycroft"
- it must wake up no matter the position of the "wake up" keyword
- it must always say it's awake at the first time (before, it'd only say it the second time you try to wake up)

2. try "okay mycroft, what's the weather in lawrence" and similar requests with "alright"
- this ensures we ignore "okay" and "alright" on the sentence

3. "mycroft" keyword should be detected as much as possible even if the sentence does not trigger a real skill/action
- that means the wake word detection is working

4. try all of those from close distance to the unit
- this ensures the changes are working

5. try all of those from a far distance to the unit
- to test the auto gain mic (for those who have one)
2016-05-26 16:21:43 -04:00
Leo Arias d618676089 Issues-4 - Fix pep8 errors. 2016-05-23 17:23:47 +00:00
Arron Atchison 6e42bb1736 In the 1970s computer users had to understand the arcane syntax of the machines they used. They programed their computers using the machine's native language and hardly gave it a thought.
The 1980s birthed a new form of interaction between computers and users.  For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting.  The mouse and the GUI revolutionized computing and made computers accessible to the masses.

We have now entered a third era.  We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech.

This is an important development.  Some might even call it revolutionary.

Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart.

Not anymore.

Meet Mycroft – the worlds first open source natural language platform.  Mycroft understands human language and responds with speech.  It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways.

Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine.

This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology.  We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction.

Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project.  With your help and support we can truly make Mycroft an AI for everyone.

Joshua W Montgomery – May 17, 2016
2016-05-20 09:16:01 -05:00