mycroft-core/README.md

84 lines
3.5 KiB
Markdown
Raw Normal View History

In the 1970s computer users had to understand the arcane syntax of the machines they used. They programed their computers using the machine's native language and hardly gave it a thought. The 1980s birthed a new form of interaction between computers and users. For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting. The mouse and the GUI revolutionized computing and made computers accessible to the masses. We have now entered a third era. We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech. This is an important development. Some might even call it revolutionary. Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart. Not anymore. Meet Mycroft – the worlds first open source natural language platform. Mycroft understands human language and responds with speech. It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways. Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine. This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology. We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction. Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project. With your help and support we can truly make Mycroft an AI for everyone. Joshua W Montgomery – May 17, 2016
2016-05-20 14:16:01 +00:00
Mycroft
==========
Full docs at: https://docs.mycroft.ai
Pair Mycroft instance with Cerberus Account Management Service: https://cerberus.mycroft.ai
# Getting Started in Ubuntu - Development Environment
- Install ` virtualenv` >= 13.1.2 and `virtualenvwrapper` (Restart session)
- Install the following native packages
- `libtool`
- `autoconf`
- `bison`
- `swig`
- `libglib2.0-dev`
- `portaudio19-dev`
- `python-dev`
- `curl`
- `mpg123`
- `espeak`
- In addition, if you are running Ubuntu 16.04 or another O/S, install:
- `libffi-dev`
- `libssl-dev`
In the 1970s computer users had to understand the arcane syntax of the machines they used. They programed their computers using the machine's native language and hardly gave it a thought. The 1980s birthed a new form of interaction between computers and users. For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting. The mouse and the GUI revolutionized computing and made computers accessible to the masses. We have now entered a third era. We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech. This is an important development. Some might even call it revolutionary. Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart. Not anymore. Meet Mycroft – the worlds first open source natural language platform. Mycroft understands human language and responds with speech. It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways. Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine. This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology. We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction. Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project. With your help and support we can truly make Mycroft an AI for everyone. Joshua W Montgomery – May 17, 2016
2016-05-20 14:16:01 +00:00
- run `dev_setup.sh` (feel free to read it, as well)
- Restart session (reboot computer, or logging out and back in might work).
## Cerberus Device and Account Manager
Mycroft AI, Inc. - the company behind Mycroft maintains the Cerberus device and account management system. Developers can sign up at https://cerberus.mycroft.ai
By default the Mycroft software is configured to use Cerberus, upon any request such as "Hey Mycroft, what is the weather?", you will be informed that you need to pair and Mycroft will speak a 6-digit code, which you enter into the pairing page on the [Cerberus site](https://cerberus.mycroft.ai).
In the 1970s computer users had to understand the arcane syntax of the machines they used. They programed their computers using the machine's native language and hardly gave it a thought. The 1980s birthed a new form of interaction between computers and users. For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting. The mouse and the GUI revolutionized computing and made computers accessible to the masses. We have now entered a third era. We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech. This is an important development. Some might even call it revolutionary. Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart. Not anymore. Meet Mycroft – the worlds first open source natural language platform. Mycroft understands human language and responds with speech. It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways. Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine. This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology. We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction. Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project. With your help and support we can truly make Mycroft an AI for everyone. Joshua W Montgomery – May 17, 2016
2016-05-20 14:16:01 +00:00
Once signed and a device is paired, the unit will use our API keys for services, such as the STT (Speech-to-Text) API. It also uses allows you to use our API keys for weather, Wolfram-Alpha, and various other skills.
Pairing information generated by registering with Cerberus is stored in:
`~./mycroft/identity/identity.json` <b><-- DO NOT SHARE THIS WITH OTHERS!</b>
It's useful to know the location of the identity file when troubleshooting device pairing issues.
## Using Mycroft without Cerberus.
If you do not wish to use our service, you may insert your own API keys into the configuration files listed below in <b>configuraion</b>.
The place to insert the API key looks like the following:
`[WeatherSkill]`
`api_key = ""`
Put the relevant key in between the quotes and Mycroft Core should begin to use the key immediately.
### API Key services
- [STT API, Google STT](http://www.chromium.org/developers/how-tos/api-keys)
- [Weather Skill API, OpenWeatherMap](http://openweathermap.org/api)
- [Wolfram-Alpha Skill](http://products.wolframalpha.com/api/)
These are the keys currently in use in Mycroft Core.
In the 1970s computer users had to understand the arcane syntax of the machines they used. They programed their computers using the machine's native language and hardly gave it a thought. The 1980s birthed a new form of interaction between computers and users. For the first time computers became capable of understanding the most basic form of human communication - pointing and grunting. The mouse and the GUI revolutionized computing and made computers accessible to the masses. We have now entered a third era. We are rapidly approaching a time when computer systems will understand human language and respond using the most natural form of human communication – speech. This is an important development. Some might even call it revolutionary. Despite its importance, however, the technologies that will underpin this new method of interaction are the property of major tech firms who don't necessarily have the public's best interests at heart. Not anymore. Meet Mycroft – the worlds first open source natural language platform. Mycroft understands human language and responds with speech. It is being designed to run on anything from a phone to an automobile and will change the way we interact with open source technologies in profound ways. Our goal here at Mycroft is to improve this technology to the point that when you interact with the software it is impossible to tell if you are talking to a human or a machine. This initial release of the Mycroft software represents a significant effort by the Mycroft community to give the open source world access to this important technology. We are all hoping that the software will be useful to the public and will help to usher in a new era of human machine interaction. Our community welcomes everyone to use Mycroft, improve the software and contribute back to the project. With your help and support we can truly make Mycroft an AI for everyone. Joshua W Montgomery – May 17, 2016
2016-05-20 14:16:01 +00:00
## Configuration
Mycroft configuration consists of 3 possible config files.
- `defaults.ini`, which lives inside the mycroft codebase/distribution
- `/etc/mycroft/mycroft.ini`
- `$HOME/.mycroft/mycroft.ini`
When the configuration loader starts, it looks in those locations in that order, and loads ALL configuration. Keys that exist in multiple config files will be overridden by the last file to contain that config value. This results in a minimal amount of config being written for a specific device/user, without modifying the distribution files.
## Starting the Virtualenv
To ensure that you are in the Mycroft virtualenv before trying to start the services, as everything is installed there, Run:
```
workon mycroft
```
### Running the initial stack
- run `PYTHONPATH=. python client/speech/main.py` # the main speech detection loop, which prints events to stdout and broadcasts them to a message bus
- run `PYTHONPATH=. python client/messagebus/service/main.py` # the main message bus, implemented via web sockets
- run `PYTHONPATH=. python client/skills/main.py` # main skills executable, loads all skills under skills dir
### Running stack via the script
- run `./start.sh service`
- run `./start.sh skills`
- run `./start.sh voice`