* Add support for different display currencies in CoinMarkerCap sensor.
* Add test for CoinMarketCap sensor.
* Add test dependency to gen_requirements_all.
* Fix review comments: use string formatting and less string case chanes.
* Trend sensor now uses linear regression to calculate trend
* Added numpy to trend sensor test requirements
* Added trendline tests
* Trend sensor now has max_samples attribute
* Trend sensor uses utcnow from HA utils
* Trend sensor now completes setup in async_added_to_hass
* Fixed linter issues
* Fixed broken import
* Trend tests make use of max_samples
* Added @asyncio.coroutine decorator to trend update callback
* Update trend.py
* Init commit of new whois sensor
* Updated requirements
* Resolved updated showing expired, added expired attr
* Added missing attribute in init, added whois to coveragerc
* Various PR comment changes
- Now more resiliant to invalid hostnames
- Removed various assumed STATE_UNKOWN setting
- Upfront check for valid hostname preventing the sensor starting with dud
- Resolved unit of measurement Day, Days, None issue
- Datetime formatting now done to iso 8601 standard
- Removed all expired usage, not really that useful
- Unused hass assignment
* More PR comment resolutions
- Resolved the dilemma with hosts / single host per sensor. Now running
single domain per sensor.
- Renamed host(s) to domain
* Moved coveragerc sensor location
* Re-phrased the expiration_date warning
* Resolved assumed updated_date existence
* Resolved missing indent
* Resolved discover_info typo
* Update whois.py
* using defusexml ElementTree for safer parsing of untrusted XML data
* move from core dependency to platform specific dependency
* style difference: put back end of list comma in setup.py
* new geo rss events sensor
* SCAN_INTERVAL instead of DEFAULT_SCAN_INTERVAL
* removed redefinition CONF_SCAN_INTERVAL
* definition of self._name not required
* removed unnecessary check and unnecessary parameter
* changed log levels
* fixed default name not used
* streamlined sensor name and entity id generation, removed unnecessary parameter
* fixed issue for entries without geometry data
* fixed tests after code changes
* simplified code
* simplified code; removed unnecessary imports
* fixed invalid variable name
* shorter sensor name and in turn entity id
* increasing test coverage for previously untested code
* fixed indentation and variable usage
* simplified test code
* merged two similar tests
* fixed an issue if no data could be fetched from external service; added test case for this case
Prometheus (https://prometheus.io/) is an open source metric and alerting
system. This adds support for exporting some metrics to Prometheus, using
its Python client library.
* Add initial version
* Fix requirements
* Prefer logging over printing
* Set executor thread name on >Py36 only
* Add tests
* Lint
* Add restrictedpython to test dependencies
* Create python_script.py
From doc:
```
However, an empty dict ({}) is treated as is. If you want to specify a list that can contain anything, specify it as dict:
>>> schema = Schema({}, extra=ALLOW_EXTRA) # don't do this
>>> try:
... schema({'extra': 1})
... raise AssertionError('MultipleInvalid not raised')
... except MultipleInvalid as e:
... exc = e
>>> str(exc) == "not a valid value"
True
>>> schema({})
{}
>>> schema = Schema(dict) # do this instead
>>> schema({})
{}
>>> schema({'extra': 1})
{'extra': 1}
```
Using `python setup.py develop` did not manage to install the required dependencies.
This updates `script/setup` to use `pip install -e .` instead in order to resolve the required dependencies.
* No longer require pyunify during tests
* No longer require cast during tests
* No longer required dependency for tests
* No longer require pymochad for tests
* Astral is a core dependency
* Avoid having to install datadog dependency during tests
* CMUS test doesn't test anything
* Frontier Silicon doesn't test anything
* No longer require mutagen
* Update requirements_test_all.txt
* Remove stale comment
* ps - do not install all dependencies
* Comment out blinkt because it depends on GPIO
* Add pip upgrade check back
* Disable import error blinkt
* Update comment
* Fix comment
* Add new raspihats component
* added raspihats to COMMENT_REQUIREMENTS in gen_requirements_all.py
* disabled pylint import errors
* using hass.data for storing i2c-hats manager
* Added eddystone_temperature platform.
* Fixed style issues.
* Fixed style issues #2.
* Fixed style issues #3.
* Added new platform to .coveragerc
* Refactored platform to use the beacontools package.
* Fixed style issues and added beacontools to excluded requirements.
* Removed obsolete constants and added pylint exception.
* Added blank line
* Updated beacontools to version 1.0.0
* Updated beacontools to version 1.0.1
* Forgot to regenerate requirements_all
* Minor changes
* Refactors script/setup_docker_prereqs
Refactors script/setup_docker_prereqs to allow toggling of packages
to being installed
* Adds support for openalpr to Docker
* Updates Dockerfile
Comments ENV directives in order to preserve cache.
* Fixes incorrect position of echo
* Fixes telldus installer by updating apt before pkg install
* Remove build dirs from docker image to keep the layers small
* Create setup_docker_prereqs script to prepare docker env
* Add documentation for required packages, drop colorlog and cython in first step of Dockerfile since it will be installed later on anyway. Drop libglib2.0-dev and libbluetooth-dev
* Also remove early install of colorlog and cython in Dockerfile.dev
* Re-add libglib2.0-dev and libbluetooth-dev for Bluetooth LE
* Add a volume to store the tox cache on the host. This gives quite some speed boost when running lint_docker and test_docker.
* Only map .tox directory for cache.
* Allow bower install of frontend components as root. Needed for frontend development in docker since everything runs as root in the docker image.
* Improve development workflow in docker
* Use LANG=C.UTF-8 in tox. Fixes installation of libraries with UTF-8 in it's readme.
* Install mysqlclient psycopg2 uvloop after requirements_all.txt again, but with a --no-cache-dir this time. Allows bootstrap_frontend to be executed in a different path like the other scripts.
* Added Bluetooth Low Energy device tracker
* Added new file(s)
* Fixed pylint errors
* Remove traling zeros from device names
* recreated deleted file
* Added requirements
* Renamed to bluetooth_le tracker
Removed gattlib from tests
Minor code cleanup
* - fixed .coveragerc bug
- changed discovery algorithm, new devices will only be added if seen 5 times to make sure
HA doesn't blow the database with devices just passing by
* Initial work to add Chrome Push Notification support
* Remove push.js from home-assistant since it is now in Polymer
* Chrome->HTML5, general cleanup/fixes
* Make html5 generic, move manifest.json into frontend so that we can dynamically add the gcm_sender_id
* Pylint, flake8, pydocstyle frontend init
* HTML5 push fixes
* Update polymer
* Remove crypto req
* Add notify default platform.
* Fix HTML5 push
* Registration fixes
* Linting fix
* pep257 fix
* Add tests
* pep257 fix
* Update frontend
* Switch to SQLAlchemy for the Recorder component. Gives the ability to use MySQL or other.
* fixes for failed lint
* add conversion script
* code review fixes and refactor to use to_native() model methods and execute() helper
* move script to homeassistant.scripts module
* style fixes my tox lint/flake8 missed
* move exclusion up
* Bump version of pymysensors to 0.6, which includes the tcp gateway.
* Update requirements_all.txt.
* Replace CONF_PORT with CONF_DEVICE and ATTR_PORT with ATTR_DEVICE.
* Add tcp_port in config.
* Try to guess if tcp or serial gateway is configured, by validating
device name as an ip address. If successful setup tcp gateway, if it
fails, setup serial gateway.
* Update device_state_attributes to show correct device, ethernet or
serial.
This tracker discovers new devices on boot and tracks bluetooth devices
periodically based on interval_seconds value. Devices discovered are
stored with 'BT_' as the prefix for device mac.
Requires PyBluez
This converts the testing infrastructure to tox for both locally
testing and travis. This is nearly equivalent to the previous testing
with the only exception that linting fails with the first tool to fail
and won't process all of them.
Slightly tricky thing is that tox resets *all* of the environment for
it's subprocess runs by default. A couple of the dependencies we have
will not install in non UTF8 locales: temper-python & XBee.
This adds a default 30 second timeout on every test method so that
deadlocks or broken threads are move obvious in travis. It also passes
-v by default to make things a little more verbose on where things
fail when they are failing.
Added a sample nginx configuration with instructions detailing how to
setup a very secure HTTPS server for HA that servers over standard
ports without requiring HA to run as root.