Merge branch 'master' into 'CentOS_Installer_8_Update'

# Conflicts:
#   INSTALL/CentOS - Quick Install.sh
fix-non-showing-inputs
Mike Hurley 2020-10-01 20:15:03 +00:00
commit a8b20fbb10
230 changed files with 37393 additions and 16968 deletions

5
.gitignore vendored
View File

@ -9,4 +9,7 @@ conf.json
super.json
dbdata
npm-debug.log
shinobi.sqlite
shinobi.sqlite
package-lock.json
dist
._*

9
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,9 @@
## Contributing license
By contributing to this repository, you agree your contributions to be bound by the [SHINOBI OPEN SOURCE SOFTWARE LICENSE AGREEMENT
](LICENSE.md)
## Suggestions
For suggestions to this project, please refer to the [README](README.md)
## Development
For contributing to the repository, please refer to [DEVELOPMENT.md](DEVELOPMENT.md)

View File

@ -15,8 +15,6 @@ any sort of super user access.
Prerequisites
=============
> To get all of Shinobi at once you can use the Ninja Way. Learn more about that here
https://shinobi.video/docs/start#content-the-ninja-way
- *Node.js* :
You'll need Node and NPM, and for this guide we recommend you set up a user-local install of
@ -34,6 +32,11 @@ instance you're working with by copying it. It is recommended to use at least ve
You'll also need FFmpeg. This is the video processing engine at the core of Shinobi. You will
need at least version 3.3.3.
### Installing prerequisites automatically
To get all of Shinobi at once you can use the Ninja Way. Learn more about that here
https://shinobi.video/docs/start#content-the-ninja-way
However this will download the repository to /home/Shinobi and start Shinobi. Once you have finished the installation using the ninja way, ensure that you stop Shinobi othewise it will confilict with your dev instance. To stop Shinobi run `sudo pm2 stop camera.js` and `sudo pm2 stop cron.js`
Development
===========
First off you need to clone the Shinobi repository. Either the regular Shinobi repository or the
@ -56,6 +59,8 @@ git clone https://gitlab.com/Shinobi-Systems/ShinobiCE.git
```
Then cd into either the "Shinobi" or "ShinobiCE" directory.
Make sure to add your fork as a remote so you can send Merge requests
Grabbing packages
-----------------
To install the required Node packages you need to install them with NPM:
@ -73,27 +78,7 @@ instances of Shinobi whenever someone updated.
```sh
cp conf.sample.json conf.json
```
Generally for development SQLite is going to be easier with SQLite, as we don't have to maintain
a MariaDB/MySQL installation and we may need to reset the database often. To set the DB to SQLite:
```sh
node tools/modifyConfiguration.js databaseType=sqlite
```
You can confirm this worked by checking conf.json for the following line
\[near the end of the file\]:
```json
"databaseType": "sqlite"
```
Currently Shinobi doesn't use a model framework or seeding system, so setting up the database
basically involves copying the pre-set instance bundled with the repository:
```sh
cp sql/shinobi.sample.sqlite shinobi.sqlite
```
### Resetting the DB
If you need to reset the database, you can now do so by deleting the shiobi.sqlite file and
copying the sql/shinobi.sample.sqlite file again. Just be sure to stop any running Shinobi
instances before you do so.
Enabling the Super User Interface
---------------------------------
@ -113,4 +98,8 @@ and "cron" processes directy. To monitor output, we recommend you use a terminal
byobu, tmux, or screen. In one terminal window, run ```node cron.js``` and in another run
```node camera.js```. Shinobi should now be running on port 8080 on your local machine (you can
change the port in conf.json) and accessable at http://localhost:8080 in your browser. Any source
code changes you make will require restarting either the camera or cron process [or both].
code changes you make will require restarting either the camera or cron process [or both]. To avoid manually restarting, use the npm package `nodemon`. Run these commands in two separate terminals.
```sh
npx nodemon server.js
npx nodemon cron.js
```

138
Docker/README.md Normal file
View File

@ -0,0 +1,138 @@
# Install Shinobi with Docker
### There are three ways!
## Docker Ninja Way
> This method uses `docker-compose` and has the ability to quick install the TensorFlow Object Detection plugin.
```
bash <(curl -s https://gitlab.com/Shinobi-Systems/Shinobi-Installer/raw/master/shinobi-docker.sh)
```
## Docker Ninja Way - Version 2
#### Installing Shinobi
> Please remember to check out the Environment Variables table further down this README.
```
docker run -d --name='Shinobi' -p '8080:8080/tcp' -v "/dev/shm/Shinobi/streams":'/dev/shm/streams':'rw' -v "$HOME/Shinobi/config":'/config':'rw' -v "$HOME/Shinobi/customAutoLoad":'/home/Shinobi/libs/customAutoLoad':'rw' -v "$HOME/Shinobi/database":'/var/lib/mysql':'rw' -v "$HOME/Shinobi/videos":'/home/Shinobi/videos':'rw' -v "$HOME/Shinobi/plugins":'/home/Shinobi/plugins':'rw' -v '/etc/localtime':'/etc/localtime':'ro' shinobisystems/shinobi:dev
```
#### Installing Object Detection (TensorFlow.js)
> This requires that you add the plugin key to the Shinobi container. This key is generated and displayed in the startup logs of the Object Detection docker container.
- `-p '8082:8082/tcp'` is an optional flag if you decide to run the plugin in host mode.
- `-e PLUGIN_HOST='10.1.103.113'` Set this as your Shinobi IP Address.
- `-e PLUGIN_PORT='8080'` Set this as your Shinobi Web Port number.
```
docker run -d --name='shinobi-tensorflow' -e PLUGIN_HOST='10.1.103.113' -e PLUGIN_PORT='8080' -v "$HOME/Shinobi/docker-plugins/tensorflow":'/config':'rw' shinobisystems/shinobi-tensorflow:latest
```
More Information about this plugin :
- CPU : https://gitlab.com/Shinobi-Systems/docker-plugin-tensorflow.js
- GPU (NVIDIA CUDA) : https://gitlab.com/Shinobi-Systems/docker-plugin-tensorflow.js/-/tree/gpu
## From Source
> Image is based on Ubuntu Bionic (20.04). Node.js 12 is used. MariaDB and FFmpeg are included.
1. Download Repo
```
git clone -b dev https://gitlab.com/Shinobi-Systems/Shinobi.git ShinobiSource
```
2. Enter Repo and Build Image.
```
cd ShinobiSource
docker build --tag shinobi-image:1.0 .
```
3. Create a container with the image.
> This command only works on Linux because of the temporary directory used. This location must exist in RAM. `-v "/dev/shm/shinobiStreams":'/dev/shm/streams':'rw'`. The timezone is also acquired from the host by the volume declaration of `-v '/etc/localtime':'/etc/localtime':'ro'`.
```
docker run -d --name='Shinobi' -p '8080:8080/tcp' -v "/dev/shm/Shinobi/streams":'/dev/shm/streams':'rw' -v "$HOME/Shinobi/config":'/config':'rw' -v "$HOME/Shinobi/customAutoLoad":'/home/Shinobi/libs/customAutoLoad':'rw' -v "$HOME/Shinobi/database":'/var/lib/mysql':'rw' -v "$HOME/Shinobi/videos":'/home/Shinobi/videos':'rw' -v "$HOME/Shinobi/plugins":'/home/Shinobi/plugins':'rw' -v '/etc/localtime':'/etc/localtime':'ro' shinobi-image:1.0
```
> Host mount paths have been updated in this document.
### Volumes
| Volumes | Description |
|-----------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------|
| /dev/shm/Shinobi/streams | **IMPORTANT!** This must be mapped to somewhere in the host's RAM. When running this image on Windows you will need to select a different location. |
| $HOME/Shinobi/config | Put `conf.json` or `super.json` files in here to override the default. values. |
| $HOME/Shinobi/customAutoLoad | Maps to the `/home/Shinobi/libs/customAutoLoad` folder for loading your own modules into Shinobi. |
| $HOME/Shinobi/database | A map to `/var/lib/mysql` in the container. This is the database's core files. |
| $HOME/Shinobi/videos | A map to `/home/Shinobi/videos`. The storage location of your recorded videos. |
| $HOME/Shinobi/plugins | A map to `/home/Shinobi/plugins`. Mapped so that plugins can easily be modified or swapped. |
### Environment Variables
| Environment Variable | Description | Default |
|----------------------|----------------------------------------------------------------------|--------------------|
| SUBSCRIPTION_ID | **THIS IS NOT REQUIRED**. If you are a subscriber to any of the Shinobi services you may use that key as the value for this parameter. If you have donated by PayPal you may use your Transaction ID to activate the license as well. | *None* |
| DB_USER | Username that the Shinobi process will connect to the database with. | majesticflame |
| DB_PASSWORD | Password that the Shinobi process will connect to the database with. | *None* |
| DB_HOST | Address that the Shinobi process will connect to the database with. | localhost |
| DB_DATABASE | Database that the Shinobi process will interact with. | ccio |
| DB_DISABLE_INCLUDED | Disable included database to use your own. Set to `true` to disable.| false |
| PLUGIN_KEYS | The object containing connection keys for plugins running in client mode (non-host, default). | {} |
| SSL_ENABLED | Enable or Disable SSL. | false |
| SSL_COUNTRY | Country Code for SSL. | CA |
| SSL_STATE | Province/State Code for SSL. | BC |
| SSL_LOCATION | Location of where SSL key is being used. | Vancouver |
| SSL_ORGANIZATION | Company Name associated to key. | Shinobi Systems |
| SSL_ORGANIZATION_UNIT | Department associated to key. | IT Department |
| SSL_COMMON_NAME | Common Name associated to key. | nvr.ninja |
> You must add (to the docker container) `/config/ssl/server.key` and `/config/ssl/server.cert`. The `/config` folder is mapped to `$HOME/Shinobi/config` on the host by default with the quick run methods. Place `key` and `cert` in `$HOME/Shinobi/config/ssl`. If `SSL_ENABLED=true` and these files don't exist they will be generated with `openssl`.
> For those using `DB_DISABLE_INCLUDED=true` please remember to create a user in your databse first. The Docker image will create the `DB_DATABASE` under the specified connection information.
### Tips
Modifying `conf.json` or Superuser credentials.
> Please read **Volumes** table in this README. conf.json is for general configuration. super.json is for Superuser credential management.
Get Docker Containers
```
docker ps -a
```
Get Images
```
docker images
```
Container Logs
```
docker logs /Shinobi
```
Enter the Command Line of the Container
```
docker exec -it /Shinobi /bin/bash
```
Stop and Remove
```
docker stop /Shinobi
docker rm /Shinobi
```
**WARNING - DEVELOPMENT ONLY!!!** Kill all Containers and Images
> These commands will completely erase all of your docker containers and images. **You have been warned!**
```
docker stop /Shinobi
docker rm $(docker ps -a -f status=exited -q)
docker rmi $(docker images -a -q)
```

105
Docker/init.sh Normal file
View File

@ -0,0 +1,105 @@
#!/bin/sh
set -e
cp sql/framework.sql sql/framework1.sql
OLD_SQL_USER_TAG="ccio"
NEW_SQL_USER_TAG="$DB_DATABASE"
sed -i "s/$OLD_SQL_USER_TAG/$NEW_SQL_USER_TAG/g" sql/framework1.sql
if [ "$SSL_ENABLED" = "true" ]; then
if [ -d /config/ssl ]; then
echo "Using provided SSL Key"
cp -R /config/ssl ssl
SSL_CONFIG='{"key":"./ssl/server.key","cert":"./ssl/server.cert"}'
else
echo "Making new SSL Key"
mkdir -p ssl
openssl req -nodes -new -x509 -keyout ssl/server.key -out ssl/server.cert -subj "/C=$SSL_COUNTRY/ST=$SSL_STATE/L=$SSL_LOCATION/O=$SSL_ORGANIZATION/OU=$SSL_ORGANIZATION_UNIT/CN=$SSL_COMMON_NAME"
cp -R ssl /config/ssl
SSL_CONFIG='{"key":"./ssl/server.key","cert":"./ssl/server.cert"}'
fi
else
SSL_CONFIG='{}'
fi
if [ "$DB_DISABLE_INCLUDED" = "false" ]; then
echo "MariaDB Directory ..."
ls /var/lib/mysql
if [ ! -f /var/lib/mysql/ibdata1 ]; then
echo "Installing MariaDB ..."
mysql_install_db --user=mysql --datadir=/var/lib/mysql --silent
fi
echo "Starting MariaDB ..."
/usr/bin/mysqld_safe --user=mysql &
sleep 5s
chown -R mysql /var/lib/mysql
if [ ! -f /var/lib/mysql/ibdata1 ]; then
mysql -u root --password="" -e "SET @@SESSION.SQL_LOG_BIN=0;
USE mysql;
DELETE FROM mysql.user ;
DROP USER IF EXISTS 'root'@'%','root'@'localhost','${DB_USER}'@'localhost','${DB_USER}'@'%';
CREATE USER 'root'@'%' IDENTIFIED BY '${DB_PASS}' ;
CREATE USER 'root'@'localhost' IDENTIFIED BY '${DB_PASS}' ;
CREATE USER '${DB_USER}'@'%' IDENTIFIED BY '${DB_PASS}' ;
CREATE USER '${DB_USER}'@'localhost' IDENTIFIED BY '${DB_PASS}' ;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' WITH GRANT OPTION ;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'localhost' WITH GRANT OPTION ;
GRANT ALL PRIVILEGES ON *.* TO '${DB_USER}'@'%' WITH GRANT OPTION ;
GRANT ALL PRIVILEGES ON *.* TO '${DB_USER}'@'localhost' WITH GRANT OPTION ;
DROP DATABASE IF EXISTS test ;
FLUSH PRIVILEGES ;"
fi
# Create MySQL database if it does not exists
if [ -n "${DB_HOST}" ]; then
echo "Wait for MySQL server" ...
while ! mysqladmin ping -h"$DB_HOST"; do
sleep 1
done
fi
echo "Setting up MySQL database if it does not exists ..."
echo "Create database schema if it does not exists ..."
mysql -e "source /home/Shinobi/sql/framework.sql" || true
echo "Create database user if it does not exists ..."
mysql -e "source /home/Shinobi/sql/user.sql" || true
else
echo "Create database schema if it does not exists ..."
mysql -u "$DB_USER" -h "$DB_HOST" -p"$DB_PASSWORD" --port="$DB_PORT" -e "source /home/Shinobi/sql/framework.sql" || true
fi
DATABASE_CONFIG='{"host": "'$DB_HOST'","user": "'$DB_USER'","password": "'$DB_PASSWORD'","database": "'$DB_DATABASE'","port":'$DB_PORT'}'
cronKey="$(head -c 1024 < /dev/urandom | sha256sum | awk '{print substr($1,1,29)}')"
cd /home/Shinobi
mkdir -p libs/customAutoLoad
if [ -e "/config/conf.json" ]; then
cp /config/conf.json conf.json
fi
if [ ! -e "./conf.json" ]; then
sudo cp conf.sample.json conf.json
fi
sudo sed -i -e 's/change_this_to_something_very_random__just_anything_other_than_this/'"$cronKey"'/g' conf.json
node tools/modifyConfiguration.js cpuUsageMarker=CPU subscriptionId=$SUBSCRIPTION_ID thisIsDocker=true pluginKeys="$PLUGIN_KEYS" db="$DATABASE_CONFIG" ssl="$SSL_CONFIG"
sudo cp conf.json /config/conf.json
echo "============="
echo "Default Superuser : admin@shinobi.video"
echo "Default Password : admin"
echo "Log in at http://HOST_IP:SHINOBI_PORT/super"
if [ -e "/config/super.json" ]; then
cp /config/super.json super.json
fi
if [ ! -e "./super.json" ]; then
sudo cp super.sample.json super.json
sudo cp super.sample.json /config/super.json
fi
# Execute Command
echo "Starting Shinobi ..."
exec "$@"

7
Docker/pm2.yml Normal file
View File

@ -0,0 +1,7 @@
apps:
- script : '/home/Shinobi/camera.js'
name : 'camera'
kill_timeout : 5000
- script : '/home/Shinobi/cron.js'
name : 'cron'
kill_timeout : 5000

108
Dockerfile Normal file
View File

@ -0,0 +1,108 @@
FROM node:12.18.3-buster-slim
ENV DB_USER=majesticflame \
DB_PASSWORD='' \
DB_HOST='localhost' \
DB_DATABASE=ccio \
DB_PORT=3306 \
SUBSCRIPTION_ID=sub_XXXXXXXXXXXX \
PLUGIN_KEYS='{}' \
SSL_ENABLED='false' \
SSL_COUNTRY='CA' \
SSL_STATE='BC' \
SSL_LOCATION='Vancouver' \
SSL_ORGANIZATION='Shinobi Systems' \
SSL_ORGANIZATION_UNIT='IT Department' \
SSL_COMMON_NAME='nvr.ninja' \
DB_DISABLE_INCLUDED=false
ARG DEBIAN_FRONTEND=noninteractive
RUN mkdir -p /home/Shinobi /config /var/lib/mysql
RUN apt update -y
RUN apt install wget curl net-tools -y
# Install MariaDB server... the debian way
RUN if [ "$DB_DISABLE_INCLUDED" = "false" ] ; then set -ex; \
{ \
echo "mariadb-server" mysql-server/root_password password '${DB_ROOT_PASSWORD}'; \
echo "mariadb-server" mysql-server/root_password_again password '${DB_ROOT_PASSWORD}'; \
} | debconf-set-selections; \
apt-get update; \
apt-get install -y \
"mariadb-server" \
socat \
; \
find /etc/mysql/ -name '*.cnf' -print0 \
| xargs -0 grep -lZE '^(bind-address|log)' \
| xargs -rt -0 sed -Ei 's/^(bind-address|log)/#&/'; fi
RUN if [ "$DB_DISABLE_INCLUDED" = "false" ] ; then sed -ie "s/^bind-address\s*=\s*127\.0\.0\.1$/#bind-address = 0.0.0.0/" /etc/mysql/my.cnf; fi
# Install FFmpeg
RUN apt install -y software-properties-common \
libfreetype6-dev \
libgnutls28-dev \
libmp3lame-dev \
libass-dev \
libogg-dev \
libtheora-dev \
libvorbis-dev \
libvpx-dev \
libwebp-dev \
libssh2-1-dev \
libopus-dev \
librtmp-dev \
libx264-dev \
libx265-dev \
yasm && \
apt install -y \
build-essential \
bzip2 \
coreutils \
gnutls-bin \
nasm \
tar \
x264
RUN apt install -y \
ffmpeg \
git \
make \
g++ \
gcc \
pkg-config \
python3 \
wget \
tar \
sudo \
xz-utils
WORKDIR /home/Shinobi
COPY . .
RUN rm -rf /home/Shinobiplugins
COPY ./plugins /home/Shinobi/plugins
RUN chmod -R 777 /home/Shinobi/plugins
RUN npm i npm@latest -g && \
npm install pm2 -g && \
npm install --unsafe-perm && \
npm audit fix --force
COPY ./Docker/pm2.yml ./
# Copy default configuration files
# COPY ./config/conf.json ./config/super.json /home/Shinobi/
RUN chmod -f +x /home/Shinobi/Docker/init.sh
VOLUME ["/home/Shinobi/videos"]
VOLUME ["/home/Shinobi/plugins"]
VOLUME ["/config"]
VOLUME ["/customAutoLoad"]
VOLUME ["/var/lib/mysql"]
EXPOSE 8080
ENTRYPOINT ["/home/Shinobi/Docker/init.sh"]
CMD [ "pm2-docker", "pm2.yml" ]

View File

@ -1,5 +1,7 @@
#!/bin/sh
apt install git -y
git clone https://github.com/ShinobiCCTV/Shinobi.git -b dev Shinobi-dev
cd Shinobi-dev
cd Shinobi-dev || exit
chmod +x INSTALL/ubuntu-easyinstall.sh && INSTALL/ubuntu-easyinstall.sh
bash INSTALL/ubuntu-easyinstall.sh

View File

@ -1,5 +1,7 @@
#!/bin/sh
apt install git -y
git clone https://github.com/ShinobiCCTV/Shinobi.git Shinobi
cd Shinobi
cd Shinobi || exit
chmod +x INSTALL/ubuntu-easyinstall.sh && INSTALL/ubuntu-easyinstall.sh
bash INSTALL/ubuntu-easyinstall.sh

View File

@ -5,16 +5,9 @@ echo "========================================================="
echo "To answer yes type the letter (y) in lowercase and press ENTER."
echo "Default is no (N). Skip any components you already have or don't need."
echo "============="
#Create default configuration file
if [ ! -e "./conf.json" ]; then
cp conf.sample.json conf.json
#Generate a random Cron key for the config file
cronKey=$(< /dev/urandom tr -dc A-Za-z0-9 | head -c${1:-30})
sed -i -e 's/change_this_to_something_very_random__just_anything_other_than_this/'"$cronKey"'/g' conf.json
fi
if [ ! -e "./super.json" ]; then
echo "Default Superuser : admin@shinobi.video"
echo "Default Password : admin"
@ -34,12 +27,13 @@ if [ ! -e "./super.json" ]; then
fi
echo "Shinobi - Run yum update"
sudo yum update -y
sudo yum install gcc gcc-c++ -y
sudo yum install make zip dos2unix -y
if ! [ -x "$(command -v node)" ]; then
echo "============="
echo "Shinobi - Installing Node.js"
#Installs Node.js 10
sudo curl --silent --location https://rpm.nodesource.com/setup_8.x | bash -
sudo curl --silent --location https://rpm.nodesource.com/setup_12.x | bash -
sudo yum install nodejs -y
else
echo "Node.js Found..."
@ -69,50 +63,31 @@ if [ "$ffmpeginstall" = "y" ] || [ "$ffmpeginstall" = "Y" ]; then
fi
fi
echo "============="
echo "Shinobi - Do you want to use MariaDB or SQLite3?"
echo "SQLite3 is better for small installs"
echo "MariaDB (MySQL) is better for large installs"
echo "(S)QLite3 or (M)ariaDB?"
echo "Press [ENTER] for default (MariaDB)"
read sqliteormariadb
if [ "$sqliteormariadb" = "S" ] || [ "$sqliteormariadb" = "s" ]; then
sudo npm install jsonfile
sudo yum install -y sqlite sqlite-devel -y
sudo npm install sqlite3
node ./tools/modifyConfiguration.js databaseType=sqlite3
if [ ! -e "./shinobi.sqlite" ]; then
echo "Creating shinobi.sqlite for SQLite3..."
sudo cp sql/shinobi.sample.sqlite shinobi.sqlite
else
echo "shinobi.sqlite already exists. Continuing..."
fi
else
echo "============="
echo "Shinobi - Do you want to Install MariaDB?"
echo "(y)es or (N)o"
read mysqlagree
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
#Add the MariaDB repository to yum, this allows for a more current version of MariaDB to be installed
sudo curl -sS https://downloads.mariadb.com/MariaDB/mariadb_repo_setup | sudo bash -s -- --skip-maxscale
sudo yum install mariadb mariadb-server -y
#Start mysql and enable on boot
sudo systemctl start mariadb
sudo systemctl enable mariadb
#Run mysql install
sudo mysql_secure_installation
fi
echo "============="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read mysqlagreeData
if [ "$mysqlagreeData" = "y" ] || [ "$mysqlagreeData" = "Y" ]; then
echo "What is your SQL Username?"
read sqluser
echo "What is your SQL Password?"
read sqlpass
sudo mysql -u $sqluser -p$sqlpass -e "source sql/user.sql" || true
sudo mysql -u $sqluser -p$sqlpass -e "source sql/framework.sql" || true
fi
echo "============="
echo "Shinobi - Do you want to Install MariaDB?"
echo "(y)es or (N)o"
read mysqlagree
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
#Add the MariaDB repository to yum, this allows for a more current version of MariaDB to be installed
sudo curl -sS https://downloads.mariadb.com/MariaDB/mariadb_repo_setup | sudo bash -s -- --skip-maxscale
sudo yum install mariadb mariadb-server -y
#Start mysql and enable on boot
sudo systemctl start mariadb
sudo systemctl enable mariadb
#Run mysql install
sudo mysql_secure_installation
fi
echo "============="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read mysqlagreeData
if [ "$mysqlagreeData" = "y" ] || [ "$mysqlagreeData" = "Y" ]; then
echo "What is your SQL Username?"
read sqluser
echo "What is your SQL Password?"
read sqlpass
sudo mysql -u $sqluser -p$sqlpass -e "source sql/user.sql" || true
sudo mysql -u $sqluser -p$sqlpass -e "source sql/framework.sql" || true
fi
echo "============="
echo "Shinobi - Install NPM Libraries"

49
INSTALL/cuda-10-2.sh Normal file
View File

@ -0,0 +1,49 @@
#!/bin/sh
echo "------------------------------------------"
echo "-- Installing CUDA Toolkit and CUDA DNN --"
echo "------------------------------------------"
# Install CUDA Drivers and Toolkit
if [ -x "$(command -v apt)" ]; then
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/cuda-ubuntu1804.pin
sudo mv cuda-ubuntu1804.pin /etc/apt/preferences.d/cuda-repository-pin-600
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub
sudo add-apt-repository "deb http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/ /"
sudo apt-get update
sudo apt-get update -y
sudo apt-get -o Dpkg::Options::="--force-overwrite" install cuda-toolkit-10-2 -y --no-install-recommends
sudo apt-get -o Dpkg::Options::="--force-overwrite" install --fix-broken -y
# Install CUDA DNN
wget https://cdn.shinobi.video/installers/libcudnn7_7.6.5.32-1+cuda10.2_amd64.deb -O cuda-dnn.deb
sudo dpkg -i cuda-dnn.deb
wget https://cdn.shinobi.video/installers/libcudnn7-dev_7.6.5.32-1+cuda10.2_amd64.deb -O cuda-dnn-dev.deb
sudo dpkg -i cuda-dnn-dev.deb
echo "-- Cleaning Up --"
# Cleanup
sudo rm cuda-dnn.deb
sudo rm cuda-dnn-dev.deb
fi
if [ -x "$(command -v yum)" ]; then
sudo yum-config-manager --add-repo http://developer.download.nvidia.com/compute/cuda/repos/rhel7/x86_64/cuda-rhel7.repo
sudo yum clean all
sudo yum -y install nvidia-driver-latest-dkms cuda
sudo yum -y install cuda-drivers
wget https://cdn.shinobi.video/installers/libcudnn7-7.6.5.33-1.cuda10.2.x86_64.rpm -O cuda-dnn.rpm
sudo yum -y localinstall cuda-dnn.rpm
wget https://cdn.shinobi.video/installers/libcudnn7-devel-7.6.5.33-1.cuda10.2.x86_64.rpm -O cuda-dnn-dev.rpm
sudo yum -y localinstall cuda-dnn-dev.rpm
echo "-- Cleaning Up --"
sudo rm cuda-dnn.rpm
sudo rm cuda-dnn-dev.rpm
fi
echo "------------------------------"
echo "Reboot is required. Do it now?"
echo "------------------------------"
echo "(y)es or (N)o. Default is No."
read rebootTheMachineHomie
if [ "$rebootTheMachineHomie" = "y" ] || [ "$rebootTheMachineHomie" = "Y" ]; then
sudo reboot
fi

47
INSTALL/cuda-10.sh Normal file
View File

@ -0,0 +1,47 @@
#!/bin/sh
echo "------------------------------------------"
echo "-- Installing CUDA Toolkit and CUDA DNN --"
echo "------------------------------------------"
# Install CUDA Drivers and Toolkit
if [ -x "$(command -v apt)" ]; then
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/cuda-repo-ubuntu1804_10.0.130-1_amd64.deb
sudo dpkg -i --force-overwrite cuda-repo-ubuntu1804_10.0.130-1_amd64.deb
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub
sudo apt-get update -y
sudo apt-get -o Dpkg::Options::="--force-overwrite" install cuda-toolkit-10-0 -y --no-install-recommends
sudo apt-get -o Dpkg::Options::="--force-overwrite" install --fix-broken -y
# Install CUDA DNN
wget https://cdn.shinobi.video/installers/libcudnn7_7.6.5.32-1+cuda10.0_amd64.deb -O cuda-dnn.deb
sudo dpkg -i cuda-dnn.deb
wget https://cdn.shinobi.video/installers/libcudnn7-dev_7.6.5.32-1+cuda10.0_amd64.deb -O cuda-dnn-dev.deb
sudo dpkg -i cuda-dnn-dev.deb
echo "-- Cleaning Up --"
# Cleanup
sudo rm cuda-dnn.deb
sudo rm cuda-dnn-dev.deb
fi
if [ -x "$(command -v yum)" ]; then
wget https://developer.download.nvidia.com/compute/cuda/repos/rhel7/x86_64/cuda-repo-rhel7-10.0.130-1.x86_64.rpm
sudo rpm -i cuda-repo-rhel7-10.0.130-1.x86_64.rpm
sudo yum clean all
sudo yum install cuda
wget https://cdn.shinobi.video/installers/libcudnn7-7.6.5.32-1.cuda10.0.x86_64.rpm -O cuda-dnn.rpm
sudo yum -y localinstall cuda-dnn.rpm
wget https://cdn.shinobi.video/installers/libcudnn7-devel-7.6.5.32-1.cuda10.0.x86_64.rpm -O cuda-dnn-dev.rpm
sudo yum -y localinstall cuda-dnn-dev.rpm
echo "-- Cleaning Up --"
sudo rm cuda-dnn.rpm
sudo rm cuda-dnn-dev.rpm
fi
echo "------------------------------"
echo "Reboot is required. Do it now?"
echo "------------------------------"
echo "(y)es or (N)o. Default is No."
read rebootTheMachineHomie
if [ "$rebootTheMachineHomie" = "y" ] || [ "$rebootTheMachineHomie" = "Y" ]; then
sudo reboot
fi

36
INSTALL/cuda-9-0.sh Normal file
View File

@ -0,0 +1,36 @@
#!/bin/sh
echo "------------------------------------------"
echo "-- Installing CUDA Toolkit and CUDA DNN --"
echo "------------------------------------------"
# Install CUDA Drivers and Toolkit
echo "============="
echo " Detecting Ubuntu Version"
echo "============="
getubuntuversion=$(lsb_release -r | awk '{print $2}' | cut -d . -f1)
echo "============="
echo " Ubuntu Version: $getubuntuversion"
echo "============="
wget http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1704/x86_64/cuda-repo-ubuntu1704_9.0.176-1_amd64.deb -O cuda.deb
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1704/x86_64/7fa2af80.pub
sudo dpkg -i --force-overwrite cuda.deb
sudo apt-get update -y
sudo apt-get -o Dpkg::Options::="--force-overwrite" install cuda-toolkit-9-0 -y --no-install-recommends
sudo apt-get -o Dpkg::Options::="--force-overwrite" install --fix-broken -y
# Install CUDA DNN
wget https://cdn.shinobi.video/installers/libcudnn7_7.6.3.30-1+cuda9.0_amd64.deb -O cuda-dnn.deb
sudo dpkg -i cuda-dnn.deb
wget https://cdn.shinobi.video/installers/libcudnn7-dev_7.6.3.30-1+cuda9.0_amd64.deb -O cuda-dnn-dev.deb
sudo dpkg -i cuda-dnn-dev.deb
echo "-- Cleaning Up --"
# Cleanup
sudo rm cuda.deb
sudo rm cuda-dnn.deb
sudo rm cuda-dnn-dev.deb
echo "------------------------------"
echo "Reboot is required. Do it now?"
echo "------------------------------"
echo "(y)es or (N)o. Default is No."
read rebootTheMachineHomie
if [ "$rebootTheMachineHomie" = "y" ] || [ "$rebootTheMachineHomie" = "Y" ]; then
sudo reboot
fi

49
INSTALL/cuda-9-2.sh Normal file
View File

@ -0,0 +1,49 @@
#!/bin/sh
echo "------------------------------------------"
echo "-- Installing CUDA Toolkit and CUDA DNN --"
echo "------------------------------------------"
# Install CUDA Drivers and Toolkit
echo "============="
echo " Detecting Ubuntu Version"
echo "============="
getubuntuversion=$(lsb_release -r | awk '{print $2}' | cut -d . -f1)
echo "============="
echo " Ubuntu Version: $getubuntuversion"
echo "============="
if [ "$getubuntuversion" = "17" ] || [ "$getubuntuversion" > "17" ]; then
wget https://cdn.shinobi.video/installers/cuda-repo-ubuntu1710_9.2.148-1_amd64.deb -O cuda.deb
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub
sudo dpkg -i --force-overwrite cuda.deb
fi
if [ "$getubuntuversion" = "16" ]; then
wget https://cdn.shinobi.video/installers/cuda-repo-ubuntu1604_9.2.148-1_amd64.deb -O cuda.deb
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub
sudo dpkg -i --force-overwrite cuda.deb
fi
sudo apt-get update -y
if [ "$getubuntuversion" = "17" ] || [ "$getubuntuversion" > "17" ]; then
sudo apt-get -o Dpkg::Options::="--force-overwrite" install cuda -y --no-install-recommends
sudo apt-get -o Dpkg::Options::="--force-overwrite" install --fix-broken -y
fi
if [ "$getubuntuversion" = "16" ]; then
sudo apt-get install libcuda1-384 -y --no-install-recommends
sudo apt-get install nvidia-cuda-toolkit -y
fi
# Install CUDA DNN
wget https://cdn.shinobi.video/installers/libcudnn7_7.2.1.38-1+cuda9.2_amd64.deb -O cuda-dnn.deb
sudo dpkg -i cuda-dnn.deb
wget https://cdn.shinobi.video/installers/libcudnn7-dev_7.2.1.38-1+cuda9.2_amd64.deb -O cuda-dnn-dev.deb
sudo dpkg -i cuda-dnn-dev.deb
echo "-- Cleaning Up --"
# Cleanup
sudo rm cuda.deb
sudo rm cuda-dnn.deb
sudo rm cuda-dnn-dev.deb
echo "------------------------------"
echo "Reboot is required. Do it now?"
echo "------------------------------"
echo "(y)es or (N)o. Default is No."
read rebootTheMachineHomie
if [ "$rebootTheMachineHomie" = "y" ] || [ "$rebootTheMachineHomie" = "Y" ]; then
sudo reboot
fi

View File

@ -3,42 +3,42 @@ echo "------------------------------------------"
echo "-- Installing CUDA Toolkit and CUDA DNN --"
echo "------------------------------------------"
# Install CUDA Drivers and Toolkit
echo "============="
echo " Detecting Ubuntu Version"
echo "============="
getubuntuversion=$(lsb_release -r | awk '{print $2}' | cut -d . -f1)
echo "============="
echo " Ubuntu Version: $getubuntuversion"
echo "============="
if [ "$getubuntuversion" = "17" ] || [ "$getubuntuversion" > "17" ]; then
wget https://cdn.shinobi.video/installers/cuda-repo-ubuntu1710_9.2.148-1_amd64.deb -O cuda.deb
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub
sudo dpkg -i cuda.deb
fi
if [ "$getubuntuversion" = "16" ]; then
wget https://cdn.shinobi.video/installers/cuda-repo-ubuntu1604_9.2.148-1_amd64.deb -O cuda.deb
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub
sudo dpkg -i cuda.deb
fi
sudo apt-get update -y
if [ "$getubuntuversion" = "17" ] || [ "$getubuntuversion" > "17" ]; then
sudo apt-get -o Dpkg::Options::="--force-overwrite" install cuda -y --no-install-recommends
if [ -x "$(command -v apt)" ]; then
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/cuda-ubuntu1804.pin
sudo mv cuda-ubuntu1804.pin /etc/apt/preferences.d/cuda-repository-pin-600
sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub
sudo add-apt-repository "deb http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/ /"
sudo apt-get update
sudo apt-get update -y
sudo apt-get -o Dpkg::Options::="--force-overwrite" install cuda-toolkit-10-2 -y --no-install-recommends
sudo apt-get -o Dpkg::Options::="--force-overwrite" install --fix-broken -y
# Install CUDA DNN
wget https://cdn.shinobi.video/installers/libcudnn7_7.6.5.32-1+cuda10.2_amd64.deb -O cuda-dnn.deb
sudo dpkg -i cuda-dnn.deb
wget https://cdn.shinobi.video/installers/libcudnn7-dev_7.6.5.32-1+cuda10.2_amd64.deb -O cuda-dnn-dev.deb
sudo dpkg -i cuda-dnn-dev.deb
echo "-- Cleaning Up --"
# Cleanup
sudo rm cuda-dnn.deb
sudo rm cuda-dnn-dev.deb
fi
if [ "$getubuntuversion" = "16" ]; then
sudo apt-get install libcuda1-384 -y --no-install-recommends
sudo apt-get install nvidia-cuda-toolkit -y
if [ -x "$(command -v yum)" ]; then
sudo yum-config-manager --add-repo http://developer.download.nvidia.com/compute/cuda/repos/rhel7/x86_64/cuda-rhel7.repo
sudo yum clean all
sudo yum -y install nvidia-driver-latest-dkms cuda
sudo yum -y install cuda-drivers
wget https://cdn.shinobi.video/installers/libcudnn7-7.6.5.33-1.cuda10.2.x86_64.rpm -O cuda-dnn.rpm
sudo yum -y localinstall cuda-dnn.rpm
wget https://cdn.shinobi.video/installers/libcudnn7-devel-7.6.5.33-1.cuda10.2.x86_64.rpm -O cuda-dnn-dev.rpm
sudo yum -y localinstall cuda-dnn-dev.rpm
echo "-- Cleaning Up --"
sudo rm cuda-dnn.rpm
sudo rm cuda-dnn-dev.rpm
fi
# Install CUDA DNN
wget https://cdn.shinobi.video/installers/libcudnn7_7.2.1.38-1+cuda9.2_amd64.deb -O cuda-dnn.deb
sudo dpkg -i cuda-dnn.deb
wget https://cdn.shinobi.video/installers/libcudnn7-dev_7.2.1.38-1+cuda9.2_amd64.deb -O cuda-dnn-dev.deb
sudo dpkg -i cuda-dnn-dev.deb
echo "-- Cleaning Up --"
# Cleanup
sudo rm cuda.deb
sudo rm cuda-dnn.deb
sudo rm cuda-dnn-dev.deb
echo "------------------------------"
echo "Reboot is required. Do it now?"
echo "------------------------------"

View File

@ -1,2 +1,4 @@
#!/bin/sh
sudo apt-get -y install cuda-toolkit-9-1
nvidia-smi

View File

@ -7,12 +7,9 @@ pkg install -y nano ffmpeg libav x264 x265 mysql56-server node npm
echo "Enabling mysql..."
sysrc mysql_enable=yes
service mysql-server start
echo "Cloning the official Shinobi Community Edition gitlab repo..."
git clone "https://gitlab.com/Shinobi-Systems/ShinobiCE"
cd ./ShinobiCE
echo "Adding Shinobi user to database..."
mysql -h localhost -u root -e "source sql/user.sql"
ehco "Shinobi database framework setup..."
echo "Shinobi database framework setup..."
mysql -h localhost -u root -e "source sql/framework.sql"
echo "Securing mysql..."
#/usr/local/bin/mysql_secure_installation

View File

@ -1,14 +1,16 @@
#!/bin/bash
# Moe was here
echo "============="
echo "Do you want to purge Desktop components from your Ubuntu 18.04 installation?"
echo "You cannot undo this. Choose wisely."
echo "Do NOT run this as root, instead run it with 'sudo'; if you want a complete wipe."
echo "(y)es or (N)o"
read purgeDesktop
read -r purgeDesktop
if [ "$purgeDesktop" = "Y" ] || [ "$purgeDesktop" = "y" ]; then
echo "Really really sure?"
echo "(y)es or (N)o"
read purgeDesktopSecond
read -r purgeDesktopSecond
if [ "$purgeDesktopSecond" = "Y" ] || [ "$purgeDesktopSecond" = "y" ]; then
echo "!----------------------------!"
echo "Reset network interface to DHCP? (Automatically assign IP Address from network)"
@ -16,9 +18,10 @@ if [ "$purgeDesktop" = "Y" ] || [ "$purgeDesktop" = "y" ]; then
echo "You can edit it after in /etc/network/interfaces"
echo "!----------------------------!"
echo "(y)es or (N)o"
read resetNetworkInterface
read -r resetNetworkInterface
if [ "$resetNetworkInterface" = "Y" ] || [ "$resetNetworkInterface" = "y" ]; then
echo "source-directory /etc/network/interfaces.d" > "/etc/network/interfaces"
echo "auto lo" > "/etc/network/interfaces"
echo "iface lo inet loopback" >> "/etc/network/interfaces"
echo "auto eth0" >> "/etc/network/interfaces"
echo "iface eth0 inet dhcp" >> "/etc/network/interfaces"
fi

View File

@ -1,4 +1,3 @@
#!/bin/bash
echo "========================================================="
echo "==!! Shinobi : The Open Source CCTV and NVR Solution !!=="
@ -6,7 +5,7 @@ echo "=================== Mac OS Install Part 2 ==============="
echo "========================================================="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read mysqlagreeData
read -r mysqlagreeData
if [ "$mysqlagreeData" = "y" ]; then
echo "Shinobi will now use root for database installation..."
sudo mysql -e "source sql/user.sql" || true
@ -42,7 +41,7 @@ echo "=====================================" >> INSTALL/installed.txt
echo "=====================================" >> INSTALL/installed.txt
echo "Shinobi - Start Shinobi and set to start on boot?"
echo "(y)es or (N)o"
read startShinobi
read -r startShinobi
if [ "$startShinobi" = "y" ]; then
sudo pm2 start camera.js
sudo pm2 startup

View File

@ -8,17 +8,17 @@ echo "Default is no (N). Skip any components you already have or don't need."
echo "============="
echo "Shinobi - Do you want to Install Node.js?"
echo "(y)es or (N)o"
read nodejsinstall
read -r nodejsinstall
if [ "$nodejsinstall" = "y" ]; then
curl -o node-v8.9.3.pkg https://nodejs.org/dist/v8.9.3/node-v8.9.3.pkg
sudo installer -pkg node-v8.9.3.pkg -target /
rm node-v8.9.3.pkg
curl -o node-installer.pkg https://nodejs.org/dist/v11.9.0/node-v11.9.0.pkg
sudo installer -pkg node-installer.pkg -target /
rm node-installer.pkg
sudo ln -s /usr/local/bin/node /usr/bin/nodejs
fi
echo "============="
echo "Shinobi - Do you want to Install FFmpeg?"
echo "(y)es or (N)o"
read ffmpeginstall
read -r ffmpeginstall
if [ "$ffmpeginstall" = "y" ]; then
echo "Shinobi - Installing FFmpeg"
curl -o ffmpeg.zip https://cdn.shinobi.video/installers/ffmpeg-3.4.1-macos.zip
@ -65,7 +65,7 @@ echo "=====================================" >> INSTALL/installed.txt
echo "=====================================" >> INSTALL/installed.txt
echo "Shinobi - Start Shinobi and set to start on boot?"
echo "(y)es or (N)o"
read startShinobi
read -r startShinobi
if [ "$startShinobi" = "y" ]; then
pm2 start camera.js
pm2 startup

View File

@ -4,41 +4,46 @@ echo "========"
echo "Select your OS"
echo "If your OS is not on the list please refer to the docs."
echo "========"
echo "1. Ubuntu"
echo "2. CentOS"
echo "3. MacOS"
echo "4. FreeBSD"
echo "5. OpenSUSE"
echo "6. CentOS - Quick Install"
echo "1. Ubuntu - Fast and Touchless"
echo "2. Ubuntu - Advanced"
echo "3. CentOS"
echo "4. CentOS - Quick Install"
echo "5. MacOS"
echo "6. FreeBSD"
echo "7. OpenSUSE"
echo "========"
read oschoicee
case $oschoicee in
"1")
chmod +x INSTALL/ubuntu-touchless.sh
sh INSTALL/ubuntu-touchless.sh
;;
"2")
chmod +x INSTALL/ubuntu.sh
sh INSTALL/ubuntu.sh
;;
"2")
"3")
chmod +x INSTALL/centos.sh
sh INSTALL/centos.sh
;;
"3")
"4")
chmod +x "INSTALL/CentOS - Quick Install.sh"
sh "INSTALL/CentOS - Quick Install.sh" 1
;;
"5")
chmod +x INSTALL/macos.sh
sh INSTALL/macos.sh
;;
"4")
"6")
chmod +x INSTALL/freebsd.sh
sh INSTALL/freebsd.sh
;;
"5")
"7")
chmod +x INSTALL/opensuse.sh
sh INSTALL/opensuse.sh
;;
"6")
chmod +x "INSTALL/CentOS - Quick Install.sh"
sh "INSTALL/CentOS - Quick Install.sh" 1
;;
*)
echo "Choice not found."
;;
esac
esac

212
INSTALL/openbsd.sh Normal file
View File

@ -0,0 +1,212 @@
#!/bin/sh
# Copyright (c) 2020 Jordan Geoghegan <jordan@geoghegan.ca>
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE
# OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
# Functions
doas_perms_abort () {
echo "\n!!! doas is not enabled! Please check /etc/doas.conf configuration. Exiting..." ; exit 1
}
package_install_abort () {
echo "\n!!! Package Install Failed! Exiting..." ; exit 1
}
mariadb_setup_abort () {
echo "\n!!! MariaDB configuration failed!. Exiting..." ; exit 1
}
user_create_abort () {
echo '\n!!! Creation of system user "_shinobi" failed. Exiting...' ; exit 1
}
source_dl_abort () {
echo "\n!!! Failed to download Shinobi source code from GitLab. Please check your internet connection. Exiting..." ; exit 1
}
source_extract_abort () {
echo "\n!!! Failed to extract Shinobi source code. Exiting..." ; exit 1
}
schema_install_abort () {
echo "\n!!! Failed to install Shinobi Database Schema. Exiting..." ; exit 1
}
npm_abort () {
echo "\n!!! Failed to install Shinobi Node.js dependencies. Exiting..." ; exit 1
}
package_install () {
while true ; do
echo "\nWould you like to install the required packages?"
printf "(Yes/no) : "
read -r pkg_perm
case $pkg_perm in
[Yy]* ) doas pkg_add node mariadb-server ffmpeg || package_install_abort ; break;;
[Nn]* ) echo "Packages are required to install Shinobi. Exiting..."; exit 0;;
* ) echo "Please answer yes or no.";;
esac
done
}
mariadb_enable () {
echo "\n### Setting up MariaDB... Please follow the on screen instructions\n"
echo '\n### Running "/usr/local/bin/mysql_install_db"'
doas /usr/local/bin/mysql_install_db >/dev/null 2>&1 || mariadb_setup_abort
echo "\n### Configuring MariaDB to start at boot"
doas rcctl enable mysqld || mariadb_setup_abort
echo "\n### Starting MariaDB"
doas rcctl start mysqld || mariadb_setup_abort
echo '\n### Running "mysql_secure_installation"'
doas mysql_secure_installation || mariadb_setup_abort
}
mariadb_setup () {
while true ; do
echo "\nWould you like to setup MariaDB now?"
printf "(Yes/no) : "
read -r mariadb_setup
case $mariadb_setup in
[Yy]* ) mariadb_enable || mariadb_setup_abort ; break;;
[Nn]* ) echo "MariaDB is required to install Shinobi. Exiting..."; exit 0;;
* ) echo "Please answer yes or no.";;
esac
done
}
schema_install () {
echo "\nWhat is the MariaDB root password?"
printf "Password? : "
read -r sqlpass
cd /home/_shinobi/shinobi || schema_install_abort
doas -u _shinobi mysql -u root -p"$sqlpass" -e "source /home/_shinobi/shinobi/sql/user.sql" || schema_install_abort
doas -u _shinobi mysql -u root -p"$sqlpass" -e "source /home/_shinobi/shinobi/sql/framework.sql" || schema_install_abort
}
pro_download () {
echo "\n### Grabbing Shinobi Pro from master branch\n"
doas -u _shinobi ftp -o /home/_shinobi/shinobi.tar.gz https://gitlab.com/Shinobi-Systems/Shinobi/-/archive/master/Shinobi-master.tar.gz
}
gpl_download () {
echo "\n### Grabbing Shinobi CE from master branch\n"
doas -u _shinobi ftp -o /home/_shinobi/shinobi.tar.gz https://gitlab.com/Shinobi-Systems/ShinobiCE/-/archive/master/ShinobiCE-master.tar.gz
}
dev_download () {
echo "\n### Grabbing latest Shinobi from development branch\n"
doas -u _shinobi ftp -o /home/_shinobi/shinobi.tar.gz https://gitlab.com/Shinobi-Systems/Shinobi/-/archive/dev/Shinobi-dev.tar.gz
}
# Script Start
while true ; do
echo "Does $(whoami) have doas permissions?"
printf "(Yes/no) : "
read -r doas_perm
case $doas_perm in
[Yy]* ) doas ls >/dev/null 2>&1 || doas_perms_abort; break;;
[Nn]* ) echo "Please run this script as user with doas permissions"; exit 0;;
* ) echo "Please answer yes or no.";;
esac
done
while true ; do
echo "\nAre the following packages already installed?\n
Node.js
MariaDB
FFmpeg\n"
printf "(Yes/no) : "
read -r package_deps
case $package_deps in
[Yy]* ) echo "Proceeding..." ; break;;
[Nn]* ) package_install || package_install_abort ; break;;
* ) echo "Please answer yes or no.";;
esac
done
while true ; do
echo "\nIs MariaDB already installed and configured on this machine?"
printf "(Yes/no) : "
read -r mariadb_conf
case $mariadb_conf in
[Yy]* ) echo "Proceeding..." ; break;;
[Nn]* ) mariadb_setup || mariadb_setup_abort ; break;;
* ) echo "Please answer yes or no.";;
esac
done
# Shinobi unpriv user creation
echo '\n### Creating "_shinobi" System User\n'
doas useradd -s /sbin/nologin -m -d /home/_shinobi _shinobi || user_create_abort
# Pro vs Community choice
while true ; do
echo "\nWhich version of Shinobi would you like to install?"
echo "[D]evelopment, [P]ro or [C]ommunity Edition"
printf "(Dev/Pro/Community) : "
read -r pro_ce
case $pro_ce in
[Dd]* ) dev_download || source_dl_abort ; break;;
[Pp]* ) pro_download || source_dl_abort ; break;;
[Cc]* ) gpl_download || source_dl_abort ; break;;
* ) echo 'Enter "P" for Pro or "C" for Community Version.';;
esac
done
echo "\n### Extracting to install directory\n"
doas -u _shinobi tar -xzf /home/_shinobi/shinobi.tar.gz -C /home/_shinobi/ || source_extract_abort
doas -u _shinobi find /home/_shinobi/ -type d -name "Shinobi*" -exec mv {} /home/_shinobi/shinobi \; >/dev/null 2>&1
# MariaDB DB schema install
while true ; do
echo '\nInstall Shinobi Database schema? (Answer "No" only if you have already installed it manually)'
printf "(Yes/no) : "
read -r schema_yn
case $schema_yn in
[Yy]* ) schema_install || schema_install_abort ; break;;
[Nn]* ) echo "Proceeding..." ; break;;
* ) echo "Please answer yes or no.";;
esac
done
# NPM Node Module Installation
echo "\n### Installing required Node modules\n"
cd /home/_shinobi/shinobi || npm_abort
doas -u _shinobi npm install --unsafe-perm
doas npm audit fix --force
doas -u _shinobi cp /home/_shinobi/shinobi/conf.sample.json /home/_shinobi/shinobi/conf.json
doas -u _shinobi cp /home/_shinobi/shinobi/super.sample.json /home/_shinobi/shinobi/super.json
doas npm install -g pm2
# Post-Install Info
echo "\nCongratulations, Shinobi is now installed!\n"
echo 'To start Shinobi at boot, add a crontab entry for the user "_shinobi" with something like this:\n'
echo '$ doas crontab -u _shinobi -e
@reboot /bin/sh -c "cd /home/_shinobi/Shinobi && pm2 start camera.js cron.js"
echo "\nYou can access Shinobi at http://$(ifconfig | grep 'inet ' | awk '!/127.0.0.1/ {print $2}'):8080"
echo "\nPlease create a user by logging in to the admin panel at http://$(ifconfig | grep 'inet ' | awk '!/127.0.0.1/ {print $2}'):8080/super"
echo "\nThe default login credentials are:
username: admin@shinobi.video
password: admin"
echo "\nThe official Shinobi Documentation can be found at: https://shinobi.video/docs/"

View File

@ -9,7 +9,7 @@ if [ ! -e "./opencv" ]; then
echo "Downloading OpenCV..."
git clone https://github.com/opencv/opencv.git
cd opencv
git checkout 3.4.0
git checkout 3.4.10
cd ..
fi
if [ ! -e "./opencv_contrib" ]; then
@ -34,16 +34,19 @@ echo "*****************"
echo "Adding Additional Repository"
echo "http://security.ubuntu.com/ubuntu"
if [ "$flavor" = *"Artful"* ]; then
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu artful-security main"
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu artful-security main" -y
fi
if [ "$flavor" = *"Zesty"* ]; then
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu zesty-security main"
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu zesty-security main" -y
fi
if [ "$flavor" = *"Xenial"* ]; then
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu xenial-security main"
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu xenial-security main" -y
fi
if [ "$flavor" = *"Trusty"* ]; then
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu trusty-security main"
sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu trusty-security main" -y
fi
if [ "$flavor" = *"Eoan"* ]; then
sudo add-apt-repository "deb http://archive.ubuntu.com/ubuntu/ bionic main restricted universe multiverse" -y
fi
echo "Downloading Libraries"
sudo apt-get install libjpeg-dev libpango1.0-dev libgif-dev build-essential gcc-6 g++-6 -y;
@ -77,4 +80,4 @@ read opencvuninstall
if [ "$opencvuninstall" = "y" ] || [ "$opencvuninstall" = "Y" ]; then
rm -rf opencv
rm -rf opencv_contrib
fi
fi

View File

@ -17,7 +17,7 @@ if [ ! -e "./super.json" ]; then
echo "if you would like to limit accessibility of an"
echo "account for business scenarios."
echo "(y)es or (N)o"
read createSuperJson
read -r createSuperJson
if [ "$createSuperJson" = "y" ] || [ "$createSuperJson" = "Y" ]; then
echo "Default Superuser : admin@shinobi.video"
echo "Default Password : admin"
@ -32,22 +32,22 @@ echo "============="
echo "Shinobi - Do you want to Install Node.js?"
echo "(y)es or (N)o"
NODEJSINSTALL=0
read nodejsinstall
read -r nodejsinstall
if [ "$nodejsinstall" = "y" ] || [ "$nodejsinstall" = "Y" ]; then
sudo zypper install -y nodejs8
sudo zypper install -y nodejs11
NODEJSINSTALL=1
fi
echo "============="
echo "Shinobi - Do you want to Install FFMPEG?"
echo "(y)es or (N)o"
read ffmpeginstall
read -r ffmpeginstall
if [ "$ffmpeginstall" = "y" ] || [ "$ffmpeginstall" = "Y" ]; then
# Without nodejs8 package we can't use npm command
if [ "$NODEJSINSTALL" -eq "1" ]; then
echo "Shinobi - Do you want to Install FFMPEG with 'zypper --version' or download a static version provided with npm 'npm --version'?"
echo "(z)ypper or (N)pm"
echo "Press [ENTER] for default (npm)"
read ffmpegstaticinstall
read -r ffmpegstaticinstall
if [ "$ffmpegstaticinstall" = "z" ] || [ "$ffmpegstaticinstall" = "Z" ]; then
# Install ffmpeg and ffmpeg-devel
sudo zypper install -y ffmpeg ffmpeg-devel
@ -59,48 +59,28 @@ if [ "$ffmpeginstall" = "y" ] || [ "$ffmpeginstall" = "Y" ]; then
fi
fi
echo "============="
echo "Shinobi - Do you want to use MariaDB or SQLite3?"
echo "SQLite3 is better for small installs"
echo "MariaDB (MySQL) is better for large installs"
echo "(S)QLite3 or (M)ariaDB?"
echo "Press [ENTER] for default (MariaDB)"
read sqliteormariadb
if [ "$sqliteormariadb" = "S" ] || [ "$sqliteormariadb" = "s" ]; then
sudo npm install jsonfile
sudo zypper install -y sqlite3 sqlite3-devel
sudo npm install sqlite3
node ./tools/modifyConfiguration.js databaseType=sqlite3
if [ ! -e "./shinobi.sqlite" ]; then
echo "Creating shinobi.sqlite for SQLite3..."
sudo cp sql/shinobi.sample.sqlite shinobi.sqlite
else
echo "shinobi.sqlite already exists. Continuing..."
fi
else
echo "============="
echo "Shinobi - Do you want to Install MariaDB?"
echo "(y)es or (N)o"
read mysqlagree
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
sudo zypper install -y mariadb
#Start mysql and enable on boot
sudo systemctl start mariadb
sudo systemctl enable mariadb
#Run mysql install
sudo mysql_secure_installation
fi
echo "============="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read mysqlagreeData
if [ "$mysqlagreeData" = "y" ] || [ "$mysqlagreeData" = "Y" ]; then
echo "What is your SQL Username?"
read sqluser
echo "What is your SQL Password?"
read sqlpass
sudo mysql -u $sqluser -p$sqlpass -e "source sql/user.sql" || true
sudo mysql -u $sqluser -p$sqlpass -e "source sql/framework.sql" || true
fi
echo "Shinobi - Do you want to Install MariaDB?"
echo "(y)es or (N)o"
read -r mysqlagree
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
sudo zypper install -y mariadb
#Start mysql and enable on boot
sudo systemctl start mariadb
sudo systemctl enable mariadb
#Run mysql install
sudo mysql_secure_installation
fi
echo "============="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read -r mysqlagreeData
if [ "$mysqlagreeData" = "y" ] || [ "$mysqlagreeData" = "Y" ]; then
echo "What is your SQL Username?"
read -r sqluser
echo "What is your SQL Password?"
read -r sqlpass
sudo mysql -u "$sqluser" -p"$sqlpass" -e "source sql/user.sql" || true
sudo mysql -u "$sqluser" -p"$sqlpass" -e "source sql/framework.sql" || true
fi
echo "============="
echo "Shinobi - Install NPM Libraries"
@ -117,7 +97,7 @@ dos2unix /home/Shinobi/INSTALL/shinobi
ln -s /home/Shinobi/INSTALL/shinobi /usr/bin/shinobi
echo "Shinobi - Start Shinobi and set to start on boot?"
echo "(y)es or (N)o"
read startShinobi
read -r startShinobi
if [ "$startShinobi" = "y" ] || [ "$startShinobi" = "Y" ]; then
sudo pm2 start camera.js
sudo pm2 start cron.js

View File

@ -4,7 +4,7 @@ if [ ! -e "/etc/shinobisystems/path.txt" ]; then
else
installationDirectory=$(cat /etc/shinobisystems/cctv.txt)
fi
cd $installationDirectory
cd "$installationDirectory" || exit
currentBuild=$(git show --oneline -s)
gitOrigin=$(git remote show origin)
splitBuildString=($currentBuild)
@ -72,8 +72,8 @@ fi
if [[ $@ == *'restart'* ]]; then
proccessAlive=$(pm2 list | grep camera)
if [ "$proccessAlive" ]; then
pm2 restart $installationDirectory/camera.js
pm2 restart $installationDirectory/cron.js
pm2 restart "$installationDirectory"/camera.js
pm2 restart "$installationDirectory"/cron.js
else
echo "Shinobi process is not running."
fi
@ -85,11 +85,11 @@ else
else
if [ -e "$installationDirectory/INSTALL/installed.txt" ]; then
echo "Starting Shinobi"
pm2 start $installationDirectory/camera.js
pm2 start $installationDirectory/cron.js
pm2 start "$installationDirectory"/camera.js
pm2 start "$installationDirectory"/cron.js
fi
if [ ! -e "$installationDirectory/INSTALL/installed.txt" ]; then
chmod +x $installationDirectory/INSTALL/now.sh&&INSTALL/now.sh
chmod +x "$installationDirectory"/INSTALL/now.sh&&INSTALL/now.sh
fi
fi
fi
@ -97,8 +97,8 @@ fi
if [[ $@ == *'stop'* ]] || [[ $@ == *'exit'* ]]; then
proccessAlive=$(pm2 list | grep camera)
if [ "$proccessAlive" ]; then
pm2 stop $installationDirectory/camera.js
pm2 stop $installationDirectory/cron.js
pm2 stop "$installationDirectory"/camera.js
pm2 stop "$installationDirectory"/cron.js
else
echo "Shinobi process is not running."
fi
@ -110,7 +110,7 @@ if [[ $@ == *'version'* ]]; then
else
echo "Repository : Shinobi CE"
fi
echo $currentBuild
echo "$currentBuild"
fi
if [[ $@ == *'bootupEnable'* ]] || [[ $@ == *'bootupenable'* ]]; then
pm2 startup
@ -140,17 +140,17 @@ if [[ $@ == *'update'* ]]; then
echo "============="
echo "Shinobi - Are you sure you want to update? This will restart Shinobi."
echo "(y)es or (N)o"
read updateshinobi
read -r updateshinobi
if [ "$updateshinobi" = "y" ] || [ "$updateshinobi" = "Y" ]; then
echo "Beginning Update Process..."
pm2 stop $installationDirectory/camera.js
pm2 stop $installationDirectory/cron.js
pm2 stop "$installationDirectory"/camera.js
pm2 stop "$installationDirectory"/cron.js
npm install --unsafe-perm
npm audit fix --force
git reset --hard
git pull
pm2 start $installationDirectory/camera.js
pm2 start $installationDirectory/cron.js
pm2 start "$installationDirectory"/camera.js
pm2 start "$installationDirectory"/cron.js
else
echo "Cancelled Update Process."
fi

View File

@ -1,13 +1,13 @@
#!/bin/bash
echo "Shinobi - Do you want to Install Node.js?"
echo "(y)es or (N)o"
read nodejsinstall
read -r nodejsinstall
if [ "$nodejsinstall" = "y" ]; then
wget https://deb.nodesource.com/setup_8.x
chmod +x setup_8.x
./setup_8.x
wget https://deb.nodesource.com/setup_12.x
chmod +x setup_12.x
./setup_12.x
sudo apt install nodejs -y
rm setup_8.x
rm setup_12.x
fi
#Detect Ubuntu Version
@ -34,28 +34,28 @@ fi
# Install MariaDB
echo "Shinobi - Do you want to Install MariaDB? Choose No if you have MySQL."
echo "(y)es or (N)o"
read mysqlagree
read -r mysqlagree
if [ "$mysqlagree" = "y" ]; then
echo "Shinobi - Installing MariaDB"
echo "Password for root SQL user, If you are installing SQL now then you may put anything:"
read sqlpass
read -r sqlpass
echo "mariadb-server mariadb-server/root_password password $sqlpass" | debconf-set-selections
echo "mariadb-server mariadb-server/root_password_again password $sqlpass" | debconf-set-selections
apt install mariadb-server -y
service mysql start
fi
# Make sure files have correct perms
# Make sure files have correct perms
chmod -R 755 .
# Database Installation
#Check If Mysql-Server is already installed
#Check If Mysql-Server is already installed
echo "============="
echo "Checking for mysql-server"
echo "============="
dpkg -s mysql-server &> /dev/null
if [ $? -eq 0 ]; then
if [ $? -eq 0 ]; then
echo "+====================================+"
echo "| Warning MYSQL SERVER IS INSTALLED! |"
echo "+====================================+"
@ -64,75 +64,75 @@ if [ $? -eq 0 ]; then
echo "+====================================+"
echo "Shinobi - Do you want to Install MariaDB?"
echo "(y)es or (N)o"
read installmariadb
read -r installmariadb
if [ "$installmariadb" = "y" ]; then
echo "+=============================================+"
echo "| This will DESTORY ALL DATA ON MYSQL SERVER! |"
echo "+=============================================+"
echo "Please type the following to continue"
echo "DESTORY!"
read mysqlagree
read -r mysqlagree
if [ "$mysqlagree" = "DESTORY!" ]; then
echo "Shinobi - Installing MariaDB"
echo "Password for root SQL user, If you are installing SQL now then you may put anything:"
read sqlpass
read -r sqlpass
echo "mariadb-server mariadb-server/root_password password $sqlpass" | debconf-set-selections
echo "mariadb-server mariadb-server/root_password_again password $sqlpass" | debconf-set-selections
#Create my.cnf file
#Create my.cnf file
echo "[client]" >> ~/.my.cnf
echo "user=root" >> ~/.my.cnf
echo "password=$sqlpass" >> ~/.my.cnf
chmod 755 ~/.my.cnf
apt install mariadb-server
chmod 755 ~/.my.cnf
apt install mariadb-server
service mysql start
fi
fi
else
else
echo "Shinobi - Do you want to Install MariaDB?"
echo "(y)es or (N)o"
read mysqlagree
read -r mysqlagree
if [ "$mysqlagree" = "y" ]; then
echo "Shinobi - Installing MariaDB"
echo "Password for root SQL user, If you are installing SQL now then you may put anything:"
read sqlpass
read -r sqlpass
echo "mariadb-server mariadb-server/root_password password $sqlpass" | debconf-set-selections
echo "mariadb-server mariadb-server/root_password_again password $sqlpass" | debconf-set-selections
echo "[client]" >> ~/.my.cnf
echo "user=root" >> ~/.my.cnf
echo "password=$sqlpass" >> ~/.my.cnf
chmod 755 ~/.my.cnf
chmod 755 ~/.my.cnf
apt install mariadb-server -y
service mysql start
fi
fi
fi
chmod -R 755 .
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read mysqlagreeData
read -r mysqlagreeData
if [ "$mysqlagreeData" = "y" ]; then
mysql -e "source sql/user.sql" || true
mysql -e "source sql/framework.sql" || true
echo "Shinobi - Do you want to Install Default Data (default_data.sql)?"
echo "(y)es or (N)o"
read mysqlDefaultData
read -r mysqlDefaultData
if [ "$mysqlDefaultData" = "y" ]; then
escapeReplaceQuote='\\"'
groupKey=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 7 | head -n 1)
userID=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 6 | head -n 1)
userEmail=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 6 | head -n 1)"@"$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 6 | head -n 1)".com"
userPasswordPlain=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 6 | head -n 1)
userPasswordMD5=$(echo -n "$userPasswordPlain" | md5sum | awk '{print $1}')
groupKey=$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,7)}')
userID=$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,6)}')
userEmail=$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,6)}')"@"$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,6)}')".com"
userPasswordPlain=$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,7)}')
userPasswordMD5=$(echo -n "$userPasswordPlain" | md5sum | awk '{print $1}')
userDetails='{"days":"10"}'
userDetails=$(echo "$userDetails" | sed -e 's/"/'$escapeReplaceQuote'/g')
echo $userDetailsNew
echo "$userDetailsNew"
apiIP='0.0.0.0'
apiKey=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1)
apiKey=$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,32)}')
apiDetails='{"auth_socket":"1","get_monitors":"1","control_monitors":"1","get_logs":"1","watch_stream":"1","watch_snapshot":"1","watch_videos":"1","delete_videos":"1"}'
apiDetails=$(echo "$apiDetails" | sed -e 's/"/'$escapeReplaceQuote'/g')
rm sql/default_user.sql || true
echo "USE ccio;INSERT INTO Users (\`ke\`,\`uid\`,\`auth\`,\`mail\`,\`pass\`,\`details\`) VALUES (\"$groupKey\",\"$userID\",\"$apiKey\",\"$userEmail\",\"$userPasswordMD5\",\"$userDetails\");INSERT INTO API (\`code\`,\`ke\`,\`uid\`,\`ip\`,\`details\`) VALUES (\"$apiKey\",\"$groupKey\",\"$userID\",\"$apiIP\",\"$apiDetails\");" > "sql/default_user.sql"
mysql -u $sqluser -p$sqlpass --database ccio -e "source sql/default_user.sql" > "INSTALL/log.txt"
mysql -u "$sqluser" -p"$sqlpass" --database ccio -e "source sql/default_user.sql" > "INSTALL/log.txt"
echo "====================================="
echo "=======!! Login Credentials !!======="
echo "|| Username : $userEmail"
@ -169,9 +169,9 @@ echo "Shinobi - Finished"
touch INSTALL/installed.txt
echo "Shinobi - Start Shinobi?"
echo "(y)es or (N)o"
read startShinobi
read -r startShinobi
if [ "$startShinobi" = "y" ]; then
pm2 start camera.js
pm2 start cron.js
pm2 list
fi
fi

132
INSTALL/ubuntu-touchless.sh Normal file
View File

@ -0,0 +1,132 @@
#!/bin/bash
echo "========================================================="
echo "==!! Shinobi : The Open Source CCTV and NVR Solution !!=="
echo "========================================================="
echo "To answer yes type the letter (y) in lowercase and press ENTER."
echo "Default is no (N). Skip any components you already have or don't need."
echo "============="
#Detect Ubuntu Version
echo "============="
echo " Detecting Ubuntu Version"
echo "============="
getubuntuversion=$(lsb_release -r | awk '{print $2}' | cut -d . -f1)
echo "============="
echo " Ubuntu Version: $getubuntuversion"
echo "============="
echo "Shinobi - Do you want to temporarily disable IPv6?"
echo "Sometimes IPv6 causes Ubuntu package updates to fail. Only do this if your machine doesn't rely on IPv6."
echo "(y)es or (N)o"
read -r disableIpv6
if [ "$disableIpv6" = "y" ] || [ "$disableIpv6" = "Y" ]; then
sudo sysctl -w net.ipv6.conf.all.disable_ipv6=1
sudo sysctl -w net.ipv6.conf.default.disable_ipv6=1
sudo sysctl -w net.ipv6.conf.lo.disable_ipv6=1
fi
if [ "$getubuntuversion" = "18" ] || [ "$getubuntuversion" > "18" ]; then
apt install sudo wget -y
sudo apt install -y software-properties-common
sudo add-apt-repository universe -y
fi
if [ "$getubuntuversion" = "16" ]; then
sudo apt install gnupg-curl -y
fi
sudo apt install gcc g++ -y
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 800 --slave /usr/bin/g++ g++ /usr/bin/g++-8
#create conf.json
if [ ! -e "./conf.json" ]; then
sudo cp conf.sample.json conf.json
#Generate a random Cron key for the config file
cronKey=$(head -c 1024 < /dev/urandom | sha256sum | awk '{print substr($1,1,29)}')
#Insert key into conf.json
sudo sed -i -e 's/change_this_to_something_very_random__just_anything_other_than_this/'"$cronKey"'/g' conf.json
fi
#create super.json
if [ ! -e "./super.json" ]; then
echo "============="
echo "Default Superuser : admin@shinobi.video"
echo "Default Password : admin"
echo "* You can edit these settings in \"super.json\" located in the Shinobi directory."
sudo cp super.sample.json super.json
fi
if ! [ -x "$(command -v ifconfig)" ]; then
echo "============="
echo "Shinobi - Installing Net-Tools"
sudo apt install net-tools -y
fi
if ! [ -x "$(command -v node)" ]; then
echo "============="
echo "Shinobi - Installing Node.js"
wget https://deb.nodesource.com/setup_12.x
chmod +x setup_12.x
./setup_12.x
sudo apt install nodejs -y
sudo apt install node-pre-gyp -y
rm setup_12.x
else
echo "Node.js Found..."
echo "Version : $(node -v)"
fi
if ! [ -x "$(command -v npm)" ]; then
sudo apt install npm -y
fi
sudo apt install make zip -y
if ! [ -x "$(command -v ffmpeg)" ]; then
if [ "$getubuntuversion" = "16" ] || [ "$getubuntuversion" < "16" ]; then
echo "============="
echo "Shinobi - Get FFMPEG 3.x from ppa:jonathonf/ffmpeg-3"
sudo add-apt-repository ppa:jonathonf/ffmpeg-3 -y
sudo apt update -y && sudo apt install ffmpeg libav-tools x264 x265 -y
else
echo "============="
echo "Shinobi - Installing FFMPEG"
sudo apt install ffmpeg -y
fi
else
echo "FFmpeg Found..."
echo "Version : $(ffmpeg -version)"
fi
echo "============="
echo "Shinobi - Installing MariaDB"
echo "MariaDB will be installed with no password."
sqlpass=""
echo "mariadb-server mariadb-server/root_password password $sqlpass" | debconf-set-selections
echo "mariadb-server mariadb-server/root_password_again password $sqlpass" | debconf-set-selections
sudo apt install mariadb-server -y
sudo service mysql start
echo "============="
echo "Shinobi - Installing Database..."
sqluser="root"
sudo mysql -e "source sql/user.sql" || true
sudo mysql -e "source sql/framework.sql" || true
echo "============="
echo "Shinobi - Install NPM Libraries"
sudo npm i npm -g
sudo npm install --unsafe-perm
sudo npm audit fix --force
echo "============="
echo "Shinobi - Install PM2"
sudo npm install pm2@3.0.0 -g
echo "Shinobi - Finished"
sudo chmod -R 755 .
touch INSTALL/installed.txt
dos2unix /home/Shinobi/INSTALL/shinobi
ln -s /home/Shinobi/INSTALL/shinobi /usr/bin/shinobi
echo "Shinobi - Randomizing cron key"
node /home/Shinobi/tools/modifyConfiguration.js addToConfig="{\"cron\":{\"key\":\"$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,60)}')\"}}"
echo "Shinobi - Starting Shinobi and setting to start on boot"
sudo pm2 start camera.js
sudo pm2 start cron.js
sudo pm2 startup
sudo pm2 save
sudo pm2 list
echo "====================================="
echo "||===== Install Completed =====||"
echo "====================================="
echo "|| Login with the Superuser and create a new user!!"
echo "||==================================="
echo "|| Open http://$(ifconfig | sed -En 's/127.0.0.1//;s/.*inet (addr:)?(([0-9]*\.){3}[0-9]*).*/\2/p'):8080/super in your web browser."
echo "||==================================="
echo "|| Default Superuser : admin@shinobi.video"
echo "|| Default Password : admin"
echo "====================================="
echo "====================================="

View File

@ -19,11 +19,17 @@ if [ "$getubuntuversion" = "18" ] || [ "$getubuntuversion" > "18" ]; then
sudo add-apt-repository universe -y
fi
if [ "$getubuntuversion" = "16" ]; then
apt install gnupg-curl -y
sudo apt install gnupg-curl -y
fi
sudo apt install gcc-8 g++-8 -y
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 800 --slave /usr/bin/g++ g++ /usr/bin/g++-8
#create conf.json
if [ ! -e "./conf.json" ]; then
sudo cp conf.sample.json conf.json
#Generate a random Cron key for the config file
cronKey=$(head -c 1024 < /dev/urandom | sha256sum | awk '{print substr($1,1,29)}')
#Insert key into conf.json
sudo sed -i -e 's/change_this_to_something_very_random__just_anything_other_than_this/'"$cronKey"'/g' conf.json
fi
#create super.json
if [ ! -e "./super.json" ]; then
@ -35,16 +41,17 @@ if [ ! -e "./super.json" ]; then
fi
if ! [ -x "$(command -v ifconfig)" ]; then
echo "============="
echo "Shinobi - Installing Net-Tools"
sudo apt install net-tools -y
echo "Shinobi - Installing Net-Tools and Dos2Unix"
sudo apt install net-tools dos2unix -y
fi
if ! [ -x "$(command -v node)" ]; then
echo "============="
echo "Shinobi - Installing Node.js"
wget https://deb.nodesource.com/setup_8.x
chmod +x setup_8.x
./setup_8.x
wget https://deb.nodesource.com/setup_12.x
chmod +x setup_12.x
./setup_12.x
sudo apt install nodejs -y
rm setup_12.x
else
echo "Node.js Found..."
echo "Version : $(node -v)"
@ -69,53 +76,34 @@ else
echo "Version : $(ffmpeg -version)"
fi
echo "============="
echo "Shinobi - Do you want to use MariaDB or SQLite3?"
echo "SQLite3 is better for small installs"
echo "MariaDB (MySQL) is better for large installs"
echo "(S)QLite3 or (M)ariaDB?"
echo "Press [ENTER] for default (MariaDB)"
read sqliteormariadb
if [ "$sqliteormariadb" = "S" ] || [ "$sqliteormariadb" = "s" ]; then
sudo npm install jsonfile
sudo apt-get install sqlite3 libsqlite3-dev -y
sudo npm install sqlite3
node ./tools/modifyConfiguration.js databaseType=sqlite3
if [ ! -e "./shinobi.sqlite" ]; then
echo "Creating shinobi.sqlite for SQLite3..."
sudo cp sql/shinobi.sample.sqlite shinobi.sqlite
else
echo "shinobi.sqlite already exists. Continuing..."
fi
else
echo "Shinobi - Do you want to Install MariaDB? Choose No if you already have it."
echo "(y)es or (N)o"
read mysqlagree
echo "Shinobi - Do you want to Install MariaDB? Choose No if you already have it."
echo "(y)es or (N)o"
read -r mysqlagree
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
echo "Shinobi - Installing MariaDB"
echo "Password for root SQL user, If you are installing SQL now then you may put anything:"
read -r sqlpass
echo "mariadb-server mariadb-server/root_password password $sqlpass" | debconf-set-selections
echo "mariadb-server mariadb-server/root_password_again password $sqlpass" | debconf-set-selections
sudo apt install mariadb-server -y
sudo service mysql start
fi
echo "============="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read -r mysqlagreeData
if [ "$mysqlagreeData" = "y" ] || [ "$mysqlagreeData" = "Y" ]; then
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
echo "Shinobi - Installing MariaDB"
echo "Password for root SQL user, If you are installing SQL now then you may put anything:"
read sqlpass
echo "mariadb-server mariadb-server/root_password password $sqlpass" | debconf-set-selections
echo "mariadb-server mariadb-server/root_password_again password $sqlpass" | debconf-set-selections
sudo apt install mariadb-server -y
sudo service mysql start
sqluser="root"
fi
echo "============="
echo "Shinobi - Database Installation"
echo "(y)es or (N)o"
read mysqlagreeData
if [ "$mysqlagreeData" = "y" ] || [ "$mysqlagreeData" = "Y" ]; then
if [ "$mysqlagree" = "y" ] || [ "$mysqlagree" = "Y" ]; then
sqluser="root"
fi
if [ ! "$mysqlagree" = "y" ]; then
echo "What is your SQL Username?"
read sqluser
echo "What is your SQL Password?"
read sqlpass
fi
sudo mysql -u $sqluser -p$sqlpass -e "source sql/user.sql" || true
sudo mysql -u $sqluser -p$sqlpass -e "source sql/framework.sql" || true
if [ ! "$mysqlagree" = "y" ]; then
echo "What is your SQL Username?"
read -r sqluser
echo "What is your SQL Password?"
read -r sqlpass
fi
sudo mysql -u "$sqluser" -p"$sqlpass" -e "source sql/user.sql" || true
sudo mysql -u "$sqluser" -p"$sqlpass" -e "source sql/framework.sql" || true
fi
echo "============="
echo "Shinobi - Install NPM Libraries"
@ -128,11 +116,13 @@ sudo npm install pm2@3.0.0 -g
echo "Shinobi - Finished"
sudo chmod -R 755 .
touch INSTALL/installed.txt
dos2unix /home/Shinobi/INSTALL/shinobi
ln -s /home/Shinobi/INSTALL/shinobi /usr/bin/shinobi
dos2unix INSTALL/shinobi
ln -s `readlink -f INSTALL/shinobi` /usr/bin/shinobi
echo "Shinobi - Randomizing cron key"
node tools/modifyConfiguration.js addToConfig="{\"cron\":{\"key\":\"$(head -c 64 < /dev/urandom | sha256sum | awk '{print substr($1,1,60)}')\"}}"
echo "Shinobi - Start Shinobi and set to start on boot?"
echo "(y)es or (N)o"
read startShinobi
read -r startShinobi
if [ "$startShinobi" = "y" ] || [ "$startShinobi" = "y" ]; then
sudo pm2 start camera.js
sudo pm2 start cron.js

173
README.md
View File

@ -1,146 +1,77 @@
# Shinobi Pro
# Shinobi Pro
### (Shinobi Open Source Software)
Shinobi is the Open Source CCTV Solution written in Node.JS. Designed with multiple account system, Streams by WebSocket, and Save to WebM. Shinobi can record IP Cameras and Local Cameras.
Shinobi is the Open Source CCTV Solution written in Node.JS. Designed with multiple account system, Streams by WebSocket, and Direct saving to MP4. Shinobi can record IP Cameras and Local Cameras.
<a href="http://shinobi.video/gallery"><img src="https://github.com/ShinobiCCTV/Shinobi/blob/master/web/libs/img/demo.jpg?raw=true"></a>
## Install and Use
# Key Aspects
- Installation : http://shinobi.video/docs/start
- Post-Installation Tutorials : http://shinobi.video/docs/configure
- Troubleshooting Guide : https://hub.shinobi.video/articles/view/v0AFPFchfVcFGUS
For an updated list of features visit the official website. http://shinobi.video/features
#### Docker
- Install with **Docker** : https://gitlab.com/Shinobi-Systems/Shinobi/-/tree/dev/Docker
- Time-lapse Viewer (Watch a hours worth of footage in a few minutes)
- 2-Factor Authentication
- Defeats stream limit imposed by browsers
- With Base64 (Stream Type) and JPEG Mode (Option)
- Records IP Cameras and Local Cameras
- Streams by WebSocket, HLS (includes audio), and MJPEG
- Save to WebM and MP4
- Can save Audio
- Push Events - When a video is finished it will appear in the dashboard without a refresh
- Region Motion Detection (Similar to ZoneMinder Zone Detection)
- Represented by a Motion Guage on each monitor
- "No Motion" Notifications
- 1 Process for Each Camera to do both, Recording and Streaming
- Timeline for viewing Motion Events and Videos
- Sub-Accounts with permissions
- Monitor Viewing
- Monitor Editing
- Video Deleting
- Separate API keys for sub account
- Cron Filters can be set based on master account
- Stream Analyzer built-in (FFprobe GUI)
- Monitor Groups
- Can snapshot images from stream directly
- Lower Bandwith Mode (JPEG Mode)
- Snapshot (cgi-bin) must be enabled in Monitor Settings
- Control Cameras from Interface
- API
- Get videos
- Get monitors
- Change monitor modes : Disabled, Watch, Record
- Embedding streams
- Dashboard Framework made with Google Material Design Lite, jQuery, and Bootstrap
## "is my camera supported?"
Ask yourself these questions to get a general sense.
- Does it have ONVIF?
- If yes, then it may have H.264 or H.265 streaming capability.
- Does it have RTSP Protocol for Streaming?
- If yes, then it may have H.264 or H.265 streaming capability.
- Can you stream it in VLC Player?
- If yes, use that same URL in Shinobi. You may need to specify the port number when using `rtsp://` protocol.
- Does it have MJPEG Streaming?
- While this would work in Shinobi, it is far from ideal. Please see if any of the prior questions are applicable.
- Does it have a web interface that you can connect to directly?
- If yes, then you may be able to find model information that can be used to search online for a streaming URL.
Configuration Guides : http://shinobi.video/docs/configure
## Asking for help
Before asking questions it would nice if you read the docs :) http://shinobi.video
- General Support : https://shinobi.community
- Please be sure to read the `#guidelines` channel after joining.
- Business Inquiries : business@shinobi.video or the Live Chat on https://shinobi.video
After doing so please head on over to the Discord community chat for support. https://discordapp.com/invite/mdhmvuH
## Support the Development
The Issues section is only for bugs with the software. Comments and feature requests may be closed without comment. http://shinobi.video/docs/contribute
It's a proven fact that generosity makes you a happier person :) https://www.nature.com/articles/ncomms15964
Please be considerate of developer efforts. If you have simple questions, like "what does this button do?", please be sure to have read the docs entirely before asking. If you would like to skip reading the docs and ask away you can order a support package :) http://shinobi.video/support
Get a Mobile License to unlock extended features on the Mobile App as well as support the development!
- Shinobi Mobile App : https://cdn.shinobi.video/installers/ShinobiMobile/
- Get a Mobile License : https://licenses.shinobi.video/subscribe?planSubscribe=plan_G31AZ9mknNCa6z
## Making Suggestions or Feature Requests
You can post suggestions on the Forum in the Suggestions category. Please do not treat this channel like a "demands" window. Developer efforts are limited. Much more than many alternatives.
when you have a suggestion please try and make the changes yourself then post a pull request to the `dev` branch. Then we can decide if it's a good change for Shinobi. If you don't know how to go about it and want to have me put it higher on my priority list you can order a support package :) Pretty Ferengi of me... but until we live in a world without money please support Shinobi :) Cheers!
http://shinobi.video/support
## Help make Shinobi the best Open Source CCTV Solution.
Donate - http://shinobi.video/docs/donate
Ordering a License, Paid Support, or anything from <a href="//camera.observer">here</a> will allow a lot more time to be spent on Shinobi.
Order Support - http://shinobi.video/support
# Why make this?
## Why make this?
http://shinobi.video/why
# What others say
## Author
> "After trying zoneminder without success (heavy unstable and slow) I passed to Shinobi that despite being young spins a thousand times better (I have a setup with 16 cameras recording in FHD to ~ 10fps on a pentium of ~ 2009 and I turn with load below 1.5)."
Moe Alam, Shinobi Systems
> *A Reddit user, /r/ItalyInformatica*
Shinobi is developed by many contributors. Please have a look at the commits to see some of who they are :)
https://gitlab.com/Shinobi-Systems/Shinobi/-/commits/dev
&nbsp;
> "I would suggest Shinobi as a NVR. It's still in the early days but works a lot better than ZoneMinder for me. I'm able to record 16 cams at 1080p 15fps continously whith no load on server (Pentium E5500 3GB RAM) where zm crashed with 6 cams at 720p. Not to mention the better interface."
> *A Reddit user, /r/HomeNetworking*
# How to Install and Run
> FOR DOCKER USERS : Docker is not officially supported and is not recommended. The kitematic method is provided for those who wish to quickly test Shinobi. The Docker files included in the master and dev branches are maintained by the community. If you would like support with Docker please find a community member who maintains the Docker files or please refer to Docker's forum.
#### Fast Install (The Ninja Way)
1. Become `root` to use the installer and run Shinobi. Use one of the following to do so.
- Ubuntu 17.04, 17.10
- `sudo su`
- CentOS 7
- `su`
- MacOS 10.7(+)
- `su`
2. Download and run the installer.
```
bash <(curl -s https://gitlab.com/Shinobi-Systems/Shinobi-Installer/raw/master/shinobi-install.sh)
```
#### Elaborate Installs
Installation Tutorials - http://shinobi.video/docs/start
Troubleshooting Guide - http://shinobi.video/docs/start#trouble-section
# Author
Moe Alam
Follow Shinobi on Twitter https://twitter.com/ShinobiCCTV
Join the Community Chat
<a title="Find me on Discord, Get an Invite" href="https://discordapp.com/invite/mdhmvuH"><img src="https://cdn-images-1.medium.com/max/115/1*OoXboCzk0gYvTNwNnV4S9A@2x.png"></a>
# Support the Development
## Support the Development
Ordering a certificate or support package greatly boosts development. Please consider contributing :)
http://shinobi.video/support
# Links
## Links
Documentation - http://shinobi.video/docs
Donate - https://shinobi.video/docs/donate
Tested Cameras and Systems - http://shinobi.video/docs/supported
Features - http://shinobi.video/features
Reddit (Forum) - https://www.reddit.com/r/ShinobiCCTV/
YouTube (Tutorials) - https://www.youtube.com/channel/UCbgbBLTK-koTyjOmOxA9msQ
Discord (Community Chat) - https://discordapp.com/invite/mdhmvuH
Twitter (News) - https://twitter.com/ShinobiCCTV
Facebook (News) - https://www.facebook.com/Shinobi-1223193167773738/?ref=bookmarks
- Articles : http://hub.shinobi.video/articles
- Documentation : http://shinobi.video/docs
- Features List : http://shinobi.video/features
- Some features may not be listed.
- Donation : http://shinobi.video/docs/donate
- Buy Shinobi Stuff : https://licenses.shinobi.video
- User Submitted Configurations : http://shinobi.video/docs/cameras
- Features : http://shinobi.video/features
- Reddit (Forum) : https://www.reddit.com/r/ShinobiCCTV/
- YouTube (Tutorials) : https://www.youtube.com/channel/UCbgbBLTK-koTyjOmOxA9msQ
- Discord (Community Chat) : https://discordapp.com/invite/mdhmvuH
- Twitter (News) : https://twitter.com/ShinobiCCTV
- Facebook (News) : https://www.facebook.com/Shinobi-1223193167773738/?ref=bookmarks

5
UPDATE.sh Normal file
View File

@ -0,0 +1,5 @@
git reset --hard
git pull
npm install --unsafe-perm
# pm2 restart camera
# pm2 restart cron

View File

@ -1,6 +1,6 @@
//
// Shinobi
// Copyright (C) 2016 Moe Alam, moeiscool
// Copyright (C) 2020 Moe Alam, moeiscool
//
//
// # Donate
@ -9,84 +9,84 @@
// PayPal : paypal@m03.ca
//
var io = new (require('socket.io'))()
//library loader
var loadLib = function(lib){
return require(__dirname+'/libs/'+lib+'.js')
}
//process handlers
var s = loadLib('process')(process,__dirname)
var s = require('./libs/process.js')(process,__dirname)
//load extender functions
loadLib('extenders')(s)
require('./libs/extenders.js')(s)
//configuration loader
var config = loadLib('config')(s)
var config = require('./libs/config.js')(s)
//basic functions
loadLib('basic')(s,config)
require('./libs/basic.js')(s,config)
//language loader
var lang = loadLib('language')(s,config)
var lang = require('./libs/language.js')(s,config)
//working directories : videos, streams, fileBin..
loadLib('folders')(s,config,lang)
require('./libs/folders.js')(s,config,lang)
//code test module
loadLib('codeTester')(s,config,lang)
require('./libs/codeTester.js')(s,config,lang)
//get version
loadLib('version')(s,config,lang)
require('./libs/version.js')(s,config,lang)
//video processing engine
loadLib('ffmpeg')(s,config,lang,function(ffmpeg){
require('./libs/ffmpeg.js')(s,config,lang,async function(ffmpeg){
//ffmpeg coProcessor
loadLib('ffmpegCoProcessor')(s,config,lang,ffmpeg)
require('./libs/ffmpegCoProcessor.js')(s,config,lang,ffmpeg)
//database connection : mysql, sqlite3..
loadLib('sql')(s,config)
require('./libs/sql.js')(s,config)
//authenticator functions : API, dashboard login..
loadLib('auth')(s,config,lang)
require('./libs/auth.js')(s,config,lang)
//express web server with ejs
var app = loadLib('webServer')(s,config,lang,io)
var app = require('./libs/webServer.js')(s,config,lang,io)
//web server routes : page handling..
loadLib('webServerPaths')(s,config,lang,app,io)
require('./libs/webServerPaths.js')(s,config,lang,app,io)
//web server routes for streams : streams..
loadLib('webServerStreamPaths')(s,config,lang,app,io)
require('./libs/webServerStreamPaths.js')(s,config,lang,app,io)
//web server admin routes : create sub accounts, share monitors, share videos
loadLib('webServerAdminPaths')(s,config,lang,app,io)
require('./libs/webServerAdminPaths.js')(s,config,lang,app,io)
//web server superuser routes : create admin accounts and manage system functions
loadLib('webServerSuperPaths')(s,config,lang,app,io)
require('./libs/webServerSuperPaths.js')(s,config,lang,app,io)
//websocket connection handlers : login and streams..
loadLib('socketio')(s,config,lang,io)
require('./libs/socketio.js')(s,config,lang,io)
//user and group functions
loadLib('user')(s,config,lang)
require('./libs/user.js')(s,config,lang)
//timelapse functions
loadLib('timelapse')(s,config,lang,app,io)
require('./libs/timelapse.js')(s,config,lang,app,io)
//fileBin functions
loadLib('fileBin')(s,config,lang,app,io)
require('./libs/fileBin.js')(s,config,lang,app,io)
//monitor/camera handlers
loadLib('monitor')(s,config,lang)
require('./libs/monitor.js')(s,config,lang)
//event functions : motion, object matrix handler
loadLib('events')(s,config,lang)
//built-in detector functions : pam-diff..
loadLib('detector')(s,config)
require('./libs/events.js')(s,config,lang)
//recording functions
loadLib('videos')(s,config,lang)
//branding functions and config defaults
loadLib('videoDropInServer')(s,config,lang,app,io)
require('./libs/videos.js')(s,config,lang)
//plugins : websocket connected services..
loadLib('plugins')(s,config,lang,io)
require('./libs/plugins.js')(s,config,lang,io)
//health : cpu and ram trackers..
loadLib('health')(s,config,lang,io)
require('./libs/health.js')(s,config,lang,io)
//cluster module
loadLib('childNode')(s,config,lang,app,io)
require('./libs/childNode.js')(s,config,lang,app,io)
//cloud uploaders : amazon s3, webdav, backblaze b2..
loadLib('uploaders')(s,config,lang)
require('./libs/uploaders.js')(s,config,lang,app,io)
//notifiers : discord..
loadLib('notification')(s,config,lang)
require('./libs/notification.js')(s,config,lang)
//notifiers : discord..
loadLib('rtmpserver')(s,config,lang)
require('./libs/rtmpserver.js')(s,config,lang)
//dropInEvents server (file manipulation to create event trigger)
loadLib('dropInEvents')(s,config,lang,app,io)
require('./libs/dropInEvents.js')(s,config,lang,app,io)
//form fields to drive the internals
loadLib('definitions')(s,config,lang,app,io)
require('./libs/definitions.js')(s,config,lang,app,io)
//branding functions and config defaults
loadLib('branding')(s,config,lang,app,io)
require('./libs/branding.js')(s,config,lang,app,io)
//custom module loader
loadLib('customAutoLoad')(s,config,lang,app,io)
require('./libs/customAutoLoad.js')(s,config,lang,app,io)
//scheduling engine
loadLib('scheduler')(s,config,lang,app,io)
require('./libs/shinobiHub.js')(s,config,lang,app,io)
//onvif, ptz engine
require('./libs/control.js')(s,config,lang,app,io)
//ffprobe, onvif engine
require('./libs/scanners.js')(s,config,lang,app,io)
//scheduling engine
require('./libs/scheduler.js')(s,config,lang,app,io)
//on-start actions, daemon(s) starter
loadLib('startup')(s,config,lang)
await require('./libs/startup.js')(s,config,lang)
//p2p, commander
require('./libs/commander.js')(s,config,lang)
})

View File

@ -2,6 +2,7 @@
"port": 8080,
"passwordType": "sha256",
"detectorMergePamRegionTriggers": true,
"wallClockTimestampAsDefault": true,
"addStorage": [
{"name":"second","path":"__DIR__/videos2"}
],

295
cron.js
View File

@ -135,7 +135,142 @@ s.sqlQuery = function(query,values,onMoveOn){
}
})
}
const cleanSqlWhereObject = (where) => {
const newWhere = {}
Object.keys(where).forEach((key) => {
if(key !== '__separator'){
const value = where[key]
newWhere[key] = value
}
})
return newWhere
}
const processSimpleWhereCondition = (dbQuery,where,didOne) => {
var whereIsArray = where instanceof Array;
if(where[0] === 'or' || where.__separator === 'or'){
if(whereIsArray){
where.shift()
dbQuery.orWhere(...where)
}else{
where = cleanSqlWhereObject(where)
dbQuery.orWhere(where)
}
}else if(!didOne){
didOne = true
whereIsArray ? dbQuery.where(...where) : dbQuery.where(where)
}else{
whereIsArray ? dbQuery.andWhere(...where) : dbQuery.andWhere(where)
}
}
const processWhereCondition = (dbQuery,where,didOne) => {
var whereIsArray = where instanceof Array;
if(!where[0])return;
if(where[0] && where[0] instanceof Array){
dbQuery.where(function() {
var _this = this
var didOneInsideGroup = false
where.forEach((whereInsideGroup) => {
processWhereCondition(_this,whereInsideGroup,didOneInsideGroup)
})
})
}else if(where[0] && where[0] instanceof Object){
dbQuery.where(function() {
var _this = this
var didOneInsideGroup = false
where.forEach((whereInsideGroup) => {
processSimpleWhereCondition(_this,whereInsideGroup,didOneInsideGroup)
})
})
}else{
processSimpleWhereCondition(dbQuery,where,didOne)
}
}
const knexError = (dbQuery,options,err) => {
console.error('knexError----------------------------------- START')
if(config.debugLogVerbose && config.debugLog === true){
s.debugLog('s.knexQuery QUERY',JSON.stringify(options,null,3))
s.debugLog('STACK TRACE, NOT AN ',new Error())
}
console.error(err)
console.error(dbQuery.toString())
console.error('knexError----------------------------------- END')
}
const knexQuery = (options,callback) => {
try{
if(!s.databaseEngine)return// console.log('Database Not Set');
// options = {
// action: "",
// columns: "",
// table: ""
// }
var dbQuery
switch(options.action){
case'select':
options.columns = options.columns.indexOf(',') === -1 ? [options.columns] : options.columns.split(',');
dbQuery = s.databaseEngine.select(...options.columns).from(options.table)
break;
case'count':
options.columns = options.columns.indexOf(',') === -1 ? [options.columns] : options.columns.split(',');
dbQuery = s.databaseEngine(options.table)
dbQuery.count(options.columns)
break;
case'update':
dbQuery = s.databaseEngine(options.table).update(options.update)
break;
case'delete':
dbQuery = s.databaseEngine(options.table)
break;
case'insert':
dbQuery = s.databaseEngine(options.table).insert(options.insert)
break;
}
if(options.where instanceof Array){
var didOne = false;
options.where.forEach((where) => {
processWhereCondition(dbQuery,where,didOne)
})
}else if(options.where instanceof Object){
dbQuery.where(options.where)
}
if(options.action === 'delete'){
dbQuery.del()
}
if(options.orderBy){
dbQuery.orderBy(...options.orderBy)
}
if(options.groupBy){
dbQuery.groupBy(options.groupBy)
}
if(options.limit){
if(`${options.limit}`.indexOf(',') === -1){
dbQuery.limit(options.limit)
}else{
const limitParts = `${options.limit}`.split(',')
dbQuery.limit(limitParts[0]).offset(limitParts[1])
}
}
if(config.debugLog === true){
console.log(dbQuery.toString())
}
if(callback || options.update || options.insert || options.action === 'delete'){
dbQuery.asCallback(function(err,r) {
if(err){
knexError(dbQuery,options,err)
}
if(callback)callback(err,r)
if(config.debugLogVerbose && config.debugLog === true){
s.debugLog('s.knexQuery QUERY',JSON.stringify(options,null,3))
s.debugLog('s.knexQuery RESPONSE',JSON.stringify(r,null,3))
s.debugLog('STACK TRACE, NOT AN ',new Error())
}
})
}
return dbQuery
}catch(err){
if(callback)callback(err,[])
knexError(dbQuery,options,err)
}
}
s.debugLog = function(arg1,arg2){
if(config.debugLog === true){
if(!arg2)arg2 = ''
@ -236,29 +371,24 @@ const checkFilterRules = function(v,callback){
var b = v.d.filters[m];
s.debugLog(b)
if(b.enabled==="1"){
b.ar=[v.ke];
b.sql=[];
b.where.forEach(function(j,k){
if(j.p1==='ke'){j.p3=v.ke}
switch(j.p3_type){
case'function':
b.sql.push(j.p1+' '+j.p2+' '+j.p3)
break;
default:
b.sql.push(j.p1+' '+j.p2+' ?')
b.ar.push(j.p3)
break;
}
const whereQuery = [
['ke','=',v.ke],
['status','!=',"0"],
['details','NOT LIKE','%"archived":"1"%'],
]
b.where.forEach(function(condition){
if(condition.p1 === 'ke'){condition.p3 = v.ke}
whereQuery.push([condition.p1,condition.p2 || '=',condition.p3])
})
b.sql='WHERE ke=? AND status != 0 AND details NOT LIKE \'%"archived":"1"%\' AND ('+b.sql.join(' AND ')+')';
if(b.sort_by&&b.sort_by!==''){
b.sql+=' ORDER BY `'+b.sort_by+'` '+b.sort_by_direction
}
if(b.limit&&b.limit!==''){
b.sql+=' LIMIT '+b.limit
}
s.sqlQuery('SELECT * FROM Videos '+b.sql,b.ar,function(err,r){
if(r&&r[0]){
knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: whereQuery,
orderBy: [b.sort_by,b.sort_by_direction.toLowerCase()],
limit: b.limit
},(err,r) => {
if(r && r[0]){
if(r.length > 0 || config.debugLog === true){
s.cx({f:'filterMatch',msg:r.length+' SQL rows match "'+m+'"',ke:v.ke,time:moment()})
}
@ -309,10 +439,19 @@ const deleteRowsWithNoVideo = function(v,callback){
)
){
s.alreadyDeletedRowsWithNoVideosOnStart[v.ke]=true;
es={};
s.sqlQuery('SELECT * FROM Videos WHERE ke=? AND status!=0 AND details NOT LIKE \'%"archived":"1"%\' AND time < ?',[v.ke,s.sqlDate('10 MINUTE')],function(err,evs){
if(evs&&evs[0]){
es.del=[];es.ar=[v.ke];
knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: [
['ke','=',v.ke],
['status','!=','0'],
['details','NOT LIKE','%"archived":"1"%'],
['time','<',s.sqlDate('10 MINUTE')],
]
},(err,evs) => {
if(evs && evs[0]){
const videosToDelete = [];
evs.forEach(function(ev){
var filename
var details
@ -337,8 +476,8 @@ const deleteRowsWithNoVideo = function(v,callback){
s.tx({f:'video_delete',filename:filename+'.'+ev.ext,mid:ev.mid,ke:ev.ke,time:ev.time,end:s.moment(new Date,'YYYY-MM-DD HH:mm:ss')},'GRP_'+ev.ke);
}
});
if(es.del.length>0 || config.debugLog === true){
s.cx({f:'deleteNoVideo',msg:es.del.length+' SQL rows with no file deleted',ke:v.ke,time:moment()})
if(videosToDelete.length>0 || config.debugLog === true){
s.cx({f:'deleteNoVideo',msg:videosToDelete.length+' SQL rows with no file deleted',ke:v.ke,time:moment()})
}
}
setTimeout(function(){
@ -353,7 +492,14 @@ const deleteRowsWithNoVideo = function(v,callback){
const deleteOldLogs = function(v,callback){
if(!v.d.log_days||v.d.log_days==''){v.d.log_days=10}else{v.d.log_days=parseFloat(v.d.log_days)};
if(config.cron.deleteLogs===true&&v.d.log_days!==0){
s.sqlQuery("DELETE FROM Logs WHERE ke=? AND `time` < ?",[v.ke,s.sqlDate(v.d.log_days+' DAY')],function(err,rrr){
knexQuery({
action: "delete",
table: "Logs",
where: [
['ke','=',v.ke],
['time','<',s.sqlDate(v.d.log_days+' DAY')],
]
},(err,rrr) => {
callback()
if(err)return console.error(err);
if(rrr.affectedRows && rrr.affectedRows.length>0 || config.debugLog === true){
@ -368,10 +514,39 @@ const deleteOldLogs = function(v,callback){
const deleteOldEvents = function(v,callback){
if(!v.d.event_days||v.d.event_days==''){v.d.event_days=10}else{v.d.event_days=parseFloat(v.d.event_days)};
if(config.cron.deleteEvents===true&&v.d.event_days!==0){
s.sqlQuery("DELETE FROM Events WHERE ke=? AND `time` < ?",[v.ke,s.sqlDate(v.d.event_days+' DAY')],function(err,rrr){
knexQuery({
action: "delete",
table: "Events",
where: [
['ke','=',v.ke],
['time','<',s.sqlDate(v.d.event_days+' DAY')],
]
},(err,rrr) => {
callback()
if(err)return console.error(err);
if(rrr.affectedRows && rrr.affectedRows.length>0 || config.debugLog === true){
if(rrr.affectedRows && rrr.affectedRows.length > 0 || config.debugLog === true){
s.cx({f:'deleteEvents',msg:(rrr.affectedRows || 0)+' SQL rows older than '+v.d.event_days+' days deleted',ke:v.ke,time:moment()})
}
})
}else{
callback()
}
}
//event counts
const deleteOldEventCounts = function(v,callback){
if(!v.d.event_days||v.d.event_days==''){v.d.event_days=10}else{v.d.event_days=parseFloat(v.d.event_days)};
if(config.cron.deleteEvents===true&&v.d.event_days!==0){
knexQuery({
action: "delete",
table: "Events Counts",
where: [
['ke','=',v.ke],
['time','<',s.sqlDate(v.d.event_days+' DAY')],
]
},(err,rrr) => {
callback()
if(err && err.code !== 'ER_NO_SUCH_TABLE')return console.error(err);
if(rrr.affectedRows && rrr.affectedRows.length > 0 || config.debugLog === true){
s.cx({f:'deleteEvents',msg:(rrr.affectedRows || 0)+' SQL rows older than '+v.d.event_days+' days deleted',ke:v.ke,time:moment()})
}
})
@ -384,7 +559,15 @@ const deleteOldFileBins = function(v,callback){
if(!v.d.fileBin_days||v.d.fileBin_days==''){v.d.fileBin_days=10}else{v.d.fileBin_days=parseFloat(v.d.fileBin_days)};
if(config.cron.deleteFileBins===true&&v.d.fileBin_days!==0){
var fileBinQuery = " FROM Files WHERE ke=? AND `time` < ?";
s.sqlQuery("SELECT *"+fileBinQuery,[v.ke,s.sqlDate(v.d.fileBin_days+' DAY')],function(err,files){
knexQuery({
action: "select",
columns: "*",
table: "Files",
where: [
['ke','=',v.ke],
['time','<',s.sqlDate(v.d.fileBin_days+' DAY')],
]
},(err,files) => {
if(files&&files[0]){
//delete the files
files.forEach(function(file){
@ -393,7 +576,14 @@ const deleteOldFileBins = function(v,callback){
})
})
//delete the database rows
s.sqlQuery("DELETE"+fileBinQuery,[v.ke,v.d.fileBin_days],function(err,rrr){
knexQuery({
action: "delete",
table: "Files",
where: [
['ke','=',v.ke],
['time','<',s.sqlDate(v.d.fileBin_days+' DAY')],
]
},(err,rrr) => {
callback()
if(err)return console.error(err);
if(rrr.affectedRows && rrr.affectedRows.length>0 || config.debugLog === true){
@ -436,7 +626,14 @@ const processUser = function(number,rows){
if(!v.d.size||v.d.size==''){v.d.size=10000}else{v.d.size=parseFloat(v.d.size)};
//days to keep videos
if(!v.d.days||v.d.days==''){v.d.days=5}else{v.d.days=parseFloat(v.d.days)};
s.sqlQuery('SELECT * FROM Monitors WHERE ke=?', [v.ke], function(err,rr) {
knexQuery({
action: "select",
columns: "*",
table: "Monitors",
where: [
['ke','=',v.ke],
]
},(err,rr) => {
if(!v.d.filters||v.d.filters==''){
v.d.filters={};
}
@ -474,14 +671,17 @@ const processUser = function(number,rows){
s.debugLog('--- deleteOldFileBins Complete')
deleteOldEvents(v,function(){
s.debugLog('--- deleteOldEvents Complete')
checkFilterRules(v,function(){
s.debugLog('--- checkFilterRules Complete')
deleteRowsWithNoVideo(v,function(){
s.debugLog('--- deleteRowsWithNoVideo Complete')
checkForOrphanedFiles(v,function(){
//done user, unlock current, and do next
overlapLocks[v.ke]=false;
processUser(number+1,rows)
deleteOldEventCounts(v,function(){
s.debugLog('--- deleteOldEventCounts Complete')
checkFilterRules(v,function(){
s.debugLog('--- checkFilterRules Complete')
deleteRowsWithNoVideo(v,function(){
s.debugLog('--- deleteRowsWithNoVideo Complete')
checkForOrphanedFiles(v,function(){
//done user, unlock current, and do next
overlapLocks[v.ke]=false;
processUser(number+1,rows)
})
})
})
})
@ -504,7 +704,14 @@ const clearCronInterval = function(){
}
const doCronJobs = function(){
s.cx({f:'start',time:moment()})
s.sqlQuery('SELECT ke,uid,details,mail FROM Users WHERE details NOT LIKE \'%"sub"%\'', function(err,rows) {
knexQuery({
action: "select",
columns: "ke,uid,details,mail",
table: "Users",
where: [
['details','NOT LIKE','%"sub"%'],
]
},(err,rows) => {
if(err){
console.error(err)
}

View File

@ -16,7 +16,7 @@ module.exports = function(s,config,lang){
"field": lang.Mode,
"fieldType": "select",
"description": "This is the primary task of the monitor.",
"default": "stop",
"default": "start",
"example": "",
"selector": "h_m",
"possible": [
@ -52,7 +52,7 @@ module.exports = function(s,config,lang){
"name": "name",
"field": lang.Name,
"description": "This is the human-readable display name for the monitor.",
"example": "Bunny"
"example": "Home-Front"
},
{
"name": "detail=max_keep_days",
@ -75,6 +75,41 @@ module.exports = function(s,config,lang){
}
]
},
"Presets": {
id: "monSectionPresets",
"name": lang.Presets,
"color": "purple",
isSection: true,
"info": [
{
"name": lang['Add New'],
"color": "grey",
isFormGroupGroup: true,
"info": [
{
"id": "monitorPresetsName",
"field": lang['Preset Name'],
},
{
"fieldType": "btn",
"class": `btn-success add-new`,
"btnContent": `<i class="fa fa-plus"></i> &nbsp; ${lang['Add']}`,
},
]
},
{
"fieldType": 'ul',
"id": "monitorPresetsSelection",
"class": "mdl-list"
},
{
"fieldType": "btn",
"attribute": `data-toggle="modal" data-target="#schedules"`,
"class": `btn-info`,
"btnContent": `<i class="fa fa-clock-o"></i> &nbsp; ${lang['Schedules']}`,
},
],
},
"Connection": {
"name": lang.Connection,
"color": "orange",
@ -163,8 +198,8 @@ module.exports = function(s,config,lang){
"field": lang.Automatic,
"description": "Feed the individual pieces required to build a stream URL or provide the full URL and allow Shinobi to parse it for you.",
"selector": "h_auto_host",
"form-group-class": "h_t_input h_t_h264 h_t_hls h_t_mp4 h_t_jpeg h_t_mjpeg",
"form-group-class-pre-layer":"h_t_input h_t_h264 h_t_hls h_t_mp4 h_t_jpeg h_t_mjpeg h_t_local",
"form-group-class": "h_t_input h_t_h264 h_t_hls h_t_mp4 h_t_jpeg h_t_mjpeg h_t_mxpeg",
"form-group-class-pre-layer":"h_t_input h_t_h264 h_t_hls h_t_mp4 h_t_jpeg h_t_mjpeg h_t_mxpeg h_t_local",
"default": "",
"example": "",
"fieldType": "select",
@ -342,7 +377,7 @@ module.exports = function(s,config,lang){
"name": "detail=fatal_max",
"field": lang['Retry Connection'],
"description": "The number of times to retry for network connection between the server and camera before setting the monitor to Disabled. No decimals. Set to 0 to retry forever.",
"default": "0",
"default": "10",
"example": "",
"possible": "",
"form-group-class": "h_t_input h_t_h264 h_t_hls h_t_mp4 h_t_jpeg h_t_mjpeg h_t_local",
@ -385,6 +420,25 @@ module.exports = function(s,config,lang){
}
]
},
{
"name": "detail=onvif_non_standard",
"field": lang['Non-Standard ONVIF'],
"description": "Is this a Non-Standard ONVIF camera?",
"default": "0",
"example": "",
"form-group-class": "h_onvif_input h_onvif_1",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
hidden: true,
"name": "detail=onvif_port",
@ -394,6 +448,11 @@ module.exports = function(s,config,lang){
"example": "",
"form-group-class": "h_onvif_input h_onvif_1",
},
{
"fieldType": "btn",
"class": `btn-success probe_config`,
"btnContent": `<i class="fa fa-search"></i> &nbsp; ${lang.FFprobe}`,
},
]
},
"Input": {
@ -478,6 +537,25 @@ module.exports = function(s,config,lang){
"example": "25",
"possible": ""
},
{
"name": "detail=wall_clock_timestamp_ignore",
"field": lang['Use Camera Timestamps'],
"description": "Base all incoming camera data in camera time instead of server time.",
"default": "0",
"example": "",
"form-group-class": "h_t_input h_t_h264",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
hidden: true,
"name": "height",
@ -1427,7 +1505,7 @@ module.exports = function(s,config,lang){
"possible": "1-23"
},
{
"name": "preset_record",
"name": "detail=preset_record",
"field": lang.Preset,
"description": "Preset flag for certain video encoders. If you find your camera is crashing every few seconds : try leaving it blank.",
"default": "",
@ -1493,7 +1571,7 @@ module.exports = function(s,config,lang){
},
{
"name": "fps",
"field": lang["Video Record Rate (FPS)"],
"field": lang["Video Record Rate"],
"description": "The speed in which frames are recorded to files, Frames Per Second. Be aware there is no default. This can lead to large files. Best to set this camera-side.",
"default": "",
"example": "2",
@ -1787,6 +1865,18 @@ module.exports = function(s,config,lang){
"form-group-class": "h_rec_ti_input h_rec_ti_1",
"fieldType": "select",
"possible": [
{
"name": `.1 ${lang.minutes}`,
"value": "6"
},
{
"name": `.25 ${lang.minutes}`,
"value": "15"
},
{
"name": `.5 ${lang.minutes}`,
"value": "30"
},
{
"name": `5 ${lang.minutes}`,
"value": "300"
@ -1847,7 +1937,7 @@ module.exports = function(s,config,lang){
},
"Timelapse Watermark": {
"id": "monSectionRecordingWatermark",
"name": lang['Recording Watermark'],
"name": lang['Timelapse Watermark'],
"color": "red",
isAdvanced: true,
@ -1977,6 +2067,16 @@ module.exports = function(s,config,lang){
"form-group-class": "shinobi-detector",
"possible": ""
},
{
hidden: true,
"name": "detail=cust_detect_object",
"field": lang["Object Detector Flags"],
"description": "Custom Flags that bind to the stream Detector uses for analyzation.",
"default": "",
"example": "",
"form-group-class": "shinobi-detector",
"possible": ""
},
{
hidden: true,
"name": "detail=cust_sip_record",
@ -2076,6 +2176,36 @@ module.exports = function(s,config,lang){
}
]
},
{
hidden: true,
"form-group-class": "h_det_input h_det_1",
"name": "detail=detector_fps",
"field": lang["Detector Rate"],
"description": "How many frames a second to send to the motion detector; 2 is the default.",
"default": "2",
"example": "",
"possible": ""
},
{
hidden: true,
"form-group-class": "h_det_input h_det_1",
"name": "detail=detector_scale_x",
"field": lang["Feed-in Image Width"],
"description": "Width of the image being detected. Smaller sizes take less CPU.",
"default": "",
"example": "640",
"possible": ""
},
{
hidden: true,
"form-group-class": "h_det_input h_det_1",
"name": "detail=detector_scale_y",
"field": lang["Feed-in Image Height"],
"description": "Height of the image being detected. Smaller sizes take less CPU.",
"default": "",
"example": "480",
"possible": ""
},
{
hidden: true,
"name": "detail=detector_lock_timeout",
@ -2106,36 +2236,6 @@ module.exports = function(s,config,lang){
}
]
},
{
hidden: true,
"name": "detail=detector_fps",
"field": lang["Detector Rate"],
"description": "How many frames a second to send to the motion detector; 2 is the default.",
"default": "2",
"example": "",
"form-group-class": "h_det_input h_det_1",
"possible": ""
},
{
hidden: true,
"name": "detail=detector_scale_x",
"field": lang["Feed-in Image Width"],
"description": "Width of the image being detected. Smaller sizes take less CPU.",
"default": "",
"example": "640",
"form-group-class": "h_det_input h_det_1",
"possible": ""
},
{
hidden: true,
"name": "detail=detector_scale_y",
"field": lang["Feed-in Image Height"],
"description": "Height of the image being detected. Smaller sizes take less CPU.",
"default": "",
"example": "480",
"form-group-class": "h_det_input h_det_1",
"possible": ""
},
{
hidden: true,
"name": "detail=detector_record_method",
@ -2294,7 +2394,7 @@ module.exports = function(s,config,lang){
"class": "mdl-list"
},
{
hidden: true,
hidden: true,
"name": "detail=group_detector_multi",
"field": "",
"description": "",
@ -2403,6 +2503,7 @@ module.exports = function(s,config,lang){
"name": "detail=snap_seconds_inward",
"field": lang['Delay for Snapshot'],
"description": lang['in seconds'],
"form-group-class": "h_det_input h_det_1",
"default": "0",
},
{
@ -2431,7 +2532,6 @@ module.exports = function(s,config,lang){
"description": "The amount of time until a trigger is allowed to send another email with motion details and another image.",
"default": "10",
"example": "",
"form-group-class": "h_det_email_input h_det_email_1",
"form-group-class-pre-layer": "h_det_input h_det_1",
"possible": ""
},
@ -2575,37 +2675,37 @@ module.exports = function(s,config,lang){
}
]
},
{
"name": "detail=detector_show_matrix",
"field": lang["Show Matrices"],
"description": "Outline which pixels are detected as changed in one matrix.",
"default": "0",
"example": "",
"fieldType": "select",
"form-group-class": "h_det_pam_input h_det_pam_1",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
// {
// "name": "detail=detector_show_matrix",
// "field": lang["Show Matrices"],
// "description": "Outline which pixels are detected as changed in one matrix.",
// "default": "0",
// "example": "",
// "fieldType": "select",
// "form-group-class": "h_det_pam_input h_det_pam_1",
// "possible": [
// {
// "name": lang.No,
// "value": "0"
// },
// {
// "name": lang.Yes,
// "value": "1"
// }
// ]
// },
{
"name": "detail=detector_sensitivity",
"field": lang.Indifference,
"description": "This can mean multiple things depending on the detector used. Built-In Motion Detection defines this as \"Percentage Changed in View or Region\"",
"default": "0.5",
"field": lang['Minimum Change'],
"description": "The motion confidence rating must exceed this value to be seen as a trigger. This number correlates directly to the confidence rating returned by the motion detector. This option was previously named \"Indifference\".",
"default": "10",
"example": "10",
"possible": ""
},
{
"name": "detail=detector_max_sensitivity",
"field": lang["Max Indifference"],
"description": "An upperbound to indifference. Any value over this amount will be ignored.",
"field": lang["Maximum Change"],
"description": "The motion confidence rating must be lower than this value to be seen as a trigger. Leave blank for no maximum. This option was previously named \"Max Indifference\".",
"default": "",
"example": "75",
"possible": ""
@ -2629,7 +2729,7 @@ module.exports = function(s,config,lang){
{
"name": "detail=detector_frame",
"field": lang["Full Frame Detection"],
"description": "This will read the entire frame for pixel differences.",
"description": "This will read the entire frame for pixel differences. This is the same as creating a region that covers the entire screen.",
"default": "1",
"example": "",
"fieldType": "select",
@ -2936,6 +3036,26 @@ module.exports = function(s,config,lang){
}
]
},
{
hidden: true,
"name": "detail=detector_obj_count_in_region",
"field": lang["Count Objects only inside Regions"],
"description": "Count Objects only inside Regions.",
"default": "0",
"example": "",
"form-group-class": "h_det_count_input h_det_count_1",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"name": "detail=detector_obj_region",
"field": lang['Require Object to be in Region'],
@ -2978,12 +3098,9 @@ module.exports = function(s,config,lang){
"name": "detail=detector_fps_object",
"field": lang['Frame Rate'],
"description": "",
"default": "1",
"default": "2",
"example": "",
"form-group-class": "h_det_mot_fir_input h_det_mot_fir_1",
"form-group-class-pre-layer": "h_det_pam_input h_det_pam_1",
"fieldType": "number",
"numberMin": "1",
"form-group-class": "h_casc_input h_casc_1",
"possible": ""
},
{
@ -2993,8 +3110,7 @@ module.exports = function(s,config,lang){
"description": "",
"default": "",
"example": "",
"form-group-class": "h_det_mot_fir_input h_det_mot_fir_1",
"form-group-class-pre-layer": "h_det_pam_input h_det_pam_1",
"form-group-class": "h_casc_input h_casc_1",
"fieldType": "number",
"numberMin": "1",
"possible": ""
@ -3006,8 +3122,7 @@ module.exports = function(s,config,lang){
"description": "",
"default": "",
"example": "",
"form-group-class": "h_det_mot_fir_input h_det_mot_fir_1",
"form-group-class-pre-layer": "h_det_pam_input h_det_pam_1",
"form-group-class": "h_casc_input h_casc_1",
"fieldType": "number",
"numberMin": "1",
"possible": ""
@ -3324,6 +3439,53 @@ module.exports = function(s,config,lang){
"form-group-class": "h_cs_input h_cs_1",
"possible": ""
},
{
"name": "detail=detector_ptz_follow",
"field": lang['PTZ Tracking'],
"description": "Follow the largest detected object with PTZ? Requires an Object Detector running or matrices provided with events.",
"default": "0",
"example": "",
"selector": "h_det_tracking",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"name": "detail=detector_ptz_follow_target",
"field": lang['PTZ Tracking Target'],
"description": "",
"default": "person",
"example": "",
"form-group-class": "h_det_tracking_input h_det_tracking_1",
"possible": ""
},
{
"name": "detail=detector_obj_count",
"field": lang["Count Objects"],
"description": "Count detected objects.",
"default": "0",
"example": "",
"selector": "h_det_count",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"name": "detail=control_url_center",
"field": lang['Center'],
@ -3465,6 +3627,24 @@ module.exports = function(s,config,lang){
"form-group-class": "h_control_call_input h_control_call_GET h_control_call_PUT h_control_call_POST",
"possible": ""
},
{
"name": "detail=control_invert_y",
"field": lang["Invert Y-Axis"],
"description": "For When your camera is mounted upside down or uses inverted vertical controls.",
"default": "0",
"example": "",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
]
},
"Grouping": {
@ -3489,7 +3669,6 @@ module.exports = function(s,config,lang){
"Copy Settings": {
id: "monSectionCopying",
"name": lang['Copy Settings'],
"color": "orange",
isSection: true,
"info": [
@ -3851,6 +4030,43 @@ module.exports = function(s,config,lang){
"Account Settings": {
"section": "Account Settings",
"blocks": {
"ShinobiHub": {
"evaluation": "!details.sub && details.use_shinobihub !== '0'",
"name": lang["ShinobiHub"],
"color": "purple",
"info": [
{
"name": "detail=shinobihub",
"selector":"autosave_shinobihub",
"field": lang.Autosave,
"description": "",
"default": "0",
"example": "",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"hidden": true,
"field": lang['API Key'],
"name": "detail=shinobihub_key",
"placeholder": "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
"form-group-class": "autosave_shinobihub_input autosave_shinobihub_1",
"description": "",
"default": "",
"example": "",
"possible": ""
},
]
},
"2-Factor Authentication": {
"name": lang['2-Factor Authentication'],
"color": "grey",
@ -3910,7 +4126,7 @@ module.exports = function(s,config,lang){
}
],
"form-group-class": "u_discord_bot_input u_discord_bot_1"
}
},
]
},
"Profile": {

View File

@ -19,15 +19,41 @@
"Remember Me": "Remember Me",
"Process Already Running": "Process Already Running",
"Process Not Running": "Process Not Running",
"ShinobiHub": "ShinobiHub",
"Oldest": "Oldest",
"Newest": "Newest",
"Title": "Title",
"Subtitle": "Subtitle",
"Date Added": "Date Added",
"Date Updated": "Date Updated",
"Public on ShinobiHub": "Public on ShinobiHub",
"Uploaded Only": "Uploaded Only",
"Play": "Play",
"Pause": "Pause",
"RAM": "RAM",
"CPU": "CPU",
"on": "on",
"OAuth Credentials": "OAuth Credentials",
"Token": "Token",
"OAuth Code": "OAuth Code",
"Google Drive": "Google Drive",
"Invert Y-Axis": "Invert Y-Axis",
"Get Code": "Get Code",
"PTZ Tracking": "PTZ Tracking",
"PTZ Tracking Target": "PTZ Tracking Target",
"Event Counts": "Event Counts",
"Already Processing": "Already Processing",
"No API Key": "No API Key",
"Use Camera Timestamps": "Use Camera Timestamps",
"Power Viewer": "Power Viewer",
"Power Video Viewer": "Power Video Viewer",
"Time-lapse": "Time-lapse",
"Montage": "Montage",
"Open All Monitors": "Open All Monitors",
"Accounts": "Accounts",
"Settings": "Settings",
"Count Objects only inside Regions": "Count Objects only inside Regions",
"Count Objects": "Count Objects",
"Cards": "Cards",
"No Region": "No Region",
"Recording FPS": "Recording FPS",
@ -63,6 +89,9 @@
"Never": "Never",
"API": "API",
"ONVIF": "ONVIF",
"Set Home": "Set Home",
"Set Home Position (ONVIF-only)": "Set Home Position (ONVIF-only)",
"Non-Standard ONVIF": "Non-Standard ONVIF",
"FFprobe": "Probe",
"Monitor States": "Monitor States",
"Schedule": "Schedule",
@ -174,6 +203,7 @@
"File Type": "File Type",
"Filesize": "Filesize",
"Video Status": "Video Status",
"Custom Auto Load": "Custom Auto Load",
"Preferences": "Preferences",
"Equal to": "Equal to",
"Not Equal to": "Not Equal to",
@ -220,10 +250,14 @@
"Video stream only from first feed": "Video stream only from first feed",
"Audio streams only": "Audio streams only",
"Audio stream only from first feed": "Audio stream only from first feed",
"sorryNothingWasFound": "Sorry, nothing was found.",
"ONVIF Port": "ONVIF Port",
"ONVIF Scanner": "ONVIF Scanner",
"ONVIFEventsNotAvailable": "ONVIF Events not Available",
"ONVIFnotCompliantProfileT": "Camera is not ONVIF Profile T Compliant",
"ONVIFErr400": "Found ONVIF port but authorization failed when retrieving the Stream URL. Check username and password used for scan.",
"ONVIFErr405": "Method Not Allowed. Check username and password used for scan.",
"ONVIFErr404": "Not Found. This may just be the web panel for a network device.",
"Scan Settings": "Scan Settings",
"ONVIFnote": "Discover ONVIF devices on networks outside your own or leave it blank to scan your current network. <br>Username and Password can be left blank.",
"Range or Single": "Range or Single",
@ -236,6 +270,8 @@
"Live Stream Toggle": "Live Stream Toggle",
"RegionNote": "Points are only saved when you press <b>Save</b> on the <b>Monitor Settings</b> window.",
"Points": "Points <small>When adding points click on the edge of the polygon.</small>",
"Minimum Change": "Minimum Change",
"Maximum Change": "Maximum Change",
"Indifference": "Indifference",
"Max Indifference": "Max Indifference",
"Trigger Threshold": "Trigger Threshold",
@ -450,7 +486,15 @@
"StreamText": "<p>This section will designate the primary method of streaming out and its settings. This stream will be displayed in the dashboard. If you choose to use HLS, JPEG, or MJPEG then you can consume the stream through other programs.</p><p class=\"h_st_input h_st_jpeg\">Using JPEG stream essentially turns off the primary stream and uses the snapshot bin to get frames.</p>",
"DetectorText": "<p>When the Width and Height boxes are shown you should set them to 640x480 or below. This will optimize the read speed of frames.</p>",
"RecordingText": "It is recommended that you set <b>Record File Type</b> to <b class=\"h_t_input h_t_jpeg h_t_socket\">WebM</b><b class=\"h_t_input h_t_mjpeg h_t_h264 h_t_hls h_t_mp4 h_t_local\">MP4</b> and <b>Video Codec</b> to <b class=\"h_t_input h_t_jpeg h_t_socket\">libvpx</b><b class=\"h_t_input h_t_h264 h_t_hls h_t_mp4\">copy or </b><b class=\"h_t_input h_t_mjpeg h_t_h264 h_t_hls h_t_mp4 h_t_local\">libx264</b> because your <b>Input Type</b> is set to <b class=\"h_t_text\"></b>.",
"'Already Installing...'": "'Already Installing...'",
"Time Created": "Time Created",
"Last Modified": "Last Modified",
"Mode": "Mode",
"Run Installer": "Run Installer",
"Install": "Install",
"Enable": "Enable",
"Disable": "Disable",
"Delete": "Delete",
"Name": "Name",
"Skip Ping": "Skip Ping",
"Retry Connection": "Retry Connection <small>Number of times allowed to fail</small>",
@ -484,6 +528,7 @@
"Width": "Width",
"Height": "Height",
"Rotate": "Rotate",
"Trigger Event": "Trigger Event",
"Primary Engine": "Primary Engine",
"Video Filter": "Video Filter",
"Font Path": "Font Path",
@ -492,7 +537,7 @@
"Text Box Color": "Text Box Color",
"Position X": "Position X",
"Position Y": "Position Y",
"Image Location": "Image Location <small>Absolute Path or leave blank to use global</small>",
"Image Location": "Image Location",
"Image Position": "Image Position",
"Frame Rate": "Frame Rate",
"Image Width": "Image Width",
@ -501,9 +546,14 @@
"Notification Video Length": "Notification Video Length",
"Video Codec": "Video Codec",
"Delete Monitor States Preset": "Delete Monitor States Preset",
"Delete Schedule": "Delete Schedule",
"Delete Monitor State?": "Delete Monitor State",
"deleteMonitorStateText1": "Do you want to delete this Monitor States Preset?",
"deleteScheduleText": "Do you want to delete this Schedule? Monitor Presets associated will not be modified..",
"deleteMonitorStateText1": "Do you want to delete this Monitor States Preset? The monitor configurations associated cannot be recovered.",
"deleteMonitorStateText2": "Do you want to delete this Monitor's Preset?",
"undoAllUnsaveChanges": "Are you sure you want to do this? This will undo all unsaved changes.",
"monitorStatesError": "Monitor Presets Error",
"monitorStateNotEnoughChanges": "You need to make a change in your monitor configuration before attempting to add it to a Preset.",
"Search Images": "Search Images",
"Launch in New Window": "Launch in New Window",
"Preset": "Preset",
@ -512,7 +562,7 @@
"sizePurgeLockedText": "The Size Purge Lock (deleteOverMax) appears to have failed to unlock. Unlocking now...",
"Use coProcessor": "Use coProcessor",
"Audio Codec": "Audio Codec",
"Video Record Rate": "Video Record Rate <small>(FPS)</small>",
"Video Record Rate": "Video Record Rate",
"Record Width": "Record Width",
"Record Height": "Record Height",
"Double Quote Directory": "Double Quote Directory <small>Some directories have spaces. Using this may crash some cameras.</small>",
@ -520,6 +570,7 @@
"Record Video Filter": "Record Video Filter",
"Input Flags": "Input Flags",
"Snapshot Flags": "Snapshot Flags",
"Object Detector Flags": "Object Detector Flags",
"Detector Flags": "Detector Flags",
"Stream Flags": "Stream Flags",
"Stream to YouTube": "Stream to YouTube",
@ -540,6 +591,7 @@
"Attach Video Clip": "Attach Video Clip",
"Error While Decoding": "Error While Decoding",
"ErrorWhileDecodingText": "Your hardware may have an unstable connection to the network. Check your network connections.",
"ErrorWhileDecodingTextAudio": "Your camera is providing broken data. Try disabling the Audio in the camera's internal settings.",
"Discord": "Discord",
"Discord Alert on Trigger": "Discord Alert on Trigger",
"Allow Next Email": "Allow Next Email <small>in Minutes</small>",
@ -729,7 +781,7 @@
"Event": "Event",
"CPU used by this stream": "CPU used by this stream",
"Detector Buffer": "Detector Buffer",
"EventText1": "Triggered a motion event at",
"EventText1": "Triggered an event at",
"EventText2": "Could not email image, file was not accessible",
"MailError": "MAIL ERROR : Could not send email, Check conf.json. Skipping any features relying on mailing.",
"updateKeyText1": "\"updateKey\" is missing from \"conf.json\", cannot do updates this way until you add it.",
@ -786,6 +838,7 @@
"Select a Monitor": "Select a Monitor",
"Per Monitor": "Per Monitor",
"Matrices": "Matrices",
"Preset Name": "Preset Name",
"Show Matrices": "Show Matrices",
"Show Matrix": "Show Matrix",
"No Monitor ID Present in Form": "No Monitor ID Present in Form",
@ -890,6 +943,8 @@
"vda": "vda (Apple VDA Hardware Acceleration)",
"videotoolbox": "videotoolbox",
"cuvid": "cuvid (NVIDIA NVENC)",
"cuda": "cuda (NVIDIA NVENC)",
"opencl": "OpenCL",
"Main": "Main",
"Storage Location": "Storage Location",
"Recommended": "Recommended",
@ -933,6 +988,7 @@
"Automatic":"Automatic",
"Max Latency":"Max Latency",
"Loop Stream":"Loop Stream",
"Object Count":"Object Count",
"Object Tag":"Object Tag",
"Noise Filter":"Noise Filter",
"Noise Filter Range":"Noise Filter Range",
@ -942,6 +998,8 @@
"TV Channel Group":"TV Channel Group",
"Emotion Average":"Emotion Average",
"Require Object to be in Region":"Require Object to be in Region",
"Numeric criteria unsupported for Region tests, Ignoring Conditional":"Numeric criteria unsupported for Region tests, Ignoring Conditional",
"Text criteria unsupported for Object Count tests, Ignoring Conditional":"Text criteria unsupported for Object Count tests, Ignoring Conditional",
"Show Regions of Interest":"Show Regions of Interest",
"Confidence of Detection":"Confidence of Detection",
"Edit Selected":"Edit Selected",

View File

@ -8,14 +8,30 @@ module.exports = function(s,config,lang){
//
var getUserByUid = function(params,columns,callback){
if(!columns)columns = '*'
s.sqlQuery(`SELECT ${columns} FROM Users WHERE uid=? AND ke=?`,[params.uid,params.ke],function(err,r){
s.knexQuery({
action: "select",
columns: columns,
table: "Users",
where: [
['uid','=',params.uid],
['ke','=',params.ke],
]
},(err,r) => {
if(!r)r = []
var user = r[0]
callback(err,user)
})
}
var getUserBySessionKey = function(params,callback){
s.sqlQuery('SELECT * FROM Users WHERE auth=? AND ke=?',[params.auth,params.ke],function(err,r){
s.knexQuery({
action: "select",
columns: '*',
table: "Users",
where: [
['auth','=',params.auth],
['ke','=',params.ke],
]
},(err,r) => {
if(!r)r = []
var user = r[0]
callback(err,user)
@ -23,7 +39,18 @@ module.exports = function(s,config,lang){
}
var loginWithUsernameAndPassword = function(params,columns,callback){
if(!columns)columns = '*'
s.sqlQuery(`SELECT ${columns} FROM Users WHERE mail=? AND (pass=? OR pass=?) LIMIT 1`,[params.username,params.password,s.createHash(params.password)],function(err,r){
s.knexQuery({
action: "select",
columns: columns,
table: "Users",
where: [
['mail','=',params.username],
['pass','=',params.password],
['or','mail','=',params.username],
['pass','=',s.createHash(params.password)],
],
limit: 1
},(err,r) => {
if(!r)r = []
var user = r[0]
callback(err,user)
@ -31,7 +58,15 @@ module.exports = function(s,config,lang){
}
var getApiKey = function(params,columns,callback){
if(!columns)columns = '*'
s.sqlQuery(`SELECT ${columns} FROM API WHERE code=? AND ke=?`,[params.auth,params.ke],function(err,r){
s.knexQuery({
action: "select",
columns: columns,
table: "API",
where: [
['code','=',params.auth],
['ke','=',params.ke],
]
},(err,r) => {
if(!r)r = []
var apiKey = r[0]
callback(err,apiKey)
@ -135,12 +170,12 @@ module.exports = function(s,config,lang){
activeSession &&
(
activeSession.ip.indexOf('0.0.0.0') > -1 ||
activeSession.ip.indexOf(params.ip) > -1
params.ip.indexOf(activeSession.ip) > -1
)
){
if(!user.lang){
var details = s.parseJSON(user.details).lang
user.lang = s.getDefinitonFile(user.details.lang) || s.copySystemDefaultLanguage()
user.lang = s.getLanguageFile(user.details.lang) || s.copySystemDefaultLanguage()
}
onSuccessComplete(user)
}else{
@ -226,7 +261,14 @@ module.exports = function(s,config,lang){
}
var foundUser = function(){
if(params.users === true){
s.sqlQuery('SELECT * FROM Users WHERE details NOT LIKE ?',['%"sub"%'],function(err,r) {
s.knexQuery({
action: "select",
columns: "*",
table: "Users",
where: [
['details','NOT LIKE','%"sub"%'],
]
},(err,r) => {
adminUsersSelected = r
success()
})

View File

@ -52,6 +52,11 @@ module.exports = function(s,config){
}
return json
}
s.stringContains = function(find,string,toLowerCase){
var newString = string + ''
if(toLowerCase)newString = newString.toLowerCase()
return newString.indexOf(find) > -1
}
s.addUserPassToUrl = function(url,user,pass){
var splitted = url.split('://')
splitted[1] = user + ':' + pass + '@' + splitted[1]
@ -64,6 +69,28 @@ module.exports = function(s,config){
}
return x.replace('__DIR__',s.mainDirectory)
}
s.mergeDeep = function(...objects) {
const isObject = obj => obj && typeof obj === 'object';
return objects.reduce((prev, obj) => {
Object.keys(obj).forEach(key => {
const pVal = prev[key];
const oVal = obj[key];
if (Array.isArray(pVal) && Array.isArray(oVal)) {
prev[key] = pVal.concat(...oVal);
}
else if (isObject(pVal) && isObject(oVal)) {
prev[key] = s.mergeDeep(pVal, oVal);
}
else {
prev[key] = oVal;
}
});
return prev;
}, {});
}
s.md5 = function(x){return crypto.createHash('md5').update(x).digest("hex")}
s.createHash = s.md5
switch(config.passwordType){
@ -187,20 +214,27 @@ module.exports = function(s,config){
if(!e){e=''}
if(config.systemLog===true){
if(typeof q==='string'&&s.databaseEngine){
s.sqlQuery('INSERT INTO Logs (ke,mid,info) VALUES (?,?,?)',['$','$SYSTEM',s.s({type:q,msg:w})]);
s.knexQuery({
action: "insert",
table: "Logs",
insert: {
ke: '$',
mid: '$SYSTEM',
info: s.s({type:q,msg:w}),
}
})
s.tx({f:'log',log:{time:s.timeObject(),ke:'$',mid:'$SYSTEM',time:s.timeObject(),info:s.s({type:q,msg:w})}},'$');
}
return console.log(s.timeObject().format(),q,w,e)
}
}
//system log
s.debugLog = function(q,w,e){
s.debugLog = function(...args){
if(config.debugLog === true){
if(!w){w = ''}
if(!e){e = ''}
console.log(s.timeObject().format(),q,w,e)
var logRow = ([s.timeObject().format()]).concat(...args)
console.log(...logRow)
if(config.debugLogVerbose === true){
console.log(new Error())
console.log(new Error('VERBOSE STACK TRACE, THIS IS NOT AN '))
}
}
}
@ -221,15 +255,32 @@ module.exports = function(s,config){
break;
case'delete':
if(!e){return false;}
return exec('rm -f '+e,{detached: true},function(err){
if(callback)callback(err)
fs.unlink(e,(err)=>{
if(err){
s.debugLog(err)
}
if(s.isWin){
exec('rd /s /q "' + e + '"',{detached: true},function(err){
if(callback)callback(err)
})
}else{
exec('rm -rf '+e,{detached: true},function(err){
if(callback)callback(err)
})
}
})
break;
case'deleteFolder':
if(!e){return false;}
exec('rm -rf '+e,{detached: true},function(err){
if(callback)callback(err)
})
if(s.isWin){
exec('rd /s /q "' + e + '"',{detached: true},function(err){
if(callback)callback(err)
})
}else{
exec('rm -rf '+e,{detached: true},function(err){
if(callback)callback(err)
})
}
break;
case'deleteFiles':
if(!e.age_type){e.age_type='min'};if(!e.age){e.age='1'};
@ -283,7 +334,7 @@ module.exports = function(s,config){
}
s.kilobyteToMegabyte = function(kb,places){
if(!places)places = 2
return (kb/1000000).toFixed(places)
return (kb/1048576).toFixed(places)
}
Object.defineProperty(Array.prototype, 'chunk', {
value: function(chunkSize){

View File

@ -1,46 +1,64 @@
// Matrix In Region Libs >
var SAT = require('sat')
var V = SAT.Vector;
var P = SAT.Polygon;
// Matrix In Region Libs />
var P2P = require('pipe2pam')
// pamDiff is based on https://www.npmjs.com/package/pam-diff
var PamDiff = require('pam-diff')
module.exports = function(s,config){
s.createPamDiffEngine = function(e){
module.exports = function(jsonData,pamDiffResponder){
var noiseFilterArray = {};
const groupKey = jsonData.rawMonitorConfig.ke
const monitorId = jsonData.rawMonitorConfig.mid
const triggerTimer = {}
var pamDiff
var p2p
var writeToStderr = function(text){
try{
stdioWriters[2].write(Buffer.from(`${text}`, 'utf8' ))
// stdioWriters[2].write(Buffer.from(`${new Error('writeToStderr').stack}`, 'utf8' ))
}catch(err){
fs.appendFileSync('/home/Shinobi/test.log',text + '\n','utf8')
}
}
if(typeof pamDiffResponder === 'function'){
var sendDetectedData = function(detectorObject){
pamDiffResponder(detectorObject)
}
}else{
var sendDetectedData = function(detectorObject){
pamDiffResponder.write(Buffer.from(JSON.stringify(detectorObject)))
}
}
createPamDiffEngine = function(){
var width,
height,
globalSensitivity,
globalColorThreshold,
fullFrame = false
if(s.group[e.ke].rawMonitorConfigurations[e.id].details.detector_scale_x===''||s.group[e.ke].rawMonitorConfigurations[e.id].details.detector_scale_y===''){
width = s.group[e.ke].rawMonitorConfigurations[e.id].details.detector_scale_x;
height = s.group[e.ke].rawMonitorConfigurations[e.id].details.detector_scale_y;
}else{
width = e.width
height = e.height
if(jsonData.rawMonitorConfig.details.detector_scale_x===''||jsonData.rawMonitorConfig.details.detector_scale_y===''){
width = jsonData.rawMonitorConfig.details.detector_scale_x;
height = jsonData.rawMonitorConfig.details.detector_scale_y;
}
if(e.details.detector_sensitivity===''){
else{
width = jsonData.rawMonitorConfig.width
height = jsonData.rawMonitorConfig.height
}
if(jsonData.rawMonitorConfig.details.detector_sensitivity===''){
globalSensitivity = 10
}else{
globalSensitivity = parseInt(e.details.detector_sensitivity)
globalSensitivity = parseInt(jsonData.rawMonitorConfig.details.detector_sensitivity)
}
if(e.details.detector_color_threshold===''){
if(jsonData.rawMonitorConfig.details.detector_color_threshold===''){
globalColorThreshold = 9
}else{
globalColorThreshold = parseInt(e.details.detector_color_threshold)
globalColorThreshold = parseInt(jsonData.rawMonitorConfig.details.detector_color_threshold)
}
globalThreshold = parseInt(e.details.detector_threshold) || 0
globalThreshold = parseInt(jsonData.rawMonitorConfig.details.detector_threshold) || 0
var regionJson
try{
regionJson = JSON.parse(s.group[e.ke].rawMonitorConfigurations[e.id].details.cords)
regionJson = JSON.parse(jsonData.rawMonitorConfig.details.cords)
}catch(err){
regionJson = s.group[e.ke].rawMonitorConfigurations[e.id].details.cords
regionJson = jsonData.rawMonitorConfig.details.cords
}
if(Object.keys(regionJson).length === 0 || e.details.detector_frame === '1'){
if(Object.keys(regionJson).length === 0 || jsonData.rawMonitorConfig.details.detector_frame === '1'){
fullFrame = {
name:'FULL_FRAME',
sensitivity:globalSensitivity,
@ -54,26 +72,25 @@ module.exports = function(s,config){
}
}
e.triggerTimer = {}
var regions = s.createPamDiffRegionArray(regionJson,globalColorThreshold,globalSensitivity,fullFrame)
var regions = createPamDiffRegionArray(regionJson,globalColorThreshold,globalSensitivity,fullFrame)
var pamDiffOptions = {
grayscale: 'luminosity',
regions : regions.forPam
}
if(e.details.detector_show_matrix==='1'){
if(jsonData.rawMonitorConfig.details.detector_show_matrix==='1'){
pamDiffOptions.response = 'bounds'
}
s.group[e.ke].activeMonitors[e.id].pamDiff = new PamDiff(pamDiffOptions);
s.group[e.ke].activeMonitors[e.id].p2p = new P2P()
pamDiff = new PamDiff(pamDiffOptions)
p2p = new P2P()
var regionArray = Object.values(regionJson)
if(config.detectorMergePamRegionTriggers === true){
if(jsonData.globalInfo.config.detectorMergePamRegionTriggers === true){
// merge pam triggers for performance boost
var buildTriggerEvent = function(trigger){
var detectorObject = {
f:'trigger',
id:e.id,
ke:e.ke,
id:monitorId,
ke:groupKey,
name:trigger.name,
details:{
plug:'built-in',
@ -82,8 +99,8 @@ module.exports = function(s,config){
confidence:trigger.percent
},
plates:[],
imgHeight:e.details.detector_scale_y,
imgWidth:e.details.detector_scale_x
imgHeight:jsonData.rawMonitorConfig.details.detector_scale_y,
imgWidth:jsonData.rawMonitorConfig.details.detector_scale_x
}
if(trigger.merged){
if(trigger.matrices)detectorObject.details.matrices = trigger.matrices
@ -91,13 +108,13 @@ module.exports = function(s,config){
var filteredCountSuccess = 0
trigger.merged.forEach(function(triggerPiece){
var region = regionArray.find(x => x.name == triggerPiece.name)
s.checkMaximumSensitivity(e, region, detectorObject, function(err1) {
s.checkTriggerThreshold(e, region, detectorObject, function(err2) {
checkMaximumSensitivity(region, detectorObject, function(err1) {
checkTriggerThreshold(region, detectorObject, function(err2) {
++filteredCount
if(!err1 && !err2)++filteredCountSuccess
if(filteredCount === trigger.merged.length && filteredCountSuccess > 0){
detectorObject.doObjectDetection = (s.isAtleatOneDetectorPluginConnected && e.details.detector_use_detect_object === '1')
s.triggerEvent(detectorObject)
detectorObject.doObjectDetection = (jsonData.globalInfo.isAtleatOneDetectorPluginConnected && jsonData.rawMonitorConfig.details.detector_use_detect_object === '1')
sendDetectedData(detectorObject)
}
})
})
@ -105,38 +122,36 @@ module.exports = function(s,config){
}else{
if(trigger.matrix)detectorObject.details.matrices = [trigger.matrix]
var region = regionArray.find(x => x.name == detectorObject.name)
s.checkMaximumSensitivity(e, region, detectorObject, function(err1) {
s.checkTriggerThreshold(e, region, detectorObject, function(err2) {
checkMaximumSensitivity(region, detectorObject, function(err1) {
checkTriggerThreshold(region, detectorObject, function(err2) {
if(!err1 && !err2){
detectorObject.doObjectDetection = (s.isAtleatOneDetectorPluginConnected && e.details.detector_use_detect_object === '1')
s.triggerEvent(detectorObject)
detectorObject.doObjectDetection = (jsonData.globalInfo.isAtleatOneDetectorPluginConnected && jsonData.rawMonitorConfig.details.detector_use_detect_object === '1')
sendDetectedData(detectorObject)
}
})
})
}
}
if(e.details.detector_noise_filter==='1'){
if(!s.group[e.ke].activeMonitors[e.id].noiseFilterArray)s.group[e.ke].activeMonitors[e.id].noiseFilterArray = {}
var noiseFilterArray = s.group[e.ke].activeMonitors[e.id].noiseFilterArray
if(jsonData.rawMonitorConfig.details.detector_noise_filter==='1'){
Object.keys(regions.notForPam).forEach(function(name){
if(!noiseFilterArray[name])noiseFilterArray[name]=[];
})
s.group[e.ke].activeMonitors[e.id].pamDiff.on('diff', (data) => {
pamDiff.on('diff', (data) => {
var filteredCount = 0
var filteredCountSuccess = 0
data.trigger.forEach(function(trigger){
s.filterTheNoise(e,noiseFilterArray,regions,trigger,function(err){
filterTheNoise(noiseFilterArray,regions,trigger,function(err){
++filteredCount
if(!err)++filteredCountSuccess
if(filteredCount === data.trigger.length && filteredCountSuccess > 0){
buildTriggerEvent(s.mergePamTriggers(data))
buildTriggerEvent(mergePamTriggers(data))
}
})
})
})
}else{
s.group[e.ke].activeMonitors[e.id].pamDiff.on('diff', (data) => {
buildTriggerEvent(s.mergePamTriggers(data))
pamDiff.on('diff', (data) => {
buildTriggerEvent(mergePamTriggers(data))
})
}
}else{
@ -145,8 +160,8 @@ module.exports = function(s,config){
var buildTriggerEvent = function(trigger){
var detectorObject = {
f:'trigger',
id:e.id,
ke:e.ke,
id: monitorId,
ke: groupKey,
name:trigger.name,
details:{
plug:'built-in',
@ -155,38 +170,36 @@ module.exports = function(s,config){
confidence:trigger.percent
},
plates:[],
imgHeight:e.details.detector_scale_y,
imgWidth:e.details.detector_scale_x
imgHeight:jsonData.rawMonitorConfig.details.detector_scale_y,
imgWidth:jsonData.rawMonitorConfig.details.detector_scale_x
}
if(trigger.matrix)detectorObject.details.matrices = [trigger.matrix]
var region = Object.values(regionJson).find(x => x.name == detectorObject.name)
s.checkMaximumSensitivity(e, region, detectorObject, function(err1) {
s.checkTriggerThreshold(e, region, detectorObject, function(err2) {
checkMaximumSensitivity(region, detectorObject, function(err1) {
checkTriggerThreshold(region, detectorObject, function(err2) {
if(!err1 && ! err2){
detectorObject.doObjectDetection = (s.isAtleatOneDetectorPluginConnected && e.details.detector_use_detect_object === '1')
s.triggerEvent(detectorObject)
detectorObject.doObjectDetection = (jsonData.globalInfo.isAtleatOneDetectorPluginConnected && jsonData.rawMonitorConfig.details.detector_use_detect_object === '1')
sendDetectedData(detectorObject)
}
})
})
}
if(e.details.detector_noise_filter==='1'){
if(!s.group[e.ke].activeMonitors[e.id].noiseFilterArray)s.group[e.ke].activeMonitors[e.id].noiseFilterArray = {}
var noiseFilterArray = s.group[e.ke].activeMonitors[e.id].noiseFilterArray
if(jsonData.rawMonitorConfig.details.detector_noise_filter==='1'){
Object.keys(regions.notForPam).forEach(function(name){
if(!noiseFilterArray[name])noiseFilterArray[name]=[];
})
s.group[e.ke].activeMonitors[e.id].pamDiff.on('diff', (data) => {
pamDiff.on('diff', (data) => {
data.trigger.forEach(function(trigger){
s.filterTheNoise(e,noiseFilterArray,regions,trigger,function(){
s.createMatrixFromPamTrigger(trigger)
filterTheNoise(noiseFilterArray,regions,trigger,function(){
createMatrixFromPamTrigger(trigger)
buildTriggerEvent(trigger)
})
})
})
}else{
s.group[e.ke].activeMonitors[e.id].pamDiff.on('diff', (data) => {
pamDiff.on('diff', (data) => {
data.trigger.forEach(function(trigger){
s.createMatrixFromPamTrigger(trigger)
createMatrixFromPamTrigger(trigger)
buildTriggerEvent(trigger)
})
})
@ -194,7 +207,7 @@ module.exports = function(s,config){
}
}
s.createPamDiffRegionArray = function(regions,globalColorThreshold,globalSensitivity,fullFrame){
createPamDiffRegionArray = function(regions,globalColorThreshold,globalSensitivity,fullFrame){
var pamDiffCompliantArray = [],
arrayForOtherStuff = [],
json
@ -236,11 +249,11 @@ module.exports = function(s,config){
return {forPam:pamDiffCompliantArray,notForPam:arrayForOtherStuff};
}
s.filterTheNoise = function(e,noiseFilterArray,regions,trigger,callback){
filterTheNoise = function(noiseFilterArray,regions,trigger,callback){
if(noiseFilterArray[trigger.name].length > 2){
var thePreviousTriggerPercent = noiseFilterArray[trigger.name][noiseFilterArray[trigger.name].length - 1];
var triggerDifference = trigger.percent - thePreviousTriggerPercent;
var noiseRange = e.details.detector_noise_filter_range
var noiseRange = jsonData.rawMonitorConfig.details.detector_noise_filter_range
if(!noiseRange || noiseRange === ''){
noiseRange = 6
}
@ -267,48 +280,48 @@ module.exports = function(s,config){
}
}
s.checkMaximumSensitivity = function(monitor, region, detectorObject, callback) {
checkMaximumSensitivity = function(region, detectorObject, callback) {
var logName = detectorObject.id + ':' + detectorObject.name
var globalMaxSensitivity = parseInt(monitor.details.detector_max_sensitivity) || undefined
var globalMaxSensitivity = parseInt(jsonData.rawMonitorConfig.details.detector_max_sensitivity) || undefined
var maxSensitivity = parseInt(region.max_sensitivity) || globalMaxSensitivity
if (maxSensitivity === undefined || detectorObject.details.confidence <= maxSensitivity) {
callback(null)
} else {
callback(true)
if (monitor.triggerTimer[detectorObject.name] !== undefined) {
clearTimeout(monitor.triggerTimer[detectorObject.name].timeout)
monitor.triggerTimer[detectorObject.name] = undefined
if (triggerTimer[detectorObject.name] !== undefined) {
clearTimeout(triggerTimer[detectorObject.name].timeout)
triggerTimer[detectorObject.name] = undefined
}
}
}
s.checkTriggerThreshold = function(monitor, region, detectorObject, callback){
checkTriggerThreshold = function(region, detectorObject, callback){
var threshold = parseInt(region.threshold) || globalThreshold
if (threshold <= 1) {
callback(null)
} else {
if (monitor.triggerTimer[detectorObject.name] === undefined) {
monitor.triggerTimer[detectorObject.name] = {
if (triggerTimer[detectorObject.name] === undefined) {
triggerTimer[detectorObject.name] = {
count : threshold,
timeout : null
}
}
if (--monitor.triggerTimer[detectorObject.name].count == 0) {
if (--triggerTimer[detectorObject.name].count == 0) {
callback(null)
clearTimeout(monitor.triggerTimer[detectorObject.name].timeout)
monitor.triggerTimer[detectorObject.name] = undefined
clearTimeout(triggerTimer[detectorObject.name].timeout)
triggerTimer[detectorObject.name] = undefined
} else {
callback(true)
var fps = parseFloat(monitor.details.detector_fps) || 2
if (monitor.triggerTimer[detectorObject.name].timeout !== null)
clearTimeout(monitor.triggerTimer[detectorObject.name].timeout)
monitor.triggerTimer[detectorObject.name].timeout = setTimeout(function() {
monitor.triggerTimer[detectorObject.name] = undefined
var fps = parseFloat(jsonData.rawMonitorConfig.details.detector_fps) || 2
if (triggerTimer[detectorObject.name].timeout !== null)
clearTimeout(triggerTimer[detectorObject.name].timeout)
triggerTimer[detectorObject.name].timeout = setTimeout(function() {
triggerTimer[detectorObject.name] = undefined
}, ((threshold+0.5) * 1000) / fps)
}
}
}
s.mergePamTriggers = function(data){
mergePamTriggers = function(data){
if(data.trigger.length > 1){
var n = 0
var sum = 0
@ -318,7 +331,7 @@ module.exports = function(s,config){
name.push(trigger.name + ' ('+trigger.percent+'%)')
++n
sum += trigger.percent
s.createMatrixFromPamTrigger(trigger)
createMatrixFromPamTrigger(trigger)
if(trigger.matrix)matrices.push(trigger.matrix)
})
var average = sum / n
@ -332,47 +345,12 @@ module.exports = function(s,config){
}
}else{
var trigger = data.trigger[0]
s.createMatrixFromPamTrigger(trigger)
createMatrixFromPamTrigger(trigger)
trigger.matrices = [trigger.matrix]
}
return trigger
}
s.isAtleastOneMatrixInRegion = function(regions,matrices,callback){
var regionPolys = []
var matrixPoints = []
regions.forEach(function(region,n){
var polyPoints = []
region.points.forEach(function(point){
polyPoints.push(new V(parseInt(point[0]),parseInt(point[1])))
})
regionPolys[n] = new P(new V(0,0), polyPoints)
})
var collisions = []
var foundInRegion = false
matrices.forEach(function(matrix){
var matrixPoints = [
new V(matrix.x,matrix.y),
new V(matrix.width,matrix.y),
new V(matrix.width,matrix.height),
new V(matrix.x,matrix.height)
]
var matrixPoly = new P(new V(0,0), matrixPoints)
regionPolys.forEach(function(region,n){
var response = new SAT.Response()
var collided = SAT.testPolygonPolygon(matrixPoly, region, response)
if(collided === true){
collisions.push({
matrix: matrix,
region: regions[n]
})
foundInRegion = true
}
})
})
if(callback)callback(foundInRegion,collisions)
return foundInRegion
}
s.createMatrixFromPamTrigger = function(trigger){
createMatrixFromPamTrigger = function(trigger){
if(
trigger.minX &&
trigger.maxX &&
@ -396,4 +374,15 @@ module.exports = function(s,config){
}
return trigger
}
return function(cameraProcess,fallback){
if(jsonData.rawMonitorConfig.details.detector === '1' && jsonData.rawMonitorConfig.coProcessor === false){
//frames from motion detect
if(jsonData.rawMonitorConfig.details.detector_pam === '1'){
createPamDiffEngine()
cameraProcess.stdio[3].pipe(p2p).pipe(pamDiff)
}
}
};
}

View File

@ -0,0 +1,213 @@
const fs = require('fs')
const request = require('request')
const exec = require('child_process').exec
const spawn = require('child_process').spawn
const isWindows = (process.platform === 'win32' || process.platform === 'win64')
process.send = process.send || function () {};
if(!process.argv[2] || !process.argv[3]){
return writeToStderr('Missing FFMPEG Command String or no command operator')
}
var jsonData = JSON.parse(fs.readFileSync(process.argv[3],'utf8'))
const ffmpegAbsolutePath = process.argv[2].trim()
const ffmpegCommandString = jsonData.cmd
const rawMonitorConfig = jsonData.rawMonitorConfig
const stdioPipes = jsonData.pipes || []
var newPipes = []
var stdioWriters = [];
var writeToStderr = function(text){
try{
process.stderr.write(Buffer.from(`${text}`, 'utf8' ))
// stdioWriters[2].write(Buffer.from(`${new Error('writeToStderr').stack}`, 'utf8' ))
}catch(err){
// fs.appendFileSync('/home/Shinobi/test.log',text + '\n','utf8')
}
}
const buildMonitorUrl = function(e,noPath){
var authd = ''
var url
if(e.details.muser&&e.details.muser!==''&&e.host.indexOf('@')===-1) {
e.username = e.details.muser
e.password = e.details.mpass
authd = e.details.muser+':'+e.details.mpass+'@'
}
if(e.port==80&&e.details.port_force!=='1'){e.porty=''}else{e.porty=':'+e.port}
url = e.protocol+'://'+authd+e.host+e.porty
if(noPath !== true)url += e.path
return url
}
// [CTRL] + [C] = exit
process.on('uncaughtException', function (err) {
writeToStderr('Uncaught Exception occured!');
writeToStderr(err.stack);
});
const exitAction = function(){
try{
if(isWindows){
spawn("taskkill", ["/pid", cameraProcess.pid, '/f', '/t'])
}else{
process.kill(-cameraProcess.pid)
}
}catch(err){
}
}
process.on('SIGTERM', exitAction);
process.on('SIGINT', exitAction);
process.on('exit', exitAction);
for(var i=0; i < stdioPipes; i++){
switch(i){
case 0:
newPipes[i] = 'pipe'
break;
case 1:
newPipes[i] = 1
break;
case 2:
newPipes[i] = 2
break;
case 3:
stdioWriters[i] = fs.createWriteStream(null, {fd: i, end:false});
if(rawMonitorConfig.details.detector === '1' && rawMonitorConfig.details.detector_pam === '1'){
newPipes[i] = 'pipe'
}else{
newPipes[i] = stdioWriters[i]
}
break;
case 5:
stdioWriters[i] = fs.createWriteStream(null, {fd: i, end:false});
newPipes[i] = 'pipe'
break;
default:
stdioWriters[i] = fs.createWriteStream(null, {fd: i, end:false});
newPipes[i] = stdioWriters[i]
break;
}
}
stdioWriters.forEach((writer)=>{
writer.on('error', (err) => {
writeToStderr(err.stack);
});
})
writeToStderr(JSON.stringify(ffmpegCommandString))
var cameraProcess = spawn(ffmpegAbsolutePath,ffmpegCommandString,{detached: true,stdio:newPipes})
cameraProcess.on('close',()=>{
writeToStderr('Process Closed')
stdioWriters.forEach((writer)=>{
writer.end()
})
process.exit();
})
cameraProcess.stdio[5].on('data',(data)=>{
stdioWriters[5].write(data)
})
writeToStderr('Thread Opening')
if(rawMonitorConfig.details.detector === '1' && rawMonitorConfig.details.detector_pam === '1'){
try{
const attachPamDetector = require(__dirname + '/detector.js')(jsonData,stdioWriters[3])
attachPamDetector(cameraProcess,(err)=>{
writeToStderr(err)
})
}catch(err){
writeToStderr(err.stack)
}
}
if(rawMonitorConfig.type === 'jpeg'){
var recordingSnapRequest
var recordingSnapper
var errorTimeout
var errorCount = 0
var capture_fps = parseFloat(rawMonitorConfig.details.sfps || 1)
if(isNaN(capture_fps))capture_fps = 1
try{
cameraProcess.stdio[0].on('error',function(err){
if(err && rawMonitorConfig.details.loglevel !== 'quiet'){
// s.userLog(e,{type:'STDIN ERROR',msg:err});
}
})
}catch(err){
writeToStderr(err.stack)
}
setTimeout(() => {
if(!cameraProcess.stdio[0])return writeToStderr('No Camera Process Found for Snapper');
const captureOne = function(f){
recordingSnapRequest = request({
url: buildMonitorUrl(rawMonitorConfig),
method: 'GET',
encoding: null,
timeout: 15000
},function(err,data){
if(err){
writeToStderr(JSON.stringify(err))
return;
}
// writeToStderr(data.body.length)
cameraProcess.stdio[0].write(data.body)
recordingSnapper = setTimeout(function(){
captureOne()
},1000 / capture_fps)
if(!errorTimeout){
clearTimeout(errorTimeout)
errorTimeout = setTimeout(function(){
errorCount = 0;
delete(errorTimeout)
},3000)
}
}).on('error', function(err){
++errorCount
clearTimeout(errorTimeout)
errorTimeout = null
writeToStderr(JSON.stringify(err))
if(rawMonitorConfig.details.loglevel !== 'quiet'){
// s.userLog(e,{
// type: lang['JPEG Error'],
// msg: {
// msg: lang.JPEGErrorText,
// info: err
// }
// });
switch(err.code){
case'ESOCKETTIMEDOUT':
case'ETIMEDOUT':
// ++s.group[e.ke].activeMonitors[e.id].errorSocketTimeoutCount
// if(
// rawMonitorConfig.details.fatal_max !== 0 &&
// s.group[e.ke].activeMonitors[e.id].errorSocketTimeoutCount > rawMonitorConfig.details.fatal_max
// ){
// // s.userLog(e,{type:lang['Fatal Maximum Reached'],msg:{code:'ESOCKETTIMEDOUT',msg:lang.FatalMaximumReachedText}});
// // s.camera('stop',e)
// }else{
// // s.userLog(e,{type:lang['Restarting Process'],msg:{code:'ESOCKETTIMEDOUT',msg:lang.FatalMaximumReachedText}});
// // s.camera('restart',e)
// }
// return;
break;
}
}
// if(rawMonitorConfig.details.fatal_max !== 0 && errorCount > rawMonitorConfig.details.fatal_max){
// clearTimeout(recordingSnapper)
// process.exit()
// }
})
}
captureOne()
},5000)
}
if(
rawMonitorConfig.type === 'dashcam' ||
rawMonitorConfig.type === 'socket'
){
process.stdin.on('data',(data) => {
//confirmed receiving data this way.
cameraProcess.stdin.write(data)
})
}

View File

@ -0,0 +1,60 @@
const fs = require('fs')
const spawn = require('child_process').spawn
const isWindows = process.platform === "win32";
var writeToStderr = function(text){
// fs.appendFileSync(rawMonitorConfig.sdir + 'errors.log',text + '\n','utf8')
process.stderr.write(Buffer.from(`${text}`, 'utf8' ))
}
if(!process.argv[2] || !process.argv[3]){
return writeToStderr('Missing FFMPEG Command String or no command operator')
}
process.send = process.send || function () {};
process.on('uncaughtException', function (err) {
writeToStderr('Uncaught Exception occured!');
writeToStderr(err.stack);
});
// [CTRL] + [C] = exit
const exitAction = function(){
if(isWindows){
spawn("taskkill", ["/pid", snapProcess.pid, '/f', '/t'])
}else{
try{
process.kill(-snapProcess.pid)
}catch(err){
}
}
}
process.on('SIGTERM', exitAction);
process.on('SIGINT', exitAction);
process.on('exit', exitAction);
var jsonData = JSON.parse(fs.readFileSync(process.argv[3],'utf8'))
// fs.unlink(process.argv[3],()=>{})
const ffmpegAbsolutePath = process.argv[2].trim()
const ffmpegCommandString = jsonData.cmd
const temporaryImageFile = jsonData.temporaryImageFile
const iconImageFile = jsonData.iconImageFile
const useIcon = jsonData.useIcon
const rawMonitorConfig = jsonData.rawMonitorConfig
// var writeToStderr = function(text){
// process.stderr.write(Buffer.from(text))
// }
var snapProcess = spawn(ffmpegAbsolutePath,ffmpegCommandString,{detached: true})
snapProcess.stderr.on('data',(data)=>{
writeToStderr(data.toString())
})
snapProcess.stdout.on('data',(data)=>{
writeToStderr(data.toString())
})
snapProcess.on('close',function(data){
if(useIcon){
var fileCopy = fs.createReadStream(temporaryImageFile).pipe(fs.createWriteStream(iconImageFile))
fileCopy.on('close',function(){
process.exit();
})
}else{
process.exit();
}
});

View File

@ -3,6 +3,7 @@ var http = require('http');
var https = require('https');
var express = require('express');
module.exports = function(s,config,lang,app,io){
const { cameraDestroy } = require('./monitor/utils.js')(s,config,lang)
//setup Master for childNodes
if(config.childNodes.enabled === true && config.childNodes.mode === 'master'){
s.childNodes = {};
@ -66,6 +67,11 @@ module.exports = function(s,config,lang,app,io){
cn.emit('c',{f:'sqlCallback',rows:rows,err:err,callbackId:d.callbackId});
});
break;
case'knex':
s.knexQuery(d.options,function(err,rows){
cn.emit('c',{f:'sqlCallback',rows:rows,err:err,callbackId:d.callbackId});
});
break;
case'clearCameraFromActiveList':
if(s.childNodes[ipAddress])delete(s.childNodes[ipAddress].activeCameras[d.ke + d.id])
break;
@ -145,16 +151,16 @@ module.exports = function(s,config,lang,app,io){
dir : s.getVideoDirectory(d.d),
file : d.filename,
filename : d.filename,
filesizeMB : parseFloat((d.filesize/1000000).toFixed(2))
filesizeMB : parseFloat((d.filesize/1048576).toFixed(2))
}
s.insertDatabaseRow(d.d,insert)
s.insertCompletedVideoExtensions.forEach(function(extender){
extender(d.d,insert)
})
//purge over max
s.purgeDiskForGroup(d)
s.purgeDiskForGroup(d.ke)
//send new diskUsage values
s.setDiskUsedForGroup(d,insert.filesizeMB)
s.setDiskUsedForGroup(d.ke,insert.filesizeMB)
clearTimeout(s.group[d.ke].activeMonitors[d.mid].recordingChecker)
clearTimeout(s.group[d.ke].activeMonitors[d.mid].streamChecker)
break;
@ -213,11 +219,19 @@ module.exports = function(s,config,lang,app,io){
s.queuedSqlCallbacks[callbackId] = onMoveOn
s.cx({f:'sql',query:query,values:values,callbackId:callbackId});
}
setInterval(function(){
s.cpuUsage(function(cpu){
s.cx({f:'cpu',cpu:parseFloat(cpu)})
s.knexQuery = function(options,onMoveOn){
var callbackId = s.gid()
if(typeof onMoveOn !== 'function'){onMoveOn=function(){}}
s.queuedSqlCallbacks[callbackId] = onMoveOn
s.cx({f:'knex',options:options,callbackId:callbackId});
}
setInterval(async () => {
const cpu = await s.cpuUsage()
s.cx({
f: 'cpu',
cpu: parseFloat(cpu)
})
},2000)
},5000)
childIO.on('connect', function(d){
console.log('CHILD CONNECTION SUCCESS')
s.cx({
@ -241,7 +255,7 @@ module.exports = function(s,config,lang,app,io){
break;
case'kill':
s.initiateMonitorObject(d.d);
s.cameraDestroy(s.group[d.d.ke].activeMonitors[d.d.id].spawn,d.d)
cameraDestroy(d.d)
var childNodeIp = s.group[d.d.ke].activeMonitors[d.d.id]
break;
case'sync':

View File

@ -46,7 +46,7 @@ module.exports = function(s,config,lang){
}
],null,3))
setTimeout(function(){
require(s.mainDirectory + '/test/run.js')(s,config,lang,io)
require('../test/run.js')(s,config,lang,io)
},500)
}
}

48
libs/commander.js Normal file
View File

@ -0,0 +1,48 @@
module.exports = function(s,config,lang){
if(config.p2pEnabled){
const { Worker, isMainThread } = require('worker_threads');
const startWorker = () => {
// set the first parameter as a string.
const pathToWorkerScript = __dirname + '/commander/worker.js'
const workerProcess = new Worker(pathToWorkerScript)
workerProcess.on('message',function(data){
switch(data.f){
case'debugLog':
s.debugLog(...data.data)
break;
case'systemLog':
s.systemLog(...data.data)
break;
}
})
setTimeout(() => {
workerProcess.postMessage({
f: 'init',
config: config,
lang: lang
})
},2000)
// workerProcess is an Emitter.
// it also contains a direct handle to the `spawn` at `workerProcess.spawnProcess`
return workerProcess
}
config.machineId = config.p2pApiKey + '' + config.p2pGroupId
config.p2pTargetAuth = config.p2pTargetAuth || s.gid(30)
if(config.p2pTargetGroupId && config.p2pTargetUserId){
startWorker()
}else{
s.knexQuery({
action: "select",
columns: "ke,uid",
table: "Users",
where: [],
limit: 1
},(err,r) => {
const firstUser = r[0]
config.p2pTargetUserId = firstUser.uid
config.p2pTargetGroupId = firstUser.ke
startWorker()
})
}
}
}

230
libs/commander/worker.js Normal file
View File

@ -0,0 +1,230 @@
const { parentPort } = require('worker_threads');
const request = require('request');
const socketIOClient = require('socket.io-client');
const p2pClientConnectionStaticName = 'Commander'
const p2pClientConnections = {}
const runningRequests = {}
const connectedUserWebSockets = {}
const s = {
debugLog: (...args) => {
parentPort.postMessage({
f: 'debugLog',
data: args
})
},
systemLog: (...args) => {
parentPort.postMessage({
f: 'systemLog',
data: args
})
},
}
parentPort.on('message',(data) => {
switch(data.f){
case'init':
initialize(data.config,data.lang)
break;
}
})
const initialize = (config,lang) => {
if(!config.p2pHost)config.p2pHost = 'ws://163.172.180.205:8084'
const parseJSON = function(string){
var parsed = string
try{
parsed = JSON.parse(string)
}catch(err){
}
return parsed
}
const createQueryStringFromObject = function(obj){
var queryString = ''
var keys = Object.keys(obj)
keys.forEach(function(key){
var value = obj[key]
queryString += `&${key}=${value}`
})
return queryString
}
const doRequest = function(url,method,data,callback,onDataReceived){
var requestEndpoint = `${config.sslEnabled ? `https` : 'http'}://localhost:${config.sslEnabled ? config.ssl.port : config.port}` + url
if(method === 'GET' && data){
requestEndpoint += '?' + createQueryStringFromObject(data)
}
return request(requestEndpoint,{
method: method,
json: method !== 'GET' ? (data ? data : null) : null
}, function(err,resp,body){
// var json = parseJSON(body)
if(err)console.error(err,data)
callback(err,body,resp)
}).on('data', function(data) {
onDataReceived(data)
})
}
const createShinobiSocketConnection = (connectionId) => {
const masterConnectionToMachine = socketIOClient(`ws://localhost:${config.port}`, {transports:['websocket']})
p2pClientConnections[connectionId || p2pClientConnectionStaticName] = masterConnectionToMachine
return masterConnectionToMachine
}
//
s.debugLog('p2p',`Connecting to ${config.p2pHost}...`)
const connectionToP2PServer = socketIOClient(config.p2pHost, {transports:['websocket']});
if(!config.p2pApiKey){
s.systemLog('p2p',`Please fill 'p2pApiKey' in your conf.json.`)
}
if(!config.p2pGroupId){
s.systemLog('p2p',`Please fill 'p2pGroupId' in your conf.json.`)
}
connectionToP2PServer.on('connect', () => {
s.systemLog('p2p',`Connected ${config.p2pHost}!`)
connectionToP2PServer.emit('initMachine',{
port: config.port,
apiKey: config.p2pApiKey,
groupId: config.p2pGroupId,
targetUserId: config.p2pTargetUserId,
targetGroupId: config.p2pTargetGroupId,
subscriptionId: config.subscriptionId || 'notActivated'
})
})
connectionToP2PServer.on('httpClose',(requestId) => {
if(runningRequests[requestId] && runningRequests[requestId].abort){
runningRequests[requestId].abort()
delete(runningRequests[requestId])
}
})
connectionToP2PServer.on('http',(rawRequest) => {
runningRequests[rawRequest.rid] = doRequest(
rawRequest.url,
rawRequest.method,
rawRequest.data,
function(err,json,resp){
connectionToP2PServer.emit('httpResponse',{
err: err,
json: rawRequest.bodyOnEnd ? json : null,
rid: rawRequest.rid
})
},
(data) => {
if(!rawRequest.bodyOnEnd)connectionToP2PServer.emit('httpResponseChunk',{
data: data,
rid: rawRequest.rid
})
})
})
const masterConnectionToMachine = createShinobiSocketConnection()
masterConnectionToMachine.on('connect', () => {
masterConnectionToMachine.emit('f',{
f: 'init',
auth: config.p2pTargetAuth,
ke: config.p2pTargetGroupId,
uid: config.p2pTargetUserId
})
})
masterConnectionToMachine.on('f',(data) => {
connectionToP2PServer.emit('f',data)
})
connectionToP2PServer.on('wsInit',(rawRequest) => {
s.debugLog('p2pWsInit',rawRequest)
const user = rawRequest.user
const clientConnectionToMachine = createShinobiSocketConnection(rawRequest.cnid)
connectedUserWebSockets[user.auth_token] = user;
clientConnectionToMachine.on('connect', () => {
s.debugLog('init',user.auth_token)
clientConnectionToMachine.emit('f',{
f: 'init',
auth: user.auth_token,
ke: user.ke,
uid: user.uid,
})
});
([
'f',
]).forEach((target) => {
connectionToP2PServer.on(target,(data) => {
clientConnectionToMachine.emit(target,data)
})
clientConnectionToMachine.on(target,(data) => {
connectionToP2PServer.emit(target,{data: data, cnid: rawRequest.cnid})
})
})
});
([
'a',
'r',
'gps',
'e',
'super',
]).forEach((target) => {
connectionToP2PServer.on(target,(data) => {
var clientConnectionToMachine
if(data.f === 'init'){
clientConnectionToMachine = createShinobiSocketConnection(data.cnid)
clientConnectionToMachine.on('connect', () => {
clientConnectionToMachine.on(target,(fromData) => {
connectionToP2PServer.emit(target,{data: fromData, cnid: data.cnid})
})
clientConnectionToMachine.on('f',(fromData) => {
connectionToP2PServer.emit('f',{data: fromData, cnid: data.cnid})
})
clientConnectionToMachine.emit(target,data)
});
}else{
clientConnectionToMachine = p2pClientConnections[data.cnid]
clientConnectionToMachine.emit(target,data)
}
})
});
([
'h265',
'Base64',
'FLV',
'MP4',
]).forEach((target) => {
connectionToP2PServer.on(target,(initData) => {
if(connectedUserWebSockets[initData.auth]){
const clientConnectionToMachine = createShinobiSocketConnection(initData.auth + initData.ke + initData.id)
clientConnectionToMachine.on('connect', () => {
clientConnectionToMachine.emit(target,initData)
});
clientConnectionToMachine.on('data',(data) => {
connectionToP2PServer.emit('data',{data: data, cnid: initData.cnid})
});
}else{
s.debugLog('disconnect now!')
}
})
});
connectionToP2PServer.on('wsDestroyStream',(clientKey) => {
if(p2pClientConnections[clientKey]){
p2pClientConnections[clientKey].disconnect();
}
delete(p2pClientConnections[clientKey])
});
connectionToP2PServer.on('wsDestroy',(rawRequest) => {
if(p2pClientConnections[rawRequest.cnid]){
p2pClientConnections[rawRequest.cnid].disconnect();
}
delete(p2pClientConnections[rawRequest.cnid])
});
connectionToP2PServer.on('allowDisconnect',(bool) => {
connectionToP2PServer.allowDisconnect = true;
connectionToP2PServer.disconnect()
s.debugLog('p2p','Server Forced Disconnection')
});
const onDisconnect = () => {
s.systemLog('p2p','Disconnected')
if(!connectionToP2PServer.allowDisconnect){
s.systemLog('p2p','Attempting Reconnection...')
setTimeout(() => {
connectionToP2PServer.connect()
},3000)
}
}
connectionToP2PServer.on('error',onDisconnect)
connectionToP2PServer.on('disconnect',onDisconnect)
}

11
libs/common.js Normal file
View File

@ -0,0 +1,11 @@
const async = require("async");
exports.copyObject = (obj) => {
return Object.assign({},obj)
}
exports.createQueue = (timeoutInSeconds, queueItemsRunningInParallel) => {
return async.queue(function(action, callback) {
setTimeout(function(){
action(callback)
},timeoutInSeconds * 1000 || 1000)
},queueItemsRunningInParallel || 3)
}

View File

@ -4,7 +4,11 @@ module.exports = function(s){
config : s.mainDirectory+'/conf.json',
languages : s.mainDirectory+'/languages'
}
var config = require(s.location.config);
try{
var config = require(s.location.config)
}catch(err){
var config = {}
}
if(!config.productType){
config.productType = 'CE'
}

6
libs/control.js Normal file
View File

@ -0,0 +1,6 @@
var os = require('os');
var exec = require('child_process').exec;
module.exports = function(s,config,lang,app,io){
require('./control/onvif.js')(s,config,lang,app,io)
// const ptz = require('./control/ptz.js')(s,config,lang,app,io)
}

145
libs/control/onvif.js Normal file
View File

@ -0,0 +1,145 @@
var os = require('os');
var exec = require('child_process').exec;
var onvif = require("node-onvif");
module.exports = function(s,config,lang,app,io){
const createOnvifDevice = async (onvifAuth) => {
var response = {ok: false}
const monitorConfig = s.group[onvifAuth.ke].rawMonitorConfigurations[onvifAuth.id]
const controlBaseUrl = monitorConfig.details.control_base_url || s.buildMonitorUrl(monitorConfig, true)
const controlURLOptions = s.cameraControlOptionsFromUrl(controlBaseUrl,monitorConfig)
//create onvif connection
const device = new onvif.OnvifDevice({
address : controlURLOptions.host + ':' + controlURLOptions.port,
user : controlURLOptions.username,
pass : controlURLOptions.password
})
s.group[onvifAuth.ke].activeMonitors[onvifAuth.id].onvifConnection = device
try{
const info = await device.init()
response.ok = true
response.device = device
}catch(err){
response.msg = 'Device responded with an error'
response.error = err
}
return response
}
const replaceDynamicInOptions = (Camera,options) => {
const newOptions = {}
Object.keys(options).forEach((key) => {
const value = options[key]
if(typeof value === 'string'){
newOptions[key] = value.replace(/__CURRENT_TOKEN/g,Camera.current_profile.token)
}else if(value !== undefined && value !== null){
newOptions[key] = value
}
})
return newOptions
}
const runOnvifMethod = async (onvifOptions,callback) => {
var onvifAuth = onvifOptions.auth
var response = {ok: false}
var errorMessage = function(msg,error){
response.ok = false
response.msg = msg
response.error = error
callback(response)
}
var actionCallback = function(onvifActionResponse){
response.ok = true
if(onvifActionResponse.data){
response.responseFromDevice = onvifActionResponse.data
}else{
response.responseFromDevice = onvifActionResponse
}
if(onvifActionResponse.soap)response.soap = onvifActionResponse.soap
callback(response)
}
var isEmpty = function(obj) {
for(var key in obj) {
if(obj.hasOwnProperty(key))
return false;
}
return true;
}
var doAction = function(Camera){
var completeAction = function(command){
if(command && command.then){
command.then(actionCallback).catch(function(error){
errorMessage('Device Action responded with an error',error)
})
}else if(command){
response.ok = true
response.repsonseFromDevice = command
callback(response)
}else{
response.error = 'Big Errors, Please report it to Shinobi Development'
callback(response)
}
}
var action
if(onvifAuth.service){
if(Camera.services[onvifAuth.service] === undefined){
return errorMessage('This is not an available service. Please use one of the following : '+Object.keys(Camera.services).join(', '))
}
if(Camera.services[onvifAuth.service] === null){
return errorMessage('This service is not activated. Maybe you are not connected through ONVIF. You can test by attempting to use the "Control" feature with ONVIF in Shinobi.')
}
action = Camera.services[onvifAuth.service][onvifAuth.action]
}else{
action = Camera[onvifAuth.action]
}
if(!action || typeof action !== 'function'){
errorMessage(onvifAuth.action+' is not an available ONVIF function. See https://github.com/futomi/node-onvif for functions.')
}else{
var argNames = s.getFunctionParamNames(action)
var options
var command
if(argNames[0] === 'options' || argNames[0] === 'params'){
options = replaceDynamicInOptions(Camera,onvifOptions.options || {})
response.options = options
}
if(onvifAuth.service){
command = Camera.services[onvifAuth.service][onvifAuth.action](options)
}else{
command = Camera[onvifAuth.action](options)
}
completeAction(command)
}
}
if(!s.group[onvifAuth.ke].activeMonitors[onvifAuth.id].onvifConnection){
const response = await createOnvifDevice(onvifAuth)
if(response.ok){
doAction(response.device)
}else{
errorMessage(response.msg,response.error)
}
}else{
doAction(s.group[onvifAuth.ke].activeMonitors[onvifAuth.id].onvifConnection)
}
}
/**
* API : ONVIF Method Controller
*/
app.all([
config.webPaths.apiPrefix+':auth/onvif/:ke/:id/:action',
config.webPaths.apiPrefix+':auth/onvif/:ke/:id/:service/:action'
],function (req,res){
s.auth(req.params,function(user){
const options = s.getPostData(req,'options',true) || s.getPostData(req,'params',true)
runOnvifMethod({
auth: {
ke: req.params.ke,
id: req.params.id,
action: req.params.action,
service: req.params.service,
},
options: options,
},(endData) => {
s.closeJsonResponse(res,endData)
})
},res,req);
})
s.createOnvifDevice = createOnvifDevice
s.runOnvifMethod = runOnvifMethod
}

382
libs/control/ptz.js Normal file
View File

@ -0,0 +1,382 @@
var os = require('os');
var exec = require('child_process').exec;
var request = require('request')
module.exports = function(s,config,lang){
const moveLock = {}
const ptzTimeoutsUntilResetToHome = {}
const startMove = async function(options,callback){
const device = s.group[options.ke].activeMonitors[options.id].onvifConnection
if(!device){
const response = await s.createOnvifDevice({
ke: options.ke,
id: options.id,
})
const device = s.group[options.ke].activeMonitors[options.id].onvifConnection
}
options.controlOptions.ProfileToken = device.current_profile.token
s.runOnvifMethod({
auth: {
ke: options.ke,
id: options.id,
action: 'continuousMove',
service: 'ptz',
},
options: options.controlOptions,
},callback)
}
const stopMove = function(options,callback){
const device = s.group[options.ke].activeMonitors[options.id].onvifConnection
s.runOnvifMethod({
auth: {
ke: options.ke,
id: options.id,
action: 'stop',
service: 'ptz',
},
options: {
'PanTilt': true,
'Zoom': true,
ProfileToken: device.current_profile.token
},
},callback)
}
const moveOnvifCamera = function(options,callback){
const monitorConfig = s.group[options.ke].rawMonitorConfigurations[options.id]
const invertedVerticalAxis = monitorConfig.details.control_invert_y === '1'
const controlUrlStopTimeout = parseInt(monitorConfig.details.control_url_stop_timeout) || 1000
switch(options.direction){
case'center':
callback({type:'Center button inactive'})
break;
case'stopMove':
callback({type:'Control Trigger Ended'})
stopMove({
ke: options.ke,
id: options.id,
},(response) => {
})
break;
default:
try{
var controlOptions = {
Velocity : {}
}
if(options.axis){
options.axis.forEach((axis) => {
controlOptions.Velocity[axis.direction] = axis.amount
})
}else{
var onvifDirections = {
"left": [-1.0,'x'],
"right": [1.0,'x'],
"down": [invertedVerticalAxis ? 1.0 : -1.0,'y'],
"up": [invertedVerticalAxis ? -1.0 : 1.0,'y'],
"zoom_in": [1.0,'z'],
"zoom_out": [-1.0,'z']
}
var direction = onvifDirections[options.direction]
controlOptions.Velocity[direction[1]] = direction[0]
}
(['x','y','z']).forEach(function(axis){
if(!controlOptions.Velocity[axis])
controlOptions.Velocity[axis] = 0
})
if(monitorConfig.details.control_stop === '1'){
startMove({
ke: options.ke,
id: options.id,
controlOptions: controlOptions
},(response) => {
if(response.ok){
if(controlUrlStopTimeout != '0'){
setTimeout(function(){
stopMove({
ke: options.ke,
id: options.id,
},(response) => {
if(!response.ok){
console.log(error)
}
})
callback({type: 'Control Triggered'})
},controlUrlStopTimeout)
}
}else{
s.debugLog(response)
}
})
}else{
controlOptions.Speed = {'x': 1, 'y': 1, 'z': 1}
controlOptions.Translation = Object.assign(controlOptions.Velocity,{})
delete(controlOptions.Velocity)
s.runOnvifMethod({
auth: {
ke: options.ke,
id: options.id,
action: 'relativeMove',
service: 'ptz',
},
options: controlOptions,
},(response) => {
if(response.ok){
callback({type: 'Control Triggered'})
}else{
callback({type: 'Control Triggered', error: response.error})
}
})
}
}catch(err){
console.log(err)
console.log(new Error())
}
break;
}
}
const ptzControl = async function(options,callback){
if(!s.group[options.ke] || !s.group[options.ke].activeMonitors[options.id]){return}
const monitorConfig = s.group[options.ke].rawMonitorConfigurations[options.id]
const controlUrlMethod = monitorConfig.details.control_url_method || 'GET'
const controlBaseUrl = monitorConfig.details.control_base_url || s.buildMonitorUrl(monitorConfig, true)
if(monitorConfig.details.control !== "1"){
s.userLog(e,{type:lang['Control Error'],msg:lang.ControlErrorText1});
return
}
if(monitorConfig.details.control_url_stop_timeout === '0' && monitorConfig.details.control_stop === '1' && s.group[options.ke].activeMonitors[options.id].ptzMoving === true){
options.direction = 'stopMove'
s.group[options.ke].activeMonitors[options.id].ptzMoving = false
}else{
s.group[options.ke].activeMonitors[options.id].ptzMoving = true
}
if(controlUrlMethod === 'ONVIF'){
try{
//create onvif connection
if(
!s.group[options.ke].activeMonitors[options.id].onvifConnection ||
!s.group[options.ke].activeMonitors[options.id].onvifConnection.current_profile ||
!s.group[options.ke].activeMonitors[options.id].onvifConnection.current_profile.token
){
const response = await s.createOnvifDevice({
ke: options.ke,
id: options.id,
})
if(response.ok){
moveOnvifCamera({
ke: options.ke,
id: options.id,
direction: options.direction,
axis: options.axis,
},(msg) => {
msg.msg = options.direction
callback(msg)
})
}else{
s.userLog(e,{type:lang['Control Error'],msg:response.error})
}
}else{
moveOnvifCamera({
ke: options.ke,
id: options.id,
direction: options.direction,
axis: options.axis,
},(msg) => {
if(!msg.msg)msg.msg = {direction: options.direction}
callback(msg)
})
}
}catch(err){
s.debugLog(err)
callback({
type: lang['Control Error'],
msg: {
msg: lang.ControlErrorText2,
error: err,
direction: options.direction
}
})
}
}else{
const controlUrlStopTimeout = parseInt(monitorConfig.details.control_url_stop_timeout) || 1000
var stopCamera = function(){
let stopURL = controlBaseUrl + monitorConfig.details[`control_url_${options.direction}_stop`]
let controlOptions = s.cameraControlOptionsFromUrl(stopURL,monitorConfig)
let requestOptions = {
url : stopURL,
method : controlOptions.method,
auth : {
user : controlOptions.username,
pass : controlOptions.password
}
}
if(monitorConfig.details.control_digest_auth === '1'){
requestOptions.sendImmediately = true
}
request(requestOptions,function(err,data){
const msg = {
ok: true,
type:'Control Trigger Ended'
}
if(err){
msg.ok = false
msg.type = 'Control Error'
msg.msg = err
}
callback(msg)
s.userLog(e,msg);
})
}
if(options.direction === 'stopMove'){
stopCamera()
}else{
let controlURL = controlBaseUrl + monitorConfig.details[`control_url_${options.direction}`]
let controlOptions = s.cameraControlOptionsFromUrl(controlURL,monitorConfig)
let requestOptions = {
url: controlURL,
method: controlOptions.method,
auth: {
user: controlOptions.username,
pass: controlOptions.password
}
}
if(monitorConfig.details.control_digest_auth === '1'){
requestOptions.sendImmediately = true
}
request(requestOptions,function(err,data){
if(err){
callback({ok:false,type:'Control Error',msg:err})
return
}
if(monitorConfig.details.control_stop == '1' && options.direction !== 'center' ){
s.userLog(e,{type:'Control Triggered Started'});
if(controlUrlStopTimeout > 0){
setTimeout(function(){
stopCamera()
},controlUrlStopTimeout)
}
}else{
callback({ok:true,type:'Control Triggered'})
}
})
}
}
}
const getPresetPositions = (options,callback) => {
const profileToken = options.ProfileToken || "__CURRENT_TOKEN"
return s.runOnvifMethod({
auth: {
ke: options.ke,
id: options.id,
service: 'ptz',
action: 'getPresets',
},
options: {
ProfileToken: profileToken
},
},callback)
}
const setPresetForCurrentPosition = (options,callback) => {
const nonStandardOnvif = s.group[options.ke].rawMonitorConfigurations[options.id].details.onvif_non_standard === '1'
const profileToken = options.ProfileToken || "__CURRENT_TOKEN"
s.runOnvifMethod({
auth: {
ke: options.ke,
id: options.id,
service: 'ptz',
action: 'setPreset',
},
options: {
ProfileToken: profileToken,
PresetToken: nonStandardOnvif ? null : options.PresetToken || profileToken,
PresetName: options.PresetName || nonStandardOnvif ? '1' : profileToken
},
},(endData) => {
callback(endData)
})
}
const moveToPresetPosition = (options,callback) => {
const nonStandardOnvif = s.group[options.ke].rawMonitorConfigurations[options.id].details.onvif_non_standard === '1'
const profileToken = options.ProfileToken || "__CURRENT_TOKEN"
return s.runOnvifMethod({
auth: {
ke: options.ke,
id: options.id,
service: 'ptz',
action: 'gotoPreset',
},
options: {
ProfileToken: profileToken,
PresetToken: options.PresetToken || nonStandardOnvif ? '1' : profileToken,
Speed: {
"x": 1,
"y": 1,
"z": 1
},
},
},callback)
}
const getLargestMatrix = (matrices) => {
var largestMatrix = {width: 0, height: 0}
matrices.forEach((matrix) => {
if(matrix.width > largestMatrix.width && matrix.height > largestMatrix.height)largestMatrix = matrix
})
return largestMatrix.x ? largestMatrix : null
}
const moveCameraPtzToMatrix = function(event,trackingTarget){
if(moveLock[event.ke + event.id])return;
clearTimeout(moveLock[event.ke + event.id])
moveLock[event.ke + event.id] = setTimeout(() => {
delete(moveLock[event.ke + event.id])
},1000)
const imgHeight = event.details.imgHeight
const imgWidth = event.details.imgWidth
const thresholdX = imgWidth * 0.125
const thresholdY = imgHeight * 0.125
const imageCenterX = imgWidth / 2
const imageCenterY = imgHeight / 2
const matrices = event.details.matrices
const largestMatrix = getLargestMatrix(matrices.filter(matrix => matrix.tag === (trackingTarget || 'person')))
// console.log(matrices.find(matrix => matrix.tag === 'person'))
if(!largestMatrix)return;
const matrixCenterX = largestMatrix.x + (largestMatrix.width / 2)
const matrixCenterY = largestMatrix.y + (largestMatrix.height / 2)
const rawDistanceX = (matrixCenterX - imageCenterX)
const rawDistanceY = (matrixCenterY - imageCenterY)
const distanceX = imgWidth / rawDistanceX
const distanceY = imgHeight / rawDistanceY
const axisX = rawDistanceX > thresholdX || rawDistanceX < -thresholdX ? distanceX : 0
const axisY = largestMatrix.y < 30 && largestMatrix.height > imgHeight * 0.8 ? 0.5 : rawDistanceY > thresholdY || rawDistanceY < -thresholdY ? -distanceY : 0
if(axisX !== 0 || axisY !== 0){
ptzControl({
axis: [
{direction: 'x', amount: axisX === 0 ? 0 : axisX > 0 ? 0.01 : -0.01},
{direction: 'y', amount: axisY === 0 ? 0 : axisY > 0 ? 0.01 : -0.01},
{direction: 'z', amount: 0},
],
// axis: [{direction: 'x', amount: 1.0}],
id: event.id,
ke: event.ke
},(msg) => {
s.userLog(event,msg)
// console.log(msg)
clearTimeout(ptzTimeoutsUntilResetToHome[event.ke + event.id])
ptzTimeoutsUntilResetToHome[event.ke + event.id] = setTimeout(() => {
moveToPresetPosition({
ke: event.ke,
id: event.id,
},(endData) => {
console.log(endData)
})
},7000)
})
}
}
return {
ptzControl: ptzControl,
startMove: startMove,
stopMove: stopMove,
getPresetPositions: getPresetPositions,
setPresetForCurrentPosition: setPresetForCurrentPosition,
moveToPresetPosition: moveToPresetPosition,
moveCameraPtzToMatrix: moveCameraPtzToMatrix
}
}

View File

@ -1,174 +1,463 @@
var fs = require('fs')
var express = require('express')
module.exports = function(s,config,lang,app,io){
function mergeDeep(...objects) {
const isObject = obj => obj && typeof obj === 'object';
return objects.reduce((prev, obj) => {
Object.keys(obj).forEach(key => {
const pVal = prev[key];
const oVal = obj[key];
if (Array.isArray(pVal) && Array.isArray(oVal)) {
prev[key] = pVal.concat(...oVal);
}
else if (isObject(pVal) && isObject(oVal)) {
prev[key] = mergeDeep(pVal, oVal);
}
else {
prev[key] = oVal;
}
});
return prev;
}, {});
const fs = require('fs-extra');
const express = require('express')
const request = require('request')
const unzipper = require('unzipper')
const fetch = require("node-fetch")
const spawn = require('child_process').spawn
module.exports = async (s,config,lang,app,io) => {
const runningInstallProcesses = {}
const modulesBasePath = s.mainDirectory + '/libs/customAutoLoad/'
const searchText = function(searchFor,searchIn){
return searchIn.indexOf(searchFor) > -1
}
s.customAutoLoadModules = {}
s.customAutoLoadTree = {
pages: [],
PageBlocks: [],
LibsJs: [],
LibsCss: [],
adminPageBlocks: [],
adminLibsJs: [],
adminLibsCss: [],
superPageBlocks: [],
superLibsJs: [],
superLibsCss: []
const extractNameFromPackage = (filePath) => {
const filePathParts = filePath.split('/')
const packageName = filePathParts[filePathParts.length - 1].split('.')[0]
return packageName
}
var folderPath = s.mainDirectory + '/libs/customAutoLoad'
var search = function(searchFor,searchIn){return searchIn.indexOf(searchFor) > -1}
fs.readdir(folderPath,function(err,folderContents){
if(!err && folderContents){
folderContents.forEach(function(filename){
s.customAutoLoadModules[filename] = {}
var customModulePath = folderPath + '/' + filename
if(filename.indexOf('.js') > -1){
s.customAutoLoadModules[filename].type = 'file'
try{
require(customModulePath)(s,config,lang,app,io)
}catch(err){
console.log('Failed to Load Module : ' + filename)
console.log(err)
}
}else{
if(fs.lstatSync(customModulePath).isDirectory()){
s.customAutoLoadModules[filename].type = 'folder'
try{
require(customModulePath)(s,config,lang,app,io)
fs.readdir(customModulePath,function(err,folderContents){
folderContents.forEach(function(name){
switch(name){
case'web':
var webFolder = s.checkCorrectPathEnding(customModulePath) + 'web/'
fs.readdir(webFolder,function(err,webFolderContents){
webFolderContents.forEach(function(name){
switch(name){
case'libs':
case'pages':
if(name === 'libs'){
if(config.webPaths.home !== '/'){
app.use('/libs',express.static(webFolder + '/libs'))
}
app.use(s.checkCorrectPathEnding(config.webPaths.home)+'libs',express.static(webFolder + '/libs'))
app.use(s.checkCorrectPathEnding(config.webPaths.admin)+'libs',express.static(webFolder + '/libs'))
app.use(s.checkCorrectPathEnding(config.webPaths.super)+'libs',express.static(webFolder + '/libs'))
}
var libFolder = webFolder + name + '/'
fs.readdir(libFolder,function(err,webFolderContents){
webFolderContents.forEach(function(libName){
var thirdLevelName = libFolder + libName
switch(libName){
case'js':
case'css':
case'blocks':
fs.readdir(thirdLevelName,function(err,webFolderContents){
webFolderContents.forEach(function(filename){
var fullPath = thirdLevelName + '/' + filename
var blockPrefix = ''
switch(true){
case search('super.',filename):
blockPrefix = 'super'
break;
case search('admin.',filename):
blockPrefix = 'admin'
break;
}
switch(libName){
case'js':
s.customAutoLoadTree[blockPrefix + 'LibsJs'].push(filename)
break;
case'css':
s.customAutoLoadTree[blockPrefix + 'LibsCss'].push(filename)
break;
case'blocks':
s.customAutoLoadTree[blockPrefix + 'PageBlocks'].push(fullPath)
break;
}
})
})
break;
default:
if(libName.indexOf('.ejs') > -1){
s.customAutoLoadTree.pages.push(thirdLevelName)
}
break;
}
})
})
break;
}
})
})
break;
case'languages':
var languagesFolder = s.checkCorrectPathEnding(customModulePath) + 'languages/'
fs.readdir(languagesFolder,function(err,files){
if(err)return console.log(err);
files.forEach(function(filename){
var fileData = require(languagesFolder + filename)
var rule = filename.replace('.json','')
if(config.language === rule){
lang = Object.assign(lang,fileData)
}
if(s.loadedLanguages[rule]){
s.loadedLanguages[rule] = Object.assign(s.loadedLanguages[rule],fileData)
}else{
s.loadedLanguages[rule] = Object.assign(s.copySystemDefaultLanguage(),fileData)
}
})
})
break;
case'definitions':
var definitionsFolder = s.checkCorrectPathEnding(customModulePath) + 'definitions/'
fs.readdir(definitionsFolder,function(err,files){
if(err)return console.log(err);
files.forEach(function(filename){
var fileData = require(definitionsFolder + filename)
var rule = filename.replace('.json','').replace('.js','')
if(config.language === rule){
s.definitions = mergeDeep(s.definitions,fileData)
}
if(s.loadedDefinitons[rule]){
s.loadedDefinitons[rule] = mergeDeep(s.loadedDefinitons[rule],fileData)
}else{
s.loadedDefinitons[rule] = mergeDeep(s.copySystemDefaultDefinitions(),fileData)
}
})
})
break;
}
})
const getModulePath = (name) => {
return modulesBasePath + name + '/'
}
const getModule = (moduleName) => {
const modulePath = modulesBasePath + moduleName
const stats = fs.lstatSync(modulePath)
const isDirectory = stats.isDirectory()
const newModule = {
name: moduleName,
path: modulePath + '/',
size: stats.size,
lastModified: stats.mtime,
created: stats.ctime,
isDirectory: isDirectory,
}
if(isDirectory){
var hasInstaller = false
if(!fs.existsSync(modulePath + '/index.js')){
hasInstaller = true
newModule.noIndex = true
}
if(fs.existsSync(modulePath + '/package.json')){
hasInstaller = true
newModule.properties = getModuleProperties(moduleName)
}else{
newModule.properties = {
name: moduleName
}
}
newModule.hasInstaller = hasInstaller
}else{
newModule.isIgnitor = (moduleName.indexOf('.js') > -1)
newModule.properties = {
name: moduleName
}
}
return newModule
}
const getModules = (asArray) => {
const foundModules = {}
fs.readdirSync(modulesBasePath).forEach((moduleName) => {
foundModules[moduleName] = getModule(moduleName)
})
return asArray ? Object.values(foundModules) : foundModules
}
const downloadModule = (downloadUrl,packageName) => {
const downloadPath = modulesBasePath + packageName
fs.mkdirSync(downloadPath)
return new Promise(async (resolve, reject) => {
fs.mkdir(downloadPath, () => {
request(downloadUrl).pipe(fs.createWriteStream(downloadPath + '.zip'))
.on('finish',() => {
zip = fs.createReadStream(downloadPath + '.zip')
.pipe(unzipper.Parse())
.on('entry', async (file) => {
if(file.type === 'Directory'){
try{
fs.mkdirSync(modulesBasePath + file.path, { recursive: true })
}catch(err){
}
}else{
const content = await file.buffer();
fs.writeFile(modulesBasePath + file.path,content,(err) => {
if(err)console.log(err)
})
}catch(err){
console.log('Failed to Load Module : ' + filename)
console.log(err)
}
})
.promise()
.then(() => {
fs.remove(downloadPath + '.zip', () => {})
resolve()
})
})
})
})
}
const getModuleProperties = (name) => {
const modulePath = getModulePath(name)
const propertiesPath = modulePath + 'package.json'
const properties = fs.existsSync(propertiesPath) ? s.parseJSON(fs.readFileSync(propertiesPath)) : {
name: name
}
return properties
}
const installModule = (name) => {
return new Promise((resolve, reject) => {
if(!runningInstallProcesses[name]){
//depending on module this may only work for Ubuntu
const modulePath = getModulePath(name)
const properties = getModuleProperties(name);
const installerPath = modulePath + `INSTALL.sh`
const propertiesPath = modulePath + 'package.json'
var installProcess
// check for INSTALL.sh (ubuntu only)
if(fs.existsSync(installerPath)){
installProcess = spawn(`sh`,[installerPath])
}else if(fs.existsSync(propertiesPath)){
// no INSTALL.sh found, check for package.json and do `npm install --unsafe-perm`
installProcess = spawn(`npm`,['install','--unsafe-perm','--prefix',modulePath])
}else{
resolve()
}
if(installProcess){
const sendData = (data,channel) => {
const clientData = {
f: 'module-info',
module: name,
process: 'install-' + channel,
data: data.toString(),
}
s.tx(clientData,'$')
s.debugLog(clientData)
}
installProcess.stderr.on('data',(data) => {
sendData(data,'stderr')
})
installProcess.stdout.on('data',(data) => {
sendData(data,'stdout')
})
installProcess.on('exit',(data) => {
runningInstallProcesses[name] = null;
resolve()
})
runningInstallProcesses[name] = installProcess
}
}else{
resolve(lang['Already Installing...'])
}
})
}
const disableModule = (name,status) => {
// set status to `false` to enable
const modulePath = getModulePath(name)
const properties = getModuleProperties(name);
const propertiesPath = modulePath + 'package.json'
var packageJson = {
name: name
}
try{
packageJson = JSON.parse(fs.readFileSync(propertiesPath))
}catch(err){
}
packageJson.disabled = status;
fs.writeFileSync(propertiesPath,s.prettyPrint(packageJson))
}
const deleteModule = (name) => {
// requires restart for changes to take effect
try{
const modulePath = modulesBasePath + name
fs.remove(modulePath, (err) => {
console.log(err)
})
return true
}catch(err){
console.log(err)
return false
}
}
const loadModule = (shinobiModule) => {
const moduleName = shinobiModule.name
s.customAutoLoadModules[moduleName] = {}
var customModulePath = modulesBasePath + '/' + moduleName
if(shinobiModule.isIgnitor){
s.customAutoLoadModules[moduleName].type = 'file'
try{
require(customModulePath)(s,config,lang,app,io)
}catch(err){
s.systemLog('Failed to Load Module : ' + moduleName)
s.systemLog(err)
}
}else if(shinobiModule.isDirectory){
s.customAutoLoadModules[moduleName].type = 'folder'
try{
require(customModulePath)(s,config,lang,app,io)
fs.readdir(customModulePath,function(err,folderContents){
folderContents.forEach(function(name){
switch(name){
case'web':
var webFolder = s.checkCorrectPathEnding(customModulePath) + 'web/'
fs.readdir(webFolder,function(err,webFolderContents){
webFolderContents.forEach(function(name){
switch(name){
case'libs':
case'pages':
if(name === 'libs'){
if(config.webPaths.home !== '/'){
app.use('/libs',express.static(webFolder + '/libs'))
}
app.use(s.checkCorrectPathEnding(config.webPaths.home)+'libs',express.static(webFolder + '/libs'))
app.use(s.checkCorrectPathEnding(config.webPaths.admin)+'libs',express.static(webFolder + '/libs'))
app.use(s.checkCorrectPathEnding(config.webPaths.super)+'libs',express.static(webFolder + '/libs'))
}
var libFolder = webFolder + name + '/'
fs.readdir(libFolder,function(err,webFolderContents){
webFolderContents.forEach(function(libName){
var thirdLevelName = libFolder + libName
switch(libName){
case'js':
case'css':
case'blocks':
fs.readdir(thirdLevelName,function(err,webFolderContents){
webFolderContents.forEach(function(filename){
var fullPath = thirdLevelName + '/' + filename
var blockPrefix = ''
switch(true){
case searchText('super.',filename):
blockPrefix = 'super'
break;
case searchText('admin.',filename):
blockPrefix = 'admin'
break;
}
switch(libName){
case'js':
s.customAutoLoadTree[blockPrefix + 'LibsJs'].push(filename)
break;
case'css':
s.customAutoLoadTree[blockPrefix + 'LibsCss'].push(filename)
break;
case'blocks':
s.customAutoLoadTree[blockPrefix + 'PageBlocks'].push(fullPath)
break;
}
})
})
break;
default:
if(libName.indexOf('.ejs') > -1){
s.customAutoLoadTree.pages.push(thirdLevelName)
}
break;
}
})
})
break;
}
})
})
break;
case'languages':
var languagesFolder = s.checkCorrectPathEnding(customModulePath) + 'languages/'
fs.readdir(languagesFolder,function(err,files){
if(err)return console.log(err);
files.forEach(function(filename){
var fileData = require(languagesFolder + filename)
var rule = filename.replace('.json','')
if(config.language === rule){
lang = Object.assign(lang,fileData)
}
if(s.loadedLanguages[rule]){
s.loadedLanguages[rule] = Object.assign(s.loadedLanguages[rule],fileData)
}else{
s.loadedLanguages[rule] = Object.assign(s.copySystemDefaultLanguage(),fileData)
}
})
})
break;
case'definitions':
var definitionsFolder = s.checkCorrectPathEnding(customModulePath) + 'definitions/'
fs.readdir(definitionsFolder,function(err,files){
if(err)return console.log(err);
files.forEach(function(filename){
var fileData = require(definitionsFolder + filename)
var rule = filename.replace('.json','').replace('.js','')
if(config.language === rule){
s.definitions = s.mergeDeep(s.definitions,fileData)
}
if(s.loadedDefinitons[rule]){
s.loadedDefinitons[rule] = s.mergeDeep(s.loadedDefinitons[rule],fileData)
}else{
s.loadedDefinitons[rule] = s.mergeDeep(s.copySystemDefaultDefinitions(),fileData)
}
})
})
break;
}
})
})
}catch(err){
s.systemLog('Failed to Load Module : ' + moduleName)
s.systemLog(err)
}
}
}
const moveModuleToNameInProperties = (modulePath,packageRoot,properties) => {
return new Promise((resolve,reject) => {
const packageRootParts = packageRoot.split('/')
const filename = packageRootParts[packageRootParts.length - 1]
fs.move(modulePath + packageRoot,modulesBasePath + filename,(err) => {
if(packageRoot){
fs.remove(modulePath, (err) => {
if(err)console.log(err)
resolve(filename)
})
}else{
resolve(filename)
}
})
}else{
fs.mkdirSync(folderPath)
})
}
const initializeAllModules = async () => {
s.customAutoLoadModules = {}
s.customAutoLoadTree = {
pages: [],
PageBlocks: [],
LibsJs: [],
LibsCss: [],
adminPageBlocks: [],
adminLibsJs: [],
adminLibsCss: [],
superPageBlocks: [],
superLibsJs: [],
superLibsCss: []
}
fs.readdir(modulesBasePath,function(err,folderContents){
if(!err && folderContents.length > 0){
getModules(true).forEach((shinobiModule) => {
if(shinobiModule.properties.disabled){
return;
}
loadModule(shinobiModule)
})
}else{
fs.mkdir(modulesBasePath,() => {})
}
})
}
/**
* API : Superuser : Custom Auto Load Package Download.
*/
app.get(config.webPaths.superApiPrefix+':auth/package/list', async (req,res) => {
s.superAuth(req.params, async (resp) => {
s.closeJsonResponse(res,{
ok: true,
modules: getModules()
})
},res,req)
})
/**
* API : Superuser : Custom Auto Load Package Download.
*/
app.post(config.webPaths.superApiPrefix+':auth/package/download', async (req,res) => {
s.superAuth(req.params, async (resp) => {
try{
const url = req.body.downloadUrl
const packageRoot = req.body.packageRoot || ''
const packageName = req.body.packageName || extractNameFromPackage(url)
const modulePath = getModulePath(packageName)
await downloadModule(url,packageName)
const properties = getModuleProperties(packageName)
const newName = await moveModuleToNameInProperties(modulePath,packageRoot,properties)
const chosenName = newName ? newName : packageName
disableModule(chosenName,true)
s.closeJsonResponse(res,{
ok: true,
moduleName: chosenName,
newModule: getModule(chosenName)
})
}catch(err){
s.closeJsonResponse(res,{
ok: false,
error: err
})
}
},res,req)
})
// /**
// * API : Superuser : Custom Auto Load Package Update.
// */
// app.post(config.webPaths.superApiPrefix+':auth/package/update', async (req,res) => {
// s.superAuth(req.params, async (resp) => {
// try{
// const url = req.body.downloadUrl
// const packageRoot = req.body.packageRoot || ''
// const packageName = req.body.packageName || extractNameFromPackage(url)
// const modulePath = getModulePath(packageName)
// await downloadModule(url,packageName)
// const properties = getModuleProperties(packageName)
// const newName = await moveModuleToNameInProperties(modulePath,packageRoot,properties)
// const chosenName = newName ? newName : packageName
//
// disableModule(chosenName,true)
// s.closeJsonResponse(res,{
// ok: true,
// moduleName: chosenName,
// newModule: getModule(chosenName)
// })
// }catch(err){
// s.closeJsonResponse(res,{
// ok: false,
// error: err
// })
// }
// },res,req)
// })
/**
* API : Superuser : Custom Auto Load Package Install.
*/
app.post(config.webPaths.superApiPrefix+':auth/package/install', (req,res) => {
s.superAuth(req.params, async (resp) => {
const packageName = req.body.packageName
const response = {ok: true}
const error = await installModule(packageName)
if(error){
response.ok = false
response.msg = error
}
s.closeJsonResponse(res,response)
},res,req)
})
/**
* API : Superuser : Custom Auto Load Package set Status (Enabled or Disabled).
*/
app.post(config.webPaths.superApiPrefix+':auth/package/status', (req,res) => {
s.superAuth(req.params, async (resp) => {
const status = req.body.status
const packageName = req.body.packageName
const selection = status == 'true' ? true : false
disableModule(packageName,selection)
s.closeJsonResponse(res,{ok: true, status: selection})
},res,req)
})
/**
* API : Superuser : Custom Auto Load Package Delete
*/
app.post(config.webPaths.superApiPrefix+':auth/package/delete', async (req,res) => {
s.superAuth(req.params, async (resp) => {
const packageName = req.body.packageName
const response = deleteModule(packageName)
s.closeJsonResponse(res,{ok: response})
},res,req)
})
/**
* API : Superuser : Custom Auto Load Package Reload All
*/
app.post(config.webPaths.superApiPrefix+':auth/package/reloadAll', async (req,res) => {
s.superAuth(req.params, async (resp) => {
await initializeAllModules();
s.closeJsonResponse(res,{ok: true})
},res,req)
})
// Initialize Modules on Start
await initializeAllModules();
}

View File

@ -1,692 +0,0 @@
//
// Shinobi - fork of pam-diff
// Copyright (C) 2018 Kevin Godell
// Author : Kevin Godell, https://github.com/kevinGodell
// npmjs : https://www.npmjs.com/package/pam-diff
// Github : https://github.com/kevinGodell/pam-diff
//
'use strict';
const { Transform } = require('stream');
const PP = require('polygon-points');
/**
*
* @param chunk
* @private
*/
var _getMatrixFromPoints = function(thisRegion) {
var coordinates = [
thisRegion.topLeft,
{"x" : thisRegion.bottomRight.x, "y" : thisRegion.topLeft.y},
thisRegion.bottomRight
]
var width = Math.sqrt( Math.pow(coordinates[1].x - coordinates[0].x, 2) + Math.pow(coordinates[1].y - coordinates[0].y, 2));
var height = Math.sqrt( Math.pow(coordinates[2].x - coordinates[1].x, 2) + Math.pow(coordinates[2].y - coordinates[1].y, 2))
return {
x: coordinates[0].x,
y: coordinates[0].y,
width: width,
height: height,
tag: thisRegion.name
}
}
class PamDiff extends Transform {
/**
*
* @param [options] {Object}
* @param [callback] {Function}
*/
constructor(options, callback) {
super(options);
Transform.call(this, {objectMode: true});
this.difference = PamDiff._parseOptions('difference', options);//global option, can be overridden per region
this.percent = PamDiff._parseOptions('percent', options);//global option, can be overridden per region
this.regions = PamDiff._parseOptions('regions', options);//can be no regions or a single region or multiple regions. if no regions, all pixels will be compared.
this.drawMatrix = PamDiff._parseOptions('drawMatrix', options);//can be no regions or a single region or multiple regions. if no regions, all pixels will be compared.
this.callback = callback;//callback function to be called when pixel difference is detected
this._parseChunk = this._parseFirstChunk;//first parsing will be reading settings and configuring internal pixel reading
}
/**
*
* @param option {String}
* @param options {Object}
* @return {*}
* @private
*/
static _parseOptions(option, options) {
if (options && options.hasOwnProperty(option)) {
return options[option];
}
return null;
}
/**
*
* @param number {Number}
* @param def {Number}
* @param low {Number}
* @param high {Number}
* @return {Number}
* @private
*/
static _validateNumber(number, def, low, high) {
if (isNaN(number)) {
return def;
} else if (number < low) {
return low;
} else if (number > high) {
return high;
} else {
return number;
}
}
/**
*
* @deprecated
* @param string {String}
*/
setGrayscale(string) {
console.warn('grayscale option has been removed, "average" has proven to most accurate and is the default');
}
/**
*
* @param number {Number}
*/
set difference(number) {
this._difference = PamDiff._validateNumber(parseInt(number), 5, 1, 255);
}
/**
*
* @return {Number}
*/
get difference() {
return this._difference;
}
/**
*
* @param number {Number}
* @return {PamDiff}
*/
setDifference(number) {
this.difference = number;
return this;
}
/**
*
* @param number {Number}
*/
set percent(number) {
this._percent = PamDiff._validateNumber(parseInt(number), 5, 1, 100);
}
/**
*
* @return {Number}
*/
get percent() {
return this._percent;
}
/**
*
* @param number {Number}
* @return {PamDiff}
*/
setPercent(number) {
this.percent = number;
return this;
}
/**
*
* @param array {Array}
*/
set regions(array) {
if (!array) {
if (this._regions) {
delete this._regions;
delete this._regionsLength;
delete this._minDiff;
}
this._diffs = 0;
} else if (!Array.isArray(array) || array.length < 1) {
throw new Error(`Regions must be an array of at least 1 region object {name: 'region1', difference: 10, percent: 10, polygon: [[0, 0], [0, 50], [50, 50], [50, 0]]}`);
} else {
this._regions = [];
this._minDiff = 255;
for (const region of array) {
if (!region.hasOwnProperty('name') || !region.hasOwnProperty('polygon')) {
throw new Error('Region must include a name and a polygon property');
}
const polygonPoints = new PP(region.polygon);
const difference = PamDiff._validateNumber(parseInt(region.difference), this._difference, 1, 255);
const percent = PamDiff._validateNumber(parseInt(region.percent), this._percent, 1, 100);
this._minDiff = Math.min(this._minDiff, difference);
this._regions.push(
{
name: region.name,
polygon: polygonPoints,
difference: difference,
percent: percent,
diffs: 0
}
);
}
this._regionsLength = this._regions.length;
this._createPointsInPolygons(this._regions, this._width, this._height);
}
}
/**
*
* @return {Array}
*/
get regions() {
return this._regions;
}
/**
*
* @param array {Array}
* @return {PamDiff}
*/
setRegions(array) {
this.regions = array;
return this;
}
/**
*
* @param func {Function}
*/
set callback(func) {
if (!func) {
delete this._callback;
} else if (typeof func === 'function' && func.length === 1) {
this._callback = func;
} else {
throw new Error('Callback must be a function that accepts 1 argument.');
}
}
/**
*
* @return {Function}
*/
get callback() {
return this._callback;
}
/**
*
* @param func {Function}
* @return {PamDiff}
*/
setCallback(func) {
this.callback = func;
return this;
}
/**
*
* @return {PamDiff}
*/
resetCache() {
//delete this._oldPix;
//delete this._newPix;
//delete this._width;
//delete this._length;
this._parseChunk = this._parseFirstChunk;
return this;
}
/**
*
* @param regions {Array}
* @param width {Number}
* @param height {Number}
* @private
*/
_createPointsInPolygons(regions, width, height) {
if (regions && width && height) {
this._pointsInPolygons = [];
for (const region of regions) {
const bitset = region.polygon.getBitset(this._width, this._height);
region.pointsLength = bitset.count;
this._pointsInPolygons.push(bitset.buffer);
}
}
}
/**
*
* @param chunk
* @private
*/
_blackAndWhitePixelDiff(chunk) {
this._newPix = chunk.pixels;
for (let y = 0, i = 0; y < this._height; y++) {
for (let x = 0; x < this._width; x++, i++) {
const diff = this._oldPix[i] !== this._newPix[i];
if (this._regions && diff === true) {
for (let j = 0; j < this._regionsLength; j++) {
if (this._pointsInPolygons[j][i]) {
this._regions[j].diffs++;
}
}
} else if (diff === true) {
this._diffs++;
}
}
}
if (this._regions) {
const regionDiffArray = [];
for (let i = 0; i < this._regionsLength; i++) {
const percent = Math.floor(100 * this._regions[i].diffs / this._regions[i].pointsLength);
if (percent >= this._regions[i].percent) {
regionDiffArray.push({name: this._regions[i].name, percent: percent});
}
this._regions[i].diffs = 0;
}
if (regionDiffArray.length > 0) {
const data = {trigger: regionDiffArray, pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
} else {
const percent = Math.floor(100 * this._diffs / this._length);
if (percent >= this._percent) {
const data = {trigger: [{name: 'percent', percent: percent}], pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
this._diffs = 0;
}
this._oldPix = this._newPix;
}
/**
*
* @param chunk
* @private
*/
_grayScalePixelDiffWithMatrices(chunk) {
this._newPix = chunk.pixels;
for (let j = 0; j < this._regionsLength; j++) {
this._regions[j].topLeft = {
x: this._width,
y: this._height
}
this._regions[j].bottomRight = {
x: 0,
y: 0
}
}
this.topLeft = {
x: this._width,
y: this._height
}
this.bottomRight = {
x: 0,
y: 0
}
for (let y = 0, i = 0; y < this._height; y++) {
for (let x = 0; x < this._width; x++, i++) {
if (this._oldPix[i] !== this._newPix[i]) {
const diff = Math.abs(this._oldPix[i] - this._newPix[i]);
if (this._regions && diff >= this._minDiff) {
for (let j = 0; j < this._regionsLength; j++) {
if (this._pointsInPolygons[j][i] && diff >= this._regions[j].difference) {
var theRegion = this._regions[j]
theRegion.diffs++;
if(theRegion.topLeft.x > x && theRegion.topLeft.y > y){
theRegion.topLeft.x = x
theRegion.topLeft.y = y
}
if(theRegion.bottomRight.x < x && theRegion.bottomRight.y < y){
theRegion.bottomRight.x = x
theRegion.bottomRight.y = y
}
}
}
} else if (diff >= this._difference) {
this._diffs++;
if(this.topLeft.x > x && this.topLeft.y > y){
this.topLeft.x = x
this.topLeft.y = y
}
if(this.bottomRight.x < x && this.bottomRight.y < y){
this.bottomRight.x = x
this.bottomRight.y = y
}
}
}
}
}
if (this._regions) {
const regionDiffArray = [];
for (let i = 0; i < this._regionsLength; i++) {
var thisRegion = this._regions[i]
const percent = Math.floor(100 * thisRegion.diffs / thisRegion.pointsLength);
if (percent >= thisRegion.percent) {
// create matrix from points >>
thisRegion._matrix = _getMatrixFromPoints(thisRegion)
// create matrix from points />>
regionDiffArray.push({name: thisRegion.name, percent: percent, matrix: thisRegion._matrix});
}
thisRegion.diffs = 0;
}
if (regionDiffArray.length > 0) {
this._matrix = _getMatrixFromPoints(this)
const data = {trigger: regionDiffArray, pam: chunk.pam, matrix: this._matrix};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
} else {
const percent = Math.floor(100 * this._diffs / this._length);
if (percent >= this._percent) {
this._matrix = _getMatrixFromPoints(this)
const data = {trigger: [{name: 'percent', percent: percent, matrix: this._matrix}], pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
this._diffs = 0;
}
this._oldPix = this._newPix;
}
/**
*
* @param chunk
* @private
*/
_grayScalePixelDiff(chunk) {
this._newPix = chunk.pixels;
for (let y = 0, i = 0; y < this._height; y++) {
for (let x = 0; x < this._width; x++, i++) {
if (this._oldPix[i] !== this._newPix[i]) {
const diff = Math.abs(this._oldPix[i] - this._newPix[i]);
if (this._regions && diff >= this._minDiff) {
for (let j = 0; j < this._regionsLength; j++) {
if (this._pointsInPolygons[j][i] && diff >= this._regions[j].difference) {
this._regions[j].diffs++;
}
}
} else {
if (diff >= this._difference) {
this._diffs++;
}
}
}
}
}
if (this._regions) {
const regionDiffArray = [];
for (let i = 0; i < this._regionsLength; i++) {
const percent = Math.floor(100 * this._regions[i].diffs / this._regions[i].pointsLength);
if (percent >= this._regions[i].percent) {
regionDiffArray.push({name: this._regions[i].name, percent: percent});
}
this._regions[i].diffs = 0;
}
if (regionDiffArray.length > 0) {
const data = {trigger: regionDiffArray, pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
} else {
const percent = Math.floor(100 * this._diffs / this._length);
if (percent >= this._percent) {
const data = {trigger: [{name: 'percent', percent: percent}], pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
this._diffs = 0;
}
this._oldPix = this._newPix;
}
/**
*
* @param chunk
* @private
*/
_rgbPixelDiff(chunk) {
this._newPix = chunk.pixels;
for (let y = 0, i = 0, p = 0; y < this._height; y++) {
for (let x = 0; x < this._width; x++, i += 3, p++) {
if (this._oldPix[i] !== this._newPix[i] || this._oldPix[i + 1] !== this._newPix[i + 1] || this._oldPix[i + 2] !== this._newPix[i + 2]) {
const diff = Math.abs(this._oldPix[i] + this._oldPix[i + 1] + this._oldPix[i + 2] - this._newPix[i] - this._newPix[i + 1] - this._newPix[i + 2])/3;
if (this._regions && diff >= this._minDiff) {
for (let j = 0; j < this._regionsLength; j++) {
if (this._pointsInPolygons[j][p] && diff >= this._regions[j].difference) {
this._regions[j].diffs++;
}
}
} else {
if (diff >= this._difference) {
this._diffs++;
}
}
}
}
}
if (this._regions) {
const regionDiffArray = [];
for (let i = 0; i < this._regionsLength; i++) {
const percent = Math.floor(100 * this._regions[i].diffs / this._regions[i].pointsLength);
if (percent >= this._regions[i].percent) {
regionDiffArray.push({name: this._regions[i].name, percent: percent});
}
this._regions[i].diffs = 0;
}
if (regionDiffArray.length > 0) {
const data = {trigger: regionDiffArray, pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
} else {
const percent = Math.floor(100 * this._diffs / this._length);
if (percent >= this._percent) {
const data = {trigger: [{name: 'percent', percent: percent}], pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
this._diffs = 0;
}
this._oldPix = this._newPix;
}
/**
*
* @param chunk
* @private
*/
_rgbAlphaPixelDiff(chunk) {
this._newPix = chunk.pixels;
for (let y = 0, i = 0, p = 0; y < this._height; y++) {
for (let x = 0; x < this._width; x++, i += 4, p++) {
if (this._oldPix[i] !== this._newPix[i] || this._oldPix[i + 1] !== this._newPix[i + 1] || this._oldPix[i + 2] !== this._newPix[i + 2]) {
const diff = Math.abs(this._oldPix[i] + this._oldPix[i + 1] + this._oldPix[i + 2] - this._newPix[i] - this._newPix[i + 1] - this._newPix[i + 2])/3;
if (this._regions && diff >= this._minDiff) {
for (let j = 0; j < this._regionsLength; j++) {
if (this._pointsInPolygons[j][p] && diff >= this._regions[j].difference) {
this._regions[j].diffs++;
}
}
} else {
if (diff >= this._difference) {
this._diffs++;
}
}
}
}
}
if (this._regions) {
const regionDiffArray = [];
for (let i = 0; i < this._regionsLength; i++) {
const percent = Math.floor(100 * this._regions[i].diffs / this._regions[i].pointsLength);
if (percent >= this._regions[i].percent) {
regionDiffArray.push({name: this._regions[i].name, percent: percent});
}
this._regions[i].diffs = 0;
}
if (regionDiffArray.length > 0) {
const data = {trigger: regionDiffArray, pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
} else {
const percent = Math.floor(100 * this._diffs / this._length);
if (percent >= this._percent) {
const data = {trigger: [{name: 'percent', percent: percent}], pam: chunk.pam};
if (this._callback) {
this._callback(data);
}
if (this._readableState.pipesCount > 0) {
this.push(data);
}
if (this.listenerCount('diff') > 0) {
this.emit('diff', data);
}
}
this._diffs = 0;
}
this._oldPix = this._newPix;
}
/**
*
* @param chunk
* @private
*/
_parseFirstChunk(chunk) {
this._width = parseInt(chunk.width);
this._height = parseInt(chunk.height);
this._oldPix = chunk.pixels;
this._length = this._width * this._height;
this._createPointsInPolygons(this._regions, this._width, this._height);
switch (chunk.tupltype) {
case 'blackandwhite' :
this._parseChunk = this._blackAndWhitePixelDiff;
break;
case 'grayscale' :
if(this.drawMatrix === "1"){
this._parseChunk = this._grayScalePixelDiffWithMatrices;
}else{
this._parseChunk = this._grayScalePixelDiff;
}
break;
case 'rgb' :
this._parseChunk = this._rgbPixelDiff;
//this._increment = 3;//future use
break;
case 'rgb_alpha' :
this._parseChunk = this._rgbAlphaPixelDiff;
//this._increment = 4;//future use
break;
default :
throw Error(`Unsupported tupltype: ${chunk.tupltype}. Supported tupltypes include grayscale(gray), blackandwhite(monob), rgb(rgb24), and rgb_alpha(rgba).`);
}
}
/**
*
* @param chunk
* @param encoding
* @param callback
* @private
*/
_transform(chunk, encoding, callback) {
this._parseChunk(chunk);
callback();
}
/**
*
* @param callback
* @private
*/
_flush(callback) {
this.resetCache();
callback();
}
}
/**
*
* @type {PamDiff}
*/
module.exports = PamDiff;
//todo get bounding box of all regions combined to exclude some pixels before checking if they exist inside specific regions

View File

@ -1,9 +1,142 @@
var fs = require('fs')
var exec = require('child_process').exec
module.exports = function(s,config,lang,app,io){
if(config.dropInEventServer === true){
if(config.dropInEventForceSaveEvent === undefined)config.dropInEventForceSaveEvent = true
if(config.dropInEventDeleteFileAfterTrigger === undefined)config.dropInEventDeleteFileAfterTrigger = true
var beforeMonitorsLoadedOnStartup = function(){
var fileQueueForDeletion = {}
var fileQueue = {}
var search = function(searchIn,searchFor){
return searchIn.indexOf(searchFor) > -1
}
var getFileNameFromPath = function(filePath){
fileParts = filePath.split('/')
return fileParts[fileParts.length - 1]
}
var clipPathEnding = function(filePath){
var newPath = filePath + ''
if (newPath.substring(newPath.length-1) == "/"){
newPath = newPath.substring(0, newPath.length-1);
}
return newPath;
}
var processFile = function(filePath,monitorConfig){
var ke = monitorConfig.ke
var mid = monitorConfig.mid
var filename = getFileNameFromPath(filePath)
if(search(filename,'.jpg') || search(filename,'.jpeg')){
var snapPath = s.dir.streams + ke + '/' + mid + '/s.jpg'
fs.unlink(snapPath,function(err){
fs.createReadStream(filePath).pipe(fs.createWriteStream(snapPath))
s.triggerEvent({
id: mid,
ke: ke,
details: {
confidence: 100,
name: filename,
plug: "dropInEvent",
reason: "ftpServer"
},
},config.dropInEventForceSaveEvent)
})
}else{
var reason = "ftpServer"
if(search(filename,'.mp4')){
fs.stat(filePath,function(err,stats){
if(err)return;
var startTime = stats.ctime
var endTime = stats.mtime
var shinobiFilename = s.formattedTime(startTime) + '.mp4'
var recordingPath = s.getVideoDirectory(monitorConfig) + shinobiFilename
var writeStream = fs.createWriteStream(recordingPath)
fs.createReadStream(filePath).pipe(writeStream)
writeStream.on('finish', () => {
s.insertCompletedVideo(s.group[monitorConfig.ke].rawMonitorConfigurations[monitorConfig.mid],{
file: shinobiFilename,
events: [
{
id: mid,
ke: ke,
time: new Date(),
details: {
confidence: 100,
name: filename,
plug: "dropInEvent",
reason: "ftpServer"
}
}
],
},function(){
})
})
})
}
var completeAction = function(){
s.triggerEvent({
id: mid,
ke: ke,
details: {
confidence: 100,
name: filename,
plug: "dropInEvent",
reason: reason
},
doObjectDetection: (s.isAtleatOneDetectorPluginConnected && s.group[ke].rawMonitorConfigurations[mid].details.detector_use_detect_object === '1')
},config.dropInEventForceSaveEvent)
}
if(search(filename,'.txt')){
fs.readFile(filePath,{encoding: 'utf-8'},function(err,data){
if(data){
reason = data.split('\n')[0] || filename
}else if(filename){
reason = filename
}
completeAction()
})
}else{
completeAction()
}
}
}
var onFileOrFolderFound = function(filePath,deletionKey,monitorConfig){
fs.stat(filePath,function(err,stats){
if(!err){
if(stats.isDirectory()){
fs.readdir(filePath,function(err,files){
if(files){
files.forEach(function(filename){
onFileOrFolderFound(clipPathEnding(filePath) + '/' + filename,deletionKey,monitorConfig)
})
}else if(err){
console.log(err)
}
})
}else{
if(!fileQueue[filePath]){
processFile(filePath,monitorConfig)
if(config.dropInEventDeleteFileAfterTrigger){
clearTimeout(fileQueue[filePath])
fileQueue[filePath] = setTimeout(function(){
exec('rm -rf ' + filePath,function(err){
delete(fileQueue[filePath])
})
},1000 * 60 * 5)
}
}
}
if(config.dropInEventDeleteFileAfterTrigger){
clearTimeout(fileQueueForDeletion[deletionKey])
fileQueueForDeletion[deletionKey] = setTimeout(function(){
exec('rm -rf ' + deletionKey,function(err){
delete(fileQueueForDeletion[deletionKey])
})
},1000 * 60 * 5)
}
}
})
}
var createDropInEventsDirectory = function(){
if(!config.dropInEventsDir){
config.dropInEventsDir = s.dir.streams + 'dropInEvents/'
}
@ -37,6 +170,7 @@ module.exports = function(s,config,lang,app,io){
directory = s.dir.dropInEvents + e.ke + '/' + (e.id || e.mid) + '/'
fs.mkdir(directory,function(err){
s.handleFolderError(err)
exec('rm -rf "' + directory + '*"',function(){})
callback(err,directory)
})
})
@ -45,125 +179,36 @@ module.exports = function(s,config,lang,app,io){
var ke = monitorConfig.ke
var mid = monitorConfig.mid
var groupEventDropDir = s.dir.dropInEvents + ke
createDropInEventDirectory(monitorConfig,function(err,monitorEventDropDir){
var monitorEventDropDir = getDropInEventDir(monitorConfig)
var fileQueue = {}
s.group[monitorConfig.ke].activeMonitors[monitorConfig.mid].dropInEventFileQueue = fileQueue
var search = function(searchIn,searchFor){
return searchIn.indexOf(searchFor) > -1
}
var processFile = function(filename){
var filePath = monitorEventDropDir + filename
if(search(filename,'.jpg') || search(filename,'.jpeg')){
var snapPath = s.dir.streams + ke + '/' + mid + '/s.jpg'
fs.unlink(snapPath,function(err){
fs.createReadStream(filePath).pipe(fs.createWriteStream(snapPath))
s.triggerEvent({
id: mid,
ke: ke,
details: {
confidence: 100,
name: filename,
plug: "dropInEvent",
reason: "ftpServer"
},
},config.dropInEventForceSaveEvent)
})
}else{
var reason = "ftpServer"
if(search(filename,'.mp4')){
fs.stat(filePath,function(err,stats){
var startTime = stats.ctime
var endTime = stats.mtime
var shinobiFilename = s.formattedTime(startTime) + '.mp4'
var recordingPath = s.getVideoDirectory(monitorConfig) + shinobiFilename
var writeStream = fs.createWriteStream(recordingPath)
fs.createReadStream(filePath).pipe(writeStream)
writeStream.on('finish', () => {
s.insertCompletedVideo(s.group[monitorConfig.ke].rawMonitorConfigurations[monitorConfig.mid],{
file : shinobiFilename
},function(){
})
})
})
}
var completeAction = function(){
s.triggerEvent({
id: mid,
ke: ke,
details: {
confidence: 100,
name: filename,
plug: "dropInEvent",
reason: reason
}
},config.dropInEventForceSaveEvent)
}
if(search(filename,'.txt')){
fs.readFile(filePath,{encoding: 'utf-8'},function(err,data){
if(data){
reason = data.split('\n')[0] || filename
}else if(filename){
reason = filename
}
completeAction()
})
}else{
completeAction()
}
}
if(config.dropInEventDeleteFileAfterTrigger){
setTimeout(function(){
fs.unlink(filePath,function(err){
})
},1000 * 60 * 5)
}
}
var eventTrigger = function(eventType,filename,stats){
if(stats.isDirectory()){
fs.readdir(monitorEventDropDir + filename,function(err,files){
if(files){
files.forEach(function(filename){
processFile(filename)
})
}else if(err){
console.log(err)
}
})
}else{
processFile(filename)
}
}
var directoryWatch = fs.watch(monitorEventDropDir,function(eventType,filename){
fs.stat(monitorEventDropDir + filename,function(err,stats){
if(!err){
clearTimeout(fileQueue[filename])
fileQueue[filename] = setTimeout(function(){
eventTrigger(eventType,filename,stats)
},1750)
}
})
})
s.group[monitorConfig.ke].activeMonitors[monitorConfig.mid].dropInEventWatcher = directoryWatch
})
createDropInEventDirectory(monitorConfig,function(err,monitorEventDropDir){})
}
// FTP Server
if(config.ftpServer === true){
createDropInEventsDirectory()
if(!config.ftpServerPort)config.ftpServerPort = 21
if(!config.ftpServerUrl)config.ftpServerUrl = `ftp://0.0.0.0:${config.ftpServerPort}`
config.ftpServerUrl = config.ftpServerUrl.replace('{{PORT}}',config.ftpServerPort)
const FtpSrv = require('ftp-srv')
const ftpServer = new FtpSrv({
url: config.ftpServerUrl,
log: require('bunyan').createLogger({
name: 'ftp-srv',
level: 100
}),
})
ftpServer.on('login', (data, resolve, reject) => {
var username = data.username
var password = data.password
ftpServer.on('login', ({connection, username, password}, resolve, reject) => {
s.basicOrApiAuthentication(username,password,function(err,user){
if(user){
connection.on('STOR', (error, fileName) => {
if(!fileName)return;
var pathPieces = fileName.replace(s.dir.dropInEvents,'').split('/')
var ke = pathPieces[0]
var mid = pathPieces[1]
var firstDroppedPart = pathPieces[2]
var monitorEventDropDir = s.dir.dropInEvents + ke + '/' + mid + '/'
var deleteKey = monitorEventDropDir + firstDroppedPart
onFileOrFolderFound(monitorEventDropDir + firstDroppedPart,deleteKey,{ke: ke, mid: mid})
})
resolve({root: s.dir.dropInEvents + user.ke})
}else{
// reject(new Error('Failed Authorization'))
@ -171,7 +216,7 @@ module.exports = function(s,config,lang,app,io){
})
})
ftpServer.on('client-error', ({connection, context, error}) => {
console.log('error')
console.log('client-error',error)
})
ftpServer.listen().then(() => {
s.systemLog(`FTP Server running on port ${config.ftpServerPort}...`)
@ -180,7 +225,6 @@ module.exports = function(s,config,lang,app,io){
})
}
//add extensions
s.beforeMonitorsLoadedOnStartup(beforeMonitorsLoadedOnStartup)
s.onMonitorInit(onMonitorInit)
s.onMonitorStop(onMonitorStop)
}
@ -190,7 +234,9 @@ module.exports = function(s,config,lang,app,io){
if(config.smtpServerHideStartTls === undefined)config.smtpServerHideStartTls = null
var SMTPServer = require("smtp-server").SMTPServer;
if(!config.smtpServerPort && (config.smtpServerSsl && config.smtpServerSsl.enabled !== false || config.ssl)){config.smtpServerPort = 465}else if(!config.smtpServerPort){config.smtpServerPort = 25}
var smtpOptions = {
config.smtpServerOptions = config.smtpServerOptions ? config.smtpServerOptions : {}
var smtpOptions = Object.assign({
logger: config.debugLog || config.smtpServerLog,
hideSTARTTLS: config.smtpServerHideStartTls,
onAuth(auth, session, callback) {
var username = auth.username
@ -207,7 +253,7 @@ module.exports = function(s,config,lang,app,io){
var split = address.address.split('@')
var monitorId = split[0]
var ke = session.user
if(s.group[ke].rawMonitorConfigurations[monitorId] && s.group[ke].activeMonitors[monitorId].isStarted === true){
if(s.group[ke] && s.group[ke].rawMonitorConfigurations[monitorId] && s.group[ke].activeMonitors[monitorId].isStarted === true){
session.monitorId = monitorId
}else{
return callback(new Error(lang['No Monitor Exists with this ID.']))
@ -218,6 +264,7 @@ module.exports = function(s,config,lang,app,io){
if(session.monitorId){
var ke = session.user
var monitorId = session.monitorId
var details = s.group[ke].rawMonitorConfigurations[monitorId].details
var reasonTag = 'smtpServer'
var text = ''
stream.on('data',function(data){
@ -255,7 +302,8 @@ module.exports = function(s,config,lang,app,io){
name: 'smtpServer',
plug: "dropInEvent",
reason: reasonTag
}
},
doObjectDetection: (s.isAtleatOneDetectorPluginConnected && details.detector_use_detect_object === '1')
},config.dropInEventForceSaveEvent)
callback()
})
@ -263,7 +311,7 @@ module.exports = function(s,config,lang,app,io){
callback()
}
}
}
},config.smtpServerOptions)
if(config.smtpServerSsl && config.smtpServerSsl.enabled !== false || config.ssl && config.ssl.cert && config.ssl.key){
var key = config.ssl.key || fs.readFileSync(config.smtpServerSsl.key)
var cert = config.ssl.cert || fs.readFileSync(config.smtpServerSsl.cert)

View File

@ -3,7 +3,83 @@ var execSync = require('child_process').execSync;
var exec = require('child_process').exec;
var spawn = require('child_process').spawn;
var request = require('request');
// Matrix In Region Libs >
var SAT = require('sat')
var V = SAT.Vector;
var P = SAT.Polygon;
var B = SAT.Box;
// Matrix In Region Libs />
module.exports = function(s,config,lang){
const {
moveCameraPtzToMatrix,
} = require('./control/ptz.js')(s,config,lang)
const countObjects = async (event) => {
const matrices = event.details.matrices
const eventsCounted = s.group[event.ke].activeMonitors[event.id].eventsCounted || {}
if(matrices){
matrices.forEach((matrix)=>{
const id = matrix.tag
if(!eventsCounted[id])eventsCounted[id] = {times: [], count: {}, tag: matrix.tag}
if(!isNaN(matrix.id))eventsCounted[id].count[matrix.id] = 1
eventsCounted[id].times.push(new Date().getTime())
})
}
return eventsCounted
}
const isAtleastOneMatrixInRegion = function(regions,matrices,callback){
var regionPolys = []
var matrixPoints = []
regions.forEach(function(region,n){
var polyPoints = []
region.points.forEach(function(point){
polyPoints.push(new V(parseInt(point[0]),parseInt(point[1])))
})
regionPolys[n] = new P(new V(0,0), polyPoints)
})
var collisions = []
var foundInRegion = false
matrices.forEach(function(matrix){
var matrixPoly = new B(new V(matrix.x, matrix.y), matrix.width, matrix.height).toPolygon()
regionPolys.forEach(function(region,n){
var response = new SAT.Response()
var collided = SAT.testPolygonPolygon(matrixPoly, region, response)
if(collided === true){
collisions.push({
matrix: matrix,
region: regions[n]
})
foundInRegion = true
}
})
})
if(callback)callback(foundInRegion,collisions)
return foundInRegion
}
const scanMatricesforCollisions = function(region,matrices){
var matrixPoints = []
var collisions = []
if (!region || !matrices){
if(callback)callback(collisions)
return collisions
}
var polyPoints = []
region.points.forEach(function(point){
polyPoints.push(new V(parseInt(point[0]),parseInt(point[1])))
})
var regionPoly = new P(new V(0,0), polyPoints)
matrices.forEach(function(matrix){
if (matrix){
var matrixPoly = new B(new V(matrix.x, matrix.y), matrix.width, matrix.height).toPolygon()
var response = new SAT.Response()
var collided = SAT.testPolygonPolygon(matrixPoly, regionPoly, response)
if(collided === true){
collisions.push(matrix)
}
}
})
return collisions
}
const nonEmpty = (element) => element.length !== 0;
s.addEventDetailsToString = function(eventData,string,addOps){
//d = event data
if(!addOps)addOps = {}
@ -15,12 +91,19 @@ module.exports = function(s,config,lang){
.replace(/{{REGION_NAME}}/g,d.details.name)
.replace(/{{SNAP_PATH}}/g,s.dir.streams+'/'+d.ke+'/'+d.id+'/s.jpg')
.replace(/{{MONITOR_ID}}/g,d.id)
.replace(/{{MONITOR_NAME}}/g,s.group[d.ke].rawMonitorConfigurations[d.id].name)
.replace(/{{GROUP_KEY}}/g,d.ke)
.replace(/{{DETAILS}}/g,detailString)
if(d.details.confidence){
newString = newString
.replace(/{{CONFIDENCE}}/g,d.details.confidence)
}
if(newString.includes("REASON")) {
if(d.details.reason) {
newString = newString
.replace(/{{REASON}}/g, d.details.reason)
}
}
return newString
}
s.filterEvents = function(x,d){
@ -41,7 +124,8 @@ module.exports = function(s,config,lang){
extender(x,d)
})
}
s.triggerEvent = function(d,forceSave){
s.triggerEvent = async (d,forceSave) => {
var didCountingAlready = false
var filter = {
halt : false,
addToMotionCounter : true,
@ -50,17 +134,18 @@ module.exports = function(s,config,lang){
webhook : true,
command : true,
record : true,
indifference : false
indifference : false,
countObjects : true
}
s.onEventTriggerBeforeFilterExtensions.forEach(function(extender){
extender(d,filter)
})
var detailString = JSON.stringify(d.details);
if(!s.group[d.ke]||!s.group[d.ke].activeMonitors[d.id]){
return s.systemLog(lang['No Monitor Found, Ignoring Request'])
}
d.mon=s.group[d.ke].rawMonitorConfigurations[d.id];
var currentConfig = s.group[d.ke].activeMonitors[d.id].details
var currentConfig = s.group[d.ke].rawMonitorConfigurations[d.id].details
s.onEventTriggerBeforeFilterExtensions.forEach(function(extender){
extender(d,filter)
})
var hasMatrices = (d.details.matrices && d.details.matrices.length > 0)
//read filters
if(
@ -126,6 +211,7 @@ module.exports = function(s,config,lang){
case'y':
case'height':
case'width':
case'confidence':
if(d.details.matrices){
d.details.matrices.forEach(function(matrix,position){
modifyFilters(matrix,position)
@ -188,10 +274,14 @@ module.exports = function(s,config,lang){
var eventTime = new Date()
//motion counter
if(filter.addToMotionCounter && filter.record){
if(!s.group[d.ke].activeMonitors[d.id].detector_motion_count){
s.group[d.ke].activeMonitors[d.id].detector_motion_count=0
}
s.group[d.ke].activeMonitors[d.id].detector_motion_count+=1
s.group[d.ke].activeMonitors[d.id].detector_motion_count.push(d)
}
if(filter.countObjects && currentConfig.detector_obj_count === '1' && currentConfig.detector_obj_count_in_region !== '1'){
didCountingAlready = true
countObjects(d)
}
if(currentConfig.detector_ptz_follow === '1'){
moveCameraPtzToMatrix(d,currentConfig.detector_ptz_follow_target)
}
if(filter.useLock){
if(s.group[d.ke].activeMonitors[d.id].motion_lock){
@ -214,9 +304,12 @@ module.exports = function(s,config,lang){
// check if object should be in region
if(hasMatrices && currentConfig.detector_obj_region === '1'){
var regions = s.group[d.ke].activeMonitors[d.id].parsedObjects.cords
var isMatrixInRegions = s.isAtleastOneMatrixInRegion(regions,d.details.matrices)
var isMatrixInRegions = isAtleastOneMatrixInRegion(regions,d.details.matrices)
if(isMatrixInRegions){
s.debugLog('Matrix in region!')
if(filter.countObjects && currentConfig.detector_obj_count === '1' && currentConfig.detector_obj_count_in_region === '1' && !didCountingAlready){
countObjects(d)
}
}else{
return
}
@ -236,7 +329,9 @@ module.exports = function(s,config,lang){
time : s.formattedTime(),
frame : s.group[d.ke].activeMonitors[d.id].lastJpegDetectorFrame
})
}else{
}
//
if(currentConfig.detector_use_motion === '0' || d.doObjectDetection !== true ){
if(currentConfig.det_multi_trig === '1'){
s.getCamerasForMultiTrigger(d.mon).forEach(function(monitor){
if(monitor.mid !== d.id){
@ -255,7 +350,16 @@ module.exports = function(s,config,lang){
}
//save this detection result in SQL, only coords. not image.
if(forceSave || (filter.save && currentConfig.detector_save === '1')){
s.sqlQuery('INSERT INTO Events (ke,mid,details,time) VALUES (?,?,?,?)',[d.ke,d.id,detailString,eventTime])
s.knexQuery({
action: "insert",
table: "Events",
insert: {
ke: d.ke,
mid: d.id,
details: detailString,
time: eventTime,
}
})
}
if(currentConfig.detector === '1' && currentConfig.detector_notrigger === '1'){
s.setNoEventsDetector(s.group[d.ke].rawMonitorConfigurations[d.id])
@ -299,13 +403,9 @@ module.exports = function(s,config,lang){
}
d.currentTime = new Date()
d.currentTimestamp = s.timeObject(d.currentTime).format()
d.screenshotName = 'Motion_'+(d.mon.name.replace(/[^\w\s]/gi,''))+'_'+d.id+'_'+d.ke+'_'+s.formattedTime()
d.screenshotName = d.details.reason + '_'+(d.mon.name.replace(/[^\w\s]/gi,''))+'_'+d.id+'_'+d.ke+'_'+s.formattedTime()
d.screenshotBuffer = null
s.onEventTriggerExtensions.forEach(function(extender){
extender(d,filter)
})
if(filter.webhook && currentConfig.detector_webhook === '1'){
var detector_webhook_url = s.addEventDetailsToString(d,currentConfig.detector_webhook_url)
var webhookMethod = currentConfig.detector_webhook_method
@ -325,6 +425,11 @@ module.exports = function(s,config,lang){
if(err)s.debugLog(err)
})
}
for (var i = 0; i < s.onEventTriggerExtensions.length; i++) {
const extender = s.onEventTriggerExtensions[i]
await extender(d,filter)
}
}
//show client machines the event
d.cx={f:'detector_trigger',id:d.id,ke:d.ke,details:d.details,doObjectDetection:d.doObjectDetection};
@ -349,6 +454,7 @@ module.exports = function(s,config,lang){
s.group[d.ke].activeMonitors[d.id].eventBasedRecording.allowEnd = true
s.group[d.ke].activeMonitors[d.id].eventBasedRecording.process.stdin.setEncoding('utf8')
s.group[d.ke].activeMonitors[d.id].eventBasedRecording.process.stdin.write('q')
s.group[d.ke].activeMonitors[d.id].eventBasedRecording.process.kill('SIGINT')
delete(s.group[d.ke].activeMonitors[d.id].eventBasedRecording.timeout)
},detector_timeout * 1000 * 60)
}
@ -371,7 +477,8 @@ module.exports = function(s,config,lang){
return
}
s.insertCompletedVideo(d.mon,{
file : filename
file : filename,
events: s.group[d.ke].activeMonitors[d.id].detector_motion_count
})
s.userLog(d,{type:lang["Traditional Recording"],msg:lang["Detector Recording Complete"]})
s.userLog(d,{type:lang["Traditional Recording"],msg:lang["Clear Recorder Process"]})

View File

@ -46,6 +46,7 @@ module.exports = function(s,config){
}
//
s.cloudDiskUseStartupExtensions = {}
s.cloudDiskUseOnGetVideoDataExtensions = {}
////// EVENTS //////
s.onEventTriggerExtensions = []
@ -143,6 +144,11 @@ module.exports = function(s,config){
s.onWebSocketDisconnectionExtensions.push(callback)
}
//
s.onWebsocketMessageSendExtensions = []
s.onWebsocketMessageSend = function(callback){
s.onWebsocketMessageSendExtensions.push(callback)
}
//
s.onGetCpuUsageExtensions = []
s.onGetCpuUsage = function(callback){
s.onGetCpuUsageExtensions.push(callback)

View File

@ -103,6 +103,8 @@ module.exports = function(s,config,lang,onFinish){
auto: {label:lang['Auto'],value:'auto'},
drm: {label:lang['drm'],value:'drm'},
cuvid: {label:lang['cuvid'],value:'cuvid'},
cuda: {label:lang['cuda'],value:'cuda'},
opencl: {label:lang['opencl'],value:'opencl'},
vaapi: {label:lang['vaapi'],value:'vaapi'},
qsv: {label:lang['qsv'],value:'qsv'},
vdpau: {label:lang['vdpau'],value:'vdpau'},
@ -158,8 +160,10 @@ module.exports = function(s,config,lang,onFinish){
})
}else{
var primaryMap = '0:0'
if(e.details.primary_input && e.details.primary_input !== '')primaryMap = e.details.primary_input
string += ' -map ' + primaryMap
if(e.details.primary_input && e.details.primary_input !== ''){
var primaryMap = e.details.primary_input || '0:0'
string += ' -map ' + primaryMap
}
}
}
return string;
@ -421,6 +425,8 @@ module.exports = function(s,config,lang,onFinish){
//
x.hwaccel = ''
x.cust_input = ''
//wallclock fix for strangely long, single frame videos
if((config.wallClockTimestampAsDefault || e.details.wall_clock_timestamp_ignore !== '1') && e.type === 'h264' && x.cust_input.indexOf('-use_wallclock_as_timestamps 1') === -1){x.cust_input+=' -use_wallclock_as_timestamps 1';}
//input - frame rate (capture rate)
if(e.details.sfps && e.details.sfps!==''){x.input_fps=' -r '+e.details.sfps}else{x.input_fps=''}
//input - analyze duration
@ -635,21 +641,17 @@ module.exports = function(s,config,lang,onFinish){
//add input feed map
x.pipe += s.createFFmpegMap(e,e.details.input_map_choices.snap)
}
if(!e.details.snap_fps || e.details.snap_fps === ''){e.details.snap_fps = 1}
if(e.details.snap_vf && e.details.snap_vf !== '' || e.cudaEnabled){
var snapVf = e.details.snap_vf.split(',')
if(e.details.snap_vf === '')snapVf.shift()
if(e.cudaEnabled){
snapVf.push('hwdownload,format=nv12')
}
//-vf "thumbnail_cuda=2,hwdownload,format=nv12"
x.snap_vf=' -vf "'+snapVf.join(',')+'"'
}else{
x.snap_vf=''
var snapVf = e.details.snap_vf ? e.details.snap_vf.split(',') : []
if(e.details.snap_vf === '')snapVf.shift()
if(e.cudaEnabled){
snapVf.push('hwdownload,format=nv12')
}
if(e.details.snap_scale_x && e.details.snap_scale_x !== '' && e.details.snap_scale_y && e.details.snap_scale_y !== ''){x.snap_ratio = ' -s '+e.details.snap_scale_x+'x'+e.details.snap_scale_y}else{x.snap_ratio=''}
if(e.details.cust_snap && e.details.cust_snap !== ''){x.cust_snap = ' '+e.details.cust_snap}else{x.cust_snap=''}
x.pipe+=' -update 1 -r '+e.details.snap_fps+x.cust_snap+x.snap_ratio+x.snap_vf+' "'+e.sdir+'s.jpg" -y';
snapVf.push(`fps=${e.details.snap_fps || '1'}`)
//-vf "thumbnail_cuda=2,hwdownload,format=nv12"
x.pipe += ` -vf "${snapVf.join(',')}"`
if(e.details.snap_scale_x && e.details.snap_scale_x !== '' && e.details.snap_scale_y && e.details.snap_scale_y !== '')x.pipe += ' -s '+e.details.snap_scale_x+'x'+e.details.snap_scale_y
if(e.details.cust_snap)x.pipe += ' ' + e.details.cust_snap
x.pipe += ` -update 1 "${e.sdir}s.jpg" -y`
}
//custom - output
if(e.details.custom_output&&e.details.custom_output!==''){x.pipe+=' '+e.details.custom_output;}
@ -670,7 +672,7 @@ module.exports = function(s,config,lang,onFinish){
x.dimensions = e.details.stream_scale_x+'x'+e.details.stream_scale_y;
}
//record - segmenting
x.segment = ' -f segment -segment_atclocktime 1 -reset_timestamps 1 -strftime 1 -segment_list pipe:2 -segment_time '+(60*e.cutoff)+' "'+e.dir+'%Y-%m-%dT%H-%M-%S.'+e.ext+'"';
x.segment = ' -f segment -segment_atclocktime 1 -reset_timestamps 1 -strftime 1 -segment_list pipe:8 -segment_time '+(60*e.cutoff)+' "'+e.dir+'%Y-%m-%dT%H-%M-%S.'+e.ext+'"';
//record - set defaults for extension, video quality
switch(e.ext){
case'mp4':
@ -692,7 +694,6 @@ module.exports = function(s,config,lang,onFinish){
if(e.details.acodec&&e.details.acodec!==''&&e.details.acodec!=='default'){x.acodec=e.details.acodec}
if(e.details.cust_record.indexOf('-strict -2') === -1){x.cust_record.push(' -strict -2')}
if(e.details.cust_record.indexOf('-threads')===-1){x.cust_record.push(' -threads 1')}
// if(e.details.cust_input&&(e.details.cust_input.indexOf('-use_wallclock_as_timestamps 1')>-1)===false){e.details.cust_input+=' -use_wallclock_as_timestamps 1';}
//record - ready or reset codecs
if(x.acodec!=='no'){
if(x.acodec.indexOf('none')>-1){x.acodec=''}else{x.acodec=' -acodec '+x.acodec}
@ -794,40 +795,58 @@ module.exports = function(s,config,lang,onFinish){
//add input feed map
x.pipe += s.createFFmpegMap(e,e.details.input_map_choices.detector)
}
if(!e.details.detector_fps || e.details.detector_fps === ''){x.detector_fps = 2}else{x.detector_fps = parseInt(e.details.detector_fps)}
if(e.details.detector_scale_x && e.details.detector_scale_x !== '' && e.details.detector_scale_y && e.details.detector_scale_y !== ''){x.dratio=' -s '+e.details.detector_scale_x+'x'+e.details.detector_scale_y}else{x.dratio=' -s 320x240'}
x.dratio = ` -s ${e.details.detector_scale_x ? e.details.detector_scale_x : '640'}x${e.details.detector_scale_y ? e.details.detector_scale_y : '480'}`
if(e.details.cust_detect&&e.details.cust_detect!==''){x.cust_detect+=e.details.cust_detect;}
if(sendFramesGlobally)x.pipe += ' -r ' + x.detector_fps + x.dratio + x.cust_detect
if(sendFramesGlobally)x.pipe += x.dratio + x.cust_detect
x.detector_vf = []
if(e.cudaEnabled){
x.detector_vf.push('hwdownload,format=nv12')
}
x.detector_vf.push('fps=' + (e.details.detector_fps ? e.details.detector_fps : '2'))
if(sendFramesGlobally && x.detector_vf.length > 0)x.pipe += ' -vf "'+x.detector_vf.join(',')+'"'
var h264Output = ' -q:v 1 -an -c:v libx264 -f hls -tune zerolatency -g 1 -hls_time 2 -hls_list_size 3 -start_number 0 -live_start_index 3 -hls_allow_cache 0 -hls_flags +delete_segments+omit_endlist "'+e.sdir+'detectorStreamX.m3u8"'
var setObjectDetectValues = () => {
//for object detection
if(
e.details.detector_scale_x_object &&
e.details.detector_scale_x_object !=='' &&
e.details.detector_scale_y_object &&
e.details.detector_scale_y_object!==''
){
x.dobjratio = ' -s '+e.details.detector_scale_x_object+'x'+e.details.detector_scale_y_object
}else{
x.dobjratio = x.dratio
}
if(e.details.cust_detect_object)x.pipe += e.details.cust_detect_object
x.pipe += x.dobjratio + ' -vf fps=' + (e.details.detector_fps_object || '2')
}
if(e.details.detector_pam === '1'){
if(sendFramesGlobally && e.cudaEnabled){
x.pipe += ' -vf "hwdownload,format=nv12"'
}
if(sendFramesGlobally)x.pipe += ' -an -c:v pam -pix_fmt gray -f image2pipe pipe:3'
if(e.details.detector_use_detect_object === '1'){
//for object detection
x.detector_fps_object = '2'
if(sendFramesGlobally){
x.pipe += s.createFFmpegMap(e,e.details.input_map_choices.detector)
if(e.details.detector_scale_x_object&&e.details.detector_scale_x_object!==''&&e.details.detector_scale_y_object&&e.details.detector_scale_y_object!==''){x.dobjratio=' -s '+e.details.detector_scale_x_object+'x'+e.details.detector_scale_y_object}else{x.dobjratio=x.dratio}
if(e.details.detector_fps_object){x.detector_fps_object = e.details.detector_fps_object}
x.pipe += ' -r ' + x.detector_fps_object + x.dobjratio + x.cust_detect
x.pipe += ' -an -c:v pam -pix_fmt gray -f image2pipe pipe:3'
}
if(e.details.detector_use_detect_object === '1'){
x.pipe += s.createFFmpegMap(e,e.details.input_map_choices.detector)
setObjectDetectValues()
if(e.details.detector_h264 === '1'){
x.pipe += h264Output
}else{
x.pipe += ' -an -f singlejpeg pipe:4'
}
}
}else if(sendFramesGlobally){
}else if(sendFramesGlobally || sendFramesToObjectDetector){
if(sendFramesToObjectDetector){
x.pipe += s.createFFmpegMap(e,e.details.input_map_choices.detector)
setObjectDetectValues()
}
if(e.details.detector_h264 === '1'){
x.pipe += h264Output
}else{
x.pipe += ' -an -f singlejpeg pipe:3'
x.pipe += ' -an -f singlejpeg pipe:4'
}
}
}
@ -908,39 +927,38 @@ module.exports = function(s,config,lang,onFinish){
}
ffmpeg.buildTimelapseOutput = function(e,x){
if(e.details.record_timelapse === '1'){
x.record_timelapse_video_filters = []
var recordTimelapseVideoFilters = []
var flags = []
if(e.details.input_map_choices&&e.details.input_map_choices.record_timelapse){
//add input feed map
x.pipe += s.createFFmpegMap(e,e.details.input_map_choices.record_timelapse)
}
var flags = []
if(e.details.record_timelapse_fps && e.details.record_timelapse_fps !== ''){
flags.push('-r 1/' + e.details.record_timelapse_fps)
}else{
flags.push('-r 1/900') // 15 minutes
}
recordTimelapseVideoFilters.push('fps=1/' + (e.details.record_timelapse_fps ? e.details.record_timelapse_fps : '900'))
if(e.details.record_timelapse_vf && e.details.record_timelapse_vf !== '')flags.push('-vf ' + e.details.record_timelapse_vf)
if(e.details.record_timelapse_scale_x && e.details.record_timelapse_scale_x !== '' && e.details.record_timelapse_scale_y && e.details.record_timelapse_scale_y !== '')flags.push(`-s ${e.details.record_timelapse_scale_x}x${e.details.record_timelapse_scale_y}`)
//record - watermark for -vf
if(e.details.record_timelapse_watermark&&e.details.record_timelapse_watermark=="1"&&e.details.record_timelapse_watermark_location&&e.details.record_timelapse_watermark_location!==''){
switch(e.details.record_timelapse_watermark_position){
case'tl'://top left
x.record_timelapse_watermark_position='10:10'
x.record_timelapse_watermark_position = '10:10'
break;
case'tr'://top right
x.record_timelapse_watermark_position='main_w-overlay_w-10:10'
x.record_timelapse_watermark_position = 'main_w-overlay_w-10:10'
break;
case'bl'://bottom left
x.record_timelapse_watermark_position='10:main_h-overlay_h-10'
x.record_timelapse_watermark_position = '10:main_h-overlay_h-10'
break;
default://bottom right
x.record_timelapse_watermark_position='(main_w-overlay_w-10)/2:(main_h-overlay_h-10)/2'
x.record_timelapse_watermark_position = '(main_w-overlay_w-10)/2:(main_h-overlay_h-10)/2'
break;
}
x.record_timelapse_video_filters.push('movie='+e.details.record_timelapse_watermark_location+'[watermark],[in][watermark]overlay='+x.record_timelapse_watermark_position+'[out]');
recordTimelapseVideoFilters.push(
'movie=' + e.details.record_timelapse_watermark_location,
`[watermark],[in][watermark]overlay=${x.record_timelapse_watermark_position}[out]`
)
}
if(x.record_timelapse_video_filters.length > 0){
var videoFilter = `-vf "${x.record_timelapse_video_filters.join(',').trim()}"`
if(recordTimelapseVideoFilters.length > 0){
var videoFilter = `-vf "${recordTimelapseVideoFilters.join(',').trim()}"`
flags.push(videoFilter)
}
x.pipe += ` -f singlejpeg ${flags.join(' ')} -an -q:v 1 pipe:7`
@ -951,7 +969,7 @@ module.exports = function(s,config,lang,onFinish){
x.ffmpegCommandString = x.loglevel+x.input_fps;
//progress pipe
x.ffmpegCommandString += ' -progress pipe:5';
const url = s.buildMonitorUrl(e);
switch(e.type){
case'dashcam':
x.ffmpegCommandString += x.cust_input+x.hwaccel+' -i -';
@ -960,17 +978,17 @@ module.exports = function(s,config,lang,onFinish){
x.ffmpegCommandString += ' -pattern_type glob -f image2pipe'+x.record_fps+' -vcodec mjpeg'+x.cust_input+x.hwaccel+' -i -';
break;
case'mjpeg':
x.ffmpegCommandString += ' -reconnect 1 -f mjpeg'+x.cust_input+x.hwaccel+' -i "'+e.url+'"';
x.ffmpegCommandString += ' -reconnect 1 -f mjpeg'+x.cust_input+x.hwaccel+' -i "'+url+'"';
break;
case'mxpeg':
x.ffmpegCommandString += ' -reconnect 1 -f mxg'+x.cust_input+x.hwaccel+' -i "'+e.url+'"';
x.ffmpegCommandString += ' -reconnect 1 -f mxg'+x.cust_input+x.hwaccel+' -i "'+url+'"';
break;
case'rtmp':
if(!e.details.rtmp_key)e.details.rtmp_key = ''
x.ffmpegCommandString += x.cust_input+x.hwaccel+` -i "rtmp://127.0.0.1:1935/${e.ke + '_' + e.mid + '_' + e.details.rtmp_key}"`;
break;
case'h264':case'hls':case'mp4':
x.ffmpegCommandString += x.cust_input+x.hwaccel+' -i "'+e.url+'"';
x.ffmpegCommandString += x.cust_input+x.hwaccel+' -i "'+url+'"';
break;
case'local':
x.ffmpegCommandString += x.cust_input+x.hwaccel+' -i "'+e.path+'"';
@ -1013,11 +1031,40 @@ module.exports = function(s,config,lang,onFinish){
ffmpeg.assembleMainPieces(e,x)
ffmpeg.createPipeArray(e,x)
//hold ffmpeg command for log stream
s.group[e.ke].activeMonitors[e.mid].ffmpeg = x.ffmpegCommandString
var sanitizedCmd = x.ffmpegCommandString
if(e.details.muser && e.details.mpass){
sanitizedCmd = sanitizedCmd
.replace(`//${e.details.muser}:${e.details.mpass}@`,'//')
.replace(`=${e.details.muser}`,'=USERNAME')
.replace(`=${e.details.mpass}`,'=PASSWORD')
}else if(e.details.muser){
sanitizedCmd = sanitizedCmd.replace(`//${e.details.muser}:@`,'//').replace(`=${e.details.muser}`,'=USERNAME')
}
s.group[e.ke].activeMonitors[e.mid].ffmpeg = sanitizedCmd
//clean the string of spatial impurities and split for spawn()
x.ffmpegCommandString = s.splitForFFPMEG(x.ffmpegCommandString)
//launch that bad boy
return spawn(config.ffmpegDir,x.ffmpegCommandString,{detached: true,stdio:x.stdioPipes})
// return spawn(config.ffmpegDir,x.ffmpegCommandString,{detached: true,stdio:x.stdioPipes})
try{
fs.unlinkSync(e.sdir + 'cmd.txt')
}catch(err){
}
fs.writeFileSync(e.sdir + 'cmd.txt',JSON.stringify({
cmd: x.ffmpegCommandString,
pipes: x.stdioPipes.length,
rawMonitorConfig: s.group[e.ke].rawMonitorConfigurations[e.id],
globalInfo: {
config: config,
isAtleatOneDetectorPluginConnected: s.isAtleatOneDetectorPluginConnected
}
},null,3),'utf8')
var cameraCommandParams = [
'./libs/cameraThread/singleCamera.js',
config.ffmpegDir,
e.sdir + 'cmd.txt'
]
return spawn('node',cameraCommandParams,{detached: true,stdio:x.stdioPipes})
}
if(!config.ffmpegDir){
ffmpeg.checkForWindows(function(){

View File

@ -1,13 +1,254 @@
var fs = require('fs')
var moment = require('moment')
module.exports = function(s,config,lang,app,io){
s.getFileBinDirectory = function(e){
if(e.mid&&!e.id){e.id=e.mid}
s.checkDetails(e)
if(e.details&&e.details.dir&&e.details.dir!==''){
return s.checkCorrectPathEnding(e.details.dir)+e.ke+'/'+e.id+'/'
}else{
return s.dir.fileBin+e.ke+'/'+e.id+'/';
}
const getFileBinDirectory = function(monitor){
return s.dir.fileBin + monitor.ke + '/' + monitor.mid + '/'
}
const getFileBinEntry = (options) => {
return new Promise((resolve, reject) => {
s.knexQuery({
action: "select",
columns: "*",
table: "Files",
where: options
},(err,rows) => {
if(rows[0]){
resolve(rows[0])
}else{
resolve()
}
})
})
}
const getFileBinEntries = (options) => {
return new Promise((resolve, reject) => {
s.knexQuery({
action: "select",
columns: "*",
table: "Files",
where: options
},(err,rows) => {
if(rows){
resolve(rows)
}else{
resolve([])
}
})
})
}
const updateFileBinEntry = (options) => {
return new Promise((resolve, reject) => {
const groupKey = options.ke
const monitorId = options.mid
const filename = options.name
const update = options.update
if(!filename){
resolve('No Filename')
return
}
if(!update){
resolve('No Update Options')
return
}
s.knexQuery({
action: "select",
columns: "size",
table: "Files",
where: {
ke: groupKey,
mid: monitorId,
name: filename,
}
},(err,rows) => {
if(rows[0]){
const fileSize = rows[0].size
s.knexQuery({
action: "update",
table: "Files",
where: {
ke: groupKey,
mid: monitorId,
name: filename,
},
update: update
},(err) => {
resolve()
if(update.size){
s.setDiskUsedForGroup(groupKey,-(fileSize/1048576),'fileBin')
s.setDiskUsedForGroup(groupKey,(update.size/1048576),'fileBin')
s.purgeDiskForGroup(groupKey)
}
})
}else{
resolve()
}
})
})
}
const deleteFileBinEntry = (options) => {
return new Promise((resolve, reject) => {
const groupKey = options.ke
const monitorId = options.mid
const filename = options.name
if(!filename){
resolve('No Filename')
return
}
s.knexQuery({
action: "select",
columns: "size",
table: "Files",
where: {
ke: groupKey,
mid: monitorId,
name: filename,
}
},(err,rows) => {
if(rows[0]){
const fileSize = rows[0].size
s.knexQuery({
action: "delete",
table: "Files",
where: {
ke: groupKey,
mid: monitorId,
name: filename,
}
},(err) => {
resolve()
s.setDiskUsedForGroup(groupKey,-(fileSize/1048576),'fileBin')
s.purgeDiskForGroup(groupKey)
})
}else{
resolve()
}
})
})
}
const insertFileBinEntry = (options) => {
return new Promise((resolve, reject) => {
const groupKey = options.ke
const monitorId = options.mid
const filename = options.name
if(!filename){
resolve('No Filename')
return
}
const monitorFileBinDirectory = getFileBinDirectory({ke: groupKey,mid: monitorId,})
const fileSize = options.size || fs.lstatSync(monitorFileBinDirectory + filename).size
const details = options.details instanceof Object ? JSON.stringify(options.details) : options.details
const status = options.status || 0
const time = options.time || new Date()
s.knexQuery({
action: "insert",
table: "Files",
insert: {
ke: groupKey,
mid: monitorId,
name: filename,
size: fileSize,
details: details,
status: status,
time: time,
}
},(err) => {
resolve()
s.setDiskUsedForGroup(groupKey,(fileSize/1048576),'fileBin')
s.purgeDiskForGroup(groupKey)
})
})
}
s.getFileBinDirectory = getFileBinDirectory
s.getFileBinEntry = getFileBinEntry
s.insertFileBinEntry = insertFileBinEntry
s.updateFileBinEntry = updateFileBinEntry
s.deleteFileBinEntry = deleteFileBinEntry
/**
* API : Get fileBin file rows
*/
app.get([config.webPaths.apiPrefix+':auth/fileBin/:ke',config.webPaths.apiPrefix+':auth/fileBin/:ke/:id'],async (req,res) => {
s.auth(req.params,(user) => {
const userDetails = user.details
const monitorId = req.params.id
const groupKey = req.params.ke
const hasRestrictions = userDetails.sub && userDetails.allmonitors !== '1';
s.sqlQueryBetweenTimesWithPermissions({
table: 'Files',
user: user,
groupKey: req.params.ke,
monitorId: req.params.id,
startTime: req.query.start,
endTime: req.query.end,
startTimeOperator: req.query.startOperator,
endTimeOperator: req.query.endOperator,
limit: req.query.limit,
endIsStartTo: true,
noFormat: true,
noCount: true,
preliminaryValidationFailed: (
user.permissions.get_monitors === "0"
)
},(response) => {
response.forEach(function(v){
v.details = s.parseJSON(v.details)
v.href = '/'+req.params.auth+'/fileBin/'+req.params.ke+'/'+req.params.id+'/'+v.details.year+'/'+v.details.month+'/'+v.details.day+'/'+v.name;
})
s.closeJsonResponse(res,{
ok: true,
files: response
})
})
},res,req);
});
/**
* API : Get fileBin file
*/
app.get(config.webPaths.apiPrefix+':auth/fileBin/:ke/:id/:year/:month/:day/:file', async (req,res) => {
s.auth(req.params,function(user){
var failed = function(){
res.end(user.lang['File Not Found'])
}
if (!s.group[req.params.ke].fileBin[req.params.id+'/'+req.params.file]){
const groupKey = req.params.ke
const monitorId = req.params.id
const monitorRestrictions = s.getMonitorRestrictions(user.details,monitorId)
if(user.details.sub && user.details.allmonitors === '0' && (user.permissions.get_monitors === "0" || monitorRestrictions.length === 0)){
s.closeJsonResponse(res,{
ok: false,
msg: lang['Not Permitted']
})
return
}
s.knexQuery({
action: "select",
columns: "*",
table: "Files",
where: [
['ke','=',groupKey],
['mid','=',req.params.id],
['name','=',req.params.file],
monitorRestrictions
]
},(err,r) => {
if(r && r[0]){
r = r[0]
r.details = JSON.parse(r.details)
req.dir = s.dir.fileBin + req.params.ke + '/' + req.params.id + '/' + r.details.year + '/' + r.details.month + '/' + r.details.day + '/' + req.params.file;
fs.stat(req.dir,function(err,stats){
if(!err){
res.on('finish',function(){res.end()})
fs.createReadStream(req.dir).pipe(res)
}else{
failed()
}
})
}else{
failed()
}
})
}else{
res.end(user.lang['Please Wait for Completion'])
}
},res,req);
});
}

View File

@ -11,6 +11,7 @@ module.exports = function(s,config,lang){
}else{
config.streamDir = config.windowsTempDir
}
config.shmDir = `${s.checkCorrectPathEnding(config.streamDir)}`
if(!fs.existsSync(config.streamDir)){
config.streamDir = s.mainDirectory+'/streams/'
}else{

View File

@ -1,79 +1,130 @@
var fs = require('fs');
var exec = require('child_process').exec;
var spawn = require('child_process').spawn;
const { getCpuUsageOnLinux, getRamUsageOnLinux } = require('./health/utils.js')
module.exports = function(s,config,lang,io){
s.heartBeat = function(){
setTimeout(s.heartBeat, 8000);
io.sockets.emit('ping',{beat:1});
}
s.heartBeat()
s.cpuUsage = function(callback){
k={}
switch(s.platform){
case'win32':
k.cmd="@for /f \"skip=1\" %p in ('wmic cpu get loadpercentage') do @echo %p%"
break;
case'darwin':
k.cmd="ps -A -o %cpu | awk '{s+=$1} END {print s}'";
break;
case'linux':
k.cmd='LANG=C top -b -n 2 | grep "^'+config.cpuUsageMarker+'" | awk \'{print $2}\' | tail -n1';
break;
case'freebsd':
k.cmd='vmstat 1 2 | tail -1 | awk \'{print $17}\''
break;
let hasProcStat = false
try{
fs.statSync("/proc/stat")
hasProcStat = true
}catch(err){
}
if(hasProcStat){
s.cpuUsage = async () => {
const percent = await getCpuUsageOnLinux()
return percent
}
if(config.customCpuCommand){
exec(config.customCpuCommand,{encoding:'utf8',detached: true},function(err,d){
if(s.isWin===true) {
d = d.replace(/(\r\n|\n|\r)/gm, "").replace(/%/g, "")
}
callback(d)
s.onGetCpuUsageExtensions.forEach(function(extender){
extender(d)
})
})
} else if(k.cmd){
exec(k.cmd,{encoding:'utf8',detached: true},function(err,d){
if(s.isWin===true){
d=d.replace(/(\r\n|\n|\r)/gm,"").replace(/%/g,"")
}
callback(d)
s.onGetCpuUsageExtensions.forEach(function(extender){
extender(d)
})
})
} else {
callback(0)
}else{
s.cpuUsage = () => {
return new Promise((resolve, reject) => {
var k = {}
switch(s.platform){
case'win32':
k.cmd = "@for /f \"skip=1\" %p in ('wmic cpu get loadpercentage') do @echo %p%"
break;
case'darwin':
k.cmd = "ps -A -o %cpu | awk '{s+=$1} END {print s}'";
break;
case'linux':
k.cmd = 'top -b -n 2 | awk \'toupper($0) ~ /^.?CPU/ {gsub("id,","100",$8); gsub("%","",$8); print 100-$8}\' | tail -n 1';
break;
case'freebsd':
k.cmd = 'vmstat 1 2 | awk \'END{print 100-$19}\''
break;
case'openbsd':
k.cmd = 'vmstat 1 2 | awk \'END{print 100-$18}\''
break;
}
if(config.customCpuCommand){
exec(config.customCpuCommand,{encoding:'utf8',detached: true},function(err,d){
if(s.isWin===true) {
d = d.replace(/(\r\n|\n|\r)/gm, "").replace(/%/g, "")
}
resolve(d)
s.onGetCpuUsageExtensions.forEach(function(extender){
extender(d)
})
})
} else if(k.cmd){
exec(k.cmd,{encoding:'utf8',detached: true},function(err,d){
if(s.isWin===true){
d=d.replace(/(\r\n|\n|\r)/gm,"").replace(/%/g,"")
}
resolve(d)
s.onGetCpuUsageExtensions.forEach(function(extender){
extender(d)
})
})
} else {
resolve(0)
}
})
}
}
s.ramUsage = function(callback){
k={}
switch(s.platform){
case'win32':
k.cmd = "wmic OS get FreePhysicalMemory /Value"
break;
case'darwin':
k.cmd = "vm_stat | awk '/^Pages free: /{f=substr($3,1,length($3)-1)} /^Pages active: /{a=substr($3,1,length($3-1))} /^Pages inactive: /{i=substr($3,1,length($3-1))} /^Pages speculative: /{s=substr($3,1,length($3-1))} /^Pages wired down: /{w=substr($4,1,length($4-1))} /^Pages occupied by compressor: /{c=substr($5,1,length($5-1)); print ((a+w)/(f+a+i+w+s+c))*100;}'"
break;
case'freebsd':
k.cmd = "echo \"scale=4; $(vmstat -H | tail -1 | awk '{print $5}')*1024*100/$(sysctl hw.physmem | awk '{print $2}')\" | bc"
break;
default:
k.cmd = "LANG=C free | grep Mem | awk '{print $7/$2 * 100.0}'";
break;
let hasProcMeminfo = false
try{
fs.statSync("/proc/meminfo")
hasProcMeminfo = true
}catch(err){
}
if(hasProcMeminfo){
s.ramUsage = async () => {
const used = await getRamUsageOnLinux()
return used
}
if(k.cmd){
exec(k.cmd,{encoding:'utf8',detached: true},function(err,d){
if(s.isWin===true){
d=(parseInt(d.split('=')[1])/(s.totalmem/1000))*100
}
callback(d)
s.onGetRamUsageExtensions.forEach(function(extender){
extender(d)
})
})
}else{
callback(0)
}else{
s.ramUsage = () => {
return new Promise((resolve, reject) => {
k={}
switch(s.platform){
case'win32':
k.cmd = "wmic OS get FreePhysicalMemory /Value"
break;
case'darwin':
k.cmd = "vm_stat | awk '/^Pages free: /{f=substr($3,1,length($3)-1)} /^Pages active: /{a=substr($3,1,length($3-1))} /^Pages inactive: /{i=substr($3,1,length($3-1))} /^Pages speculative: /{s=substr($3,1,length($3-1))} /^Pages wired down: /{w=substr($4,1,length($4-1))} /^Pages occupied by compressor: /{c=substr($5,1,length($5-1)); print ((a+w)/(f+a+i+w+s+c))*100;}'"
break;
case'freebsd':
k.cmd = "echo \"scale=4; $(vmstat -H | awk 'END{print $5}')*1024*100/$(sysctl -n hw.physmem)\" | bc"
break;
case'openbsd':
k.cmd = "echo \"scale=4; $(vmstat | awk 'END{ gsub(\"M\",\"\",$4); print $4 }')*104857600/$(sysctl -n hw.physmem)\" | bc"
break;
default:
k.cmd = "LANG=C free | grep Mem | awk '{print $7/$2 * 100.0}'";
break;
}
if(k.cmd){
exec(k.cmd,{encoding:'utf8',detached: true},function(err,d){
if(s.isWin===true){
d=(parseInt(d.split('=')[1])/(s.totalmem/1000))*100
}
resolve(d)
s.onGetRamUsageExtensions.forEach(function(extender){
extender(d)
})
})
}else{
resolve(0)
}
})
}
}
if(config.childNodes.mode !== 'child'){
setInterval(async () => {
const cpu = await s.cpuUsage()
const ram = await s.ramUsage()
s.tx({
f: 'os',
cpu: cpu,
ram: ram
},'CPU')
},10000)
}
}

66
libs/health/utils.js Normal file
View File

@ -0,0 +1,66 @@
// This file's contents were referenced from https://gist.github.com/sidwarkd/9578213
const fs = require('fs');
const calculateCPUPercentage = function(oldVals, newVals){
var totalDiff = newVals.total - oldVals.total;
var activeDiff = newVals.active - oldVals.active;
return Math.ceil((activeDiff / totalDiff) * 100);
};
function getValFromLine(line){
var match = line.match(/[0-9]+/gi);
if(match !== null)
return parseInt(match[0]);
else
return null;
};
const currentCPUInfo = {
total: 0,
active: 0
}
const lastCPUInfo = {
total: 0,
active: 0
}
exports.getCpuUsageOnLinux = () => {
lastCPUInfo.active = currentCPUInfo.active;
lastCPUInfo.idle = currentCPUInfo.idle;
lastCPUInfo.total = currentCPUInfo.total;
return new Promise((resolve,reject) => {
const getUsage = function(callback){
fs.readFile("/proc/stat" ,'utf8', function(err, data){
var lines = data.split('\n');
var cpuTimes = lines[0].match(/[0-9]+/gi);
currentCPUInfo.total = 0;
currentCPUInfo.idle = parseInt(cpuTimes[3]) + parseInt(cpuTimes[4]);
for (var i = 0; i < cpuTimes.length; i++){
currentCPUInfo.total += parseInt(cpuTimes[i]);
}
currentCPUInfo.active = currentCPUInfo.total - currentCPUInfo.idle
currentCPUInfo.percentUsed = calculateCPUPercentage(lastCPUInfo, currentCPUInfo);
callback(currentCPUInfo.percentUsed)
})
}
getUsage(function(percentage){
setTimeout(function(){
getUsage(function(percentage){
resolve(percentage);
})
}, 3000)
})
})
}
exports.getRamUsageOnLinux = () => {
return new Promise((resolve,reject) => {
fs.readFile("/proc/meminfo", 'utf8', function(err, data){
const lines = data.split('\n');
const total = Math.floor(getValFromLine(lines[0]) / 1024);
const free = Math.floor(getValFromLine(lines[1]) / 1024);
const cached = Math.floor(getValFromLine(lines[4]) / 1024);
const used = total - free;
const percentUsed = Math.ceil(((used - cached) / total) * 100);
resolve({
used: used,
percent: percentUsed,
});
})
})
}

File diff suppressed because it is too large Load Diff

89
libs/monitor/utils.js Normal file
View File

@ -0,0 +1,89 @@
module.exports = (s,config,lang) => {
const cameraDestroy = function(e,p){
if(
s.group[e.ke] &&
s.group[e.ke].activeMonitors[e.id] &&
s.group[e.ke].activeMonitors[e.id].spawn !== undefined
){
const activeMonitor = s.group[e.ke].activeMonitors[e.id];
const proc = s.group[e.ke].activeMonitors[e.id].spawn;
if(proc){
activeMonitor.allowStdinWrite = false
s.txToDashcamUsers({
f : 'disable_stream',
ke : e.ke,
mid : e.id
},e.ke)
// if(activeMonitor.p2pStream){activeMonitor.p2pStream.unpipe();}
try{
proc.removeListener('end',activeMonitor.spawn_exit);
proc.removeListener('exit',activeMonitor.spawn_exit);
delete(activeMonitor.spawn_exit);
}catch(er){
}
}
if(activeMonitor.audioDetector){
activeMonitor.audioDetector.stop()
delete(activeMonitor.audioDetector)
}
activeMonitor.firstStreamChunk = {}
clearTimeout(activeMonitor.recordingChecker);
delete(activeMonitor.recordingChecker);
clearTimeout(activeMonitor.streamChecker);
delete(activeMonitor.streamChecker);
clearTimeout(activeMonitor.checkSnap);
delete(activeMonitor.checkSnap);
clearTimeout(activeMonitor.watchdog_stop);
delete(activeMonitor.watchdog_stop);
delete(activeMonitor.lastJpegDetectorFrame);
delete(activeMonitor.detectorFrameSaveBuffer);
clearTimeout(activeMonitor.recordingSnapper);
clearInterval(activeMonitor.getMonitorCpuUsage);
clearInterval(activeMonitor.objectCountIntervals);
delete(activeMonitor.onvifConnection)
if(activeMonitor.onChildNodeExit){
activeMonitor.onChildNodeExit()
}
activeMonitor.spawn.stdio.forEach(function(stdio){
try{
stdio.unpipe()
}catch(err){
console.log(err)
}
})
if(activeMonitor.mp4frag){
var mp4FragChannels = Object.keys(activeMonitor.mp4frag)
mp4FragChannels.forEach(function(channel){
activeMonitor.mp4frag[channel].removeAllListeners()
delete(activeMonitor.mp4frag[channel])
})
}
if(config.childNodes.enabled === true && config.childNodes.mode === 'child' && config.childNodes.host){
s.cx({f:'clearCameraFromActiveList',ke:e.ke,id:e.id})
}
if(activeMonitor.childNode){
s.cx({f:'kill',d:s.cleanMonitorObject(e)},activeMonitor.childNodeId)
}else{
s.coSpawnClose(e)
if(proc && proc.kill){
if(s.isWin){
spawn("taskkill", ["/pid", proc.pid, '/t'])
}else{
proc.kill('SIGTERM')
}
setTimeout(function(){
try{
proc.kill()
}catch(err){
s.debugLog(err)
}
},1000)
}
}
}
}
return {
cameraDestroy: cameraDestroy
}
}

View File

@ -1,6 +1,16 @@
var fs = require("fs")
var Discord = require("discord.js")
var template = require("./notifications/emailTemplate.js")
module.exports = function(s,config,lang){
const checkEmail = (email) => {
if(email.toLowerCase().indexOf('@shinobi') > -1 && !config.allowSpammingViaEmail){
console.log('CHANGE YOUR ACCOUNT EMAIL!')
console.log(email + ' IS NOT ALLOWED TO BE USED')
console.log('YOU CANNOT EMAIL TO THIS ADDRESS')
return 'cannot@email.com'
}
return email
}
//discord bot
if(config.discordBot === true){
try{
@ -11,7 +21,7 @@ module.exports = function(s,config,lang){
s.userLog({ke:groupKey,mid:'$USER'},{type:lang.DiscordFailedText,msg:lang.DiscordNotEnabledText})
return
}
var sendBody = Object.assign({
const sendBody = Object.assign({
color: 3447003,
title: 'Alert from Shinobi',
description: "",
@ -22,7 +32,7 @@ module.exports = function(s,config,lang){
text: "Shinobi Systems"
}
},data)
var discordChannel = bot.channels.get(s.group[groupKey].init.discordbot_channel)
const discordChannel = bot.channels.cache.get(s.group[groupKey].init.discordbot_channel)
if(discordChannel && discordChannel.send){
discordChannel.send({
embed: sendBody,
@ -44,10 +54,10 @@ module.exports = function(s,config,lang){
})
}
}
var onEventTriggerBeforeFilterForDiscord = function(d,filter){
const onEventTriggerBeforeFilterForDiscord = function(d,filter){
filter.discord = true
}
var onEventTriggerForDiscord = function(d,filter){
const onEventTriggerForDiscord = async (d,filter) => {
// d = event object
//discord bot
if(filter.discord && s.group[d.ke].discordBot && d.mon.details.detector_discordbot === '1' && !s.group[d.ke].activeMonitors[d.id].detector_discordbot){
@ -59,28 +69,11 @@ module.exports = function(s,config,lang){
}
//lock mailer so you don't get emailed on EVERY trigger event.
s.group[d.ke].activeMonitors[d.id].detector_discordbot = setTimeout(function(){
//unlock so you can mail again.
clearTimeout(s.group[d.ke].activeMonitors[d.id].detector_discordbot);
delete(s.group[d.ke].activeMonitors[d.id].detector_discordbot);
},detector_discordbot_timeout)
var files = []
var sendAlert = function(){
s.discordMsg({
author: {
name: s.group[d.ke].rawMonitorConfigurations[d.id].name,
icon_url: config.iconURL
},
title: lang.Event+' - '+d.screenshotName,
description: lang.EventText1+' '+d.currentTimestamp,
fields: [],
timestamp: d.currentTime,
footer: {
icon_url: config.iconURL,
text: "Shinobi Systems"
}
},files,d.ke)
}
if(d.mon.details.detector_discordbot_send_video === '1'){
// change to function that captures on going video capture, waits, grabs new video file, slices portion (max for transmission) and prepares for delivery
s.mergeDetectorBufferChunks(d,function(mergedFilepath,filename){
s.discordMsg({
author: {
@ -102,21 +95,34 @@ module.exports = function(s,config,lang){
],d.ke)
})
}
s.getRawSnapshotFromMonitor(d.mon,{
const {screenShot, isStaticFile} = await s.getRawSnapshotFromMonitor(d.mon,{
secondsInward: d.mon.details.snap_seconds_inward
},function(data){
if(data[data.length - 2] === 0xFF && data[data.length - 1] === 0xD9){
d.screenshotBuffer = data
files.push({
attachment: d.screenshotBuffer,
name: d.screenshotName+'.jpg'
})
}
sendAlert()
})
if(screenShot[screenShot.length - 2] === 0xFF && screenShot[screenShot.length - 1] === 0xD9){
d.screenshotBuffer = screenShot
s.discordMsg({
author: {
name: s.group[d.ke].rawMonitorConfigurations[d.id].name,
icon_url: config.iconURL
},
title: lang.Event+' - '+d.screenshotName,
description: lang.EventText1+' '+d.currentTimestamp,
fields: [],
timestamp: d.currentTime,
footer: {
icon_url: config.iconURL,
text: "Shinobi Systems"
}
},[
{
attachment: screenShot,
name: d.screenshotName+'.jpg'
}
],d.ke)
}
}
}
var onTwoFactorAuthCodeNotificationForDiscord = function(r){
const onTwoFactorAuthCodeNotificationForDiscord = function(r){
// r = user
if(r.details.factor_discord === '1'){
s.discordMsg({
@ -135,13 +141,13 @@ module.exports = function(s,config,lang){
},[],r.ke)
}
}
var loadDiscordBotForUser = function(user){
ar=JSON.parse(user.details);
const loadDiscordBotForUser = function(user){
const userDetails = s.parseJSON(user.details);
//discordbot
if(!s.group[user.ke].discordBot &&
config.discordBot === true &&
ar.discordbot === '1' &&
ar.discordbot_token !== ''
userDetails.discordbot === '1' &&
userDetails.discordbot_token !== ''
){
s.group[user.ke].discordBot = new Discord.Client()
s.group[user.ke].discordBot.on('ready', () => {
@ -153,16 +159,16 @@ module.exports = function(s,config,lang){
msg: s.group[user.ke].discordBot.user.tag
})
})
s.group[user.ke].discordBot.login(ar.discordbot_token)
s.group[user.ke].discordBot.login(userDetails.discordbot_token)
}
}
var unloadDiscordBotForUser = function(user){
const unloadDiscordBotForUser = function(user){
if(s.group[user.ke].discordBot && s.group[user.ke].discordBot.destroy){
s.group[user.ke].discordBot.destroy()
delete(s.group[user.ke].discordBot)
}
}
var onDetectorNoTriggerTimeoutForDiscord = function(e){
const onDetectorNoTriggerTimeoutForDiscord = function(e){
//e = monitor object
var currentTime = new Date()
if(e.details.detector_notrigger_discord === '1'){
@ -204,14 +210,22 @@ module.exports = function(s,config,lang){
if(config.mail.from === undefined){config.mail.from = '"ShinobiCCTV" <no-reply@shinobi.video>'}
s.nodemailer = require('nodemailer').createTransport(config.mail);
}
var onDetectorNoTriggerTimeoutForEmail = function(e){
const onDetectorNoTriggerTimeoutForEmail = function(e){
//e = monitor object
if(config.mail && e.details.detector_notrigger_mail === '1'){
s.sqlQuery('SELECT mail FROM Users WHERE ke=? AND details NOT LIKE ?',[e.ke,'%"sub"%'],function(err,r){
s.knexQuery({
action: "select",
columns: "mail",
table: "Users",
where: [
['ke','=',e.ke],
['details','NOT LIKE','%"sub"%'],
]
},(err,r) => {
r = r[0]
var mailOptions = {
from: config.mail.from, // sender address
to: r.mail, // list of receivers
to: checkEmail(r.mail), // list of receivers
subject: lang.NoMotionEmailText1+' '+e.name+' ('+e.id+')', // Subject line
html: '<i>'+lang.NoMotionEmailText2+' ' + (e.details.detector_notrigger_timeout || 10) + ' '+lang.minutes+'.</i>',
}
@ -228,16 +242,15 @@ module.exports = function(s,config,lang){
})
}
}
var onTwoFactorAuthCodeNotificationForEmail = function(r){
const onTwoFactorAuthCodeNotificationForEmail = function(r){
// r = user object
if(r.details.factor_mail !== '0'){
var mailOptions = {
s.nodemailer.sendMail({
from: config.mail.from,
to: r.mail,
to: checkEmail(r.mail),
subject: r.lang['2-Factor Authentication'],
html: r.lang['Enter this code to proceed']+' <b>'+s.factorAuth[r.ke][r.uid].key+'</b>. '+r.lang.FactorAuthText1,
};
s.nodemailer.sendMail(mailOptions, (error, info) => {
}, (error, info) => {
if (error) {
s.systemLog(r.lang.MailError,error)
return
@ -245,14 +258,14 @@ module.exports = function(s,config,lang){
})
}
}
var onFilterEventForEmail = function(x,d){
const onFilterEventForEmail = function(x,d){
// x = filter function
// d = filter event object
if(x === 'email'){
if(d.videos && d.videos.length > 0){
d.mailOptions = {
from: config.mail.from, // sender address
to: d.mail, // list of receivers
to: checkEmail(d.mail),
subject: lang['Filter Matches']+' : '+d.name, // Subject line
html: lang.FilterMatchesText1+' '+d.videos.length+' '+lang.FilterMatchesText2,
};
@ -274,13 +287,25 @@ module.exports = function(s,config,lang){
}
}
}
var onEventTriggerBeforeFilterForEmail = function(d,filter){
filter.mail = true
const onEventTriggerBeforeFilterForEmail = function(d,filter){
if(d.mon.details.detector_mail === '1'){
filter.mail = true
}else{
filter.mail = false
}
}
var onEventTriggerForEmail = function(d,filter){
if(filter.mail && config.mail && !s.group[d.ke].activeMonitors[d.id].detector_mail && d.mon.details.detector_mail === '1'){
s.sqlQuery('SELECT mail FROM Users WHERE ke=? AND details NOT LIKE ?',[d.ke,'%"sub"%'],function(err,r){
r=r[0];
const onEventTriggerForEmail = async (d,filter) => {
if(filter.mail && config.mail && !s.group[d.ke].activeMonitors[d.id].detector_mail){
s.knexQuery({
action: "select",
columns: "mail",
table: "Users",
where: [
['ke','=',d.ke],
['details','NOT LIKE','%"sub"%'],
]
},async (err,r) => {
r = r[0];
var detector_mail_timeout
if(!d.mon.details.detector_mail_timeout||d.mon.details.detector_mail_timeout===''){
detector_mail_timeout = 1000*60*10;
@ -288,24 +313,35 @@ module.exports = function(s,config,lang){
detector_mail_timeout = parseFloat(d.mon.details.detector_mail_timeout)*1000*60;
}
//lock mailer so you don't get emailed on EVERY trigger event.
s.group[d.ke].activeMonitors[d.id].detector_mail=setTimeout(function(){
s.group[d.ke].activeMonitors[d.id].detector_mail = setTimeout(function(){
//unlock so you can mail again.
clearTimeout(s.group[d.ke].activeMonitors[d.id].detector_mail);
delete(s.group[d.ke].activeMonitors[d.id].detector_mail);
},detector_mail_timeout);
var files = []
var mailOptions = {
from: config.mail.from, // sender address
to: r.mail, // list of receivers
subject: lang.Event+' - '+d.screenshotName, // Subject line
html: '<i>'+lang.EventText1+' '+d.currentTimestamp+'.</i>',
attachments: files
}
var sendMail = function(){
Object.keys(d.details).forEach(function(v,n){
mailOptions.html+='<div><b>'+v+'</b> : '+d.details[v]+'</div>'
const sendMail = function(files){
const infoRows = []
Object.keys(d.details).forEach(function(key){
var value = d.details[key]
var text = value
if(value instanceof Object){
text = JSON.stringify(value,null,3)
}
infoRows.push(template.createRow({
title: key,
text: text
}))
})
s.nodemailer.sendMail(mailOptions, (error, info) => {
s.nodemailer.sendMail({
from: config.mail.from,
to: checkEmail(r.mail),
subject: lang.Event+' - '+d.screenshotName,
html: template.createFramework({
title: lang.EventText1 + ' ' + d.currentTimestamp,
subtitle: 'Shinobi Event',
body: infoRows.join(''),
}),
attachments: files || []
}, (error, info) => {
if (error) {
s.systemLog(lang.MailError,error)
return false;
@ -313,12 +349,13 @@ module.exports = function(s,config,lang){
})
}
if(d.mon.details.detector_mail_send_video === '1'){
// change to function that captures on going video capture, waits, grabs new video file, slices portion (max for transmission) and prepares for delivery
s.mergeDetectorBufferChunks(d,function(mergedFilepath,filename){
fs.readFile(mergedFilepath,function(err,buffer){
if(buffer){
s.nodemailer.sendMail({
from: config.mail.from,
to: r.mail,
to: checkEmail(r.mail),
subject: filename,
html: '',
attachments: [
@ -337,24 +374,18 @@ module.exports = function(s,config,lang){
})
})
}
if(d.screenshotBuffer){
files.push({
if(!d.screenshotBuffer){
const {screenShot, isStaticFile} = await s.getRawSnapshotFromMonitor(d.mon,{
secondsInward: d.mon.details.snap_seconds_inward
})
d.screenshotBuffer = screenShot
}
sendMail([
{
filename: d.screenshotName + '.jpg',
content: d.screenshotBuffer
})
sendMail()
}else{
s.getRawSnapshotFromMonitor(d.mon,{
secondsInward: d.mon.details.snap_seconds_inward
},function(data){
d.screenshotBuffer = data
files.push({
filename: d.screenshotName + '.jpg',
content: data
})
sendMail()
})
}
}
])
})
}
}

View File

@ -0,0 +1,250 @@
// Example of how to generate HTML for an email.
// createFramework({
// title: 'Password Reset',
// subtitle: 'If you did not make this request please change your password.',
// body: [
// createRow({
// title: 'Customer',
// text: `<span style="border:0;margin:0;padding:0;color:inherit;text-decoration:none">${customer.email}</span> — ${customer.id}`
// }),
// createRow({
// btn: {
// text: 'Confirm Password Reset',
// href: `https://licenses.shinobi.video/forgot/reset?code=${newCode}`
// }
// }),
// createRow({
// title: 'Reset Code',
// text: newCode
// }),
// ].join(''),
// })
module.exports = {
createRow : (options) => {
const trFillers = `<tr>
<td colspan="3" height="11" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td colspan="3" height="1" style="border:0;margin:0;padding:0;border:1px solid #ffffff;border-width:1px 0 0 0;font-size:1px;line-height:1px;max-height:1px">
<div>&nbsp;</div>
</td>
</tr>`
if(options.btn){
return `<tr>
<td style="border:0;margin:0;padding:0">
<table border="0" cellpadding="0" cellspacing="0" width="100%">
<tbody>
<tr>
<td colspan="3" height="12" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="16">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0;color:#525f7f;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:16px;line-height:24px">
<table border="0" cellpadding="0" cellspacing="0" width="100%">
<tbody>
<tr>
<td align="center" height="38" valign="middle" style="border:0;margin:0;padding:0;background-color:#666ee8;border-radius:5px;text-align:center">
<a style="border:0;margin:0;padding:0;color:#ffffff;display:block;height:38px;text-align:center;text-decoration:none" href="${options.btn.href}" target="_blank">
<span style="border:0;margin:0;padding:0;color:#ffffff;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:16px;font-weight:bold;height:38px;line-height:38px;text-decoration:none;vertical-align:middle;white-space:nowrap;width:100%">
${options.btn.text}
</span>
</a>
</td>
</tr>
</tbody>
</table>
</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="16">
<div>&nbsp;</div>
</td>
</tr>
${trFillers}
</tbody>
</table>
</td>
</tr>`
}
return `<tr>
<td style="border:0;margin:0;padding:0">
<table border="0" cellpadding="0" cellspacing="0" width="100%">
<tbody>
<tr>
<td colspan="3" height="12" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="16">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0;color:#525f7f;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:16px;line-height:24px">
<table border="0" cellpadding="0" cellspacing="0" width="100%">
<tbody>
<tr>
<td style="border:0;margin:0;padding:0;color:#8898aa;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:12px;font-weight:bold;line-height:16px;text-transform:uppercase">
${options.title}
</td>
</tr>
</tbody>
</table>
<table border="0" cellpadding="0" cellspacing="0" width="100%">
<tbody>
<tr>
<td height="4" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px;max-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>
<table border="0" cellpadding="0" cellspacing="0" width="100%">
<tbody>
<tr>
<td style="border:0;margin:0;padding:0;color:#525f7f;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:16px;line-height:24px">
${options.text}
</td>
</tr>
</tbody>
</table>
</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="16">
<div>&nbsp;</div>
</td>
</tr>
${trFillers}
</tbody>
</table>
</td>
</tr>`
},
createFramework : (options) => {
return `<div bgcolor="f6f9fc" style="border:0;margin:40px 0 40px 0;padding:0;min-width:100%;width:100%;/* text-align: center; */">
<table border="0" cellpadding="0" cellspacing="0" width="600" style="min-width:600px">
<tbody>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0;color:#32325d;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:24px;line-height:32px">
<span class="m_4782828237464230064st-Delink m_4782828237464230064st-Delink--title" style="border:0;margin:0;padding:0;color:#32325d;text-decoration:none">${options.title}</span>
</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td colspan="3" height="8" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>
<table border="0" cellpadding="0" cellspacing="0" width="600" style="min-width:600px">
<tbody>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0;color:#525f7f!important;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:16px;line-height:24px">${options.subtitle}</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td colspan="3" height="12" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>
<table border="0" cellpadding="0" cellspacing="0" width="600" style="min-width:600px">
<tbody>
<tr>
<td colspan="3" height="4" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0">
<table bgcolor="f6f9fc" border="0" cellpadding="0" cellspacing="0" style="border-radius:5px" width="100%">
<tbody>
${options.body}
<tr>
<td colspan="3" height="12" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td colspan="3" height="16" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>
<table border="0" cellpadding="0" cellspacing="0" width="600" style="min-width:600px">
<tbody>
<tr>
<td colspan="3" height="20" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px;max-height:1px">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px;max-height:1px" width="64">
<div>&nbsp;</div>
</td>
<td bgcolor="#e6ebf1" height="1" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px;max-height:1px">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px;max-height:1px" width="64">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td colspan="3" height="31" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px;max-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>
${options.footerText ? `<table border="0" cellpadding="0" cellspacing="0" width="600" style="min-width:600px">
<tbody>
<tr>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
<td style="border:0;margin:0;padding:0;color:#525f7f!important;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,'Helvetica Neue',Ubuntu,sans-serif;font-size:16px;line-height:24px">
${options.footerText}
</td>
<td style="border:0;margin:0;padding:0;font-size:1px;line-height:1px" width="64">
<div>&nbsp;</div>
</td>
</tr>
<tr>
<td colspan="3" height="12" style="border:0;margin:0;padding:0;font-size:1px;line-height:1px">
<div>&nbsp;</div>
</td>
</tr>
</tbody>
</table>` : ''}
</div>`
}
}

View File

@ -14,11 +14,14 @@ module.exports = function(s,config,lang,io){
case's.tx':
s.tx(d.data,d.to)
break;
case'log':
s.systemLog('PLUGIN : '+d.plug+' : ',d)
break;
case's.sqlQuery':
s.sqlQuery(d.query,d.values)
break;
case'log':
s.systemLog('PLUGIN : '+d.plug+' : ',d)
case's.knexQuery':
s.knexQuery(d.options)
break;
}
}
@ -72,10 +75,84 @@ module.exports = function(s,config,lang,io){
s.debugLog(`resetDetectorPluginArray : ${JSON.stringify(pluginArray)}`)
s.detectorPluginArray = pluginArray
}
s.sendToAllDetectors = function(data){
s.detectorPluginArray.forEach(function(name){
s.connectedPlugins[name].tx(data)
})
var currentPluginCpuUsage = {}
var currentPluginGpuUsage = {}
var currentPluginFrameProcessingCount = {}
var pluginHandlersSet = {}
if(config.detectorPluginsCluster){
if(config.clusterUseBasicFrameCount === undefined)config.clusterUseBasicFrameCount = true;
if(config.clusterUseBasicFrameCount){
// overAllProcessingCount
var getPluginWithLowestUtilization = () => {
var selectedPluginServer = null
var lowestUsed = 1000
s.detectorPluginArray.forEach((pluginName) => {
const processCount = currentPluginFrameProcessingCount[pluginName] || 0
if(processCount < lowestUsed){
selectedPluginServer = pluginName
lowestUsed = processCount
}
})
if(selectedPluginServer){
return s.connectedPlugins[selectedPluginServer]
}else{
return {tx: () => {}}
}
}
}else{
if(config.clusterBasedOnGpu){
var getPluginWithLowestUtilization = () => {
var selectedPluginServer = null
var lowestUsed = 1000
s.detectorPluginArray.forEach((pluginName) => {
var overAllPercent = 0
var gpus = currentPluginGpuUsage[pluginName]
gpus.forEach((gpu) => {
console.log(gpu)
const percent = gpu.utilization
overAllPercent += percent
})
if((overAllPercent / gpus.length) < lowestUsed){
selectedPluginServer = pluginName
lowestUsed = overAllPercent
}
})
if(selectedPluginServer){
return s.connectedPlugins[selectedPluginServer]
}else{
return {tx: () => {}}
}
}
}else{
var getPluginWithLowestUtilization = () => {
var selectedPluginServer = null
var lowestUsed = 1000
s.detectorPluginArray.forEach((pluginName) => {
const percent = currentPluginCpuUsage[pluginName]
if(percent < lowestUsed){
selectedPluginServer = pluginName
lowestUsed = percent
}
})
if(selectedPluginServer){
return s.connectedPlugins[selectedPluginServer]
}else{
return {tx: () => {}}
}
}
}
}
s.debugLog(`Detector Plugins running in Cluster Mode`)
s.sendToAllDetectors = function(data){
getPluginWithLowestUtilization().tx(data)
}
}else{
s.sendToAllDetectors = function(data){
s.detectorPluginArray.forEach(function(name){
s.connectedPlugins[name].tx(data)
})
}
}
s.sendDetectorInfoToClient = function(data,txFunction){
s.detectorPluginArray.forEach(function(name){
@ -212,6 +289,7 @@ module.exports = function(s,config,lang,io){
if(cn.ocv && s.ocv){
s.tx({f:'detector_unplugged',plug:s.ocv.plug},'CPU')
delete(s.ocv);
delete(pluginHandlersSet[pluginName])
}
}
var onSocketAuthentication = function(r,cn,d,tx){
@ -223,17 +301,32 @@ module.exports = function(s,config,lang,io){
tx({f:'detector_plugged',plug:s.ocv.plug,notice:s.ocv.notice})
}
}
var addCpuUsageHandler = (cn,pluginName) => {
if(pluginHandlersSet[pluginName])return;
pluginHandlersSet[pluginName] = true
cn.on('cpuUsage',function(percent){
currentPluginCpuUsage[pluginName] = percent
})
cn.on('gpuUsage',function(gpus){
currentPluginGpuUsage[pluginName] = gpus
})
cn.on('processCount',function(count){
currentPluginFrameProcessingCount[pluginName] = count
})
}
var onWebSocketConnection = function(cn){
cn.on('ocv',function(d){
if(!cn.pluginEngine && d.f === 'init'){
if(config.pluginKeys[d.plug] === d.pluginKey){
s.pluginInitiatorSuccess("client",d,cn)
if(config.detectorPluginsCluster)addCpuUsageHandler(cn,d.plug)
}else{
s.pluginInitiatorFail("client",d,cn)
}
}else{
if(config.pluginKeys[d.plug] === d.pluginKey){
s.pluginEventController(d)
if(config.detectorPluginsCluster)addCpuUsageHandler(cn,d.plug)
}else{
cn.disconnect()
}

View File

@ -31,7 +31,7 @@ module.exports = function(process,__dirname){
//UTC Offset
utcOffset : require('moment')().utcOffset(),
//directory path for this file
mainDirectory : __dirname
mainDirectory : process.cwd()
}
s.packageJson = packageJson
if(packageJson.mainDirectory){

173
libs/scanners.js Normal file
View File

@ -0,0 +1,173 @@
var os = require('os');
var exec = require('child_process').exec;
var onvif = require("node-onvif");
module.exports = function(s,config,lang,app,io){
const activeProbes = {}
const runFFprobe = (url,auth,callback) => {
var endData = {ok: false}
if(!url){
endData.error = 'Missing URL'
callback(endData)
return
}
if(activeProbes[auth]){
endData.error = 'Account is already probing'
callback(endData)
return
}
activeProbes[auth] = 1
const probeCommand = s.splitForFFPMEG(`-v quiet -print_format json -show_format -show_streams -i "${url}"`).join(' ')
exec('ffprobe ' + probeCommand,function(err,stdout,stderr){
delete(activeProbes[auth])
if(err){
endData.error = (err)
}else{
endData.ok = true
endData.result = s.parseJSON(stdout)
}
endData.probe = probeCommand
callback(endData)
})
}
const runOnvifScanner = (options,foundCameraCallback) => {
var ip = options.ip.replace(/ /g,'')
var ports = options.port.replace(/ /g,'')
if(options.ip === ''){
var interfaces = os.networkInterfaces()
var addresses = []
for (var k in interfaces) {
for (var k2 in interfaces[k]) {
var address = interfaces[k][k2]
if (address.family === 'IPv4' && !address.internal) {
addresses.push(address.address)
}
}
}
const addressRange = []
addresses.forEach(function(address){
if(address.indexOf('0.0.0')>-1){return false}
var addressPrefix = address.split('.')
delete(addressPrefix[3]);
addressPrefix = addressPrefix.join('.')
addressRange.push(`${addressPrefix}1-${addressPrefix}254`)
})
ip = addressRange.join(',')
}
if(ports === ''){
ports = '80,8080,8000,7575,8081,9080,8090,8999,8899'
}
if(ports.indexOf('-') > -1){
ports = ports.split('-')
var portRangeStart = ports[0]
var portRangeEnd = ports[1]
ports = s.portRange(portRangeStart,portRangeEnd);
}else{
ports = ports.split(',')
}
var ipList = options.ipList
var onvifUsername = options.user || ''
var onvifPassword = options.pass || ''
ip.split(',').forEach(function(addressRange){
var ipRangeStart = addressRange[0]
var ipRangeEnd = addressRange[1]
if(addressRange.indexOf('-')>-1){
addressRange = addressRange.split('-');
ipRangeStart = addressRange[0]
ipRangeEnd = addressRange[1]
}else{
ipRangeStart = addressRange
ipRangeEnd = addressRange
}
if(!ipList){
ipList = s.ipRange(ipRangeStart,ipRangeEnd);
}else{
ipList = ipList.concat(s.ipRange(ipRangeStart,ipRangeEnd))
}
})
var hitList = []
ipList.forEach((ipEntry,n) => {
ports.forEach((portEntry,nn) => {
hitList.push({
xaddr : 'http://' + ipEntry + ':' + portEntry + '/onvif/device_service',
user : onvifUsername,
pass : onvifPassword,
ip: ipEntry,
port: portEntry,
})
})
})
var responseList = []
hitList.forEach(async (camera) => {
try{
var device = new onvif.OnvifDevice(camera)
var info = await device.init()
var date = await device.services.device.getSystemDateAndTime()
var stream = await device.services.media.getStreamUri({
ProfileToken : device.current_profile.token,
Protocol : 'RTSP'
})
var cameraResponse = {
ip: camera.ip,
port: camera.port,
info: info,
date: date,
uri: stream.data.GetStreamUriResponse.MediaUri.Uri
}
responseList.push(cameraResponse)
if(foundCameraCallback)foundCameraCallback(Object.assign(cameraResponse,{f: 'onvif'}))
}catch(err){
const searchError = (find) => {
return s.stringContains(find,err.message,true)
}
var foundDevice = false
var errorMessage = ''
switch(true){
//ONVIF camera found but denied access
case searchError('400'): //Bad Request - Sender not Authorized
foundDevice = true
errorMessage = lang.ONVIFErr400
break;
case searchError('405'): //Method Not Allowed
foundDevice = true
errorMessage = lang.ONVIFErr405
break;
//Webserver exists but undetermined if IP Camera
case searchError('404'): //Not Found
foundDevice = true
errorMessage = lang.ONVIFErr404
break;
}
if(foundDevice && foundCameraCallback)foundCameraCallback({
f: 'onvif',
ff: 'failed_capture',
ip: camera.ip,
port: camera.port,
error: errorMessage
});
s.debugLog(err)
}
})
return responseList
}
const onWebSocketConnection = async (cn) => {
const tx = function(z){if(!z.ke){z.ke=cn.ke;};cn.emit('f',z);}
cn.on('f',(d) => {
switch(d.f){
case'onvif':
runOnvifScanner(d,tx)
break;
}
})
}
s.onWebSocketConnection(onWebSocketConnection)
/**
* API : FFprobe
*/
app.get(config.webPaths.apiPrefix+':auth/probe/:ke',function (req,res){
s.auth(req.params,function(user){
runFFprobe(req.query.url,req.params.auth,(endData) => {
s.closeJsonResponse(res,endData)
})
},res,req);
})
}

View File

@ -3,7 +3,11 @@ module.exports = function(s,config,lang,app,io){
//Get all Schedules
s.getAllSchedules = function(callback){
s.schedules = {}
s.sqlQuery('SELECT * FROM Schedules',function(err,rows){
s.knexQuery({
action: "select",
columns: "*",
table: "Schedules"
},(err,rows) => {
rows.forEach(function(schedule){
s.updateSchedule(schedule)
})
@ -114,10 +118,11 @@ module.exports = function(s,config,lang,app,io){
var scheduleNames = Object.keys(s.schedules[key])
scheduleNames.forEach(function(name){
var schedule = s.schedules[key][name]
if(!schedule.active && schedule.enabled === 1 && schedule.start && schedule.details.monitorStates){
if(schedule.enabled === 1 && schedule.start && schedule.details.monitorStates){
var timePasses = checkTimeAgainstSchedule(schedule)
var daysPasses = checkDaysAgainstSchedule(schedule)
if(timePasses && daysPasses){
var passed = timePasses && daysPasses
if(passed && !schedule.active){
schedule.active = true
var monitorStates = schedule.details.monitorStates
monitorStates.forEach(function(stateName){
@ -131,7 +136,7 @@ module.exports = function(s,config,lang,app,io){
// console.log(endData)
})
})
}else{
}else if(!passed && schedule.active){
schedule.active = false
}
}
@ -141,7 +146,16 @@ module.exports = function(s,config,lang,app,io){
//
s.findSchedule = function(groupKey,name,callback){
//presetQueryVals = [ke, type, name]
s.sqlQuery("SELECT * FROM Schedules WHERE ke=? AND name=? LIMIT 1",[groupKey,name],function(err,schedules){
s.knexQuery({
action: "select",
columns: "*",
table: "Schedules",
where: [
['ke','=',groupKey],
['name','=',name],
],
limit: 1
},function(err,schedules) {
var schedule
var notFound = false
if(schedules && schedules[0]){
@ -184,22 +198,24 @@ module.exports = function(s,config,lang,app,io){
s.closeJsonResponse(res,endData)
return
}
var theQuery = "SELECT * FROM Schedules WHERE ke=?"
var theQueryValues = [req.params.ke]
var whereQuery = [
['ke','=',req.params.ke]
]
if(req.params.name){
theQuery += ' AND name=?'
theQueryValues.push(req.params.name)
whereQuery.push(['name','=',req.params.name])
}
s.sqlQuery(theQuery,theQueryValues,function(err,schedules){
if(schedules && schedules[0]){
endData.ok = true
schedules.forEach(function(schedule){
s.checkDetails(schedule)
})
endData.schedules = schedules
}else{
endData.msg = user.lang['Not Found']
}
s.knexQuery({
action: "select",
columns: "*",
table: "Schedules",
where: whereQuery,
},function(err,schedules) {
endData.ok = true
schedules = schedules || []
schedules.forEach(function(schedule){
s.checkDetails(schedule)
})
endData.schedules = schedules
s.closeJsonResponse(res,endData)
})
})
@ -243,7 +259,11 @@ module.exports = function(s,config,lang,app,io){
end: form.end,
enabled: form.enabled
}
s.sqlQuery('INSERT INTO Schedules ('+Object.keys(insertData).join(',')+') VALUES (?,?,?,?,?,?)',Object.values(insertData))
s.knexQuery({
action: "insert",
table: "Schedules",
insert: insertData
})
s.tx({
f: 'add_schedule',
insertData: insertData,
@ -256,14 +276,23 @@ module.exports = function(s,config,lang,app,io){
details: s.stringJSON(form.details),
start: form.start,
end: form.end,
enabled: form.enabled,
ke: req.params.ke,
name: req.params.name
enabled: form.enabled
}
s.sqlQuery('UPDATE Schedules SET details=?,start=?,end=?,enabled=? WHERE ke=? AND name=?',Object.values(insertData))
s.knexQuery({
action: "update",
table: "Schedules",
update: insertData,
where: [
['ke','=',req.params.ke],
['name','=',req.params.name],
]
})
s.tx({
f: 'edit_schedule',
insertData: insertData,
insertData: Object.assign(insertData,{
ke: req.params.ke,
name: req.params.name,
}),
ke: req.params.ke,
name: req.params.name
},'GRP_'+req.params.ke)
@ -286,7 +315,14 @@ module.exports = function(s,config,lang,app,io){
endData.msg = user.lang['Schedule Configuration Not Found']
s.closeJsonResponse(res,endData)
}else{
s.sqlQuery('DELETE FROM Schedules WHERE ke=? AND name=?',[req.params.ke,req.params.name],function(err){
s.knexQuery({
action: "delete",
table: "Schedules",
where : {
ke: req.params.ke,
name: req.params.name,
}
},function(err){
if(!err){
endData.msg = lang["Deleted Schedule Configuration"]
endData.ok = true

154
libs/shinobiHub.js Normal file
View File

@ -0,0 +1,154 @@
var fs = require('fs')
var request = require('request')
module.exports = function(s,config,lang,app,io){
if(config.shinobiHubEndpoint === undefined){config.shinobiHubEndpoint = `https://hub.shinobi.video/`}else{config.shinobiHubEndpoint = s.checkCorrectPathEnding(config.shinobiHubEndpoint)}
var stripUsernameAndPassword = function(string,username,password){
if(username)string = string.split(username).join('_USERNAME_')
if(password)string = string.split(password).join('_PASSWORD_')
return string
}
var validatePostConfiguration = function(fields){
var response = {ok: true}
var fieldsJson
if(!fields.json)fields.json = '{}'
try{
fieldsJson = JSON.parse(fields.json)
}catch(err){
response.ok = false
response.msg = ('Configuration is not JSON format.')
}
if(!fields.details)fields.details = '{}'
try{
fieldsDetails = JSON.parse(fields.details)
}catch(err){
fieldsDetails = {}
}
if(
fields.name === '' ||
fields.brand === '' ||
fields.json === ''
){
response.ok = false
response.msg = ('The form is incomplete.')
}
if(!fields.description)fields.description = ''
if(!fieldsJson.mid){
response.ok = false
response.msg = ('The monitor configuration is incomplete.')
}
var monitorDetails = s.parseJSON(fieldsJson.details)
fieldsJson.details = monitorDetails || {}
fieldsJson.details.auto_host = stripUsernameAndPassword(fieldsJson.details.auto_host,fieldsJson.details.muser,fieldsJson.details.mpass)
fieldsJson.path = stripUsernameAndPassword(fieldsJson.path,fieldsJson.details.muser,fieldsJson.details.mpass)
fieldsJson.details.muser = '_USERNAME_'
fieldsJson.details.mpass = '_PASSWORD_'
response.json = JSON.stringify(fieldsJson)
response.details = JSON.stringify(fieldsDetails)
return response
}
const uploadConfiguration = (shinobiHubApiKey,type,monitorConfig,callback) => {
var validated = validatePostConfiguration({
json: JSON.stringify(monitorConfig)
})
if(validated.ok === true){
request.post({
url: `${config.shinobiHubEndpoint}api/${shinobiHubApiKey}/postConfiguration`,
form: {
"type": type,
"brand": monitorConfig.ke,
"name": monitorConfig.name,
"description": "Backup at " + (new Date()),
"json": validated.json,
"details": JSON.stringify({
// maybe ip address?
})
}
}, function(err,httpResponse,body){
callback(err,s.parseJSON(body) || {ok: false})
})
}else{
callback(new Error(validated.msg),{ok: false})
}
}
const onMonitorSave = async (monitorConfig,form) => {
if(config.shinobiHubAutoBackup === true && config.shinobiHubApiKey){
uploadConfiguration(config.shinobiHubApiKey,'cam',monitorConfig,() => {
// s.userLog({ke:monitorConfig.ke,mid:'$USER'},{type:lang['Websocket Connected'],msg:{for:lang['Superuser'],id:cn.mail,ip:cn.ip}})
})
}
if(s.group[monitorConfig.ke] && s.group[monitorConfig.ke].init.shinobihub === '1'){
uploadConfiguration(s.group[monitorConfig.ke].init.shinobihub_key,'cam',monitorConfig,() => {
// s.userLog({ke:monitorConfig.ke,mid:'$USER'},{type:lang['Websocket Connected'],msg:{for:lang['Superuser'],id:cn.mail,ip:cn.ip}})
})
}
}
app.get([
config.webPaths.apiPrefix + ':auth/getShinobiHubConfigurations/:ke/:type',
config.webPaths.apiPrefix + ':auth/getShinobiHubConfigurations/:ke/:type/:id'
],function (req,res){
s.auth(req.params,function(user){
//query defaults : rowLimit=5, skipOver=0, explore=0
res.setHeader('Content-Type', 'application/json');
var shinobiHubApiKey = s.group[req.params.ke].init.shinobihub_key
if(shinobiHubApiKey){
var queryString = []
if(req.query){
Object.keys(req.query).forEach((key) => {
var value = req.query[key]
queryString.push(key + '=' + value)
})
}
request(`${config.shinobiHubEndpoint}api/${shinobiHubApiKey}/getConfiguration/${req.params.type}${req.params.id ? '/' + req.params.id : ''}${queryString.length > 0 ? '?' + queryString.join('&') : ''}`).pipe(res)
}else{
s.closeJsonResponse(res,{
ok: false,
msg: user.lang['No API Key']
})
}
},res,req)
})
app.get([
config.webPaths.apiPrefix + ':auth/backupMonitorsAllToShinobHub/:ke'
],function (req,res){
s.auth(req.params,function(user){
//query defaults : rowLimit=5, skipOver=0, explore=0
res.setHeader('Content-Type', 'application/json');
var shinobiHubApiKey = s.group[req.params.ke].init.shinobihub_key
if(shinobiHubApiKey){
if(!s.group[req.params.ke].uploadingAllMonitorsToShinobiHub){
s.group[req.params.ke].uploadingAllMonitorsToShinobiHub = true
var current = 0;
var monitorConfigs = s.group[req.params.ke].rawMonitorConfigurations
var monitorIds = Object.keys(monitorConfigs)
var doOneUpload = () => {
if(!monitorIds[current]){
s.group[req.params.ke].uploadingAllMonitorsToShinobiHub = false
s.closeJsonResponse(res,{
ok: true,
})
return;
};
uploadConfiguration(s.group[req.params.ke].init.shinobihub_key,'cam',Object.assign(monitorConfigs[monitorIds[current]],{}),() => {
++current
doOneUpload()
})
}
doOneUpload()
}else{
s.closeJsonResponse(res,{
ok: false,
msg: lang['Already Processing']
})
}
}else{
s.closeJsonResponse(res,{
ok: false,
msg: user.lang['No API Key']
})
}
},res,req)
})
s.onMonitorSave(onMonitorSave)
}

File diff suppressed because it is too large Load Diff

View File

@ -61,6 +61,222 @@ module.exports = function(s,config){
.raw(data.query,data.values)
.asCallback(callback)
}, 4);
const cleanSqlWhereObject = (where) => {
const newWhere = {}
Object.keys(where).forEach((key) => {
if(key !== '__separator'){
const value = where[key]
newWhere[key] = value
}
})
return newWhere
}
const processSimpleWhereCondition = (dbQuery,where,didOne) => {
var whereIsArray = where instanceof Array;
if(where[0] === 'or' || where.__separator === 'or'){
if(whereIsArray){
where.shift()
dbQuery.orWhere(...where)
}else{
where = cleanSqlWhereObject(where)
dbQuery.orWhere(where)
}
}else if(!didOne){
didOne = true
whereIsArray ? dbQuery.where(...where) : dbQuery.where(where)
}else{
whereIsArray ? dbQuery.andWhere(...where) : dbQuery.andWhere(where)
}
}
const processWhereCondition = (dbQuery,where,didOne) => {
var whereIsArray = where instanceof Array;
if(!where[0])return;
if(where[0] && where[0] instanceof Array){
dbQuery.where(function() {
var _this = this
var didOneInsideGroup = false
where.forEach((whereInsideGroup) => {
processWhereCondition(_this,whereInsideGroup,didOneInsideGroup)
})
})
}else if(where[0] && where[0] instanceof Object){
dbQuery.where(function() {
var _this = this
var didOneInsideGroup = false
where.forEach((whereInsideGroup) => {
processSimpleWhereCondition(_this,whereInsideGroup,didOneInsideGroup)
})
})
}else{
processSimpleWhereCondition(dbQuery,where,didOne)
}
}
const knexError = (dbQuery,options,err) => {
console.error('knexError----------------------------------- START')
if(config.debugLogVerbose && config.debugLog === true){
s.debugLog('s.knexQuery QUERY',JSON.stringify(options,null,3))
s.debugLog('STACK TRACE, NOT AN ',new Error())
}
console.error(err)
console.error(dbQuery.toString())
console.error('knexError----------------------------------- END')
}
const knexQuery = (options,callback) => {
try{
if(!s.databaseEngine)return// console.log('Database Not Set');
// options = {
// action: "",
// columns: "",
// table: ""
// }
var dbQuery
switch(options.action){
case'select':
options.columns = options.columns.indexOf(',') === -1 ? [options.columns] : options.columns.split(',');
dbQuery = s.databaseEngine.select(...options.columns).from(options.table)
break;
case'count':
options.columns = options.columns.indexOf(',') === -1 ? [options.columns] : options.columns.split(',');
dbQuery = s.databaseEngine(options.table)
dbQuery.count(options.columns)
break;
case'update':
dbQuery = s.databaseEngine(options.table).update(options.update)
break;
case'delete':
dbQuery = s.databaseEngine(options.table)
break;
case'insert':
dbQuery = s.databaseEngine(options.table).insert(options.insert)
break;
}
if(options.where instanceof Array){
var didOne = false;
options.where.forEach((where) => {
processWhereCondition(dbQuery,where,didOne)
})
}else if(options.where instanceof Object){
dbQuery.where(options.where)
}
if(options.action === 'delete'){
dbQuery.del()
}
if(options.orderBy){
dbQuery.orderBy(...options.orderBy)
}
if(options.groupBy){
dbQuery.groupBy(options.groupBy)
}
if(options.limit){
if(`${options.limit}`.indexOf(',') === -1){
dbQuery.limit(options.limit)
}else{
const limitParts = `${options.limit}`.split(',')
dbQuery.limit(limitParts[1]).offset(limitParts[0])
}
}
if(config.debugLog === true){
console.log(dbQuery.toString())
}
if(callback || options.update || options.insert || options.action === 'delete'){
dbQuery.asCallback(function(err,r) {
if(err){
knexError(dbQuery,options,err)
}
if(callback)callback(err,r)
if(config.debugLogVerbose && config.debugLog === true){
s.debugLog('s.knexQuery QUERY',JSON.stringify(options,null,3))
s.debugLog('s.knexQuery RESPONSE',JSON.stringify(r,null,3))
s.debugLog('STACK TRACE, NOT AN ',new Error())
}
})
}
return dbQuery
}catch(err){
if(callback)callback(err,[])
knexError(dbQuery,options,err)
}
}
const getDatabaseRows = function(options,callback){
//current cant handle `end` time
var whereQuery = [
['ke','=',options.groupKey],
]
const monitorRestrictions = options.monitorRestrictions
var frameLimit = options.limit
const endIsStartTo = options.endIsStartTo
const chosenDate = options.date
const startDate = options.startDate ? s.stringToSqlTime(options.startDate) : null
const endDate = options.endDate ? s.stringToSqlTime(options.endDate) : null
const startOperator = options.startOperator || '>='
const endOperator = options.endOperator || '<='
const rowType = options.rowType || 'rows'
if(chosenDate){
if(chosenDate.indexOf('-') === -1 && !isNaN(chosenDate)){
chosenDate = parseInt(chosenDate)
}
var selectedDate = chosenDate
if(typeof chosenDate === 'string' && chosenDate.indexOf('.') > -1){
selectedDate = chosenDate.split('.')[0]
}
selectedDate = new Date(selectedDate)
var utcSelectedDate = new Date(selectedDate.getTime() + selectedDate.getTimezoneOffset() * 60000)
startDate = moment(utcSelectedDate).format('YYYY-MM-DD HH:mm:ss')
var dayAfter = utcSelectedDate
dayAfter.setDate(dayAfter.getDate() + 1)
endDate = moment(dayAfter).format('YYYY-MM-DD HH:mm:ss')
}
if(startDate){
if(endDate){
whereQuery.push(['time',startOperator,startDate])
whereQuery.push([endIsStartTo ? 'time' : 'end',endOperator,endDate])
}else{
whereQuery.push(['time',startOperator,startDate])
}
}
if(monitorRestrictions && monitorRestrictions.length > 0){
whereQuery.push(monitorRestrictions)
}
if(options.archived){
whereQuery.push(['details','LIKE',`%"archived":"1"%`])
}
if(options.filename){
whereQuery.push(['filename','=',options.filename])
frameLimit = "1";
}
options.orderBy = options.orderBy ? options.orderBy : ['time','desc']
if(options.count)options.groupBy = options.groupBy ? options.groupBy : options.orderBy[0]
knexQuery({
action: options.count ? "count" : "select",
columns: options.columns || "*",
table: options.table,
where: whereQuery,
orderBy: options.orderBy,
groupBy: options.groupBy,
limit: frameLimit || '500'
},(err,r) => {
if(err){
callback({
ok: false,
total: 0,
limit: frameLimit,
[rowType]: []
})
}else{
r.forEach(function(file){
file.details = s.parseJSON(file.details)
})
callback({
ok: true,
total: r.length,
limit: frameLimit,
[rowType]: r
})
}
})
}
s.knexQuery = knexQuery
s.getDatabaseRows = getDatabaseRows
s.sqlQuery = function(query,values,onMoveOn,hideLog){
if(!values){values=[]}
if(typeof values === 'function'){
@ -109,55 +325,209 @@ module.exports = function(s,config){
var mySQLtail = ''
if(config.databaseType === 'mysql'){
mySQLtail = ' ENGINE=InnoDB DEFAULT CHARSET=utf8'
//add Presets table and modernize
var createPresetsTableQuery = 'CREATE TABLE IF NOT EXISTS `Presets` ( `ke` varchar(50) DEFAULT NULL, `name` text, `details` text, `type` varchar(50) DEFAULT NULL)'
s.sqlQuery( createPresetsTableQuery + mySQLtail + ';',[],function(err){
if(err)console.error(err)
if(config.databaseType === 'sqlite3'){
var aQuery = "ALTER TABLE Presets RENAME TO _Presets_old;"
aQuery += createPresetsTableQuery
aQuery += "INSERT INTO Presets (`ke`, `name`, `details`, `type`) SELECT `ke`, `name`, `details`, `type` FROM _Presets_old;COMMIT;DROP TABLE _Presets_old;"
}else{
s.sqlQuery('ALTER TABLE `Presets` CHANGE COLUMN `type` `type` VARCHAR(50) NULL DEFAULT NULL AFTER `details`;',[],function(err){
if(err)console.error(err)
},true)
}
},true)
//add Schedules table, will remove in future
s.sqlQuery("CREATE TABLE IF NOT EXISTS `Schedules` (`ke` varchar(50) DEFAULT NULL,`name` text,`details` text,`start` varchar(10) DEFAULT NULL,`end` varchar(10) DEFAULT NULL,`enabled` int(1) NOT NULL DEFAULT '1')" + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//add Timelapses and Timelapse Frames tables, will remove in future
s.sqlQuery("CREATE TABLE IF NOT EXISTS `Timelapses` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`details` longtext,`date` date NOT NULL,`time` timestamp NOT NULL,`end` timestamp NOT NULL,`size` int(11)NOT NULL)" + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
s.sqlQuery("CREATE TABLE IF NOT EXISTS `Timelapse Frames` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`details` longtext,`filename` varchar(50) NOT NULL,`time` timestamp NULL DEFAULT NULL,`size` int(11) NOT NULL)" + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//Add index to Videos table
s.sqlQuery('CREATE INDEX `videos_index` ON Videos(`time`);',[],function(err){
if(err && err.code !== 'ER_DUP_KEYNAME'){
console.error(err)
}
},true)
//Add index to Events table
s.sqlQuery('CREATE INDEX `events_index` ON Events(`ke`, `mid`, `time`);',[],function(err){
if(err && err.code !== 'ER_DUP_KEYNAME'){
console.error(err)
}
},true)
//Add index to Logs table
s.sqlQuery('CREATE INDEX `logs_index` ON Logs(`ke`, `mid`, `time`);',[],function(err){
if(err && err.code !== 'ER_DUP_KEYNAME'){
console.error(err)
}
},true)
//Add index to Monitors table
s.sqlQuery('CREATE INDEX `monitors_index` ON Monitors(`ke`, `mode`, `type`, `ext`);',[],function(err){
if(err && err.code !== 'ER_DUP_KEYNAME'){
console.error(err)
}
},true)
//Add index to Timelapse Frames table
s.sqlQuery('CREATE INDEX `timelapseframes_index` ON `Timelapse Frames`(`ke`, `mid`, `time`);',[],function(err){
if(err && err.code !== 'ER_DUP_KEYNAME'){
console.error(err)
}
},true)
//add Cloud Videos table, will remove in future
s.sqlQuery('CREATE TABLE IF NOT EXISTS `Cloud Videos` (`mid` varchar(50) NOT NULL,`ke` varchar(50) DEFAULT NULL,`href` text NOT NULL,`size` float DEFAULT NULL,`time` timestamp NULL DEFAULT NULL,`end` timestamp NULL DEFAULT NULL,`status` int(1) DEFAULT \'0\',`details` text)' + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//add Events Counts table, will remove in future
s.sqlQuery('CREATE TABLE IF NOT EXISTS `Events Counts` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`details` longtext NOT NULL,`time` timestamp NOT NULL DEFAULT current_timestamp(),`end` timestamp NOT NULL DEFAULT current_timestamp(),`count` int(10) NOT NULL DEFAULT 1,`tag` varchar(30) DEFAULT NULL)' + mySQLtail + ';',[],function(err){
if(err && err.code !== 'ER_TABLE_EXISTS_ERROR'){
console.error(err)
}
s.sqlQuery('ALTER TABLE `Events Counts` ADD COLUMN `time` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP AFTER `details`;',[],function(err){
// console.error(err)
},true)
},true)
//add Cloud Timelapse Frames table, will remove in future
s.sqlQuery('CREATE TABLE IF NOT EXISTS `Cloud Timelapse Frames` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`href` text NOT NULL,`details` longtext,`filename` varchar(50) NOT NULL,`time` timestamp NULL DEFAULT NULL,`size` int(11) NOT NULL)' + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//create Files table
var createFilesTableQuery = "CREATE TABLE IF NOT EXISTS `Files` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`name` tinytext NOT NULL,`size` float NOT NULL DEFAULT '0',`details` text NOT NULL,`status` int(1) NOT NULL DEFAULT '0',`time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP)"
s.sqlQuery(createFilesTableQuery + mySQLtail + ';',[],function(err){
if(err)console.error(err)
//add time column to Files table
if(config.databaseType === 'sqlite3'){
var aQuery = "ALTER TABLE Files RENAME TO _Files_old;"
aQuery += createPresetsTableQuery
aQuery += "INSERT INTO Files (`ke`, `mid`, `name`, `details`, `size`, `status`, `time`) SELECT `ke`, `mid`, `name`, `details`, `size`, `status`, `time` FROM _Files_old;COMMIT;DROP TABLE _Files_old;"
}else{
s.sqlQuery('ALTER TABLE `Files` ADD COLUMN `time` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP AFTER `status`;',[],function(err){
if(err && err.sqlMessage && err.sqlMessage.indexOf('Duplicate') === -1)console.error(err)
},true)
}
},true)
}
//add Presets table and modernize
var createPresetsTableQuery = 'CREATE TABLE IF NOT EXISTS `Presets` ( `ke` varchar(50) DEFAULT NULL, `name` text, `details` text, `type` varchar(50) DEFAULT NULL)'
s.sqlQuery( createPresetsTableQuery + mySQLtail + ';',[],function(err){
if(err)console.error(err)
if(config.databaseType === 'sqlite3'){
var aQuery = "ALTER TABLE Presets RENAME TO _Presets_old;"
aQuery += createPresetsTableQuery
aQuery += "INSERT INTO Presets (`ke`, `name`, `details`, `type`) SELECT `ke`, `name`, `details`, `type` FROM _Presets_old;COMMIT;DROP TABLE _Presets_old;"
}else{
s.sqlQuery('ALTER TABLE `Presets` CHANGE COLUMN `type` `type` VARCHAR(50) NULL DEFAULT NULL AFTER `details`;',[],function(err){
if(err)console.error(err)
},true)
}
},true)
//add Schedules table, will remove in future
s.sqlQuery("CREATE TABLE IF NOT EXISTS `Schedules` (`ke` varchar(50) DEFAULT NULL,`name` text,`details` text,`start` varchar(10) DEFAULT NULL,`end` varchar(10) DEFAULT NULL,`enabled` int(1) NOT NULL DEFAULT '1')" + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//add Timelapses and Timelapse Frames tables, will remove in future
s.sqlQuery("CREATE TABLE IF NOT EXISTS `Timelapses` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`details` longtext,`date` date NOT NULL,`time` timestamp NOT NULL,`end` timestamp NOT NULL,`size` int(11)NOT NULL)" + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
s.sqlQuery("CREATE TABLE IF NOT EXISTS `Timelapse Frames` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`details` longtext,`filename` varchar(50) NOT NULL,`time` timestamp NULL DEFAULT NULL,`size` int(11) NOT NULL)" + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//add Cloud Videos table, will remove in future
s.sqlQuery('CREATE TABLE IF NOT EXISTS `Cloud Videos` (`mid` varchar(50) NOT NULL,`ke` varchar(50) DEFAULT NULL,`href` text NOT NULL,`size` float DEFAULT NULL,`time` timestamp NULL DEFAULT NULL,`end` timestamp NULL DEFAULT NULL,`status` int(1) DEFAULT \'0\',`details` text)' + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//add Cloud Timelapse Frames table, will remove in future
s.sqlQuery('CREATE TABLE IF NOT EXISTS `Cloud Timelapse Frames` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`href` text NOT NULL,`details` longtext,`filename` varchar(50) NOT NULL,`time` timestamp NULL DEFAULT NULL,`size` int(11) NOT NULL)' + mySQLtail + ';',[],function(err){
if(err)console.error(err)
},true)
//create Files table
var createFilesTableQuery = "CREATE TABLE IF NOT EXISTS `Files` (`ke` varchar(50) NOT NULL,`mid` varchar(50) NOT NULL,`name` tinytext NOT NULL,`size` float NOT NULL DEFAULT '0',`details` text NOT NULL,`status` int(1) NOT NULL DEFAULT '0',`time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP)"
s.sqlQuery(createFilesTableQuery + mySQLtail + ';',[],function(err){
if(err)console.error(err)
//add time column to Files table
if(config.databaseType === 'sqlite3'){
var aQuery = "ALTER TABLE Files RENAME TO _Files_old;"
aQuery += createPresetsTableQuery
aQuery += "INSERT INTO Files (`ke`, `mid`, `name`, `details`, `size`, `status`, `time`) SELECT `ke`, `mid`, `name`, `details`, `size`, `status`, `time` FROM _Files_old;COMMIT;DROP TABLE _Files_old;"
}else{
s.sqlQuery('ALTER TABLE `Files` ADD COLUMN `time` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP AFTER `status`;',[],function(err){
if(err && err.sqlMessage && err.sqlMessage.indexOf('Duplicate') === -1)console.error(err)
},true)
}
},true)
delete(s.preQueries)
}
s.sqlQueryBetweenTimesWithPermissions = (options,callback) => {
// options = {
// table: 'Events Counts',
// user: user,
// monitorId: req.params.id,
// startTime: req.query.start,
// endTime: req.query.end,
// startTimeOperator: req.query.startOperator,
// endTimeOperator: req.query.endOperator,
// limit: req.query.limit,
// archived: req.query.archived,
// endIsStartTo: !!req.query.endIsStartTo,
// parseRowDetails: true,
// rowName: 'counts'
// }
const rowName = options.rowName || 'rows'
const preliminaryValidationFailed = options.preliminaryValidationFailed || false
if(preliminaryValidationFailed){
if(options.noFormat){
callback([]);
}else{
callback({
ok: true,
[rowName]: [],
})
}
return
}
const user = options.user
const groupKey = options.groupKey
const monitorId = options.monitorId
const archived = options.archived
const theTableSelected = options.table
const endIsStartTo = options.endIsStartTo
const userDetails = user.details
var endTime = options.endTime
var startTimeOperator = options.startTimeOperator
var endTimeOperator = options.endTimeOperator
var startTime = options.startTime
var limitString = `${options.limit}`
const monitorRestrictions = s.getMonitorRestrictions(options.user.details,monitorId)
getDatabaseRows({
monitorRestrictions: monitorRestrictions,
table: theTableSelected,
groupKey: groupKey,
startDate: startTime,
endDate: endTime,
startOperator: startTimeOperator,
endOperator: endTimeOperator,
limit: options.limit,
archived: archived,
rowType: rowName,
endIsStartTo: endIsStartTo
},(response) => {
const limit = response.limit
const r = response[rowName];
if(!r){
callback({
total: 0,
limit: response.limit,
skip: 0,
[rowName]: []
});
return
}
if(options.parseRowDetails){
r.forEach((row) => {
row.details = JSON.parse(row.details)
})
}
if(options.noCount){
if(options.noFormat){
callback(r)
}else{
callback({
ok: true,
limit: response.limit,
[rowName]: r,
endIsStartTo: endIsStartTo
})
}
}else{
getDatabaseRows({
monitorRestrictions: monitorRestrictions,
columns: 'time',
count: true,
table: theTableSelected,
groupKey: groupKey,
startDate: startTime,
endDate: endTime,
startOperator: startTimeOperator,
endOperator: endTimeOperator,
archived: archived,
type: 'count',
endIsStartTo: endIsStartTo
},(response) => {
const count = response.count
var skipOver = 0
if(limitString.indexOf(',') > -1){
skipOver = parseInt(limitString.split(',')[0])
limitString = parseInt(limitString.split(',')[1])
}else{
limitString = parseInt(limitString)
}
callback({
total: response['count(*)'],
limit: response.limit,
skip: skipOver,
[rowName]: r,
endIsStartTo: endIsStartTo
})
})
}
})
}
}

View File

@ -6,384 +6,446 @@ var crypto = require('crypto');
var exec = require('child_process').exec;
var execSync = require('child_process').execSync;
module.exports = function(s,config,lang,io){
console.log('FFmpeg version : '+s.ffmpegVersion)
console.log('Node.js version : '+process.version)
s.processReady = function(){
s.systemLog(lang.startUpText5)
s.onProcessReadyExtensions.forEach(function(extender){
extender(true)
})
process.send('ready')
}
var checkForTerminalCommands = function(callback){
var next = function(){
if(callback)callback()
}
if(!s.isWin && s.packageJson.mainDirectory !== '.'){
var etcPath = '/etc/shinobisystems/cctv.txt'
fs.stat(etcPath,function(err,stat){
if(err || !stat){
exec('node '+ s.mainDirectory + '/INSTALL/terminalCommands.js',function(err){
if(err)console.log(err)
})
}
next()
return new Promise((resolve, reject) => {
var checkedAdminUsers = {}
console.log('FFmpeg version : '+s.ffmpegVersion)
console.log('Node.js version : '+process.version)
s.processReady = function(){
delete(checkedAdminUsers)
resolve()
s.systemLog(lang.startUpText5)
s.onProcessReadyExtensions.forEach(function(extender){
extender(true)
})
}else{
next()
process.send('ready')
}
}
var loadedAccounts = []
var foundMonitors = []
var loadMonitors = function(callback){
s.beforeMonitorsLoadedOnStartupExtensions.forEach(function(extender){
extender()
})
s.systemLog(lang.startUpText4)
//preliminary monitor start
s.sqlQuery('SELECT * FROM Monitors', function(err,monitors) {
foundMonitors = monitors
if(err){s.systemLog(err)}
var checkForTerminalCommands = function(callback){
var next = function(){
if(callback)callback()
}
if(!s.isWin && s.packageJson.mainDirectory !== '.'){
var etcPath = '/etc/shinobisystems/cctv.txt'
fs.stat(etcPath,function(err,stat){
if(err || !stat){
exec('node '+ s.mainDirectory + '/INSTALL/terminalCommands.js',function(err){
if(err)console.log(err)
})
}
next()
})
}else{
next()
}
}
var loadedAccounts = []
var foundMonitors = []
var loadMonitors = function(callback){
s.beforeMonitorsLoadedOnStartupExtensions.forEach(function(extender){
extender()
})
s.systemLog(lang.startUpText4)
//preliminary monitor start
s.knexQuery({
action: "select",
columns: "*",
table: "Monitors",
},function(err,monitors) {
foundMonitors = monitors
if(err){s.systemLog(err)}
if(monitors && monitors[0]){
var didNotLoad = 0
var loadCompleted = 0
var orphanedVideosForMonitors = {}
var loadMonitor = function(monitor){
const checkAnother = function(){
++loadCompleted
if(monitors[loadCompleted]){
loadMonitor(monitors[loadCompleted])
}else{
if(didNotLoad > 0)console.log(`${didNotLoad} Monitor${didNotLoad === 1 ? '' : 's'} not loaded because Admin user does not exist for them. It may have been deleted.`);
callback()
}
}
if(checkedAdminUsers[monitor.ke]){
setTimeout(function(){
if(!orphanedVideosForMonitors[monitor.ke])orphanedVideosForMonitors[monitor.ke] = {}
if(!orphanedVideosForMonitors[monitor.ke][monitor.mid])orphanedVideosForMonitors[monitor.ke][monitor.mid] = 0
s.initiateMonitorObject(monitor)
s.group[monitor.ke].rawMonitorConfigurations[monitor.mid] = monitor
s.sendMonitorStatus({id:monitor.mid,ke:monitor.ke,status:'Stopped'});
var monObj = Object.assign(monitor,{id : monitor.mid})
s.camera(monitor.mode,monObj)
checkAnother()
},1000)
}else{
++didNotLoad
checkAnother()
}
}
loadMonitor(monitors[loadCompleted])
}else{
callback()
}
})
}
var checkForOrphanedVideos = function(callback){
var monitors = foundMonitors
if(monitors && monitors[0]){
var loadCompleted = 0
var orphanedVideosForMonitors = {}
var loadMonitor = function(monitor){
setTimeout(function(){
if(!orphanedVideosForMonitors[monitor.ke])orphanedVideosForMonitors[monitor.ke] = {}
if(!orphanedVideosForMonitors[monitor.ke][monitor.mid])orphanedVideosForMonitors[monitor.ke][monitor.mid] = 0
s.initiateMonitorObject(monitor)
s.group[monitor.ke].rawMonitorConfigurations[monitor.mid] = monitor
s.sendMonitorStatus({id:monitor.mid,ke:monitor.ke,status:'Stopped'});
var monObj = Object.assign(monitor,{id : monitor.mid})
s.camera(monitor.mode,monObj)
var checkForOrphanedVideosForMonitor = function(monitor){
if(!orphanedVideosForMonitors[monitor.ke])orphanedVideosForMonitors[monitor.ke] = {}
if(!orphanedVideosForMonitors[monitor.ke][monitor.mid])orphanedVideosForMonitors[monitor.ke][monitor.mid] = 0
s.orphanedVideoCheck(monitor,null,function(orphanedFilesCount){
if(orphanedFilesCount){
orphanedVideosForMonitors[monitor.ke][monitor.mid] += orphanedFilesCount
}
++loadCompleted
if(monitors[loadCompleted]){
loadMonitor(monitors[loadCompleted])
checkForOrphanedVideosForMonitor(monitors[loadCompleted])
}else{
s.systemLog(lang.startUpText6+' : '+s.s(orphanedVideosForMonitors))
delete(foundMonitors)
callback()
}
},2000)
})
}
loadMonitor(monitors[loadCompleted])
checkForOrphanedVideosForMonitor(monitors[loadCompleted])
}else{
callback()
}
})
}
var checkForOrphanedVideos = function(callback){
var monitors = foundMonitors
if(monitors && monitors[0]){
var loadCompleted = 0
var orphanedVideosForMonitors = {}
var checkForOrphanedVideosForMonitor = function(monitor){
if(!orphanedVideosForMonitors[monitor.ke])orphanedVideosForMonitors[monitor.ke] = {}
if(!orphanedVideosForMonitors[monitor.ke][monitor.mid])orphanedVideosForMonitors[monitor.ke][monitor.mid] = 0
s.orphanedVideoCheck(monitor,null,function(orphanedFilesCount){
if(orphanedFilesCount){
orphanedVideosForMonitors[monitor.ke][monitor.mid] += orphanedFilesCount
}
++loadCompleted
if(monitors[loadCompleted]){
checkForOrphanedVideosForMonitor(monitors[loadCompleted])
}else{
s.systemLog(lang.startUpText6+' : '+s.s(orphanedVideosForMonitors))
delete(foundMonitors)
callback()
}
})
}
checkForOrphanedVideosForMonitor(monitors[loadCompleted])
}else{
callback()
}
}
var loadDiskUseForUser = function(user,callback){
s.systemLog(user.mail+' : '+lang.startUpText0)
var userDetails = JSON.parse(user.details)
s.group[user.ke].sizeLimit = parseFloat(userDetails.size) || 10000
s.group[user.ke].sizeLimitVideoPercent = parseFloat(userDetails.size_video_percent) || 90
s.group[user.ke].sizeLimitTimelapseFramesPercent = parseFloat(userDetails.size_timelapse_percent) || 10
s.sqlQuery('SELECT * FROM Videos WHERE ke=? AND status!=?',[user.ke,0],function(err,videos){
s.sqlQuery('SELECT * FROM `Timelapse Frames` WHERE ke=?',[user.ke],function(err,timelapseFrames){
s.sqlQuery('SELECT * FROM `Files` WHERE ke=?',[user.ke],function(err,files){
var usedSpaceVideos = 0
var usedSpaceTimelapseFrames = 0
var usedSpaceFilebin = 0
var addStorageData = {
files: [],
videos: [],
timelapeFrames: [],
}
var loadDiskUseForUser = function(user,callback){
s.systemLog(user.mail+' : '+lang.startUpText0)
var userDetails = JSON.parse(user.details)
s.group[user.ke].sizeLimit = parseFloat(userDetails.size) || 10000
s.group[user.ke].sizeLimitVideoPercent = parseFloat(userDetails.size_video_percent) || 90
s.group[user.ke].sizeLimitTimelapseFramesPercent = parseFloat(userDetails.size_timelapse_percent) || 10
s.knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: [
['ke','=',user.ke],
['status','!=',0],
]
},function(err,videos) {
s.knexQuery({
action: "select",
columns: "*",
table: "Timelapse Frames",
where: [
['ke','=',user.ke],
]
},function(err,timelapseFrames) {
s.knexQuery({
action: "select",
columns: "*",
table: "Files",
where: [
['ke','=',user.ke],
]
},function(err,files) {
var usedSpaceVideos = 0
var usedSpaceTimelapseFrames = 0
var usedSpaceFilebin = 0
var addStorageData = {
files: [],
videos: [],
timelapeFrames: [],
}
if(videos && videos[0]){
videos.forEach(function(video){
video.details = s.parseJSON(video.details)
if(!video.details.dir){
usedSpaceVideos += video.size
}else{
addStorageData.videos.push(video)
}
})
}
if(timelapseFrames && timelapseFrames[0]){
timelapseFrames.forEach(function(frame){
frame.details = s.parseJSON(frame.details)
if(!frame.details.dir){
usedSpaceTimelapseFrames += frame.size
}else{
addStorageData.timelapeFrames.push(frame)
}
})
}
if(files && files[0]){
files.forEach(function(file){
file.details = s.parseJSON(file.details)
if(!file.details.dir){
usedSpaceFilebin += file.size
}else{
addStorageData.files.push(file)
}
})
}
s.group[user.ke].usedSpace = (usedSpaceVideos + usedSpaceTimelapseFrames + usedSpaceFilebin) / 1048576
s.group[user.ke].usedSpaceVideos = usedSpaceVideos / 1048576
s.group[user.ke].usedSpaceFilebin = usedSpaceFilebin / 1048576
s.group[user.ke].usedSpaceTimelapseFrames = usedSpaceTimelapseFrames / 1048576
loadAddStorageDiskUseForUser(user,addStorageData,function(){
callback()
})
})
})
})
}
var loadCloudDiskUseForUser = function(user,callback){
var userDetails = JSON.parse(user.details)
user.cloudDiskUse = {}
user.size = 0
user.limit = userDetails.size
s.cloudDisksLoaded.forEach(function(storageType){
user.cloudDiskUse[storageType] = {
usedSpace : 0,
firstCount : 0
}
if(s.cloudDiskUseStartupExtensions[storageType])s.cloudDiskUseStartupExtensions[storageType](user,userDetails)
})
var loadCloudVideos = function(callback){
s.knexQuery({
action: "select",
columns: "*",
table: "Cloud Videos",
where: [
['ke','=',user.ke],
['status','!=',0],
]
},function(err,videos) {
if(videos && videos[0]){
videos.forEach(function(video){
video.details = s.parseJSON(video.details)
if(!video.details.dir){
usedSpaceVideos += video.size
}else{
addStorageData.videos.push(video)
}
var storageType = JSON.parse(video.details).type
if(!storageType)storageType = 's3'
var videoSize = video.size / 1048576
user.cloudDiskUse[storageType].usedSpace += videoSize
user.cloudDiskUse[storageType].usedSpaceVideos += videoSize
++user.cloudDiskUse[storageType].firstCount
})
s.cloudDisksLoaded.forEach(function(storageType){
var firstCount = user.cloudDiskUse[storageType].firstCount
s.systemLog(user.mail+' : '+lang.startUpText1+' : '+firstCount,storageType,user.cloudDiskUse[storageType].usedSpace)
delete(user.cloudDiskUse[storageType].firstCount)
})
}
if(timelapseFrames && timelapseFrames[0]){
timelapseFrames.forEach(function(frame){
frame.details = s.parseJSON(frame.details)
if(!frame.details.dir){
usedSpaceTimelapseFrames += frame.size
}else{
addStorageData.timelapeFrames.push(frame)
}
callback()
})
}
var loadCloudTimelapseFrames = function(callback){
s.knexQuery({
action: "select",
columns: "*",
table: "Cloud Timelapse Frames",
where: [
['ke','=',user.ke],
]
},function(err,frames) {
if(frames && frames[0]){
frames.forEach(function(frame){
var storageType = JSON.parse(frame.details).type
if(!storageType)storageType = 's3'
var frameSize = frame.size / 1048576
user.cloudDiskUse[storageType].usedSpace += frameSize
user.cloudDiskUse[storageType].usedSpaceTimelapseFrames += frameSize
})
}
if(files && files[0]){
files.forEach(function(file){
file.details = s.parseJSON(file.details)
if(!file.details.dir){
usedSpaceFilebin += file.size
}else{
addStorageData.files.push(file)
}
})
}
s.group[user.ke].usedSpace = (usedSpaceVideos + usedSpaceTimelapseFrames + usedSpaceFilebin) / 1000000
s.group[user.ke].usedSpaceVideos = usedSpaceVideos / 1000000
s.group[user.ke].usedSpaceFilebin = usedSpaceFilebin / 1000000
s.group[user.ke].usedSpaceTimelapseFrames = usedSpaceTimelapseFrames / 1000000
loadAddStorageDiskUseForUser(user,addStorageData,function(){
callback()
})
callback()
})
}
loadCloudVideos(function(){
loadCloudTimelapseFrames(function(){
s.group[user.ke].cloudDiskUse = user.cloudDiskUse
callback()
})
})
})
}
var loadCloudDiskUseForUser = function(user,callback){
var userDetails = JSON.parse(user.details)
user.cloudDiskUse = {}
user.size = 0
user.limit = userDetails.size
s.cloudDisksLoaded.forEach(function(storageType){
user.cloudDiskUse[storageType] = {
usedSpace : 0,
firstCount : 0
}
if(s.cloudDiskUseStartupExtensions[storageType])s.cloudDiskUseStartupExtensions[storageType](user,userDetails)
})
var loadCloudVideos = function(callback){
s.sqlQuery('SELECT * FROM `Cloud Videos` WHERE ke=? AND status!=?',[user.ke,0],function(err,videos){
}
var loadAddStorageDiskUseForUser = function(user,data,callback){
var videos = data.videos
var timelapseFrames = data.timelapseFrames
var files = data.files
var userDetails = JSON.parse(user.details)
var userAddStorageData = s.parseJSON(userDetails.addStorage) || {}
var currentStorageNumber = 0
var readStorageArray = function(){
var storage = s.listOfStorage[currentStorageNumber]
if(!storage){
//done all checks, move on to next user
callback()
return
}
var path = storage.value
if(path === ''){
++currentStorageNumber
readStorageArray()
return
}
var storageId = path
var storageData = userAddStorageData[storageId] || {}
if(!s.group[user.ke].addStorageUse[storageId])s.group[user.ke].addStorageUse[storageId] = {}
var storageIndex = s.group[user.ke].addStorageUse[storageId]
storageIndex.name = storage.name
storageIndex.path = path
storageIndex.usedSpace = 0
storageIndex.sizeLimit = parseFloat(storageData.limit) || parseFloat(userDetails.size) || 10000
var usedSpaceVideos = 0
var usedSpaceTimelapseFrames = 0
var usedSpaceFilebin = 0
if(videos && videos[0]){
videos.forEach(function(video){
var storageType = JSON.parse(video.details).type
if(!storageType)storageType = 's3'
var videoSize = video.size / 1000000
user.cloudDiskUse[storageType].usedSpace += videoSize
user.cloudDiskUse[storageType].usedSpaceVideos += videoSize
++user.cloudDiskUse[storageType].firstCount
})
s.cloudDisksLoaded.forEach(function(storageType){
var firstCount = user.cloudDiskUse[storageType].firstCount
s.systemLog(user.mail+' : '+lang.startUpText1+' : '+firstCount,storageType,user.cloudDiskUse[storageType].usedSpace)
delete(user.cloudDiskUse[storageType].firstCount)
if(video.details.dir === storage.value){
usedSpaceVideos += video.size
}
})
}
callback()
})
}
var loadCloudTimelapseFrames = function(callback){
s.sqlQuery('SELECT * FROM `Cloud Timelapse Frames` WHERE ke=?',[user.ke],function(err,frames){
if(frames && frames[0]){
frames.forEach(function(frame){
var storageType = JSON.parse(frame.details).type
if(!storageType)storageType = 's3'
var frameSize = frame.size / 1000000
user.cloudDiskUse[storageType].usedSpace += frameSize
user.cloudDiskUse[storageType].usedSpaceTimelapseFrames += frameSize
if(timelapseFrames && timelapseFrames[0]){
timelapseFrames.forEach(function(frame){
if(video.details.dir === storage.value){
usedSpaceTimelapseFrames += frame.size
}
})
}
callback()
})
}
loadCloudVideos(function(){
loadCloudTimelapseFrames(function(){
s.group[user.ke].cloudDiskUse = user.cloudDiskUse
callback()
})
})
}
var loadAddStorageDiskUseForUser = function(user,data,callback){
var videos = data.videos
var timelapseFrames = data.timelapseFrames
var files = data.files
var userDetails = JSON.parse(user.details)
var userAddStorageData = s.parseJSON(userDetails.addStorage) || {}
var currentStorageNumber = 0
var readStorageArray = function(){
var storage = s.listOfStorage[currentStorageNumber]
if(!storage){
//done all checks, move on to next user
callback()
return
}
var path = storage.value
if(path === ''){
if(files && files[0]){
files.forEach(function(file){
if(video.details.dir === storage.value){
usedSpaceFilebin += file.size
}
})
}
storageIndex.usedSpace = (usedSpaceVideos + usedSpaceTimelapseFrames + usedSpaceFilebin) / 1048576
storageIndex.usedSpaceVideos = usedSpaceVideos / 1048576
storageIndex.usedSpaceFilebin = usedSpaceFilebin / 1048576
storageIndex.usedSpaceTimelapseFrames = usedSpaceTimelapseFrames / 1048576
s.systemLog(user.mail+' : '+path+' : '+videos.length,storageIndex.usedSpace)
++currentStorageNumber
readStorageArray()
return
}
var storageId = path
var storageData = userAddStorageData[storageId] || {}
if(!s.group[user.ke].addStorageUse[storageId])s.group[user.ke].addStorageUse[storageId] = {}
var storageIndex = s.group[user.ke].addStorageUse[storageId]
storageIndex.name = storage.name
storageIndex.path = path
storageIndex.usedSpace = 0
storageIndex.sizeLimit = parseFloat(storageData.limit) || parseFloat(userDetails.size) || 10000
var usedSpaceVideos = 0
var usedSpaceTimelapseFrames = 0
var usedSpaceFilebin = 0
if(videos && videos[0]){
videos.forEach(function(video){
if(video.details.dir === storage.value){
usedSpaceVideos += video.size
}
})
}
if(timelapseFrames && timelapseFrames[0]){
timelapseFrames.forEach(function(frame){
if(video.details.dir === storage.value){
usedSpaceTimelapseFrames += frame.size
}
})
}
if(files && files[0]){
files.forEach(function(file){
if(video.details.dir === storage.value){
usedSpaceFilebin += file.size
}
})
}
storageIndex.usedSpace = (usedSpaceVideos + usedSpaceTimelapseFrames + usedSpaceFilebin) / 1000000
storageIndex.usedSpaceVideos = usedSpaceVideos / 1000000
storageIndex.usedSpaceFilebin = usedSpaceFilebin / 1000000
storageIndex.usedSpaceTimelapseFrames = usedSpaceTimelapseFrames / 1000000
s.systemLog(user.mail+' : '+path+' : '+videos.length,storageIndex.usedSpace)
++currentStorageNumber
readStorageArray()
}
readStorageArray()
}
var loadAdminUsers = function(callback){
//get current disk used for each isolated account (admin user) on startup
s.sqlQuery('SELECT * FROM Users WHERE details NOT LIKE ?',['%"sub"%'],function(err,users){
if(users && users[0]){
var loadLocalDiskUse = function(callback){
var count = users.length
var countFinished = 0
var loadAdminUsers = function(callback){
//get current disk used for each isolated account (admin user) on startup
s.knexQuery({
action: "select",
columns: "*",
table: "Users",
where: [
['details','NOT LIKE','%"sub"%']
]
},function(err,users) {
if(users && users[0]){
users.forEach(function(user){
s.loadGroup(user)
s.loadGroupApps(user)
loadedAccounts.push(user.ke)
loadDiskUseForUser(user,function(){
++countFinished
if(countFinished === count){
callback()
}
checkedAdminUsers[user.ke] = user
})
var loadLocalDiskUse = function(callback){
var count = users.length
var countFinished = 0
users.forEach(function(user){
s.loadGroup(user)
s.loadGroupApps(user)
loadedAccounts.push(user.ke)
loadDiskUseForUser(user,function(){
++countFinished
if(countFinished === count){
callback()
}
})
})
}
var loadCloudDiskUse = function(callback){
var count = users.length
var countFinished = 0
users.forEach(function(user){
loadCloudDiskUseForUser(user,function(){
++countFinished
if(countFinished === count){
callback()
}
})
})
}
loadLocalDiskUse(function(){
loadCloudDiskUse(function(){
callback()
})
})
}else{
s.processReady()
}
var loadCloudDiskUse = function(callback){
var count = users.length
var countFinished = 0
users.forEach(function(user){
loadCloudDiskUseForUser(user,function(){
++countFinished
if(countFinished === count){
callback()
}
})
})
}
loadLocalDiskUse(function(){
loadCloudDiskUse(function(){
callback()
})
})
}
config.userHasSubscribed = false
var checkSubscription = function(callback){
var subscriptionFailed = function(){
console.error('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
console.error('This Install of Shinobi is NOT Activated')
console.error('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
console.log('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
s.systemLog('This Install of Shinobi is NOT Activated')
console.log('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
console.log('https://licenses.shinobi.video/subscribe')
}
if(config.subscriptionId && config.subscriptionId !== 'sub_XXXXXXXXXXXX'){
var url = 'https://licenses.shinobi.video/subscribe/check?subscriptionId=' + config.subscriptionId
request(url,{
method: 'GET',
timeout: 30000
}, function(err,resp,body){
var json = s.parseJSON(body)
if(err)console.log(err,json)
var hasSubcribed = json && !!json.ok
config.userHasSubscribed = hasSubcribed
callback(hasSubcribed)
if(config.userHasSubscribed){
s.systemLog('This Install of Shinobi is Activated')
if(!json.expired){
s.systemLog(`This License expires on ${json.timeExpires}`)
}
}else{
subscriptionFailed()
}
})
}else{
s.processReady()
subscriptionFailed()
callback(false)
}
})
}
config.userHasSubscribed = false
var checkSubscription = function(callback){
var subscriptionFailed = function(){
console.error('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
console.error('This Install of Shinobi is NOT Activated')
console.error('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
console.log('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
s.systemLog('This Install of Shinobi is NOT Activated')
console.log('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
console.log('https://licenses.shinobi.video/subscribe')
}
if(config.subscriptionId){
var url = 'https://licenses.shinobi.video/subscribe/check?subscriptionId=' + config.subscriptionId
request(url,{
method: 'GET',
timeout: 30000
}, function(err,resp,body){
var json = s.parseJSON(body)
if(err)console.log(err,json)
var hasSubcribed = !!json.ok
config.userHasSubscribed = hasSubcribed
callback(hasSubcribed)
if(config.userHasSubscribed){
s.systemLog('This Install of Shinobi is Activated')
}else{
subscriptionFailed()
}
})
}else{
subscriptionFailed()
callback(false)
//check disk space every 20 minutes
if(config.autoDropCache===true){
setInterval(function(){
exec('echo 3 > /proc/sys/vm/drop_caches',{detached: true})
},60000*20)
}
}
//check disk space every 20 minutes
if(config.autoDropCache===true){
setInterval(function(){
exec('echo 3 > /proc/sys/vm/drop_caches',{detached: true})
},60000*20)
}
if(config.childNodes.mode !== 'child'){
//master node - startup functions
setInterval(function(){
s.cpuUsage(function(cpu){
s.ramUsage(function(ram){
s.tx({f:'os',cpu:cpu,ram:ram},'CPU');
})
})
},10000)
//hourly check to see if sizePurge has failed to unlock
//checks to see if request count is the number of monitors + 10
s.checkForStalePurgeLocks()
//run prerequsite queries, load users and monitors
//sql/database connection with knex
s.databaseEngine = require('knex')(s.databaseOptions)
//run prerequsite queries
s.preQueries()
setTimeout(function(){
//check for subscription
checkSubscription(function(){
//check terminal commander
checkForTerminalCommands(function(){
//load administrators (groups)
loadAdminUsers(function(){
//load monitors (for groups)
loadMonitors(function(){
//check for orphaned videos
checkForOrphanedVideos(function(){
s.processReady()
if(config.childNodes.mode !== 'child'){
//master node - startup functions
//hourly check to see if sizePurge has failed to unlock
//checks to see if request count is the number of monitors + 10
s.checkForStalePurgeLocks()
//run prerequsite queries, load users and monitors
//sql/database connection with knex
s.databaseEngine = require('knex')(s.databaseOptions)
//run prerequsite queries
s.preQueries()
setTimeout(() => {
//check for subscription
checkSubscription(function(){
//check terminal commander
checkForTerminalCommands(function(){
//load administrators (groups)
loadAdminUsers(function(){
//load monitors (for groups)
loadMonitors(function(){
//check for orphaned videos
checkForOrphanedVideos(async () => {
s.processReady()
})
})
})
})
})
})
},1500)
}
},1500)
}
})
}

View File

@ -74,9 +74,13 @@ module.exports = function(s,config,lang,app,io){
}
}
s.insertTimelapseFrameDatabaseRow = function(e,queryInfo,filePath){
s.sqlQuery('INSERT INTO `Timelapse Frames` ('+Object.keys(queryInfo).join(',')+') VALUES (?,?,?,?,?,?)',Object.values(queryInfo))
s.setDiskUsedForGroup(e,queryInfo.size / 1000000,'timelapeFrames')
s.purgeDiskForGroup(e)
s.knexQuery({
action: "insert",
table: "Timelapse Frames",
insert: queryInfo
})
s.setDiskUsedForGroup(e.ke,queryInfo.size / 1048576,'timelapeFrames')
s.purgeDiskForGroup(e.ke)
s.onInsertTimelapseFrameExtensions.forEach(function(extender){
extender(e,queryInfo,filePath)
})
@ -103,11 +107,26 @@ module.exports = function(s,config,lang,app,io){
s.deleteTimelapseFrameFromCloud = function(e){
// e = video object
s.checkDetails(e)
var frameSelector = [e.id,e.ke,new Date(e.time)]
s.sqlQuery('SELECT * FROM `Cloud Timelapse Frames` WHERE `mid`=? AND `ke`=? AND `time`=?',frameSelector,function(err,r){
if(r&&r[0]){
var frameSelector = {
ke: e.ke,
mid: e.id,
time: new Date(e.time),
}
s.knexQuery({
action: "select",
columns: "*",
table: "Cloud Timelapse Frames",
where: frameSelector,
limit: 1
},function(err,r){
if(r && r[0]){
r = r[0]
s.sqlQuery('DELETE FROM `Cloud Timelapse Frames` WHERE `mid`=? AND `ke`=? AND `time`=?',frameSelector,function(){
s.knexQuery({
action: "delete",
table: "Cloud Timelapse Frames",
where: frameSelector,
limit: 1
},function(){
s.onDeleteTimelapseFrameFromCloudExtensionsRunner(e,r)
})
}else{
@ -131,112 +150,54 @@ module.exports = function(s,config,lang,app,io){
var hasRestrictions = user.details.sub && user.details.allmonitors !== '1'
if(
user.permissions.watch_videos==="0" ||
hasRestrictions && (!user.details.video_view || user.details.video_view.indexOf(req.params.id)===-1)
hasRestrictions &&
(
!user.details.video_view ||
user.details.video_view.indexOf(req.params.id) === -1
)
){
res.end(s.prettyPrint([]))
s.closeJsonResponse(res,[])
return
}
req.sql='SELECT * FROM `Timelapse Frames` WHERE ke=?';req.ar=[req.params.ke];
if(req.query.archived=='1'){
req.sql+=' AND details LIKE \'%"archived":"1"\''
}
if(!req.params.id){
if(user.details.sub&&user.details.monitors&&user.details.allmonitors!=='1'){
try{user.details.monitors=JSON.parse(user.details.monitors);}catch(er){}
req.or=[];
user.details.monitors.forEach(function(v,n){
req.or.push('mid=?');req.ar.push(v)
})
req.sql+=' AND ('+req.or.join(' OR ')+')'
}
}else{
if(!user.details.sub||user.details.allmonitors!=='0'||user.details.monitors.indexOf(req.params.id)>-1){
req.sql+=' and mid=?'
req.ar.push(req.params.id)
}else{
res.end('[]');
return;
}
}
var isMp4Call = false
if(req.query.mp4){
isMp4Call = true
}
if(req.params.date){
if(req.params.date.indexOf('-') === -1 && !isNaN(req.params.date)){
req.params.date = parseInt(req.params.date)
}
var selectedDate = req.params.date
if(typeof req.params.date === 'string' && req.params.date.indexOf('.') > -1){
isMp4Call = true
selectedDate = req.params.date.split('.')[0]
}
selectedDate = new Date(selectedDate)
var utcSelectedDate = new Date(selectedDate.getTime() + selectedDate.getTimezoneOffset() * 60000)
req.query.start = moment(utcSelectedDate).format('YYYY-MM-DD HH:mm:ss')
var dayAfter = utcSelectedDate
dayAfter.setDate(dayAfter.getDate() + 1)
req.query.end = moment(dayAfter).format('YYYY-MM-DD HH:mm:ss')
}
if(req.query.start||req.query.end){
if(!req.query.startOperator||req.query.startOperator==''){
req.query.startOperator='>='
}
if(!req.query.endOperator||req.query.endOperator==''){
req.query.endOperator='<='
}
if(req.query.start && req.query.start !== '' && req.query.end && req.query.end !== ''){
req.query.start = s.stringToSqlTime(req.query.start)
req.query.end = s.stringToSqlTime(req.query.end)
req.sql+=' AND `time` '+req.query.startOperator+' ? AND `time` '+req.query.endOperator+' ?';
req.ar.push(req.query.start)
req.ar.push(req.query.end)
}else if(req.query.start && req.query.start !== ''){
req.query.start = s.stringToSqlTime(req.query.start)
req.sql+=' AND `time` '+req.query.startOperator+' ?';
req.ar.push(req.query.start)
}
}
// if(!req.query.limit||req.query.limit==''){req.query.limit=288}
req.sql+=' ORDER BY `time` DESC'
s.sqlQuery(req.sql,req.ar,function(err,r){
if(isMp4Call){
if(r && r[0]){
s.createVideoFromTimelapse(r,req.query.fps,function(response){
if(response.fileExists){
if(req.query.download){
res.setHeader('Content-Type', 'video/mp4')
s.streamMp4FileOverHttp(response.fileLocation,req,res)
}else{
res.setHeader('Content-Type', 'application/json')
res.end(s.prettyPrint({
ok : response.ok,
fileExists : response.fileExists,
msg : response.msg,
}))
}
const monitorRestrictions = s.getMonitorRestrictions(user.details,req.params.id)
s.getDatabaseRows({
monitorRestrictions: monitorRestrictions,
table: 'Timelapse Frames',
groupKey: req.params.ke,
date: req.query.date,
startDate: req.query.start,
endDate: req.query.end,
startOperator: req.query.startOperator,
endOperator: req.query.endOperator,
limit: req.query.limit,
archived: req.query.archived,
rowType: 'frames',
endIsStartTo: true
},(response) => {
var isMp4Call = !!(req.query.mp4 || (req.params.date && typeof req.params.date === 'string' && req.params.date.indexOf('.') > -1))
if(isMp4Call && response.frames[0]){
s.createVideoFromTimelapse(response.frames,req.query.fps,function(response){
if(response.fileExists){
if(req.query.download){
res.setHeader('Content-Type', 'video/mp4')
s.streamMp4FileOverHttp(response.fileLocation,req,res)
}else{
res.setHeader('Content-Type', 'application/json')
res.end(s.prettyPrint({
s.closeJsonResponse(res,{
ok : response.ok,
fileExists : response.fileExists,
msg : response.msg,
}))
})
}
})
}else{
res.setHeader('Content-Type', 'application/json');
res.end(s.prettyPrint([]))
}
}else{
s.closeJsonResponse(res,{
ok : response.ok,
fileExists : response.fileExists,
msg : response.msg,
})
}
})
}else{
if(r && r[0]){
r.forEach(function(file){
file.details = s.parseJSON(file.details)
})
res.end(s.prettyPrint(r))
}else{
res.end(s.prettyPrint([]))
}
s.closeJsonResponse(res,response.frames)
}
})
},res,req);
@ -257,35 +218,18 @@ module.exports = function(s,config,lang,app,io){
res.end(s.prettyPrint([]))
return
}
req.sql='SELECT * FROM `Timelapse Frames` WHERE ke=?';req.ar=[req.params.ke];
if(req.query.archived=='1'){
req.sql+=' AND details LIKE \'%"archived":"1"\''
}
if(!req.params.id){
if(user.details.sub&&user.details.monitors&&user.details.allmonitors!=='1'){
try{user.details.monitors=JSON.parse(user.details.monitors);}catch(er){}
req.or=[];
user.details.monitors.forEach(function(v,n){
req.or.push('mid=?');req.ar.push(v)
})
req.sql+=' AND ('+req.or.join(' OR ')+')'
}
}else{
if(!user.details.sub||user.details.allmonitors!=='0'||user.details.monitors.indexOf(req.params.id)>-1){
req.sql+=' and mid=?'
req.ar.push(req.params.id)
}else{
res.end('[]');
return;
}
}
req.sql+=' AND filename=?'
req.ar.push(req.params.filename)
req.sql+=' ORDER BY `time` DESC'
s.sqlQuery(req.sql,req.ar,function(err,r){
if(r && r[0]){
var frame = r[0]
frame.details = s.parseJSON(frame.details)
const monitorRestrictions = s.getMonitorRestrictions(user.details,req.params.id)
s.getDatabaseRows({
monitorRestrictions: monitorRestrictions,
table: 'Timelapse Frames',
groupKey: req.params.ke,
archived: req.query.archived,
filename: req.params.filename,
rowType: 'frames',
endIsStartTo: true
},(response) => {
var frame = response.frames[0]
if(frame){
var fileLocation
if(frame.details.dir){
fileLocation = `${s.checkCorrectPathEnding(frame.details.dir)}`
@ -303,11 +247,11 @@ module.exports = function(s,config,lang,app,io){
res.on('finish',function(){res.end()})
fs.createReadStream(fileLocation).pipe(res)
}else{
res.end(s.prettyPrint({ok: false, msg: lang[`Nothing exists`]}))
s.closeJsonResponse(res,{ok: false, msg: lang[`Nothing exists`]})
}
})
}else{
res.end(s.prettyPrint({ok: false, msg: lang[`Nothing exists`]}))
s.closeJsonResponse(res,{ok: false, msg: lang[`Nothing exists`]})
}
})
},res,req);
@ -338,7 +282,15 @@ module.exports = function(s,config,lang,app,io){
if(hoursNow === 1){
var dateNowMoment = moment(dateNow).utc().format('YYYY-MM-DDTHH:mm:ss')
var dateMinusOneDay = moment(dateNow).utc().subtract(1, 'days').format('YYYY-MM-DDTHH:mm:ss')
s.sqlQuery('SELECT * FROM `Timelapse Frames` WHERE time => ? AND time =< ?',[dateMinusOneDay,dateNowMoment],function(err,frames){
s.knexQuery({
action: "select",
columns: "*",
table: "Timelapse Frames",
where: [
['time','=>',dateMinusOneDay],
['time','=<',dateNowMoment],
]
},function(err,frames) {
console.log(frames.length)
var groups = {}
frames.forEach(function(frame){

View File

@ -1,21 +1,20 @@
module.exports = function(s,config,lang){
module.exports = function(s,config,lang,app,io){
s.uploaderFields = []
var loadLib = function(lib){
var uploadersFolder = __dirname + '/uploaders/'
var libraryPath = uploadersFolder + lib + '.js'
var loadedLib = require(libraryPath)(s,config,lang)
if(lib !== 'loader'){
loadedLib.isFormGroupGroup = true
s.uploaderFields.push(loadedLib)
}
return loadedLib
require('./uploaders/loader.js')(s,config,lang,app,io)
const loadedLibraries = {
//cloud storage
s3based: require('./uploaders/s3based.js'),
backblazeB2: require('./uploaders/backblazeB2.js'),
amazonS3: require('./uploaders/amazonS3.js'),
webdav: require('./uploaders/webdav.js'),
//oauth
googleDrive: require('./uploaders/googleDrive.js'),
//simple storage
sftp: require('./uploaders/sftp.js'),
}
loadLib('loader')
//cloud storage
loadLib('s3based')
loadLib('backblazeB2')
loadLib('amazonS3')
loadLib('webdav')
//simple storage
loadLib('sftp')
Object.keys(loadedLibraries).forEach((key) => {
var loadedLib = loadedLibraries[key](s,config,lang,app,io)
loadedLib.isFormGroupGroup = true
s.uploaderFields.push(loadedLib)
})
}

View File

@ -3,8 +3,8 @@ module.exports = function(s,config,lang){
//Amazon S3
var beforeAccountSaveForAmazonS3 = function(d){
//d = save event
d.form.details.aws_use_global=d.d.aws_use_global
d.form.details.use_aws_s3=d.d.use_aws_s3
d.formDetails.aws_use_global=d.d.aws_use_global
d.formDetails.use_aws_s3=d.d.use_aws_s3
}
var cloudDiskUseStartupForAmazonS3 = function(group,userDetails){
group.cloudDiskUse['s3'].name = 'Amazon S3'
@ -100,23 +100,26 @@ module.exports = function(s,config,lang){
s.userLog(e,{type:lang['Amazon S3 Upload Error'],msg:err})
}
if(s.group[e.ke].init.aws_s3_log === '1' && data && data.Location){
var save = [
e.mid,
e.ke,
k.startTime,
1,
s.s({
type : 's3',
location : saveLocation
}),
k.filesize,
k.endTime,
data.Location
]
s.sqlQuery('INSERT INTO `Cloud Videos` (mid,ke,time,status,details,size,end,href) VALUES (?,?,?,?,?,?,?,?)',save)
s.setCloudDiskUsedForGroup(e,{
amount : k.filesizeMB,
storageType : 's3'
s.knexQuery({
action: "insert",
table: "Cloud Videos",
insert: {
mid: e.mid,
ke: e.ke,
time: k.startTime,
status: 1,
details: s.s({
type : 's3',
location : saveLocation
}),
size: k.filesize,
end: k.endTime,
href: data.Location
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount: k.filesizeMB,
storageType: 's3'
})
s.purgeCloudDiskForGroup(e,'s3')
}
@ -142,19 +145,22 @@ module.exports = function(s,config,lang){
s.userLog(e,{type:lang['Wasabi Hot Cloud Storage Upload Error'],msg:err})
}
if(s.group[e.ke].init.aws_s3_log === '1' && data && data.Location){
var save = [
queryInfo.mid,
queryInfo.ke,
queryInfo.time,
s.s({
type : 's3',
location : saveLocation,
}),
queryInfo.size,
data.Location
]
s.sqlQuery('INSERT INTO `Cloud Timelapse Frames` (mid,ke,time,details,size,href) VALUES (?,?,?,?,?,?)',save)
s.setCloudDiskUsedForGroup(e,{
s.knexQuery({
action: "insert",
table: "Cloud Timelapse Frames",
insert: {
mid: queryInfo.mid,
ke: queryInfo.ke,
time: queryInfo.time,
details: s.s({
type : 's3',
location : saveLocation
}),
size: queryInfo.size,
href: data.Location
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount : s.kilobyteToMegabyte(queryInfo.size),
storageType : 's3'
},'timelapseFrames')
@ -405,4 +411,4 @@ module.exports = function(s,config,lang){
},
]
}
}
}

View File

@ -3,8 +3,8 @@ module.exports = function(s,config,lang){
//Backblaze B2
var beforeAccountSaveForBackblazeB2 = function(d){
//d = save event
d.form.details.b2_use_global=d.d.b2_use_global
d.form.details.use_bb_b2=d.d.use_bb_b2
d.formDetails.b2_use_global=d.d.b2_use_global
d.formDetails.use_bb_b2=d.d.use_bb_b2
}
var cloudDiskUseStartupForBackblazeB2 = function(group,userDetails){
group.cloudDiskUse['b2'].name = 'Backblaze B2'
@ -44,7 +44,7 @@ module.exports = function(s,config,lang){
}
var backblazeErr = function(err){
// console.log(err)
s.userLog({mid:'$USER',ke:e.ke},{type:lang['Backblaze Error'],msg:err.data || err})
s.userLog({mid:'$USER',ke:e.ke},{type:lang['Backblaze Error'],msg:err.stack || err.data || err})
}
var createB2Connection = function(){
var b2 = new B2({
@ -129,23 +129,26 @@ module.exports = function(s,config,lang){
}).then(function(resp){
if(s.group[e.ke].init.bb_b2_log === '1' && resp.data.fileId){
var backblazeDownloadUrl = s.group[e.ke].bb_b2_downloadUrl + '/file/' + s.group[e.ke].init.bb_b2_bucket + '/' + backblazeSavePath
var save = [
e.mid,
e.ke,
k.startTime,
1,
s.s({
type : 'b2',
bucketId : resp.data.bucketId,
fileId : resp.data.fileId,
fileName : resp.data.fileName
}),
k.filesize,
k.endTime,
backblazeDownloadUrl
]
s.sqlQuery('INSERT INTO `Cloud Videos` (mid,ke,time,status,details,size,end,href) VALUES (?,?,?,?,?,?,?,?)',save)
s.setCloudDiskUsedForGroup(e,{
s.knexQuery({
action: "insert",
table: "Cloud Videos",
insert: {
mid: e.mid,
ke: e.ke,
time: k.startTime,
status: 1,
details: s.s({
type : 'b2',
bucketId : resp.data.bucketId,
fileId : resp.data.fileId,
fileName : resp.data.fileName
}),
size: k.filesize,
end: k.endTime,
href: backblazeDownloadUrl
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount : k.filesizeMB,
storageType : 'b2'
})

View File

@ -0,0 +1,418 @@
var fs = require('fs');
const {google} = require('googleapis');
module.exports = (s,config,lang,app,io) => {
const initializeOAuth = (credentials) => {
const creds = s.parseJSON(credentials)
if(!creds || !creds.installed)return;
const {client_secret, client_id, redirect_uris} = creds.installed;
const oAuth2Client = new google.auth.OAuth2(client_id, client_secret, redirect_uris[0]);
return oAuth2Client
}
const getVideoDirectoryId = (video) => {
return new Promise((resolve, reject) => {
const videoDirectory = s.group[video.ke].init.googd_dir + video.ke + '/' + video.mid
if(!s.group[video.ke].googleDriveFolderIds[video.ke + video.mid]){
var pageToken = null;
s.group[video.ke].googleDrive.files.list({
q: `name='${videoDirectory}'`,
fields: 'nextPageToken, files(id, name)',
spaces: 'drive',
pageToken: pageToken
}, function (err, res) {
if (err) {
reject(err)
} else {
var file = res.data.files
if(file[0]){
file = file[0]
s.group[video.ke].googleDriveFolderIds[video.ke + video.mid] = file.id
resolve(file.id)
}else{
s.group[video.ke].googleDrive.files.create({
resource: {
'name': videoDirectory,
'mimeType': 'application/vnd.google-apps.folder'
},
fields: 'id'
}, function (err, response) {
if (err) {
reject(err)
} else {
s.group[video.ke].googleDriveFolderIds[video.ke + video.mid] = response.data.id
resolve(response.data.id)
}
})
}
}
})
}else{
resolve(s.group[video.ke].googleDriveFolderIds[video.ke + video.mid])
}
})
}
//Google Drive Storage
var beforeAccountSaveForGoogleDrive = function(d){
//d = save event
d.formDetails.googd_use_global = d.d.googd_use_global
d.formDetails.use_googd = d.d.use_googd
}
var cloudDiskUseStartupForGoogleDrive = function(group,userDetails){
group.cloudDiskUse['googd'].name = 'Google Drive Storage'
group.cloudDiskUse['googd'].sizeLimitCheck = (userDetails.use_googd_size_limit === '1')
if(!userDetails.googd_size_limit || userDetails.googd_size_limit === ''){
group.cloudDiskUse['googd'].sizeLimit = 10000
}else{
group.cloudDiskUse['googd'].sizeLimit = parseFloat(userDetails.googd_size_limit)
}
}
var loadGoogleDriveForUser = async function(e){
// e = user
var userDetails = JSON.parse(e.details)
if(userDetails.googd_use_global === '1' && config.cloudUploaders && config.cloudUploaders.GoogleDrive){
// {
// googd_accessKeyId: "",
// googd_secretAccessKey: "",
// googd_region: "",
// googd_bucket: "",
// googd_dir: "",
// }
userDetails = Object.assign(userDetails,config.cloudUploaders.GoogleDrive)
}
if(userDetails.googd_save === '1'){
s.group[e.ke].googleDriveFolderIds = {}
var oAuth2Client
if(!s.group[e.ke].googleDriveOAuth2Client){
oAuth2Client = initializeOAuth(userDetails.googd_credentials)
s.group[e.ke].googleDriveOAuth2Client = oAuth2Client
}else{
oAuth2Client = s.group[e.ke].googleDriveOAuth2Client
}
if(userDetails.googd_code && userDetails.googd_code !== 'Authorized. Token Set.' && !s.group[e.ke].googleDrive){
oAuth2Client.getToken(userDetails.googd_code, (err, token) => {
if (err) return console.error('Error retrieving access token', err)
oAuth2Client.setCredentials(token)
s.accountSettingsEdit({
ke: e.ke,
uid: e.uid,
form: {
details: JSON.stringify(Object.assign(userDetails,{
googd_code: 'Authorized. Token Set.',
googd_token: token,
}))
},
},true)
})
}else if(userDetails.googd_token && !s.group[e.ke].googleDrive){
oAuth2Client.setCredentials(userDetails.googd_token)
const auth = oAuth2Client
const drive = google.drive({version: 'v3', auth});
s.group[e.ke].googleDrive = drive
}
}
}
var unloadGoogleDriveForUser = function(user){
s.group[user.ke].googleDrive = null
s.group[user.ke].googleDriveOAuth2Client = null
}
var deleteVideoFromGoogleDrive = function(e,video,callback){
// e = user
var videoDetails = s.parseJSON(video.details)
s.group[e.ke].googleDrive.files.delete({
fileId: videoDetails.id
}, function(err, resp){
if (err) {
console.log('Error code:', err.code)
} else {
// console.log('Successfully deleted', file);
}
callback()
})
}
var uploadVideoToGoogleDrive = async function(e,k){
//e = video object
//k = temporary values
if(!k)k={};
//cloud saver - Google Drive Storage
if(s.group[e.ke].googleDrive && s.group[e.ke].init.use_googd !== '0' && s.group[e.ke].init.googd_save === '1'){
var ext = k.filename.split('.')
ext = ext[ext.length - 1]
var fileStream = fs.createReadStream(k.dir+k.filename);
fileStream.on('error', function (err) {
console.error(err)
})
var bucketName = s.group[e.ke].init.googd_bucket
var saveLocation = s.group[e.ke].init.googd_dir+e.ke+'/'+e.mid+'/'+k.filename
s.group[e.ke].googleDrive.files.create({
requestBody: {
name: k.filename,
parents: [await getVideoDirectoryId(e)],
mimeType: 'video/'+ext
},
media: {
mimeType: 'video/'+ext,
body: fileStream
}
}).then(function(response){
const data = response.data
if(s.group[e.ke].init.googd_log === '1' && data && data.id){
s.knexQuery({
action: "insert",
table: "Cloud Videos",
insert: {
mid: e.mid,
ke: e.ke,
time: k.startTime,
status: 1,
details: s.s({
type: 'googd',
id: data.id
}),
size: k.filesize,
end: k.endTime,
href: ''
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount : k.filesizeMB,
storageType : 'googd'
})
s.purgeCloudDiskForGroup(e,'googd')
}
}).catch((err) => {
if(err){
s.userLog(e,{type:lang['Google Drive Storage Upload Error'],msg:err})
}
console.log(err)
})
}
}
var onInsertTimelapseFrame = function(monitorObject,queryInfo,filePath){
var e = monitorObject
if(s.group[e.ke].googd && s.group[e.ke].init.use_googd !== '0' && s.group[e.ke].init.googd_save === '1'){
var fileStream = fs.createReadStream(filePath)
fileStream.on('error', function (err) {
console.error(err)
})
var saveLocation = s.group[e.ke].init.googd_dir + e.ke + '/' + e.mid + '_timelapse/' + queryInfo.filename
s.group[e.ke].googleDrive.files.create({
requestBody: {
name: saveLocation,
mimeType: 'image/jpeg'
},
media: {
mimeType: 'image/jpeg',
body: fileStream
}
}).then(function(response){
const data = response.data
if(err){
s.userLog(e,{type:lang['Google Drive Storage Upload Error'],msg:err})
}
if(s.group[e.ke].init.googd_log === '1' && data && data.id){
s.knexQuery({
action: "insert",
table: "Cloud Timelapse Frames",
insert: {
mid: queryInfo.mid,
ke: queryInfo.ke,
time: queryInfo.time,
details: s.s({
type : 'googd',
id : data.id,
}),
size: queryInfo.size,
href: ''
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount : s.kilobyteToMegabyte(queryInfo.size),
storageType : 'googd'
},'timelapseFrames')
s.purgeCloudDiskForGroup(e,'googd','timelapseFrames')
}
})
}
}
var onDeleteTimelapseFrameFromCloud = function(e,frame,callback){
// e = user
var frameDetails = s.parseJSON(frame.details)
if(frameDetails.type !== 'googd'){
return
}
s.group[e.ke].googleDrive.files.delete({
fileId: frameDetails.id
}, function(err, resp){
if (err) console.log(err);
callback()
});
}
var onGetVideoData = async (video) => {
// e = user
var videoDetails = s.parseJSON(video.details)
const fileId = videoDetails.id
if(videoDetails.type !== 'googd'){
return
}
return new Promise((resolve, reject) => {
s.group[video.ke].googleDrive.files
.get({fileId, alt: 'media'}, {responseType: 'stream'})
.then(res => {
resolve(res.data)
}).catch(reject)
})
}
//
app.get([config.webPaths.apiPrefix+':auth/googleDriveOAuthRequest/:ke'], (req,res) => {
s.auth(req.params,async (user) => {
var response = {ok: false}
var oAuth2Client = s.group[req.params.ke].googleDriveOAuth2Client
if(!oAuth2Client){
oAuth2Client = initializeOAuth(s.group[req.params.ke].googd_credentials)
s.group[req.params.ke].googleDriveOAuth2Client = oAuth2Client
}
if(oAuth2Client){
const authUrl = oAuth2Client.generateAuthUrl({
access_type: 'offline',
scope: ['https://www.googleapis.com/auth/drive.file'],
})
response.ok = true
response.authUrl = authUrl
}
s.closeJsonResponse(res,response)
})
});
//wasabi
s.addCloudUploader({
name: 'googd',
loadGroupAppExtender: loadGoogleDriveForUser,
unloadGroupAppExtender: unloadGoogleDriveForUser,
insertCompletedVideoExtender: uploadVideoToGoogleDrive,
deleteVideoFromCloudExtensions: deleteVideoFromGoogleDrive,
cloudDiskUseStartupExtensions: cloudDiskUseStartupForGoogleDrive,
beforeAccountSave: beforeAccountSaveForGoogleDrive,
onAccountSave: cloudDiskUseStartupForGoogleDrive,
onInsertTimelapseFrame: onInsertTimelapseFrame,
onDeleteTimelapseFrameFromCloud: onDeleteTimelapseFrameFromCloud,
onGetVideoData: onGetVideoData
})
return {
"evaluation": "details.use_googd !== '0'",
"name": lang["Google Drive"],
"color": "forestgreen",
"info": [
{
"name": "detail=googd_save",
"selector":"autosave_googd",
"field": lang.Autosave,
"description": "",
"default": lang.No,
"example": "",
"fieldType": "select",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"hidden": true,
"field": lang['OAuth Credentials'],
"name": "detail=googd_credentials",
"form-group-class": "autosave_googd_input autosave_googd_1",
"description": "",
"default": "",
"example": "",
"possible": ""
},
{
"hidden": true,
"fieldType": "btn",
"attribute": `style="margin-bottom:1em" href="javascript:$.get(getApiPrefix() + '/googleDriveOAuthRequest/' + $user.ke,function(data){if(data.ok)window.open(data.authUrl, 'Google Drive Authentication', 'width=800,height=400');})"`,
"class": `btn-success`,
"form-group-class": "autosave_googd_input autosave_googd_1",
"btnContent": `<i class="fa fa-plus"></i> &nbsp; ${lang['Get Code']}`,
},
{
"hidden": true,
"field": lang['OAuth Code'],
"name": "detail=googd_code",
"form-group-class": "autosave_googd_input autosave_googd_1",
"description": "",
"default": "",
"example": "",
"possible": ""
},
{
"hidden": true,
"name": "detail=googd_log",
"field": lang['Save Links to Database'],
"fieldType": "select",
"selector": "h_googdsld",
"form-group-class":"autosave_googd_input autosave_googd_1",
"description": "",
"default": "",
"example": "",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"hidden": true,
"name": "detail=use_googd_size_limit",
"field": lang['Use Max Storage Amount'],
"fieldType": "select",
"selector": "h_googdzl",
"form-group-class":"autosave_googd_input autosave_googd_1",
"form-group-class-pre-layer":"h_googdsld_input h_googdsld_1",
"description": "",
"default": "",
"example": "",
"possible": [
{
"name": lang.No,
"value": "0"
},
{
"name": lang.Yes,
"value": "1"
}
]
},
{
"hidden": true,
"name": "detail=googd_size_limit",
"field": lang['Max Storage Amount'],
"form-group-class":"autosave_googd_input autosave_googd_1",
"form-group-class-pre-layer":"h_googdsld_input h_googdsld_1",
"description": "",
"default": "10000",
"example": "",
"possible": ""
},
{
"hidden": true,
"name": "detail=googd_dir",
"field": lang['Save Directory'],
"form-group-class":"autosave_googd_input autosave_googd_1",
"description": "",
"default": "/",
"example": "",
"possible": ""
},
]
}
}

View File

@ -10,6 +10,7 @@ module.exports = function(s){
s.beforeAccountSave(opt.beforeAccountSave)
s.onAccountSave(opt.onAccountSave)
s.cloudDisksLoader(opt.name)
if(opt.onGetVideoData)s.cloudDiskUseOnGetVideoDataExtensions[opt.name] = opt.onGetVideoData
}
s.addSimpleUploader = function(opt){
s.loadGroupAppExtender(opt.loadGroupAppExtender)

View File

@ -3,8 +3,8 @@ module.exports = function(s,config,lang){
//Wasabi Hot Cloud Storage
var beforeAccountSaveForWasabiHotCloudStorage = function(d){
//d = save event
d.form.details.whcs_use_global=d.d.whcs_use_global
d.form.details.use_whcs=d.d.use_whcs
d.formDetails.whcs_use_global=d.d.whcs_use_global
d.formDetails.use_whcs=d.d.use_whcs
}
var cloudDiskUseStartupForWasabiHotCloudStorage = function(group,userDetails){
group.cloudDiskUse['whcs'].name = 'Wasabi Hot Cloud Storage'
@ -117,21 +117,24 @@ module.exports = function(s,config,lang){
if(s.group[e.ke].init.whcs_log === '1' && data && data.Location){
var cloudLink = data.Location
cloudLink = fixCloudianUrl(e,cloudLink)
var save = [
e.mid,
e.ke,
k.startTime,
1,
s.s({
type : 'whcs',
location : saveLocation
}),
k.filesize,
k.endTime,
cloudLink
]
s.sqlQuery('INSERT INTO `Cloud Videos` (mid,ke,time,status,details,size,end,href) VALUES (?,?,?,?,?,?,?,?)',save)
s.setCloudDiskUsedForGroup(e,{
s.knexQuery({
action: "insert",
table: "Cloud Videos",
insert: {
mid: e.mid,
ke: e.ke,
time: k.startTime,
status: 1,
details: s.s({
type : 'whcs',
location : saveLocation
}),
size: k.filesize,
end: k.endTime,
href: cloudLink
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount : k.filesizeMB,
storageType : 'whcs'
})
@ -159,19 +162,22 @@ module.exports = function(s,config,lang){
s.userLog(e,{type:lang['Wasabi Hot Cloud Storage Upload Error'],msg:err})
}
if(s.group[e.ke].init.whcs_log === '1' && data && data.Location){
var save = [
queryInfo.mid,
queryInfo.ke,
queryInfo.time,
s.s({
type : 'whcs',
location : saveLocation,
}),
queryInfo.size,
data.Location
]
s.sqlQuery('INSERT INTO `Cloud Timelapse Frames` (mid,ke,time,details,size,href) VALUES (?,?,?,?,?,?)',save)
s.setCloudDiskUsedForGroup(e,{
s.knexQuery({
action: "insert",
table: "Cloud Timelapse Frames",
insert: {
mid: queryInfo.mid,
ke: queryInfo.ke,
time: queryInfo.time,
details: s.s({
type : 'whcs',
location : saveLocation
}),
size: queryInfo.size,
href: data.Location
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount : s.kilobyteToMegabyte(queryInfo.size),
storageType : 'whcs'
},'timelapseFrames')

View File

@ -2,13 +2,12 @@ var fs = require('fs');
var ssh2SftpClient = require('node-ssh')
module.exports = function(s,config,lang){
//SFTP
var sftpErr = function(err){
// console.log(err)
s.userLog({mid:'$USER',ke:e.ke},{type:lang['SFTP Error'],msg:err.data || err})
var sftpErr = function(groupKey,err){
s.userLog({mid:'$USER',ke:groupKey},{type:lang['SFTP Error'],msg:err.data || err})
}
var beforeAccountSaveForSftp = function(d){
//d = save event
d.form.details.use_sftp = d.d.use_sftp
d.formDetails.use_sftp = d.d.use_sftp
}
var loadSftpForUser = function(e){
// e = user
@ -37,7 +36,9 @@ module.exports = function(s,config,lang){
if(userDetails.sftp_username && userDetails.sftp_username !== '')connectionDetails.username = userDetails.sftp_username
if(userDetails.sftp_password && userDetails.sftp_password !== '')connectionDetails.password = userDetails.sftp_password
if(userDetails.sftp_privateKey && userDetails.sftp_privateKey !== '')connectionDetails.privateKey = userDetails.sftp_privateKey
sftp.connect(connectionDetails).catch(sftpErr)
sftp.connect(connectionDetails).catch((err) => {
sftpErr(e.ke,err)
})
s.group[e.ke].sftp = sftp
}
}
@ -54,14 +55,16 @@ module.exports = function(s,config,lang){
if(s.group[e.ke].sftp && s.group[e.ke].init.use_sftp !== '0' && s.group[e.ke].init.sftp_save === '1'){
var localPath = k.dir + k.filename
var saveLocation = s.group[e.ke].init.sftp_dir + e.ke + '/' + e.mid + '/' + k.filename
s.group[e.ke].sftp.putFile(localPath, saveLocation).catch(sftpErr)
s.group[e.ke].sftp.putFile(localPath, saveLocation).catch((err) => {
sftpErr(e.ke,err)
})
}
}
var createSftpDirectory = function(monitorConfig){
var monitorSaveDirectory = s.group[monitorConfig.ke].init.sftp_dir + monitorConfig.ke + '/' + monitorConfig.mid
s.group[monitorConfig.ke].sftp.mkdir(monitorSaveDirectory, true).catch(function(err){
if(err.code !== 'ERR_ASSERTION'){
sftpErr(err)
sftpErr(monitorConfig.ke,err)
}
})
}

View File

@ -4,8 +4,8 @@ module.exports = function(s,config,lang){
// WebDAV
var beforeAccountSaveForWebDav = function(d){
//d = save event
d.form.details.webdav_use_global=d.d.webdav_use_global
d.form.details.use_webdav=d.d.use_webdav
d.formDetails.webdav_use_global=d.d.webdav_use_global
d.formDetails.use_webdav=d.d.use_webdav
}
var cloudDiskUseStartupForWebDav = function(group,userDetails){
group.cloudDiskUse['webdav'].name = 'WebDAV'
@ -81,23 +81,26 @@ module.exports = function(s,config,lang){
fs.createReadStream(k.dir + k.filename).pipe(wfs.createWriteStream(webdavUploadDir + k.filename))
if(s.group[e.ke].init.webdav_log === '1'){
var webdavRemoteUrl = s.addUserPassToUrl(s.checkCorrectPathEnding(s.group[e.ke].init.webdav_url),s.group[e.ke].init.webdav_user,s.group[e.ke].init.webdav_pass) + s.group[e.ke].init.webdav_dir + e.ke + '/'+e.mid+'/'+k.filename
var save = [
e.mid,
e.ke,
k.startTime,
1,
s.s({
type : 'webdav',
location : webdavUploadDir + k.filename
}),
k.filesize,
k.endTime,
webdavRemoteUrl
]
s.sqlQuery('INSERT INTO `Cloud Videos` (mid,ke,time,status,details,size,end,href) VALUES (?,?,?,?,?,?,?,?)',save)
s.setCloudDiskUsedForGroup(e,{
amount : k.filesizeMB,
storageType : 'webdav'
s.knexQuery({
action: "insert",
table: "Cloud Videos",
insert: {
mid: e.mid,
ke: e.ke,
time: k.startTime,
status: 1,
details: s.s({
type : 'webdav',
location : webdavUploadDir + k.filename
}),
size: k.filesize,
end: k.endTime,
href: webdavRemoteUrl
}
})
s.setCloudDiskUsedForGroup(e.ke,{
amount: k.filesizeMB,
storageType: 'webdav'
})
s.purgeCloudDiskForGroup(e,'webdav')
}

View File

@ -2,278 +2,63 @@ var fs = require('fs');
var events = require('events');
var spawn = require('child_process').spawn;
var exec = require('child_process').exec;
var async = require("async");
module.exports = function(s,config,lang){
s.purgeDiskForGroup = function(e){
if(config.cron.deleteOverMax === true && s.group[e.ke] && s.group[e.ke].sizePurgeQueue){
s.group[e.ke].sizePurgeQueue.push(1)
if(s.group[e.ke].sizePurging !== true){
s.group[e.ke].sizePurging = true
var finish = function(){
//remove value just used from queue
s.group[e.ke].sizePurgeQueue.shift()
//do next one
if(s.group[e.ke].sizePurgeQueue.length > 0){
checkQueue()
}else{
s.group[e.ke].sizePurging = false
s.sendDiskUsedAmountToClients(e)
}
}
var checkQueue = function(){
//get first in queue
var currentPurge = s.group[e.ke].sizePurgeQueue[0]
var reRunCheck = function(){}
var deleteSetOfVideos = function(err,videos,storageIndex,callback){
var videosToDelete = []
var queryValues = [e.ke]
var completedCheck = 0
if(videos){
videos.forEach(function(video){
video.dir = s.getVideoDirectory(video) + s.formattedTime(video.time) + '.' + video.ext
videosToDelete.push('(mid=? AND `time`=?)')
queryValues.push(video.mid)
queryValues.push(video.time)
fs.chmod(video.dir,0o777,function(err){
fs.unlink(video.dir,function(err){
++completedCheck
if(err){
fs.stat(video.dir,function(err){
if(!err){
s.file('delete',video.dir)
}
})
}
if(videosToDelete.length === completedCheck){
videosToDelete = videosToDelete.join(' OR ')
s.sqlQuery('DELETE FROM Videos WHERE ke =? AND ('+videosToDelete+')',queryValues,function(){
reRunCheck()
})
}
})
})
if(storageIndex){
s.setDiskUsedForGroupAddStorage(e,{
size: -(video.size/1000000),
storageIndex: storageIndex
})
}else{
s.setDiskUsedForGroup(e,-(video.size/1000000))
}
s.tx({
f: 'video_delete',
ff: 'over_max',
filename: s.formattedTime(video.time)+'.'+video.ext,
mid: video.mid,
ke: video.ke,
time: video.time,
end: s.formattedTime(new Date,'YYYY-MM-DD HH:mm:ss')
},'GRP_'+e.ke)
})
}else{
console.log(err)
}
if(videosToDelete.length === 0){
if(callback)callback()
}
}
var deleteSetOfTimelapseFrames = function(err,frames,storageIndex,callback){
var framesToDelete = []
var queryValues = [e.ke]
var completedCheck = 0
if(frames){
frames.forEach(function(frame){
var selectedDate = frame.filename.split('T')[0]
var dir = s.getTimelapseFrameDirectory(frame)
var fileLocationMid = `${dir}` + frame.filename
framesToDelete.push('(mid=? AND `time`=?)')
queryValues.push(frame.mid)
queryValues.push(frame.time)
fs.unlink(fileLocationMid,function(err){
++completedCheck
if(err){
fs.stat(fileLocationMid,function(err){
if(!err){
s.file('delete',fileLocationMid)
}
})
}
if(framesToDelete.length === completedCheck){
framesToDelete = framesToDelete.join(' OR ')
s.sqlQuery('DELETE FROM `Timelapse Frames` WHERE ke =? AND ('+framesToDelete+')',queryValues,function(){
reRunCheck()
})
}
})
if(storageIndex){
s.setDiskUsedForGroupAddStorage(e,{
size: -(frame.size/1000000),
storageIndex: storageIndex
},'timelapeFrames')
}else{
s.setDiskUsedForGroup(e,-(frame.size/1000000),'timelapeFrames')
}
// s.tx({
// f: 'timelapse_frame_delete',
// ff: 'over_max',
// filename: s.formattedTime(video.time)+'.'+video.ext,
// mid: video.mid,
// ke: video.ke,
// time: video.time,
// end: s.formattedTime(new Date,'YYYY-MM-DD HH:mm:ss')
// },'GRP_'+e.ke)
})
}else{
console.log(err)
}
if(framesToDelete.length === 0){
if(callback)callback()
}
}
var deleteSetOfFileBinFiles = function(err,files,storageIndex,callback){
var filesToDelete = []
var queryValues = [e.ke]
var completedCheck = 0
if(files){
files.forEach(function(file){
var dir = s.getFileBinDirectory(file)
var fileLocationMid = `${dir}` + file.name
filesToDelete.push('(mid=? AND `name`=?)')
queryValues.push(file.mid)
queryValues.push(file.name)
fs.unlink(fileLocationMid,function(err){
++completedCheck
if(err){
fs.stat(fileLocationMid,function(err){
if(!err){
s.file('delete',fileLocationMid)
}
})
}
if(filesToDelete.length === completedCheck){
filesToDelete = filesToDelete.join(' OR ')
s.sqlQuery('DELETE FROM `Files` WHERE ke =? AND ('+filesToDelete+')',queryValues,function(){
reRunCheck()
})
}
})
if(storageIndex){
s.setDiskUsedForGroupAddStorage(e,{
size: -(file.size/1000000),
storageIndex: storageIndex
},'fileBin')
}else{
s.setDiskUsedForGroup(e,-(file.size/1000000),'fileBin')
}
})
}else{
console.log(err)
}
if(framesToDelete.length === 0){
if(callback)callback()
}
}
var deleteMainVideos = function(callback){
reRunCheck = function(){
return deleteMainVideos(callback)
}
//run purge command
if(s.group[e.ke].usedSpaceVideos > (s.group[e.ke].sizeLimit * (s.group[e.ke].sizeLimitVideoPercent / 100) * config.cron.deleteOverMaxOffset)){
s.sqlQuery('SELECT * FROM Videos WHERE status != 0 AND details NOT LIKE \'%"archived":"1"%\' AND ke=? AND details NOT LIKE \'%"dir"%\' ORDER BY `time` ASC LIMIT 3',[e.ke],function(err,rows){
deleteSetOfVideos(err,rows,null,callback)
})
}else{
callback()
}
}
var deleteAddStorageVideos = function(callback){
reRunCheck = function(){
return deleteAddStorageVideos(callback)
}
var currentStorageNumber = 0
var readStorageArray = function(finishedReading){
setTimeout(function(){
reRunCheck = readStorageArray
var storage = s.listOfStorage[currentStorageNumber]
if(!storage){
//done all checks, move on to next user
callback()
return
}
var storageId = storage.value
if(storageId === '' || !s.group[e.ke].addStorageUse[storageId]){
++currentStorageNumber
readStorageArray()
return
}
var storageIndex = s.group[e.ke].addStorageUse[storageId]
//run purge command
if(storageIndex.usedSpace > (storageIndex.sizeLimit * (storageIndex.deleteOffset || config.cron.deleteOverMaxOffset))){
s.sqlQuery('SELECT * FROM Videos WHERE status != 0 AND details NOT LIKE \'%"archived":"1"%\' AND ke=? AND details LIKE ? ORDER BY `time` ASC LIMIT 3',[e.ke,`%"dir":"${storage.value}"%`],function(err,rows){
deleteSetOfVideos(err,rows,storageIndex,callback)
})
}else{
++currentStorageNumber
readStorageArray()
}
})
}
readStorageArray()
}
var deleteTimelapseFrames = function(callback){
reRunCheck = function(){
return deleteTimelapseFrames(callback)
}
//run purge command
if(s.group[e.ke].usedSpaceTimelapseFrames > (s.group[e.ke].sizeLimit * (s.group[e.ke].sizeLimitTimelapseFramesPercent / 100) * config.cron.deleteOverMaxOffset)){
s.sqlQuery('SELECT * FROM `Timelapse Frames` WHERE ke=? AND details NOT LIKE \'%"archived":"1"%\' ORDER BY `time` ASC LIMIT 3',[e.ke],function(err,frames){
deleteSetOfTimelapseFrames(err,frames,null,callback)
})
}else{
callback()
}
}
var deleteFileBinFiles = function(callback){
if(config.deleteFileBinsOverMax === true){
reRunCheck = function(){
return deleteSetOfFileBinFiles(callback)
}
//run purge command
if(s.group[e.ke].usedSpaceFileBin > (s.group[e.ke].sizeLimit * (s.group[e.ke].sizeLimitFileBinPercent / 100) * config.cron.deleteOverMaxOffset)){
s.sqlQuery('SELECT * FROM `Files` WHERE ke=? ORDER BY `time` ASC LIMIT 1',[e.ke],function(err,frames){
deleteSetOfFileBinFiles(err,frames,null,callback)
})
}else{
callback()
}
}else{
callback()
}
}
deleteMainVideos(function(){
deleteTimelapseFrames(function(){
deleteFileBinFiles(function(){
deleteAddStorageVideos(function(){
finish()
const {
deleteSetOfVideos,
deleteSetOfTimelapseFrames,
deleteSetOfFileBinFiles,
deleteAddStorageVideos,
deleteMainVideos,
deleteTimelapseFrames,
deleteFileBinFiles,
deleteCloudVideos,
deleteCloudTimelapseFrames,
} = require("./user/utils.js")(s,config,lang);
let purgeDiskGroup = () => {}
const runQuery = async.queue(function(groupKey, callback) {
purgeDiskGroup(groupKey,callback)
}, 1);
if(config.cron.deleteOverMax === true){
purgeDiskGroup = (groupKey,callback) => {
if(s.group[groupKey]){
if(s.group[groupKey].sizePurging !== true){
s.group[groupKey].sizePurging = true
s.debugLog(`${groupKey} deleteMainVideos`)
deleteMainVideos(groupKey,() => {
s.debugLog(`${groupKey} deleteTimelapseFrames`)
deleteTimelapseFrames(groupKey,() => {
s.debugLog(`${groupKey} deleteFileBinFiles`)
deleteFileBinFiles(groupKey,() => {
s.debugLog(`${groupKey} deleteAddStorageVideos`)
deleteAddStorageVideos(groupKey,() => {
s.group[groupKey].sizePurging = false
s.sendDiskUsedAmountToClients(groupKey)
callback();
})
})
})
})
}else{
s.sendDiskUsedAmountToClients(groupKey)
}
checkQueue()
}
}else{
s.sendDiskUsedAmountToClients(e)
}
}
s.setDiskUsedForGroup = function(e,bytes,storagePoint){
s.purgeDiskForGroup = (groupKey) => {
return runQuery.push(groupKey,function(){
//...
})
}
s.setDiskUsedForGroup = function(groupKey,bytes,storagePoint){
//`bytes` will be used as the value to add or substract
if(s.group[e.ke] && s.group[e.ke].diskUsedEmitter){
s.group[e.ke].diskUsedEmitter.emit('set',bytes,storagePoint)
if(s.group[groupKey] && s.group[groupKey].diskUsedEmitter){
s.group[groupKey].diskUsedEmitter.emit('set',bytes,storagePoint)
}
}
s.setDiskUsedForGroupAddStorage = function(e,data,storagePoint){
if(s.group[e.ke] && s.group[e.ke].diskUsedEmitter){
s.group[e.ke].diskUsedEmitter.emit('setAddStorage',data,storagePoint)
s.setDiskUsedForGroupAddStorage = function(groupKey,data,storagePoint){
if(s.group[groupKey] && s.group[groupKey].diskUsedEmitter){
s.group[groupKey].diskUsedEmitter.emit('setAddStorage',data,storagePoint)
}
}
s.purgeCloudDiskForGroup = function(e,storageType,storagePoint){
@ -281,33 +66,44 @@ module.exports = function(s,config,lang){
s.group[e.ke].diskUsedEmitter.emit('purgeCloud',storageType,storagePoint)
}
}
s.setCloudDiskUsedForGroup = function(e,usage,storagePoint){
s.setCloudDiskUsedForGroup = function(groupKey,usage,storagePoint){
//`usage` will be used as the value to add or substract
if(s.group[e.ke].diskUsedEmitter){
s.group[e.ke].diskUsedEmitter.emit('setCloud',usage,storagePoint)
if(s.group[groupKey].diskUsedEmitter){
s.group[groupKey].diskUsedEmitter.emit('setCloud',usage,storagePoint)
}
}
s.sendDiskUsedAmountToClients = function(e){
s.sendDiskUsedAmountToClients = function(groupKey){
//send the amount used disk space to connected users
if(s.group[e.ke]&&s.group[e.ke].init){
if(s.group[groupKey]&&s.group[groupKey].init){
s.tx({
f: 'diskUsed',
size: s.group[e.ke].usedSpace,
usedSpace: s.group[e.ke].usedSpace,
usedSpaceVideos: s.group[e.ke].usedSpaceVideos,
usedSpaceFilebin: s.group[e.ke].usedSpaceFilebin,
usedSpaceTimelapseFrames: s.group[e.ke].usedSpaceTimelapseFrames,
limit: s.group[e.ke].sizeLimit,
addStorage: s.group[e.ke].addStorageUse
},'GRP_'+e.ke);
size: s.group[groupKey].usedSpace,
usedSpace: s.group[groupKey].usedSpace,
usedSpaceVideos: s.group[groupKey].usedSpaceVideos,
usedSpaceFilebin: s.group[groupKey].usedSpaceFilebin,
usedSpaceTimelapseFrames: s.group[groupKey].usedSpaceTimelapseFrames,
limit: s.group[groupKey].sizeLimit,
addStorage: s.group[groupKey].addStorageUse
},'GRP_'+groupKey);
}
}
//user log
s.userLog = function(e,x){
if(e.id && !e.mid)e.mid = e.id
if(!x||!e.mid){return}
if((e.details&&e.details.sqllog==='1')||e.mid.indexOf('$')>-1){
s.sqlQuery('INSERT INTO Logs (ke,mid,info) VALUES (?,?,?)',[e.ke,e.mid,s.s(x)]);
if(
(e.details && e.details.sqllog === '1') ||
e.mid.indexOf('$') > -1
){
s.knexQuery({
action: "insert",
table: "Logs",
insert: {
ke: e.ke,
mid: e.mid,
info: s.s(x),
}
})
}
s.tx({f:'log',ke:e.ke,mid:e.mid,log:x,time:s.timeObject()},'GRPLOG_'+e.ke);
}
@ -334,22 +130,31 @@ module.exports = function(s,config,lang){
s.group[e.ke].sizeLimitTimelapseFramesPercent = parseFloat(s.group[e.ke].init.size_timelapse_percent) || 5
s.group[e.ke].sizeLimitFileBinPercent = parseFloat(s.group[e.ke].init.size_filebin_percent) || 5
//save global used space as megabyte value
s.group[e.ke].usedSpace = s.group[e.ke].usedSpace || ((e.size || 0) / 1000000)
s.group[e.ke].usedSpace = s.group[e.ke].usedSpace || ((e.size || 0) / 1048576)
//emit the changes to connected users
s.sendDiskUsedAmountToClients(e)
s.sendDiskUsedAmountToClients(e.ke)
}
s.loadGroupApps = function(e){
// e = user
if(!s.group[e.ke].init){
s.group[e.ke].init={};
}
s.sqlQuery('SELECT * FROM Users WHERE ke=? AND details NOT LIKE ?',[e.ke,'%"sub"%'],function(ar,r){
s.knexQuery({
action: "select",
columns: "*",
table: "Users",
where: [
['ke','=',e.ke],
['details','NOT LIKE',`%"sub"%`],
],
limit: 1
},(err,r) => {
if(r && r[0]){
r = r[0];
ar = JSON.parse(r.details);
const details = JSON.parse(r.details);
//load extenders
s.loadGroupAppExtensions.forEach(function(extender){
extender(r,ar)
extender(r,details)
})
//disk Used Emitter
if(!s.group[e.ke].diskUsedEmitter){
@ -381,82 +186,15 @@ module.exports = function(s,config,lang){
break;
}
})
s.group[e.ke].diskUsedEmitter.on('purgeCloud',function(storageType,storagePoint){
if(config.cron.deleteOverMax === true){
var cloudDisk = s.group[e.ke].cloudDiskUse[storageType]
//set queue processor
var finish=function(){
// s.sendDiskUsedAmountToClients(e)
}
var deleteVideos = function(){
//run purge command
if(cloudDisk.sizeLimitCheck && cloudDisk.usedSpace > (cloudDisk.sizeLimit*config.cron.deleteOverMaxOffset)){
s.sqlQuery('SELECT * FROM `Cloud Videos` WHERE status != 0 AND ke=? AND details LIKE \'%"type":"'+storageType+'"%\' ORDER BY `time` ASC LIMIT 2',[e.ke],function(err,videos){
var videosToDelete = []
var queryValues = [e.ke]
if(!videos)return console.log(err)
videos.forEach(function(video){
video.dir = s.getVideoDirectory(video) + s.formattedTime(video.time) + '.' + video.ext
videosToDelete.push('(mid=? AND `time`=?)')
queryValues.push(video.mid)
queryValues.push(video.time)
s.setCloudDiskUsedForGroup(e,{
amount : -(video.size/1000000),
storageType : storageType
})
s.deleteVideoFromCloudExtensionsRunner(e,storageType,video)
})
if(videosToDelete.length > 0){
videosToDelete = videosToDelete.join(' OR ')
s.sqlQuery('DELETE FROM `Cloud Videos` WHERE ke =? AND ('+videosToDelete+')',queryValues,function(){
deleteVideos()
})
}else{
finish()
}
})
}else{
finish()
}
}
var deleteTimelapseFrames = function(callback){
reRunCheck = function(){
return deleteTimelapseFrames(callback)
}
//run purge command
if(cloudDisk.usedSpaceTimelapseFrames > (cloudDisk.sizeLimit * (s.group[e.ke].sizeLimitTimelapseFramesPercent / 100) * config.cron.deleteOverMaxOffset)){
s.sqlQuery('SELECT * FROM `Cloud Timelapse Frames` WHERE ke=? AND details NOT LIKE \'%"archived":"1"%\' ORDER BY `time` ASC LIMIT 3',[e.ke],function(err,frames){
var framesToDelete = []
var queryValues = [e.ke]
if(!frames)return console.log(err)
frames.forEach(function(frame){
frame.dir = s.getVideoDirectory(frame) + s.formattedTime(frame.time) + '.' + frame.ext
framesToDelete.push('(mid=? AND `time`=?)')
queryValues.push(frame.mid)
queryValues.push(frame.time)
s.setCloudDiskUsedForGroup(e,{
amount : -(frame.size/1000000),
storageType : storageType
})
s.deleteVideoFromCloudExtensionsRunner(e,storageType,frame)
})
s.sqlQuery('DELETE FROM `Cloud Timelapse Frames` WHERE ke =? AND ('+framesToDelete+')',queryValues,function(){
deleteTimelapseFrames(callback)
})
})
}else{
callback()
}
}
deleteVideos(function(){
deleteTimelapseFrames(function(){
if(config.cron.deleteOverMax === true){
s.group[e.ke].diskUsedEmitter.on('purgeCloud',function(storageType,storagePoint){
deleteCloudVideos(storageType,storagePoint,function(){
deleteCloudTimelapseFrames(storageType,storagePoint,function(){
})
})
}else{
// s.sendDiskUsedAmountToClients(e)
}
})
})
}
//s.setDiskUsedForGroup
s.group[e.ke].diskUsedEmitter.on('set',function(currentChange,storageType){
//validate current values
@ -482,7 +220,7 @@ module.exports = function(s,config,lang){
break;
}
//remove value just used from queue
s.sendDiskUsedAmountToClients(e)
s.sendDiskUsedAmountToClients(e.ke)
})
s.group[e.ke].diskUsedEmitter.on('setAddStorage',function(data,storageType){
var currentSize = data.size
@ -510,60 +248,76 @@ module.exports = function(s,config,lang){
break;
}
//remove value just used from queue
s.sendDiskUsedAmountToClients(e)
s.sendDiskUsedAmountToClients(e.ke)
})
}
Object.keys(ar).forEach(function(v){
s.group[e.ke].init[v] = ar[v]
Object.keys(details).forEach(function(v){
s.group[e.ke].init[v] = details[v]
})
}
})
}
s.accountSettingsEdit = function(d){
s.sqlQuery('SELECT details FROM Users WHERE ke=? AND uid=?',[d.ke,d.uid],function(err,r){
if(r&&r[0]){
r=r[0];
d.d=JSON.parse(r.details);
if(!d.d.sub || d.d.user_change !== "0"){
s.accountSettingsEdit = function(d,dontRunExtensions){
s.knexQuery({
action: "select",
columns: "details",
table: "Users",
where: [
['ke','=',d.ke],
['uid','=',d.uid],
]
},(err,r) => {
if(r && r[0]){
r = r[0];
const details = JSON.parse(r.details);
if(!details.sub || details.user_change !== "0"){
if(d.cnid){
if(d.d.get_server_log==='1'){
if(details.get_server_log === '1'){
s.clientSocketConnection[d.cnid].join('GRPLOG_'+d.ke)
}else{
s.clientSocketConnection[d.cnid].leave('GRPLOG_'+d.ke)
}
}
///unchangeable from client side, so reset them in case they did.
d.form.details=JSON.parse(d.form.details)
s.beforeAccountSaveExtensions.forEach(function(extender){
extender(d)
})
var form = d.form
var formDetails = JSON.parse(form.details)
if(!dontRunExtensions){
s.beforeAccountSaveExtensions.forEach(function(extender){
extender({
form: form,
formDetails: formDetails,
d: details
})
})
}
//admin permissions
d.form.details.permissions=d.d.permissions
d.form.details.edit_size=d.d.edit_size
d.form.details.edit_days=d.d.edit_days
d.form.details.use_admin=d.d.use_admin
d.form.details.use_ldap=d.d.use_ldap
d.form.details.landing_page=d.d.landing_page
formDetails.permissions = details.permissions
formDetails.edit_size = details.edit_size
formDetails.edit_days = details.edit_days
formDetails.use_admin = details.use_admin
formDetails.use_ldap = details.use_ldap
formDetails.landing_page = details.landing_page
//check
if(d.d.edit_days == "0"){
d.form.details.days = d.d.days;
if(details.edit_days == "0"){
formDetails.days = details.days;
}
if(d.d.edit_size == "0"){
d.form.details.size = d.d.size;
if(details.edit_size == "0"){
formDetails.size = details.size;
formDetails.addStorage = details.addStorage;
}
if(d.d.sub){
d.form.details.sub=d.d.sub;
if(d.d.monitors){d.form.details.monitors=d.d.monitors;}
if(d.d.allmonitors){d.form.details.allmonitors=d.d.allmonitors;}
if(d.d.monitor_create){d.form.details.monitor_create=d.d.monitor_create;}
if(d.d.video_delete){d.form.details.video_delete=d.d.video_delete;}
if(d.d.video_view){d.form.details.video_view=d.d.video_view;}
if(d.d.monitor_edit){d.form.details.monitor_edit=d.d.monitor_edit;}
if(d.d.size){d.form.details.size=d.d.size;}
if(d.d.days){d.form.details.days=d.d.days;}
delete(d.form.details.mon_groups)
if(details.sub){
formDetails.sub = details.sub;
if(details.monitors){formDetails.monitors = details.monitors;}
if(details.allmonitors){formDetails.allmonitors = details.allmonitors;}
if(details.monitor_create){formDetails.monitor_create = details.monitor_create;}
if(details.video_delete){formDetails.video_delete = details.video_delete;}
if(details.video_view){formDetails.video_view = details.video_view;}
if(details.monitor_edit){formDetails.monitor_edit = details.monitor_edit;}
if(details.size){formDetails.size = details.size;}
if(details.days){formDetails.days = details.days;}
delete(formDetails.mon_groups)
}
var newSize = parseFloat(d.form.details.size) || 10000
var newSize = parseFloat(formDetails.size) || 10000
//load addStorageUse
var currentStorageNumber = 0
var readStorageArray = function(){
@ -578,7 +332,7 @@ module.exports = function(s,config,lang){
readStorageArray()
return
}
var detailContainer = d.form.details || s.group[r.ke].init
var detailContainer = formDetails || s.group[r.ke].init
var storageId = path
var detailsContainerAddStorage = s.parseJSON(detailContainer.addStorage)
if(!s.group[d.ke].addStorageUse[storageId])s.group[d.ke].addStorageUse[storageId] = {}
@ -594,30 +348,43 @@ module.exports = function(s,config,lang){
}
readStorageArray()
///
d.form.details = JSON.stringify(d.form.details)
formDetails = JSON.stringify(s.mergeDeep(details,formDetails))
///
d.set=[],d.ar=[];
if(d.form.pass&&d.form.pass!==''){d.form.pass=s.createHash(d.form.pass);}else{delete(d.form.pass)};
delete(d.form.password_again);
d.for=Object.keys(d.form);
d.for.forEach(function(v){
d.set.push(v+'=?'),d.ar.push(d.form[v]);
});
d.ar.push(d.ke),d.ar.push(d.uid);
s.sqlQuery('UPDATE Users SET '+d.set.join(',')+' WHERE ke=? AND uid=?',d.ar,function(err,r){
if(!d.d.sub){
var user = Object.assign(d.form,{ke : d.ke})
var userDetails = JSON.parse(d.form.details)
const updateQuery = {}
if(form.pass && form.pass !== ''){
form.pass = s.createHash(form.pass)
}else{
delete(form.pass)
}
delete(form.password_again)
Object.keys(form).forEach(function(key){
const value = form[key]
updateQuery[key] = value
})
s.knexQuery({
action: "update",
table: "Users",
update: updateQuery,
where: [
['ke','=',d.ke],
['uid','=',d.uid],
]
},() => {
if(!details.sub){
var user = Object.assign(form,{ke : d.ke})
var userDetails = JSON.parse(formDetails)
s.group[d.ke].sizeLimit = parseFloat(newSize)
s.onAccountSaveExtensions.forEach(function(extender){
extender(s.group[d.ke],userDetails,user)
})
s.unloadGroupAppExtensions.forEach(function(extender){
extender(user)
})
s.loadGroupApps(d)
if(!dontRunExtensions){
s.onAccountSaveExtensions.forEach(function(extender){
extender(s.group[d.ke],userDetails,user)
})
s.unloadGroupAppExtensions.forEach(function(extender){
extender(user)
})
s.loadGroupApps(d)
}
}
if(d.cnid)s.tx({f:'user_settings_change',uid:d.uid,ke:d.ke,form:d.form},d.cnid)
if(d.cnid)s.tx({f:'user_settings_change',uid:d.uid,ke:d.ke,form:form},d.cnid)
})
}
}
@ -625,7 +392,17 @@ module.exports = function(s,config,lang){
}
s.findPreset = function(presetQueryVals,callback){
//presetQueryVals = [ke, type, name]
s.sqlQuery("SELECT * FROM Presets WHERE ke=? AND type=? AND name=? LIMIT 1",presetQueryVals,function(err,presets){
s.knexQuery({
action: "select",
columns: "*",
table: "Presets",
where: [
['ke','=',presetQueryVals[0]],
['type','=',presetQueryVals[1]],
['name','=',presetQueryVals[2]],
],
limit: 1
},function(err,presets) {
var preset
var notFound = false
if(presets && presets[0]){

463
libs/user/utils.js Normal file
View File

@ -0,0 +1,463 @@
var fs = require('fs');
module.exports = (s,config,lang) => {
const deleteSetOfVideos = function(options,callback){
const groupKey = options.groupKey
const err = options.err
const videos = options.videos
const storageIndex = options.storageIndex
const reRunCheck = options.reRunCheck
var completedCheck = 0
var whereGroup = []
var whereQuery = [
['ke','=',groupKey],
]
if(videos){
videos.forEach(function(video){
video.dir = s.getVideoDirectory(video) + s.formattedTime(video.time) + '.' + video.ext
const queryGroup = {
mid: video.mid,
time: video.time,
}
if(whereGroup.length > 0)queryGroup.__separator = 'or'
whereGroup.push(queryGroup)
fs.chmod(video.dir,0o777,function(err){
fs.unlink(video.dir,function(err){
++completedCheck
if(err){
fs.stat(video.dir,function(err){
if(!err){
s.file('delete',video.dir)
}
})
}
const whereGroupLength = whereGroup.length
if(whereGroupLength > 0 && whereGroupLength === completedCheck){
whereQuery[1] = whereGroup
s.knexQuery({
action: "delete",
table: "Videos",
where: whereQuery
},(err,info) => {
setTimeout(reRunCheck,1000)
})
}
})
})
if(storageIndex){
s.setDiskUsedForGroupAddStorage(groupKey,{
size: -(video.size/1048576),
storageIndex: storageIndex
})
}else{
s.setDiskUsedForGroup(groupKey,-(video.size/1048576))
}
s.tx({
f: 'video_delete',
ff: 'over_max',
filename: s.formattedTime(video.time)+'.'+video.ext,
mid: video.mid,
ke: video.ke,
time: video.time,
end: s.formattedTime(new Date,'YYYY-MM-DD HH:mm:ss')
},'GRP_'+groupKey)
})
}else{
console.log(err)
}
if(whereGroup.length === 0){
if(callback)callback()
}
}
const deleteSetOfTimelapseFrames = function(options,callback){
const groupKey = options.groupKey
const err = options.err
const frames = options.frames
const storageIndex = options.storageIndex
var whereGroup = []
var whereQuery = [
['ke','=',groupKey],
[]
]
var completedCheck = 0
if(frames){
frames.forEach(function(frame){
var selectedDate = frame.filename.split('T')[0]
var dir = s.getTimelapseFrameDirectory(frame)
var fileLocationMid = `${dir}` + frame.filename
const queryGroup = {
mid: video.mid,
time: video.time,
}
if(whereGroup.length > 0)queryGroup.__separator = 'or'
whereGroup.push(queryGroup)
fs.unlink(fileLocationMid,function(err){
++completedCheck
if(err){
fs.stat(fileLocationMid,function(err){
if(!err){
s.file('delete',fileLocationMid)
}
})
}
const whereGroupLength = whereGroup.length
if(whereGroupLength > 0 && whereGroupLength === completedCheck){
whereQuery[1] = whereGroup
s.knexQuery({
action: "delete",
table: "Timelapse Frames",
where: whereQuery
},() => {
deleteTimelapseFrames(groupKey,callback)
})
}
})
if(storageIndex){
s.setDiskUsedForGroupAddStorage(groupKey,{
size: -(frame.size/1048576),
storageIndex: storageIndex
},'timelapeFrames')
}else{
s.setDiskUsedForGroup(groupKey,-(frame.size/1048576),'timelapeFrames')
}
// s.tx({
// f: 'timelapse_frame_delete',
// ff: 'over_max',
// filename: s.formattedTime(video.time)+'.'+video.ext,
// mid: video.mid,
// ke: video.ke,
// time: video.time,
// end: s.formattedTime(new Date,'YYYY-MM-DD HH:mm:ss')
// },'GRP_'+groupKey)
})
}else{
console.log(err)
}
if(whereGroup.length === 0){
if(callback)callback()
}
}
const deleteSetOfFileBinFiles = function(options,callback){
const groupKey = options.groupKey
const err = options.err
const frames = options.frames
const storageIndex = options.storageIndex
var whereGroup = []
var whereQuery = [
['ke','=',groupKey],
[]
]
var completedCheck = 0
if(files){
files.forEach(function(file){
var dir = s.getFileBinDirectory(file)
var fileLocationMid = `${dir}` + file.name
const queryGroup = {
mid: file.mid,
name: file.name,
}
if(whereGroup.length > 0)queryGroup.__separator = 'or'
whereGroup.push(queryGroup)
fs.unlink(fileLocationMid,function(err){
++completedCheck
if(err){
fs.stat(fileLocationMid,function(err){
if(!err){
s.file('delete',fileLocationMid)
}
})
}
const whereGroupLength = whereGroup.length
if(whereGroupLength > 0 && whereGroupLength === completedCheck){
whereQuery[1] = whereGroup
s.knexQuery({
action: "delete",
table: "Files",
where: whereQuery
},() => {
deleteFileBinFiles(groupKey,callback)
})
}
})
if(storageIndex){
s.setDiskUsedForGroupAddStorage(groupKey,{
size: -(file.size/1048576),
storageIndex: storageIndex
},'fileBin')
}else{
s.setDiskUsedForGroup(groupKey,-(file.size/1048576),'fileBin')
}
})
}else{
console.log(err)
}
if(whereGroup.length === 0){
if(callback)callback()
}
}
const deleteAddStorageVideos = function(groupKey,callback){
reRunCheck = function(){
s.debugLog('deleteAddStorageVideos')
return deleteAddStorageVideos(groupKey,callback)
}
var currentStorageNumber = 0
var readStorageArray = function(){
setTimeout(function(){
reRunCheck = readStorageArray
var storage = s.listOfStorage[currentStorageNumber]
if(!storage){
//done all checks, move on to next user
callback()
return
}
var storageId = storage.value
if(storageId === '' || !s.group[groupKey].addStorageUse[storageId]){
++currentStorageNumber
readStorageArray()
return
}
var storageIndex = s.group[groupKey].addStorageUse[storageId]
//run purge command
if(storageIndex.usedSpace > (storageIndex.sizeLimit * (storageIndex.deleteOffset || config.cron.deleteOverMaxOffset))){
s.knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: [
['ke','=',groupKey],
['status','!=','0'],
['details','NOT LIKE',`%"archived":"1"%`],
['details','LIKE',`%"dir":"${storage.value}"%`],
],
orderBy: ['time','asc'],
limit: 3
},(err,rows) => {
deleteSetOfVideos({
groupKey: groupKey,
err: err,
videos: rows,
storageIndex: storageIndex,
reRunCheck: () => {
return readStorageArray()
}
},callback)
})
}else{
++currentStorageNumber
readStorageArray()
}
})
}
readStorageArray()
}
const deleteMainVideos = function(groupKey,callback){
// //run purge command
// s.debugLog('!!!!!!!!!!!deleteMainVideos')
// s.debugLog('s.group[groupKey].usedSpaceVideos > (s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitVideoPercent / 100) * config.cron.deleteOverMaxOffset)')
// s.debugLog(s.group[groupKey].usedSpaceVideos > (s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitVideoPercent / 100) * config.cron.deleteOverMaxOffset))
// s.debugLog('s.group[groupKey].usedSpaceVideos')
// s.debugLog(s.group[groupKey].usedSpaceVideos)
// s.debugLog('s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitVideoPercent / 100) * config.cron.deleteOverMaxOffset')
// s.debugLog(s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitVideoPercent / 100) * config.cron.deleteOverMaxOffset)
// s.debugLog('s.group[groupKey].sizeLimitVideoPercent / 100')
// s.debugLog(s.group[groupKey].sizeLimitVideoPercent / 100)
// s.debugLog('s.group[groupKey].sizeLimit')
// s.debugLog(s.group[groupKey].sizeLimit)
if(s.group[groupKey].usedSpaceVideos > (s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitVideoPercent / 100) * config.cron.deleteOverMaxOffset)){
s.knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: [
['ke','=',groupKey],
['status','!=','0'],
['details','NOT LIKE',`%"archived":"1"%`],
['details','NOT LIKE',`%"dir"%`],
],
orderBy: ['time','asc'],
limit: 3
},(err,rows) => {
deleteSetOfVideos({
groupKey: groupKey,
err: err,
videos: rows,
storageIndex: null,
reRunCheck: () => {
return deleteMainVideos(groupKey,callback)
}
},callback)
})
}else{
callback()
}
}
const deleteTimelapseFrames = function(groupKey,callback){
//run purge command
if(s.group[groupKey].usedSpaceTimelapseFrames > (s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitTimelapseFramesPercent / 100) * config.cron.deleteOverMaxOffset)){
s.knexQuery({
action: "select",
columns: "*",
table: "Timelapse Frames",
where: [
['ke','=',groupKey],
['details','NOT LIKE',`%"archived":"1"%`],
],
orderBy: ['time','asc'],
limit: 3
},(err,frames) => {
deleteSetOfTimelapseFrames({
groupKey: groupKey,
err: err,
frames: frames,
storageIndex: null
},callback)
})
}else{
callback()
}
}
const deleteFileBinFiles = function(groupKey,callback){
if(config.deleteFileBinsOverMax === true){
//run purge command
if(s.group[groupKey].usedSpaceFileBin > (s.group[groupKey].sizeLimit * (s.group[groupKey].sizeLimitFileBinPercent / 100) * config.cron.deleteOverMaxOffset)){
s.knexQuery({
action: "select",
columns: "*",
table: "Files",
where: [
['ke','=',groupKey],
],
orderBy: ['time','asc'],
limit: 1
},(err,frames) => {
deleteSetOfFileBinFiles({
groupKey: groupKey,
err: err,
frames: frames,
storageIndex: null
},callback)
})
}else{
callback()
}
}else{
callback()
}
}
const deleteCloudVideos = function(groupKey,storageType,storagePoint,callback){
const whereGroup = []
const cloudDisk = s.group[groupKey].cloudDiskUse[storageType]
//run purge command
if(cloudDisk.sizeLimitCheck && cloudDisk.usedSpace > (cloudDisk.sizeLimit * config.cron.deleteOverMaxOffset)){
s.knexQuery({
action: "select",
columns: "*",
table: "Cloud Videos",
where: [
['status','!=','0'],
['ke','=',groupKey],
['details','LIKE',`%"type":"${storageType}"%`],
],
orderBy: ['time','asc'],
limit: 2
},function(err,videos) {
if(!videos)return console.log(err)
var whereQuery = [
['ke','=',groupKey],
]
var didOne = false
videos.forEach(function(video){
video.dir = s.getVideoDirectory(video) + s.formattedTime(video.time) + '.' + video.ext
const queryGroup = {
mid: video.mid,
time: video.time,
}
if(whereGroup.length > 0)queryGroup.__separator = 'or'
whereGroup.push(queryGroup)
s.setCloudDiskUsedForGroup(e.ke,{
amount : -(video.size/1048576),
storageType : storageType
})
s.deleteVideoFromCloudExtensionsRunner(e,storageType,video)
})
const whereGroupLength = whereGroup.length
if(whereGroupLength > 0){
whereQuery[1] = whereGroup
s.knexQuery({
action: "delete",
table: "Cloud Videos",
where: whereQuery
},() => {
deleteCloudVideos(groupKey,storageType,storagePoint,callback)
})
}else{
callback()
}
})
}else{
callback()
}
}
const deleteCloudTimelapseFrames = function(groupKey,storageType,storagePoint,callback){
const whereGroup = []
var cloudDisk = s.group[e.ke].cloudDiskUse[storageType]
//run purge command
if(cloudDisk.usedSpaceTimelapseFrames > (cloudDisk.sizeLimit * (s.group[e.ke].sizeLimitTimelapseFramesPercent / 100) * config.cron.deleteOverMaxOffset)){
s.knexQuery({
action: "select",
columns: "*",
table: "Cloud Timelapse Frames",
where: [
['ke','=',e.ke],
['details','NOT LIKE',`%"archived":"1"%`],
],
orderBy: ['time','asc'],
limit: 3
},(err,frames) => {
if(!frames)return console.log(err)
var whereQuery = [
['ke','=',e.ke],
]
frames.forEach(function(frame){
frame.dir = s.getVideoDirectory(frame) + s.formattedTime(frame.time) + '.' + frame.ext
const queryGroup = {
mid: frame.mid,
time: frame.time,
}
if(whereGroup.length > 0)queryGroup.__separator = 'or'
whereGroup.push(queryGroup)
s.setCloudDiskUsedForGroup(e.ke,{
amount : -(frame.size/1048576),
storageType : storageType
})
s.deleteVideoFromCloudExtensionsRunner(e,storageType,frame)
})
const whereGroupLength = whereGroup.length
if(whereGroupLength > 0){
whereQuery[1] = whereGroup
s.knexQuery({
action: "delete",
table: "Cloud Timelapse Frames",
where: whereQuery
},() => {
deleteCloudTimelapseFrames(groupKey,storageType,storagePoint,callback)
})
}else{
callback()
}
})
}else{
callback()
}
}
return {
deleteSetOfVideos: deleteSetOfVideos,
deleteSetOfTimelapseFrames: deleteSetOfTimelapseFrames,
deleteSetOfFileBinFiles: deleteSetOfFileBinFiles,
deleteAddStorageVideos: deleteAddStorageVideos,
deleteMainVideos: deleteMainVideos,
deleteTimelapseFrames: deleteTimelapseFrames,
deleteFileBinFiles: deleteFileBinFiles,
deleteCloudVideos: deleteCloudVideos,
deleteCloudTimelapseFrames: deleteCloudTimelapseFrames,
}
}

View File

@ -31,8 +31,19 @@ module.exports = function(s,config,lang,app,io){
details.dir = monitor.details.dir
}
var timeNow = new Date(s.nameToTime(filename))
s.sqlQuery('INSERT INTO `Timelapse Frames` (ke,mid,details,filename,size,time) VALUES (?,?,?,?,?,?)',[ke,mid,s.s(details),filename,fileStats.size,timeNow])
s.setDiskUsedForGroup(monitor,fileStats.size / 1000000)
s.knexQuery({
action: "insert",
table: "Timelapse Frames",
insert: {
ke: ke,
mid: mid,
details: s.s(details),
filename: filename,
size: fileStats.size,
time: timeNow,
}
})
s.setDiskUsedForGroup(monitor.ke,fileStats.size / 1048576)
}
// else{
// s.insertDatabaseRow(

View File

@ -64,17 +64,20 @@ module.exports = function(s,config,lang){
k.details.dir = e.details.dir
}
if(config.useUTC === true)k.details.isUTC = config.useUTC;
var save = [
e.mid,
e.ke,
k.startTime,
e.ext,
1,
s.s(k.details),
k.filesize,
k.endTime,
]
s.sqlQuery('INSERT INTO Videos (mid,ke,time,ext,status,details,size,end) VALUES (?,?,?,?,?,?,?,?)',save,function(err){
s.knexQuery({
action: "insert",
table: "Videos",
insert: {
ke: e.ke,
mid: e.mid,
time: k.startTime,
ext: e.ext,
status: 1,
details: s.s(k.details),
size: k.filesize,
end: k.endTime,
}
},(err) => {
if(callback)callback(err)
fs.chmod(k.dir+k.file,0o777,function(err){
@ -90,7 +93,11 @@ module.exports = function(s,config,lang){
e.dir = s.getVideoDirectory(e)
k.dir = e.dir.toString()
if(s.group[e.ke].activeMonitors[e.id].childNode){
s.cx({f:'insertCompleted',d:s.group[e.ke].rawMonitorConfigurations[e.id],k:k},s.group[e.ke].activeMonitors[e.id].childNodeId);
s.cx({
f: 'insertCompleted',
d: s.group[e.ke].rawMonitorConfigurations[e.id],
k: k
},s.group[e.ke].activeMonitors[e.id].childNodeId);
}else{
//get file directory
k.fileExists = fs.existsSync(k.dir+k.file)
@ -108,10 +115,10 @@ module.exports = function(s,config,lang){
}
if(k.fileExists===true){
//close video row
k.details = {}
k.details = k.details && k.details instanceof Object ? k.details : {}
k.stat = fs.statSync(k.dir+k.file)
k.filesize = k.stat.size
k.filesizeMB = parseFloat((k.filesize/1000000).toFixed(2))
k.filesizeMB = parseFloat((k.filesize/1048576).toFixed(2))
k.startTime = new Date(s.nameToTime(k.file))
k.endTime = new Date(k.endTime || k.stat.mtime)
@ -126,58 +133,54 @@ module.exports = function(s,config,lang){
if(!e.ext){e.ext = k.filename.split('.')[1]}
//send event for completed recording
if(config.childNodes.enabled === true && config.childNodes.mode === 'child' && config.childNodes.host){
const response = {
mid: e.mid,
ke: e.ke,
filename: k.filename,
d: s.cleanMonitorObject(e),
filesize: k.filesize,
time: s.timeObject(k.startTime).format('YYYY-MM-DD HH:mm:ss'),
end: s.timeObject(k.endTime).format('YYYY-MM-DD HH:mm:ss')
}
fs.createReadStream(k.dir+k.filename,{ highWaterMark: 500 })
.on('data',function(data){
s.cx({
s.cx(Object.assign(response,{
f:'created_file_chunk',
mid:e.mid,
ke:e.ke,
chunk:data,
filename:k.filename,
d:s.cleanMonitorObject(e),
filesize:e.filesize,
time:s.timeObject(k.startTime).format(),
end:s.timeObject(k.endTime).format()
})
chunk: data,
}))
})
.on('close',function(){
clearTimeout(s.group[e.ke].activeMonitors[e.id].recordingChecker)
clearTimeout(s.group[e.ke].activeMonitors[e.id].streamChecker)
s.cx({
s.cx(Object.assign(response,{
f:'created_file',
mid:e.id,
ke:e.ke,
filename:k.filename,
d:s.cleanMonitorObject(e),
filesize:k.filesize,
time:s.timeObject(k.startTime).format(),
end:s.timeObject(k.endTime).format()
})
}))
})
}else{
var href = '/videos/'+e.ke+'/'+e.mid+'/'+k.filename
if(config.useUTC === true)href += '?isUTC=true';
s.txWithSubPermissions({
f:'video_build_success',
hrefNoAuth:href,
filename:k.filename,
mid:e.mid,
ke:e.ke,
time:k.startTime,
size:k.filesize,
end:k.endTime
f: 'video_build_success',
hrefNoAuth: href,
filename: k.filename,
mid: e.mid,
ke: e.ke,
time: k.startTime,
size: k.filesize,
end: k.endTime,
events: k.events && k.events.length > 0 ? k.events : null
},'GRP_'+e.ke,'video_view')
//purge over max
s.purgeDiskForGroup(e)
s.purgeDiskForGroup(e.ke)
//send new diskUsage values
var storageIndex = s.getVideoStorageIndex(e)
if(storageIndex){
s.setDiskUsedForGroupAddStorage(e,{
s.setDiskUsedForGroupAddStorage(e.ke,{
size: k.filesizeMB,
storageIndex: storageIndex
})
}else{
s.setDiskUsedForGroup(e,k.filesizeMB)
s.setDiskUsedForGroup(e.ke,k.filesizeMB)
}
s.onBeforeInsertCompletedVideoExtensions.forEach(function(extender){
extender(e,k)
@ -210,8 +213,17 @@ module.exports = function(s,config,lang){
time = e.time
}
time = new Date(time)
var queryValues = [e.id,e.ke,time];
s.sqlQuery('SELECT * FROM Videos WHERE `mid`=? AND `ke`=? AND `time`=?',queryValues,function(err,r){
const whereQuery = {
ke: e.ke,
mid: e.id,
time: time,
}
s.knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: whereQuery
},(err,r) => {
if(r && r[0]){
r = r[0]
fs.chmod(e.dir+filename,0o777,function(err){
@ -225,14 +237,18 @@ module.exports = function(s,config,lang){
},'GRP_'+e.ke);
var storageIndex = s.getVideoStorageIndex(e)
if(storageIndex){
s.setDiskUsedForGroupAddStorage(e,{
size: -(r.size / 1000000),
s.setDiskUsedForGroupAddStorage(e.ke,{
size: -(r.size / 1048576),
storageIndex: storageIndex
})
}else{
s.setDiskUsedForGroup(e,-(r.size / 1000000))
s.setDiskUsedForGroup(e.ke,-(r.size / 1048576))
}
s.sqlQuery('DELETE FROM Videos WHERE `mid`=? AND `ke`=? AND `time`=?',queryValues,function(err){
s.knexQuery({
action: "delete",
table: "Videos",
where: whereQuery
},(err) => {
if(err){
s.systemLog(lang['File Delete Error'] + ' : '+e.ke+' : '+' : '+e.id,err)
}
@ -253,9 +269,8 @@ module.exports = function(s,config,lang){
}
s.deleteListOfVideos = function(videos){
var deleteSetOfVideos = function(videos){
var query = 'DELETE FROM Videos WHERE '
var videoQuery = []
var queryValues = []
const whereQuery = []
var didOne = false;
videos.forEach(function(video){
s.checkDetails(video)
//e = video object
@ -276,37 +291,45 @@ module.exports = function(s,config,lang){
time = video.time
}
time = new Date(time)
fs.chmod(video.dir+filename,0o777,function(err){
fs.chmod(video.dir + filename,0o777,function(err){
s.tx({
f: 'video_delete',
filename: filename,
mid: video.id,
mid: video.mid,
ke: video.ke,
time: s.nameToTime(filename),
end: s.formattedTime(new Date,'YYYY-MM-DD HH:mm:ss')
},'GRP_'+video.ke);
var storageIndex = s.getVideoStorageIndex(video)
if(storageIndex){
s.setDiskUsedForGroupAddStorage(video,{
size: -(video.size / 1000000),
s.setDiskUsedForGroupAddStorage(video.ke,{
size: -(video.size / 1048576),
storageIndex: storageIndex
})
}else{
s.setDiskUsedForGroup(video,-(video.size / 1000000))
s.setDiskUsedForGroup(video.ke,-(video.size / 1048576))
}
fs.unlink(video.dir+filename,function(err){
fs.stat(video.dir+filename,function(err){
fs.unlink(video.dir + filename,function(err){
fs.stat(video.dir + filename,function(err){
if(!err){
s.file('delete',video.dir+filename)
s.file('delete',video.dir + filename)
}
})
})
})
videoQuery.push('(`mid`=? AND `ke`=? AND `time`=?)')
queryValues = queryValues.concat([video.id,video.ke,time])
const queryGroup = {
ke: video.ke,
mid: video.mid,
time: time,
}
if(whereQuery.length > 0)queryGroup.__separator = 'or'
whereQuery.push(queryGroup)
})
query += videoQuery.join(' OR ')
s.sqlQuery(query,queryValues,function(err){
s.knexQuery({
action: "delete",
table: "Videos",
where: whereQuery
},(err) => {
if(err){
s.systemLog(lang['List of Videos Delete Error'],err)
}
@ -338,11 +361,24 @@ module.exports = function(s,config,lang){
s.deleteVideoFromCloud = function(e){
// e = video object
s.checkDetails(e)
var videoSelector = [e.id,e.ke,new Date(e.time)]
s.sqlQuery('SELECT * FROM `Cloud Videos` WHERE `mid`=? AND `ke`=? AND `time`=?',videoSelector,function(err,r){
const whereQuery = {
ke: e.ke,
mid: e.mid,
time: new Date(e.time),
}
s.knexQuery({
action: "select",
columns: "*",
table: "Cloud Videos",
where: whereQuery
},(err,r) => {
if(r&&r[0]){
r = r[0]
s.sqlQuery('DELETE FROM `Cloud Videos` WHERE `mid`=? AND `ke`=? AND `time`=?',videoSelector,function(){
s.knexQuery({
action: "delete",
table: "Cloud Videos",
where: whereQuery
},(err,r) => {
s.deleteVideoFromCloudExtensionsRunner(e,r)
})
}else{
@ -374,18 +410,23 @@ module.exports = function(s,config,lang){
}
fiveRecentFiles.forEach(function(filename){
if(/T[0-9][0-9]-[0-9][0-9]-[0-9][0-9]./.test(filename)){
var queryValues = [
monitor.ke,
monitor.mid,
s.nameToTime(filename)
]
s.sqlQuery('SELECT * FROM Videos WHERE ke=? AND mid=? AND time=? LIMIT 1',queryValues,function(err,rows){
s.knexQuery({
action: "select",
columns: "*",
table: "Videos",
where: [
['ke','=',monitor.ke],
['mid','=',monitor.mid],
['time','=',s.nameToTime(filename)],
],
limit: 1
},(err,rows) => {
if(!err && (!rows || !rows[0])){
++orphanedFilesCount
var video = rows[0]
s.insertCompletedVideo(monitor,{
file : filename
},function(){
},() => {
fileComplete()
})
}else{
@ -463,14 +504,14 @@ module.exports = function(s,config,lang){
createLocation = fileLocationMid
}
})
if(concatFiles.length > 30){
if(concatFiles.length > framesPerSecond){
var commandTempLocation = `${s.dir.streams}${ke}/${mid}/mergeJpegs_${finalFileName}.sh`
var finalMp4OutputLocation = `${s.dir.fileBin}${ke}/${mid}/${finalFileName}.mp4`
if(!s.group[ke].activeMonitors[mid].buildingTimelapseVideo){
if(!fs.existsSync(finalMp4OutputLocation)){
var currentFile = 0
var completionTimeout
var commandString = `ffmpeg -y -pattern_type glob -f image2pipe -vcodec mjpeg -r ${framesPerSecond} -analyzeduration 10 -i - -q:v 1 -c:v libx264 -r ${framesPerSecond} "${finalMp4OutputLocation}"`
var commandString = `ffmpeg -y -f image2pipe -vcodec mjpeg -r ${framesPerSecond} -analyzeduration 10 -i - -q:v 1 -c:v libx264 -r ${framesPerSecond} "${finalMp4OutputLocation}"`
fs.writeFileSync(commandTempLocation,commandString)
var videoBuildProcess = spawn('sh',[commandTempLocation])
videoBuildProcess.stderr.on('data',function(data){
@ -486,8 +527,19 @@ module.exports = function(s,config,lang){
var timeNow = new Date()
var fileStats = fs.statSync(finalMp4OutputLocation)
var details = {}
s.sqlQuery('INSERT INTO `Files` (ke,mid,details,name,size,time) VALUES (?,?,?,?,?,?)',[ke,mid,s.s(details),finalFileName + '.mp4',fileStats.size,timeNow])
s.setDiskUsedForGroup({ke: ke},fileStats.size / 1000000,'fileBin')
s.knexQuery({
action: "insert",
table: "Files",
insert: {
ke: ke,
mid: mid,
details: s.s(details),
name: finalFileName + '.mp4',
size: fileStats.size,
time: timeNow,
}
})
s.setDiskUsedForGroup(ke,fileStats.size / 1048576,'fileBin')
fs.unlink(commandTempLocation,function(){
})

View File

@ -57,7 +57,9 @@ module.exports = function(s,config,lang,io){
//SSL options
var wellKnownDirectory = s.mainDirectory + '/web/.well-known'
if(fs.existsSync(wellKnownDirectory))app.use('/.well-known',express.static(wellKnownDirectory))
config.sslEnabled = false
if(config.ssl&&config.ssl.key&&config.ssl.cert){
config.sslEnabled = true
config.ssl.key=fs.readFileSync(s.checkRelativePath(config.ssl.key),'utf8')
config.ssl.cert=fs.readFileSync(s.checkRelativePath(config.ssl.cert),'utf8')
if(config.ssl.port === undefined){
@ -93,6 +95,12 @@ module.exports = function(s,config,lang,io){
path:s.checkCorrectPathEnding(config.webPaths.super)+'socket.io',
transports: ['websocket']
})
app.use(function(req, res, next) {
if(!req.secure) {
return res.redirect(['https://', req.hostname,":",config.ssl.port, req.url].join(''));
}
next();
})
}
//start HTTP
var server = http.createServer(app);

View File

@ -25,15 +25,18 @@ module.exports = function(s,config,lang,app){
var mail = form.mail || s.getPostData(req,'mail',false)
if(form){
var keys = ['details']
var condition = []
var value = []
keys.forEach(function(v){
condition.push(v+'=?')
if(form[v] instanceof Object)form[v] = JSON.stringify(form[v])
value.push(form[v])
const updateQuery = {
details: s.stringJSON(form.details)
}
s.knexQuery({
action: "update",
table: "Users",
update: updateQuery,
where: [
['ke','=',req.params.ke],
['uid','=',uid],
]
})
value = value.concat([req.params.ke,uid])
s.sqlQuery("UPDATE Users SET "+condition.join(',')+" WHERE ke=? AND uid=?",value)
s.tx({
f: 'edit_sub_account',
ke: req.params.ke,
@ -42,7 +45,15 @@ module.exports = function(s,config,lang,app){
form: form
},'ADM_'+req.params.ke)
endData.ok = true
s.sqlQuery("SELECT * FROM API WHERE ke=? AND uid=?",[req.params.ke,uid],function(err,rows){
s.knexQuery({
action: "select",
columns: "*",
table: "API",
where: [
['ke','=',req.params.ke],
['uid','=',uid],
]
},function(err,rows){
if(rows && rows[0]){
rows.forEach(function(row){
delete(s.api[row.code])
@ -71,13 +82,36 @@ module.exports = function(s,config,lang,app){
var form = s.getPostData(req) || {}
var uid = form.uid || s.getPostData(req,'uid',false)
var mail = form.mail || s.getPostData(req,'mail',false)
s.sqlQuery('DELETE FROM Users WHERE uid=? AND ke=? AND mail=?',[uid,req.params.ke,mail])
s.sqlQuery("SELECT * FROM API WHERE ke=? AND uid=?",[req.params.ke,uid],function(err,rows){
s.knexQuery({
action: "delete",
table: "Users",
where: {
ke: req.params.ke,
uid: uid,
mail: mail,
}
})
s.knexQuery({
action: "select",
columns: "*",
table: "API",
where: [
['ke','=',req.params.ke],
['uid','=',uid],
]
},function(err,rows){
if(rows && rows[0]){
rows.forEach(function(row){
delete(s.api[row.code])
})
s.sqlQuery('DELETE FROM API WHERE uid=? AND ke=?',[uid,req.params.ke])
s.knexQuery({
action: "delete",
table: "API",
where: {
ke: req.params.ke,
uid: uid,
}
})
}
})
s.tx({
@ -112,8 +146,15 @@ module.exports = function(s,config,lang,app){
var form = s.getPostData(req)
if(form.mail !== '' && form.pass !== ''){
if(form.pass === form.password_again || form.pass === form.pass_again){
s.sqlQuery('SELECT * FROM Users WHERE mail=?',[form.mail],function(err,r) {
if(r&&r[0]){
s.knexQuery({
action: "select",
columns: "*",
table: "Users",
where: [
['mail','=',form.mail],
]
},function(err,r){
if(r && r[0]){
//found one exist
endData.msg = 'Email address is in use.'
}else{
@ -125,7 +166,17 @@ module.exports = function(s,config,lang,app){
sub: "1",
allmonitors: "1"
})
s.sqlQuery('INSERT INTO Users (ke,uid,mail,pass,details) VALUES (?,?,?,?,?)',[req.params.ke,newId,form.mail,s.createHash(form.pass),details])
s.knexQuery({
action: "insert",
table: "Users",
insert: {
ke: req.params.ke,
uid: newId,
mail: form.mail,
pass: s.createHash(form.pass),
details: details,
}
})
s.tx({
f: 'add_sub_account',
details: details,
@ -199,8 +250,22 @@ module.exports = function(s,config,lang,app){
s.userLog(s.group[req.params.ke].rawMonitorConfigurations[req.params.id],{type:'Monitor Deleted',msg:'by user : '+user.uid});
req.params.delete=1;s.camera('stop',req.params);
s.tx({f:'monitor_delete',uid:user.uid,mid:req.params.id,ke:req.params.ke},'GRP_'+req.params.ke);
s.sqlQuery('DELETE FROM Monitors WHERE ke=? AND mid=?',[req.params.ke,req.params.id])
// s.sqlQuery('DELETE FROM Files WHERE ke=? AND mid=?',[req.params.ke,req.params.id])
s.knexQuery({
action: "delete",
table: "Monitors",
where: {
ke: req.params.ke,
mid: req.params.id,
}
})
// s.knexQuery({
// action: "delete",
// table: "Files",
// where: {
// ke: req.params.ke,
// mid: req.params.id,
// }
// })
if(req.query.deleteFiles === 'true'){
//videos
s.dir.addStorage.forEach(function(v,n){
@ -250,28 +315,28 @@ module.exports = function(s,config,lang,app){
}
var form = s.getPostData(req)
if(form){
var insert = {
const insertQuery = {
ke : req.params.ke,
uid : user.uid,
code : s.gid(30),
ip : form.ip,
details : s.stringJSON(form.details)
}
var escapes = []
Object.keys(insert).forEach(function(column){
escapes.push('?')
});
s.sqlQuery('INSERT INTO API ('+Object.keys(insert).join(',')+') VALUES ('+escapes.join(',')+')',Object.values(insert),function(err,r){
insert.time = s.formattedTime(new Date,'YYYY-DD-MM HH:mm:ss');
s.knexQuery({
action: "insert",
table: "API",
insert: insertQuery
},(err,r) => {
insertQuery.time = s.formattedTime(new Date,'YYYY-DD-MM HH:mm:ss');
if(!err){
s.tx({
f: 'api_key_added',
uid: user.uid,
form: insert
form: insertQuery
},'GRP_' + req.params.ke)
endData.ok = true
}
endData.api = insert
endData.api = insertQuery
s.closeJsonResponse(res,endData)
})
}else{
@ -305,16 +370,15 @@ module.exports = function(s,config,lang,app){
s.closeJsonResponse(res,endData)
return
}
var row = {
ke : req.params.ke,
uid : user.uid,
code : form.code
}
var where = []
Object.keys(row).forEach(function(column){
where.push(column+'=?')
})
s.sqlQuery('DELETE FROM API WHERE '+where.join(' AND '),Object.values(row),function(err,r){
s.knexQuery({
action: "delete",
table: "API",
where: {
ke: req.params.ke,
uid: user.uid,
code: form.code,
}
},(err,r) => {
if(!err){
s.tx({
f: 'api_key_deleted',
@ -345,15 +409,16 @@ module.exports = function(s,config,lang,app){
var endData = {
ok : false
}
var row = {
const whereQuery = {
ke : req.params.ke,
uid : user.uid
}
var where = []
Object.keys(row).forEach(function(column){
where.push(column+'=?')
})
s.sqlQuery('SELECT * FROM API WHERE '+where.join(' AND '),Object.values(row),function(err,rows){
s.knexQuery({
action: "select",
columns: "*",
table: "API",
where: whereQuery
},function(err,rows) {
if(rows && rows[0]){
rows.forEach(function(row){
row.details = JSON.parse(row.details)
@ -383,16 +448,22 @@ module.exports = function(s,config,lang,app){
s.closeJsonResponse(res,endData)
return
}
s.sqlQuery("SELECT * FROM Presets WHERE ke=? AND type=?",[req.params.ke,'monitorStates'],function(err,presets){
s.knexQuery({
action: "select",
columns: "*",
table: "Presets",
where: [
['ke','=',req.params.ke],
['type','=','monitorStates'],
]
},function(err,presets) {
if(presets && presets[0]){
endData.ok = true
presets.forEach(function(preset){
preset.details = JSON.parse(preset.details)
})
endData.presets = presets
}else{
endData.msg = user.lang['State Configuration Not Found']
}
endData.presets = presets || []
s.closeJsonResponse(res,endData)
})
})
@ -437,7 +508,11 @@ module.exports = function(s,config,lang,app){
details: s.s(details),
type: 'monitorStates'
}
s.sqlQuery('INSERT INTO Presets ('+Object.keys(insertData).join(',')+') VALUES (?,?,?,?)',Object.values(insertData))
s.knexQuery({
action: "insert",
table: "Presets",
insert: insertData
})
s.tx({
f: 'add_group_state',
details: details,
@ -449,7 +524,17 @@ module.exports = function(s,config,lang,app){
var details = Object.assign(preset.details,{
monitors : form.monitors
})
s.sqlQuery('UPDATE Presets SET details=? WHERE ke=? AND name=?',[s.s(details),req.params.ke,req.params.stateName])
s.knexQuery({
action: "update",
table: "Presets",
update: {
details: s.s(details)
},
where: [
['ke','=',req.params.ke],
['name','=',req.params.stateName],
]
})
s.tx({
f: 'edit_group_state',
details: details,
@ -467,7 +552,14 @@ module.exports = function(s,config,lang,app){
endData.msg = user.lang['State Configuration Not Found']
s.closeJsonResponse(res,endData)
}else{
s.sqlQuery('DELETE FROM Presets WHERE ke=? AND name=?',[req.params.ke,req.params.stateName],function(err){
s.knexQuery({
action: "delete",
table: "Presets",
where: {
ke: req.params.ke,
name: req.params.stateName,
}
},(err) => {
if(!err){
endData.msg = lang["Deleted State Configuration"]
endData.ok = true

File diff suppressed because it is too large Load Diff

View File

@ -11,12 +11,18 @@ var httpProxy = require('http-proxy');
var proxy = httpProxy.createProxyServer({})
var ejs = require('ejs');
module.exports = function(s,config,lang,app){
var noCache = function(res){
res.setHeader('Cache-Control', 'private, no-cache, no-store, must-revalidate')
res.setHeader('Expires', '-1')
res.setHeader('Pragma', 'no-cache')
}
/**
* Page : Get Embed Stream
*/
app.get([config.webPaths.apiPrefix+':auth/embed/:ke/:id',config.webPaths.apiPrefix+':auth/embed/:ke/:id/:addon'], function (req,res){
req.params.protocol=req.protocol;
s.auth(req.params,function(user){
noCache(res)
if(user.permissions.watch_stream==="0"||user.details.sub&&user.details.allmonitors!=='1'&&user.details.monitors.indexOf(req.params.id)===-1){
res.end(user.lang['Not Permitted'])
return
@ -117,10 +123,10 @@ module.exports = function(s,config,lang,app){
Emitter = s.group[req.params.ke].activeMonitors[req.params.id].emitterChannel[parseInt(req.params.channel)+config.pipeAddition]
}
res.writeHead(200, {
'Content-Type': 'multipart/x-mixed-replace; boundary=shinobi',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
'Pragma': 'no-cache'
'Content-Type': 'multipart/x-mixed-replace; boundary=shinobi',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
'Pragma': 'no-cache'
});
var contentWriter
fs.readFile(config.defaultMjpeg,'binary',function(err,content){
@ -164,6 +170,7 @@ module.exports = function(s,config,lang,app){
app.get([config.webPaths.apiPrefix+':auth/hls/:ke/:id/:file',config.webPaths.apiPrefix+':auth/hls/:ke/:id/:channel/:file'], function (req,res){
req.fn=function(user){
s.checkChildProxy(req.params,function(){
noCache(res)
req.dir=s.dir.streams+req.params.ke+'/'+req.params.id+'/'
if(req.params.channel){
req.dir+='channel'+(parseInt(req.params.channel)+config.pipeAddition)+'/'+req.params.file;
@ -186,6 +193,7 @@ module.exports = function(s,config,lang,app){
app.get(config.webPaths.apiPrefix+':auth/jpeg/:ke/:id/s.jpg', function(req,res){
s.auth(req.params,function(user){
s.checkChildProxy(req.params,function(){
noCache(res)
if(user.details.sub&&user.details.allmonitors!=='1'&&user.details.monitors&&user.details.monitors.indexOf(req.params.id)===-1){
res.end(user.lang['Not Permitted'])
return
@ -206,11 +214,34 @@ module.exports = function(s,config,lang,app){
},res,req);
});
/**
* API : Get JPEG Snapshot
*/
app.get(config.webPaths.apiPrefix+':auth/icon/:ke/:id', function(req,res){
s.auth(req.params,async (user) => {
if(user.details.sub&&user.details.allmonitors!=='1'&&user.details.monitors&&user.details.monitors.indexOf(req.params.id)===-1){
res.end(user.lang['Not Permitted'])
return
}
res.writeHead(200, {
'Content-Type': 'image/jpeg',
'Cache-Control': 'no-cache',
'Pragma': 'no-cache'
});
res.end(await s.getCameraSnapshot({
ke: req.params.ke,
mid: req.params.id,
},{
useIcon: true
}))
},res,req);
});
/**
* API : Get FLV Stream
*/
app.get([config.webPaths.apiPrefix+':auth/flv/:ke/:id/s.flv',config.webPaths.apiPrefix+':auth/flv/:ke/:id/:channel/s.flv'], function(req,res) {
s.auth(req.params,function(user){
s.checkChildProxy(req.params,function(){
noCache(res)
var Emitter,chunkChannel
if(!req.params.channel){
Emitter = s.group[req.params.ke].activeMonitors[req.params.id].emitter
@ -261,6 +292,7 @@ module.exports = function(s,config,lang,app){
app.get([config.webPaths.apiPrefix+':auth/h265/:ke/:id/s.hevc',config.webPaths.apiPrefix+':auth/h265/:ke/:id/:channel/s.hevc'], function(req,res) {
s.auth(req.params,function(user){
s.checkChildProxy(req.params,function(){
noCache(res)
var Emitter,chunkChannel
if(!req.params.channel){
Emitter = s.group[req.params.ke].activeMonitors[req.params.id].emitter
@ -310,6 +342,7 @@ module.exports = function(s,config,lang,app){
], function (req, res) {
s.auth(req.params,function(user){
s.checkChildProxy(req.params,function(){
noCache(res)
if(!req.query.feed){req.query.feed='1'}
var Emitter
if(!req.params.feed){

View File

@ -7,62 +7,31 @@ var exec = require('child_process').exec;
var spawn = require('child_process').spawn;
var execSync = require('child_process').execSync;
module.exports = function(s,config,lang,app){
/**
* API : Superuser : Get Logs
*/
app.all([config.webPaths.supersuperApiPrefix+':auth/logs'], function (req,res){
app.all([config.webPaths.superApiPrefix+':auth/logs'], function (req,res){
req.ret={ok:false};
s.superAuth(req.params,function(resp){
req.sql='SELECT * FROM Logs WHERE ke=?';req.ar=['$'];
if(!req.params.id){
if(user.details.sub&&user.details.monitors&&user.details.allmonitors!=='1'){
try{user.details.monitors=JSON.parse(user.details.monitors);}catch(er){}
req.or=[];
user.details.monitors.forEach(function(v,n){
req.or.push('mid=?');req.ar.push(v)
})
req.sql+=' AND ('+req.or.join(' OR ')+')'
}
}else{
if(!user.details.sub||user.details.allmonitors!=='0'||user.details.monitors.indexOf(req.params.id)>-1||req.params.id.indexOf('$')>-1){
req.sql+=' and mid=?';req.ar.push(req.params.id)
}else{
res.end('[]');
return;
}
}
if(req.query.start||req.query.end){
if(!req.query.startOperator||req.query.startOperator==''){
req.query.startOperator='>='
}
if(!req.query.endOperator||req.query.endOperator==''){
req.query.endOperator='<='
}
if(req.query.start && req.query.start !== '' && req.query.end && req.query.end !== ''){
req.query.start = s.stringToSqlTime(req.query.start)
req.query.end = s.stringToSqlTime(req.query.end)
req.sql+=' AND `time` '+req.query.startOperator+' ? AND `time` '+req.query.endOperator+' ?';
req.ar.push(req.query.start)
req.ar.push(req.query.end)
}else if(req.query.start && req.query.start !== ''){
req.query.start = s.stringToSqlTime(req.query.start)
req.sql+=' AND `time` '+req.query.startOperator+' ?';
req.ar.push(req.query.start)
}
}
if(!req.query.limit||req.query.limit==''){req.query.limit=50}
req.sql+=' ORDER BY `time` DESC LIMIT '+req.query.limit+'';
s.sqlQuery(req.sql,req.ar,function(err,r){
if(err){
err.sql=req.sql;
res.end(s.prettyPrint(err));
return
}
if(!r){r=[]}
r.forEach(function(v,n){
r[n].info=JSON.parse(v.info)
const monitorRestrictions = s.getMonitorRestrictions(user.details,req.params.id)
s.getDatabaseRows({
monitorRestrictions: monitorRestrictions,
table: 'Logs',
groupKey: req.params.ke,
date: req.query.date,
startDate: req.query.start,
endDate: req.query.end,
startOperator: req.query.startOperator,
endOperator: req.query.endOperator,
limit: req.query.limit,
archived: req.query.archived,
endIsStartTo: true
},(response) => {
response.rows.forEach(function(v,n){
r[n].info = JSON.parse(v.info)
})
res.end(s.prettyPrint(r))
s.closeJsonResponse(res,r)
})
},res,req)
})
@ -71,11 +40,16 @@ module.exports = function(s,config,lang,app){
*/
app.all(config.webPaths.superApiPrefix+':auth/logs/delete', function (req,res){
s.superAuth(req.params,function(resp){
s.sqlQuery('DELETE FROM Logs WHERE ke=?',['$'],function(){
var endData = {
ok : true
s.knexQuery({
action: "delete",
table: "Logs",
where: {
ke: '$'
}
res.end(s.prettyPrint(endData))
},() => {
s.closeJsonResponse(res,{
ok : true
})
})
},res,req)
})
@ -99,7 +73,7 @@ module.exports = function(s,config,lang,app){
var endData = {
ok : true
}
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
},res,req)
})
/**
@ -124,7 +98,7 @@ module.exports = function(s,config,lang,app){
s.systemLog('Flush PM2 Logs',{by:resp.$user.mail,ip:resp.ip})
endData.logsOuput = execSync('pm2 flush')
}
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
},res,req)
})
/**
@ -145,11 +119,23 @@ module.exports = function(s,config,lang,app){
ip: resp.ip,
old:jsonfile.readFileSync(s.location.config)
})
try{
if(config.thisIsDocker){
const dockerConfigFile = '/config/conf.json'
fs.stat(dockerConfigFile,(err) => {
if(!err){
fs.writeFile(dockerConfigFile,JSON.stringify(postBody,null,3),function(){})
}
})
}
}catch(err){
console.log(err)
}
jsonfile.writeFile(s.location.config,postBody,{spaces: 2},function(){
s.tx({f:'save_configuration'},'$')
})
}
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
},res,req)
})
/**
@ -163,22 +149,23 @@ module.exports = function(s,config,lang,app){
var endData = {
ok : true
}
searchQuery = 'SELECT ke,uid,auth,mail,details FROM Users'
queryVals = []
const whereQuery = []
switch(req.params.type){
case'admin':case'administrator':
searchQuery += ' WHERE details NOT LIKE ?'
queryVals.push('%"sub"%')
whereQuery.push(['details','NOT LIKE','%"sub"%'])
break;
case'sub':case'subaccount':
searchQuery += ' WHERE details LIKE ?'
queryVals.push('%"sub"%')
whereQuery.push(['details','LIKE','%"sub"%'])
break;
}
// ' WHERE details NOT LIKE ?'
s.sqlQuery(searchQuery,queryVals,function(err,users) {
s.knexQuery({
action: "select",
columns: "ke,uid,auth,mail,details",
table: "Users",
where: whereQuery
},(err,users) => {
endData.users = users
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
})
},res,req)
})
@ -192,7 +179,7 @@ module.exports = function(s,config,lang,app){
}
var form = s.getPostData(req)
if(form){
var currentSuperUserList = jsonfile.readFileSync(s.location.super)
var currentSuperUserList = JSON.parse(fs.readFileSync(s.location.super))
var currentSuperUser = {}
var currentSuperUserPosition = -1
//find this user in current list
@ -230,14 +217,26 @@ module.exports = function(s,config,lang,app){
currentSuperUserList.push(currentSuperUser)
}
//update master list in system
jsonfile.writeFile(s.location.super,currentSuperUserList,{spaces: 2},function(){
try{
if(config.thisIsDocker){
const dockerSuperFile = '/config/super.json'
fs.stat(dockerSuperFile,(err) => {
if(!err){
fs.writeFile(dockerSuperFile,JSON.stringify(currentSuperUserList,null,3),function(){})
}
})
}
}catch(err){
console.log(err)
}
fs.writeFile(s.location.super,JSON.stringify(currentSuperUserList,null,3),function(){
s.tx({f:'save_preferences'},'$')
})
}else{
endData.ok = false
endData.msg = lang.postDataBroken
}
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
},res,req)
})
/**
@ -249,7 +248,7 @@ module.exports = function(s,config,lang,app){
ok : false
}
var close = function(){
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
}
var isCallbacking = false
var form = s.getPostData(req)
@ -257,7 +256,14 @@ module.exports = function(s,config,lang,app){
if(form.mail !== '' && form.pass !== ''){
if(form.pass === form.password_again || form.pass === form.pass_again){
isCallbacking = true
s.sqlQuery('SELECT * FROM Users WHERE mail=?',[form.mail],function(err,r) {
s.knexQuery({
action: "select",
columns: "*",
table: "Users",
where: [
['mail','=',form.mail]
]
},(err,r) => {
if(r&&r[0]){
//found address already exists
endData.msg = lang['Email address is in use.'];
@ -277,16 +283,17 @@ module.exports = function(s,config,lang,app){
form.details = JSON.stringify(form.details)
}
//write user to db
s.sqlQuery(
'INSERT INTO Users (ke,uid,mail,pass,details) VALUES (?,?,?,?,?)',
[
form.ke,
form.uid,
form.mail,
s.createHash(form.pass),
form.details
]
)
s.knexQuery({
action: "insert",
table: "Users",
insert: {
ke: form.ke,
uid: form.uid,
mail: form.mail,
pass: s.createHash(form.pass),
details: form.details
}
})
s.tx({f:'add_account',details:form.details,ke:form.ke,uid:form.uid,mail:form.mail},'$')
endData.user = Object.assign(form,{})
//init user
@ -315,12 +322,19 @@ module.exports = function(s,config,lang,app){
ok : false
}
var close = function(){
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
}
var form = s.getPostData(req)
if(form){
var account = s.getPostData(req,'account')
s.sqlQuery('SELECT * FROM Users WHERE mail=?',[account.mail],function(err,r) {
s.knexQuery({
action: "select",
columns: "*",
table: "Users",
where: [
['mail','=',account.mail]
]
},(err,r) => {
if(r && r[0]){
r = r[0]
var details = JSON.parse(r.details)
@ -337,25 +351,16 @@ module.exports = function(s,config,lang,app){
}
delete(form.password_again);
delete(form.pass_again);
var keys = Object.keys(form)
var set = []
var values = []
keys.forEach(function(v,n){
if(
set === 'ke' ||
!form[v]
){
//skip
return
}
set.push(v+'=?')
if(v === 'details'){
form[v] = s.stringJSON(Object.assign(details,s.parseJSON(form[v])))
}
values.push(form[v])
})
values.push(account.mail)
s.sqlQuery('UPDATE Users SET '+set.join(',')+' WHERE mail=?',values,function(err,r) {
delete(form.ke);
form.details = s.stringJSON(Object.assign(details,s.parseJSON(form.details)))
s.knexQuery({
action: "update",
table: "Users",
update: form,
where: [
['mail','=',account.mail],
]
},(err,r) => {
if(err){
console.log(err)
endData.error = err
@ -388,32 +393,78 @@ module.exports = function(s,config,lang,app){
ok : true
}
var close = function(){
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
}
var account = s.getPostData(req,'account')
s.sqlQuery('DELETE FROM Users WHERE uid=? AND ke=? AND mail=?',[account.uid,account.ke,account.mail])
s.sqlQuery('DELETE FROM API WHERE uid=? AND ke=?',[account.uid,account.ke])
s.knexQuery({
action: "delete",
table: "Users",
where: {
ke: account.ke,
uid: account.uid,
mail: account.mail,
}
})
s.knexQuery({
action: "delete",
table: "API",
where: {
ke: account.ke,
uid: account.uid,
}
})
if(s.getPostData(req,'deleteSubAccounts',false) === '1'){
s.sqlQuery('DELETE FROM Users WHERE ke=?',[account.ke])
s.knexQuery({
action: "delete",
table: "Users",
where: {
ke: account.ke,
}
})
}
if(s.getPostData(req,'deleteMonitors',false) == '1'){
s.sqlQuery('SELECT * FROM Monitors WHERE ke=?',[account.ke],function(err,monitors){
s.knexQuery({
action: "select",
columns: "*",
table: "Monitors",
where: {
ke: account.ke,
}
},(err,monitors) => {
if(monitors && monitors[0]){
monitors.forEach(function(monitor){
s.camera('stop',monitor)
})
s.sqlQuery('DELETE FROM Monitors WHERE ke=?',[account.ke])
s.knexQuery({
action: "delete",
table: "Monitors",
where: {
ke: account.ke,
}
})
}
})
}
if(s.getPostData(req,'deleteVideos',false) == '1'){
s.sqlQuery('DELETE FROM Videos WHERE ke=?',[account.ke])
s.knexQuery({
action: "delete",
table: "Videos",
where: {
ke: account.ke,
}
})
fs.chmod(s.dir.videos+account.ke,0o777,function(err){
fs.unlink(s.dir.videos+account.ke,function(err){})
})
}
if(s.getPostData(req,'deleteEvents',false) == '1'){
s.sqlQuery('DELETE FROM Events WHERE ke=?',[account.ke])
s.knexQuery({
action: "delete",
table: "Events",
where: {
ke: account.ke,
}
})
}
s.tx({f:'delete_account',ke:account.ke,uid:account.uid,mail:account.mail},'$')
close()
@ -449,7 +500,11 @@ module.exports = function(s,config,lang,app){
if(tableName){
var tableIsSelected = s.getPostData(req,tableName) == 1
if(tableIsSelected){
s.sqlQuery('SELECT * FROM `' + tableName +'`',[],function(err,dataRows){
s.knexQuery({
action: "select",
columns: "*",
table: tableName
},(err,dataRows) => {
endData.database[tableName] = dataRows
++completedTables
tableExportLoop(callback)
@ -551,26 +606,26 @@ module.exports = function(s,config,lang,app){
])
break;
}
var keysToCheck = []
var valuesToCheck = []
const whereQuery = []
fieldsToCheck.forEach(function(key){
keysToCheck.push(key + '= ?')
valuesToCheck.push(row[key])
whereQuery.push([key,'=',row[key]])
})
s.sqlQuery('SELECT * FROM ' + tableName + ' WHERE ' + keysToCheck.join(' AND '),valuesToCheck,function(err,selected){
s.knexQuery({
action: "select",
columns: "*",
table: tableName,
where: whereQuery
},(err,selected) => {
if(selected && selected[0]){
selected = selected[0]
rowsExistingAlready[tableName].push(selected)
callback()
}else{
var rowKeys = Object.keys(row)
var insertEscapes = []
var insertValues = []
rowKeys.forEach(function(key){
insertEscapes.push('?')
insertValues.push(row[key])
})
s.sqlQuery('INSERT INTO ' + tableName + ' (' + rowKeys.join(',') +') VALUES (' + insertEscapes.join(',') + ')',insertValues,function(){
s.knexQuery({
action: "insert",
table: tableName,
insert: row
},(err) => {
if(!err){
++countOfRowsInserted[tableName]
}
@ -631,7 +686,7 @@ module.exports = function(s,config,lang,app){
ok : true
}
s.checkForStalePurgeLocks()
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
},res,req)
})
/**
@ -669,7 +724,7 @@ module.exports = function(s,config,lang,app){
ok : true,
childNodes: childNodesJson,
}
res.end(s.prettyPrint(endData))
s.closeJsonResponse(res,endData)
},res,req)
})
}

View File

@ -4,11 +4,6 @@
"version": "2.0.0",
"description": "CCTV and NVR in Node.js",
"main": "camera.js",
"bin": "camera.js",
"scripts": {
"test": "node camera.js test",
"start": "chmod +x INSTALL/start.sh && INSTALL/start.sh"
},
"repository": {
"type": "git",
"url": "git+https://gitlab.com/Shinobi-Systems/Shinobi.git"
@ -17,51 +12,68 @@
"bugs": {
"url": "https://gitlab.com/Shinobi-Systems/Shinobi/issues"
},
"pkg": {
"assets": [
"libs/**/*",
"libs/**/**/*",
"libs/**/**/**/*",
"libs/**/**/**/**/*",
"languages/*",
"web/*",
"node_modules/ffmpeg-static/*",
"definitions/*"
]
},
"homepage": "https://gitlab.com/Shinobi-Systems/Shinobi#readme",
"dependencies": {
"async": "^3.1.0",
"aws-sdk": "^2.279.1",
"backblaze-b2": "^1.0.4",
"body-parser": "^1.18.3",
"connection-tester": "^0.1.1",
"cws": "^1.0.0",
"discord.js": "^11.3.2",
"aws-sdk": "^2.731.0",
"backblaze-b2": "^1.5.0",
"body-parser": "^1.19.0",
"connection-tester": "^0.2.0",
"cws": "^1.2.11",
"discord.js": "^12.2.0",
"ejs": "^2.5.5",
"express": "^4.16.4",
"ftp-srv": "^4.0.0",
"ftp-srv": "4.3.4",
"http-proxy": "^1.17.0",
"jsonfile": "^3.0.1",
"knex": "^0.19.5",
"ldapauth-fork": "^4.0.2",
"moment": "^2.17.0",
"mp4frag": "^0.0.22",
"mysql": "^2.16.0",
"node-onvif": "^0.1.4",
"knex": "^0.21.4",
"ldapauth-fork": "^4.3.3",
"moment": "^2.27.0",
"mp4frag": "^0.2.0",
"mysql": "^2.18.1",
"node-onvif": "^0.1.7",
"node-ssh": "^5.1.2",
"nodemailer": "^4.0.1",
"pam-diff": "^0.12.1",
"nodemailer": "^6.4.11",
"pam-diff": "^1.0.0",
"path": "^0.12.7",
"pipe2pam": "^0.6.2",
"request": "^2.88.0",
"sat": "^0.7.1",
"shinobi-sound-detection": "^0.1.7",
"shinobi-sound-detection": "^0.1.8",
"smtp-server": "^3.5.0",
"socket.io": "^2.2.0",
"socket.io-client": "^2.2.0",
"socket.io": "^2.3.0",
"socket.io-client": "^2.3.0",
"webdav-fs": "^1.11.0",
"express-fileupload": "^1.1.6-alpha.6"
"express-fileupload": "^1.1.6-alpha.6",
"googleapis": "^39.2.0",
"tree-kill":"1.2.2",
"unzipper":"0.10.11",
"node-fetch":"2.6.0",
"fs-extra": "9.0.1"
},
"devDependencies": {}
"devDependencies": {},
"bin": "camera.js",
"scripts": {
"test": "node camera.js test",
"start": "chmod +x INSTALL/start.sh && INSTALL/start.sh",
"package": "pkg package.json -t linux,macos,win --out-path dist",
"package-x64": "pkg package.json -t linux-x64,macos-x64,win-x64 --out-path dist/x64",
"package-x86": "pkg package.json -t linux-x86,macos-x86,win-x86 --out-path dist/x86",
"package-armv6": "pkg package.json -t linux-armv6,macos-armv6,win-armv6 --out-path dist/armv6",
"package-armv7": "pkg package.json -t linux-armv7,macos-armv7,win-armv7 --out-path dist/armv7",
"package-all": "npm run package && npm run package-x64 && npm run package-x86 && npm run package-armv6 && npm run package-armv7"
},
"pkg": {
"targets": [
"node12"
],
"scripts": [
],
"assets": [
"definitions/*",
"languages/*",
"web/*",
"test/*"
]
}
}

View File

@ -1,50 +0,0 @@
#!/bin/bash
THE_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null && pwd )"
sudo apt update -y
sudo apt-get install libx11-dev -y
sudo apt-get install libpng-dev -y
sudo apt-get install libopenblas-dev -y
echo "----------------------------------------"
echo "-- Installing Dlib Plugin for Shinobi --"
echo "----------------------------------------"
if ! [ -x "$(command -v nvidia-smi)" ]; then
echo "You need to install NVIDIA Drivers to use this."
echo "inside the Shinobi directory run the following :"
echo "sh INSTALL/cuda.sh"
exit 1
else
echo "NVIDIA Drivers found..."
echo "$(nvidia-smi |grep 'Driver Version')"
fi
echo "-----------------------------------"
if [ ! -d "/usr/local/cuda" ]; then
echo "You need to install CUDA Toolkit to use this."
echo "inside the Shinobi directory run the following :"
echo "sh INSTALL/cuda.sh"
exit 1
else
echo "CUDA Toolkit found..."
fi
echo "-----------------------------------"
if [ ! -e "./conf.json" ]; then
echo "Creating conf.json"
sudo cp conf.sample.json conf.json
else
echo "conf.json already exists..."
fi
npm i npm -g
echo "-----------------------------------"
echo "Getting node-gyp to build C++ modules"
npm install node-gyp -g --unsafe-perm
echo "-----------------------------------"
echo "Getting C++ module : face-recognition"
echo "https://gitlab.com/Shinobi-Systems/face-recognition-js-cuda"
npm install --unsafe-perm
npm audit fix --force
cd $THE_DIR
echo "-----------------------------------"
echo "Start the plugin with pm2 like so :"
echo "pm2 start shinobi-dlib.js"
echo "-----------------------------------"
echo "Start the plugin without pm2 :"
echo "node shinobi-dlib.js"

View File

@ -1,9 +0,0 @@
{
"plug":"Dlib",
"host":"localhost",
"port":8080,
"key":"Dlib123123",
"mode":"client",
"type":"detector",
"connectionType":"websocket"
}

View File

@ -1,19 +0,0 @@
{
"name": "shinobi-dlib",
"version": "1.0.0",
"description": "Dlib plugin for Shinobi that uses C++ functions for detection.",
"main": "shinobi-dlib.js",
"dependencies": {
"socket.io-client": "^1.7.4",
"express": "^4.16.2",
"moment": "^2.19.2",
"socket.io": "^2.0.4",
"face-recognition-cuda": "0.9.3"
},
"devDependencies": {},
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Moe Alam",
"license": "ISC"
}

View File

@ -1,97 +0,0 @@
//
// Shinobi - Dlib Plugin
// Copyright (C) 2016-2025 Moe Alam, moeiscool
//
// # Donate
//
// If you like what I am doing here and want me to continue please consider donating :)
// PayPal : paypal@m03.ca
//
// Base Init >>
var fs = require('fs');
var config = require('./conf.json')
var s
try{
s = require('../pluginBase.js')(__dirname,config)
}catch(err){
console.log(err)
try{
s = require('./pluginBase.js')(__dirname,config)
}catch(err){
console.log(err)
return console.log(config.plug,'Plugin start has failed. pluginBase.js was not found.')
}
}
// Base Init />>
var fr = require('face-recognition-cuda');//modified "binding.gyp" file for "face-recognition" to build dlib with cuda
const detector = fr.FaceDetector()
s.detectObject=function(buffer,d,tx,frameLocation){
var detectStuff = function(frame){
try{
var buffer = fr.loadImage(frame)
var faceRectangles = detector.locateFaces(buffer)
var matrices = []
faceRectangles.forEach(function(v){
var coordinates = [
{"x" : v.rect.left, "y" : v.rect.top},
{"x" : v.rect.right, "y" : v.rect.top},
{"x" : v.rect.right, "y" : v.rect.bottom}
]
var width = Math.sqrt( Math.pow(coordinates[1].x - coordinates[0].x, 2) + Math.pow(coordinates[1].y - coordinates[0].y, 2));
var height = Math.sqrt( Math.pow(coordinates[2].x - coordinates[1].x, 2) + Math.pow(coordinates[2].y - coordinates[1].y, 2))
matrices.push({
x: coordinates[0].x,
y: coordinates[0].y,
width: width,
height: height,
tag: 'UNKNOWN FACE',
confidence: v.confidence,
})
})
if(matrices.length > 0){
tx({
f: 'trigger',
id: d.id,
ke: d.ke,
details:{
plug: config.plug,
name: 'dlib',
reason: 'object',
matrices: matrices,
imgHeight: parseFloat(d.mon.detector_scale_y),
imgWidth: parseFloat(d.mon.detector_scale_x)
}
})
}
fs.unlink(frame,function(){
})
}catch(err){
console.log(err)
}
}
if(frameLocation){
detectStuff(frameLocation)
}else{
d.tmpFile=s.gid(5)+'.jpg'
if(!fs.existsSync(s.dir.streams)){
fs.mkdirSync(s.dir.streams);
}
d.dir=s.dir.streams+d.ke+'/'
if(!fs.existsSync(d.dir)){
fs.mkdirSync(d.dir);
}
d.dir=s.dir.streams+d.ke+'/'+d.id+'/'
if(!fs.existsSync(d.dir)){
fs.mkdirSync(d.dir);
}
fs.writeFile(d.dir+d.tmpFile,buffer,function(err){
if(err) return s.systemLog(err);
try{
detectStuff(d.dir+d.tmpFile)
}catch(error){
console.error('Catch: ' + error);
}
})
}
}

View File

@ -1,2 +0,0 @@
conf.json
data

View File

@ -1,19 +0,0 @@
#!/bin/bash
mkdir data
chmod -R 777 data
wget https://cdn.shinobi.video/weights/dnnCocoData.zip -O dnnCocoData.zip
unzip dnnCocoData.zip -d data
if [ $(dpkg-query -W -f='${Status}' opencv_version 2>/dev/null | grep -c "ok installed") -eq 0 ]; then
echo "Shinobi - Do ypu want to let the `opencv4nodejs` npm package install OpenCV? "
echo "Only do this if you do not have OpenCV already or will not use a GPU (Hardware Acceleration)."
echo "(y)es or (N)o"
read nodejsinstall
if [ "$nodejsinstall" = "y" ] || [ "$nodejsinstall" = "Y" ]; then
export OPENCV4NODEJS_DISABLE_AUTOBUILD=0
else
export OPENCV4NODEJS_DISABLE_AUTOBUILD=1
fi
else
export OPENCV4NODEJS_DISABLE_AUTOBUILD=1
fi
npm install opencv4nodejs moment express canvas@1.6 --unsafe-perm

View File

@ -1,9 +0,0 @@
{
"plug":"Coco",
"host":"localhost",
"port":8080,
"hostPort":8082,
"key":"change_this_to_something_very_random____make_sure_to_match__/plugins/opencv/conf.json",
"mode":"client",
"type":"detector"
}

View File

@ -1,83 +0,0 @@
module.exports = [
'background',
'person',
'bicycle',
'car',
'motorcycle',
'airplane',
'bus',
'train',
'truck',
'boat',
'traffic light',
'fire hydrant',
'stop sign',
'parking meter',
'bench',
'bird',
'cat',
'dog',
'horse',
'sheep',
'cow',
'elephant',
'bear',
'zebra',
'giraffe',
'backpack',
'umbrella',
'handbag',
'tie',
'suitcase',
'frisbee',
'skis',
'snowboard',
'sports ball',
'kite',
'baseball bat',
'baseball glove',
'skateboard',
'surfboard',
'tennis racket',
'bottle',
'wine glass',
'cup',
'fork',
'knife',
'spoon',
'bowl',
'banana',
'apple',
'sandwich',
'orange',
'broccoli',
'carrot',
'hot dog',
'pizza',
'donut',
'cake',
'chair',
'couch',
'potted plant',
'bed',
'dining table',
'toilet',
'tv',
'laptop',
'mouse',
'remote',
'keyboard',
'cell phone',
'microwave',
'oven',
'toaster',
'sink',
'refrigerator',
'book',
'clock',
'vase',
'scissors',
'teddy bear',
'hair drier',
'toothbrush'
];

View File

@ -1,94 +0,0 @@
; Specify the path to the runtime data directory
runtime_dir = ${CMAKE_INSTALL_PREFIX}/share/openalpr/runtime_data
ocr_img_size_percent = 1.33333333
state_id_img_size_percent = 2.0
; Calibrating your camera improves detection accuracy in cases where vehicle plates are captured at a steep angle
; Use the openalpr-utils-calibrate utility to calibrate your fixed camera to adjust for an angle
; Once done, update the prewarp config with the values obtained from the tool
prewarp =
; detection will ignore plates that are too large. This is a good efficiency technique to use if the
; plates are going to be a fixed distance away from the camera (e.g., you will never see plates that fill
; up the entire image
max_plate_width_percent = 100
max_plate_height_percent = 100
; detection_iteration_increase is the percentage that the LBP frame increases each iteration.
; It must be greater than 1.0. A value of 1.01 means increase by 1%, 1.10 increases it by 10% each time.
; So a 1% increase would be ~10x slower than 10% to process, but it has a higher chance of landing
; directly on the plate and getting a strong detection
detection_iteration_increase = 1.1
; The minimum detection strength determines how sure the detection algorithm must be before signaling that
; a plate region exists. Technically this corresponds to LBP nearest neighbors (e.g., how many detections
; are clustered around the same area). For example, 2 = very lenient, 9 = very strict.
detection_strictness = 3
; The detection doesn't necessarily need an extremely high resolution image in order to detect plates
; Using a smaller input image should still find the plates and will do it faster
; Tweaking the max_detection_input values will resize the input image if it is larger than these sizes
; max_detection_input_width/height are specified in pixels
max_detection_input_width = 1280
max_detection_input_height = 720
; detector is the technique used to find license plate regions in an image. Value can be set to
; lbpcpu - default LBP-based detector uses the system CPU
; lbpgpu - LBP-based detector that uses Nvidia GPU to increase recognition speed.
; lbpopencl - LBP-based detector that uses OpenCL GPU to increase recognition speed. Requires OpenCV 3.0
; morphcpu - Experimental detector that detects white rectangles in an image. Does not require training.
detector = lbpgpu
; If set to true, all results must match a postprocess text pattern if a pattern is available.
; If not, the result is disqualified.
must_match_pattern = 0
; Bypasses plate detection. If this is set to 1, the library assumes that each region provided is a likely plate area.
skip_detection = 0
; Specifies the full path to an image file that constrains the detection area. Only the plate regions allowed through the mask
; will be analyzed. The mask image must match the resolution of your image to be analyzed. The mask is black and white.
; Black areas will be ignored, white areas will be searched. An empty value means no mask (scan the entire image)
detection_mask_image =
; OpenALPR can scan the same image multiple times with different randomization. Setting this to a value larger than
; 1 may increase accuracy, but will increase processing time linearly (e.g., analysis_count = 3 is 3x slower)
analysis_count = 1
; OpenALPR detects high-contrast plate crops and uses an alternative edge detection technique. Setting this to 0.0
; would classify ALL images as high-contrast, setting it to 1.0 would classify no images as high-contrast.
contrast_detection_threshold = 0.3
max_plate_angle_degrees = 15
ocr_min_font_point = 6
; Minimum OCR confidence percent to consider.
postprocess_min_confidence = 65
; Any OCR character lower than this will also add an equally likely
; chance that the character is incorrect and will be skipped. Value is a confidence percent
postprocess_confidence_skip_level = 80
debug_general = 0
debug_timing = 0
debug_detector = 0
debug_prewarp = 0
debug_state_id = 0
debug_plate_lines = 0
debug_plate_corners = 0
debug_char_segment = 0
debug_char_analysis = 0
debug_color_filter = 0
debug_ocr = 0
debug_postprocess = 0
debug_show_images = 0
debug_pause_on_frame = 0

Some files were not shown because too many files have changed in this diff Show More