Half-baked guide to automation
I see a lot of nice turn-key automation setups, along with some really slick custom automation setups.
I have neither. I do however have an embedded PC, some I2C sensors, and the will to make due with what I have.
Reading a sensor can be achieved using the linux sysfs tree without any special software
#!/bin/bash
function readsensor {
local loghost="10.10.10.10"
local logport="420"
local facility="local0.info"
## I2C addresses for BMP280 is either 0x76 or 0x77
for x in 76, 77; do
local device="/sys/class/i2c-adapter/i2c-2/2-00${x}/iio:device1"
local sensor=$(cat "${device}/name")
## If you're going to get fancy with sed, make notes about what it is you wrote
local temp=$(cat "${device}/in_temp_input" | sed -e 's/\(..\)\(..\).*/\1.\2/' -)
local address=$x
logger -n "${loghost}" -P "${logport}" -p "${facility}" -t "$sensor-$address" "$temp"
done
}
readsensor
#EOF
Automating sensor readings is as simple as adding such a bash script to your crontab.
* * * * * /usr/local/bin/logtemp.sh > /dev/null
But where does this data go, and how is it stored, a fancy resource intensive SQL database? Absolutely not.
Let the system logger deal with it. I just use the built in syslog software, which is already running in the background of 99% of linux distros. Log rotation, compression, and backup are all handled by the logger already. No forgotten SQL credentials, no intricate backup procedures, no worries.
In my case I just make a custom rsyslog configuration to handle local and UDP logging.
# /etc/rsyslog.d/10-temperature.conf
input(type="imudp" port="420" ruleset="templogData")
ruleset(name="templogData") {
action(type="omfile" dynaFile="templogTemplate" template="templogFormat")
action(type="omfile" file="/dev/tty2" template="ttylogFormat")
stop
}
Logging is not limited to local devices either. With UDP logging I can send data from a battery powered ESP-01 with a BMP280 over network for months on a single charge. For robustness I donāt even bother sending timestamps with environmental data, I use the recieve time on the logging machine as the timestamp. This also cuts reliance on local or external NTP services.
What to do with all this data? Hereās some remote log data:
2021-04-04T06:30:26.972864-04:00 logger BMP280 26.22 68074.22
2021-04-04T06:30:31.972951-04:00 logger BMP280 26.22 68074.22
How about the highest value? The lowest?
cat remotelogs/esp_183da3.local.domain-BMP280.log-20210405 | cut -d' ' -f4 | sort -u | head -n1
26.22
cat remotelogs/esp_183da3.local.domain-BMP280.log-20210405 | cut -d' ' -f4 | sort -u | tail -n1
23.73
Certainly you can use your imagination in this department. Data can be parsed in nearly limitless ways and periodic cronjobs can be used for analysis and email alerts.
Graphing can also be automated with tools like gnuplot, either to a fixed dashboard display or by periodically pumping out SVG or PNG graphs for display on a web page.
Please note, when it comes to graphing I will never be satisfied. This type of flexibility opens a pandoraās box of possibilities and can consume a lot of valuable time. Do you really need gradients and transparent historical data behind your plot line? I wish I could share some gnuplot scripts with you here, but it feels like an unfinished masterpiece
Couldnāt you just use MQTT for your IOT setup? Sure, if you want to install, maintain, and secure a bunch of extra network services. There are also 3rd party IOT MQTT services one can use if you have absolutely no regard for sending private data out on the internet, and are 100% confident that you will never suffer network connectivity issues. I realize this is a major technology implemented in industrial automation systems, chat programs, and more but the protocol and tools just seem so longwinded to me. Half the time you have to roll your own publisher or subscriber daemons anyways, itās just another middleman.