Monitoring station preparation
In this blog entry we will focus on preparing our monitoring station.
In previous part we have reinstalled LattePanda with Ubuntu Linux 16.04 , now it's time for some customization.
First of all, let's change a hostname. It is stored in /etc/hostname and can be reload in runtime using following command:
# hostname -F /etc/hostname
New hostname should be also added into /etc/hosts file (using loopback or external address) to prevent name resolution errors in various components.
Default Clonezilla Ubuntu image (as provided by the manufacturer) uses English language, but with Chinese locales - for example, date is printed like this:
root@e14-waterproof-challenger:~# date
2023年 04月 25日 星期二 15:39:03 CST
As we investigate further, only language is set to en_US.UTF-8, the rest of locale parameters are Chinese:
root@e14-waterproof-challenger:~# locale
LANG=en_US.UTF-8
LANGUAGE=
LC_CTYPE="en_US.UTF-8"
LC_NUMERIC=zh_CN.UTF-8
LC_TIME=zh_CN.UTF-8
LC_COLLATE="en_US.UTF-8"
LC_MONETARY=zh_CN.UTF-8
LC_MESSAGES="en_US.UTF-8"
LC_PAPER=zh_CN.UTF-8
LC_NAME=zh_CN.UTF-8
LC_ADDRESS=zh_CN.UTF-8
LC_TELEPHONE=zh_CN.UTF-8
LC_MEASUREMENT=zh_CN.UTF-8
LC_IDENTIFICATION=zh_CN.UTF-8
LC_ALL=
Using official Debian reconfiguration procedure:
https://wiki.debian.org/Locale
is not sufficient in this case - either it ends with error (when zh_CN locales are not being generated) or it fails to reconfigure zh_CN entries from the above.
It turns out that the file /etc/default/locale was manually modified to allow for mixed locale configuration (English language but with Chinese settings). Commenting out all LC_ entries from this file, like this:
# File generated by update-locale
LANG=en_US.UTF-8
#LC_NUMERIC="zh_CN.UTF-8"
#LC_TIME="zh_CN.UTF-8"
#LC_MONETARY="zh_CN.UTF-8"
#LC_PAPER="zh_CN.UTF-8"
#LC_NAME="zh_CN.UTF-8"
#LC_ADDRESS="zh_CN.UTF-8"
#LC_TELEPHONE="zh_CN.UTF-8"
#LC_MEASUREMENT="zh_CN.UTF-8"
#LC_IDENTIFICATION="zh_CN.UTF-8"
then running
#dpkg-reconfigure locales
allows for correct locale switch.
root@e14-waterproof-challenger:~# locale
LANG=en_US.UTF-8
LANGUAGE=
LC_CTYPE="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_PAPER="en_US.UTF-8"
LC_NAME="en_US.UTF-8"
LC_ADDRESS="en_US.UTF-8"
LC_TELEPHONE="en_US.UTF-8"
LC_MEASUREMENT="en_US.UTF-8"
LC_IDENTIFICATION="en_US.UTF-8"
LC_ALL=
Next point - change timezone. After installation it defaults to:
root@e14-waterproof-challenger:~# timedatectl
Local time: Tue 2023-04-25 16:31:45 CST
Universal time: Tue 2023-04-25 08:31:45 UTC
RTC time: Tue 2023-04-25 08:31:45
Time zone: Asia/Chongqing (CST, +0800)
Network time on: yes
NTP synchronized: yes
RTC in local TZ: no
so it probably isn't the zone we are in. Let's find more suitable one:
root@e14-waterproof-challenger:~# timedatectl list-timezones
Africa/Abidjan
Africa/Accra
Africa/Addis_Ababa
[...]
then make a switch:
root@e14-waterproof-challenger:~# timedatectl set-timezone Europe/Warsaw
as can be seen from the initial timedatectl output, NTP synchronization is already enabled, so out monitoring will get correct timestamps.
Installing InfluxDB database and Grafana data presentation software.
Some initial remark - this is on-premises, traditional approach. Every element is directly connected, without any additional intermediates or cloud infrastructure - so it can be accessed locally (inside LAN) and external access should be configured as needed. Positive side? Proof of concept was installed on OrangePI Zero with 256MB RAM and works correctly to this day (storing and presenting data from a few sensors for several weeks now).
Another, probably more modern approach was presented (for example) here:
InfluxDB
Let's begin from some theory - what is time-series database (like InfluxDB)? It is database optimized for storage of time-stamped data (like sensor readings). Every record is referenced by it's timestamp, and usually stores one or more values and some additional information. In InfluxDB those are:
- measurement - it can be interpreted as a table in relational database, groups together data of similar type,
- field - it is like a column in RDBMS, it is key-value pair that stores one data value
- tag (set) - list of optional tags that can be used to differentiate data from different sources (for example). As tag keys are indexed, they can be used to accelerate queries
And the beautiful part? Data can be stored in the database using simple HTTP POSTs, like this:
https://docs.influxdata.com/influxdb/v1.8/write_protocols/line_protocol_tutorial/
so interfacing even with low-powered MCUs is very simple.
Enough of this theory, let's install something.
As versions provided in the default Ubuntu repository are archaic, we will install from dedicated repositories:
https://docs.influxdata.com/influxdb/v1.8/introduction/install/
$wget -q https://repos.influxdata.com/influxdata-archive_compat.key
$echo '393e8779c89ac8d958f81f942f9ad7fb82a25e133faddaf92e15b16e6ac9ce4c influxdata-archive_compat.key' | sha256sum -c && cat influxdata-archive_compat.key | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/influxdata-archive_compat.gpg > /dev/null
$echo 'deb [signed-by=/etc/apt/trusted.gpg.d/influxdata-archive_compat.gpg] https://repos.influxdata.com/debian stable main' | sudo tee /etc/apt/sources.list.d/influxdata.list
root@e14-waterproof-challenger:~# apt-get install influxdb
Grafana
Grafana will be installed from a downloaded package - there are some certificate problems with current Grafana repository that prevent correct operation on Ubuntu 16.04.
wget https://dl.grafana.com/oss/release/grafana_9.4.7_amd64.deb
dpkg -i ./grafana_9.4.7_amd64.deb
systemctl daemon-reload
systemctl enable grafana-server
systemctl start grafana-server
And let's see what happened:
http://panda_address:3000
(initially log-in using admin:admin)
Data gathering script and presentation layer configuration
METAR client
Before we configure presentation layer, let's get some real data. As a reference (because of sensor tolerance) I have chosen weather data obtained from METAR reports. METAR reports are freely available (in contrast to other weather data sources that are usually paid-for or dedicated for one kind of application and actively prevented by authors from other forms of usage), but usually only for places with nearby airport or weather observation station.
So - let's install standard METAR client tool:
root@e14-waterproof-challenger:~# apt-get install metar
Quick check - and failure:
root@e14-waterproof-challenger:~# metar -d ehgr
METAR pattern not found in NOAA data.
In turns out that Ubuntu 16.04 ships with outdated version of metar utility - querying using HTTP instead of HTTPS (which method was discontinued by the provider - National Oceanic and Atmospheric Administration). We could upgrade Ubuntu, build from source newer version of metar utility or learn how to parse METAR reports ourselves.
We will go the last route. First thing we need to know is a code for an airport nearbly. Using for example
https://airportcodes.aero/search
we can find ICAO code for - for example Warsaw Chopin Airport (which is EPWA).
Then, using the url from our failed metar execution we can download a report:
root@e14-waterproof-challenger:~# wget http://tgftp.nws.noaa.gov/data/observations/metar/stations/EPWA.TXT
URL transformed to HTTPS due to an HSTS policy
--2023-04-25 11:50:17-- https://tgftp.nws.noaa.gov/data/observations/metar/stations/EPWA.TXT
Resolving tgftp.nws.noaa.gov (tgftp.nws.noaa.gov)... 140.172.138.79
Connecting to tgftp.nws.noaa.gov (tgftp.nws.noaa.gov)|140.172.138.79|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 79 [text/plain]
Saving to: ‘EPWA.TXT’
EPWA.TXT 100%[==========================================================>] 79 --.-KB/s in 0s
2023-04-25 11:50:18 (2.59 MB/s) - ‘EPWA.TXT’ saved [79/79]
As can be seen, wget correctly redirects to HTTPS when HTTP is not available.
Now let's see our report:
root@e14-waterproof-challenger:~# cat EPWA.TXT
2023/04/25 09:30
EPWA 250930Z 30011KT 9999 -RA BKN012 09/07 Q1005 BECMG BKN020
We can decode it manually using data from
https://metar-taf.com/explanation
Interesting fields are:
- 250930Z (09:30 UTC time)
- 09/07 (temperature of 9 deg. C with dew point at 7 deg. C)
- Q1005 (air pressure of 1005 hPa)
So - let's create some simple script to decode and store those data in our database ("Go away or I will replace you with simple shell script" says angry sysadmin to some persistent user).
To execute it, we will need to create InfluxDB database first:
root@e14-waterproof-challenger:~# influx
Connected to http://localhost:8086 version 1.8.10
InfluxDB shell version: 1.8.10
> create database tests
> exit
Then run our script and check if data is present in database:
root@e14-waterproof-challenger:~# influx --database tests
Connected to http://localhost:8086 version 1.8.10
InfluxDB shell version: 1.8.10
> select * from environment
name: environment
-----------------
time DewPoint Humidity Pressure Temperature sensor
1682418600000000000 8 93 1005 9 METAR
> exit
Data visualisation
Now, as we have our first data, let's graph it. We will need to create datasource for local InfluxDB instance, then create some graphs. Grapsh are created using panels and grupped by dashboards.
And our first graphs will look like this
There are many customization options, but they will be used as needed in the later project stages.
Now we have created a simple weather station gathering data from METAR reports for an airport nearbly.
In the next part we will focus on creating sensors that will send data to our monitoring station.