#255 Node-Red, InfluxDB, and Grafana Tutorial on a Raspberry Pi

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
grah fauna influx TP and node-red on a raspberry pi form a dream team for visualization of IOT data the set up is not easy but this is what we will do together create C youtubers here is the guy with the Swiss accent with a new episode and fresh ideas around sensors and microcontrollers viewers might remember my video about the visualization of weather data in node-red I used the name porn because it was too much back then I promise you to show you a simpler method so today we will install everything necessary on a Raspberry Pi using Peters Cargill's newest script we will compare time series databases like in flux TB with SQL databases and see their advantages for IOT we will create an influx DB database and connected to a node read flow to read data from my weather station we will create a graph on a data source to get access to the influx DB database we will develop a dashboard with several panels showing the different weather data we will see the power of a time series database combined with gravano and in the end you can download a ready-made SD card for your Raspberry Pi based on what you learn today you should be able to take the example and create a flow in node-red to store sensor data in infox DB and display it from there on a graph on a dashboard very important the data is only stored in in flux TB node-red thus only the conditioning and Gravano only the display of the data a lot to do let's start as usual first we have to copy a raspbian to an SD card I use standard raspbian without any additional software later Peters script will install what is needed you should also be able to use raspbian light if you do not want a desktop then we start our new raspbian do all the required configuration and the standard update and upgrade to get everything on the latest status please do not forget to enable SSH if you want to connect from a remote PC to your raspberry and to choose your timezone you find a link in the description on how to set up your raspbian now you can follow peter's instructions to get his script and run it in peter script you can select the programs you want to install here you see my choice do not choose not read twice I picked this one to install the newest version because I had some issues I had to manually install moment I hope this will no more be necessary in the future as a last installation step we have to do some admin stuff for influx TB first we edit influx DB conf using sudo nano the only two lines we have to change or the HTTP enabled to true and the bind to the standard port then we have to create an admin user now rate in flux TB Ravana should now work ifconfig shows the address of my raspy 192 168 0 12 we can test the installation by connecting with the browser to node rate on port 1880 and to graph Anna on port 3000 both use admin admin as credentials in out red you should already find many nodes installed also the node to connect to in flux TB unfortunately during my install Peters script had some issues and I had to install the in flux DB node from the palette I hope this will be solved in the future but why do we use influx TP and not SQLite as in previous projects what is the difference between the two SQL databases like SQLite are very flexible and the workhorse of many business applications they are organized in rows of data like time temperature humidity wind speed etc if we want to store our sensor data in SQLite we have to create a database here called whether a table called stations and then add all needed fields including their data types not a big deal then we can insert the sensor data line of the line in flux DB on the other hand is a time series database it is a relatively new sort of databases they have a similar structure like SQLite the difference is that a timestamp is automatically added every time we insert data and we do not need to define the fields as we did before in SQLite as we will like to see they are added as soon as we insert data into the database so it is a little easier to work with a time series database but this is not the most important difference much more important is the reading of the data stored in the database and how these new databases deal with historical data what are typical questions we have for our sensor data we are interested in for example all data for the last hour or the last 24 hours or we want to have average temperatures for all days of the previous year what is common in all these questions time slots are always one part of the question time series databases are optimized for such queries of course you also can create standard SQL queries to answer those questions but this involves a lot of work and you have to write all the different queries yourself in contrast look at the out of the bok choy s' here over helming with one line of coding precisely the stuff for non-programmers and for guys like me we like programming but we love results the second difference is how databases deal with historical data my weather station for example creates data every few minutes this real-time data is interesting for let's say one week later on and more interested in hourly daily or even weekly data standard SQL databases store and retain all data ever written to them and to get the data for last year they have to read all sensor data and create averages when you issue a query this can create a heavy load and this slow especially on machines like a Raspberry Pi with limited memory time series databases offer so-called retention policies where you can decide how long you want to keep the real-time data and how the data should be averaged and compressed after that you could for example decide you want to keep all sensor data for a week after that week you only want to keep hourly averages for a year and after this year only daily averages you can imagine how much the database size will be reduced with such a retention policy and how fast a query runs against this reduced data set the retention policy is only created once and then the database does it automatically this is why these time series databases are usually preferred for IOT scenarios especially on small machines like a raspberry by the way why do I use node-red in flux TB and gravina there are other software packages to choose from the first reason is cost all three packages are open source and free to use from acres the next reason is support most of us anyway use node-red in one or the other way for our project and for the moment the combination of influx DB and Gravano is on everyone's lips so you get a decent community support and it's supported by Peters script also an advantage let's now look at how we get our data from the sensor into the database I use the same flow as in video number two for two the flow passes a mqtt message from RF link and creates the following message for three three object this object contains all sensor data like temperature humidity and wind speed it also includes the name of the sensor because most of you do not have the same weather station I added an inject note that you can simulate my weather station like that you can play with this setup even if you do not have sensors attached at the end of this flow we use an ordinary function block to create the needed message for the influx DB connector the rows in in flux TP are mapped for each variable named temp wind etc the message for three three variables are the ones created by the flow from the sensor the next node inserts these fields into influx DB because note read an influx DB run on the same machine we can use one twenty seven point zero point zero point one or localhost as address and port 8080 six as defined earlier in the setup of influx DB we call our database weather and use the credentials of our PI user created before and we use the measurement stations done of course we get now errors because no database exists this is what we have to do next it is quite simple we login to our raspberry and open in flux the client to the database then we create our database now no dread does not create any errors anymore if we look at measurements in our database we see that no dread automatically creates the measurement stations and if we ask for the fields we get the automatically created fields including the data types like string or float everything done automatically and if we query the stations we see already field data from our weather station cool now we are ready to fire up gravina on its port 3000 we add a new data source and fill in these parameters a name and the type the default address and the basic authentication for influx DB then the name of the database we created before and the credentials again now we are ready to test the connection next we can go back and start to create our first dashboard we call it weather of course it is empty till we add a panel the first is outside temperature and we edit the panel data the source is our only measurement called stations and because our weather station has the name dkw 2012 we have to put this information into the where Clause we do not want to read data from other sensors like the XT 200 or the oriole SQL lovers can look at the query inspector to see what happened now we select the temp field and select mean to get an average value we select group by time and select 5 minutes as the time to average we give it the name roof temperature and go on to define the axis for the y axis we choose degrees Celsius and the label next we add a label with the current temperature like that we always see the last value and we can remove filling to get a better readability of curves this is especially important if you want to display more than one curve in a panel done unpack our first panel is finished and can be named and saved the wind panel has two curves one for the wind speed and one for the gusts you create the first measurement and copy the second to save time the last number on the dashboard is the wind direction it is a gauge not aligned graph after a day or so the dashboard looks like that we can now freely select the time scale to display the details and even choose a particular period with a mouse to get all this running was not a simple task and took some time this is why you can download a copy of my SD card image and using win32diskimager create your card for your Raspberry Pi 3 the user and passwords are PI and raspberry admin and admin and user and user I used an 8 gigabyte card if you use a 16 gigabyte card it should be ok maybe even with an 8 gigabyte card of course you can do much more with craf Ana and also define retention strategies with influx DB with this first installation you have a working example where you can do your tests of course also add your own sensors in node-red as I did it with my youtube statistics where I created the whole new node red flow a different database and a new dashboard summarized we installed the needed software using Peters Cargill's newest script we know the difference between time series databases and SQL databases we created an influx DB database and connected it to node read to read data of my weather station we connected graph fauna to influx DB we developed a dashboard with several panels showing the weather data we played around to see the power of a time series data base and graph Anna and you got a ready-made SD card to easily start your experiments I want to specially thank Peter and Antonio for their work on the script and also my supporters on patreon and viewers using my links for their purchases for supporting the channel without you it would be difficult for me to do what I do now bye you
Info
Channel: Andreas Spiess
Views: 179,390
Rating: 4.9647207 out of 5
Keywords: arduino, arduino project, beginners, diy, do-it-yourself, eevblog, electronics, esp32 project, esp8266, esp8266 project, greatscott, guide, hack, hobby, how to, iot, nodemcu, project, simple, smart home, ttgo, wifi, Grafana, influxdb tutorial, influxDB, Node-red, node-red tutorial, node-red influxdb grafana, influxdb grafana, influxdb grafana tutorial, raspbery pi, IOT visualization, weather station, raspbian, influxdata, influxdata tutorial, influxdb time series, time series database, spiess
Id: JdV4x925au0
Channel Id: undefined
Length: 16min 31sec (991 seconds)
Published: Sun Feb 10 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.