In this project I show how to use a webserver on a Raspberry Pi with a RTL-SDR software defined radio dongle to show weather data from an old wireless weather station. De system uses rtl_433 to decode the weather station data and python code, based on Dash is used to graph weather data in a browser. The code is tested with a Fine Offset Electronics WH1080/WH3080 compatible Weather Station (Alecto WS-4000).


I have an old weather station running at home, consisting of some outdoor sensors and an indoor display unit. Communication is wireless at 868 MHz. Here a picture of the unit.

The display nicely shows the current weather data, but it is difficult to show historical data, or to graph trends. There are/were some options with currently outdated windows or macOS software, but this is currently not a very useful setup. Also the fact that a usb connection needs to be made with the display unit. Recently I came across a very nice and useful piece of software called rtl_433 ( It is a generic data receiver, mainly for the 433.92 MHz, 868 MHz (SRD), 315 MHz, 345 MHz, and 915 MHz ISM bands, based on the well known RTL-SDR dongle ( and capable of decoding all kind of FSK protocols, including my weather station. So I decided to use rtl_433 to receive the wireless data from my weather station sensors and use some code to graph it in a web page.

The rtl_sdr dongle in the home automation raspberry pi. The case of the dongle is removed as it was to big to use it in combination with other usb devices.

The antenna is on top of the case.


The general setup is simple, data received by rtl_433 is stored in a sqlite database. A webserver extracts the data from the sqlite database and displays it in the browser. This is all implemented on a home raspberry pi server that also is used for home automation. I'm using a Pi 2, which is less power hungry and also produces less heat than the Pi 3 and 4. Still it has enough power for home automation and this weather serving task. When testing rtl_433 it apeared that it uses 30% of my CPU. Since I prefer to limit power consumption, I decided to run rtl_433 on an interval of 10 minutes, which gives enough weather data to plot. Crontab is used to run rtl_433 at this 10 minute interval. The received data from rtl_433 is piped to a python script that writes the received weather records to a sqlite database. Then a python script using Dash ( is used to serve the graphs to a web browser.

Store rtl_433 decoded weather data to database

First we need to put the rtl_433 command in cron.
sudo crontab -e   
And add:
# weather station logger
0,10,20,30,40,50 * * * *        /usr/local/bin/rtl_433 -p 69 -f 868M -F json -R 155 -T 90 -E quit | /usr/bin/python /home/pi/bin/ >> /var/log/temperature/ws_error.log
The command is executed at 0,10,20,30,40,50 minutes for each hour,day of month,month and day of week (*). Other intervals can be used by adapting this.

The rtl_433 options used are:
  • -p 69 : to compensate the ppm error (tested with rtl_test -p)
  • -f 868M : 868 MHz is the frequency used at the WS-4000
  • -F json : output json format
  • -R 155 : only output protocol 155 (Fine Offset Electronics WH1080/WH3080 Weather Station (FSK))
  • -T 90 : timeout if nothing received in 90 seconds
  • -E quit : quit command after successful event
The -E quit option causes rtl_433 to quit, so every 10 seconds we have just one weather data record in json data, exept when nothing is received in 90 seconds, which is the timeout value. Here is an example output from the rtl_433 command:
rtl_433 version 21.05-125-g73a2edb0 branch master at 202111080848 inputs file rtl_tcp RTL-SDR
Use -h for usage help and see for documentation.
Trying conf file at "rtl_433.conf"...
Trying conf file at "/home/pi/.config/rtl_433/rtl_433.conf"...
Trying conf file at "/usr/local/etc/rtl_433/rtl_433.conf"...
Trying conf file at "/etc/rtl_433/rtl_433.conf"...

New defaults active, use "-Y classic -s 250k" for the old defaults!

Registered 1 out of 200 device decoding protocols [ 155 ]
Found Rafael Micro R820T tuner
Exact sample rate is: 1000000.026491 Hz
[R82XX] PLL not locked!
Sample rate set to 1000000 S/s.
Tuner gain set to Auto.
Frequency correction set to 69 ppm.
Tuned to 868.000MHz.
baseband_demod_FM: low pass filter for 1000000 Hz at cutoff 200000 Hz, 5.0 us
{"time" : "2021-12-09 21:11:17", "model" : "Fineoffset-WHx080", "subtype" : 0, "id" : 57, "battery_ok" : 1, "temperature_C" : 4.600, "humidity" : 99, "wind_dir_deg" : 135, "wind_avg_km_h" : 1.224, "wind_max_km_h" : 4.896, "rain_mm" : 126.600, "mic" : "CRC"}   

As you see a lot of debuging info with at the end one line with json formatted waether data. This output is piped to the script, which saves the data in a sqlite database. The script reads all lines one by one, throws everything away which is not json, and reads the json in a python dictionary. Non numeric and unneeded values are removed from the dictionary with the pop command.

for line in fileinput.input():
        d = json.loads(line)
        # remove non numeric and unneeded values

Now the weather record is stored in the python dictionary d, it is stored in the database with the store_in_database(d) procedure:

# write to sqlite database
def store_in_database(d):
    conn = sqlite3.connect(sqlite_file)
    c = conn.cursor()
    if not c.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='{tn}'"\
    c.execute("INSERT INTO {tn} ({cn}) VALUES(DATE('{thedate}'))"\
       .format(tn=table_name, idf=index_col, cn=date_col, thedate=d['time']))
    lrid = c.lastrowid
    c.execute("UPDATE {tn} SET {cn}=TIME('{thedate}') WHERE {idf}=({rid})"\
         .format(tn=table_name, idf=index_col, rid=lrid, cn=time_col, thedate=d['time']))       
    for field in d:
      if field != 'time':
         c.execute("UPDATE {tn} SET {cn}={val} WHERE {idf}=({rid})"\
              .format(tn=table_name, idf=index_col, rid=lrid, cn=field, val=d[field]))       

The store_in_database procedure first checks weather the database already exists, if not, it will create one using a seperate function call. Then it stores the date and the time. Although date and time is one record in the weather data, and consequently in the dictionary, I decided to store them seperately in the database. Finaly the other values from the dictionary are stored in the database. Complete code can be found on github (

Serve graphs to browser

The python script extracts the saved weather data from the sqlite database and displays nice graphs in a web browser. Dash.plotly is the framework used for this. A Dash DatePickerRange is used at the bottom of the page to select the dates for which the weather data is plotted. In this example the internal DASH webserver is used, but Dash can also be used in connection with your standard raspberri pi webserver such as apache2. When running standalone the web page is served at port 8050.

When browsing to the ip address of the raspberry pi, port 8050, we see the output of the python script. On top six graphs with weather data, on the bottom the date range which is displayed. Most graphs are directly extracted form the database, except for the rain per day, which is calculated as the total amount of rain fallen between 00:00 and 23:59 for each day. Note that this is not the way official metereological values are calculated, as they use 8:00 UTC as the seperation.

codedef calc_rain_per_day(timestamp,rain):
    format = "%Y-%m-%d %H:%M:%S"
    startrain = rain[0]
    datestamp = []
    for ts in timestamp:
        if i>0:
            dt_object_prev = dt.strptime(timestamp[i-1], format)
            dt_object = dt.strptime(ts, format)
            if >
                #print(, rain[i]- startrain)
                rain_per_day.append(round(rain[i]- startrain,1))
                startrain = rain[i]
    return{'datestamp':datestamp, 'rain_per_day':rain_per_day}   

The date range at the bottom of the page is a Dash DatePickerRange, that can be used to make a selection of the plot period. In this example the internal DASH webserver is used, but Dash can also be used in connection with your standard raspberri pi webserver such as apache2. When running standalone the web page is served at port 8050.

python3 rpi_rtlsdr_weather_station/
Dash is running on

 * Serving Flask app 'show_weather_station' (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on all addresses.
   WARNING: This is a development server. Do not use it in a production deployment.
 * Running on (Press CTRL+C to quit) - - [30/Nov/2021 20:16:15] "GET / HTTP/1.1" 200 - - - [30/Nov/2021 20:16:15] "GET /_dash-layout HTTP/1.1" 200 - - - [30/Nov/2021 20:16:15] "GET /_dash-dependencies HTTP/1.1" 200 - - - [30/Nov/2021 20:16:15] "GET /_dash-component-suites/dash/dcc/async-graph.js HTTP/1.1" 304 - - - [30/Nov/2021 20:16:15] "GET /_dash-component-suites/dash/dcc/async-datepicker.js HTTP/1.1" 304 - - - [30/Nov/2021 20:16:15] "GET /_dash-component-suites/dash/dcc/async-plotlyjs.js HTTP/1.1" 304 - - - [30/Nov/2021 20:16:18] "POST /_dash-update-component HTTP/1.1" 200 -

Add the command to /etc/rc.local if you want the application to start automatically at rpi boot time.

# show weather station logs in dash application
python3 /home/pi/rpi_rtlsdr_weather_station/ >> /var/log/show_weather_station.log 2>&1 &

The complete code can be found on github (

Questions and comments welcome, please use the Contact button at the top of the page.