Notes on using BME280 sensor with Raspberry Pi

My BME280 sensor setup to monitor temperature and humidity

I was looking for a sensor to monitor temperature, humidity and pressure using Raspberry Pi and came across BME280 sensor. The sensor is very nicely packaged by Adafruit Industries as you see in the image above. In particular, the stemma QT connectors on the board, make it very easy to connect the sensor board with GPIO pins of the Raspberry Pi. From a point of view of hardware assembly, getting the sensor connected was very easy and I used this diagram to connect it to the Raspberry Pi Zero.

The sensor uses I2C bus, which requires following setup on the Pi. In my case the device address was 0x77 as shown below:

$ i2cdetect -y 1
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: -- -- -- -- -- -- -- -- -- -- -- -- --
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
70: -- -- -- -- -- -- -- 77

With hardware all setup, let’s now look at the software part. Below is the final outcome with the temperature and humidity values captured overnight on a Grafana dashboard. You can see that the temperature dropped through the night and the effect of the heater early in the morning around 5 am. The wiggles indicate on and off cycles of the heater whenever it reached its threshold temperature. This is quite fascinating and I can see myself using this setup for a variety of experiments and measurements. There are several software components at play here including a Kubernetes cluster, but I’ll go over the piece that captures and publishes the data running on the RPi Zero in more details in this post.

Temperature and Humidity captured using BME280 and Raspberry Pi Zero WH

Architecture

Schematic of various components in the pipeline

Before I talk about the software component running on the far right on IoT devices (RPi Zero in this case), let’s go over the general architecture of the entire system. The idea is to leverage Kubernetes ecosystem for managing following subsystems:

  • Grafana for dashboards
  • Prometheus for metrics collection
  • MQTT for messaging pipeline
  • CRD’s for IoT device management

To put it in different words, Kubernetes gives us robustness, scale, security and a rich ecosystem of tools to manage our applications. So if we can figure out what it takes to connect a software running on individual IoT devices to the control plane running on Kubernetes, that combination can be very powerful. With that as the motivation, let’s now dig into how the sensor data was captured and sent to Kubernetes cluster.

Data Capture and Messaging

As you might have already guessed MQTT was used as the messaging broker in this case. MQTT is a widely used protocol for communication in the IoT ecosystem. So essentially the software component running on the individual IoT device would need to do two tasks:

  • Capture data from the sensor
  • Publish data to MQTT server on a particular topic

There are many ways to put together such a software and MQTT allows a great deal of flexibility. In my case, however, the software materialized as a pure-Go based binary that leveraged following open-source key components:

I tested the binary by running in a mode that would simply print out temperature and other values:

pi@raspberry:~ $ ./bme280 run --interval-seconds=2 --total-count=100 | jq '.'
{
"level": "info",
"time": "2021-05-21T17:53:10.210-0700",
"name": "bme280",
"msg": "detected bme280 with device id: 0x60"
}
{
"level": "info",
"time": "2021-05-21T17:53:12.228-0700",
"name": "bme280.data",
"msg": "data",
"temperature": 23.87,
"pressure": 1003.94,
"humidity": 32.5,
"altitude": 77
}
{
"level": "info",
"time": "2021-05-21T17:53:14.251-0700",
"name": "bme280.data",
"msg": "data",
"temperature": 23.87,
"pressure": 1003.93,
"humidity": 32.39,
"altitude": 77
}
...<redacted>

As a side note, Kubernetes software ecosystem has excellent tooling for logging and I used this logger package to generate json logs you see above. Below is a snippet of logger setup.

package runimport (
"io"
"go.uber.org/zap/zapcore"
ctrl "sigs.k8s.io/controller-runtime"
"sigs.k8s.io/controller-runtime/pkg/log/zap"
)
var log = ctrl.Log.WithName("bme280")// setupLogger configures logger
func setupLogger(out io.Writer) {
ctrl.SetLogger(
zap.New(
zap.Encoder(
zapcore.NewJSONEncoder(
zapcore.EncoderConfig{
MessageKey: "msg",
LevelKey: "level",
TimeKey: "time",
NameKey: "name",
CallerKey: "caller",
StacktraceKey: "stacktrace",
LineEnding: zapcore.DefaultLineEnding,
EncodeLevel: zapcore.LowercaseLevelEncoder,
EncodeTime: zapcore.ISO8601TimeEncoder,
EncodeDuration: zapcore.SecondsDurationEncoder,
EncodeCaller: zapcore.ShortCallerEncoder,
EncodeName: zapcore.FullNameEncoder,
},
),
),
zap.UseDevMode(true),
zap.WriteTo(out),
),
)
}

I won’t go over the detailed setup for MQTT interface but just mention that the workflow involved running daprd as a systemd process on the RPi. This way the Go-binary would not need to know about MQTT protocol or MQTT server address. It simply would need to send messages to daprd service running on localhost. My systemd unit file looks as follows:

pi@raspberrypi:~ $ cat /etc/systemd/system/daprd.service 
[Unit]
Description=daprd service.
[Service]
ExecStart=/usr/local/bin/dapr run --app-id rpi0w-0 --dapr-http-port 3500 --dapr-grpc-port 50001 --config /etc/dapr/config.yaml --components-path /etc/dapr/components
[Install]
WantedBy=multi-user.target

As for data capture I was interested in a pure-Go solution allowing easy distribution of binaries on target platforms. I came across periph.io i2c interface which is written in pure-Go. Using this as an I2C layer with tiny-go drivers required compliance to an interface defined by tiny-go. In particular the code took the form as shown below. I have removed error management for brevity and it highlights instantiation of bus to be used with tiny-go’s bme280.New() function:

bus, _ := i2c.NewBus("")
defer bus.Close()
device := bme280.New(bus)
device.Address = 0x77
device.Configure()

The bus needs to implement following interface, which is part of the drivers package:

type I2C interface {
ReadRegister(addr uint8, r uint8, buf []byte) error
WriteRegister(addr uint8, r uint8, buf []byte) error
Tx(addr uint16, w, r []byte) error
}

Putting it all together

With pure-Go based bme280 binary producing MQTT messages with data captured from BME280 sensor in place all we need to do is intercept these messages can make them flow through the pipeline which runs on the Kubernetes cluster. In my case the Kubernetes cluster is running on a Raspberry Pi 4 unit, however, it can be any Kubernetes cluster whether on-prem or on cloud and whether on x-86 or arm architecture. The pipeline starts with the creation of a device CRD object in Kubernetes, which instantiates Prometheus metrics associated with the device.

apiVersion: edge.deoras-labs.io/v1beta1
kind: Device
metadata:
name: rpi0w-0
spec:
counters:
- name: counter1
help: "Help for counter1"
- name: counter2
help: "Help for counter2"
gauges:
- name: temperature
help: "Temperature in degree C"
- name: pressure
help: "Pressure in hPa"
- name: humidity
help: "Humidity in percentage"
- name: altitude
help: "Altitude in unknown units"

The yaml manifest above declares a contract between the IoT device and the Kubernetes cluster. As long as IoT device sends messages with metric names listed as gauge names, they get captured and processed correctly. These are, of course, high level notes and my intent here is to describe the overall plan and not go into details of code for each for these pieces. For the most situations a simple python code to capture data would suffice, however, in order to run this as a so-called cloud-native application, we need to build the whole spectrum of tooling. That’s it for now and I hope that gave you an overview of how my particular data capture setup is running.

--

--

--

Software engineer and entrepreneur currently building Kubernetes infrastructure and cloud native stack for edge/IoT and ML workflows.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

How we developed StereoPi v2 overcoming 6 failures along the way

My Experience Modding a Seiko Mechanical Watch

Build Smart Home Systems on Top of Legacy Electronics

Everything You Need to Know to Make a Face Covering to Stop the Spread of COVID-19

Connect PS3 Joystick to Raspberry 4b with ROS

How to Use a BLE Module — A RN4020 Tutorial

Building a Polybius Arcade Cabinet

Useful Gifts to Make From the Garden

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Saurabh Deoras

Saurabh Deoras

Software engineer and entrepreneur currently building Kubernetes infrastructure and cloud native stack for edge/IoT and ML workflows.

More from Medium

Build Private Cloud with OpenStack Kolla-ansible Step by Step Guide 1.0

Design of the Automated Laser Pointer

Install problematic DS packages on M1 Chip Mac using OpenBlas

Pachyderm with OpenDataHub 2— End-to-End Installation Guide