Data Collection and Management: The Pulse of GardenPi
With the hardware and database setup, now we can write the scripts which will populate everything and set them on a cron job timer to bring the system to life.
How it is organized
There are 3 scripts which power the data loading and processing:
RecordSensorMeasurements.py - Inserts records into fields of the sensorMeasurement table
SenseWatering.py - Inserts records into fields of the sensorWatering table
SensorMovement.py - Inserts records into fields of the sensorMoved table
All 3 scripts are on a Linux cron job (job scheduler) set to run every 15 minutes to update the tables accordingly.
step 1: create the scripts
Using VSCode in your PC, you will want to create a new text document for each of the following scripts:
RecordSensorMeasurments.py
# import packages import RPi.GPIO as GPIO import datetime import spidev import bme280 import smbus2 import MySQLdb ### variables # sensor variables spi = spidev.SpiDev() spi.open(0, 0) spi.max_speed_hz = 1000000 # BME280 sensor variables port = 1 address = 0x76 bus = smbus2.SMBus(port) # GPIO setup GPIO.setwarnings(False) GPIO.setmode(GPIO.BCM) GPIO.setup(4, GPIO.IN) # Light sensor # DB access setup db_config = { 'host': 'localhost', 'user': 'exampleuser', # change these 'password': 'password', 'database': 'SensorDataDB', } # database insert query add_measurement = ( "INSERT INTO sensorMeasurements (timestamp, sensorOI, sensorRead)" "VALUES (%s, %s, %s)" ) # moisture sensor function def readData(channel): adc = spi.xfer2([1, (8 + channel) << 4, 0]) data = ((adc[1] & 3) << 8) + adc[2] return data # Main function try: # Connect to the database conn = MySQLdb.connect(**db_config) cursor = conn.cursor() timestamp = datetime.datetime.now() # Read measurements from BME280 calibration_params = bme280.load_calibration_params(bus, address) bmeData = bme280.sample(bus, address, calibration_params) humidity = bmeData.humidity temperature = bmeData.temperature pressure = bmeData.pressure # Read light sensor value light_value = GPIO.input(4) # Read light sensor input # Insert sensor measurements, excluding OI as it's auto-incrementing sensor_ois = [5, 6, 7, 8] # BME280 and Light sensor OI values sensor_reads = [temperature, pressure,humidity, light_value] for sensor_oi, sensor_read in zip(sensor_ois, sensor_reads): cursor.execute(add_measurement, (timestamp, sensor_oi, sensor_read)) # Using timestamp directly # Insert moisture readings for each sensor, using datetime format sensor_ois = [1, 2, 3, 4] # Moisture sensor OI values sensor_reads = [readData(0), readData(1), readData(2), readData(3)] for sensor_oi, sensor_read in zip(sensor_ois, sensor_reads): cursor.execute(add_measurement, (timestamp, sensor_oi, sensor_read)) conn.commit() finally: GPIO.cleanup() cursor.close() conn.close() sys.exit()
SenseWatering.py
import MySQLdb from datetime import datetime, timedelta # Database credentials db_config = { 'host': 'localhost', 'user': 'exampleuser', #change this 'password': 'password', 'database': 'SensorDataDB' } # Modifiable threshold variable threshold_percentage = 0.10 # Change this value as needed # Connect to the database db = MySQLdb.connect(**db_config) cursor = db.cursor() # Get moisture sensor IDs cursor.execute("SELECT sensorOI FROM sensorData WHERE sensorType = 'Moisture'") moisture_sensor_ids = [row[0] for row in cursor.fetchall()] # Calculate the date 7 days ago seven_days_ago = datetime.now() - timedelta(days=7) for sensor_id in moisture_sensor_ids: # Get moisture readings for the sensor from the last 7 days cursor.execute("SELECT timestamp, sensorRead FROM sensorMeasurements WHERE sensorOI = %s AND timestamp >= %s ORDER BY timestamp", (sensor_id, seven_days_ago)) readings = cursor.fetchall() # Check for moisture drop > threshold percentage for i in range(1, len(readings)): current_value = readings[i][1] previous_value = readings[i - 1][1] if previous_value * (1 - threshold_percentage) > current_value: # Check if this drop is already recorded cursor.execute("SELECT * FROM sensorWatering WHERE sensorOI = %s AND datetime = %s", (sensor_id, readings[i][0])) if cursor.fetchone() is None: perc_drop = 100 * (previous_value - current_value) / previous_value # Populate sensorWatering table insert_query = """ INSERT INTO sensorWatering (datetime, sensorOI, preValue, postValue, percDrop) VALUES (%s, %s, %s, %s, %s) """ cursor.execute(insert_query, (readings[i][0], sensor_id, previous_value, current_value, perc_drop)) db.commit() # Close database connection cursor.close() db.close()
SensorMoved.py
import MySQLdb from datetime import datetime, timedelta # Database credentials db_config = { 'host': 'localhost', 'user': 'exampleuser', 'password': 'password', 'database': 'SensorDataDB' } # Modifiable threshold variable for increase threshold_percentage = 0.10 # Change this value as needed to reflect desired increase # Connect to the database db = MySQLdb.connect(**db_config) cursor = db.cursor() # Get moisture sensor IDs cursor.execute("SELECT sensorOI FROM sensorData WHERE sensorType = 'Moisture'") moisture_sensor_ids = [row[0] for row in cursor.fetchall()] # Calculate the date 7 days ago seven_days_ago = datetime.now() - timedelta(days=14) for sensor_id in moisture_sensor_ids: # Get moisture readings for the sensor from the last 14 days cursor.execute("SELECT timestamp, sensorRead FROM sensorMeasurements WHERE sensorOI = %s AND timestamp >= %s ORDER BY timestamp", (sensor_id, seven_days_ago)) readings = cursor.fetchall() # Check for moisture increase > threshold percentage for i in range(1, len(readings)): current_value = readings[i][1] previous_value = readings[i - 1][1] if previous_value * (1 + threshold_percentage) < current_value: # Check if this increase is already recorded in sensorMoved cursor.execute("SELECT * FROM sensorMoved WHERE sensorOI = %s AND datetime = %s", (sensor_id, readings[i][0])) if cursor.fetchone() is None: perc_incr = 100 * (current_value - previous_value) / previous_value # Update sensorMoved table to reflect the increase insert_query = """ INSERT INTO sensorMoved (datetime, sensorOI, preValue, postValue, percIncr) VALUES (%s, %s, %s, %s, %s) """ cursor.execute(insert_query, (readings[i][0], sensor_id, previous_value, current_value, perc_incr)) db.commit() # Close database connection cursor.close() db.close()
Each of these scripts should be saved off as a .py file and moved via WinSCP to the Raspberry Pi in a folder ready for use
step 2: modify the cron job
Before the scheduler will use the python scripts, it needs to know where they are and when to run them. Using terminal, enter the following command to access the cron file:
sudo crontab -e
You are now in VIM in the terminal which has a weird key control setup in my opinion, so be careful what you press. Text from the cron job file will appear and you will need to press the down key to go to the bottom to add the following rows of text:
*/15 * * * * /usr/bin/python3 /path/to/RecordSensorReadings.py */15 * * * * /usr/bin/python3 /path/to/RecordSenseWatering.py */15 * * * * /usr/bin/python3 /path/to/RecordSensorMoved.py
With the above copy/pasted into the bottom of the cron job document, you will want to Save by hitting Ctrl-O and Return then Ctrl-X and Return to get back to the main terminal. With these changes, you will want to ensure your cron job is set up correctly, you can list your cron jobs with the following command in terminal:
sudo crontab -l
If the 3 python scripts show up, you are good to go and they should run at the next quarter hour. This is a huge accomplishment if you have come this far successfully. Congrats!
At this point you can query the database via terminal MySQL to see the data that has loaded, but it would be much easier to have a local website with a graph, wouldn’t it?
Next we will setup a very simple website to visualize the collected data for Exploratory Data Analysis.