Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 40 additions & 0 deletions .github/workflows/linter.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
name: Lint

on:
push:
branches:
- main
- dev
pull_request:
branches:
- main

permissions:
checks: write
contents: write

jobs:
run-linters:
name: Run linters
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set-up Python
uses: actions/setup-python@v5
with:
python-version: 3.11

- name: Install Python Dependencies
run: pip install black==24.4.0 flake8==7.0.0

- name: Run linters
uses: wearerequired/lint-action@v2
with:
black: true
black_args: "-l88"
black_auto_fix: true
flake8: true
flake8_args: "--max-line-length=100 --ignore=E203"
26 changes: 26 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
- id: check-merge-conflict
- id: debug-statements
- id: mixed-line-ending
- id: check-added-large-files
- repo: https://github.com/kynan/nbstripout
rev: 0.6.1
hooks:
- id: nbstripout
- repo: https://github.com/ambv/black
rev: 24.10.0
hooks:
- id: black
args:
- --line-length=88
language_version: python3
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v1.2.3
hooks:
- id: flake8
args:
- --max-line-length=100
- --ignore=E203
79 changes: 77 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,77 @@
# river_depth
Repository for displaying river Mersey depth on Pi Hat
# River Depth
This repository has been created for displaying the latest river depth information from [GOV.UK](https://check-for-flooding.service.gov.uk/) and displaying this on a [Sense HAT](https://projects.raspberrypi.org/en/projects/getting-started-with-the-sense-hat) with a raspberry pi.

## Why do this?
Good question. This is a small side project so that I could get some more practice with python particularly for web-scraping and playing around with Sense HAT.

## Repository Structure
``` plaintext

├───README.md
├───LICENSE
├───requirements.txt
├───.gitignore
├───.pre-commit-config.yaml
├───.github/workflows
└───linter.yaml
└───src
├───config.py
├───output.py
└───scrape.py

```

- `README.md`: This file containing an overview and instructions for using the repository.
- `LICENSE`: License information for the repository.
- `requirements.txt`: Python dependencies required.
- `.gitignore`: Specifies the files and folders that are ignored (not tracked) in the repository.
- `.pre-commit-config.yaml`: Pre-commit hooks configuration
- `.github/workflows/linter.yaml`: GitHub Actions workflow for linting on push and pull requests to `main`
- `src`: Contains python scripts for scraping and outputs
- `config.py`: For setting the `station_id` to scrape data from.
- `output.py`: For creating the message to display on the Sense HAT and the graphic of the river height.
- `scrape.py`: Contains function for scraping data on the river height for the latest reading at the specified station.

# Using the repository

## Setup

Firstly either fork or clone the repository:
```bash
git clone https://github.com/ASW-Analyst/river_depth/river_depth.git
cd river_depth
```
Now create a virtual environment:

``` bash
python -m venv venvhttps://check-for-flooding.service.gov.uk/
source venv/bin/activate # On Linux/Mac
venv\Scripts\activate # On Windows
```
Install dependencies:

``` bash
pip install -r requirements.txt
```

## Configuration
Open `config.py` and set the `station_id` for the location that you wish to use.

## Running
You can then run the output script `output.py` to scrape the data and display the message and graphic on the Sense HAT:

```bash
python src/output.py

```
This script will:
- Scrape the latest data from the specified station
- Display a scrolling message with the latest readings
- Show a visual graphic representing current water level relative to top of the normal range.

![River depth message demo](sense_hat.gif)

# Acknowledgements
This is only possible due to river height data being made available via Flood Service at [GOV.UK](https://check-for-flooding.service.gov.uk/)


12 changes: 12 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
beautifulsoup4==4.13.4
bs4==0.0.2
certifi==2025.4.26
charset-normalizer==3.4.2
idna==3.10
numpy==2.2.6
pillow==11.2.1
requests==2.32.3
sense-hat==2.6.0
soupsieve==2.7
typing_extensions==4.14.0
urllib3==2.4.0
Binary file added sense_hat.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions src/config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
station_id = "5008"
48 changes: 48 additions & 0 deletions src/output.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
from sense_hat import SenseHat
from time import sleep
from src.scrape import scraper
from src.config import station_id

# Run scraper function
(
location,
date_time,
height,
height_float,
trend,
state,
range_text,
range_lower,
range_upper,
) = scraper(station_id)

sense = SenseHat()

# Write the message to display
message = f"{location} {date_time}: | Height: {height}|"

# Clear LED matrix
sense.clear()

# Print message
sense.show_message(message, scroll_speed=0.1, text_colour=[0, 255, 0])

# Clear Display
sleep(1)
sense.clear()

# Calculate how many rows to fill
rows_total = 8
row_height = range_upper / rows_total
rows_to_light = min(int(height_float / row_height), 8)

# Prepare an 8x8 graphic:
for y in range(8):
for x in range(8):
if y >= (8 - rows_to_light):
sense.set_pixel(x, y, (0, 0, 255))
else:
sense.set_pixel(x, y, (0, 0, 0))

sleep(8)
sense.clear()
57 changes: 57 additions & 0 deletions src/scrape.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
from bs4 import BeautifulSoup
import requests
import re


def scraper(station: str):
"""
Function to scrape the latest information for the
specified station. Function will return the scraped objects
"""

url = "https://check-for-flooding.service.gov.uk/station/" + station
response = requests.get(url)
doc = BeautifulSoup(response.text, "html.parser")

location = doc.select_one("#main-content > div:nth-child(1) > div > h1")
location_text = " ".join(location.get_text(strip=True).split())

date_time = doc.select_one(
"#main-content > div:nth-child(3) > div > div > div > h2"
)
date_time_text = date_time.get_text(strip=True)

height = doc.select_one(
"#main-content > div:nth-child(3) > div > div > dl > div:nth-child(1) > dd"
)
height_text = height.get_text(strip=True)
height_float = float(height_text.replace("m", ""))

trend = doc.select_one(
"#main-content > div:nth-child(3) > div > div > dl > div:nth-child(2) > dd > span"
)
trend_text = trend.get_text(strip=True)

state = doc.select_one(
"#main-content > div:nth-child(3) > div > div > dl > div:nth-child(3) > dd > span"
)
state_text = state.get_text(strip=True)

range_doc = doc.select_one("#main-content > div:nth-child(3) > div > div > p")
range_text = range_doc.get_text(strip=True)

range_values = re.findall(r"\d+\.\d+", range_text)
range_lower = float(range_values[0])
range_upper = float(range_values[1])

return (
location_text,
date_time_text,
height_text,
height_float,
trend_text,
state_text,
range_text,
range_lower,
range_upper,
)