-
Notifications
You must be signed in to change notification settings - Fork 0
Home
SSA 2023 oral presentation slidedeck
SQUAC Seismic Quality Assessment Console is built from:
squac the web app (GUI) with dashboards and alarms. squac.pnsn.org
squacapi an API for interacting with the backend database of billions of measurements from dozens of metrics on thousands of channels. squacapi.pnsn.org
squacapi_client the user friendly python client that makes reading from/writing to the squac database easy. Great for data mining.
Request an account & questions: squac-help@uw.edu
Requirements
Installation
Note
Import modules & initialize client
Metrics
Channels
Chanel Groups
GET measurements from the database
POST measurements in the database
Finding the Latest Breach
Python 2.7 and 3.4+
Latest version number can be seen here: https://github.com/pnsn/squacapi_client
Install or update using pip; it is recommended to do this in a conda env. Note ObsPy & pytz are not required, but installing them will make a fully functional environment for most seismologists with all of their linked dependencies:
conda create -n SQUAC python=3.9 obspy pytz
conda activate SQUAC
pip install git+https://github.com/pnsn/squacapi_client.gitCheck your current version:
pip show squacapi_clientNote: when updating with the squacapi_client, in order to see results in squac, open up squac in a new tab, i.e. a simple refresh might not show updates due to caching.
#!/path_to_your_conda_environments_python e.g.: /home/usr/miniconda3/envs/SQUAC/bin/python
import os
# Import modules
try:
from squacapi_client.models.write_only_measurement_serializer \
import WriteOnlyMeasurementSerializer
from squacapi_client.models.write_only_group_serializer \
import WriteOnlyGroupSerializer
from squacapi_client.pnsn_utilities \
import get_client, make_channel_map, make_metric_map, perform_bulk_create
except Exception as e:
print("Info: squacapi_client not available, cannot use --squac option")
try:
USER = os.environ['SQUACAPI_USER']
PASSWORD = os.environ['SQUACAPI_PASSWD']
except KeyError:
sys.exit("Requires ENV vars SQUACAPI_USER, SQUACAPI_PASSWD")
# Initiate the client.
HOST = 'https://squacapi.pnsn.org'
squac_client = get_client(USER, PASSWORD)# Example: retrieve all metrics
metrics = squac_client.api_measurement_metrics_list()
# Get dictionary of metric_name to metric_id
# e.g. {'hourly_min': 82, 'acc_spikes_gt_.34': 90...
metric_map = make_metric_map(metrics)
# Example: Using params to filter
my_metric = squac_client.api_measurement_metrics_list(name='hourly_min')
print(my_metric)
print(my_metric[0].sample_rate, type(my_metric[0].sample_rate))
[{'code': 'hourly_min',
'created_at': datetime.datetime(2020, 6, 9, 22, 7, 13, 651106, tzinfo=tzutc()),
'default_maxval': 25000.0,
'default_minval': -25000.0,
'description': 'Hourly minimum of raw data.',
'id': 82,
'name': 'hourly_min',
'reference_url': 'https://github.com/pnsn/station_metrics/blob/master/station_metrics.md',
'sample_rate': 3600,
'unit': 'counts',
'updated_at': datetime.datetime(2022, 9, 13, 0, 2, 38, 471259, tzinfo=tzutc()),
'user': 5}]
Create a metric (POST at bottom of page): http://squacapi.pnsn.org/api/measurement/metrics/
Or with squacapi_client:
# Create a metric:
# https://github.com/pnsn/squacapi_client/blob/main/docs/WriteOnlyMetricSerializer.md
from squacapi_client.models.write_only_measurement_serializer \
import WriteOnlyMeasurementSerializer
name = 'hourly_min'
code = 'hourly_min'
description = 'Hourly minimum of raw data.' # optional
unit = 'counts'
default_minval = -25000. # optional
default_maxval = 25000. # optional
reference_url = 'https://github.com/pnsn/station_metrics/blob/master/station_metrics.md'
sample_rate = 3600 # optional
newmetric= WriteOnlyMetricSerializer(name=name, code=code, description=description,
unit=unit, default_minval= default_minval, default_maxval= default_maxval,
reference_url=reference_url, sample_rate=sample_rate)
response = squac_client.v1_0_measurement_metrics_create(newmetric)
A handy daily updated mapping of SCNLs to channel_id is here.
# Example: retrieve all channels. Note can take up to 10-30s depending on cache.
channels = squac_client.api_nslc_channels_list()
# Get dictionary of SCNL to channel_id
# e.g. {'5473.HNE.NP.to': 7599, 'CFS.HNE.CI.01': 16189...
channel_map = make_channel_map(channels)
# Example: Using params to filter and chan_search to get all ?NZ channels
# all_params = ['network', 'channel', 'chan_search', 'station', 'location',
# 'startafter', 'startbefore', 'endafter', 'endbefore',
# 'lat_min', 'lat_max', 'lon_min', 'lon_max']
import datetime
T1 = datetime.datetime(2021, 5, 5, 0, 0, 0)
channels = squac_client.api_nslc_channels_list(network='uw', station='alki,hood', \
chan_search='.nz', endafter=T1, lat_min=45.0, lat_max=48.0)
print(channels)
[{'azimuth': 0.0,
'class_name': 'channel',
'code': 'hnz',
'created_at': datetime.datetime(2019, 10, 10, 20, 27, 52, 303509, tzinfo=tzutc()),
'description': None,
'dip': -90.0,
'elev': 1.0,
'endtime': datetime.datetime(2599, 12, 31, 23, 59, 59, tzinfo=tzutc()),
'id': 777,
'lat': 47.5751,
'loc': '--',
'lon': -122.4176,
'name': 'Alki Wastewater Plant, Seattle, WA, USA',
'network': 'uw',
'nslc': 'uw.alki.--.hnz',
'sample_rate': 200.0,
'starttime': datetime.datetime(2017, 5, 2, 0, 0, tzinfo=tzutc()),
'station_code': 'alki',
'station_name': '',
'updated_at': datetime.datetime(2022, 12, 12, 10, 3, 8, 395912, tzinfo=tzutc()),
'user': 2}, {'azimuth': 0.0,
'class_name': 'channel',
'code': 'enz',
'created_at': datetime.datetime(2019, 10, 10, 20, 28, 7, 164189, tzinfo=tzutc()),
'description': None,
'dip': -90.0,
'elev': 1512.0,
'endtime': datetime.datetime(2599, 12, 31, 23, 59, 59, tzinfo=tzutc()),
'id': 1626,
'lat': 45.322262,
'loc': '--',
'lon': -121.650932,
'name': 'Mt Hood Meadows, OR CREST BB SMO',
'network': 'uw',
'nslc': 'uw.hood.--.enz',
'sample_rate': 100.0,
'starttime': datetime.datetime(2018, 9, 18, 0, 0, tzinfo=tzutc()),
'station_code': 'hood',
'station_name': '',
'updated_at': datetime.datetime(2022, 12, 12, 10, 3, 14, 859192, tzinfo=tzutc()),
'user': 2}]
# Example: updating one of your existing channel groups
all_groups = squac_client.api_nslc_groups_list()
for group in all_groups:
if ( group.name == "Seattle HNZ" ):
print("OLD GROUP: ",group)
group_name = group.name
group_channels = group.channels
group_id = group.id
group_org_id = group.organization
group_description = group.description
group_share_org = group.share_org
group_share_all = group.share_all
channels = squac_client.api_nslc_channels_list()
channel_map = make_channel_map(channels)
stations_to_add = [ 'ALKI', 'HOOD' ]
channels_to_add = [ 'ENZ', 'HNZ' ]
stations_to_remove = [ 'RNAV' ]
for SCNL in channel_map:
station = SCNL.split('.')[0]
chan = SCNL.split('.')[1]
if ( station in stations_to_add and chan in channels_to_add and \
channel_map[SCNL] not in group_channels ):
group_channels.append(channel_map[SCNL])
if ( station in stations_to_remove and channel_map[SCNL] in group_channels ):
group_channels.remove(channel_map[SCNL])
upated_group_data = WriteOnlyGroupSerializer(name=group_name, \
description=group_description, channels=group_channels, \
organization=group_org_id, share_org=group.share_org, \
share_all=group.share_all)
print("NEW GROUP: ", upated_group_data)
response = squac_client.api_nslc_groups_update(group_id, upated_group_data)
print("RESPONSE :", response)
from pytz import timezone
T1 = datetime.datetime(2022, 12, 5, 0, 0, 0).replace(tzinfo=timezone('UTC'))
T2 = datetime.datetime(2022, 12, 6, 0, 0, 0).replace(tzinfo=timezone('UTC'))
metric_id = metric_map['hourly_mean']
for SCNL in channel_map:
if ( 'ALKI.HNZ' in SCNL ):
channel_id = channel_map[SCNL]
measurements = squac_client.api_measurement_measurements_list( \
metric=[metric_id], channel=[channel_id], starttime=T1, \
endtime=T2)
for measurement in measurements:
print('SCNL, starttime, value: ',SCNL, measurement.starttime, \
measurement.value)
print('LAST MEASUREMENT: ',measurement)
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 00:00:00+00:00 -8052.620478744429
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 01:00:00+00:00 -8038.201500393198
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 02:00:00+00:00 -8025.302139634486
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 03:00:00+00:00 -8015.824732216291
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 04:00:00+00:00 -8008.67494948066
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 05:00:00+00:00 -7996.802052950648
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 06:00:00+00:00 -7981.238208571021
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 07:00:00+00:00 -7982.982831052508
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 08:00:00+00:00 -7988.429545457698
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 09:00:00+00:00 -7998.945236619136
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 10:00:00+00:00 -8013.859587440414
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 11:00:00+00:00 -8031.319104701593
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 12:00:00+00:00 -8046.1992965433255
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 13:00:00+00:00 -8062.899260898932
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 14:00:00+00:00 -8085.732994364857
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 15:00:00+00:00 -8102.885438641019
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 16:00:00+00:00 -8114.799750627938
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 17:00:00+00:00 -8117.516571869222
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 18:00:00+00:00 -8112.275245453953
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 19:00:00+00:00 -8109.202730610213
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 20:00:00+00:00 -8102.458216310153
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 21:00:00+00:00 -8091.838130070138
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 22:00:00+00:00 -8071.972390157709
SCNL, starttime, value: ALKI.HNZ.UW.-- 2021-05-05 23:00:00+00:00 -8050.53769221274
LAST MEASUREMENT: {'channel': 777,
'created_at': datetime.datetime(2021, 5, 6, 1, 37, 23, 739173, tzinfo=tzutc()),
'endtime': datetime.datetime(2021, 5, 6, 0, 0, tzinfo=tzutc()),
'id': 637009565,
'metric': 84,
'starttime': datetime.datetime(2021, 5, 5, 23, 0, tzinfo=tzutc()),
'user_id': '5',
'value': -8050.53769221274}
Notes: -the actions in this example will be "create or overwrite" depending on whether a value exists for the given metric_id, channel_id and starttime. chunk is the internal post size of the list. If your list is longer than chunk (optional parameter), then it will automagically split up your list for you when POSTing. The default chunk=100 measurements is usually fine, but feel free to explore if your perform_bulk_create are huge and prohibitively slow. -numbers uploaded as floats must be 64-bit. By default, python floats are 64-bit, however some libraries use 32-bit floats.
metric_id = 124
channel_id = 777
my_value = 90
T0 = datetime.datetime(2021, 5, 5, 0, 0, 0).replace(tzinfo=timezone('UTC'))
measurements = []
for i in range(0,3):
T1 = T0 + datetime.timedelta(0,(i*3600))
T2 = T0 + datetime.timedelta(0,(i+1)*3600)
my_value = my_value + 1
measurement = WriteOnlyMeasurementSerializer(metric=metric_id, \
channel=channel_id, value=my_value, starttime=T1, endtime=T1)
measurements.append(measurement)
response, errors = perform_bulk_create(measurements, squac_client, chunk=500)
print("RESPONSE: ", response)
print("ERRORS: ", errors)
RESPONSE: [{'channel': 777,
'created_at': datetime.datetime(2021, 5, 10, 19, 57, 35, 678238, tzinfo=tzutc()),
'endtime': datetime.datetime(2021, 5, 5, 0, 0, tzinfo=tzutc()),
'id': 669930555,
'metric': 124,
'starttime': datetime.datetime(2021, 5, 5, 0, 0, tzinfo=tzutc()),
'user_id': '5',
'value': 91.0}, {'channel': 777,
'created_at': datetime.datetime(2021, 5, 10, 19, 57, 35, 748250, tzinfo=tzutc()),
'endtime': datetime.datetime(2021, 5, 5, 1, 0, tzinfo=tzutc()),
'id': 669930556,
'metric': 124,
'starttime': datetime.datetime(2021, 5, 5, 1, 0, tzinfo=tzutc()),
'user_id': '5',
'value': 92.0}, {'channel': 777,
'created_at': datetime.datetime(2021, 5, 10, 19, 57, 35, 833526, tzinfo=tzutc()),
'endtime': datetime.datetime(2021, 5, 5, 2, 0, tzinfo=tzutc()),
'id': 669930557,
'metric': 124,
'starttime': datetime.datetime(2021, 5, 5, 2, 0, tzinfo=tzutc()),
'user_id': '5',
'value': 93.0}]
ERRORS: []
The find_latest_breach function (in find_latest_breach.py) can be used to locate the most recent time a given NSCL (Network.Station.Channel.Location) crossed a threshold value. The function queries SQUAC data at multiple time granularities—hour, day, week, and month—to efficiently narrow down the breach timestamp.
-
File Location:
find_latest_breach.py -
Function:
find_latest_breach(nscl, T1, T2, metric_id, user_threshold_dict, channel_map, verbosity, breach_mode)
-
Hierarchical Search
The function first checks hourly data nearT2(end of search window). If no breach is found there, it looks at daily data for the last 7 days, then weekly data for up to 6 weeks, and finally monthly. This approach speeds up searches for most recent breaches. -
Partial-Block Handling for Weekly Archives
Because SQUAC weekly archives may start beforeT1, the function can query from(T1 - 7 days)to avoid missing a weekly block that crossesT1. -
Above or Below Threshold (
breach_mode)- Default is
"above", meaning a breach isvalue > threshold. - If
breach_mode="below", the function usesmeasurement.min(if available) to detectvalue < threshold.
- Default is
-
Custom Thresholds
- By default, if
metric_idis 85 (hourly range) or 101 (5s power), the function applies hard-coded thresholds. - You can pass your own thresholds for other metrics or override the defaults by providing
user_threshold_dict.
- By default, if
-
Latest Breach Returned
The function always returns the latest time of breach within[T1, T2], or(None, None)if no breach was found.
Calling the function:
from find_latest_breach import find_latest_breach
import datetime
import logging
nscl = "UW.ALLI.--.HNN"
T1 = datetime.now(timezone.utc) - timedelta(days=180)
T2 = datetime.now(timezone.utc)
metric_id = 85 # hourly range in counts
override_thresh_dict = {'HN':5000}
breach_time, breach_value = find_latest_breach(nscl, T1, T2, metric_id, override_thresh_dict, verbosity=logging.DEBUG, breach_mode='above')
#breach_time, breach_value = find_latest_breach(nscl, T1, T2, metric_id) # also valid if metric_id = 85 or 101 which have internal default thresholds