Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
f4871e8
Better ini parsing
cboulay Aug 18, 2021
3362cf7
Fixup auditory state management, esp spike_only
cboulay Aug 18, 2021
41ffced
move threshold change to data_source interface, enable threshold locking
cboulay Aug 18, 2021
f1c9ac8
Move more ini parsing into super
cboulay Aug 19, 2021
3f11d17
CerebusDataSource - Get more parameters from ini
cboulay Aug 19, 2021
f6e1da7
unit_scaling from ini, other cleanup
cboulay Aug 19, 2021
6e79c6e
range text from ini
cboulay Aug 19, 2021
a387e24
RasterGUI data abstraction and ini loading
cboulay Aug 19, 2021
f99f642
WaveformGUI - Data abstraction and ini settings loading.
cboulay Aug 19, 2021
1554c54
MappingGUI - data abstraction
cboulay Aug 23, 2021
9581f1a
Update build instructions to use qt 5.15
cboulay Aug 23, 2021
d342c6c
not functional; switching computers
cboulay Sep 21, 2021
2fdfec2
Moved top-level windows out of scripts and into dbsgui sub-package
cboulay May 2, 2022
dfe6d3f
Reorganization; relative imports.
cboulay May 2, 2022
74f7eb5
modernize setuptools
cboulay May 2, 2022
44fc4eb
Fixup gui position in settings
cboulay May 3, 2022
931f3ff
PySide6 compatibility
cboulay May 3, 2022
e985eb3
Re-add DDU detection lost in merge.
cboulay May 30, 2023
1e9c49f
Qt6 compat fixup.
cboulay May 30, 2023
b5589fa
Add DTT widget.
cboulay May 30, 2023
eda1bcb
Finished renaming from neuroport_dbs to open_mer. Unfortunately, git …
cboulay Aug 13, 2023
db022ae
Update a couple more paths to open_mer
cboulay Aug 13, 2023
c77b660
Add CbSdkConnection.ini to settings files.
cboulay Aug 14, 2023
6411150
Updates to GUIs. Some regressions...
cboulay Aug 14, 2023
41512a4
Big docs update.
cboulay Aug 14, 2023
2a834ea
Update Qt RegExp names.
cboulay Aug 14, 2023
418af22
Still working
cboulay Aug 15, 2023
d2d472c
Fix misplaced argument.
cboulay Aug 15, 2023
3ffde9b
WIP update
cboulay Aug 16, 2023
2d8fcf7
First attempt at channel_select pub/sub.
cboulay Aug 18, 2023
26794d0
Fix FHC DDU settings in DepthGUI
cboulay Aug 18, 2023
aa5e149
doc update
cboulay Aug 18, 2023
0aff8af
ini update
cboulay Aug 18, 2023
6298011
more specific naming; comment out unused code.
cboulay Aug 18, 2023
45d1a6d
Many fixes for feature plots.
cboulay Aug 18, 2023
9e35eae
Fixup DepthGUI for StarDrive
cboulay Aug 18, 2023
fc6c223
pass correct object when monitor selected via keypress
cboulay Aug 31, 2023
577137c
Update features chanselect when sweep changes chan.
cboulay Aug 31, 2023
5db110d
Better use of space in feature plots
cboulay Aug 31, 2023
45017f2
Add script to copy ini files to home dir.
cboulay Sep 22, 2023
968191e
Moved ini files from config to settings
cboulay Sep 22, 2023
58ac2a6
Rearrange ini parsing in widgets.
cboulay Sep 22, 2023
1ae7d2d
Add try-except around GUI application exec.
cboulay Sep 22, 2023
3fdd31f
forgotten file in move from config to settings
cboulay Sep 22, 2023
34969ea
Refactor docs for development (especially macOS).
cboulay Sep 22, 2023
b7995f0
Docs update
cboulay Sep 27, 2023
c372a7e
Add script to monitor ZeroMQ
cboulay Sep 27, 2023
1a779d4
Rename ProcessGUI to ProcedureGUI
cboulay Sep 27, 2023
de5983a
Major refactor:
cboulay Sep 28, 2023
6e1f032
Continue major refactor:
cboulay Sep 29, 2023
ded5c17
DepthGUI.ini return default source to FHCSerial
cboulay Sep 29, 2023
d09d573
Add sleep to MonitorZeroMQ to reduce CPU usage.
cboulay Sep 29, 2023
dd31e43
Fix reference to child widget settings.
cboulay Sep 29, 2023
9d0f15f
update paths to snippet and features processes
cboulay Sep 29, 2023
5a11794
Moved most of Depth_Process to dbsgui/process/trajectory.py
cboulay Nov 2, 2023
873f056
Procedure SettingsDialog - eliminate widgets from member variable nam…
cboulay Nov 2, 2023
b2a22cf
Debug prints;
cboulay Nov 2, 2023
759f3ca
Update feature-processing logic.
cboulay Nov 20, 2023
5ac8f7e
doc update
cboulay Nov 20, 2023
6b765c2
SweepGUI visualize snippet status.
cboulay Nov 30, 2023
25d9c9c
Update FeaturesGUI.ini
cboulay Nov 30, 2023
757d022
Fix depth-to-cbsdk
cboulay Nov 30, 2023
4c31b5c
tiny readme update.
cboulay Dec 8, 2023
4228081
Modify default features categories in FeaturesGUI.ini
cboulay Jan 5, 2024
035de05
procedure launches snippets- and features- subprocesses.
cboulay Apr 1, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,8 @@ QtSerialPort/build*
build-*
.idea
CereStimGUI/matlab/cerestim_dbs/
neuroport_dbs.egg-info
*.egg-info
*.whl

/site
build/
2 changes: 1 addition & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OpenMER

A collection of software developed in the Sachs Lab at the Ottawa Hospital for for deep brain stimulation (DBS) surgery intraoperative mapping with microelectrode recording (MER).
OpenMER is a collection of software developed in the Sachs Lab at the Ottawa Hospital for deep brain stimulation (DBS) surgery intraoperative mapping with microelectrode recording (MER).

![Image of vis apps](https://github.com/SachsLab/OpenMER/blob/master/vis_apps_screenshot.PNG?raw=true)

Expand Down
164 changes: 72 additions & 92 deletions docs/for-developers.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
## Repository Organization

* Library and Application code in the /neuroport_dbs folder
* Unit tests in the /tests folder. Note this uses the pytest framework and conventions.
* Documentation in the /docs folder
* `/open_mer`: Library and Application code
* `/open_mer/scripts`: Entry points
* `/tests` TODO: Unit tests
* `/docs`: Documentation
* `/scripts`: Scratch scripts

## Maintaining the Documentation

Expand All @@ -12,15 +14,17 @@ You will need to install several Python packages to maintain the documentation.

The /docs/{top-level-section} folders contain a mix of .md and .ipynb documentation. The latter are converted to .md by the [mknotebooks plugin](https://github.com/greenape/mknotebooks/projects) during building.

Run `mkdocs gh-deploy` to build the documentation, commit to the `gh-deploy` branch, and push to GitHub. This will make the documentation available at https://SachsLab.github.io/NeuroportDBS/
Run `mkdocs gh-deploy` to build the documentation, commit to the `gh-deploy` branch, and push to GitHub. This will make the documentation available at https://SachsLab.github.io/OpenMER/

### Autogenerated Documentation

The /docs/neuroport_dbs folder can hold stubs to tell the [mkdocstrings plugin](https://github.com/mkdocstrings/mkdocstrings) to build the API documentation from the docstrings in the library code itself. Currently this is empty. If any stubs are added then it's necessary to build the documentation from a Python environment that has the package installed. A stub takes the form
The /docs/open_mer folder can hold stubs to tell the [mkdocstrings plugin](https://github.com/mkdocstrings/mkdocstrings) to build the API documentation from the docstrings in the library code itself. Currently, this is empty. If any stubs are added then it's necessary to build the documentation from a Python environment that has the package installed.

A stub takes the form

```
# Title
::: neuroport_dbs.module.name
::: open_mer.module.name
```

### Testing the documentation locally
Expand All @@ -33,92 +37,68 @@ If you build the docs locally then you'll also get the /site directory, but this

TODO

## Maintaining the Zip Distribution

* Download the latest [WinPython release](https://github.com/winpython/winpython/releases/latest).
* These instructions were tested with Winpython64-3.8.5.0
* Run the WinPython self-extracting executable. This will create a folder containing a full Python distribution with many useful packages installed (see full list [here](https://github.com/winpython/winpython/blob/master/changelogs/WinPython-64bit-3.8.5.0.md)).
* [Edit the `WPy64-3850\settings\winpython.ini` file](https://sourceforge.net/p/winpython/wiki/Environment/) and add the following line: `PATH = %WINPYDIR%\Lib\site-packages\PyQt5\Qt\bin;%PATH%`
* Download [MySQL Windows ZIP Archive](https://dev.mysql.com/downloads/mysql/)
* Tested with mysql-8.0.29-win64.zip
* Next to the WinPython folder, extract the mysql zip and rename the extracted folder to `mysql`
* In the WinPython folder, run "WinPython Command Prompt". This will open a Command Prompt with all the paths configured to use this new Python distribution.
* Install all of the Python packages listed in the table below.
* Version numbers may not be important. Please try the latest version and report to us if it does not work.
* The method to install the packages isn't important. If you're on an internet-connected computer then you can use the pip commands. Otherwise you can first download the wheels then bring them to the development computer to pip install the wheels.
* If you wish to be able to modify any of the SachsLab packages that are pure python (mspacman, cerebuswrapper, serf, **neuroport_dbs**) then you may do so by first cloning the repository to get the source and installing the package in-place: Using the WinPython command prompt, run `pip install -e .` from within the cloned directory.
* The `cerebus` package may complain "DLL load failed". This happens when cerebus.cbpy can't find Qt5 or it finds the wrong version. This SHOULD be fixed by editing the PATH in the 3rd step above, but I also found it necessary to copy Qt5Core.dll and Qt5Xml.dll from the above path directly into the site-packages\cerebus folder. We hope to remove the qt dependency from cerebus to avoid this in the future.
* In the command prompt, `cd` into the `bin` subfolder of the unzipped mysql folder.
* Create a mysql\data folder along with the base databases: `mysqld --initialize-insecure --console`
* You can change the default data directory, username, and password. See the section below "Configuring MySQL Database Server"
* Run `mysqld` in the `mysql\bin` folder. (Windows: `start /B mysqld.exe`; allow network access if asked.)
* Back in the command prompt, run `mysqladmin --user=root create serf`
* Install the serf databases with the following commands:
```
serf-makemigrations
serf-migrate
```
* Make a batch file `WPy64-3850\scripts\NeuroportDBS.bat` with the following contents:
```shell script
@echo off
call "%~dp0env_for_icons.bat"
start "" "%WINPYDIR%\Scripts\dbs-sweep.exe" /command:%1 /B
start "" "%WINPYDIR%\Scripts\dbs-raster.exe" /command:%1 /B
start "" "%WINPYDIR%\Scripts\dbs-waveform.exe" /command:%1 /B
start "" "%WINPYDIR%\Scripts\dbs-ddu.exe" /command:%1 /B
start "" "%WINPYDIR%\Scripts\dbs-features.exe" /command:%1 /B
```
* Jump ahead to [Usage Instructions](#usage-instructions) below.

### Required Python Packages

| Package | Version | Wheel | pip command |
| ------- | ------- | ----- | ----------- |
| pyFFTW | 0.12.0 | [Link](https://files.pythonhosted.org/packages/2b/e4/822d4cf12cd907fb8e80844db48aef7adf9e888c89256feb510fa81ae83f/pyFFTW-0.12.0-cp38-cp38-win_amd64.whl)
| mysqlclient | 2.0.1 | [Link](https://files.pythonhosted.org/packages/b2/72/e205fcf877dd0ec05d71b975def8ecef3ae4bb7fee14434615140ebdc168/mysqlclient-2.0.1-cp38-cp38-win_amd64.whl)
| Django | 3.1 | [Link](https://files.pythonhosted.org/packages/2b/5a/4bd5624546912082a1bd2709d0edc0685f5c7827a278d806a20cf6adea28/Django-3.1-py3-none-any.whl)
| quantities | 0.12.4 | | |
| python-neo | 0.9.0 | | `pip install git+https://github.com/NeuralEnsemble/python-neo.git`
| pylsl | 1.13.6 | [Link](https://files.pythonhosted.org/packages/02/c2/7b58adda02dbfa8f76bf835879d36b83dfc1da2eaa50d124d13a515e148c/pylsl-1.13.6-py2.py3-none-win_amd64.whl)
| pytf | 0.1 | [Link](https://github.com/SachsLab/pytf/releases/download/v0.1/pytf-0.1-py2.py3-none-any.whl) |`pip install git+https://github.com/SachsLab/pytf.git`|
| mspacman | 0.1 | [Link](https://github.com/SachsLab/mspacman/releases/download/v0.1/mspacman-0.1-py2.py3-none-any.whl) |`pip install git+https://github.com/SachsLab/mspacman.git`|
| cerebus | 0.0.4 | [Link](https://github.com/dashesy/CereLink/releases/download/v7.0.5/cerebus-0.0.4-cp38-cp38-win_amd64.whl) |N/A - must use wheel|
| cerebuswrapper | 0.1 | [Link](https://github.com/SachsLab/cerebuswrapper/releases/download/v0.1/cerebuswrapper-0.1.0-py3-none-any.whl) |`pip install git+https://github.com/SachsLab/cerebuswrapper.git`|
| serf | 1.1 | [Link](https://github.com/cboulay/SERF/releases/download/v1.1/serf-1.1-py3-none-any.whl) | `pip install git+https://github.com/cboulay/SERF.git#subdirectory=python`|
| neurport_dbs | 1.0 | [Link](https://github.com/SachsLab/NeuroportDBS/releases/download/v1.0/neuroport_dbs-1.0.0-py3-none-any.whl) | `pip install git+https://github.com/SachsLab/NeuroportDBS.git`|

### Configuring MySQL Database Server

* If you wish to use a different datadir then you must first create a `my.cnf` file in the root `mysql` folder with the following contents (commented out lines aren't necessary, just keeping them here for reference):
```
[mysqld]
datadir=path/to/data
#port = 3306
#socket = /tmp/mysql.sock
#pid-file = /Volumes/STORE/eerfdata/Chadwicks-MacBook-Pro.local.pid
#default-storage-engine = MyISAM
#default_tmp_storage_engine = MyISAM
#query_cache_type = 1
#key_buffer_size = 2G
#query_cache_limit = 400M
```
* If you wish to secure the database then you'll need to give the root account a password. Do so with `mysql_secure_installation`.
* If you change from the default username (`root`) and password (none) then you will have to tell `serf` what the username and password are. Create a file named `my_serf.cnf` and put it in the path identified by the following command: `python -c "import os; print(os.path.expanduser('~'))"` The file contents should be
```
[client]
user = root
password = {password}
#port = 3306
#socket = /tmp/mysql.sock
```

## For experts who want to use their existing environment

We assume you know how to work with conda environments and that you have a MySQL database server running and configured to your liking.

* Install the Python packages from the table above.
* Adapt the instructions at [Segmented Electrophys Recordings and Features Database (SERF)](https://github.com/cboulay/SERF) to prepare the database server for these tools.
* If you have a hybrid distribution/system-MySQL environment (i.e., your name is Guillaume) then you may also wish to use some of the MySQL DB config settings from above.
## Interprocess Communication

This section is referring to communication among the applications within the OpenMER suite, including mysqld and ~8 Python applications. Communication to/from the data sources is out-of-scope in this section.

The applications all run independently of each other, but most of them work better in combination. To communicate information between applications we use [ZeroMQ](https://zeromq.org/).

| Publisher | Port | Topic | Message | Subscribers |
|-------------------|-------|--------------------|--------------------------------------------------------|-------------------|
| ProcedureGUI | 60001 | procedure_settings | json of settings. "procedure": {}, "running": bool | Segments_Process |
| Segments_Process | 60002 | snippet_status | (startup, notrecording, recording, accumulating, done) | ProcedureGUI |
| SweepGUI | 60003 | channel_select | json with channel, range, highpass | FeaturesGUI |
| Features_Process | 60004 | features | refresh | ProcedureGUI |
| DepthGUI | 60005 | ddu | float of depth | Segments_Process |

We also have one LSL stream coming from the DepthGUI. Old version of the SERF Depth_Process might still be using it but they should be migrated.

| Stream Name | Stream Type | Content | Inlets |
|-----------------|-------------|-------------------------|----------------------------|
| electrode_depth | depth | 1 float32 of elec depth | *old* Depth_Process (SERF) |

## Development Environment

### Blackrock Neuroport

When using Blackrock hardware, the following tools and SDKs are needed.

The Blackrock NSP has its own [NeuroPort Central Suite](https://www.blackrockmicro.com/technical-support/software-downloads/) to manage the configuration of the device and to store data. However, its data visualization capabilities are rather limited and not suited for DBS MER.

The NSP data stream is accessible via an open source API [CereLink](https://github.com/CerebusOSS/CereLink) which includes a Python interface called `cerebus.cbpy`. These are maintained by Sachs Lab member Chadwick Boulay. Most of our OpenMER software is written in Python and depends on `cerebus.cbpy` and a custom [cerebuswrapper](https://github.com/SachsLab/cerebuswrapper) to communicate with the NSP.

#### nPlayServer

For development, it is useful to playback previously recorded data, without need of hardware or patients.
nPlayServer emulates a Blackrock Neuroport almost perfectly -- just with a different ip address and it plays back an old data file instead of provides new data. See below for platform-specific instructions.

When using nPlayServer, you can follow the general [Usage Instructions](./usage-instructions.md) with one modification:
Modify the [DepthGUI settings](settings.md#depthguiini) to use "cbsdk playback" as its source. The depth value might not update until the file playback emits a change in depth.

**Windows**

* Run "C:\Program Files (x86)\Blackrock Microsystems\NeuroPort Windows Suite\runNPlayAndCentral.bat"
* Select a recording to play back
* Use Central's hardware configuration tool to enable continuous recording and spike extraction on the recorded channels.

**Nix**

* Blackrock does not distribute nPlayServer on macOS or Linux. However, it does exist. Contact Chad directly.
* Central is unavailable on these platforms. Use `pycbsdk` to quickly change the hardware configuration.

### Playback XDF file

If you have a correctly formatted file, it may be enough to use [pyxdf playback](https://github.com/xdf-modules/pyxdf/blob/main/pyxdf/examples/playback_lsl.py).

TODO: More instructions needed.

### Dependencies

We assume you know how to work with conda / mamba environments and that you have a MySQL database server running and configured to your liking.

* Create an `openmer` conda environment.
* Install the Python packages from the [table](preparing-distribution.md#required-python-packages).
* Adapt the instructions at [Segmented Electrophys Recordings and Features Database (SERF)](https://github.com/cboulay/SERF) to prepare the database server for your development environment.

## Future goal - installable package

Expand Down
35 changes: 20 additions & 15 deletions docs/getting-started.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,30 @@
Our primary method of distributing the full OpenMER Suite is as a giant zip file. TODO: Link!
The PC that runs our software is directly connected to the acquisition system, but it is never connected to the internet. Thus, we create a portable install on a thumb drive which we then copy to the clinical PC.

If you want to setup the individual pieces on an internet-connected computer (e.g., for development or testing) then please look at the [For Developers](./for-developers.md) documentation.
> For development or testing on an internet-connected computer, or a non-Windows computer, please look at the [For Developers](for-developers.md) documentation.

It is expected that the target computer is a standalone computer that has a dedicated connection to the data acquisition system, such as a manufacturer-provided PC which is usually not connected to the internet. Testing without the hardware is also possible using a signal generator source or a data playback source (see below for example).
## Installation

Extract the zip file to the target computer. Choose a destination with a lot of disk space because the data segments will be saved within.
### Distribution

Updates may come in the form of a smaller zip file to extract within a specific subfolder of the extracted distribution.
Copy the `<distribution>` folder from the thumb drive to the instrument PC.

Proceed with the [Usage Instructions](./usage-instructions.md)
> Be sure to choose a destination with lots of disk space because many recording segments will be stored within the database located in this folder.

## Test Environment - Without Hardware
> If you do not have the `<distribution>` folder then follow the [Preparing Distribution](preparing-distribution.md) instructions to create it.

### Emulate Blackrock NSP
### Configure

* Run "C:\Program Files (x86)\Blackrock Microsystems\NeuroPort Windows Suite\runNPlayAndCentral.bat"
* Select a recording to play back
* Use Central's hardware configuration tool to enable continuous recording and spike extraction on the recorded channels.
* Follow the general [Usage Instructions](./usage-instructions.md) with one modification:
* When running `dbs-ddu`, choose "cbsdk playback" from the dropdown menu to reuse the depths from the recording. The value might not update until the file plays back a change in depth.
The `<distribution>` folder is ready to use as-is. However, with some additional steps it can be more useful on the target PC.

### Playback XDF file
#### Shortcuts

More instructions are needed. If you have a correctly formatted file, it may be enough to use [XDFStreamer](https://github.com/labstreaminglayer/App-XDFStreamer).
* Make a desktop shortcut to `<distribution>\mysql\bin\mysqld.exe`.
* Make a desktop shortcut to `<distribution>\<python>\scripts\OpenMER.bat`

#### Settings files

OpenMER functionality can be modified via INI files. See [Settings](settings.md) for more information.

## Using OpenMER

See [Usage Instructions](usage-instructions.md)
Loading