Skip to content

mindhiveproject/evs-experimentalSoftware

Repository files navigation

Physiological Responses To Audio-Visual Input (IRB-FY2022-6088)

Project Overview

This research study is designed to validate a new physiological sensing device called EmotiBit against gold-standard equipment, specifically the Brain Products LiveAmp and V-Amp. The overall goal is to learn more about the measurement of human physiological responses to visual input.

The project involves capturing research-grade physiological signals (such as heart rate, respiration, skin conductance responses, and body temperature) to assess physiological changes in the sympathetic and parasympathetic nervous system. The study is conducted by researchers Sean Montgomery, Franck Porteous, Esteban Romero, Ryan Morgenstern, and Principal Investigator Dr. Suzanne Dikker, affiliated with the NYU Department of Psychology.

Key Components and Stimuli

Component Description
New Device EmotiBit: An open-source multi-modal sensor designed for capturing research-grade physiological signals from various parts of the body.
Validation Devices Brain Products LiveAmp and V-Amp: Used as "gold-standard devices". These devices are intended for amplifying and digitizing electrophysiological signals (e.g., EEG, EMG, ECG, EOG).
Stimuli Open Affective Standardized Image Set (OASIS): Images from this set are presented to generate a range of affective physiological responses. OASIS contains 900 color images depicting animals, objects, people, and scenes, normed on valence (positivity/negativity) and arousal (intensity). The images are open-access and free for reuse and modification in research.
Data Types Physiological signals (via sensors attached to the perimeter, middle and ring finger, shoulder, and face), and video data (via webcam) for behavioral analysis.

Methodology Summary

The study employs a dyadic design, with two participants completing the procedure simultaneously while positioned next to each other, although they are not interacting as part of the experimental protocol.

Participants undergo an A-B-A physical activity sequence (approx. 1 hour total):

  1. Seated (10 minutes): Participants are initially seated while watching images.
  2. Standing (10 minutes): Participants are asked to stand and continue watching images.
  3. Seated (10 minutes): Participants sit again to assess the renormalization of the sympathetic/parasympathetic response.

The dyadic design and A-B-A sequence serve two primary purposes: to validate protocols for synchronizing datastreams across multiple people in real-time, and to replicate past research suggesting that physiological responses may co-vary between individuals in the same room. Participants must respond with a button press whenever a new image appears to ensure attention.

References

  • Kurdi, B., Lozano, S., & Banaji, M. R. (2017). Introducing the Open Affective Standardized Image Set (OASIS). Behavior Research Methods, 49(2), 457–470.
  • EmotiBit.
  • Brain Products (LiveAmp and V-Amp).

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published