Prediction Error VR (reach-to-touch)

uploaded by Lukas Gehrke on 2021-03-01 - 2 months ago
last modified on 2021-03-03 - 2 months ago
authored by Lukas Gehrke, Sezen Akman, Albert Chen, Pedro Lopes, Klaus Gramann
162673

OpenNeuro Accession Number: ds003552
Files: 157, Size: 6.53GB, Subjects: 19, Session: 1
Available Tasks: ReachToTouchPredictionError
Available Modalities: channels, coordsystem, eeg, electrodes, events

README

Prediction Error EEG & Motion Study

This is the dataset of the study "prediction error". In short, 19 participants were tested in a virtual reality (VR) reach-to-object task. In the task participants experienced visual, visual with vibrotactile or visual with vibrotactile (all subjects) and electrical muscle stimulation (EMS) feedback (10 subjects) . In 25% of trials the feedback, 'button selection', was provided prematurely, resulting in prediction error / mismatch ERPs (oddball style paradigm). Participants rated their interactive experience on the Immersion and Presence Questionnaire (IPQ) and their workload on the NASA-TLX.

Details about the study can be found in the following publication(s): - Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials. Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, Avinash Kumar Singh, Hsiang-Ting Chen, Chin-Teng Lin and Klaus Gramann | In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 427, 11 pages. DOI: https://doi.org/10.1145/3290605.3300657

A full repository including data, experimental VR protocol (Unity), and publication resources can be found at OSF, doi: 10.17605/OSF.IO/X7HNM

Available Data

Data include 64 channel EEG + 1 reference and 12 channel Motion (6DOF right hand, 6DOF head). Motion metadata is formatted to current preliminary BIDS Motion, see how to get involved. Derivatives on the motion data (velocity and acceleration) are appended in the motion data.

Data Format and Derivation

Original data were recorded in .xdf format using labstreaminglayer. A \sourcedata directory is currently missing since our .xdf files did not comply with GDPR. Both data, EEG and Motion, are made available in EEGLAB compatible format, see .set and .fdt files. For derivation from the raw .xdf each participants recordings were appended and sampled to 250Hz. Synchronous EEG and motion data were separated and are available as independent data sources in BIDS-like style.

Authors

  • Lukas Gehrke
  • Sezen Akman
  • Albert Chen
  • Pedro Lopes
  • Klaus Gramann

Dataset DOI

10.18112/openneuro.ds003552.v1.1.0

License

CC0

Acknowledgements

We thank Avinash Singh, Tim Chen and C.-T. Lin from the Univsersity of Sydney (New South Wales, Australia) for their help developing the task.

How to Acknowledge

Please cite: Lukas Gehrke, Sezen Akman, Albert Chen, Pedro Lopes, Klaus Gramann (2021, March 1). Prediction Error: A reach-to-touch Mobile Brain/Body Imaging Dataset. https://doi.org/10.17605/OSF.IO/X7HNM

Funding

References and Links

  • Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials. Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, Avinash Kumar Singh, Hsiang-Ting Chen, Chin-Teng Lin and Klaus Gramann | In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 427, 11 pages. DOI: https://doi.org/10.1145/3290605.3300657

Ethics Approvals

  • GR1020180603

How To Cite

Copy
Lukas Gehrke and Sezen Akman and Albert Chen and Pedro Lopes and Klaus Gramann (2021). Prediction Error VR (reach-to-touch). OpenNeuro. [Dataset] doi: 10.18112/openneuro.ds003552.v1.1.0
More citation info

Prediction Error VR (reach-to-touch)

uploaded by Lukas Gehrke on 2021-03-01 - 2 months ago
last modified on 2021-03-03 - 2 months ago
authored by Lukas Gehrke, Sezen Akman, Albert Chen, Pedro Lopes, Klaus Gramann
162673

OpenNeuro Accession Number: ds003552
Files: 157, Size: 6.53GB, Subjects: 19, Session: 1
Available Tasks: ReachToTouchPredictionError
Available Modalities: channels, coordsystem, eeg, electrodes, events

README

Prediction Error EEG & Motion Study

This is the dataset of the study "prediction error". In short, 19 participants were tested in a virtual reality (VR) reach-to-object task. In the task participants experienced visual, visual with vibrotactile or visual with vibrotactile (all subjects) and electrical muscle stimulation (EMS) feedback (10 subjects) . In 25% of trials the feedback, 'button selection', was provided prematurely, resulting in prediction error / mismatch ERPs (oddball style paradigm). Participants rated their interactive experience on the Immersion and Presence Questionnaire (IPQ) and their workload on the NASA-TLX.

Details about the study can be found in the following publication(s): - Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials. Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, Avinash Kumar Singh, Hsiang-Ting Chen, Chin-Teng Lin and Klaus Gramann | In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 427, 11 pages. DOI: https://doi.org/10.1145/3290605.3300657

A full repository including data, experimental VR protocol (Unity), and publication resources can be found at OSF, doi: 10.17605/OSF.IO/X7HNM

Available Data

Data include 64 channel EEG + 1 reference and 12 channel Motion (6DOF right hand, 6DOF head). Motion metadata is formatted to current preliminary BIDS Motion, see how to get involved. Derivatives on the motion data (velocity and acceleration) are appended in the motion data.

Data Format and Derivation

Original data were recorded in .xdf format using labstreaminglayer. A \sourcedata directory is currently missing since our .xdf files did not comply with GDPR. Both data, EEG and Motion, are made available in EEGLAB compatible format, see .set and .fdt files. For derivation from the raw .xdf each participants recordings were appended and sampled to 250Hz. Synchronous EEG and motion data were separated and are available as independent data sources in BIDS-like style.

Authors

  • Lukas Gehrke
  • Sezen Akman
  • Albert Chen
  • Pedro Lopes
  • Klaus Gramann

Dataset DOI

10.18112/openneuro.ds003552.v1.1.0

License

CC0

Acknowledgements

We thank Avinash Singh, Tim Chen and C.-T. Lin from the Univsersity of Sydney (New South Wales, Australia) for their help developing the task.

How to Acknowledge

Please cite: Lukas Gehrke, Sezen Akman, Albert Chen, Pedro Lopes, Klaus Gramann (2021, March 1). Prediction Error: A reach-to-touch Mobile Brain/Body Imaging Dataset. https://doi.org/10.17605/OSF.IO/X7HNM

Funding

References and Links

  • Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials. Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, Avinash Kumar Singh, Hsiang-Ting Chen, Chin-Teng Lin and Klaus Gramann | In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 427, 11 pages. DOI: https://doi.org/10.1145/3290605.3300657

Ethics Approvals

  • GR1020180603

How To Cite

Copy
Lukas Gehrke and Sezen Akman and Albert Chen and Pedro Lopes and Klaus Gramann (2021). Prediction Error VR (reach-to-touch). OpenNeuro. [Dataset] doi: 10.18112/openneuro.ds003552.v1.1.0
More citation info

Dataset File Tree

Git Hash: e359722 

BIDS Validation

Dataset File Tree

Git Hash: e359722 

Comments

Please sign in to contribute to the discussion.