Repo for Jones, Thyer, Suplica & Awh, 2024 - Cortically disparate visual features evoke content-independent load signals during storage in working memory
Experiment 1 compares colors and orientations. Experiment 2 compares colors and motion coherences.
The Github contains the code. The OSF contains the code + data.
ExperimentN/contains the code to run the task in Psychopy (experiment/), preprocess the data (preprocessing/), and run the analyses related to Experiment N (analyss/). Each analysis directory contains another README walking through the specific scripts.eye_movement_comparisons.ipynbpools the decodability results across experiments to examine the role of eyemovements in predicting decodability, for figure 8. It calls functions fromeeg_decoder.pyfor loading in the data.env.ymla minimal requirements file for building a python environment that should be able to run everything...