NEUROSCAPES
Composing with the Human Mind
The Process
This project explores a musical system that translates brainwave patterns into a four-movement composition, structured much like a symphony.
​
Dataset
I used the Visually Evoked Potential EEG (VEP-EEG) dataset, hosted on NITRC, which contains EEG recordings from an 18-subject visual oddball experiment. I selected the dataset due to its detailed metadata and consistency, which made it suitable for mapping neural responses.
​
Preprocessing
Before translation, I performed preprocessing via VSCode (e.g. filtering, segmentation around event windows) and feature extraction (e.g. power in frequency bands, amplitude peaks), then graphing it with extracted data points.
​
%20and%20Oddball%20ERP%20(%C2%B5V)%20-%20Subject%2010.png)
​​​​​​​​​
Composing
Next, I converted EEG signals into MIDI (musical instrument digital interface) using csv-to-midi.evanking.io. I mapped signals to both pitch and amplitude. Finally, I imported the MIDI data into Ableton Live, where I arranged, layered, and orchestrated the material into what became NEUROSCAPES!
​
​​​
​
I. Introduction
A melodic opening that ushers the listener into an uncanny classical world. The tone is at once playful and haunting, preparing the ground for the contrasts to come.
​
II. The Prairie (after the Standard ERP dataset)
A wide-open landscape rendered in music. The rhythm is deliberately non standard, unfolding in a cadenza-like format. A solo flute runs freely across the texture, evoking the sense of boundless motion.
​
III. Clouds on the Prairie (after the Oddball ERP dataset)
Shadows gather over the openness. Here the rhythm becomes further unsettled, with deviations in peaks and maxima mirroring the distortions of perception. The texture thickens, charged with irregularity and shifting patterns.
​
IV. Rain
Order returns. The rhythm settles back into a standard pulse, as if the release of rain brings balance and resolution to the restless landscape.
EP Outline
