Google recently released a paper called Brain2Music, which can use artificial intelligence to read, analyze your functional magnetic resonance imaging (fMRI) data, and then automatically generate music based on your brain activity.
The researchers recruited five volunteers and first had them listen to 15-second music clips of different music genres such as blues, classical, country, disco, hip-hop, jazz, metal, pop, reggae, and rock, and then recorded the volunteers’ fMRI data.
Google uses this data to train a deep neural network to understand the relationship between brain activity patterns and different elements of music such as rhythm and emotion.
Brain2Music is based on Google’s previously introduced MusicLM model (text-based music clip generation), which generates 15-second music clips based on fMRI.
The results of the study show that one of the 15-second music clips generated by Brain2Music is “Oops! …I Did It Again” is virtually identical to the original music heard in the test.