Researchers created an AI model that produces short music clips based on what someone listened to. (Image Credit: Mohamed_hassan)
Google researchers have developed Brain2Music, an AI tool that generates short song snippets based on the music someone listened to during their fMRI brain scans. Five volunteers participated in this study, listening to 15-second music clips, and the team examined their fMRI data. Anyone interested in listening to these tunes can head over to Google’s page and see how closely they sound like the original song.
The team used parts of the brain imaging data and music clips to train an AI tool that looks for connections between the music features. This includes the genre, instruments, rhythm, mood, and the volunteers’ brain signals. The team categorized each mood as happy, exciting, tender, scary, or angry. Each individual had the AI program form links between their brain activity patterns and different musical features.
Once trained, the AI translated the unseen brain imaging data remnants into a snippet closely resembling musical aspects of the original songs. Afterward, the team input that data into MusicLM, another Google AI model designed to produce music via text, like “a calming violin melody backed by a distorted guitar riff.” The team discovered that the AI model achieved 60% agreement based on the mood of the reconstructed and original music. And the AI most accurately identified classical music versus other genres.
At the end of the day, the goal behind this project is to offer more insight into how the human brain processes music. So far, the team discovered that primary auditory cortex brain regions became active as the participants listened to the songs. Additionally, the lateral prefrontal cortex plays a crucial role in interpreting the song’s meaning. However, researchers need to look further into this with more research. Previous research showed that various prefrontal cortex regions shifted while a freestyle rapper improvised.
In the future, researchers could observe the brain’s ability to process different music moods or genres. The team’s goal is to determine if AI can recreate music that people think about in their minds.
Have a story tip? Message me at: http://twitter.com/Cabe_Atwell