Skip to main content

Secret Mathematical Patterns Revealed in Bach’s Music

Physicists found that the music of Johann Sebastian Bach contains mathematical patterns that help convey information

One page of sheet music on diagonal.

Bach's prelude for klavier score.

Baroque German composer Johann Sebastian Bach produced music that is so scrupulously structured that it’s often compared to math. Although few among us are emotionally affected by mathematics, Bach’s works—and music in general—moves us. It’s more than sound; it’s a message. And now, thanks to tools from information theory, researchers are starting to understand how Bach’s music gets that message across.

By representing scores as simple networks of dots, called nodes, connected by lines, called edges, scientists quantified the information conveyed by hundreds of Bach’s compositions. An analysis of these musical networks published on February 2 in Physical Review Research revealed that Bach’s many musical styles, such as chorales and toccatas, differed markedly in how much information they communicated—and that the musical networks contained structures that could make their messages easier for human listeners to understand.

“I just found the idea really cool,” says physicist Suman Kulkarni of the University of Pennsylvania, lead author of the new study. “We used tools from physics without making assumptions about the musical pieces, just starting with this simple representation and seeing what that can tell us about the information that is being conveyed.”


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Researchers quantified the information content of everything from simple sequences to tangled networks using information entropy, a concept introduced by mathematician Claude Shannon in 1948.

As its name suggests, information entropy is mathematically and conceptually related to thermodynamic entropy. It can be thought of as a measure of how surprising a message is—where a “message” can be anything that conveys information, from a sequence of numbers to a piece of music. That perspective may feel counterintuitive, given that, colloquially, information is often equated with certainty. But the key insight of information entropy is that learning something you already know isn’t learning at all.

A conversation with a person who can only ever say one thing, such as the character Hodor in the television series Game of Thrones, who only says “Hodor,” would be predictable but uninformative. A chat with Pikachu would be a bit better; the Pokémon can only say the syllables in its name, but it can rearrange them, unlike Hodor. Likewise, a musical piece with just one note would be relatively easy for the brain to “learn,” or accurately reproduce as a mental model, but the piece would struggle to get any kind of message across. Watching a coin flip with a double-headed coin would yield no information at all.

Of course, packing a message full of information isn’t much good if whatever—or whoever—receives it can’t accurately understand that information. And when it comes to musical messages, researchers are still working out how we learn what music is trying to tell us.

“There are a few different theories,” says cognitive scientist Marcus Pearce of Queen Mary University of London, who wasn’t involved in the recent Physical Review Research study. “The main one, I think, at the moment, is based on probabilistic learning.”

In this framework, “learning” music means building up accurate mental representations of the real sounds we hear—what researchers call a model—through an interplay of anticipation and surprise. Our mental models predict how likely it is that a given sound will come next, based on what came before. Then, Pearce says, “you find out whether the prediction was right or wrong, and then you can update your model accordingly.”

Kulkarni and her colleagues are physicists, not musicians. They wanted to use the tools of information theory to scour music for informational structures that could have something to do with how humans glean meaning from melody.

So Kulkarni boiled down 337 Bach compositions into webs of interconnected nodes and calculated the information entropy of the resulting networks. In these networks, each note of the original score is a node, and each transition between notes is an edge. For example, if a piece included an E note followed by a C and a G played together, the node representing E would be connected to the nodes representing C and G.

Networks of note transitions in Bach’s music packed more of an informational punch than randomly generated networks of the same size—the result of greater variation in the networks’ nodal degrees, or the number of edges connected to each node. Additionally, the scientists uncovered variation in the information structure and content of Bach’s many compositional styles. Chorales, a type of hymn meant to be sung, yielded networks that were relatively sparse in information, though still more information-rich than randomly generated networks of the same size. Toccatas and preludes, musical styles that are often written for keyboard instruments such as the organ, harpsichord and piano, had higher information entropy.

“I was particularly excited by the higher levels of surprise in the toccatas than in the chorale works,” says study co-author and physicist Dani Bassett of the University of Pennsylvania. “These two sorts of pieces feel different in my bones, and I was interested to see that distinction manifest in the compositional information.”

Network structures in Bach’s compositions might also make it easier for human listeners to learn those networks accurately. Humans don’t learn networks perfectly. We have biases, Bassett says. “We kind of ignore some of the local information in favor of seeing the bigger informational picture across the entire system,” they add. By modeling this bias in how we build our mental models of complex networks, the researchers compared the total information of each musical network to the amount of information a human listener would glean from it.

The musical networks contained clusters of note transitions that might help our biased brains “learn” the music—to reproduce the music’s informational structure accurately as a mental model—without sacrificing much information.

“The particular kind of way in which they capture learnability is pretty interesting,” says Peter Harrison of the University of Cambridge, who wasn’t involved in the study. “It's very reductive in a certain sense. But it’s quite complementary to other theories we have out there, and learnability is a pretty hard thing to get a handle on.”

This type of network analysis isn’t particular to Bach—it could work for any composer. Pearce says it would be interesting to use the approach to compare different composers or look for informational trends through music history. For her part, Kulkarni is excited to analyze the informational properties of scores from beyond the Western musical tradition.

Music isn’t just a sequence of notes, though, Harrison notes. Rhythm, volume, instruments’ timbre—these elements and more are important dimensions of the musical messages that weren’t considered in this study. Kulkarni says she’d be interested in including these aspects of music in her networks. The process could also work the other way, Harrison adds: rather than boiling musical features down to a network, he’s curious how network features translate to things that a musician would recognize.

“A musician would say, ‘What are the actual musical rules, or the musical characteristics, that are driving this? Can I hear this on a piano?’” Harrison says.

Finally, it’s not yet clear how, exactly, the network patterns identified in the new study translate into the lived experience of listening to a Bach piece—or any music, Pearce says. Settling that will be a matter for music psychology, he continues. Experiments could reveal “if, actually, those kinds of things are perceivable by people and then what effects they have on the pleasure that people have when they're listening to music.” Likewise, Harrison says he’d be interested in experiments testing whether the types of network-learning mistakes the researchers modeled in this study are actually important for how people learn music.

“The fact that humans have this kind of imperfect, biased perception of complex informational systems is critical for understanding how we engage in music,” Bassett says. “Understanding the informational complexity of Bach’s compositions opens new questions regarding the cognitive processes that underlie how we each appreciate different sorts of music.”