The science behind Adele's extraordinary success
Have you ever noticed how successful songs don't always have great lyrics? Despite logic suggesting that the most successful songs would have the best combination of great music and great lyrics, that doesn't always appear to be the case. Why? Because writing a song with both great music and great lyrics can actually be counter-productive. But get the combination right, and you might just be the next Adele.
Music and/or lyrics?
If you want to rile up a group of music fans, one of the easiest ways is to raise the issue of music versus lyrics. Which would you say is more important to you when listening to a song? I’m probably more music- than lyric-oriented, personally, though my focus definitely changes depending on the context.
Whichever way you lean, have you ever considered how the features of both the music and the lyrics of a song might affect the impact they have on you? More specifically, how they combine and interact to produce the most compelling experience possible?
If writing a hit song is your aim, perhaps you should.
Music, language and emotions
Both music and language draw on similar processing mechanisms in the brain, such as in terms of their perception, their semantics — the meaning inherent in the language or music — and their syntax, i.e., the rules governing their structure. For language, this would be grammar, for music, note-hierarchies. In fact, music is often viewed as a precursor to language, and both music and language are considered universal human traits across cultures.
What's more, both language and music can evoke powerful emotions. Indeed, one of the most commonly-reported reasons for engaging with music at all is its emotion-related qualities. We enjoy music because it moves us, makes us feel something.
And experiencing different emotions can actually make us process language — and thus song lyrics — in different ways. For example, if we're in a negative mood, we tend to process language more systematically, more analytically, in a more detail-oriented fashion. If we're in a positive mood, on the other hand, we tend to process language in a more heuristic fashion — using mental shortcuts — with less focused attention and in a more creative manner.
Grounded in evolution
The reasons for this are grounded in evolution. Negative moods are usually associated with an uncertain environment. To our ancient ancestors, such environments could be life-threatening, so vigilance and precise processing of new information was critical. Positive moods, on the other hand, tend to be associated with safer, more predictable environments that require less attention to detail and allow us to get by using heuristic 'rules of thumb’.
This can be easily demonstrated by 'priming' someone to feel a particular emotion or mood. Inducing a positive mood, for example, leads to poorer performance in sentence processing, while negative priming leads to better performance. Thus, transient mood states can affect the way language is processed. And what do you know, the valence of music (positive or negative/happy or sad) affects how our processing of music and lyrics interact!
Emotion and lyric processing
In a study carried out by Laura Jiménez-Ortega and colleagues at the Center for Human Evolution and Behavior in Madrid, volunteers were asked to rate the emotional connotation of happy, sad, angry and calm music played with and without lyrics. For happy and calm melodies, participants gave higher ratings of the intended emotion when there were no lyrics. For sad and angry melodies, however, participants gave higher ratings of the intended emotion when there were lyrics.
In other words, lyrics enhanced perception of sadness, but detracted from perception of happiness.
In a similar vein, fMRI scans of listeners' brains reveal larger neural activations in response to sad music with lyrics, and to happy music without lyrics, compared to sad music without and happy music with, particularly in the limbic system, the brain’s ‘emotion centre’. What's more, sad music with lyrics also activates Brodmann Area 47 — a region that specialises in processing both music and language syntax — to a greater extent than happy music with lyrics.
Rolling In The Deep
Consequently, if you wish to engage your audience on the deepest possible level — and why wouldn't you, since emotional engagement is the primary reason we listen to music in the first place — a song with negatively-valenced (sad) music paired with lyrics that cut like a knife is the optimal route to a deeply compelling song.
Which is exactly the pattern we see time and time again in the songs of 100 million record-selling Adele. From Someone Like You to Hello, even the more up-tempo Rolling In The Deep, the negatively-valenced music causes us to tune into to the exquisite, deeply personal lyrics that touch our very core.
Not that Adele's voice doesn't work its own magic, for it surely does, and I'll be writing about that in a future post. But equally key to her success is the way her songs combine music and lyrics to engage us on the deepest level possible.
To learn more about how to maximise audience engagement through music, get your copy of The Experience Factor today.