Questions posed at the beginning of this book include; “What is music? Where does it come from? Why do some sequences of sound move us while others do not? Where does creativity come from? What role does perception play in all of this?” The author presents these along with more broad questions about memory, perception, creativity, and the common instrument underlying them all: the human brain. Music is dynamic and changes across time and what moves it forward are meter and rhythm which are said to be the engine driving virtually all music. Virtually every civilization and culture considers movement to be an integral part of music making and listening and every known culture bases its music on octaves.
There is a notion that art and music are processed on the right side of the brain while language and mathematics are on the left, but music is distributed throughout the brain and involves nearly every neural subsystem. Music tells us how to feel and evokes and manipulates emotions. Levitin feels that by better understanding what music is and where it comes from, we may be able to better understand our motives, fears, desires, memories and even communication in the broadest sense- creating the story of how the brain and music coevolved. Levitin defines the fundamental building blocks of music to be sound as loudness, pitch, contour, duration, rhythm, tempo, timbre, spatial location, and reverberation. All these attributes are separable and can be varied without altering the others. Our brains organize these into higher-level concepts like meter, key, harmony and melody.
Pitch is said to be a primary means by which emotion is portrayed- mood, excitement, calm, romance, and danger are signaled most decisively by pitch. Our brains are also subconsciously keeping track of how many times notes are sounded to help us identify tonic. A composer could intend C major, but if A is hit repeatedly, we will hear A minor. In keys, it is observed action, not intention that counts. Some studies suggest that associations of happy with major and sad with minor might be innate but is likely to be specific cultural associations. Also by the age of five, most children have internalized rules about what chord progressions are legal or typical of their cultures music and they can readily detect deviations from the standard sequences. The brain accomplishes this by creating networks of neurons that form abstract representations of musical structure and rules.
One study referenced talked about restoration of the missing fundamental- OWL experiment. When The Blue Danube Waltz by Strauss was played missing one note, scientists found that they “were hearing the firing rates of the neurons and they were identical to the frequency of the missing fundamental.”Another study about the importance of attack and how we perceive notes had a scientist cut the attack from a bunch of recordings. She found that it was impossible for most people to then identify the instrument that was playing, showing that the initial sound has a major influence on what we hear.
At a neural level, playing an instrument requires the orchestration of regions in our cerebellum and brain stem as well as higher cognitive systems such as the motor cortex (in the parietal lobe) and the planning of regions in our frontal lobes, the most advanced regions of the brain. Loudness is also a psychological phenomenon and only exists in the mind. It means increasing amplitude and is measured in decibels but is a dimensionless unit. The brain stem and the dorsal cochlear nucleus can distinguish between consonance and dissonance and it happens before the cortex even gets involved. Neuroscientists deconstruct sound into it components and what brain regions are involved in processing them. Gestalt theorists use the idea of grouping within the mind as it applies to music. Your mind can simultaneously group the orchestra, its sections, and individuals. Our brains are capable of analyzing dozens of different frequencies at once and also put them together in the right way. Time, timbre, and amplitude are all factors in grouping along with frequency and pitch. Neurobiological subsystems for different attributes of sound separate early on which suggests the brains general mechanisms carry out the process independently of one another, but experience and attention can influence grouping suggesting portions of the groupings are under conscious and cognitive control.
The human brain is divided into four lobes- the frontal lobe, associated with planning, self-control, and making sense of signals our senses receive; the temporal lobe, associated with hearing and memory; posterior frontal lobe associated with motor movement and spatial skill; and the occipital with vision. The cerebellum involves emotions and planning of movements. Musical activity involves nearly every region of the brain that we know about and nearly every neural subsystem. Different aspects of the music are handled by different neural regions. Listening to music starts with the sub-cortical structures- the cochlear nuclei, the brain stem, and the cerebellum. Afterwards it moves to the auditory cortex on both sides of the brain. Following along with music you are familiar with uses additional regions including the hippocampus (memory center) and subsections of the frontal lobe. Tapping along involves the cerebellum. Performing involves the frontal lobes for the planning of behavior and the motor cortex in the posterior frontal, also the sensory cortex for tactile feedback. Reading music involves the visual cortex in the occipital lobe. Listening to and or recalling music uses language centers in the temporal and frontal lobes. The emotional response involves structures in the cerebellum in the cerebellar vermis and amygdale.
The brain is massively parallel with processes distributed throughout with a capacity for reorganization far beyond what was recently thought. It is called neuroplasticity and suggest regional specificity may be temporary and important mental functions have the capacity to move to other regions in the case of trauma or brain damage (THERAPY). Further developing our understanding of how particular patterns in music give rise to particular patterns in the brain. To acknowledge that the brain is constructing a version of reality, we must reject that the brain has an isomorphic representation of the world. The brain represents the world in mental or neural codes. Left brained people are said to be writers, businessmen, and engineers, while right brained people are said to be artists, dancers, and musicians; but this thought is overly simplistic. All require both hemispheres and some are lateralized. Lateralization is found in music. Melodic shape is right brained while discriminating tones is left brained. Along with many other aspects of musical interpretation, musical training seems to have an effect on the shifting of some music processing from the right to left. As musicians learn to talk about, think about, and use linguistic terms.
One of the most important qualifiers is that music happens over time. As tones happen, they lead us to predict what comes next. Neural firings produce small electric current that can be measured with an EEG (electroencephalogram). It helps observe musical behavior because it is time based and has the best temporal resolution for studying the human brain. According to studies, the EEG shows that within 150-400 milliseconds the brain is putting together musical syntax to predict what comes next. Other tools used have been MRI’s and EMI’s to trace blood flow through the brain and where the most blood flow is. These really help to show within millimeters where activity is in the brain. The left hemisphere regions were found actively tracking musical structure and are associated with the region that responds to sight. All sounds begin at the eardrum and then sound gets segregated by pitch. Then it goes through separate processing circuits. Then speech circuits decompose and organize phonemes, pitch, timbre, contour, and rhythm, are then are sorted. Then the frontal lobe puts it all together to try and find structure or patterns. This generally covers the neurobiology and moves us to brain mechanisms underlying emotion and memory.
Levitin, D. (2006). This is your brain on music: The science of a human obsession. New York, N.Y.: Dutton.