Our results show that a hierarchical distributed network is synchronized between individuals during the processing of extended musical sequences, and provide new insight
into the temporal integration of complex and biologically salient auditory sequences. Music is a cultural universal and a rich part of the human experience. Brain imaging studies have identified an array of structures that underlie critical components of music, including pitch (Zatorre et al., 1994; Patel & Balaban, 2001), harmony (Janata et al., 2002; Passynkova et al., 2005), rhythm (Snyder & Large, 2005; Grahn & Rowe, 2009), timbre (Menon et al., 2002; Deike et al., 2004) and musical syntax (Levitin & Menon, 2005; Abrams et al., 2011; Oechslin et al., 2012). A drawback of probing neural substrates of ABT-199 in vivo individual musical features is that artificially CP-868596 price constructed laboratory stimuli do not represent music as it is commonly heard, limiting the ecological validity of such studies. Furthermore, this componential approach fails to tap into one of the most important aspects of listeners’ musicality – the ability to integrate components of musical information over extended time periods (on the order of minutes)
into a coherent perceptual gestalt (Leaver et al., 2009). Examining the synchronization of brain responses across listeners constitutes a novel approach for exploring neural substrates of musical information processing. Inter-subject synchronization (ISS) using functional magnetic resonance imaging
(fMRI) detects common stimulus-driven brain structures by calculating voxel-wise correlations in fMRI activity over time between subjects (Hasson et al., 2004). The theoretical basis for using this approach is that brain structures that are consistently synchronized across subjects during an extended stimulus constitute core brain regions responsible for tracking structural elements of that stimulus over time (Hasson et al., 2010). ISS represents a fundamentally different approach, and provides advantages, relative to conventional fMRI methods new (Wilson et al., 2008; see Fig. S1). ISS allows us to examine cognitive processes that require the integration of information over extended time periods; this is critical for the study of music in which the structure of musical elements is manifested over time. Furthermore, ISS does not rely on a priori assumptions about specific stimulus events or subtraction paradigms that require comparison of discrete perceptual or cognitive events. Our goal was to examine shared neural representations underlying the processing of natural musical stimuli (‘Natural Music’; Fig. 1). We used ISS to identify brain regions that showed synchronized activity across individuals in response to music.