top of page

EDM...Visual Effects But With Glow Sticks

Intro

Well, I’m back into the swing of things. Before jumping back into my science articles, I wanted to discuss a branch of visual effects that needs more community love and support: EDM. It’s also an excellent avenue for CGI to work in if you ever get the chance.

How is EDM a branch of visual effects and computer animation well… We’ll dive into that soon. But first I need to thank the rave team down at Promise Cherry Beach in Toronto who, it turns out are just a bunch of retired visual effects people, and others who have worked with Derivitate Software projecting lights and sounds into trees and the beach every Sunday and Monday in the Toronto Summer. Thank you so much for sliding me into the DJ booths and teaching me how you operated the lasers. That was awesome.

This article focuses on the art and visuals created for EDM rather than the music and by no means covers the entire industry. It is also for those who like EDM but would like to explore it for “research purposes.”

Let’s get started.

History Of EDM

The history of EDM, or Electronic Dance Music, goes farther back than you think. Electronic music can be characterized as a genre of music that is produced using electronic, electromechanical, and digital instruments. Electronic music instruments can include electronic oscillators, the theremin, synthesizers, electronic pianos, electric guitars, vocals, or whatever the chosen artist feels like bringing on stage. Electronic music can be made from an extensive variety of resources, from electronic oscillators to diverse computer installations and software, to microprocessors. These sounds are recorded and edited on tape and then transformed and then played back and reproduced using loudspeakers and other instruments. EDM is one of the most popular musical genres on the globe.

Electronic Dance Music is a compilation of electronic music subgenres. Some of these are as follows:

Ambient: https://www.youtube.com/watch?v=mehLx_Fjv_c
------------------------------------------------------------------------------------------

-- Ambient music focuses on atmosphere and tonal textures rather than rhythm and structure. It surfaced in the 1970s when synthesizers were used to create experimental music. The term “ambient music” was reportedly coined by English producer Brian Eno.

Big room: https://www.youtube.com/watch?v=zlCL9z-cDy8
--------------------------------------------------------------------------------------

--- Big room (or big room house) is one of the newer genres. Initially a subgenre of electro house, the big room rose around the early 2010s through acts such as Swedish House Mafia and Martin Garrix. It has minimal melodies and electro-house-style drops.

Chill-out: https://www.youtube.com/watch?v=pUjE9H8QlA4
---------------------------------------------------------------------------------------

-- Chill-out can be characterized by slow rhythms, relaxed moods, and sometimes even jazzy or classical influences.

Deep house: https://www.youtube.com/watch?v=SM-BT9cijI4
----------------------------------------------------------------------------------------

-- What classifies as Deep House is up for debate. It is influenced by House tracks of artists like Marshall Jefferson and Larry Heard from the ‘80s. It is a style of house music adopted with a lower tempo (bpm) and a more soulful feel through gospel-like vocals or Jazzy vibes in comparison to a “regular” house. However, it can also mean the commercialized form of house music of recent times. ‘Deep house,’ often upbeat, radio-friendly, and very summer-like, leans more toward pop-dance or electro-pop due to its frequent application in pop music.

Disco: https://www.youtube.com/watch?v=6dYWe1c3OyU
------------------------------------------------------------------------------------

-- Disco is where music styles such as rhythm and blues, funk, soul, and pop collide. It was popular in the 1970s and 1980s and was one of the first genres of electronic dance music to gain a following. Disco largely featured more acoustic instruments than electronically produced elements, but it was also one of the main influences for house music.

Drum and Bass: https://www.youtube.com/watch?v=rkjNL4dX-U4
------------------------------------------------------------------------------------------------

-- Commonly abbreviated to DnB, it originated in the U.K. in the 1990s. It is built on fast-paced breakbeats in non-standard rhythms and can use raw and heavy bass rips to achieve an aggressive sound. For instance, the liquid drum and bass(a subgenre) are gentler and melodic.

Dubstep: https://www.youtube.com/watch?v=WSeNSzJ2-Jw
--------------------------------------------------------------------------------------------

-- Often confused with drum and bass, dubstep is less fast-paced and generally not break-beat-like. It features unconventional rhythms and is widely known for a specific sound element: the wobble bass (the 'wub').

Electro House: https://www.traxsource.com/track/11760407/blade-extended-mix
-----------------------------------------------------------------------------------------------------------------------

-- Electro house surfaced when electro, originally a fusion of funk, early hip-hop, and New York boogie, met in the house scene of the late 1990s. Electro house is defined by raw, prominent basslines and powerful kick drums, making it a pumped-up version of 'regular' house music.

Hardcore: https://www.youtube.com/watch?v=leLiXABWg68
------------------------------------------------------------------------------------------

-- Hardcore, supposedly going by the name of hardcore techno once, originated in the Netherlands in the 1990s and is arguably the hardest style in this list. It is incredibly fast-paced (160 to 200 beats per minute) and often described as violent by those who favor more delicate music. The true signature of hardcore lies in the kick drums, which are unrivaled in both intensity and the application of distortion effects.

Hardstyle: https://www.youtube.com/watch?v=XsG_zDV83mo
-------------------------------------------------------------------------------------------

-- Hardstyle is one of the harder styles. It is more melodically oriented than hardcore, and certainly not as noise-heavy. It is somewhere in between hard techno and hardcore, but it also draws from trance music.

House: https://www.youtube.com/watch?v=NUVCQXMUVnI
---------------------------------------------------------------------------------------

-- House music is the genre that began it all. It began in Chicago through artists such as Frankie Knuckles, who took over cities in the United States, such as Detroit and New York, Europe, and the rest of the world. There, it became one of the most prominent and popular genres within the dance music space.

Progressive House: https://www.youtube.com/watch?v=3C2KSaqm3so
-------------------------------------------------------------------------------------------------------

-- Progressive houses are subject to the same issues as Deep House. There are two different ways to describe it. It is sometimes defined in the style of music with the sound of artists such as Sasha and John Digweed, Eric Prydz, and Deadmau5. It also draws from the sound of early trance tracks. The other progressive house forms are radio-friendly vocals, simple song structures, and easy-to-listen-to melodies, which share traits with Big Room. This form of progressive house was the first to cross over into the pop realm. Some of the first examples of these tracks are songs like ‘Clarity’ from Zedd feat. Foxes, ‘Take Over Control’ from Afrojack feat. Eva Simons, and Avicii’s 'Wake Me Up'.

Techno: https://www.youtube.com/watch?v=oC-GflRB0y4
------------------------------------------------------------------------------------

-- The foundations of the genre were largely created by Juan Atkins, Derrick May, and Kevin Saunderson, jointly known as the Belleville Three. With emphasis on atmosphere and an often extended build-up of the track, techno music has a rawer, less polished sound set than house music.

Trance: https://www.youtube.com/watch?v=y6120QOlsfU
----------------------------------------------------------------------------------

Trance music is all about the atmospheres, epic breakdowns, and melodies. It started as a minor subgenre within the house spectrum and shared traits with house, techno, new age, and synthesizer pop. The first tracks classified as trance emerged in the late 1980s, although the early 1990s is when the genre began to flourish. Artists such as Armin van Buuren, Chicane, DJ Tiësto, and Ferry Corsten helped trance music grow into one of the most popular genres within the music realm.

//////////////////////////////////////////
///////////////////////

Richard James Burgess would be the man to coin the phrase “EDM”, or Electronic dance music in the 1980s. Coining this phrase would place him at the front of the electronic music revolution. Born in London and raised in New Zealand, Burgess was drawn to percussion at an early age. By the time he was 15, Burgess was making enough money as a musician to pursue a full-time career. He would travel the globe and be attached to prominent record labels such as RCA and Capitol.

Burgess was different from other artists at the time as he mainly used synthesizers in his work. This began with the EMS Synthi A, an analog synthesizer and matrix pad built into a suitcase.

He later got his hands on devices like the ARP 2600 and the Roland MC8 and invented his instrument, the prototype of the SDS-5 (“described as a piece of wood with wires and components on it”). InIne formed a band known as Landscape, which was formed by Christopher Heaton, Andy Pask, Peter Thoms, and John L. Walters. The band was known for its early, synth-wave-era embrace of computer programming.

Walters played the lyrical, the world’s first electronic wind controller, while Heaton, Pask, and Thoms developed and mastered synthesizers that manipulated their acoustic instruments. “Avant-garde electronic stuff, 100% improvised,” Burgess would affirm. One of their tracks, “Einstein a Go-Go,” was the first computer-driven song to become a top-five hit in the U.K.

------------------------------------------------------
----------------------------------------

In 1980, Landscape’s record titled “European Man” stated the now infamous phrase: “EDM; computer programmed to perfection for your listening pleasure.” The name for the genre was born.

Burgess may have been at the forefront of making the genre popular, but where the technique and music style originated was long before his time in the music scene.

A turning point for the overall music industry was the invention of the phonograph by Thomas Alva Edison and Emile Berliner around the 1870s/1880s. Phonographs were the first means of recording and reproducing audio files, marking the beginning of the recording industry.

The genre originated at the turn of the 19th and 20th centuries. Emerging electronics allowed for experimentation with sounds and electronic devices. Several electronic instruments were developed, including the Telharmonium, an electrical organ developed in 1896.

From the late 1920s, the increased practicality of electronic instruments influenced composers such as Joseph Schillinger and Maria Schuppel. They were typically used within orchestras, and most composers wrote parts for the theremin that could be performed with string instruments. The Hammond organ, Trautonium, and the Ondes Martenot would also be developed, an early electronic device played with keyboards or a ring along a wire. These early innovations were first used for demonstrations and public performances as they were too complex, impractical, and incapable of creating a sound practically for music.
.
The introduction of electrical recording in 1925 was followed by experimentation with record players. Paul Hindemith and Ernst Toch composed several pieces in 1930 by layering recordings of instruments and vocals at adjusted speeds. In 1935, the first practical audio tape was invented. Influenced by these techniques, John Cage composed Imaginary Landscape No. 1 in 1939 by adjusting the speeds of recorded tones.
----------------------------------------------
------------------

In 1948, musique concrète, a type of music composition, was invented in Paris, France by French composers Pierre Schaeffer and Pierre Henry in the Studio d'Essai at Radiodiffusion Française (RDF). The Musique concrète technique created tape collages or montages of recorded sounds. All these sounds produced by an individual and their environment were considered materials taken from ‘concrete’ means and situations. Therefore, music concrète was opposed to the use of oscillators as they were considered ‘artificial’ and ‘anti-humanistic’ but still used electric devices to record them.

The work in electronic music taking place in America during this time resulted from two projects undertaken by three composers at Columbia University: Otto Luening, John Cage, and Vladimir Ussachevsky. Between 1942 and 1958, Cage completed Williams Mix (1952) and Fontana Mix (1958) and formed The Music for Magnetic Tape Project along with other composers and members of the New York School. The emphasis of the project was on experimenting with the recording of electronic and natural sounds and combining them with instrumental music, dance, and visual arts.

The goal of the project led by Luening and Ussachevsky was to create a professional tape studio that would demonstrate the musical possibilities of tape as a medium. Joined by Milton Babbitt, the two composers established the Columbia-Princeton Electronic Music Center, known as the Computer Music Center or CMC today, in 1959. This is the oldest center of electronic and computer music research in the USA. After 1958, the Experimental Music Studio at the University of Illinois and the University of Toronto Studio were established in 1959.

Karlheinz Stockhausen, who worked in Schaeffer's studio in 1952, had a different idea of how sounds and music could be transformed and altered. So, softly after joining Schaeffer, he left and joined WDR Cologne's Studio for Electronic Music, established by Herbert Einer. His focus was on electronic sound modifications rather than tape manipulation. What he wanted to achieve, through sound alterations, filtering, and modulating, was electric acoustic compositions. This marked the birth of Elektronische Musik, a German branch of electronic music.

Music festivals featuring electronic instruments started to appear in the early 1900s. This mostly included electronic sounds used in experimental music, particularly tape and electroacoustic music, including the above instruments. Live electronic music expanded more in the 1950s, alongside the use of bass and electric guitar.

--------------------------------
--------------

Japanese electronic musical instruments started having a strong influence on the international music industry. Various Japanese manufacturers, such as AceTone, Korg, Matsushita, Roland, and Yamaha, were developing their versions of electronic music devices in the 1950s. These included percussion instruments, early drum machines, electric organs, and synthesizers.

The period of the 1960s was crucial for music festivals and their evolution into mass-cultural entertainment. With the postwar economic boom and the rise of the American car industry, festivals and other cultural events had become less elite and more accessible to a wider range of middle classes and even some working classes. Music festivals were also centered around various cultural matters popular among younger audiences, such as fashion, food, art, and literature. Through this music festivals started to become a valuable part of the popular culture of that time.

One of the first ‘pop-culture’ music festivals to be founded was the Monterey Pop Music Festival which was held in June 1967 in Monterey. This festival included stars, like Jimi Hendrix, Otis Redding, Janis Joplin, the Who, and Simon & Garfunkel. The greatest musical event of the 1960s was the legendary Woodstock festival which was organized on a dairy farm in upstate New York from 15th-18th August 1969 and attracted almost 500,000 people. The festival offered in total of 32 different performers, and those that were previously present at the Monterey festival.

------------------------------------------
------------------

The late 1960s saw the rise of popular electronic music merge with other musical genres, especially pop and rock, which led to the establishment of new genres. Musicians of that time, such as the Beatles or the Beach Boys, started integrating electronic instruments, including the theremin or Mellotron, into their sound. Electronica was pioneered by the American duo Silver Apples and bands such as White Noise, which are known for adding oscillators and synthesizers to their psychedelic sound.

Mood synthesizers became particularly popular among progressive rock bands, including Pink Floyd, Genesis, Yes, and Emerson, Lake & Palmer as well. A whole new sub-genre of progressive rock was born in West Germany in the late 1960s and early 1970s by this movement.

Electronic art music developed new-agent art music developed new-age ambient murky ambient dub, in the parent dub was pioneered by several Jamaican musicians such as King Tubby and later adopted by other international artists like Dreadzone, The Orb, and Ott.

After disco became highly popular in the 1970s, the late 1970s and 1980s saw the emergence of synth-pop, featuring the synthesizer. The sub-genre was greatly influenced by musicians, such as Ultravox, with the song ‘Hiroshima Mon Amour’ (1977), Depeche Mode, with a track called ‘Dreaming of Me’ (1980), and New Order, with their song ‘Ceremony’ (1981).

Disco music aimed to move crowds of people on the dancefloor, using drum machines and electronic instruments to create synthesized rhythms. Popular disco music that helped to create the EDM scene included Donna Summer’s 1977 hit “I Feel Love,” which was written by Pete Bellotte and Giorgio Moroder, who would later collaborate with Daft Punk, and George McCrae's 1974 hit “Rock Your Baby,” which used a drum machine and Roland rhythm machine.

------------------------------------------------------
----------------------------

Massive music festivals were also developing in the UK, too. Some of the most renowned pop festivals in the UK were The Isle of Wight (1967-present), Glastonbury Festival (1970-present) and Reading Festival (1971-present). The 1970 edition of the Isle of Wight exceeded the attendance at Woodstock, attracting more than 700,000 participants., international festivals were also established in the 1970s, including the Canadian Strawberry Fields Festival (1970), the Sunbury Pop Festival in Australia (1972-1975), or Festival Rock Y Ruedas De Avándaro in Mexico (1971).

Synth music, or synthpop, was developed alongside house music and electro music during the “post-disco” era in the ‘’80s. Music of this era began to be produced in the mainstream music industry in Europe as EDM became more sophisticated with the ever-changing technological advancements. Drum machines and synthesizers were used more heavily after the 1970s. Paired with the use of computers, electronic music took off as an accessible art form that could be replicated and transformed by anyone with musical experience and a computer. Music produced during this time included hits like A-ha’s “Take On Me” and the song that would come to be recognized as one of the first house records.

----------------------------------------------
----------------

Synth-pop continued through the 1980s, moving closer to dance music, with acts including Pet Shop Boys, Erasure, and The Communards. In 1987, British DJ Danny Rampling started organizing a weekly party called Shoom in one of London's fitness clubs. Soon, such parties, taking place illegally, spread to other European countries, including Germany. In Frankfurt, this time, a sub-genre, trance, was born.

Artists such as Lime and Men Without Hats from Canada, Propaganda, Sandra and Modern Talking from Germany, Yello from Switzerland, and Telex from Belgium became popular. The sound of synth-pop also became the defining feature of Italo-disco. Bands including Van Halen with their track ‘Jump’ (1983) and Europe with ‘The Final Countdown’ (1986) became instant hits.

The genres of Acid House, Dance, and Techno were all established in Germany and the UK. They then spread to the rest of Europe before traveling to America and landing in the Detroit Techno scene. Detroit techno had many early pioneers, but one main one was Juan Atkins, who in 1981 partnered with Rik Davis as Cybotron and issued the single “Alleys of Your Mind.” Shortly after releasing an album, Enter (1983), the duo split up, at which point Atkins started his label, Metroplex, and began releasing 12-inch vinyl singles under the name Model 500. The sound that emerged from the Detroit scene was an abstract instrumental funk; Saunderson often used vocalists and had his biggest hits with the soul-influenced duo Inner City. It became formalized as a style after Atkins named a track “Techno Music.” which was released in 1988.

In the 1980s the invention of the Musical Instrument Digital Interface (MIDI), a technical standard that describes and standardizes a communication protocol, the digital interface, and electrical connectors between various electronic musical instruments, computer software, and other related audio gadgets for recording, editing, and playing music was created. MIDI was completed in 1983 and the technology made the development of electronic sound easier.

-----------------------------------------------
------------------

By 1988, acid house became the most prominent music genre in the UK, mostly in nightclubs in London, Sheffield, Manchester, and Birmingham. However, conflicts with local police and attracting people exceeding the capacities of their venues, ‘raves’ were moved to the countryside near the big cities, taking place in various outdoor locations and closed industrial sites.

The official era of open-air raves didn’t last very long. In 1992-1993, the British government passed new laws to ban them effectively. While these events didn’t stop the local promoters from organizing them, these raves moved back to already established clubs or spread to the European continent, where no laws prohibiting raves existed.

The 1990’s rave culture would form out of techno music, house music, hardcore rave (hardstyle), dub, trance, and drum and bass. In the ’90s, EDM was still breaking into the mainstream, and it began to rise in popularity, especially in Europe. It was then that the rave scene began to grow into what it is today. Popular songs included the British group Orbital’s single Chime and their single “Halcyon On and On.”

This music was popularized under the name “electronica” during the 1990s. Bands like Underworld, Fatboy Slim, The Prodigy, and The Chemical Brothers came from England and had a massive impact. In the U.S., “Detroit techno,” or techno that seemed to originate in Detroit in the late ’80s and ’90s grew in popularity. Detroit techno was pioneered by the “Belleville Three,” including Derrick Mayr, who produced music under the names Mayday and Rhythim is Rhythim.

The fall of the Berlin Wall happened in 1989, just on the verge of the 1990s. The political events happening in Berlin, Germany, unconsciously, impacted the development of above-ground electronic music festivals. Shortly before the fall of the Berlin Wall, in the summer of that year, a group of 150 people, organized by the Berlin underground, took part in a political demonstration in the form of a musical parade that was held in Kurfürstendamm. Officially billed as a political march, the parade was the beginning of Love Parade, a massive electronic dance music festival, and techno parade. The following year, after the fall of the Wall, the Love Parade took place again, this time attracting around 2000 people.

----------------------------------------------------
--------------------

By 1997, the Love Parade was attended by more than a million partygoers. It had changed its location to the nearby Tiergarten Park, with the parade route ending at the Siegessäule, a ‘Victory Column’. The free-access music festival featured stages as well as floats with music, DJs, and dancers stirring through the audience and was held annually in Berlin from 1989-2003. The Love Parade was canceled due to funding difficulties. Eventually, the parade saw its short revival in 2007 when it was organized in the Ruhr region, centered around the cities of Dortmund, Essen, Duisburg, and Gelsenkirchen. The last Love Parade took place in 2010 in the city of Duisburg and was marked by a fatal disaster from attendees dying from suffocation as they were trying to leave the ramp leading to the festival area. The tragedy caused the deaths of 21 people and at least 500 more were injured. The organizers decided to cancel the festival permanently. However, in 2022, the equivalent of Love Parade was revived under the name of Rave The Planet.

Alongside Love Parade, Mayday (1991-present) and the Swiss festival Street Parade (1992-present) were the biggest electronic music festivals in Europe throughout the 1990s and early 2000s. City-based electronic music festivals were trendy in the 1990s and were eventually seen as lucrative sources of income and tourism. Other most popular city-based festivals of that time were Time Warp in Mannheim (1994-present), Sónar in Barcelona (1994-present), I Love Techno in Ghent (1995-present), Awakenings in Rotterdam and Amsterdam (1997-present) or Hradhouse in Boskovice (1998-2010).

------------------------------------------------
------------------

In the early 2000s, the ‘urban arts festival’ model was the go, and more festivals were being established, including Club Transmediale in Berlin (1999-present), EXIT in Novi Sad (2000-present), MUTEK in Montreal (2000-present), and Decibel Festival in Detroit (2003-present). Non-urban festivals were organized, too, such as Destiny/World Electronic Music Festival near Toronto (1995-2012), Kazantip in Ukraine (1992-2013), Boom in Portugal (1997-present), and Melt! in Germany (1998-present).

In the 2000s, EDM became popular in the U.S. thanks to international DJs and music producers drawing attention to the genre and a techno-pop 1998 album by Madonna called Ray of Light. From there, producers like Tiësto, Daft Punk, and David Guetta rose to popularity. In the early 2000s, dubstep was also introduced to the U.S. music industry through artists like Skrillex.

The genre during this time was strongly shaped by technological advances and higher accessibility of technology and musical software. Many technological innovations introduced in the new millennium, such as CDs or DVDs, the digital audio workstation (DAW) Ableton Live (2001), or the studio emulation Reason (2000) shaped music. These devices provided less complex, more cost-effective, alternatives to traditional hardware-based production studios. Ableton Live is considered one of the first music applications to automatically beat-match a song. DJs have widely used shows and live performances for composing, recording, and mastering records.

---------------------------------------------------
---------------

Today, EDM is often characterized by remixes and original sound mixes, produced by DJs like The Chainsmokers, Steve Aoki, Martin Garrix, and others. EDM is used by a large majority of pop and hip-hop artists, including Coldplay, Justin Bieber, Selena Gomez, Drake, Taylor Swift, Katy Perry, Alessia Cara, and countless others.

The new millennium is marked by the continuous growth of EDM in the US, such as dubstep promoted by Skrillex. Daft Punk changed the game and the global perception of electronic dance music when they launched their 2006-2007 worldwide tour at the Coachella festival near Indio, California.

Then came the global establishment of EDM festivals in the first decade of the 21st century, taking the format of outdoor mega-events. The most popular are Tomorrowland in Belgium (2005), Electric Daisy Festival in Las Vegas, US (1991), Ultra Music Festival in Miami (1999), Airbeat One Festival in Germany (2002), and Balaton Sound in Hungary (2007).

As of today, 2023, the industry's value has grown in 16 countries, led by market share gains mostly in the UK and Germany. Recorded electronic music grew by up to 18%, with physical format sales growing for the first time in 20 years, and artists’ and DJ earnings went up by 111%.

----------------------------------------------------------------------------------------------------------------------------------------------------------

Norman McLaren

This might seem like an odd transition from EDM to talking about the people at the National Film Board of Canada. However, EDM and the corresponding visuals that it is known for weren’t created together. Instead, “EDM visuals” or electronic music visuals were created through multiple experiences by multiple people. One of those people who experimented with audio and visuals was Norman McLaren.

For those who have never heard of him, McLaren is a National Hero. There are no heroes in VFX or animation, with a few exceptions. McLaren is an exception to that rule because he helped create foundational techniques that countless animation and art industries use, shaped Canadian Culture and Canada’s global image, and countlessly promoted love, anti-war sentiments, inclusivity, and peace throughout his lifetime. He created keyframe animation, pixilation, drawn-on-film animation, visual music, abstract film, and graphical sound techniques. When he died of a heart attack in 1987, 400 people from the NFB showed up to honor him, and every single newspaper across Canada reported his loss. His boyfriend, producer, and long-time partner, Guy Glover, would die of a broken heart just over a year later. He was and still is the most honored animator in Canadian history.

Born in Scotland in 1914, McLaren studied interior design at the Glasgow School of Art from 1932 to 1936. Mclaren started making animations as a teenager in Scotland; he would lie back with his eyes closed, listen to music on the radio, and watch dancing shapes projected by his mind. He started making films, always with “a musical script” instead of dialogue, and attempting a “visual translation of the music”. During his time there, he became interested in motion pictures and experimental film, leading him to set up a production group for himself and his fellow students. As he couldn’t afford a camera, he instead washed off the emulsion from a cinema’s discarded 35mm reel and painted directly onto each frame.

---------------------------------------------------------------------
------------------------------------------

The visuals of Colour Cocktail, his first amateur silent short, which led to his discovery by documentarist John Grierson, were so well matched to a gramophone record that people were convinced the sound and image had been pre-synced. Grierson went on to found the National Film Board and hire Norman Mclaren because of the extreme talent he saw.

In his early 20s, while working at Grierson’s British GPO Film Unit, McLaren noticed that knife marks on a film’s soundtrack played back as interesting sounds. McLaren would then scratch-compose a soundtrack for a film, but his producer rejected it at the time.

A few years later, after emigrating to New York, he animated both the images and sounds of his vibrant handmade shorts for the Guggenheim, including Dots (1940) and Loops (1940), to avoid paying for music rights.

McLaren would then relocate to the National Film Board of Canada, where he spent most of his career. During his first years there, he founded Studio A and oversaw a series of proto-music videos set to folk songs. Later, he would work with musicians such as Pete Seeger, Oscar Peterson, Ravi Shankar, and Glenn Gould. but he told friends of his plans “eventually to be able to compose my music… I’ve had no formal training – but I know I could compose.”

He would continue to use actual physical film to create music. McLaren used animated, electronic, optical-graphical music, like “a small orchestra of clicking, thudding, buzzing and drumlike timbres.” His work was and still is often classified as ‘synthetic’ music, as it didn’t imitate instruments, as a keyboard synthesizer does. Other filmmakers, including Claude Jutra and the Whitney brothers, would also try their hand at animated sound but McLaren developed it further than any other.

He would mark a filmstrip’s soundtrack with pencils, pens, brushes, razor blades, and other tools. He was often freehanded. McLaren controlled tone through shape, volume through thickness, and pitch through the number of his slashes into the film strips. One result was the Morse code-like percussion of Mosaic (1965).

At the Film Board, McLaren was inspired by musicians such as Maurice Blackburn and Louis Applebaum. McLaren was invited to join the National Composers’ Association and lecture at Juilliard and the Acoustical Society of America. John Cage, who helped found some of the first electronic music experiments in the U.S.A., would play McLaren’s music at his New York concerts.

--------------------------------------------------------------------------------
------------------------------

With filmmaker Evelyn Lambart, McLaren developed a deck of pitch cards, a ‘keyboard’ that allowed the exact musical pitches of a piano to be photographed onto a soundtrack. The blips and whirrs of his hand-etched scores created this card-animated music, which resembled synthesizer, 8-bit, or digital music decades in advance. He first used his cards for Now Is the Time (1951), a 3D film with a parallel multidimensional audio system of independent speakers that wrapped around the audience. He used the same cards for Neighbours (1952), his most famous film.

One of Mclaren’s other films was called Blinkity Blank. Blinkity Blank employed many techniques of shapes drawn directly onto film. It showed everyday household objects like fruit and umbrellas and a rough narrative involving a bird attempting to escape its cage. Blinkity made great use of sound and used a combination of musical sequences and sound effects created by McLaren in the film to convey its message. McLaren described how one note of his “animated sound” would require 50 individual lines on his 35mm film, and its pitch and intensity were adjustable through the shape and size of the strokes.

Synchromy (1971), the last of McLaren’s many films, could be best described as a moving kaleidoscope on screen, and the images presented correspond to the soundtrack. The part of the filmstrip that contained the audio, McLaren colored, multiplied, and reorganized to look synchronized viewing. Filming him writing the ‘perpetual motion’ tune on a piano for Synchromy, photographing pitch cards to match it and finally transforming the soundtrack into visuals, the BBC documentary The Eye Hears, the Ear Sees (1971) documented this and explained that “this man is doing something that no one in the world has ever done before: he’s writing a film on the piano”.

A National Film Board Short entitled Pen Point Percussion was a “hand-drawn sound introduction” to Norman Mclaren's work that would be documented. It covered the fundamental processes of how sound is captured into film, providing an excellent glimpse into sound processes for film.

-----------------------------------------------------------------
---------------------------------------

As the Narrator states:

“Well, if a sound makes a pattern in a film, a pattern in a film will make a sound. You can even create your Sounds by drawing directly on the film. Norman McLaren the motion picture artist has been making some hand-drawn sounds, and now he's going to hear them for the first time. Played on the movie Lola, a sort of miniature movie projected it's not just guesswork the sound of any of these patterns of lines can be calculated the size controls the loudness……”

“That's simple enough, and then there's the tone quality. It's controlled by the shape of the marks. This series of thin, straight lines gives a sharp rather unpleasant sound but these round dots are a bit smoother. The marks can be any shape you like. Now, how about this? Using the brush, Norman tries to make a row of small triangles with these sharp angular forms. What would they sound like? The distance the lines are apart controls pitch, with the lines far apart being a low-pitched, medium, and high note.”

“Now, what can they be used for these hand-drawn sounds? Norman McLaren finds they're a Perfectly Natural accompaniment to some of his hand-drawn Motion Pictures. Each movement of the screen can have its own specially designed sound-making movies. This way, the artist has direct personal control at every stage of the film's production. Sound and picture are planned and closely related as they are drawn. Now Norman is checking to ensure that each bit of sound is perfectly matched to its accompanying screen action. Finally, the picture and the sound will be printed together at one length of the film, and color will be added during the process.”

Even though Norman’s processes were opposite to how we sync graphics for sound now, his processes are still worth learning.

Video Here: https://www.youtube.com/watch?v=TxZe4hL73m8

---------------------------------------------------------------------------------------------------------------------------------------------------------

The Toronto Connection

TouchDesigner

Let’s talk about one of the most used software packages in the EDM world: Touchdesigner.

TouchDesigner is software created by DeDerivative, a company in Toronto, Canada. It generates visuals in "procedural" and "node-based" and has real-time capabilities. Derivative was founded in 2000 by Greg Hermanovic(Founder, President, and CTO), Rob Bairos(Head of R&D), and Jarrett Smith(Product Architect).

If you are familiar with the history of Sidefx Houdini, you might have heard of Touchdesigner once or twice. Like Houdini, TouchDesigner was developed from a software package called PRISMS, developed at Omnibus Computer Graphics in Toronto, Los Angeles, and New York from 1984 to 1987. When Omnibus collapsed and went bankrupt, PRISMS was purchased from Omnibus liquidators Kim Davidson and Greg Hermanovic, who formed a company called Side Effects. It was a pure chance of luck that they acquired the software and its accompanying SGI computer. Greg Hermanovic served as CEO and Director of Strategic Technology of Sidefx until 2000. Hermanovic's background was different from that of the traditional route of CGI. Before working in CGI Software, he worked for six years in aerospace, developing simulations for The Canadarm and various flight simulators.

Side Effects developed and licensed PRISMS till 1998, when Houdini became the primary package that clients started to swarm around. PRISMS was used in over 200 feature films and won its first Academy Award in 1998. Houdini’s first release was in 1995. In 2003, Side Effects Software received its second Academy Award for the innovations within Houdini, and the Software has been used in well over a thousand productions and countless video games.

------------------------------------------------------------
-----------------------------------

In FX Guide’s Side Effects Software – 25 Years Retrospective, written by Mike Seymour in 2012, it’s clear that PRISMs received much love. “From the earliest days, Side Effects threw good parties. As one original user, Harry Magiros (who went on to be an Australian reseller for Side Effects), recalls, “Greg would throw these amazing parties with interactive stuff happening – all live – which back then was unheard of, today of course you can do it with a PS3. But it was such a great family atmosphere at the first user groups. We all had Greg and Kim’s home phone numbers, and they never minded us calling in the middle of the night if we had problems. While the other companies were spending a fortune on marketing, [Side Effects] got to know everyone personally, which was a great move then.”

Greg Hermanovic left Sidefx to form Derivative in 2000, working from the most recently released version of Houdini 4.1. Derivative's first product, TouchDesigner, spanned from TouchDesigner 007 to 017 from 2002 to 2007. In 2008, Derivative released its next-gen TouchDesigner package 077 in beta form, which rewrote the software, incorporating OpenGL workflows, compositing and effects, a new user interface, and other additional features.

Like SideFx’s Houdini, TouchDesigner is based on procedural workflows and has Operators. Operators are the building blocks of a TouchDesigner project. These objects are represented as user interface nodes and connected to create procedural effects and animation. TouchDesigner was also built to implement visual programming languages for easier user interaction and lightweight scripting.

Touchdesigner’s nodes also share several names similar to Houdini’s. These include COMP, Components representing 3D objects and other operators; TOPs, Texture Operators for 2D animations; and CHOP, Channel operators for motion, audio, animation, and control signals. Then there are SOPs, Surface operators for the native objects and points in 3D space; MAT, nodes for applying materials; and DAT, data operators for plain text, scripts, XML, and tables.

TouchDesigner's other features included rendering and compositing workflows, scalable architecture, video, and audio in and out, multi-display support, video mapping, animation and control channels, custom control panels, and a 3D engine.

-----------------------------------------------------
-------------------------

TouchDesigner 007 was launched at the end of 2001 on December 17th. Introduced as the Touch 101 product family, this included TouchDesigner 007, Which cost $1199. TouchPlayer 007, a player of Touch artworks, was Free, and TouchMixer 007 cost around $199 for virtual DJs who wished to use the software. TouchMixer 007 was added to TouchPlayer, allowing it to use MIDI in/out, record/edit mixes, and replace textures.

The first year and a half of Derivative’s existence was spent setting up the company. Much of this time was spent creating a website and brand and finding employees to help. Rob Bairos started work on adapting Houdini into a real-time engine, giving inputs for MIDI controllers and essential interface-building tools for live performances. In contrast, Jarrett Smith worked on the design of new features. The other goal for the first few years was building the community and getting clients excited about the new products. The team created an artwork section for their site where artists could upload their files and upload tracks for presets in the software they had designed.

Some of Touchdesigner’s first work was for Canadian rock icons RUSH for their North American Vapor Trails tour in 2002. As well as a collaboration with Smirnoff Toronto and Warp Records LA. Derivative produced a unique visual synth for their event, and Touch artists “Topcat” (Greg Hermanovic), “Mordka” (Jarrett Smith), and Christian Smith ran visuals alongside techno pioneers Derrick May and Richard Devine.

Touch 008 was released on February 28, 2002, and Touch 009 on April 2, 2002. Many of the updates revolved around bug fixes and new artwork. Jarrett Smith's label Mordka released the first Synth with a music track and a recorded performance mix: Niosumed. A Niosumed Complete Walkthrough was featured in 3D World magazine. Another feature added for Touch 009 was the release of a Touch plugin for Internet Explorer and Netscape Navigator4, allowing you to embed Touch synths into browsers.

Touch 012 was released on Oct 1, 2002. Touch 010 and 011 were internal builds and were never publicly released. Alongside its release was TouchArt CD, a collection of all the artwork from the website's catalog for $19.95. It included its installer and Flash interface to navigate and launch the artwork. Touch 013 was released on Jan 13, 2003, followed by a major software update on April 10, 2003, in Touch 015. This included features like dual monitor support for the first time, offscreen rendering, Quicktime movie support for playback as textures, and support of new Nvidia and ATI drivers.

-------------------------------------------------------
-------------------------

In the summer of 2004, the new PRADA flagship store opened in Aoyama, Tokyo. Designed by Herzog & de Meuron, and Working alongside the architects, Derivative integrated five real-time architectural projections onto the walls and ceiling of the building. This was one of the first projection mapping projects done with TouchDesigner.

At the 5th edition of MUTEK Montreal, Jarrett Smith completed and led work for the beginning of Plastikman’s world tour. Richie Hawtin (aka Plastikman) wanted to control the entire show from his console, a custom MIDI controller dubbed ‘CTRL LIVE’ built by his dad. Because of this, Team Derivative had to “hack” the Ableton Live 3 and build a bi-directional audio-visual control system between Live and Touch to get the system to work with Plastikman’s console. The show succeeded, but the prep had to overcome several hurdles.

The show's beginning had the most challenging content: a satellite image of the Detroit-Windsor area was loaded into Touch as a huge texture, and then it slowly zoomed in over the first track. The piece crashed frequently because of the memory requirements. In the show, everything worked smoothly until 3 minutes into the show, and then everything went black. After minutes of struggling, a camera shutter closed, preventing the projections from being projected to the audience.

In December 2004, Derivative released Touch 017. Some highlights in 017 were a Keyframe editor live video input and its movie file format, .tmv (Touch Movie file), with a converter to encode it. Touch 017 was the final version of Touch tools in this era. After its release, there would be a long break before Derivative’s next software release.

-------------------------------------------------
---------------------

In 2014, TouchDesigner 88 (build 11600) would support the Oculus Rift.

However, Derivative kept TouchDesigner up to date with all the latest tools and flexibility it needed to survive. In June 2017, they made TouchDesigner 099 available on Windows and macOS. TouchDesigner 099 would add support for the latest SDI video cards from AJA, Bluefish, and Blackmagic up to 4K and 12G, with low latency, high frame rate, and deep pixel depth. Physically-based Rendering was also implemented during this time, as was Compositing improvement and Ableton Sync support.

The choices for getting video between TouchDesigner and other systems included Syphon/Spout and streaming over IP of H.264, Newtek NDI, and HAP video. TouchDesigner’s most famous component, Kantan Mapper, was also fully re-engineered to map video onto shapes.

Most recently, TouchDesigner 2023 was released. 2023 brings considerable improvements in device and sensor support, timecode, and other features to help the software work in broadcast and production environments. The new software improvements include new operators, a Bloom TOP for bloom effects, a GLSL COMP, an NVIDIA Upscaler TOP for integrated upscaling, and an overhauled Engine COMP. It also now includes Expanded OAK-D camera support (or stereophonic images, depth sensing, and IR night vision: ass well as Machine learning tracking capabilities and the Body Track CHOP for NVIDIA RTX GPUs. And we can’t forget the lasers…2023 also included Integration for Mo-Sys Startracker and Laser devices.

------------------------------------------------------
--------------------------

In July of 2023, TouchDesigner integration with Unreal Engine 5 would be possible within the software. TouchDesigner release – 2022.33910 would work alongside the latest GitHub build of the TouchEngine-UE plugin (1.2.0) to provide this new integration of the two tools. The only downside of TouchEngine is that it does not work with Non-Commercial licenses. CHOPs in Unreal would work in the same way as CHOPs in TouchDesigner. This meant that users could work with samples, channels, by names or indices, and CHOP objects directly within the engine.

Following TouchDesigner 2023, Derivative announced Touch would now have a promising new update that would add POPs for GPU-powered 3D operations. It’s its own set of nodes for simulations of everything from fog to cloth and collisions. POPs in TouchDesigner are Point Operators, a set of GPU-accelerated operators that manipulate 3D materials. That joins Surface Operators (SOPs, which are CPU-bound), Texture Operators (TOPs), and Channel Operators (CHOPs) inside the software package.

Just like how Sidefx hosts HUGs(Houdini User Groups), Touchdesigner(Team Derivative) also hosts many workshops on how to use their software successfully in the music scene.

For example, their TouchDesigner and Ableton Live workshop hosted on March 24, 2023, from London’s Music Hackspace. They were led by TouchDesigner artist and educator Bileam Tschepe aka elekktronaut, and presentations from Grigory Gromov (generative AV systems), Stanislav Glazov (interactive music in TouchDesigner), and Roberto Teran (an interactive waterfall). This presentation was about CHOPs, combining Touchdesigner with other software such as Houdini, and using the VST CHOP for some examples.

Another partnership was with The Interactive & Immersive HQ on September 7th, 2021, to launch a beginner series on making expressive media with TouchDesigner. The series highlights the basics of operators and various other music functions.

------------------------------------------------------------
---------------------------

In July 2023, Derivative will launch its new TouchDesigner Curriculum in partnership with SudoMagic. This educational resource started with 80+ units of materials and lessons for the community to enjoy and learn from. These included form videos, files, written documentation, and downloads.

Touchdesigner can also work with other digital audio workstations (DAW) and music production software such as Bitwig Studio. This is useful for users who wish to use Bitwig as a sound, music, and audio mixing engine and then export their creations to Touchdesigner. This can be done with TDBitwig, which includes tools such as bitwigSong for transport/arranger communication (including timeline cues), bitwigTrack for bidirectional volume/mute/solo controls and control rate amplitude envelopes, and bitwigClip and bitwigClipSlot for bidirectional access to color, loop length, quantization, plus launching ability, playback control, and clip events with callbacks.

As of 2024, Team Derivative is still running its tech magic and keeping an active community involvement. The most recent was a presentation in Brooklyn, New York, at a Pataphysics show in February 2024. Showing off GeoSynth, an experimental audio synthesizer in TouchDesigner (and a little bit of Houdini) that repurposes the point coordinates of objects as raw audio samples. GeoSynth is a prototype instrument, a digital audio synthesizer that uses 3D objects as the source input for real-time audio synthesis. The idea is that the sounds can be sculpted visually by manipulating the form of each object and creating an animation in perfect synch with the audio but in the opposite manner than the typical audio-driven approach. It is very similar to what NormanMcLarenn did all those years ago, except it is the 3D version.

---------------------------------------------------------------------------------------------------------------------------------------------------------

Bill Buxton

For those unfamiliar with Bill Buxton's works, he is one of the pioneers of computer engineering in Toronto, let alone Canada. Although Buxton’s work covers a variety of larger subject matters regarding computer and human interactions, psychology between humans and computers, and various computer software engineering, his work is worth mentioning here. As Chief Scientist, Buxton also worked for the legendary Alias Wavefront corporation that made the one and only Alias and then the infamous Maya 3D animation tools. (1994-2002) Buxton is now completing work with the Mircosoft corporation.

Born in 1949, Bill Buxton's early research paved the way for the trackpads and touch screens that are ubiquitous today, which is why his work is so relevant in the electronic music field. The more interactive music is, the more connections we can derive from it. His 1985 paper on the capacitive multi-touch tablet was the first ever in the peer-reviewed literature to discuss a multi-touch device. He holds over 20 patents on these and other techniques.

Buxton was introduced to HCI (Human-computer interaction) in the early 1970s while studying music at Queen’s University. He would then participate in developing digital music systems at the National Research Council of Canada in Ottawa. There, he would be introduced to the world of computer animation through the work of Peter Foldes, an NFB computer animator best known for his work on Hunger (La Faim) 1974 and Metadata 1971.

Buxton also has a unique background as a composer and performer. In 1975, he arrived in Canada at the invitation of Leslie Mezei as an informal "Artist in Residence" in the Dynamic Graphics Lab (Also known as DGP) which he co-directed at The University of Toronto. Buxton didn’t have much of a technical background in computers at this time but did have a robust skill set in electronic and computer music. This experience is precisely what DGP needed. In 1978, Buxton earned his Master of Science in computer science.

Buxton has been awarded four honorary degrees, including an honorary doctorate of science from the University of Toronto in 2013. In 2024, Buxton would also be presented with the highest honor of the country, becoming an Officer of the Order of Canada. In 2011, he received the first annual Grand Canadian Digital Media Pioneer Award, recognizing his work pioneering multi-touch systems and novel user interface designs for computer music systems. He has also been named one of the top five designers in Canada by Time Magazine and one of the first dozen recipients of ACM SIGCHI’s most prestigious honor, a Lifetime Achievement Award for contributions to the field of HCI.

Ron Baecker, Leslie Mezei, and the chair of the Department of Electrical Engineering, K.C.Smith, would work alongside Buxton to launch his endeavors off the ground at the university. The Structured Sound Synthesis Project (SSSP) was born out of this. Buxton also ran the Input Research Group (IRG) at U of T. The project was based in the Computer Systems Research Institute, received funding from 1976 to 1977, and existed until 1984. During this time, SSSP would build one of the world’s first portable digital synthesizers and many other user interfaces for music, which are now commonly used worldwide.

One of the other inventions of SSSP was a 6-voice digital synthesizer to make the sounds controlled in real-time via a DEC LSI-11 microcomputer. Two project engineers, Tom Duff and Rob Pike, wrote a real-time package that let it run as a slave to a PDP-11/45 minicomputer, which ran an early version of UNIX. The LSI-11 communicated with a UNIX machine via some dual-port memory using a modification to UNIX written by Bill Reeves.

This synthesizer would then be demoed, played in concerts, and modified as a stand-alone microcomputer. This involved laying the control panel on the screen like a spreadsheet and controlling the cells with a tablet and other graphical controllers. This synthesizer did not have MIDI controls as they had not been invented yet and ran off C code.

Buxton has also written several books and scientific papers, including his 2007 book Sketching User Experiences, a major work in the theory and practice of holistic design.

From 1989-94 he traveled between Toronto, where he was Scientific Director of the Ontario Telepresence Project, and Palo Alto, California, where he was a consulting researcher at Xerox PARC. From 1998-2004, Bill was on the board of the Canadian Film Centre, and from 1998-99 chaired a panel to advise the Premier of Ontario on developing long-term policy to foster innovation through the Ontario Jobs and Investment Board. In the fall of 2004, he became a part-time instructor in the Department of Industrial Design at the Ontario College of Art and Design(OCAD). Then, he moved on to the Knowledge Media Design Institute (KMDI) at the University of Toronto as a visiting professor.

It’s hard to condense all of Buxton’s work into a complete summary of his achievements. I highly recommend checking out his website below to explore all of the developments he has contributed to the world of electronic music and computers:
https://www.billbuxton.com/

---------------------------------------------------------------------------------------------------------------------------------------------------------

Artist Spotlight

We can now start to talk about the various EDM artists of Toronto. There are so many that it is hard to count, but let’s look at some that feature the incredible computer graphics packages created here in Canada in their acts. One that awe-struck me was Deadmau5, and it was one of my first EDM concerts ever.

There was something breathtaking about being ushered into the dark, closed-off room of Rebel Toronto and then blasted with psychedelic images and lasers of rotating mouse head visuals until being forced out into the cold weather of February at 6 a.m. It was the equivalent of seeing fireworks for the first time.

It won’t be right not to mention one of Ontario’s most famous EDM artists, Deadmau5. Joel Zimmerman, better known as DeadMau5, is a six-time Grammy nominee who produces progressive house-electro-house music. Some of his hits include "Ghosts 'n' Stuff," "Hi Friend," and "Strobe." Arguably, the best part about his shows is the visuals and how Zimmerman attempts and succeeds in elevating his show to the next level.

Here is a brief breakdown of his work and the artist(s), tools, and visuals he has recently created.

Reza Ali was one of the artists who provided visuals for DeadMau5’s Meowingtons Hax 2011/2012 tour. Ali created the audio-synced visuals using custom software mixed with C++ and openFrameworks. Ali built an application that would output generative visuals, and these visuals would act as a mask for the pre-rendered content. In addition, the generative masks make the content midi-mappable. This meant that if Deadmau5′s toggled a button or moved a fader up and down it would reflect that in the show’s visuals. His app also had to pixel map the visuals so they displayed perfectly on Deadmau5′s LED stage setup. This setup consisted of two stealth LED wedges, an LED back wall, 9 small cubes (with LED panels on each side of the cubes), and the main cube (with large LED panels on each visible side).

Ali dubbed the application “Rezanator,” and it would output images consisting of five visual layers that then could be run in a composition of each other or one layer at a time. Each layer generated a unique-looking mask. The first layer would be a “blackout layer” for partially or completely hiding images, followed by a “white out layer,” showing pre-recorded visuals. These two layers were then followed by a “dynamic mask layer,” which produced simple squares and rectangles that would change in size. It also controlled the oscillation of the images at different rates and offset them.

--------------------------------------------------------------------
-------------------------------

2011 was also the year where Cube v1 would be debuted on tour. Cube v1 was a massive LED-covered static cube projecting visuals into the audience. The cube also served as a platform and DJ booth. The cube had 36 tiles, totaling over 2800 individual F11 LEDs. It had a 1600 x 1200 native resolution and could display virtually any color. The opening motion graphics sequence was an old-school simulation of Super Mario but with a deadmau5 character trying to beat Mario to the castle.

Zimmerman would keep control of all the visuals and matching sounds through his interactive mouse helmet. The helmet came in two different versions, one completely covered with LEDs and another simpler one with neon-lined edges. The neon one weighed over eleven pounds. The LED model weighed almost three times as much—the LED helmet contained over 1,000 individual diodes alone.

In both helmets, a camera up front showed a view of what was going on outside since the entire helmet was completely solid, and there weren't any eye holes. It had a set of color video goggles on the inside that displayed whatever the camera saw, so all interaction with knobs and sliders had to be dealt with from a different perspective. Eight fans were contained around the back to keep Zimmerman’s head cool while spinning tracks. The LED one had six on the rear and two in the neck area to stimulate air circulation.

The central server output signals to another box loaded with Pixel Mad, a software designed for digital signage. Data then flows through a router that breaks the content into separate windows for each tile. Each track had 15-16 video levels for motion graphics. One person in the front of the house controlled the system and linked the graphics to whatever track was playing. The entire production was designed by Martin Phillips, who was also responsible for creating the visual effects in Daft Punk productions.

In 2017, Zimmerman would unveil Cube 2.1, which traveled with him on his tour for his sixth Studio Album. The show's visuals were created with Maxon Cinema 4D and built with the help of over 40 people and with Tait Towers, an entertainment engineering company. This version of The Cube was 33% larger than its predecessor and had moving parts, which included a three-axis motor. The cube played multiple loops of animation over 600 frames long. Some of the visuals were tied to specific songs, and the animations had their special pipeline for producing them.

Zimmerman approaches visuals like all his music in a “jam session.” For Cube 2.1, he compiles over 300 clips, which he and his associates refine over a few months. Then, for each approved clip, he creates four or five variations.

In an article with LiveDesign in 2017, Zimmerman also mentioned his software of choice is Sidefx Houdini, as he preferred its procedural workflows. The skill set he developed from the software helped him jump into Cinema 4D faster.

Projection mapping the visuals was the next step in making Cube 2.1 work. For the DeadMau5 tour, the images had to have a resolution of at least 768x768 pixels. The visuals also could not look flat, so they needed to work from different perspectives depending on where viewers were watching the stage. Zimmerman then took 3D models or scans of his stage, broke it down, and modeled LED panels around certain perspectives of the crowd. Zimmerman then rendered his final images in C4D’s OpenGL renderer and Octane.

Cube v3 would be a whole different matter. Presented in 2019, it was a different endeavor and required different software skills. For his 2019 tour, Deadmau5 would write away custom code and visuals in hotel rooms using Derivatives TouchDesigner. This visuals would include images of Bosch’s “The Garden of Earthly Delights” and deadmau5’s signature “mau5head” logo to the tune of hits like “Faxing Berlin,” “Raise Your Weapon” and “Ghosts N Stuff.” With the Cube v3, Zimmerman could cue up visuals in real-time, leaving space for spontaneity during sets and custom retimes visuals when he felt like switching the setup.

In 2020, Zimmerman would also assist in producing the visuals for the earworm song of the summer Saint JHN’s Roses for their performance at The Billboard Music Awards. Zimmerman made custom 3D designs that involved the use of Epic Games’ Unreal Engine.

In 2024, Deadmau5 also dipped his toes into virtual reality and teamed up with Soundscape to perform a series of exclusive virtual concerts on the platform. This performance included spatial audio, psychedelic visuals, and a 30-meter-tall Deadmau5 avatar.

Soundscape is an interactive audio website that allows you to create your own and mix audio with matching visuals depending on which genre you enjoy. It now supports Unreal Engine 5. The website was originally launched in 2017 as an app for the Burning Man Festival.

---------------------------------------------------------------------------------------------------------------------------------------------------------

bottom of page