Music has been an integral part of human life for thousands of years. From ancient civilizations to modern times, music has been enjoyed and cherished by people of all ages and backgrounds. However, with the advent of technology, music has undergone a significant transformation. The invention of music software has revolutionized the way we create, produce, and consume music. In this article, we will explore the history of music software and its impact on the music industry. Join us as we take a journey through the evolution of digital music technology and discover when music software was first invented.
The Emergence of Electronic Music and Early Music Software
The Beginnings of Electronic Music
The origins of electronic music can be traced back to the early 20th century, with the invention of the first electronic musical instrument, the Theremin, in 1920. However, it was not until the 1950s and 1960s that electronic music began to gain widespread recognition, with the development of new technologies and the establishment of electronic music studios.
One of the pioneers of electronic music was the German composer Karlheinz Stockhausen, who in the 1950s began experimenting with electronic sound generation and manipulation using the recently developed synthesizer. Stockhausen’s works, such as “Gesang der Jünglinge” (1955-56), showcased the potential of electronic music to create new and unique sounds that were not possible with traditional acoustic instruments.
Another significant development in the early history of electronic music was the establishment of the first electronic music studios, such as the Columbia-Princeton Electronic Music Center in the United States and the Studio für Elektronische Musik in West Germany. These studios provided composers and musicians with access to the latest technology and equipment, enabling them to create and record electronic music compositions.
In the 1960s, the availability of affordable tape recorders and the rise of popular music led to the emergence of electronic music in popular culture. Electronic instruments such as the Moog synthesizer, which was invented in the late 1960s, became popular among musicians and composers, and electronic music began to be incorporated into popular music recordings.
Overall, the beginnings of electronic music can be seen as a response to the limitations of traditional acoustic instruments and the desire to explore new sounds and musical possibilities. The development of technology and the establishment of electronic music studios provided the infrastructure for electronic music to flourish, paving the way for the evolution of digital music technology.
The First Electronic Musical Instruments
In the early 20th century, a group of composers and inventors began experimenting with electronic music, using machines and devices to create sounds that were previously impossible to produce with traditional instruments. The first electronic musical instruments were created in the 1920s and 1930s, and were primarily used in research and academic settings. These early instruments included the Ondes Martenot, the Theremin, and the Trautonium.
The Ondes Martenot, developed by French inventor Maurice Martenot in the 1920s, was one of the first electronic musical instruments. It was designed to create a wide range of sounds, from orchestral to vocal, and was capable of producing complex, evolving textures. The instrument worked by manipulating a series of electronic oscillators, which were controlled by a keyboard and a series of knobs and levers.
The Theremin, developed by Russian inventor Leon Theremin in the 1920s, was another early electronic musical instrument. It was a pioneering device that allowed players to create sounds by moving their hands near two metal antennae. The Theremin was the first instrument to use the principle of volume and pitch feedback, which would later become a standard feature in many electronic instruments.
The Trautonium, developed by German inventor Friedrich Trautwein in the 1930s, was another important early electronic musical instrument. It was designed to produce a wide range of sounds, from natural to electronic, and was capable of creating complex, evolving textures. The instrument worked by manipulating a series of electronic oscillators, which were controlled by a keyboard and a series of knobs and levers.
These early electronic musical instruments paved the way for the development of digital music technology, and inspired a new generation of composers and inventors to explore the possibilities of electronic sound. They demonstrated that it was possible to create a wide range of sounds using electronic means, and opened up new avenues for musical expression and experimentation.
The First Music Software Programs
In the early days of computing, the first music software programs were created as a means of experimentation with electronic music. These programs were designed to emulate the sounds of traditional instruments, as well as create new and unique sounds that were not possible with acoustic instruments.
One of the earliest music software programs was the “MUSIC” program, which was developed in 1957 by the legendary computer scientist, John Mauchly. This program allowed users to input musical notes and play them back through a speaker, marking the beginning of computer-generated music.
Another significant early music software program was the “CompuSynth” program, which was developed in the early 1970s by the inventor, Max Mathews. This program was capable of generating a wide range of sounds, including realistic instrument sounds, as well as more experimental and abstract sounds.
Other notable early music software programs include the “Fractal Music Software” program, which was developed in the late 1980s and allowed users to create complex and intricate musical patterns using mathematical algorithms, and the “BASIC Music Software” program, which was developed in the early 1990s and provided an easy-to-use interface for creating and editing music.
Overall, the early music software programs were instrumental in paving the way for the development of digital music technology, and laid the foundation for the sophisticated music software programs and technologies that we see today.
The Rise of Personal Computers and Digital Audio Workstations
The Evolution of Personal Computers
The evolution of personal computers has played a significant role in the development of digital music technology. From the early days of the Apple II and Commodore 64 to the modern-day laptops and desktops, personal computers have become an essential tool for musicians and producers alike.
In the early 1980s, personal computers were primarily used for basic tasks such as word processing and data management. However, as technology advanced, the capabilities of personal computers expanded, and they became increasingly powerful tools for music production.
One of the earliest personal computers specifically designed for music production was the Fairlight CMI, which was introduced in 1979. This revolutionary machine was equipped with a powerful digital signal processor (DSP) and a graphical user interface (GUI) that allowed musicians and producers to create and manipulate sound in new and innovative ways.
Throughout the 1980s, personal computers continued to evolve, and new software programs emerged that allowed musicians to create and record their own music. One of the most popular early music software programs was called MIDI (Musical Instrument Digital Interface), which was first introduced in 1983. MIDI allowed musicians to connect their instruments and computers, enabling them to compose, record, and produce music digitally for the first time.
As personal computers became more powerful and more affordable, music software continued to advance, and new technologies emerged that transformed the way musicians created and recorded music. One of the most significant developments in the history of digital music technology was the introduction of digital audio workstations (DAWs) in the early 1990s. DAWs like Pro Tools and Logic Pro allowed musicians and producers to record, edit, and mix music digitally, revolutionizing the music production process and paving the way for the digital music revolution.
The Development of Digital Audio Workstations
In the late 1970s and early 1980s, the first digital audio workstations (DAWs) were developed. These early DAWs were large, expensive, and limited in their capabilities. However, as personal computers became more powerful and affordable, the technology behind DAWs improved rapidly.
One of the first commercially successful DAWs was the Synclavier, which was released in 1984. The Synclavier was a revolutionary device that allowed musicians and producers to create and record music using a computer. It was a powerful tool that could produce high-quality audio and was used by many prominent artists of the time.
Another significant development in the history of DAWs was the release of the MIDI (Musical Instrument Digital Interface) standard in 1983. MIDI allowed different devices to communicate with each other and enabled musicians to use a wide range of electronic instruments and equipment in their recordings.
In the 1990s, DAWs became more accessible and user-friendly, with the development of software such as Pro Tools and Logic Pro. These programs allowed musicians and producers to create and record music on their personal computers, using high-quality audio interfaces and digital recording equipment.
Today, DAWs are an essential part of the music industry, and there are many different options available for musicians and producers to choose from. Whether you’re a beginner or a professional, there is a DAW out there that will suit your needs and help you create the music you want to make.
The Impact of Personal Computers on Music Production
With the advent of personal computers in the 1980s, music production underwent a significant transformation. Prior to this, the creation of music was limited to a select few professionals who had access to expensive recording equipment. However, with the widespread availability of personal computers, even amateurs could produce high-quality music from the comfort of their homes.
One of the most significant impacts of personal computers on music production was the introduction of digital audio workstations (DAWs). These software applications allowed musicians to record, edit, and mix their tracks digitally, rather than relying on analog tape. This revolutionized the process of music production, making it possible to manipulate and fine-tune every aspect of a recording with greater precision than ever before.
Additionally, personal computers enabled musicians to explore new sounds and experiment with different production techniques that were previously impossible. With the advent of digital signal processing (DSP) algorithms, musicians could manipulate and transform sounds in real-time, creating entirely new textures and sonic landscapes. This opened up a world of possibilities for music production, leading to a wide range of new musical styles and genres.
Moreover, personal computers enabled collaboration among musicians from all over the world. With the ability to share digital files, musicians could work together remotely, exchanging ideas and creative input without the need for physical proximity. This allowed for a much more diverse and global music scene, with artists from different cultures and backgrounds coming together to create new and innovative sounds.
In conclusion, the rise of personal computers and digital audio workstations revolutionized the music industry, democratizing the production process and enabling a new generation of musicians to create and share their music with the world.
The Digital Revolution in Music: Sampling, Synthesis, and Sequencing
The Emergence of Sampling Technology
Sampling technology, which involves the extraction of sound bits from recorded audio, has played a significant role in the development of digital music technology. The emergence of sampling technology can be traced back to the 1960s, when electronic musicians first began experimenting with manipulating pre-recorded sounds. However, it was not until the 1980s that sampling technology became widely accessible to the general public, with the advent of affordable samplers such as the Roland SP-100 and the Akai S900.
These early samplers allowed musicians to capture and manipulate sound bites from a variety of sources, including records, movies, and even live performances. The resulting soundscapes were often unpredictable and experimental, and helped to push the boundaries of what was possible in electronic music.
Sampling technology continued to evolve throughout the 1990s, with the emergence of new hardware and software tools that made it easier for musicians to create complex, layered sounds. Today, sampling is an integral part of many music genres, from hip-hop and electronic dance music to rock and pop.
The Development of Synthesis Techniques
In the early days of digital music technology, the development of synthesis techniques played a crucial role in shaping the future of music production. These techniques allowed musicians and composers to create new sounds and textures that were previously impossible to achieve with traditional instruments and recording methods.
One of the earliest synthesis techniques was frequency modulation synthesis (FM synthesis), which was developed in the 1960s by the Italian engineer Giovanni Mallegni. FM synthesis involves the manipulation of two oscillators, where one oscillator’s frequency is modulated by the other oscillator’s output. This technique was later refined and popularized by the Japanese electronic musician Isao Tomita in the 1970s.
Another important synthesis technique that emerged in the 1970s was additive synthesis. This technique involves the combination of various harmonic components to create a new sound. Additive synthesis was first developed by the French engineer Georges Jenny in the 1950s, but it was not until the 1970s that it became widely used in music production.
In the 1980s, wavetable synthesis became a popular synthesis technique. This technique involves the use of a wavetable, which contains a series of waveforms that can be manipulated to create new sounds. Wavetable synthesis was first used in the 1970s, but it was not until the 1980s that it became widely used in music production, thanks to the development of affordable synthesizers such as the Roland DX7.
In the 1990s, virtual synthesis became a popular synthesis technique. This technique involves the use of software to create and manipulate sounds. Virtual synthesizers can emulate the sound of traditional analog synthesizers, or they can create entirely new sounds that would be impossible to achieve with hardware synthesizers. The emergence of affordable digital audio workstations (DAWs) in the 1990s made virtual synthesis more accessible to musicians and composers.
Today, digital synthesis techniques continue to evolve and expand the possibilities of music production. From granular synthesis to frequency ratio synthesis, there are now many different techniques available to musicians and composers looking to create new and innovative sounds.
The Rise of Sequencing Software
In the late 1970s and early 1980s, the rise of sequencing software marked a significant turning point in the history of digital music technology. The introduction of this new technology enabled musicians and composers to create, record, and manipulate digital audio signals with greater ease and precision than ever before.
One of the earliest and most influential sequencing software programs was the Music Macro Language (MML), developed by Iain Banks and Peter Wright in 1977. MML allowed users to input and manipulate musical notes and rests using simple text commands, which were then converted into musical notation.
In 1981, the German software company, PDP-11, released the first commercial sequencing software called Sequential Audio Processor (SAP). SAP allowed users to record and manipulate up to 16 audio tracks using a keyboard and display.
In 1983, the first commercial digital audio workstation (DAW) was released by the company, Fairlight. The Fairlight CMI (Computer Music Instrument) was a revolutionary piece of hardware that combined a digital synthesizer, sampler, and sequencer into one powerful machine. The accompanying software, known as Page 11, allowed users to record, edit, and manipulate digital audio and MIDI data.
By the mid-1980s, a number of other sequencing software programs had been developed, including the popular programs, Dr. T, Pro-24, and Akai MPC. These programs were widely used by electronic musicians, hip-hop producers, and other artists who were looking for new ways to create and manipulate digital audio signals.
The rise of sequencing software marked a significant turning point in the history of digital music technology, as it allowed musicians and composers to create, record, and manipulate digital audio signals with greater ease and precision than ever before. The continued development of sequencing software in the years that followed would play a crucial role in shaping the future of electronic music and digital audio production.
The Impact of Music Software on the Music Industry
The Democratization of Music Production
The introduction of music software has revolutionized the music industry in several ways. One of the most significant impacts has been the democratization of music production.
Traditionally, music production was the domain of a select few professionals with expensive equipment and technical expertise. However, with the advent of digital music technology, the barriers to entry have been significantly reduced, allowing anyone with a computer and basic software to create and record their own music.
This democratization of music production has had a profound impact on the music industry. It has given rise to a new generation of independent artists who have been able to bypass traditional gatekeepers and reach audiences directly through digital platforms. It has also enabled new forms of collaboration and experimentation, as artists from different genres and backgrounds can easily share and remix each other’s work.
Moreover, the accessibility of music software has allowed for a greater diversity of voices and styles to be represented in the music industry. This has led to a proliferation of niche genres and subcultures, as well as a broader representation of cultural and regional influences in mainstream music.
However, it is important to note that while the democratization of music production has opened up new opportunities for artists, it has also led to a glut of content and increased competition for attention. As a result, many artists still struggle to gain exposure and make a living in the industry. Nonetheless, the overall impact of music software on the music industry has been largely positive, allowing for greater creativity, innovation, and accessibility in the production and distribution of music.
The Changing Role of Producers and Engineers
Music software has had a profound impact on the music industry, transforming the way music is created, produced, and distributed. One of the most significant changes has been the evolution of the role of producers and engineers.
Traditionally, producers and engineers have played a crucial role in the recording process, overseeing the technical aspects of recording, mixing, and mastering. However, with the advent of digital music technology, the role of producers and engineers has become more creative and collaborative.
Here are some ways in which the role of producers and engineers has changed:
Increased Collaboration
Music software has made it easier for producers and engineers to collaborate with artists and other creatives. Digital audio workstations (DAWs) allow for easy file sharing and real-time collaboration, enabling producers and engineers to work remotely with artists from anywhere in the world.
Greater Creative Control
With the advent of music software, producers and engineers have gained greater creative control over the recording process. Digital tools such as plugins and virtual instruments have expanded the sonic palette available to producers and engineers, enabling them to create unique sounds and textures that were previously impossible.
Faster Workflows
Music software has also accelerated the workflow of producers and engineers. DAWs allow for non-linear editing, enabling producers and engineers to work on different parts of a song simultaneously. This has significantly reduced the time required to complete a project, allowing for more efficient and cost-effective production.
More Specialized Roles
As the role of producers and engineers has evolved, so too have the specialized roles within the production process. For example, there are now specialized software applications for mastering, mixing, and processing audio, each with their own unique features and functions. This has led to a greater specialization within the field, with producers and engineers focusing on specific areas of expertise.
In conclusion, the evolution of music software has had a profound impact on the role of producers and engineers in the music industry. With increased collaboration, greater creative control, faster workflows, and more specialized roles, the role of producers and engineers has become more complex and dynamic, reflecting the ever-changing landscape of digital music technology.
The Emergence of Electronic Dance Music and Other Genres
Electronic Dance Music (EDM) has its roots in the early development of music technology. In the late 1960s and early 1970s, pioneers like Gottfried Ensslin, who went by the stage name of “Gottfried”, began experimenting with electronic sounds and rhythms, creating some of the earliest forms of EDM. These early works laid the foundation for the emergence of a new genre of music that would become known as EDM.
The 1980s saw the rise of electronic music in the mainstream, with the development of affordable music technology such as synthesizers and drum machines. This led to the creation of a new sound that blended elements of rock, funk, and electronic music, known as “techno”. The genre gained popularity in the early 1990s, with the emergence of the raving scene in Detroit, Michigan.
EDM continued to evolve and diversify throughout the 1990s and 2000s, with the emergence of new sub-genres such as trance, house, and techno. This was due in part to the increased availability of affordable music technology, which allowed producers to create more complex and diverse sounds. The use of digital audio workstations (DAWs) and music software also allowed for greater control over the creative process, enabling producers to fine-tune their sounds and experiment with new techniques.
Today, EDM has become one of the most popular genres of music in the world, with a global market worth billions of dollars. It has also had a profound impact on other genres of music, inspiring new sounds and techniques that have been incorporated into hip-hop, pop, and rock music. The use of music software and technology has played a key role in this evolution, enabling producers to push the boundaries of what is possible in music production and create new and innovative sounds.
The Future of Music Software: Trends and Innovations
Artificial Intelligence and Machine Learning
As technology continues to advance, artificial intelligence (AI) and machine learning (ML) are increasingly being used in music software to create new and innovative tools for musicians and producers.
One of the key areas where AI and ML are being used in music software is in the creation of intelligent instruments. These instruments use machine learning algorithms to analyze a musician’s playing style and then adjust the instrument’s sound to mimic that style. This allows musicians to experiment with different sounds and styles without having to learn how to play each instrument from scratch.
Another area where AI and ML are being used in music software is in the creation of intelligent compositional tools. These tools use machine learning algorithms to analyze a musician’s compositions and then suggest new ideas and approaches based on that analysis. This can help musicians to explore new creative avenues and to develop their skills as composers.
AI and ML are also being used in music software to create personalized music recommendations. By analyzing a user’s listening history and musical preferences, these tools can suggest new songs and artists that the user is likely to enjoy. This can help musicians to discover new music and to connect with new audiences.
Overall, the use of AI and ML in music software is expanding the possibilities for musicians and producers, and is likely to continue to shape the future of music technology.
Virtual and Augmented Reality
Virtual and augmented reality technologies have the potential to revolutionize the way we experience music. Virtual reality (VR) and augmented reality (AR) both involve the use of computer-generated images and sound, but they differ in the way they present this information to the user.
Virtual Reality
Virtual reality is a fully immersive experience that transports the user to a digital environment. In a VR environment, users can move around and interact with computer-generated objects and other users. VR has already been used in the music industry for events such as concerts and festivals, allowing attendees to experience a more immersive and interactive music experience.
Augmented Reality
Augmented reality, on the other hand, is a partially immersive experience that overlays computer-generated images and sound onto the real world. AR has already been used in the music industry for things like live concerts and music videos, allowing users to interact with computer-generated objects and visuals in real-time.
Both VR and AR have the potential to be used in the music industry in a variety of ways, such as:
- Virtual music concerts and festivals
- Interactive music videos
- Virtual music studios
- Music education and training
- Music therapy
As technology continues to advance, it is likely that we will see more and more innovative uses for VR and AR in the music industry.
The Internet of Things and Smart Instruments
The Internet of Things (IoT) is a network of interconnected devices that can communicate with each other, exchange data, and perform tasks without human intervention. This technology has been gradually making its way into the music industry, and smart instruments are one of the most promising applications.
Smart instruments are musical instruments that have been equipped with sensors, microcontrollers, and other electronic components to enable them to communicate with other devices and computers. These instruments can be connected to the internet and can be used to create and record music, as well as to control other devices and software.
One of the main advantages of smart instruments is that they can be used to create more expressive and dynamic music. For example, a smart guitar can detect the pressure and angle of the player’s fingers on the strings, allowing for more precise and nuanced playing. Similarly, a smart drum kit can detect the force and angle of the player’s strikes on the drums, enabling more expressive drumming.
Smart instruments are also opening up new possibilities for music education and collaboration. For example, students can use smart instruments to practice and record their playing, and teachers can use the data collected by the instruments to provide feedback and guidance. In addition, smart instruments can be used to create virtual music ensembles, allowing musicians to collaborate and perform together even if they are located in different parts of the world.
Another promising application of IoT in music is the development of smart stages and venues. These are venues that are equipped with sensors, cameras, and other electronic devices to create an immersive and interactive experience for the audience. For example, a smart stage can detect the movements and actions of the performers and respond to them in real-time, creating a more dynamic and engaging performance.
Overall, the IoT and smart instruments are transforming the music industry, creating new opportunities for musicians, educators, and audiences alike. As these technologies continue to evolve and improve, it is likely that we will see even more innovative and exciting applications in the years to come.
The Continuing Evolution of Music Software
Music software has come a long way since its inception in the 1950s. From early synthesizers to digital audio workstations (DAWs), the technology has evolved rapidly, and the trend is set to continue. Here are some of the key innovations we can expect to see in the future of music software.
Improved AI and Machine Learning Algorithms
One of the most exciting areas of innovation in music software is the integration of artificial intelligence (AI) and machine learning algorithms. These technologies are already being used to create new sounds and textures, and to assist with tasks such as music composition and mixing. As the technology continues to advance, we can expect to see even more sophisticated AI-powered tools that can help musicians and producers to create and refine their music.
Enhanced Virtual and Augmented Reality Experiences
Virtual and augmented reality (VR/AR) technologies are already being used in music software to create immersive audio experiences. From VR music videos to live concerts, these technologies are revolutionizing the way we experience music. As the hardware and software continue to improve, we can expect to see even more sophisticated VR/AR experiences that will transport listeners to new worlds and dimensions.
Increased Interactivity and Collaboration
Music software is becoming increasingly interactive, allowing musicians and producers to collaborate and share their work with a global audience. This trend is set to continue, with new technologies that will enable even more seamless collaboration between musicians and producers from around the world. From real-time collaboration tools to social media platforms for music creation, the possibilities are endless.
Expanded Integration with Other Technologies
Music software is also becoming increasingly integrated with other technologies, such as mobile devices and wearables. This integration is making it easier for musicians and producers to create and share their music on the go, and to access a wide range of tools and resources from their mobile devices. As the technology continues to evolve, we can expect to see even more seamless integration between music software and other technologies.
Overall, the future of music software looks bright, with many exciting innovations on the horizon. From AI and machine learning to VR/AR and collaboration, the technology is set to transform the way we create, share, and experience music.
The Exciting Possibilities for the Future of Music Technology
With the rapid advancements in technology, the future of music software is full of exciting possibilities. Here are some of the trends and innovations that are expected to shape the future of music technology:
AI-powered Music Creation
Artificial intelligence (AI) is becoming increasingly important in the music industry, and it is expected to play a significant role in the future of music software. AI-powered music creation tools can analyze and learn from existing music, and then generate new songs that are similar in style and genre. This technology has the potential to revolutionize the way music is created, and could potentially lead to the creation of entirely new musical styles.
Virtual and Augmented Reality
Virtual and augmented reality (VR/AR) technologies are already being used in the music industry, and their use is expected to become even more widespread in the future. VR/AR technologies can create immersive musical experiences, allowing listeners to feel like they are part of the music. This technology has the potential to revolutionize the way music is consumed, and could potentially lead to the creation of entirely new musical genres.
Music Analytics and Data Mining
Music analytics and data mining are becoming increasingly important in the music industry, and are expected to play a significant role in the future of music software. These technologies can analyze large amounts of data, such as streaming data and social media activity, to gain insights into consumer behavior and preferences. This information can then be used to inform marketing and promotional strategies, and to identify new trends and opportunities in the music industry.
Cloud-based Music Production
Cloud-based music production is becoming increasingly popular, and is expected to become even more widespread in the future. Cloud-based music production allows musicians and producers to collaborate and share files online, rather than having to be in the same physical location. This technology has the potential to revolutionize the way music is produced, and could potentially lead to the creation of entirely new musical collaborations.
Overall, the future of music software is full of exciting possibilities, and these trends and innovations are sure to shape the way music is created, consumed, and experienced in the years to come.
FAQs
1. When was music software first invented?
The first music software was developed in the late 1950s and early 1960s, during the early days of computing. These early programs were primarily used for composition and notation, and were mainly used by professional composers and music institutions.
2. Who invented the first music software?
The first music software was not invented by a single person, but rather developed by a number of pioneers in the field of computer music. Some of the earliest programs were developed at institutions such as MIT and IBM, and were used primarily for research and experimentation.
3. How has music software evolved over time?
Music software has come a long way since its early days. In the 1970s and 1980s, software such as Max/MSP and Pure Data emerged, which allowed musicians and composers to create interactive multimedia installations and performances. In the 1990s and 2000s, digital audio workstations (DAWs) such as Pro Tools and Logic Pro became popular, allowing musicians to record, edit, and produce music on their computers.
4. What are some of the most popular music software programs today?
Today, there are many popular music software programs available, each with its own unique features and capabilities. Some of the most popular DAWs include Ableton Live, FL Studio, and Cubase. Other popular music software programs include synthesizers such as Serum and Massive, and sample libraries such as Spitfire Audio and EastWest.
5. What is the future of music software?
The future of music software is likely to be shaped by emerging technologies such as artificial intelligence and machine learning. These technologies have the potential to revolutionize the way we create and experience music, and could lead to the development of new tools and techniques for composition, performance, and production. Additionally, the rise of virtual and augmented reality could also have a significant impact on the way we interact with music software in the future.