What is music?
Music is anything that involves oscillations. It may be a single sound, an ordered sequence of sounds (melody), a combination of sounds, a sequential combination of sounds, etc. Pauses (silence) are also included in the concept of music. Despite being often understimated, they are as important as sounds. Through the contrast of sound and pauses, music becomes clearly defined and focused.
Music obeys precise mathematical rules, from the intrinsic causative level (a note produced by an instrument does not contain a single frequency, but a frequency plus a number of harmonics, which are determined by the physical properties of the instrument itself - timbre) to the harmonic constructs and their flow.
Traditionally, music has only be associated with sound. However, we think that this is not correct. To know why, we must first examine the electromagnetic spectrum.
We will notice that the audible (sound) is only at the lower end of frequencies. Going up in the drawing, we find the visible (light), up to x and gamma rays.
The first observation we can do is that we have two different organs - the eyes and the ears - that sense the exact same physical phenomenon, which is the electromagnetic radiation. Between the two, there is a big gap, spanning from radio waves up to microwaves and infrared. We have no organ capable of sensing those bands and therefore we are completely blind to them. Certain animals, such as bats, can sense ultrasounds, which are out of our reach.
Frequencies of light are called "wavelengths", but they can be translated into frequencies with a simple equation:
frequency = speed of light / wavelength
The speed of light is 299,792,458 ms.
Therefore, we need to stop considering hearing and sight as two different things. We actually have to apply the same rules to both and beyond, spanning through the entire electromagnetic spectrum.
The next thing that we are going to figure out is that we can translate music into colors and colors into music, through the same equation above. We only need to transpose a music to "higher octaves", till we end up in the visible spectrum, to finalize our translation.
Realizing this truth is already a great things, but things are not that easy as they might seem theoretically. Visual arts rely heavily on the sensibility of the artist. A painter would choose a combination of colors according to his liking, without an underlying thought about the wavelengths he is using. Also, no mathematical construct would be employed, in order to make the painting "attuned" and harmonically correct.
Further research is needed in this field, but we may infer that, if the wavelengths of the chosen colors are harmonically consistent, this would be immediately be perceived by the viewer and the painting would meet a larger consensus. The sense of harmony is intrinsic in people. As we sense it in music, we sense it in visual arts too.
Moreover, by knowing what color is a multiple (octave) of the musical tone used by the music we are playing we can design better coreographies. The relation between color and sound should not be understimated, but taken as a whole, instead.
As we saw in the onion model, an electromagnetic radiation tends to acquire different properties, depending on the slice of the spectrum it refers to. Audible low frequencies (LF) and ultrasounds do not produce an electromagnetic field. However, from radio waves and on, an electromagnetic field is established. At a certain point in the UV, the radiation is so small in length that it starts interacting at the atomic level. Therefore, we have ionizing radiation, which can dislodge electrons and cause the breakup of chemical bonds.
We can exploit those properties for our benefit. For example, we can use LF to half the recovery time of a bone injury and UVC light to shatter an unwanted pathogen, as we will see in the section about music therapy.