An interactive audio-visual instrument
“Formaphone” structures the similarity of music through visual patterns. The camera transmits the light that enters the frame as a visual data. The software produces a sound frequency based on the light’s speed in the air. The pitch of the sounds gets higher as the light moves faster, and it becomes deeper as the light moves slower. The software’s duty is not restricted to transforming the light’s speed into sound frequency; it also projects the route the light follows on a screen as a real-time visual output, from the perspective of the camera that records it. With this, the user may see their drawing on the screen as they draw it. While the users are free to draw whatever they wish, the sound output of the drawing is inclined to be heard rhythmically. For this reason, the user will move with a certain rhythm. Since the speed of the zigzags and curves that the user will make with their shoulders and arms will increase the variety of the notes, they will also want to diversify their movements. To diversify the speed of the movement done in a stable rhythm, the user will play with the scale of the movement. Big-scale curves will be done fast while the small-scale ones are done slower. This diversity of motion and speed will enable the visual-auditory pattern to gain its waveform. The visual patterns that correspond to movements that will create the most common scales of 2/4, 4/4, and 3/8, will be in the waveform.