Using pmidic (Open Source) I'm able to translate video motion (on x, y, z axes) to MIDI events. It will notice movement and focus on the area with most white (or black). I could of course program it myself using OpenCV & Processing / pd, but this is the same thing, so much easier ('why walk on water when there's a boat?')
What I liked about this is when changing the sensitivity, it picks up on artifacts as the light changes, and the camera re-adjusts. In this video, the camera is facing a wall, and the glitch as it picks up on video artifacts converts to a noise:
What I liked about this is when changing the sensitivity, it picks up on artifacts as the light changes, and the camera re-adjusts. In this video, the camera is facing a wall, and the glitch as it picks up on video artifacts converts to a noise:
The same in this video. Normally this would not be shown in an installation, but it almost makes a nice video in itself. The little dot almost becomes like a moving insect, a fly, or a microbe, flitting about in a non-linear way. Perhaps glitches are more valuable in understanding processes such as 'thinking'!