r26D

Look ma! No hands.

One of the the problems with making software is that it doesn’t film well. There’s nothing to touch and a lot of reading, in different languages, and in symbols.

For the last few days we’ve been trying to get everything to talk to each other. The goal is to allow anyone near the pixel wall to be able to control it, say to turn it off or change the visuals. Using the pieces already in PixelController, like the Open Sound Control integration. After re-sizing and adding some new buttons, the iPhone 5 TouchOSC interface is ready. The best feature is Random Mode. You can now set the number of seconds, like 3600, before a new completely random visual is displayed. Stand-alone operation. Check.

Of course to use it you need an iPhone 5, the TouchOSC app (not free), and access to a desktop running the TouchOSC Editor (free) or iTunes to first transfer the files to the phone. It works well but not for everybody. We need something more accessible, like… the Internet.

This gets a little more complicated - sending OSC messages on the CuBox through a web server using NodeJS - so let’s just skip to the… and now we have a web page that can control the pixels! Any web browser. Any device. GOAL!!!

wpid-2014-04-2317.27.35-2014-04-23-19-56.png

Yeah its all buttons. It works. Sliders, dropdown menus, and an actual interface layout are for the next iteration.