However, a quick note to let you all know that multi-touch works just brilliantly in every scopesync device

I have win7 x86 with a GeChic OnLap 1502i...a touch enabled monitor that complies with "Windows Touch", so there is no special drivers or other background detection involved. The monitor has a USB port, which connects as a normal HUI device.

Basically, all ScopeSync interfaces work, as-is, as a multi-touch interface to scope devices. I think I just stepped into modular heaven, and now have 10 fingers to stroke my patches with. No MIDI assignment needed, no other editing.

Now...how can I control a device that doesn't have ScopeSync built in..such as a mixer?

congrats guys, this might be a side-effect of using JUCE, but it's a fabulous way to interact with devices. No need to build a Lemur path on an iPad...just use your touch screen

Well, that's good to hear! Thanks for trying this out. For those feeling jealous, it's worth noting that Juce supports iOS and Android as build targets, so it could be quite a fun project to create a standalone version of the ScopeSync VST and compile it for those platforms. Also, for Mac users, it should be even easier to compile an AudioUnit version. There's no Windows-specific code in the ScopeSync VST, only in the Scope module (as the Scope libs are only for Windows currently). We'll probably look at builds for the different platforms once the core features are complete, but would certainly support anyone who fancies trying in the meantime!

Regarding using ScopeSync with devices that don't have built-in support, the best we have so far is Simon's (in-development) Remote Control Shell. That allows control of modular modules that don't have control pads using ScopeSync. We have some ideas around how something similar could be incorporated into the Routing Window, but will most likely need some help from S|C to get that working.