This is an open call for feature requests, overall philosophies, dream/cadillac solutions, etc.

Just trying to get a feel for what we should attempt to build here.

Nothing is off the table, it should be MUCH easier than the EuCon support project, because it is comparatively unconstrained.

For example: don't even know if this is possible yet, but imagine being able to control the faders from an MCU, the transport from an Artist Transport, and the VST compressors, EQ's, reverbs, etc. from an iPad (perhaps using OSC).

I will contribute more later, but here's what I'd be looking for:
1) Extender support for plugin control please (more than 9 faders)
2) Double click rewind brings transport to 0:00:00, and double click fast forward brings the transport to the end of the project ("Did you want to hear that back from the top?)
3) Easy plug-in control mapping
4) Support for fader/pan 'flip' in plug-in control mode
5) Ability to assign faders to the control surface and have them identified by colour in reaper. E.g. maybe I want Bass, Kick, Snr, Rest of Kit, GTRs, LdVox, BGV, FX1, FX2, and not a bunch of sub folder tracks.

For example: don't even know if this is possible yet, but imagine being able to control the faders from an MCU, the transport from an Artist Transport, and the VST compressors, EQ's, reverbs, etc. from an iPad (perhaps using OSC).

I think this would work. At one point in time I tested using Klinke's csurf dll (the one that supported extenders, I forget the version no.) on my hardware MCU, and on Neyrink V-control at the same time. It worked. I can't image why throwing OSC into the mix wouldn't work too.

Eucon's "wheel actions" which you can map to the the jog shuttle wheel were very interesting: I would have loved being able to use the wheel to jog, zoom, trim, nudge, move envelope points up and down, select next item, track, point etc.

I would love to be able to lock tracks to faders, and to have faders write to selected envelope on their track... being able to cycle through envelopes with the wheel would make that very powerful.

It would be very amazing to have compressor gain reduction meters (ok that's too much I'm sure)...

being able to open the first, next plug per track and have the params spill

- a 'native' (built in the .dll) MCU display emu (for those using an MCU emu without built in display, like me with the BCR2000)
Not a high priority as the Mountain utilities BC manager contains an MCU display emu that's working nicely (for me anyways). A built in display emu would eliminate the necessity for virtual MIDI connections though.

- Nice might also be if the tracks currently controlled by the MCU could automatically change to some custom color.

A Synthedit like compiler.
So take all the things that are possible in Reapers controller SDK, put them in graphical compiler form.
This way we can create our own controller support designed specifically around the controllers we have.

Support logical aggregation of control surfaces
e.g. an iPad controlling VST's and an MCU controlling faders. pan, etc.
Each controller is aware of its neighbours, itself, and its role in the "community aggregate".

Support numerous workflow configurations, often by pruning functionality -- anyone not hiding the playrate control ?

Support scripting -- EEL/JSFX/Lua/Python, etc.

Support the usual controller specific suspects -- midi learn, etc.

Support more than just MIDI -- support OSC etc.

Coming up with the functionality is more than trivial, but that pales in comparison to the effort required to get a workable UI for all this.

I certainly hope all of you out there will jump in with the great ideas needed for success.

Connecting to reaper via OSC (hence doing a separate program) supposedly will allow for the requested functionality, but when doing something "within" Reaper do you intend to access Reaper via the Tool API ? I suppose same will provide the necessary functionality, as well.

How should such a Tool access the script interpreters to "Support scripting" ? OSC2Midi has a built-in EEL interpreter and is open source.

I've been thinking:
What is the ultimate Control Surface-solution for Reaper/ a DAW ?
The answer depends a little on user-scenario I guess,
but multifunction touch-screen with multifunction knobs seems quite good.

Lot of knobs for most common transport/edit,
and screen-knobs changing for the chosen tasks (think toolbar-icons)

Then, the old paradigm of one fader/ch per DAW-track seems pretty useless.
Personally, one fader (selected track) is enough for most tasks.
Or, assignable faders 8/16 for typical busses.

Then, those faders should work for plugin-automation as well,
but then 2-way communication for led-labels are crucial.

It is of little use having a ton of faders/knobs if you don't know what they are assigned to/ value.

I believe track_ID is sorely missing in Reaper.

But else, most should be doable.
Generic, GUI-less plugins should be ok,
but GR-meter is maybe harder.

As far as the ultimate control surface, I've been thinking about / working on this for over 10 years now - first programming I ever did in this space was the Faderport drivers for various DAWS in 2006.

I now think the ultimate control surface is a hybrid made up of various phones/pads/fader packs/scrub jog wheels, etc. depending on the user's role at the moment - mixer, editor, post-pro, etc.

So workflow has to be the BIG number 1 with a bullet.

Otherwise why would you spend the money, time, and effort on all this mess.

You have to be able to change workflows quickly, perhaps in a way very like the layers on the live digital boards.

Most often the idea would be to remove buttons (ones that are out of context) so that you don't press them mistakenly.

For instance:

example 1: If you are not tracking, the record arm and record buttons just clutter things up, they can and should be hidden, so that they are not inadvertently pressed.

example 2: If you use an Apollo or equivalent, the input monitoring section is generally just clutter, the Apollo console takes care of that chore.

Quote:

I believe track_ID is sorely missing in Reaper

I seem to recall a track GUID, I'll have to check.

Please let's keep this discussion going everyone, the product will only get better with your input.

The custom plugin layouts in Klinkes Csurf were good, though Avid Icon consoles were really good at that.

If setup complexity can be reduced that would already help. More people would use it.

As for Reaper <-> OSC <-> Midi controllers,
that is a serious can of worms.

Aquisition of midi controls resources and then customisable ways of using them. For example having an action to "Control plugin 1 of currently selected track". This should present the parameter values on the predefined output capabilities of the mid device or any other for that matter.

You'd have the example I showed in my FR thread on this subject in post #1. That output command set could also be a separate OSC tablet, which I might use to show the parameters of the currently targeted olugin.

Midi knob to OSC, Reaper sends back plugin data via OSC and this magic thing sends whatever I told it to when THAT midi knob is being used.

just thinking about the interaction of actions and surface - all of this might be obvious but just riffing...

for the surfaces to be maximally useful they need to be modal.

so modes need to be able to be set and indicated by toggles on the surface

we also need to be able to set what the surface controls from reaper with actions.

an action might assign the focused envelope to fader 1.

a macro could add reaeq AND open it on surface's faders for editing.

or a script might seek input from the surface: create a send and wait for a button press on a 'select' button to set the target.

a search field in a script could narrow down tracks and the results could be dynamically assigned to the surface.

in the mapping process selecting params from lists can block flow. it would be nice to be able to use text to map params - I could imagine an on-the-fly mapping window that prompts for knob 1 and accepts a dynamic search, auto selecting and advancing when there is only one result so you type 'low gain' 'low q' 'mid 1 gain' to map first 3 knobs.

Likewise a command to set jog wheel function based on last action could be interesting. say you press zoom out twice... then invoke the command and the jogwheel is set to zoom. or trim item start etc.

Justin and Schwa might provide a better framework, but they're not going do the grunt work that a major force of users could.

Ok here goes.

Some groundwork.

Plugin parameters need to be sorted by the user in to a prioritized list.

The plugin parameters most important to the user come first. Call them Plugin Parameter A, Plugin Parameter B and so on. Past the user assignments, the rest of the parameters are just strung along. Maybe the user should be able to keep some of them out as well for those unwieldy mega beast plugins with hundreds of parameters.

This will come in handy later.

It has to stay simple. You have a bunch of resources and need to setup how they ought to be used.

At our disposal is stuff like

MCU controllers
which is a bunch of motorized faders, a few endless knobs, a slew of buttons and a jog wheel.

Indicators include lights on some buttons and a set of scribble strips.

Midi knob/fader boxes like the Akai Midimix, Midi Fighter Twister, with or without indicators. This also includes the Behringer X-Touch mini, which has a midi mode with two layers.

Button boxes
They all usually have indicators, like the Midi Fighter 3D or the Novation Dicer (6! colours).

OSC Touchscreens

Now for some ways of using it. This design work is by far the hardest part I find.

Custom Zones
It's a specialized mode, that some folks are likely to spend most of their time in.

This is akin to the Custom Zones on consoles like the Avid Icon(good, older Protools-only controller).

Here is how it works:

A bunch of control resources are designated a custom zone

The user can switch this custom zone on and off. Some control resources (faders, buttons, knobs...) might still be used to select tracks and control vanilla stuff like track volume.

Zone types:

Custom plugin control
"Show Plugin 1 in custom zone" and so on. Each plugin can be given its own custom layout. Bank functions to page through plugins and parameter banks can be assigned too.

Access to this custom zone mode is done via actions. Each action also takes you out of the mode again when executed the second time without another Zone action having been triggered. So you hit the "Custom Track Zone" button, then hit it again to get out again. Hit the "Custom Plugin Control", then "Custom Track Zone" and then "Custom Plugin Control" again to end up in "Custom Plugin Control". You'd need to hit that again to turn the zone thing off again.

That's a fair start and a fair warning of what this could end up being.

Remember that each state also needs indicators if possible.

Indicators are another thing entirely. Control knobs could be assigned default indicators. But what happens if that clashes with another default assignment. Knob Z might want to show the state of the parameter it controls as does Knob A. Who knows. Let's get there first is my moto.

At our disposal is stuff like
MCU controllers
which is a bunch of motorized faders, a few endless knobs, a slew of buttons and a jog wheel.

Indicators include lights on some buttons and a set of scribble strips.

Midi knob/fader boxes like the Akai Midimix, Midi Fighter Twister, with or without indicators. This also includes the Behringer X-Touch mini, which has a midi mode with two layers.
Button boxes
They all usually have indicators, like the Midi Fighter 3D or the Novation Dicer (6! colours).

Custom Zones
It's a specialized mode, that some folks are likely to spend most of their time in.

This is akin to the Custom Zones on consoles like the Avid Icon(good, older Protools-only controller).

Here is how it works:
A bunch of control resources are designated a custom zone

The user can switch this custom zone on and off. Some control resources (faders, buttons, knobs...) might still be used to select tracks and control vanilla stuff like track volume.

Zone types:
Custom plugin control
"Show Plugin 1 in custom zone" and so on. Each plugin can be given its own custom layout. Bank functions to page through plugins and parameter banks can be assigned too.

Tackle the MCU as the big test case might just be a good place to start. Many folks have such devices.

The way Klinkes csurf gave us all the extra layers could be the starting point.

If midi gear is adressed as control-enabled midi devices, the MCU would have to live like this too. On the other hand that gives you more flexibility in putting your personal control setup together.

Several type of presets could come in to play.
Device resource presets that let users skip over the aquisition of midi-learn-collected resources.
The composition of these resources in to a control setup is another and it likely encompass the device presets.

Layers such as one controls a currently selected plugin could be filled with device presets, whose control resources get auto-assigned according to a list of preferred parameters for each plugin.

First on there is the simplest. Midi controllers in all its forms (absolute, toggle, toggle-through-number-of-states,relative1-3)

But how do you capture OSC resources ?
I'm inexperienced with OSC though.

Presets of resources should be sharable, perhaps via clipboard similar to what ValhallaDSP plugins do or file exchange. Paste/load a preset and ask which device it is referring to, and we have a bunch of new resources ready for use.

Once we have resource acquisition, we can turn to layers and zones. Zones can have their layers switched with actions, which in turn can be triggered from anywhere. That'll be the bigger GUI challenge. Representing the GUI resources, perhaps creating a simplistic layout for users to better remember stuff, and then creating zones from them.

First on there is the simplest. Midi controllers in all its forms (absolute, toggle, toggle-through-number-of-states,relative1-3)

But how do you capture OSC resources ?
I'm inexperienced with OSC though.

Presets of resources should be sharable, perhaps via clipboard similar to what ValhallaDSP plugins do or file exchange. Paste/load a preset and ask which device it is referring to, and we have a bunch of new resources ready for use.

Once we have resource acquisition, we can turn to layers and zones. Zones can have their layers switched with actions, which in turn can be triggered from anywhere. That'll be the bigger GUI challenge. Representing the GUI resources, perhaps creating a simplistic layout for users to better remember stuff, and then creating zones from them.

What do you think ?

I was fortunate in having done an iOS app last year.

Learned a hell of a lot about simplicity, pragmatism, taste, etc.

The part that applies to this project is the part that says "The best way to simplify configuration is to eliminate it".

To that end I have started a little POC (proof of concept) that will test some of these ideas.

I'm going to write an iPad app that has an LA2A compressor, complete with multitouch controls.

This app will control (via OSC I think right now) the VST plugin in Reaper.

The goal is to have zero setup.

OSC Gateway apps run on both the host computer and the iPad and discover each other -- zero-config.

After we have reduced the setup/config to as much as possible, we then will have to tackle all the mapping, layering, zoning, etc.

This will likely be an ongoing project with multiple releases of multiple apps on multiple platforms over a number of years.

I'm certainly in favour of keeping all the intelligence in the DAW, and relegating all external controller hardware to being what they are, dumb-fuck devices that send data the user enters and have blinking lights.

OSC tablet apps are different. They can be controllers but have to be designed on the pad, not the DAW. HOwever if you use the them to be a blinking light display, it will work too.

I'm certainly in favour of keeping all the intelligence in the DAW, and relegating all external controller hardware to being what they are, dumb-fuck devices that send data the user enters and have blinking lights.

OSC tablet apps are different. They can be controllers but have to be designed on the pad, not the DAW. HOwever if you use the them to be a blinking light display, it will work too.

If that's what you meant, I'm all +1.

Well i think you've cut to the chase here, yes, it is the OSC pad devices I'm considering, I think we're on the same page, sort of very smart but still glorified blinking light / touch input displays.

Although I currently use a pair of nanokontrols and am happy with how they do what I bought them for, I also use TouchOSC for remote control of reaper & would LOVE to see easier/better integrsation at any level with reaper.

Mind you fwiw the ACt IMPLEMENTATION IN sONAR IS NO BETTER THAN WHAT WE CURRENTLY HAVE ON rEAPER... Aw shit! arthritis again

__________________
"What a dick comment. I'm gonna make sure to avoid your name." Dicks other than Trump can speak????