On 07 Jun 2007, at 18:42, Dan Nigrin wrote:
>
> Considering the Cycling/Ableton relationship, I think we all should
> beg Brad Garton to make us a Python Max external… ;-) Or perhaps
> there’s a way at Python via Java… Hmmmm:

Dan Nigrin skrev:
> OK, now I feel stupid! That’ll teach me to post before checking maxobjects first… Anyway, props to Mr. Grill, now off to learn Python and think of cool stuff to do with Max < -> Live!
>
> Dan
> –
>
The way I see it you don’t even need to learn python – just run the
python scripts like they instruct and control Live over OSC… Looks
rad, to be honest. I hope it pushes Ableton towards an official stance
on OSC.

At 8:34 PM +0200 6/7/07, Andreas Wetterberg wrote:
>Dan Nigrin skrev:
>>OK, now I feel stupid! That’ll teach me to post before checking
>>maxobjects first… Anyway, props to Mr. Grill, now off to learn
>>Python and think of cool stuff to do with Max < -> Live!
>>
>>
>The way I see it you don’t even need to learn python – just run the
>python scripts like they instruct and control Live over OSC… Looks
>rad, to be honest. I hope it pushes Ableton towards an official
>stance on OSC.

Good point – will have to explore whether all that is possible to be
done via the API is accessible via OSC…

over here it’s running fine under max 4.6 (ppc).
thomas grill has done an update some time ago.
couln’t find it on his site though.
v

On 07 Jun 2007, at 21:14, Dan Nigrin wrote:

> Yeah, just noticed the same thing. But the sources are there –
> perhaps someone who knows this stuff better than I could build a
> Mac UB version? Or perhaps Thomas himself??
>
> Thanks,
> Dan
>
> At 2:41 PM -0400 6/7/07, Gary Lee Nelson wrote:
>> Had a look. No 4.6 version and apparently no Intel.
>>

Jorge schrieb:
> Wouldn’t be possible to do the same, let’s say, in C? If we can, it’s
> just a matter of building an object with some networking functions
> concrete to this proccess, and import it to Max.
>
> Maybe I’m wrong, I’m an absolute begginer in Max, but I know
> something about networking.

It had been done, look at the OSC objects from CNMAT… (maxobjects.com
to search)

Okay, went over the instruction, that helped a lot.
Amazing how helpful instructions are when you read them ;)

So I installed all the files and configured ports and ran the
demo Live project. When I call getAllClips and getAllTracks
only some of the info makes it over. The clips that do make it
over I am able to trigger via the buttons.

Quote: Anthony Palomba wrote on Thu, 30 October 2008 13:07
—————————————————-
> Okay, went over the instruction, that helped a lot.
> Amazing how helpful instructions are when you read them ;)
>
> So I installed all the files and configured ports and ran the
> demo Live project. When I call getAllClips and getAllTracks
> only some of the info makes it over. The clips that do make it
> over I am able to trigger via the buttons.
>
> Any idea why clip/track names are missing?
—————————————————-

Not sure, things seem to work for me here. I just wish they added more functionality. How can you adjust BPM? How about Global Quantize? Etc…right now I am only able to launch clips/scenes.

marcos wrote:
> I just wish they added more functionality. How can you adjust BPM? How about Global Quantize? Etc…right now I am only able to launch clips/scenes.
>
BPM and Global Quantize are accessible via regular midi.
The focus of the LiveAPI work has been on actually expanding the
(especially output) data streams of Live.

Thanks. I did check out the documentation on the Midi Api and it does seem outdated. The only thing that has managed to work has been clip launching and scene launching. I’ll try to go through the API and see if I can do something with it.

I dont have any plans to add anything in particular to that version of the APIMIDI setup, what I have started (quite some time ago now) is a system for configuring your own setup. (MultiAPI)

With that you would use a text file to define how you want to get the data in and out,(via OSC or MIDI) and then define if you want to use Midi notes, CCs, which OSC addresses etc, so you could say you want to send in clip triggers via MIDI ccs but send out clip information via OSC…that kind of thing.

Obviously the rationale for this is that people can then build the system they want without having to worry about coding and also it would open up some more interesting integration scenarious, like midi in from some controller and OSC data out going to Quartz Composer or a remote SuperCollider setup, oh yeah and Max/MSP ;)

anyway…there is some code in SVN for this but I havent actually tried running it with Live yet, just in my test setup.

The problem at the moment is that I have no time to spare for any of these ‘home projects’, but i’ll try to answer any questions to the best of my ability.

Quote: marcoskohler wrote on Mon, 17 November 2008 15:43
—————————————————-
> For example, according to the documentation if I send a CC message to ctlout n on Channel 4, MIDICC #13, Value 0, Scene 1 should Fire…however it doesn’t.
—————————————————-

the best i can suggest at the moment is the typical ‘follow the flow’ procedure.

verify that the data is correct at each junction in the system and see where it breaks down.

again, that may well mean digging into the python code to add some logging or other kind of feedback info.

Quote: marcoskohler wrote on Mon, 17 November 2008 18:34
—————————————————-

> My overall question is, to hack together/steal my own functions from other examples and .py what declarations/vars/etc should I be copying over and to WHERE.
>
> It would be great if you could walk me through implementing this into MIDIAPI. I know it’s asking for alot, but I think once I can understand one example, the rest will flow. Thanks!
>
—————————————————-

Sorry for the delay in replying..im not ignoring your request, i’ve just been too busy with deadlines recently. I’ll try and put something together over the weekend.