Cycling 74 » abletonhttps://cycling74.com
Makers of Max Visual Programming Software for Artists, Musicians & PerformersSun, 02 Aug 2015 20:00:51 +0000en-UShourly1http://wordpress.org/?v=4.2.2Journey in Push Programminghttps://cycling74.com/2013/12/02/journey-in-push-programming/
https://cycling74.com/2013/12/02/journey-in-push-programming/#commentsMon, 02 Dec 2013 17:33:26 +0000http://cycling74.com/?p=274399During the month of November, I took a little journey into a new programming area: creating content specifically for the Ableton Push control device. This hardware has a unique place within the Max community due to its tight integration with Ableton Live (and therefore Max for Live), but it is also a powerful control surface […]

]]>
During the month of November, I took a little journey into a new programming area: creating content specifically for the Ableton Push control device. This hardware has a unique place within the Max community due to its tight integration with Ableton Live (and therefore Max for Live), but it is also a powerful control surface in its own right.

With help from Mark Egloff of Ableton, I started with a goal: to create a device that would be a usable performance tool, but would “take over” the button grid on the Push to make it easy to manipulate in real time. I chose an 8-band EQ-like device that I called the Frequency Mixer, and created the code necessary to run it solely from the Push. See the result (along with some video).

While this first device used Mark’s comprehensive Push programming abstractions, I really wanted to dive into the depths of the Push – regardless of how difficult it might be. Since I’m comfortable with the Javascript implementation in Max (using the js object), I also wanted to see how difficult it would be to program an entire device based on Javascript interaction with the Push. See the result — a MIDI-based gating device.

Next up was to work directly with the Push in Max – completely outside the Live environment. Based on some information that Mark (Egloff) provided, I was able to determine the values needed to update the Push button matrix RGB values, and created an interesting, if rather useless, 8×8 image display. I can imagine using this to modify a program based on the display values, but have left this as an exercise for the willing Push student!

Finally, based on feedback received on YouTube, I modified the first (Frequency Mixer) project to act on other tracks in a Live set. This way, you could either mix multiple channels, or (by inverting the values) crossfade multiple tracks from a single instance of the Frequency Mixer. This is based on the use of send and receive objects that share a specific name, which is propagated through the entire Live set. See the result — a fun extension to the original device.

While I create some specific devices and projects, the implication should be much greater – that the Push, like many other controller devices, is an interesting playground for the creative coder. Hopefully you will find tips and techniques that can help you get more extensive use out of your Push!

]]>https://cycling74.com/2013/12/02/journey-in-push-programming/feed/1Max for Live at San Francisco Ableton User Grouphttps://cycling74.com/2013/08/09/max-for-live-at-san-francisco-ableton-user-group/
https://cycling74.com/2013/08/09/max-for-live-at-san-francisco-ableton-user-group/#commentsFri, 09 Aug 2013 16:06:23 +0000http://cycling74.com/?p=258384Thursday, August 29, 2013, 7-9PM at 450 Bryant, Suite 100, San Francisco, I’ll be presenting an introduction to programming in Max for Live for the Ableton User Group Meeting. Here’s the Facebook for the event. I’ll have about an hour to explain what Max is, show how it works in Live, and offer some tips […]

]]>https://cycling74.com/2013/08/09/max-for-live-at-san-francisco-ableton-user-group/feed/0Pushing the Edit Buttonhttps://cycling74.com/2012/10/30/pushing-the-edit-button/
https://cycling74.com/2012/10/30/pushing-the-edit-button/#commentsTue, 30 Oct 2012 17:03:19 +0000http://cycling74.com/?p=21471Helpful tutorials and Max for Live projects to get you started.

]]>
Digging into Max for Live for the first time and need a little nudge? Has the edit button been calling your name? To help you get started, we’ve gathered a few links to helpful tutorials and Max for Live projects that you might not have seen.

Max for Live Devices

First of all, if you want to find some ready-to-use Max for Live devices, it’s hard to beat the high-quality ones found in the Packs section of Ableton.com, which includes Robert Henke’s Granulator and Alexkid’s Instant Haus devices.

The unofficial Max for Live community site MaxforLive.com has collected over 900 user-created devices that are freely downloadable.

]]>https://cycling74.com/2012/09/20/live-cubes/feed/0Soundflower’s role in multi-track Skype group call recordingshttps://cycling74.com/2012/07/24/soundflowers-role-in-multi-track-skype-group-call-recordings/
https://cycling74.com/2012/07/24/soundflowers-role-in-multi-track-skype-group-call-recordings/#commentsTue, 24 Jul 2012 17:41:31 +0000http://cycling74.com/?p=19808Lots of people use Soundflower for producing podcasts and doing various work where audio needs to get from one app to another. One great thing about Soundflower is that it is hackable so that you can customize it to your needs. A clever Soundlower user — who hosts “LuBlog” — contacted us with a great […]

]]>Lots of people use Soundflower for producing podcasts and doing various work where audio needs to get from one app to another. One great thing about Soundflower is that it is hackable so that you can customize it to your needs. A clever Soundlower user — who hosts “LuBlog” — contacted us with a great example where a group call can be recorded using Ableton Live with each individual recorded on their own track so the whole thing can be mixed properly in post-production.

]]>A fabulous suite of free and artistic audio plugins using Max for Live, designed to challenge the Eurocentric / Western norms prevalent in electronic music software. Definitely worth checking out both from technical and cultural perspectives.

]]>https://cycling74.com/2012/07/16/sufi-plug-ins/feed/0Max for Live Updatehttps://cycling74.com/2011/03/29/max-for-live-update/
https://cycling74.com/2011/03/29/max-for-live-update/#commentsTue, 29 Mar 2011 18:54:20 +0000http://cycling74.com/?p=9658In 18 months, Max for Live has already become an essential tool for artists working with live media. The combined force of Ableton Live and Max/MSP in the hands of dynamic and creative people has created a synergy that has been amazing for us to watch. Today, with the release of Ableton Live 8.2.2 and […]

In 18 months, Max for Live has already become an essential tool for artists working with live media. The combined force of Ableton Live and Max/MSP in the hands of dynamic and creative people has created a synergy that has been amazing for us to watch.

Today, with the release of Ableton Live 8.2.2 and Max/MSP 5.1.8, Max for Live receives its most significant update since the initial release. We’ve listened to feedback from users and put our heads together with Ableton to come up with some exciting new features, devices, and lessons. Visit the Max download page to get the 5.1.8 update and the Live download page to get 8.2.2. Don’t miss the featured devices and special offers on the Ableton Max for Live page.

New Max for Live Lessons

Max for Live now ships with six new Live Lessons (17 total) that will help you get started using the various features available in Max for Live. Explore cool Live API tricks, making your own LFOs, multichannel routing tools, and more in these clearly documented and interactive Lessons, complete with Max Devices and example sessions.

Featured Devices

Ableton has selected exceptional user-created devices that highlight the range of possibilities available with Max for Live. To download these devices and learn more about the artists behind them, visit Ableton.com.

Pluggo Devices

Included in Max for Live is a selection of the most popular plugins from our much-adored, but now defunct Pluggo package. For this update, we’ve combed through every single Pluggo device to make the patches more readable, better looking, and more consistent in patching style. Now you can dig in and repurpose that Space Echo, and make it work the way you want it to. If you are learning to build your own instruments and audio effects, the Pluggo devices are a rich source of inspiration and clues.

Live API Tools

The Live API allows you to control various aspects of your Live Set from within a Max Device. For this update, Manuel Poletti created a collection of attractive new devices and Max abstractions to make using the most popular features of the Live API much easier. Now you can add an LFO, automate clips, and navigate through your Live Set with ease, all from within a Max Device. Combined with updates to the Live Object Model (including access to Racks and Persistent IDs) these new tools open the door to some serious Max for Live magic! To learn about some of the new Live API features in this update, be sure to check out our new Max for Live video tutorials.

]]>
https://cycling74.com/2011/03/29/max-for-live-update/feed/0Max for Live Video Tutorialshttps://cycling74.com/2011/03/29/max-for-live-video-tutorials/
https://cycling74.com/2011/03/29/max-for-live-video-tutorials/#commentsTue, 29 Mar 2011 18:53:39 +0000http://cycling74.com/?p=9751In addition to new devices and lessons in the Max for Live update (Max 5.1.8/Live 8.2.2), there are a couple of new Live API features we would like to share with you in these short video tutorials. Many Max for Live devices utilize the Live API to control and keep track of different parts of […]

Many Max for Live devices utilize the Live API to control and keep track of different parts of a Live Set. Due to popular demand, these mappings are now saved within a Live Set and saved Racks, making it easier to use these smart devices in your Set.

Another often-requested feature was the ability to easily map live.object and live.remote~ to whatever is currently selected by the mouse. This is similar to how Live’s MIDI-mapping system works, and it made sense to extend this functionality to the Live API.

We’ve also added a new Max object called live.thisdevice. Max for Live users wanted the ability to keep track of the various states of their device, and know for certain when their device is completely loaded and ready to go. The new live.thisdevice provides this info and more.

Audio Wizard Tom Erbe is a generous guy. His SoundHack program is a legendary and beloved tool for mangling sound and he gives it away for free. Now he has made VST plug-ins so I called him up to see what the man behind this benevolent act was like. I found a funny and wise educator and musician who loves what he does.

So where did you grew up?

I grew up in the Midwest, in Milwaukee and Chicago, and went to school at the University of Illinois Urbana-Champaign. I got involved with music technology pretty early on. My great uncle was a radio engineer at WCFL, and my grandpa was a police radio operator, back in the ‘30s. My great uncle gave me an oscilloscope and microphone when I was about ten or so.

So I ended up at a high school in Illinois, which happened to have a radio station, and as soon as my friends found out that I soldered, and knew about microphones, I became the technical director there when I was 15.

I had a nice crowd of friends in high school that were very into music. We all had our own radio shows, and we all were always competing to find the more obscure music. The weekly trip to the record store was the basis of high-school social life for me.

Did you play an instrument?

Not really. I really got into things from being a DJ. Also, being the guy who could fix all the gear, as well. We used to do a little outdoor fair that the community put on, and our radio station would go there and put on the music. I would be the one climbing the telephone pole to hang the speakers. It was all fun, and it was just getting together with friends who were really interested in music.

It’s seems so unusual for a high school to have a radio station.

Yeah. This was back in the ‘70s, and back then there was a provision by the FCC for a Class D radio station license. It allowed small organizations to have low powered radio stations, under 10 watts.

So there used to be a lot of high-school radio stations in the ‘70s. I was lucky, because I could get into music technology very early.

Then, when I was finishing high school, I found a recording studio that I was able to intern at for a bit. I learned quite a bit there as well. They let me play with the mixing board, and some of the equipment.

When I got into college, I went into computer science and music. At the same time, I worked at a record store, DJ’ed at a couple radio stations, interned at Faithful Sound Studio, which was where Mark Rubel worked. He runs Pogo Records now, a recording studio in Champaign.

I learned quite a bit then. I also played synthesizer in a band. I just got really, really interested in electronic music. Of course, as my tastes got more adventurous, I got into weirder, as well as more serious, electronic music.

So yeah, after I got out of college, I found a job—I found that there was an opening at the Computer Audio Research Lab at UCSD, and took it. It was just perfect for someone with a degree in computer science and music minor.

What year was that?

This was 1984.

That must have been quite a culture shock, coming from the Midwest.

Yeah, I guess so. [Laughs.] The options in the Midwest for jobs at the time were very conservative, or at least that’s what it seemed like. Someone with an engineering background was going to work for an industrial company. I wanted to get into something more related to music. So it really seemed like quite the right thing to come out to California.

It must have been amazing. There were a lot of really interesting composers that came through back then.

Oh, yeah. A lot of people were visiting. Gordon Mumma was here for a good long time. That was very fun. John Cage visited. Of course, Roger Reynolds was here—and still is. This was also the time when the Computer Audio Research Lab [CARL] was here, at the Center for Music Experiment. We were developing a lot of software tools for signal processing that all ran on the mainframe computer. The music software all ran in non-real time.

Mark Dolson was at CARL. He developed sound file convolution and phase vocoder software—a lot of cool stuff. And Dick Moore, who developed C Music, which was one of the more interesting Music 5 languages in C.

So there was a lot of good software development going on, and I just spent all my time trying to figure out how all of it worked. [Laughs.] I worked on a project developing a real-time pitch detector for electronic violin. Also more basic things like a MIDI interface for a Sun workstation. Which, at that point, we thought was sort of a nice, small, compact computer. [Laughs.]

We were trying to get things working in real time, but at the same time, there was a lot of interest in research with signal processing. I worked there for about three years, but I was itching for a place that was even more creatively active. In ’87, a position opened up at Mills College for the technical director, so I went for that position.

That was such an exciting time in the Bay Area.

It really was. I was at CCM [Center for Contemporary Music at Mills College]. Anthony Braxton was there, David Rosenboom, and Larry Polansky; Chris Brown had just started working. The Hub was there. Bob Ashley came by for a couple of years and I recorded and played synthesizer on his album, Improvement.

So that’s where I started developing the software that became SoundHack, back in ’89.

What was your philosophy behind it? Or did it just evolve organically?

Well, a lot of the fascinating things at UCSD, it seemed to me, were unavailable to people who couldn’t have a mainframe. So for a couple years, I really tried hard to get a mainframe-type computer at Mills and only got so far. I ended up with sort of a cast-off Hewlett Packard Bobcat computer, and I got a lot of things working on it.

Then suddenly the Mac II came out, which had a floating-point processor, and it was the first computer that could actually run serious signal processing, because it did have floating point built in. The earlier Macs didn’t, so it was completely impractical to do anything on them.

So at that point, I thought, “Well, maybe I should learn how to program the Mac, and bring some of these interesting things to the Macintosh.” And that’s what I went ahead and did. I really love the sounds that you can get out of convolution, and out of the phase vocoder.

It took a couple years. I think the first version of SoundHack came out in ’91. And then I spent maybe five years just continually developing and updating it and adding new processes, always as a standalone application.

Why the decision to make it free?

I didn’t think anyone would want to pay for something that took a whole day to process three minutes of sound. I was excited about the software, and wasn’t really thinking about marketing.

This was before there was any sort of notion of open source software—at least it hadn’t hit me yet. I just wanted to get something out there that would be helpful to experimental musicians, and would help people make a lot of different sounds.

What’s you relationship with Max/MSP?

I’m a teacher, computer music developer, recording engineer and occasional musician. I use Max/MSP and PD [Pure Data] in all of those roles.

At UCSD, I teach the fundamentals of music synthesis. It’s extremely helpful to have a program that is modular, that allows me to show the architecture of an oscillator, or a filter, for instance, and show students how to build these things up from small components and into a hierarchy.

My other relationship with Max/MSP is that I am a plug-in developer. The reason I use Max/MSP is to prototype all my software. For example, three years ago, I designed a bunch of delay effects for VST/RTAS/AU, one of which was just called +delay, but it’s really based on the old multi-head, rotating delay lines that were out in the early ‘60s. I prototyped everything in Max, and I wanted to give this delay analog-like behavior, so I needed to do some sub-sample interpolation. I wanted to put some sort of tape saturation as well as some nice filtering in the feedback path, so I could emulate the high-frequency loss in tape. But also allow people to go farther than that.

So I built this all up as a huge patch, before I ever went to the C compiler. Then after I built the patch, I was able to quickly go into C and build a plug-in out of it. Using that process, I built a pitch-shifting delay, which uses the classic, multi-head technique for pitch shifting. I also built a granular delay, where the delay line is being sampled with a grain stream.

So it’s become my process, now, I guess for the past three or four years, just to build everything, in either Max or PD first. Once I’m convinced I have something that sounds good, then I implement it in C, and maybe do some refinements.

You came out with some plug-ins for Max for Live?

I did. I’ve programmed about 15 plug-ins now. I’ve been doing them since, oh, I guess since about ’99, 2000. I found a lot of people were using my plug-ins within Max, using the VST~ object. So they were using my Decimate plug-in, or my Binaural plug-in, and I thought that was possibly a little inefficient for them, because the GUI does take up some CPU.

Also, within Max, you can get to parameters much quicker than you can through the VST~ object—at least more direct access.

So this last year, me and a couple grad students and undergrads ported all of my plug-ins to Max/MSP as externals. So now those are all running under Max/MSP and Max for Live.

Have they been popular?

I don’t know. [Laughs.] It’s really hard to say. I never look at how many people download it. I see a lot of people talking about it, and a lot of people saying, “Hurray!” when it got announced. That’s one thing about developing software, when you give your software away for free, you don’t really get a lot of feedback. So I think they’re popular. I should check how many downloads there are.

But people were definitely asking about them a lot before they came out. Then, when they came out, the announcement got retweeted quite a bit. [Laughs.]

I retweeted it!

There ya go. It seemed to go up on every blog. But I don’t know if that indicates popularity. I do hear from time to time that, oh, I use SoundHack’s externals for doing this or that. So I get feedback here and there, but I don’t spend all of my time looking at other people’s blogs or press releases.

Good for you. But that can be a hard thing to resist.

[Laughs.] I’m usually focused on the new software. So I’ll assume that people like it. If they do send me some notes back, it’ll be encouragement for me to do more. So I guess I actually would like to hear whether people are using them or not.

They are sort of different than other externals for Max in that they’re very complete processes. They’re more like stand-alone effects, studio effects, than they are externals. So I’m really not sure how that would gel to a typical Max user.

At first, I thought there was no reason to turn my VST plug-ins into externals. I thought, well, a Max user could just build them themselves, so I don’t need to do that for them. But then when I found enough people using the plug-ins, I figured, well, maybe it would be nice for them to have some convenience.

Especially the ‘Max for Live’ people. They just want to get going—fast.

Yeah, definitely. As Max gets more popular, and Max for Live gets more popular, there’s a wider variety of users, and some who don’t want to sit there twiddling so much, or programming so much. So I imagine there’s a need, but I haven’t had a lot of feedback yet.

What are you working on right now?

Right now I’m working on a set of plug-ins that are based on the classic phase vocoder algorithm, which I feel has really not been explored enough in commercial plug-ins. I’m working on a real-time time stretcher, which takes a real-time stream of sound and captures multiple windows from the incoming stream, and layers a time-stretched output. I’m still developing it, but it looks like it’s going to be a really nice way to develop a big, ambient, stretched sound out of any incoming sound.

In real-time? How exciting.

I’m also working on some stuff with pitch. Pitch shifting out to the ridiculous. I always like taking algorithms beyond the beautiful, to the point where it gets noisy—from sublime to ridiculous. I’ve also done a pitch-shifting vocoder with it, which is sounding really nice.

Then there is also a phase-vocoder looper that I’m developing. It’s sort of like a conventional looper, but it will be using phase-vocoder style pitch shifting and time stretching on all of the loops.

I’m at the difficult part, to make this thing fun to play with, interactive, able to lock it to beat, and all those kind of good things. That’s what I’m doing right now.

I expect to be finished with these, hopefully some time in the next couple of months.

That’s really exciting. Those should get a lot of attention.

Especially with experimental and fringe electronic music getting bigger and bigger.

]]>https://cycling74.com/2011/02/02/an-interview-with-tom-erbe/feed/10Max for Live Wiimote Instrumenthttps://cycling74.com/project/project69-max-for-live-wiimote-instrument/
https://cycling74.com/project/project69-max-for-live-wiimote-instrument/#commentsThu, 15 Apr 2010 20:34:45 +0000http://cycling74.com/?p=4737Made this vid to demo a max for live device my evil twin made. He’s not really British. He just thinks it gives him street cred.

]]>https://cycling74.com/project/project69-max-for-live-wiimote-instrument/feed/0Inanitiahttps://cycling74.com/project/project58-inanitia/
https://cycling74.com/project/project58-inanitia/#commentsThu, 01 Apr 2010 17:28:52 +0000http://cycling74.com/?p=4629Number theory are a three piece band fusing elements of electronic, jazz, electro-acoustic, and rock music. Inanitia will be our 3rd e.p. release and will be accompanied by the release of a video for Tabula Rasa.

Number theory are a three piece band fusing elements of electronic, jazz, electro-acoustic, and rock music. Inanitia will be our 3rd e.p. release and will be accompanied by the release of a video for Tabula Rasa.

]]>https://cycling74.com/project/project58-inanitia/feed/0Recomposer: recombine tracks into new mixeshttps://cycling74.com/project/project56-recomposer/
https://cycling74.com/project/project56-recomposer/#commentsWed, 31 Mar 2010 15:44:28 +0000http://cycling74.com/?p=4595Recomposer, a device built with Max for Live, allows the user to generate new pieces by algorithmically hybridizing and recombining tracks from existing pieces in Ableton Live. Melodies and rhythms are automatically reworked note-by-note to create coherent new mixes. The Recomposer device creates a new mix (scene) composed of modified parts (clips) drawn from other mixes in the […]

Recomposer, a device built with Max for Live, allows the user to generate new pieces by algorithmically hybridizing and recombining tracks from existing pieces in Ableton Live. Melodies and rhythms are automatically reworked note-by-note to create coherent new mixes.

The Recomposer device creates a new mix (scene) composed of modified parts (clips) drawn from other mixes in the same Live set. Analysis and generation of note content occurs within an algorithmic music engine which collects note data from the Live set, and injects the new notes back into a new scene within the Live set.

]]>https://cycling74.com/project/project56-recomposer/feed/-1TouchControl – Wireless Control for Ableton Live & M4Lhttps://cycling74.com/project/project44touchcontrol-wireless-control-for-ableton-live-m4l/
https://cycling74.com/project/project44touchcontrol-wireless-control-for-ableton-live-m4l/#commentsWed, 17 Feb 2010 17:12:34 +0000http://cycling74.com/?p=4355Author: Christian Blomert TouchControl combines the advantages of Max 4 Live and TouchOSC to create an automapping control interface for Ableton Live that runs on iPhone / iPod Touch. Including Clip-Launcher with named clips / playing status. Device Controls, Mixer and more.

Author: Christian Blomert

TouchControl combines the advantages of Max 4 Live and TouchOSC to create an automapping control interface for Ableton Live that runs on iPhone / iPod Touch. Including Clip-Launcher with named clips / playing status. Device Controls, Mixer and more.

]]>https://cycling74.com/project/project44touchcontrol-wireless-control-for-ableton-live-m4l/feed/0EM Reviews Max for Livehttps://cycling74.com/2010/01/25/em-reviews-max-for-live/
https://cycling74.com/2010/01/25/em-reviews-max-for-live/#commentsMon, 25 Jan 2010 19:11:06 +0000http://cycling74.com/?p=4184In this article, Jim Aikin reviews the new add-on product to Live, developed by Ableton and Cycling ’74, with a detailed account of his experience. In you are new to Max for Live, this is a helpful introduction before downloading the demo and trying it out yourself.

]]>https://cycling74.com/2010/01/25/em-reviews-max-for-live/feed/0A Video Processing Device for Max for Livehttps://cycling74.com/2010/01/07/a-video-processing-device-for-max-for-live/
https://cycling74.com/2010/01/07/a-video-processing-device-for-max-for-live/#commentsThu, 07 Jan 2010 21:42:32 +0000http://cycling74.com/?p=4086While many people are looking at Max for Live as a great way to integrate their favorite hardware controllers, build really unique effects, and add variety to their productions, I was eager to explore what could be done with video inside of Max for Live.

While many people are looking at Max for Live as a great way to integrate their favorite hardware controllers, build really unique effects, and add variety to their productions, I was eager to explore what could be done with video inside of Max for Live.

I have collaborated with musicians before that work exclusively inside of Ableton Live, so it struck me as a huge advantage to be able to build a triggered video playback and live processing system that worked inside of Live natively. Assuming you could keep the overhead low, it may even be practical to run both audio and video from a single Live set. To test this idea, I went for the most obvious solution, which was to use my Video Processing System patch as a starting point. What follows is a document of the process of getting a Jitter instrument working inside of Max for Live.

The Plan

In Max for Live, we have the option to create an Audio Effect, and Instrument, or a MIDI effect. For musical or sound-making devices, the choice is usually pretty clear depending on what sort of input and output you need. Since we are making a device that isn’t outputting audio or MIDI information, it isn’t obvious which type of device to use as a template. For this project, I decided to create a Max Midi Effect, since it allowed me to get MIDI input and pass that MIDI to other devices if I needed to. If we wanted to take audio input, it might make more sense to create an Audio Effect instead. The plan was to create a basic system where incoming MIDI notes would trigger different videos, and all of the controls would be mappable to MIDI and Live’s internal modulation system. I also knew that several features of the VPS patch weren’t going to be necessary, like the LFOs and the QT movie recording features. The rest would be simple copy and paste, with a little re-organizing.

A Note About Jitter in Max for Live

If you are a Max for Live user, but don’t own a full copy of MaxMSP/Jitter, you may notice that jit.window and jit.pwindow objects have an intermittent overlay while editing your patch. This is the only hindrance to working with Jitter in Max for Live, and your windows will be fully functional inside of Live itself. This means that you can build and create Jitter-based devices, but you won’t have seamless window output until you save and go back to Live. Jitter owners have no such hindrance.

Another important thing to remember when using Jitter inside of Live is that it is very easy to create duplicate devices in your Live set and create naming conflicts for things like Jitter windows, textures, or matrix objects. You may have to take some extra care with this, or use “—” before the names of windows and such to create device-unique names.

Getting out the Shoehorn

The first obstacle in getting a fully functional patch into a Live device is getting the UI to fit inside the fixed height device view. Without Presentation View, this might be near impossible for any reasonably complex patch. The first step is to go into the Patcher Inspector and turn on “Open in Presentation” so that the device shows up in Live with the organized presentation view. Still, given the vertical orientation of the original patch, some serious redesign is going to be necessary.

Before I get to that though, we’ll start by replacing several of the standard Max UI objects with Live UI objects. These objects, in addition to looking at home inside of Live, allow us to use Live’s native modulation, mapping, and preset-saving system. Since many of these objects also have built-in labels, a lot of the comment boxes in the patch could be eliminated. To replace all the sliders we first drop a live.slider object into the patch and open the inspector. Inside the inspector, we can set the Modulation Mode to Absolute, which enables clip modulation for the parameter. There are several different modulation modes, with Absolute being the most straightforward since it maps directly to the value of the modulation curve. This will give our new sliders similar settings to what we originally had, and allow for modulation of the slider value using Live’s automation system. Once we’ve adjusted all the settings in the inspector, we can copy the generic live.slider, select the normal slider objects and then Paste Replace. Now all we have to do is go into each inspector and set the parameter names appropriately. From here it is a simple matter of combing through the user interface, replacing UI objects with their Max for Live equivalent, where appropriate.

Now that everything is looking more Live-like, we can begin reorganizing UI elements, condensing things down and ditching unnecessary labels to fit the limited space in our Presentation View.

Bringing in the MIDI

Since this is a Midi Effect, we’ll use the MIDI note input to change the movies in the movie player module. Since the VPS ‘mbank’ module was already designed to have a simple ‘step’ message interface to jump to the previous, next, or random movie clip, or a number to jump to a specific index, parsing MIDI notes is pretty easy. Using a split object, the first 3 MIDI notes are routed to ‘step’ messages, while the rest of the note scale is used to jump to specific movies. This makes it really easy to create MIDI clips in Live that drive the movie selection.

The Test Drive

Now that we have a more tightly packed and concise device, it’s time to give it a try inside of a Live set. To do that, we simply drop it onto a MIDI track, click “Load” to set the folder of movies, activate the camera (see previous VPS articles for a thorough explanation of the patch), and turn on rendering. Once we verify that we are able to make it work manually, it’s time to try piping in some cues. For that, we double-click an empty clip slot to create an empty midi clip and draw in some notes at the very bottom of the scale. When we activate the transport and launch the clip, our movies should be changing in time with the rhythm.

To add a little extra excitement, we can modulate the parameters with the MIDI clip. To get started, activate the Envelopes button in the MIDI clip view and select one of the device parameters from the drop down menu (blur is a fun one). Adjust the automation curve by clicking and dragging. Repeat for any other parameters you’d like to automate. Enjoy.

]]>https://cycling74.com/project/project16-nadzharov-talalay/feed/0Controller for Ableton Live with Max for Live as interfacehttps://cycling74.com/project/project6/
https://cycling74.com/project/project6/#commentsMon, 23 Nov 2009 22:04:54 +0000http://cycling74.com/?p=3504I needed my own controller for Ableton Live. I built it in a diy mind and I’m using max for live as an interface between Ableton Live and the hardware.

]]>https://cycling74.com/project/project6/feed/1The Edit Button Has Been Pressedhttps://cycling74.com/2009/11/22/the-edit-button-has-been-pressed/
https://cycling74.com/2009/11/22/the-edit-button-has-been-pressed/#commentsSun, 22 Nov 2009 23:35:54 +0000http://veryhandso.me/?p=3443Even before the Max for Live beta was opened up to the public, a community of testers was hard at work putting Max for Live through its paces. The integration of Max into the Ableton Live environment opened up a whole spectrum of possibilities that many users hadn’t even considered until now, and it didn’t […]

]]>Even before the Max for Live beta was opened up to the public, a community of testers was hard at work putting Max for Live through its paces. The integration of Max into the Ableton Live environment opened up a whole spectrum of possibilities that many users hadn’t even considered until now, and it didn’t take long before our beta testers were begging to show off their new projects to the larger community.

Once the veil of secrecy had been lifted, we saw an unprecedented flurry of blog posts, screencasts, Twitter updates and even whole websites devoted to the software and what people are doing with it. There was already a vibrant and engaged community of users developing around Max for Live before the product was even released.

Alternative Controllers in Live

One of the areas where there seems to be an explosion of development is the integration of various alternative controllers into Live sets. A lot has been said about the new Akai APC40 and Novation Launchpad, but Max for Live can benefit a variety of controllers. Ableton Live has a fairly friendly interface for mapping MIDI controllers, but users have increasingly desired interfaces for things like Open Sound Control , serial data, HID devices, and other non-MIDI protocols. With mature devices like the Monome, JazzMutant’s Lemur, and others using OSC as their native communication protocol, Max for Live presents a welcome set of tools for use with Live. Monome users can expect to see many of their favorite patches ported to Max for Live in the near future, if they aren’t already.

Now, Monome and Lemur aren’t the only controllers that will benefit from Max for Live. Here is a screenshot of a patch being developed by Vlad Spears for the Snyderphonics Manta controller:

Vlad says:

“Honeycomb maps incoming pad presses from the Snyderphonics Manta to a latticed tonal system of midi notes. This lattice turns the Manta into an isomorphic keyboard, allowing consistent chordal shapes to be played anywhere on its surface. A minor chord is always created with an inverted triangle having its root note in the upper left corner, while a major chord is always a triangle with its root at the top of the shape. This isomorphism makes it easy to play the Manta and produces surprising melodies.”

Vlad is also busy working on devices for Monome control, porting his popular Daevl.Plugs, and all sorts of wonderful sounding experiments for Max for Live.

Livid Instruments has also been working on integrating their new controllers Block and Ohm64 with Max for Live devices.

For those individuals who like to create their own unique experimental controllers, here is a video from “Liubo” showing an Arduino connected to Max for Live:

Of course any device communication that natively works inside of Max should work the same inside of Max for Live, so gaming controllers, multitouch clients, serial boards and custom MIDI controllers of all varieties should work just fine. Combine this with the Live API objects and hardware controllers for Live are about to become way more interesting.

Pluggo Plugins Reborn

Included with Max for Live are 40 of the most popular Pluggo plugins ported as editable Max devices. This is the result of a labor of love to clean up the old patches, add comments where needed, and generally spruce up the user interfaces. Admittedly, even after years of Pluggo updates, the plugins themselves were looking pretty dated. Max for Live provided an excuse to weed out the best plugins and modernize them a bit, taking advantage of some of the features that are specific to the Live integration. Also, since all Max for Live devices are editable, users are welcome to take them apart, reuse bits of the patches, or look at them as some really quirky examples of patching.

Peculiar Instruments, Custom Effects, and Email

It is tempting to focus all of our attention on the grand productions that people will build in Max for Live, the fully realized synths and complex sequencers. As Robert Henke noted in a recent interview on this site, “MaxForLive allows people to solve their very individual problems with a high degree of elegance.” A lot of the devices produced in Max for Live are likely to be very simple solutions to very unique problems. To help you on your way, there is an impressive set of Max building blocks and tutorial devices developed by Manuel Poletti included with the Max for Live content. The building block devices show just how effective simple Max patches can be, and also offer less experienced users a set of well-designed demonstrations of common audio and MIDI processing approaches. Also included are the Big 3 devices developed by Darwin Grosse – Buffer Shuffler, Step Sequencer, and Loop Shifter – more complex devices that you can play with right out of the box. There will certainly be plenty of new toys to keep you busy.

During the beta process, long-time Max user Nick Rothwell decided to take off in a completely different direction and came up with some really clever devices. Since Max provides access to all sorts of programming and scripting languages, there is a potential to integrate any sort of scripting API into a Max project. Eager to test this idea, Nick decided to be the first user to create an email-reading device:

He then followed up with a Twitter client as well, using Java and python libraries:

As Nick’s experiments suggest, we are bound to see all sorts of strange pairings once people start using Max for Live and exploring all of the options. As Robert says, “People will come up with ideas which totally exceed what any one of us would imagine people would do.” So, what will you build?

]]>https://cycling74.com/2009/11/22/the-edit-button-has-been-pressed/feed/11An Interview with Robert Henkehttps://cycling74.com/2009/11/21/an-interview-with-robert-henke/
https://cycling74.com/2009/11/21/an-interview-with-robert-henke/#commentsSat, 21 Nov 2009 17:29:56 +0000http://veryhandso.me/?p=3329Robert Henke is a brilliant electronic musician who records and performs under his own name and also as Monolake. His music has been described as minimalist yet complex techno with an architectural sound. For me, his music is very spatial and multi-dimensional.I find it takes me on an extraordinary journey through space and time, similar to a great work of fiction. Henke recently said, "The last century was about the creation of electronic music. This century is about performance."

Robert Henke is a brilliant electronic musician who records and performs under his own name and also as Monolake. His music has been described as minimalist yet complex techno with an architectural sound. For me, his music is very spatial and multi-dimensional.I find it takes me on an extraordinary journey through space and time, similar to a great work of fiction. Henke recently said, “The last century was about the creation of electronic music. This century is about performance.”

In recent years Henke’s work has strayed outside the boundaries of clubs and CDs. He performs amazing surround concerts and has begun to create densely layered, immersive multi media installation works that have shown internationally. Henke is relentless in his drive to designing the perfect unique tools for each project. He is one of those rare artists that is also a brilliant designer.

Henke is also active in creating tools for other musicians, since the founding of the company in 1999 he is part of the development team of the software program Ableton Live. In 2003, Henke began an evolving design on a performance controller in 2003 that came to inspire the Akai APC40 control surface. He has also been involved in the development of Max for Live.

Henke has worked in Max/MSP for over 15 years. I caught up with him at Expo ’74 to talk about the interesting journey that he has taken.

What brought you to use Max/MSP?

Well, I’m not a musician, I can’t play an instrument, and I really like the idea of a machine doing things. I like step sequencers. I like statistical and stochastic functions for creating clouds of sounds, and clouds of events. I see myself more as someone who is writing a structure, and the structure then creates the music. And Max is pretty much the ideal tool for this kind of thinking.

How would you describe your patches? Are they messy, are they organized, are they minimal, are they maximal?

This changed over the last 20 years. My first patches were very small, because I worked on a Macintosh Plus, which was extremely limited. Then my patches became very big, because I tried to solve every possible problem with one single, big patch. Then my patches became very small again. What I do these days is that I try to come up with a patch which solves 90 percent of what I want to do, and I try to do this within a very short amount of time, so that I spend more time actually making music than programming Max.

It’s a tricky thing with Max: You can spend weeks in refining a patch, and afterwards, the initial idea is pretty much gone. Now you have the perfect tool, but you’re not interested in using it anymore. So I try to make small patches, which do a lot of good things.

There was a step sequencer, which had been developed by Gerhard Behles and myself, and this step sequencer had a few features, which later found its way into Live. For instance, the fact that you can switch patterns for each track individually. And a lot of the effects in Ableton Live I prototyped in Max, like the Grain Delay, which is a classic type of Max patch. The Chorus was a Max patch as is was the Waveshaper/Saturator. The Operator synthesizer started as a Max patch, at least in parts. Live itself was written from scratch in C++. But I like the idea that it’s a Max patch, because everyone who is into Max obviously understands that this is impossible. [Laughs] It’s a nice urban myth.

Do you have any favorite object?

Favorite object? Well, in the old days, the recipe was pretty much: Table, Counter, and Random… Metro!, of course! I think Metro, to me, is the Max object of choice, if I have to choose one. It just bangs regularly by itself. [Laughs]

Do you make your own objects?

No. Maybe 15 years ago I dove into the SDK, and I thought, let’s go hard-core. But then I realized pretty quickly that what I really want to do is making music. And there’s (already) such an abundance of tools. I really don’t see why I need to invent yet another tool at this time.

What do you think’s going to happen in the next few days, at this Expo ’74 conference? What do you hope to get out of it?

Well, I have a kind of official task to do here. I’m looking for interesting people who might at some point contribute content for MaxForLive. Or would be interested in working for Ableton. That’s just the very straightforward answer. I’m also hoping to get inspiration, ideas for my own creative works. In general, to hang out with people who can expand my idea of what’s possible. Because Max is a language and everyone is using that language in a different way. I’m always amazed when communicating with other people who are using Max, because we are all using the same tool, but we achieve totally different results. And that’s very interesting, and very inspiring!

Regarding Max, I think what I really like about Max am that it has the potential to change the way you think about music. It did this for me. It really freed me from this linear idea of something which has a start and an end. And this was a total liberation. I realized more than before that what I’m interested in in music is some kind of constantly changing, endless state, and Max is the ideal tool that. It’s actually a tool for sculpturing music, much more than for recording. A Max patch looks like an abstract painting. The music you make with Max is potentially endless, and it all somehow fits together.

Do you have any tips for somebody new to it?

Yeah: Try simple things. The biggest mistake one can make is trying to come up with the one single patch that does everything. Make a simple patch, and try to work with that simple patch for a period of time. Because making a tool is one thing, but mastering a tool and working with a tool is another thing. And if you’re constantly creating a new tool, you will never master it. And that’s a danger, in Max, that the Edit button is so close. [Laughs] Maybe people should sometimes just use Max Runtime for a month. Cycling ’74 should force people to use Max Runtime for one month a year, so they can’t make new patches, they just have to use what they already did. [Laughs] It’s kind of a responsibility of Cycling ’74 towards the development of art. [Laughs] So it would be the ‘Max Play Days’. You can think about Max Play parties and stuff like that.

Do you still play in clubs a lot?

I do, I’m touring. I play in clubs. And I more and more enjoy extending what you can do in a club. The more ‘famous’ I get, the more I try to find out how far I can push it. So the last thing I started was that I decided I’d like to play 4-channel live sets, even in a club. There are people who say, oh, it’s a club, it’s mono anyway, no one cares if the snare comes from there, from there, or from there. But, as a matter of fact, since my music is so much about atmospheres, and the beats are just one part of it, the atmospheres tremendously benefit from four channels. So this is one way how I try to expand the club idea. Another idea is that I’m really trying to perform ‘live’ as much as possible with a lot of control over my live set. And there Max plays an important role, because Max works as the bridge between Ableton Live and my hardware controller. So yeah, I try to make club music, which is not club music. And that’s what I really like when performing; Presenting people with something they usually do not hear in a club, and still make sure they can dance. That’s a challenge, but it’s also very satisfying.

I really enjoy being a part of this event here, and I strongly believe, that the integration of Max and Live will create tremendous possibilities. And it will do so because it simplifies things. The German computer pioneer, Konrad Zuse, who I think built the first working computer based on tubes, in Berlin in 1930s, once said that ‘not the most advanced ideas, but those who create the most immediate and simple results are the ones which succeed.’ And Max for Live will simplify a lot of things and therefore will enable more people to do outstanding things with easier access. Therefore people will be encouraged to explore new ideas, and this will definitely have an impact on what people will do artistically.

So this was the positive side of Max for Live. The negative side of Max for Live is that people might project a lot of expectations. I have the feeling that for a lot of people who are not really deeply into Max, it’s this kind of secret weapon, and they believe it’s the Holy Grail of everything and once you know Max, you can do everything. And of course that’s not true, because there are all kinds of little limitations, and awkward things, like in any environment. I guess what will happen is that at first people are extremely euphoric, then a lot of people will become really upset about the limitations, and afterwards, people will realize that it’s cool nevertheless, and will be extremely happy. That’s the curve I know from every Live release. Total anticipation, and euphoric statements, then total frustration, and afterwards people just use it and love it. MaxForLive will be pretty much the same thing. The one thing I’m really curious about is what people actually are going to do with it. Because the possibilities of this combination are really beyond anything else I could imagine. I believe we will see stunning results, and we will see surprising results. People will come up with ideas, which totally exceed what any one of us would imagine people would do. We can already see people doing stuff with Live itself that we would never have come up with. And the combination of Max and Live is just like a new universe.

I started working more with Max, actually. I had a period where I was fed up with programming, and I just wanted to make music, and I felt that making Max patches is a waste of time. And this phase is over, since maybe a year or something, I enjoy making Max patches again. And it was independent of the release of Max 5. It just happened that I again felt there’s a way of expression possible in Max that I was missing. Now I’m kind of back to Max. And then MaxForLive finally came true, so it all works very well together. I actually can’t wait to use MaxForLive more. It’s fun.

Explain what MaxForLive is.

Well, MaxForLive is a version of Max, which runs inside Live. This combines two very different applications into something new and exciting. The benefit of Max which runs in Live is: if you come from a Max perspective, you have access to features which are difficult to realize in Max. Max is really good for things that have nothing to do with a timeline at all. Live is really good in dealing with timeline-based operations, because we have all this stuff there. So if you want to control a Max patch to create a change over a long period of time, and you need a timeline, MaxForLive is a very good answer for that. It frees the Max user to do something that is hard to do in Max. From a Live perspective, MaxForLive opens up the possibility to create your own effects, and to create your own synthesizers. It also allows you control Live in a new way, and therefore extend the functionality of Live, helps to customize Live. We realize that a lot of our customers have very individual ideas how they would like to use Live. The problem is, if you asked 100 people what they want, how we should continue developing Live, you’d get 100 different answers. And we just cannot fulfill every need for every person. It would just be impossible. MaxForLive allows people to solve their very individual problems with a high degree of elegance.

To give an example, you could want that if you play in a live situation, if you played a clip that the color of this clip changes. So that you actually realize, oh, I played this clip before. That’s a feature that totally makes sense from a live performer perspective, but it’s certainly not a feature we would ever implement in Live ourselves, because it’s such a special feature. It’s a special request. It’s not something that you would want to see in a menu of Live. But with MaxForLive, this is a task that is extremely simple to do. So you play on stage, and all your clips are green, and you play a few clips, and afterwards the clips are orange. You know, OK, I played these clips already. This is a very simple patch in MaxForLive, which enhances the functionality of Live in a totally meaningful way. It’s very personal, and it applies only for those few hundred people who like this feature, but for them this feature is extremely important. And that, I think, is a nice example of how a very simple Max patch can solve a problem, which is very essential for a few people. Actually this example is something I did already. I color the LEDs on my MIDI controller, so if I played a clip already, it has a different color.

What type of hardware controller do you use?

I built my own hardware controller, because I was not satisfied with any commercial product. And I so underestimated the effort. [Laughs] It took me more than a year to build my controller.

Is Ableton going to be putting it out? Is it going to become a commercial product?

It kind of has. My controller is called the Monodeck II, there’s a lot of information on my web site about it. A lot of ideas from the Monodeck, which had been realized in 2005, 2006, are now, four years later, part of the Akai APC40 controller. So there’s obviously a connection there. The Monodeck will never be a commercial product, but the Akai APC40, which was developed by Akai with Ableton, is the logical consequence out of that. If the Monodeck ever breaks, then I will just grab my APC40, and it’s all good. The first Monodeck was a really self-made, DIY kind of thing. Which gave the security people at the airport a hard time. The Monodeck II just looks like a professional product, so no one cares anymore.

If you have something really DIY they think it’s a bomb or something.

Yeah. The thing with the Monodeck is, and with the Monodeck II, that I have some wood stuff in there, as distancing parts. I realized because a lady at the x-ray told me once that, explosives are organic materials, too. So if you look at my controller on the x-ray, you see blocks of organic substance in there, and that’s classic ‘Alarm, there’s an explosive in there!!’ trigger. So they always check it.

]]>https://cycling74.com/2009/11/21/an-interview-with-robert-henke/feed/2Time TunnelXLhttps://cycling74.com/project/project2/
https://cycling74.com/project/project2/#commentsSat, 21 Nov 2009 16:27:10 +0000http://veryhandso.me/?p=3317Author: Komika Hackage This pair of externals decodes timecoded vinyl and plays the decoded stream through a resample external back. It can be used to scratch sounds, movies or whatever else Max/MSP is offering to scratch.

]]>https://cycling74.com/project/project2/feed/-1Max for Live Presentation at Expo ’74https://cycling74.com/2009/08/21/max-for-live-presentation-at-expo-74/
https://cycling74.com/2009/08/21/max-for-live-presentation-at-expo-74/#commentsFri, 21 Aug 2009 18:05:06 +0000http://veryhandso.me/?p=2108In this presentation, I spoke directly to people that were already familiar with Max, explained some of the details of working within the Live environment, and provided some tips about how to design an effective Live device. Hopefully this will whet your appetite for working with the Max/Live combo!

]]>One of the best experiences of the last year was attending the Expo ’74 conference. As some have already stated, it was refreshing to be part of a gathering that didn’t require an explanation of what Max did, and I never felt like I needed to be ashamed about my excitement over a patching trick.

In addition to attending, I also gave a presentation about programming with Max for Live. Fortunately, the presentations were recorded; even more fortunately, Andrew Benson was kind enough to edit my talk and remove most of the embarrassing moments. In this presentation, I spoke directly to people that were already familiar with Max, explained some of the details of working within the Live environment, and provided some tips about how to design an effective Live device. Hopefully this will whet your appetite for working with the Max/Live combo!

Part One

Part Two

]]>https://cycling74.com/2009/08/21/max-for-live-presentation-at-expo-74/feed/0Max for Live: A Sneak Peak at the Live API featureshttps://cycling74.com/2009/07/14/max-for-live-a-sneak-peak-at-the-live-api-features/
https://cycling74.com/2009/07/14/max-for-live-a-sneak-peak-at-the-live-api-features/#commentsTue, 14 Jul 2009 21:03:52 +0000http://veryhandso.me/?p=2096So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you've seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I'd like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.

So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you’ve seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I’d like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.

For those of you who aren’t well-versed in geeky acronyms, the Live API provides the ability to access the greater Live user interface from within your own device. This will offer an unprecedented amount of control and interaction, and it will be fully documented.

The Live API made something of a public debut in 2007 when a few Live users exposed a Python-scripting interface to control various aspects of a Live set. This API was originally developed by Ableton for testing and creating hardware controller templates. With Max for Live, we have had the opportunity to work with Ableton to fully integrate the features of this API into Max in the form of four objects – live.path, live.observer, live.object, and live.remote~. We have also convinced Ableton to add a few key features to the API to make Max patching and creating simple utilities more straightforward and robust.

What it can do

The Live API provides access to a Live set so that we can gather information about what is happening or change the behavior or state of the set. This means you could write a Max device that triggers clips, randomly generates parameters for other devices, and behaves differently depending on what else is going on in Live. The Live API also provides access to the same tools Ableton uses to create hardware control surface templates and interfaces, with the addition of all the features Max brings to the table. To give you a better idea of how this works, let’s look at the objects themselves and some really simple examples.

live.path

In order to control something in Live using the API, you have to navigate the object hierarchy to find the specific parameter you want access to. For this purpose, we have live.path. This object takes navigation commands as input (goto …) and outputs an ID number for the specific element you navigate to. This ID is used by the other API objects to point them in the right direction. The live.path object can also be used to gather information, like how many tracks are in a set, or how many parameters are in a device. Using the “goto this_device” command, followed by a “getpath” message, you can also find out where in the Live set your Max device is located.

live.observer

Sometimes you just want to know if a particular clip has been triggered yet, or what the volume settings are for your tracks. For this, we created the live.observer object. This handy object attaches itself to a specific UI element or parameter in Live and reports the state of it. Whenever the value changes, it will output the new value. This allows you to do things like listen for specific clips getting triggered, or modulate values in your device based on the parameters of another device.

live.object

This object allows you to control the state of various values and trigger events in the Live Set. This is the real workhorse of the API objects, since it allows you to do things like making basic clip alterations, changing the values of different parameters, querying information, and significantly altering the behavior and state of a Live set. Most things you can do with a mouse click in the Live interface, you can do with live.object. This includes manipulating MIDI clips, changing clip colors, triggering events, and altering playback. This object also provides the interface for designing custom control surface mappings for Live.

live.remote~

Since live.object is designed to mimic user interactions with the Live Set (and adds to undo history), there are certain things that it probably shouldn’t be used for, like rapidly modulating the parameters of effects. For this purpose, we created an object called live.remote~, which allows you to directly modulate the parameters of any “remoteable” control in Live at signal rate. Those of you familiar with the Pluggo modulator plugins will be astounded at the possibilities opened up when every knob and slider in Live is controllable by a humble Max patch with sample accuracy. For example, one could copy the clever LFO patches Gregory Taylor writes about in his recent articles and use these same processes to modulate the drive on a Saturator device or the transposition of one of the drums in an Impulse device.

CODEBASE="http://www.apple.com/qtactivex/qtplugin.cab">

Video represents an earlier stage in the Live API development and should not be depended on for programming techniques.

What You Can Do

To see a couple simple examples of things you can do with Live API, have a look at the video above. This video shows live.path, live.object, and live.observer in action. In the first instance, we are just using live.path to find out how many tracks we have in our set. The same could be done with clip_slots, devices, parameters, etc. In the next segment, we are using live.observer to monitor the volume of one of our tracks, and then live.object to set the volume. It’s important to note that even though we are just using number-boxes as an interface, any number of procedural methods could be employed to set these values. Lastly, live.object is being used to trigger some clips in our track. If we were to query the number of clip_slots in our track, we could easily set up a random number generator or other logic to trigger clips in interesting ways. While we’re at it, we could also alter transposition and scrub the playhead around too.

CODEBASE="http://www.apple.com/qtactivex/qtplugin.cab">

Video represents an earlier stage in the Live API development and should not be depended on for programming techniques.

The video above shows the live.remote~ object being used as an assignable LFO for the parameters of a Saturator device. While there are a couple of simple things happening behind the scenes (getting the list of device parameters, scaling the LFO to the range of the parameter) you will see the patch itself is pretty straightforward. Extending this little device, we could create all sorts of complex sonic behavior with just a little patching. Since the Integrated Timing features of Max 5 also work with the Live transport, we can also set up perfectly synced waveforms to use as LFOs with live.remote~.

Of course, it’s not all about oscillators. Live.remote~ (and live.object) can also be used to map custom non-MIDI controllers to specific device parameters without having to convert the data to 7-bit MIDI messages. Since the Live API objects bypass MIDI, you can take full advantage of the sample-accurate, floating-point precision of the Live device controls.

We think the Live API objects for Max in Live will open the door to completely new ways of working with Ableton Live, but it will also allow you to create really practical “utility” devices that help you to solve specific problems in your Live project. Ever wished Follow Actions did a little more, or that you could just connect one knob to another knob, or add more chaos to the environment? Want to connect a hardware device to Live that doesn’t use MIDI? Do you just love to set up bizarre control structures and interdependent, complex systems for event sequencing? All of these things will be within reach, and many more that I can’t think of.

]]>https://cycling74.com/2009/07/14/max-for-live-a-sneak-peak-at-the-live-api-features/feed/0Pluggo Technology Moves to Max for Livehttps://cycling74.com/2009/05/14/pluggo-technology-moves-to-max-for-live/
https://cycling74.com/2009/05/14/pluggo-technology-moves-to-max-for-live/#commentsThu, 14 May 2009 20:12:39 +0000http://veryhandso.me/?p=2083Effective immediately, Cycling ’74 will discontinue sales of prebuilt Max-based audio plug-in packages. This includes Pluggo, Mode, Hipno, and UpMix. We will still continue to support current users as best we can, but there will be no further development on either the plug-in packages or their supporting technology.

Effective immediately, Cycling ’74 will discontinue sales of prebuilt Max-based audio plug-in packages. This includes Pluggo, Mode, Hipno, and UpMix. We will still continue to support current users as best we can, but there will be no further development on either the plug-in packages or their supporting technology.

This was not an easy decision to make, and we know it will disappoint some users. We had originally hoped to update our plug-in building technology to work with Max 5. However, we have had to face the fact that it is simply not cost-effective to support three different plug-in specifications on two different platforms, particularly given the increasing absence of standardization of host platforms we have observed over the past several years. Supporting our Max/MSP-based plug-in technology involves trying to make the entire Max environment run inside another host application. This was never a simple matter to begin with, and it has only grown more challenging with time.

We have decided instead to focus our development efforts on a single application, Ableton Live, where we can work directly with the developers and exert some influence over the host environment. Max for Live, announced this past January, offers the key ingredient, interactive software development, that reflects our company’s mission in life. With Max for Live, we’ve used our ten years of experience creating plug-ins to improve every aspect of our development system, adding new user interface tools, sample-accurate automation ramps, and flexible parameter definition and storage options.

As part of our Max for Live development, we have begun to revive the highlights of the Pluggo audio effect and instrument collection, and a couple of screen shots below show examples of our efforts to date. Our Pluggo-inspired devices will be freely available for use with Max for Live, and in recognition of our loyal customer base, we have arranged with Ableton to provide Max for Live discounts to plug-in customers. Details will be announced when the product is released later this year.

Robert was in Anaheim, giving amazing Live 1.0 demos non-stop at Ableton’s first NAMM booth, and before the last day of the show, we were chatting in the topiary-enhanced parking lot of Stovall’s Inn. Having used Max to prototype some of the first effects included with Live, Robert told me he wanted to be able to reprogram his effects on the fly, without stopping the music, just the way everything else worked in Live. At that time, the reality for Robert would have involved translating his revised patch into a C program and rebuilding a new version of Live. This was not exactly the real-time development cycle he was used to as a Max user.

At the time, I told Robert I thought his idea was cool and that we should make it happen. But then I began to immerse myself in the details. Would we have to shoehorn the entire Max environment into Live? If not, how could you edit a patch without hearing it? None of the alternatives seemed terribly attractive. Brooding quickly set in. Fast forward nine years. After a lot of negotiating, specifying, and programming, Robert’s dream is becoming a reality. It seemed appropriate to reflect on this newest evolution of our software, and why it has caused my mood to brighten considerably.

The Max device (Degrader) comes with an edit button, unlike the built-in Live device (Erosion).

From Ableton’s perspective, Max is the meta-feature. Live’s limits are now your imagination’s limits. (Who knows, maybe they won’t need to add any more features!) But what it does Max for Live mean for Cycling ’74 and our users? And why did we want to integrate Max into another piece of software?

I’ll answer the second question first.

As someone whose primary career interest has been software user interfaces, I have to say at the outset that Live has been an ongoing inspiration since I first saw it. The important thing for me is Live’s recognition that fluency was a fundamental goal in an interface for creative work. Particularly with version 5, we’ve tried to incorporate lessons from the Live interface into Max.

After Live appeared, it soon became clear that I wasn’t the only one who was impressed. Live has become a preferred tool of many Max users. Live’s performance orientation attracts a similar community to ours. And here’s a insider tip: Ableton’s MIDI, plug-in, and ReWire implementations were always the most stable we dealt with, and we actually knew people who were using the two programs together with success, on both Mac and Windows. That meant we could imagine that an integration project would have a good chance of actually working when we were finished with it!

Ultimately, it came down to this: my Cycling ’74 co-workers and I have come to believe the unique thing we have to offer the world is fundamentally about programming. In other words, we want to make edit buttons, and if we can put them in places where they have never existed before, all the better. It was clear to me that Ableton understood what it meant to have the Max environment work with their software. They weren’t just talking about more plug-ins.

We’ve been working with Ableton for more than two years to bring Max and Live together. From the outset, our goal was to create the concept of a dynamic Live device that would make the application itself seem editable. The result is not just another plug-in specification but an entirely new kind of workflow that manages to combine the interactivity and fluency of both applications without compromising anything.

Working on a complex task with another company separated by over 5000 miles and a nine-hour time difference has been an interesting challenge. Time and distance were not the only issue, however. Even though we respect each other’s software tremendously, the cultures of Ableton and Cycling ’74 are, within the narrow confines of audio software companies, pretty divergent. I suppose I should be careful in making comparisons between the two organizations, but I think it would be safe to say that Cycling ’74 operates in a manner that, by comparison to Ableton, could be characterized as complete and utter chaos. Yet for me at least, the experience of getting to know another company and its people has been intensely rewarding. Since December 2007, when Max for Live was first demonstrated (and yes of course, it crashed!) at an Ableton company meeting, the Cycling ’74 office began to receive requests for Max authorizations from Ableton employees. That was an encouraging sign for me that maybe we were on to something. For the past several years, we have actually managed to infiltrate the Ableton office in Berlin with one of our developers, Jeremy Bernstein. In retrospect, even with all the other pieces of the puzzle falling into place, it’s hard to imagine how we could have accomplished this task without Jeremy being in the right place at the right time.

Even Max users who never end up with Live have benefited from this project. In addition to some of the Live-influenced changes we made to the UI design, there were features we developed for Max 5 specifically to address challenges of Live integration. For example, given the size constraints of the Live device view, we needed a better method for displaying a compact interface that wouldn’t distort the logical structure of a patch. The result was presentation mode, which turned out to be a dramatic improvement for UI design for any patch. The task of integrating Max into Live has already prompted a number of innovations within the Max environment and I can confidently predict more will be forthcoming.

Finally, I want to leave you with a Max-centric perspective on what this project represents.

The most obvious thing you are probably seeing as a Max user is the ease with which you can get your Max stuff into Live, as well as share it with a new user community. But that is not the whole story. Instead of thinking about what Max is going to do for Live, think about what Live is going to do for Max. With this integration, a programming environment has just gained a set of powerful composing and performing tools. In Music-N terms, Max supplies the orchestra while Live holds the score. The “score” however is not just MIDI notes. It can be audio, triggered and manipulated in all the sophisticated ways Live provides. Or it can be automation, drawn inside Live and fed to Max as sample-accurate audio-rate ramps if you like. It’s equally possible to work the other way, where Max represents the score and Live represents the orchestra.

Those are just the raw capabilities. The real magic happens when you see how it can all work together. Because we started with the requirement to support dynamically changing devices, your “score” and your “orchestra” will evolve together seamlessly. For example, if you edit a device and add a parameter to it, you won’t lose the automation data you’ve already created for the device’s existing parameters. Then there is something we have been calling preview mode. Preview mode pipes audio, MIDI, automation, and timing from Live to Max (and back to Live) while you are editing your device. The result is a sound design process that feels completely integrated from the highest to the lowest level in a way nothing else has before.

Once you experience this integration, I think you will see how it has the potential to change the typical usage patterns of both applications. Max is the ultimate workaround for out-of-the-ordinary things you need to do in Live, while Live supplies the sampling and granular audio triggering Max users often find themselves constructing. Our new Live-inspired Max UI objects, with their effortless parameter management, tie everything together, and then you save it all into a single document, ready for tomorrow’s creative explorations.

]]>https://cycling74.com/2009/01/15/my-perspective-on-integrating-max-and-live/feed/-1Announcing Max for Livehttps://cycling74.com/2009/01/15/announcing-max-for-live/
https://cycling74.com/2009/01/15/announcing-max-for-live/#commentsThu, 15 Jan 2009 14:23:52 +0000http://veryhandso.me/?p=2005Cycling '74 and Ableton today announced Max for Live, the integration of Cycling '74's Max/MSP environment into Ableton Live. Available as an add-on product to Ableton's newly announced Live 8, Max for Live permits users to create devices that extend and customize Live by creating instruments, controllers, audio effects, and MIDI processors.

NAMM • Anaheim, CA • January 15, 2009–Cycling ’74 and Ableton today announced Max for Live, the integration of Cycling ’74’s Max/MSP environment into Ableton Live. Available as an add-on product to Ableton’s newly announced Live 8, Max for Live permits users to create devices that extend and customize Live by creating instruments, controllers, audio effects, and MIDI processors.

Devices developed with Max for Live utilize the same features as those created by Ableton engineers. This includes UI controls, MIDI mapping, multiple undo, tempo-based effects, sample-accurate automation, and comprehensive file and preset management. Devices created in Max can be shared with Ableton’s new web collaboration features. An innovative “preview mode” feature permits editing in Max while devices continue to process audio and/or MIDI as if they were inside Live. When an edited device is saved, it updates in place inside Live’s device view.

The devices included with Max for Live illustrate the potential of the integration of the two products. Step Sequencer is a MIDI effect that features four 16-note sequences with adjustable step sizes. It includes unique features such as sequence shift buttons, a random mode, and real-time MIDI control. Loop Shifter is a new type of loop playback device that uses MIDI to change the way loops are played back. It includes automated mapping and playback modes that produce surprising and entertaining results. Finally, a Max for Live extension for the newly announced Akai APC40 transforms the buttons on the hardware controller into a step-sequencer-style editor for Live MIDI clips.

Max for Live features a variety of basic building blocks for creating new devices as well as an extensive set of interactive tutorials illustrating the development of instruments, audio effects, and controllers. 15 new Max objects duplicate the UI elements found in Live devices, manage parameter state, and provide an unprecedented level of control over the Live environment itself.

Availability and Pricing

Max for Live will be available from Ableton later this year. Pricing information will be announced when Max for Live is released.

More information about Cycling ’74 and its entire product line available at http://www.cycling74.com.

About Ableton

Ableton develops technology to inspire creative people. Since the company started in 1999, Ableton has attracted an extensive and highly committed community of musicians, composers and DJs worldwide. Currently, Ableton counts about one hundred employees in its Berlin and New York offices. The company has received outstanding press, awards and customer feedback since the unveiling of Live in October 2001.