Running a Quartz composition in your application

Updated Sept. 12th, 2008: make sure you don’t initWithOpenGLContext: a QCRenderer* outside of a @try…@catch block.
If you do this on a 16 MB or less PCI video card, this will throw an exception instead of just returning a nil object.
These video cards are not Quartz Extreme compatible.

Now that you have created a Quartz composition, you are probably wondering if you can somehow reuse this prototype in your production code (C, C++ or Objective-C).

For the purposes of this demo, it is assumed that you have an application written in C++ using Carbon. Cocoa applications can simplify this code as needed, for instance they may not need a local NSAutoreleasePool.

The Hard Way

Quartz Composer is a GUI on top of the Core Image filter library. Everything you see in QC represents one or more of the basic Core Image filters.

Technically, nothing prevents you from re-writing the composition by writing procedural code. Given an image img, you can:

Load a CIFilter

Set its parameters, including input image img

Apply the filter

Get the new image img2

Unload the filter (if necessary)

Repeat with img2 and a new filter…

This is tedious, error-prone and hard to maintain.

You already did all the creative work in Quartz Composer, why not let Quartz Composer do the heavy lifting for you?

The easy way

When you think about it, our composition requires two pieces of data:

A source image to operate on

A place to store the resulting image

If you were to treat a composition as a black box, the function prototype would probably look something like this:

Pretty simple so far! Copy this in a header file (QuartzComposer.h, for instance) and add it to your application.

A small addition

As-is, our composition cannot be used. You will make some minor modifications to it to help run it from our application.

First, you will add an intermediate, “do-nothing” image transformation.

Disconnect your source image from the “Color Monochrome” and “Source Atop” filters, by dragging the tail end of the connexion away from the little dot labeled “Image”.

The glowing image should disappear from the output window. This is expected.

Drag an “Image Transform” patch from the patch list into your composition

Do not change the default parameters of this new patch. We simply want a “do-nothing” transformation.

Connect the source image to the “Image” input (on the left) of the new Image Transform patch

Connect the “Transformed Image” output of the new Image Transform patch to both the Color Monochrome, and Source Atop patches.

The glowing image should re-appear in the output window.

When you are done, you should have something like this:

Published outlets

The last thing to do is to to indicate in our composition where the source image is set, and where the resulting image can be copied.

If you control-click on any patch element of a composition, a contextual menu with the text “Published Inputs” and “Published Outputs” will appear. They will be disabled if there are no inputs or outputs. For instance, the source image you dragged in (on the left) has no inputs, and a Billboard has no outputs.

Using this technique, change the name of the published input “Image” of the Image Transform patch to “SourceImage”. The input’s name should now be “SourceImage” (with quotes), indicating that it is now a “named” input.

But my picture disappeared!

Right. As soon as you named the Input, the image was disconnected because a Quartz Composition cannot accept multiple inputs. It’s either a named input, or a connexion. This is expected.

Finally, name the output image of the “Source Atop” patch to “OutputImage”. Notice that the link to the Billboard was not severed, because outputs can be split.

Save your composition, with its named inputs and output, to a file calledGlow.qtz.

Adding the composition to your application as a resource

Your application is probably built in Xcode, in which case you have a “Copy Resources” build phase. Simply add the Composition Glow.qtz to your project as a resource, and make sure it is added to this Copy phase. When you build your application, check the Contents/Resources folder in your bundle: the composition should have been copied there.

Actual code

You declared a function called ApplyQuartzComposition(const char*, const CGImageRef) above. Here is the code to this function:

Do you know of a way to adapt the above into accepting a live Video Stream from a camera input.
Ive looked into QTKit framework and adapting the decompressedVideoOutput but to no avail as yet.
Any ideas,
Regards,
Adrian

If you can get a Quartz Composition to play in Quartz Composer, you should be able to play it in your application using a modified version of this code. Basically, this code plays one frame of the animation. That was sufficient for my purposes, but for video playing you need to setup at timer and call [renderer renderAtTime: arguments:] repeatedly.

Im still trying to nut out a few things with you sample above.
1# which line is looking at the incomming image being fed to it?
2# how do I declare the image/images in my app to automatically be sent to that ?
3# Can I still use CGImageRef or do I have to convert to CVImage Buffer to accept a video stream from QTKit?

2. For your images, they can come in any format as long as you can transform them into NSImage, CGImageRef, CIImage or CVImageBuffer (see above tech note). In my case, my images start as PNGs on the disk and are loaded in CGImageRefs. But in other cases, you could init an NSImage from a bitmap representation and use that as the input source. Really, there are no limites to what you can do.

3. I believe you can directly use your CVImageBuffer in the setValue:forKey call above (note that I cast my CGImageRef to (id), a generic object. Quartz Composer figures out what data you’re passing to it and acts accordingly.