Been experimenting and wondering whether it’s possible to use the FaceTime camera and AppleScript to capture a still image. So far, I haven’t had success:

use framework "AVFoundation"
use framework "Foundation"
use scripting additions
property this : a reference to current application
property AVCaptureDevice : a reference to AVCaptureDevice of this
property AVCaptureDeviceInput : a reference to AVCaptureDeviceInput of this
property AVCaptureSession : a reference to AVCaptureSession of this
property AVCaptureStillImageOutput : a reference to AVCaptureStillImageOutput of this
property AVMediaTypeVideo : a reference to AVMediaTypeVideo of this
property NSPredicate : a reference to NSPredicate of this
property NSRunLoop : a reference to NSRunLoop of this
set CaptureDevice to the first item of (AVCaptureDevice's devices()'s ¬
filteredArrayUsingPredicate:(NSPredicate's ¬
predicateWithFormat:"localizedName==[c]'FaceTime Camera'"))
set [DeviceInput, err] to AVCaptureDeviceInput's alloc()'s ¬
initWithDevice:CaptureDevice |error|:(reference)
if err ≠ missing value then return err's localizedDescription() as text
set ImageOutput to AVCaptureStillImageOutput's alloc()'s init()
set CaptureSession to AVCaptureSession's alloc()'s init()
CaptureSession's addInput:DeviceInput
CaptureSession's addOutput:ImageOutput
set gotConnection to false
repeat with connection in ImageOutput's connections()
set connection to the connection's contents
repeat with port in connection's inputPorts
if the port's mediaType as text = ¬
AVMediaTypeVideo as text then
set gotConnection to true
exit repeat
end if
end repeat
end repeat
if not gotConnection then return false
CaptureSession's startRunning()
ImageOutput's captureStillImageAsynchronouslyFromConnection:connection ¬
completionHandler:"imageWasCaptured:"
CaptureSession's stopRunning()
on imageWasCaptured_(imageSampleBuffer, err)
if err ≠ missing value then return err's localizedDescription() as text
ImageOutput's jpegStillImageNSDataRepresentation:imageSampleBuffer
end imageWasCaptured_

I’ve only looked at your code briefly, but there’s one thing to be aware of: you can’t use captureStillImageAsynchronouslyFromConnection:completionHandler: like that. The completion handler needs to be a block (an inline Objective-C function), and they’re not supported from ASObjC. You might be able to pass missing value then use some other method to tell when the process is finished.