You can play/control them either as Standalone synths or using DAWs/Hosts as VST/AU "racks", allowing you to access more than 1 instrument at the same time either through MIDI Channels (1 per instrument) or using various (virtual) MIDI ports (each allowing 16 MIDI Channels)... whatever you DAW/Host allows.

As an example, Kontakt allows stacking instruments and assigning a MIDI Port+Channel per instrument and this is how you're able to play a specific one from Reason, sending MIDI to the respective MIDI Port+Channel pair.

Keep in mind that EMI enables you to "convert" CV to MIDI CCs, which is kinda cool... playing iPad's Animoog with a Matrix, RPG-8, sequencer track or through the controller keyb while controlling its X/Y pad with Pulsar is always something kinda neat to do ;)

So, sending MIDI Clock out allows us to sync Reason's sequencer Tempo with any external (hardware or software) gear/instruments (or DAWs) that support MIDI Clock input.

This is how you're able to sync your external synth's Arpeggiator, LFOs or step sequencer with Reason's BPMs :)

Here's a quick example of sync'ing an external (software) instrument through MIDI while also playing it through MIDI (video by sinnerfire)

Besides MIDI Clock, another useful info being sent out is Song Position, extremely useful when controlling Rhythm/Drum machines, Sequencers or DAWs that support Song Position (some don't, you'll have to try it or check the specs list).
TE OP-1's Tape mode receives Song Position, so it's possible to sync its "Tape" position with Reason's Sequencer position :)

«Right, so I'm able to play other instruments not on Reason's rack through MIDI but that's not saved into my Reason song!!!»

Well, just like a singer's voice, a guitar or any other acoustic, electric or electronic instrument that you want to add to your Reason song, you need to audio record that performance to become a part of your song as audio tracks, right?

So you'll need to do the same with any of those External MIDI Instruments played by Reason. If you want to capture that audio performance and embedded it in your song, the definitive way to achieve that is to record that instrument audio output into a Reason audio track. Kinda obvious, isn't it? ;P

External hardware instruments are usually a no-brainer: You connect their audio output into your computer's audio interface inputs and record that into an audio track and you're done. Then, just mute the original MIDI source track and keep it just as a backup, in case there's a need to make some changes or tweaks to the MIDI track and re-record that instrument back into another audio take.

Windows users, as a last resort, can also try using their internal audio card through ASIO4ALL, connecting your computer's audio output to your external ASIO audio interface through a cable... as a last resort, keeping in mind that you won't get any miracles regarding audio latency, this way.

«What about all the virtual instruments, soft-synths (VSTs/AUs/Standalones) or DAWs running on the same computer I'm running Reason on?»

Well, this is where it may get a bit complicated (or not), depending on your system, your overall setup, gear, OS, etc...

In this case, the closest to "external hardware" scenario is using 2 audio interfaces.

In my case, I use Propellerhead Balance with Reason and Novation X-Station (that is also an audio interface) as my secondary audio interface.
With this setup, I have the X-Station's audio output connected to Balance on Line 1 through a pair of audio cables and that's how I'm able to record into Reason any soft-synth, DAW or the X-Station's own synth engine into Reason's audio tracks. Kinda straightforward, right? :)

The other option closest to the above is using your ASIO driver's capabilities to re-route audio internally achieving virtual connections between the driver outs into the driver ins... if you're lucky enough to own such audio interface with such capabilities.

I think Focusrite Saffire is an example of such type of audio interface (I never used one, so I'm not sure about this)

«Does it work ? Can I really use those virtual cabling "hacks"? Is that a reliable way to work properly?»

Maybe for Mac users that usually use CoreAudio's built-in audio, Soundflower is a good solution, but for Windows users, used to real ASIO hardware audio interfaces, you'll notice the drop in quality quite easily.

I've tried JACK a long time ago (so I won't bother retrying it now again) and it wasn't up to the standards I expect of an ASIO level audio interface. Also tried VAC 4.12 yesterday on my 2nd laptop (just a Core2Duo, unlike my main i7 laptop) and it kinda works but, just like JACK, at a very low quality and high latency settings and... not worth the trouble, to be honest.For me, the best results are achieved with proper hardware audio (ASIO) interfaces and since I have 2 available at all times, that's exactly what I've been using and am happy with :)

If you need (virtual) multi-channels out of a a DAW or standalone synth application into Reason, technically VAC 4.12 can deal with up to 3 wires (i.e. IN-to-OUT) connections of 8 channels each but... like I said, you'll need a lot of CPU power and be extremely lucky and/or knowledgeable to make that bearably work. So not worth the trouble, IMHO.
If you really need inter-app multi-channel audio exchanges, stick with the ReWire technology.

Maybe Reason 8 will strike one of the last surviving ultimate requests for Reason: ReWire Master and then yeah, all we need is a ReWire slave VST Host to tackle this problem properly completely inside-the-box ;)

(*cof*cof* and Video Track(s)!!! *cof*cof*)

If all these "hacks" sound (or act) too troublesome for you, I guess the truth may be that you don't need them anyway. Stick to Reason, stick to Rack Extensions, stick to ReWire and enjoy that easily understandable and stable bliss ;)

Bottom line is... Reason 7 now allows both worlds to be happy: The "the box is my oyster" crowd and the tweakers, tinkerers, hackers, hardware fans to finally properly marry Reason with that other aspect of their music loving interest :)