Audiobus: Use your music apps together.

What is Audiobus? — Audiobus is
an award-winning music app for iPhone and iPad which lets you use
your other music apps together. Chain effects on your favourite
synth, run the output of apps or Audio Units into an app like
GarageBand or Loopy, or select a different audio interface output
for each app. Route MIDI between apps — drive a
synth from a MIDI sequencer, or add an arpeggiator to your MIDI
keyboard — or sync with your external MIDI gear.
And control your entire setup from a MIDI controller.

Orchestral music on iPad

Comments

This is also all Notion on the iPad, and @ipadmussic, this is an example where I exported the wav mix and imported just that stereo out into Auria Pro and then did a little work with Pro-MB and Pro-ML. I also exported it dry from Notion and added AltiSpace for reverb in AP instead of using Notion's reverb.

@ipadmussic said:@MusicInclusive , you seem to be an expert in orchestral music in here. I checked out your soundcloud pieces and they are great!.
I am a sucker for orchestral stuff and trying to learn basic orchestration. I never knew, Notion gives you orchestral samples and that too with articulations.

Questions:
1. Are the orchestral samples IAP or free?
2. Does all the articulations available in the UI - have equivalent samples recorded using that articulation or is it done internally by scripts?
3. can you list all the articulations available?
4. any MIDI / audiobus capability to record into cubasis

Thanks.

1.) Some are free with the app. Many (most) are IAPs - well worth it. You can buy all together or add instruments as you need them.

2.) I don't know how it's done internally. I suspect it's separate samples derived from the desktop Notion LSO samples as I believe it is on the desktop, but it may be timing / dynamics in some cases as opposed to different bowings. I can see if I can find out from the Notion team.

3.) You can check the articulation list in the Notion manual - available online.

It lists all the articulations across different instruments. Nothing about those has changed in the latest version.

The latest version's manual however is embedded in the app itself and there's no separate URL for it that I know of all though I'll hunt some more and if not I'll see if we can request that of the Notion folks.

4.) No ability to record MIDI out directly but you can export MIDI and import into Cubasis. And, as I noted in my longer post above, there is no audio out from Notion into IAA or Audiobus either. However, there are a number of options with audio:

a.) You can export the wav and import into something else

(You can't export stems directly from Notion iOS, only the total mix.)

b.) If you have desktop Notion, you can export stems. You can share your iOS Notion session with desktop Notion and then export from the desktop. (And then reimport those stems back into Cubasis or whatever).

c.) If you have desktop Notion, you can also just do a "send to Studio One" and the audio will be sent.

d.) If you have desktop Notion, you can also export a PreSonus Capture session.

MIDI out and audio out to AB / IAA has been requested already - for some time.

Thank you sir! appreciate your detailed answers and your bunny waltz is excellent :-)
hearing your tracks really inspired me...but i still don't know how to orchestrate....have a long way to go.

Thank you sir! appreciate your detailed answers and your bunny waltz is excellent :-)
hearing your tracks really inspired me...but i still don't know how to orchestrate....have a long way to go.

Glad you are liking them. I've still got a huge way to go to get to where I want to be. I have a number of other (longer) works in progress right now - I need to finish some of those this year if I can.

Thank you sir! appreciate your detailed answers and your bunny waltz is excellent :-)
hearing your tracks really inspired me...but i still don't know how to orchestrate....have a long way to go.

Glad you are liking them. I've still got a huge way to go to get to where I want to be. I have a number of other (longer) works in progress right now - I need to finish some of those this year if I can.

Thanks for the links, but right now honestly i do not have the time for books..so will try the easy route first - youtube videos/google etc..

I wish if somehow magically knew the techniques of orchestration/programming.....i get lot of ideas in my mind..but just can't produce....the process takes too much time and o/p is crappy :-(

This is my ultimate wish for a music software - one that can create orchestration, based on human voice/humming.
something that can even generate articulations based on how the voice is articulated/modulated...okay enough of dreaming ..back to reality :-)

Thank you sir! appreciate your detailed answers and your bunny waltz is excellent :-)
hearing your tracks really inspired me...but i still don't know how to orchestrate....have a long way to go.

Glad you are liking them. I've still got a huge way to go to get to where I want to be. I have a number of other (longer) works in progress right now - I need to finish some of those this year if I can.

Thanks for the links, but right now honestly i do not have the time for books..so will try the easy route first - youtube videos/google etc..

I wish if somehow magically knew the techniques of orchestration/programming.....i get lot of ideas in my mind..but just can't produce....the process takes too much time and o/p is crappy :-(

This is my ultimate wish for a music software - one that can create orchestration, based on human voice/humming.
something that can even generate articulations based on how the voice is articulated/modulated...okay enough of dreaming ..back to reality :-)

Ah - well - Hmmm. There's no real shortcut here to doing proper orchestration. One does have to dig in to learning material somewhat to get understanding here of what characteristics instruments have, how to write for those characteristics, how to arrange with those instruments, complementarity, instrument choices, doubling, section sizes, a divisi choices, etc. etc.

However, there is a series of alternate training, including videos by Thomas Goss if videos really are your thing. Thomas runs the Orchestration Online FB group and organizes an MOOOC regularly for students wanting to learn orchestration.

Thank you sir! appreciate your detailed answers and your bunny waltz is excellent :-)
hearing your tracks really inspired me...but i still don't know how to orchestrate....have a long way to go.

Glad you are liking them. I've still got a huge way to go to get to where I want to be. I have a number of other (longer) works in progress right now - I need to finish some of those this year if I can.

Thanks for the links, but right now honestly i do not have the time for books..so will try the easy route first - youtube videos/google etc..

I wish if somehow magically knew the techniques of orchestration/programming.....i get lot of ideas in my mind..but just can't produce....the process takes too much time and o/p is crappy :-(

This is my ultimate wish for a music software - one that can create orchestration, based on human voice/humming.
something that can even generate articulations based on how the voice is articulated/modulated...okay enough of dreaming ..back to reality :-)

Ah - well - Hmmm. There's no real shortcut here to doing proper orchestration. One does have to dig in to learning material somewhat to get understanding here of what characteristics instruments have, how to write for those characteristics, how to arrange with those instruments, complementarity, instrument choices, doubling, section sizes, a divisi choices, etc. etc.

However, there is a series of alternate training, including videos by Thomas Goss if videos really are your thing. Thomas runs the Orchestration Online FB group and organizes an MOOOC regularly for students wanting to learn orchestration.

Adler's a good starting point though It's a reference work that's used the world over in orchestration classes.

Thanks for the links. agree to your point regarding not taking shortcuts, but this is just a hobby for me and i don't have the luxury of time for detailed study I am trying to leverage the power of software to compensate for my incompetence.
I watched couple of Thomas's videos - i found it a bit advanced for my level. Will someday go back to them as they are a complete series.
Btw, I found this video series from Rick Beato, which i liked.

Interesting. He makes some good choices but doesn't explain why he's making them at an instrument-by-instrument level. That's a significant part of the understanding that's missing here. I.e. which instruments work well together, which don't. Why one would choose to orchestrate various instruments in an octave divisi. Why one would double a flute with a clarinet (or not) etc. etc.

Also, the fact that John Williams wrote the theme to Star Wars in Bb is equally likely to do with using Bb transposing instruments than it is to do with the preceding 20th C Fox title sequence. There are four Bb instruments being played during that opening chord in the original score. Writing in Bb is useful for scoring for those instruments.

So, while he mechanically does some good things, it doesn't actually teachwhy one would make those choices beyond dividing among instruments.

Interesting. He makes some good choices but doesn't explain why he's making them at an instrument-by-instrument level. That's a significant part of the understanding that's missing here. I.e. which instruments work well together, which don't. Why one would choose to orchestrate various instruments in an octave divisi. Why one would double a flute with a clarinet (or not) etc. etc.

Also, the fact that John Williams wrote the theme to Star Wars in Bb is equally likely to do with using Bb transposing instruments than it is to do with the preceding 20th C Fox title sequence. There are four Bb instruments being played during that opening chord in the original score. Writing in Bb is useful for scoring for those instruments.

So, while he mechanically does some good things, it doesn't actually teachwhy one would make those choices beyond dividing among instruments. It's a start however

Interesting. He makes some good choices but doesn't explain why he's making them at an instrument-by-instrument level. That's a significant part of the understanding that's missing here. I.e. which instruments work well together, which don't. Why one would choose to orchestrate various instruments in an octave divisi. Why one would double a flute with a clarinet (or not) etc. etc.

Also, the fact that John Williams wrote the theme to Star Wars in Bb is equally likely to do with using Bb transposing instruments than it is to do with the preceding 20th C Fox title sequence. There are four Bb instruments being played during that opening chord in the original score. Writing in Bb is useful for scoring for those instruments.

So, while he mechanically does some good things, it doesn't actually teachwhy one would make those choices beyond dividing among instruments. It's a start however

I appreciate your find and feedback though

You are right, i did not understand the "why" part of the orchestration choices made, but at my level, I was most interested in "how" that HUGE chord was made.
I used to try chords using string patches (combine them with brass etc) - and get disappointed at how muddied they sounded.

I did not understand the thing about clarinets (and others) to be written an octave below to sound as Bb. Also what do you mean by Bb transposing instruments? so are you saying certain set of instruments only sound good for certain chords like Bb?

Do you think this is some kind of rule - for huge chords - the 3rd & 5th voices should be kept to minimum?
Thanks again for sharing your knowledge

Also, I think the Bb transposing instrument means that individual instruments' music is written differently. For example, when a Trombone (I played in High School!) plays a Bb, it is actually a Bb in reality. However for Trumpet, when a trumpet plays a "C", in reality they are playing a Bb. I think French Horns are written 4 steps up or something like that...
So some instruments written music notes don't match actual notes.

I think (not speaking from experience, but from what I've read about it) that the muddy sound is due to all the instruments fighting for "space" in the mix. Real orchestras don't have to be mixed for an audience; they are mixed just by seating position and how loud or soft they play. But when making a "virtual" orchestra with midi sample instruments, you have to place the instruments in their own "space" using pan, reverb and db levels.

Just my 2 cents on the matter...I'm still learning how to do this though-which is why I started the thread!

Also, I think the Bb transposing instrument means that individual instruments' music is written differently. For example, when a Trombone (I played in High School!) plays a Bb, it is actually a Bb in reality. However for Trumpet, when a trumpet plays a "C", in reality they are playing a Bb. I think French Horns are written 4 steps up or something like that...
So some instruments written music notes don't match actual notes.

Thanks @Jmcmillan for your inputs.
Why aren't these "special instruments" re-calibrated or redesigned to produce the exact sound of notes being played? any reason you know?

I did not understand the thing about clarinets (and others) to be written an octave below to sound as Bb. Also what do you mean by Bb transposing instruments? so are you saying certain set of instruments only sound good for certain chords like Bb?

No not really - although each instrument obviously has a range and there are certain natural resonances for instruments that sound best with the way they are constructed, and you also have to take into account things like equal temperament - or other scale choices (the relative scale divisions if you will) like just intonation.

But, consider on top of that you can change the tuning of an instrument too. So, e.g., concert A is 440Hz, but often orchestras will tune to A higher or lower depending on the type of music, genre and age of instruments (e.g. Baroque), concert hall, weather, humidity, etc.

Rather, some instruments naturally are tuned and played in a particular way - like a Bb clarinet. You can play in any key on them, but naturally to the player they are "in" the key of Bb - so the natural "scale" (think of how you would play a recorder by easy analogy) is Bb, not C.

If you are not used to thinking of this, consider by further analogy a guitar. You can change the tuning to say DADGAE so that "naturally" when you strum it it plays a chord in harmony (instead of the discordant EADGBE). You can still play all the other notes but you've changed the "default" so to speak to be a "D" guitar. You now just have to remember that two strings are two-semitones out from where they normally are. (Lots of people do this with guitars).

This is further compounded by transposition however for something like a clarinet (or trumpet or French horn, or .... ). When a Bb clarinet player "plays" a C ("Concert pitch" or the piano note C if you like - at least as written in notation in a score), it actually sounds as Bb. So, their instrument is "in" Bb if you will. If they want to actually sound a C - in "tune" with a piano, they play a D on the clarinet, which sounds 2 semi-tones below as C, just like C sounded 2 semi-tones down as Bb. And so on.

And, you can score for instruments in their natural key or in concert pitch in your notation - but you need to let the player know which. Notation software such as Notion allows you to choose and to transpose.

So, it's more "natural" for a Bb clarient player to play pieces in Bb - because the fingering is the natural fingering on the instrument. Hence why some composers write in certain keys for certain instruments.

By contrast, D is a favorite choice for violin-focused pieces for example to make use of the sympathetic resonances of adjacent strings.

Do you think this is some kind of rule - for huge chords - the 3rd & 5th voices should be kept to minimum?
Thanks again for sharing your knowledge

Yes - there are "rules" (but of course they're only arbitrary - they are simply rules of thumb that work better aurally).

Here's a good quick tutorial. It's one public sample part of a larger series but it'll help as you see how choosing different intervals among different instruments gives better combinations.

@Jmcmillan said:
I think (not speaking from experience, but from what I've read about it) that the muddy sound is due to all the instruments fighting for "space" in the mix. Real orchestras don't have to be mixed for an audience; they are mixed just by seating position and how loud or soft they play. But when making a "virtual" orchestra with midi sample instruments, you have to place the instruments in their own "space" using pan, reverb and db levels.

Just my 2 cents on the matter...I'm still learning how to do this though-which is why I started the thread!

Thanks @Jmcmillan for the link to Mattias Westlund's site. Yes, I have also understood the issue with "muddied" sound and looks like the solution is to learn orchestration ...no easy route. But i will play around with the pan/reverb settings from now, thanks.

Ah! There's a nice little intro. video series on orchestral instruments buried in his site:

And here's the one from that series on the clarinet in particular @ipadmussic

(Check specifically around 1:50 & ff)

@MusicInclusive , I can't thank you enough...the links/videos you provided are really helpful. And your detailed explanation has clarified all my questions on the "tuning of instrument". The groove3 video was are really good.

So i think along with whatever you were recommending, panning/reverb/db level is also a crucial aspect (as pointed by @Jmcmillan ).

Ah! There's a nice little intro. video series on orchestral instruments buried in his site:

And here's the one from that series on the clarinet in particular @ipadmussic

(Check specifically around 1:50 & ff)

@MusicInclusive , I can't thank you enough...the links/videos you provided are really helpful. And your detailed explanation has clarified all my questions on the "tuning of instrument". The groove3 video was are really good.

So i think along with whatever you were recommending, panning/reverb/db level is also a crucial aspect (as pointed by @Jmcmillan ).

Indeed. And EQ. EQ is crucial! - I've just spent the better part of a day EQ'ing a solo violin (Not - obviously - as one would do in a mix, but simply to take off some of the rough edges (excess bow noise particularly) before it goes in a mix. And, albeit on the desktop but the same principles apply on iOS)

Further with panning, esp. with solo instruments, one also wants to play with the 3D field width of the instrument - usually to narrow it (but taking careful account of the interaction with reverb - esp. in a hall kind of setting). It's true of sections too, but, much particularly so with solo instruments. There is a reason an orchestra is arranged the way it is on a stage! It'll help with the stereo field - both in width and depth (with volume and EQ too). I.e. not just where in the 3D soundscape it sets L->R but how wide its sound source is perceived to be. (This is generally true of course of mixing, but even more so with orchestral instruments where there are a potentially a lot of solos from different instruments over the course of a piece.)

In Auria Pro, Pro-C with some careful M/S settings can be used on iOS to handle field width. It's about the best option in Auria Pro because one can't use Stereo Designer (easier to use) as an AU - only one instance, freeze, rinse & repeat etc.

Ah! There's a nice little intro. video series on orchestral instruments buried in his site:

And here's the one from that series on the clarinet in particular @ipadmussic

(Check specifically around 1:50 & ff)

@MusicInclusive , I can't thank you enough...the links/videos you provided are really helpful. And your detailed explanation has clarified all my questions on the "tuning of instrument". The groove3 video was are really good.

So i think along with whatever you were recommending, panning/reverb/db level is also a crucial aspect (as pointed by @Jmcmillan ).

Indeed. And EQ. EQ is crucial! - I've just spent the better part of a day EQ'ing a solo violin (Not - obviously - as one would do in a mix, but simply to take off some of the rough edges (excess bow noise particularly) before it goes in a mix. And, albeit on the desktop but the same principles apply on iOS)

Further with panning, esp. with solo instruments, one also wants to play with the 3D field width of the instrument - usually to narrow it (but taking careful account of the interaction with reverb - esp. in a hall kind of setting). It's true of sections too, but, much particularly so with solo instruments. There is a reason an orchestra is arranged the way it is on a stage! It'll help with the stereo field - both in width and depth (with volume and EQ too). I.e. not just where in the 3D soundscape it sets L->R but how wide its sound source is perceived to be. (This is generally true of course of mixing, but even more so with orchestral instruments where there are a potentially a lot of solos from different instruments over the course of a piece.)

In Auria Pro, Pro-C with some careful M/S settings can be used on iOS to handle field width. It's about the best option in Auria Pro because one can't use Stereo Designer (easier to use) as an AU - only one instance, freeze, rinse & repeat etc.

The knowledge folks like you share in this forum is incredible. I would prefer this to reading books. Well we are in sound engineering territory now. In today's world a musician has to be a sound engineer / orchestrator / programmer ..what not..

Ah! There's a nice little intro. video series on orchestral instruments buried in his site:

And here's the one from that series on the clarinet in particular @ipadmussic

(Check specifically around 1:50 & ff)

@MusicInclusive , I can't thank you enough...the links/videos you provided are really helpful. And your detailed explanation has clarified all my questions on the "tuning of instrument". The groove3 video was are really good.

So i think along with whatever you were recommending, panning/reverb/db level is also a crucial aspect (as pointed by @Jmcmillan ).

Indeed. And EQ. EQ is crucial! - I've just spent the better part of a day EQ'ing a solo violin (Not - obviously - as one would do in a mix, but simply to take off some of the rough edges (excess bow noise particularly) before it goes in a mix. And, albeit on the desktop but the same principles apply on iOS)

Further with panning, esp. with solo instruments, one also wants to play with the 3D field width of the instrument - usually to narrow it (but taking careful account of the interaction with reverb - esp. in a hall kind of setting). It's true of sections too, but, much particularly so with solo instruments. There is a reason an orchestra is arranged the way it is on a stage! It'll help with the stereo field - both in width and depth (with volume and EQ too). I.e. not just where in the 3D soundscape it sets L->R but how wide its sound source is perceived to be. (This is generally true of course of mixing, but even more so with orchestral instruments where there are a potentially a lot of solos from different instruments over the course of a piece.)

In Auria Pro, Pro-C with some careful M/S settings can be used on iOS to handle field width. It's about the best option in Auria Pro because one can't use Stereo Designer (easier to use) as an AU - only one instance, freeze, rinse & repeat etc.

The knowledge folks like you share in this forum is incredible. I would prefer this to reading books. Well we are in sound engineering territory now. In today's world a musician has to be a sound engineer / orchestrator / programmer ..what not..

Ah! There's a nice little intro. video series on orchestral instruments buried in his site:

And here's the one from that series on the clarinet in particular @ipadmussic

(Check specifically around 1:50 & ff)

@MusicInclusive , I can't thank you enough...the links/videos you provided are really helpful. And your detailed explanation has clarified all my questions on the "tuning of instrument". The groove3 video was are really good.

So i think along with whatever you were recommending, panning/reverb/db level is also a crucial aspect (as pointed by @Jmcmillan ).

Indeed. And EQ. EQ is crucial! - I've just spent the better part of a day EQ'ing a solo violin (Not - obviously - as one would do in a mix, but simply to take off some of the rough edges (excess bow noise particularly) before it goes in a mix. And, albeit on the desktop but the same principles apply on iOS)

Further with panning, esp. with solo instruments, one also wants to play with the 3D field width of the instrument - usually to narrow it (but taking careful account of the interaction with reverb - esp. in a hall kind of setting). It's true of sections too, but, much particularly so with solo instruments. There is a reason an orchestra is arranged the way it is on a stage! It'll help with the stereo field - both in width and depth (with volume and EQ too). I.e. not just where in the 3D soundscape it sets L->R but how wide its sound source is perceived to be. (This is generally true of course of mixing, but even more so with orchestral instruments where there are a potentially a lot of solos from different instruments over the course of a piece.)

In Auria Pro, Pro-C with some careful M/S settings can be used on iOS to handle field width. It's about the best option in Auria Pro because one can't use Stereo Designer (easier to use) as an AU - only one instance, freeze, rinse & repeat etc.

The knowledge folks like you share in this forum is incredible. I would prefer this to reading books. Well we are in sound engineering territory now. In today's world a musician has to be a sound engineer / orchestrator / programmer ..what not..

I have a question. Does Notion allow to record the o/p into multiple tracks? does it give you the o/p as midi or only audio file?

You are welcome - It's nice that there is access to the Notion team and that they are responsive. Thankful for that.

You can only get one stereo wav mix out of Notion for iOS. However you can export stems from Notion for the desktop.

You can export and import MIDI to/from both, as well as MusicXML.

You can also print a PDF score.

Okay, here is a scenario that will be useful to me.
Record something in Notion -
using multiple instruments.
using different articulations available.
Export the midi to Cubasis.

Will this midi out have - multiple tracks in cubasis for each of the instruments? How will the articulations appear - as separate tracks or in a single track? is this even possible?

The output will have multiple MIDI tracks yes.

The articulations will totally disappear. The only things that might remain in terms of anything usable articulation wise are shorter note lengths for staccato since they are usually notated shorter and may get exported as shorter MIDI notes.

That's because there is no way for Notion to tell Cubasis (or Auria Pro, or even iSymphonic Orchestra) what articulations to play. There is no articulation mapping technology on iOS as we've noted before. The closest is iSymphonic's use of different MIDI channels as we've discussed above. This is not a fault of Notion, it's that there is no way to tell other programs what to do. On the desktop, Notion is able to manage this by using keyswitch technology, but there is no consistent keyswitch technology in iOS for it to work with. SampleTank has some keyswitching, but not in the way that one needs it to be and not in a way that maps, e.g., to the Miroslav 1 patches in the desktop version. (Which are, instead, done using MIDI channels. Sigh... ).

It could be done, but there is no other orchestration library that comes close to the one in Notion. iSymphonic's is not complete enought yet. Perhaps when it is, the Notion guys could talk to the iSymphonic guys and work out a way to make it happen. OTOH, the Notion folks have not a little interest in getting folks to buy the Notion LSO orchestral sample IAPs - esp. given the lower cost on iOS.

Anyhow, once you have your exported MIDI in separate tracks per instrument in your iOS DAW, what you then have to do is to take a single MIDI track (let's say you have a string quartet and you have exported it and got four MIDI tracks), for say the violin, and then break it out manually into multiple further MIDI tracks note by note where there are articulation changes. So, if you have used, say, legato, sustain, staccato, pizzicato in your score, you would end up breaking your solo violin track out into 4 further MIDI tracks. Each would have some of the notes of the original track. You would then assign the MIDI outputs of those to some other instruments you chose. SSO, VSCO2, iSymphonic, etc. etc. to drive each of the voices on the specific notes that they apply to. Multiply that by the other tracks. And, consider carefully whether you want to do that for a complete orchestra or not...

The only simple way to preserve articulations without the above work is to use Notion's own instruments.

The other way is to step outside of iOS and use desktop Notion to drive EastWest, VSL or Miroslav 1.

Even after those choices (which are baked in to desktop Notion), you are, even on the desktop, walking in pretty much the same path. E.g., if you have scored something in Notion and have, say, Kontakt instruments like Albion, Chris Hein, Kirk Hunter, etc. etc. , you either have to

a.) work out a Kontakt keyswitch map for the articulations and expressions
b.) export to a DAW and do the same as in iOS above - i.e. split out into separate tracks manually after the fact.

Rinse & repeat for Sibelius, Finale, Dorico, etc.

Yep. It's not a small task.

Some people skip notation programs altogether, and just compose in the DAW using MIDI directly, writing the articulations out right up front in the tracks in which they belong if using separate instrument voices per track. I don't typically write orchestral music that way. I think and write in terms of a notated score. The instrumentation for me is second. One should be able to write score and have it interpreted by the computer to produce realistic articulations and expression in an ideal situation.

That's what the keyswitch and other mappings from notation programs attempt to do. They still need after the fact tweaking for best realism, and, in fact, desktop Notion gives you some access to CC values via an overlay to assist in that. That latter's a bit kludgy, but usable.

Further, in desktop Notion, let's suppose you have EastWest's normal legato strings voice for a particular string instrument, BUT, you want to use Butter Legato that's available for that instrument at one point. There's no built in mapping for that. One could write a rule for it, or, as some do, you can also insert another instrument temporarily on a track, so, let's say for a bar and a half you want to use Butter Legato, you would "insert" a new instrument for that bar and a half, and then revert to the original instrument thereafter. Again, a bit kludgy but usable.

I have a question. Does Notion allow to record the o/p into multiple tracks? does it give you the o/p as midi or only audio file?

You are welcome - It's nice that there is access to the Notion team and that they are responsive. Thankful for that.

You can only get one stereo wav mix out of Notion for iOS. However you can export stems from Notion for the desktop.

You can export and import MIDI to/from both, as well as MusicXML.

You can also print a PDF score.

Okay, here is a scenario that will be useful to me.
Record something in Notion -
using multiple instruments.
using different articulations available.
Export the midi to Cubasis.

Will this midi out have - multiple tracks in cubasis for each of the instruments? How will the articulations appear - as separate tracks or in a single track? is this even possible?

The output will have multiple MIDI tracks yes.

The articulations will totally disappear. The only things that might remain in terms of anything usable articulation wise are shorter note lengths for staccato since they are usually notated shorter and may get exported as shorter MIDI notes.

That's because there is no way for Notion to tell Cubasis (or Auria Pro, or even iSymphonic Orchestra) what articulations to play. There is no articulation mapping technology on iOS as we've noted before. The closest is iSymphonic's use of different MIDI channels as we've discussed above. This is not a fault of Notion, it's that there is no way to tell other programs what to do. On the desktop, Notion is able to manage this by using keyswitch technology, but there is no consistent keyswitch technology in iOS for it to work with. SampleTank has some keyswitching, but not in the way that one needs it to be and not in a way that maps, e.g., to the Miroslav 1 patches in the desktop version. (Which are, instead, done using MIDI channels. Sigh... ).

It could be done, but there is no other orchestration library that comes close to the one in Notion. iSymphonic's is not complete enought yet. Perhaps when it is, the Notion guys could talk to the iSymphonic guys and work out a way to make it happen. OTOH, the Notion folks have not a little interest in getting folks to buy the Notion LSO orchestral sample IAPs - esp. given the lower cost on iOS.

Anyhow, once you have your exported MIDI in separate tracks per instrument in your iOS DAW, what you then have to do is to take a single MIDI track (let's say you have a string quartet and you have exported it and got four MIDI tracks), for say the violin, and then break it out manually into multiple further MIDI tracks note by note where there are articulation changes. So, if you have used, say, legato, sustain, staccato, pizzicato in your score, you would end up breaking your solo violin track out into 4 further MIDI tracks. Each would have some of the notes of the original track. You would then assign the MIDI outputs of those to some other instruments you chose. SSO, VSCO2, iSymphonic, etc. etc. to drive each of the voices on the specific notes that they apply to. Multiply that by the other tracks. And, consider carefully whether you want to do that for a complete orchestra or not...

The only simple way to preserve articulations without the above work is to use Notion's own instruments.

The other way is to step outside of iOS and use desktop Notion to drive EastWest, VSL or Miroslav 1.

Even after those choices (which are baked in to desktop Notion), you are, even on the desktop, walking in pretty much the same path. E.g., if you have scored something in Notion and have, say, Kontakt instruments like Albion, Chris Hein, Kirk Hunter, etc. etc. , you either have to

a.) work out a Kontakt keyswitch map for the articulations and expressions
b.) export to a DAW and do the same as in iOS above - i.e. split out into separate tracks manually after the fact.

Rinse & repeat for Sibelius, Finale, Dorico, etc.

Yep. It's not a small task.

Some people skip notation programs altogether, and just compose in the DAW using MIDI directly, writing the articulations out right up front in the tracks in which they belong if using separate instrument voices per track. I don't typically write orchestral music that way. I think and write in terms of a notated score. The instrumentation for me is second. One should be able to write score and have it interpreted by the computer to produce realistic articulations and expression in an ideal situation.

That's what the keyswitch and other mappings from notation programs attempt to do. They still need after the fact tweaking for best realism, and, in fact, desktop Notion gives you some access to CC values via an overlay to assist in that. That latter's a bit kludgy, but usable.

Further, in desktop Notion, let's suppose you have EastWest's normal legato strings voice for a particular string instrument, BUT, you want to use Butter Legato that's available for that instrument at one point. There's no built in mapping for that. One could write a rule for it, or, as some do, you can also insert another instrument temporarily on a track, so, let's say for a bar and a half you want to use Butter Legato, you would "insert" a new instrument for that bar and a half, and then revert to the original instrument thereafter. Again, a bit kludgy but usable.

there you go...you burst my bubble. I was building up my hopes for nothing
But good to know that there is a way in the desktop world to achieve this even though final o/p may need further tweaking. Thanks for your detailed explanations how to achieve this.

Just wondering with our current technology is this so hard achieve - translate music from a notation s/w & record into DAW using sample libraries preserving all the articulations?
May be there is no market for this. I guess either folks directly record into DAW using real instruments/sample libraries or folks like you "write" using notation s/w.

I belong to the irrelevant market here ...no musical theory knowledge or no instrument skills. Well for now i will keep trying using iOS platform..and may be someday upgrade myself to the above group.

@MusicInclusive said:
This is also all Notion on the iPad, and @ipadmussic, this is an example where I exported the wav mix and imported just that stereo out into Auria Pro and then did a little work with Pro-MB and Pro-ML. I also exported it dry from Notion and added AltiSpace for reverb in AP instead of using Notion's reverb.

Did I miss this one? That sounds REALLY good! Notion through Auria Pro produces one helluva Notion

Just wondering with our current technology is this so hard achieve - translate music from a notation s/w & record into DAW using sample libraries preserving all the articulations?
May be there is no market for this. I guess either folks directly record into DAW using real instruments/sample libraries or folks like you "write" using notation s/w.

I belong to the irrelevant market here ...no musical theory knowledge or no instrument skills. Well for now i will keep trying using iOS platform..and may be someday upgrade myself to the above group.

Well. Hmm. I think - once Dorico moves from pre-alpha to a product that should actually be on sale (IMO it was placed on sale while still in pre-alpha stage... and it's finally moving into beta with all the feedback from the people who bought it - it's a great product, or it will be when it's finished (Kudos to the devs) but it seems to me that Steinberg's project management went a bit odd there... - anyhow) then it might provide for some more in-program tweaking with the DAW-like facilities that will allow people to do more of that notation->realistic rendering directly in a notation program. But, really, it isn't finished yet.

Desktop Notion's probably the easiest to use directly to do this currently in some fashion that produces and immediately pleasing rendered result, but people have long worked directly in DAWs like Cubase - from early Cubase days - to enter notes by hand into MIDI tracks to instrument with. Google "Cubase expression maps" and you'll see what people have worked up.

A lot of people do that - yes - with either the traditional libraries like EastWest and VSL or they have worked with the wonderfully sculpted but very complex newer libraries in Kontakt like the Albion range from Spitfire, Chris Hein, Embertone and so on. Those provide for a great deal of expression variation but are more often played using a keyboard vs. scored in, say, Notion and then rendered automatically. Or, if the latter is done with custom mappings, then further work is still done with expression in a DAW after that.

I highly recommend watching the latest Chris Hein library videos to get a feel for the depth of realism that can be achieved - esp. where it shows the library emulating the recorded violin etc. BUT, know that a significant amount of manual work went in to achieving that. Take and mulitply by a symphony's worth of instruments and bars!!! Lot of work.

Have you noticed that a lot of computer-instrumented film music these days is staccato? And if there are lush strings they're usually not long solo passages? (Or if they are solo, they're recorded live with a real violinist). There's a reason for that - it's much easier to handle staccato and staccato-like articulations; the ear is more forgiving than for emotionally expressive solo instruments. Can it be done? Sure, but it takes a lot of work and doing so for a longer passage is still a lot of work.

That being said a lot can still be done pretty well - even with things like Notion on iOS.