Comments, problems, suggestions about Oric emulators (Euphoric, Mess, Amoric, etc...) it's the right place to ask. And don't hesitate to give your tips and tricks that help using these emulations in the best possible way on your favorite operating system.

And then it will exit the DosBox emulator on Android, leaving me freshly zipped ALL TAP files from my euphoric folder in a Ztaps.zip file on Android, that I (manually) click from the Android's file-manager and select "send via bluetooth".

IT WORKS!!!
(Allmost automatically)

And regarding the battery:
DosBox, emulating Euphoric, with Oric CPU at 2.5MHZ drains exactly 0,333% of the battery (measured at the room temperature of 21'C)
That means - If Android's battery is at 100%, at the room temperature, then Android, running DosBox emulating Euphoric can run it up to 5 hours per a single battery charge.
NOT BAD AT ALL!!!

So.. why even bother developing other emulators??
Euphoric can be easily handled, installed, configured (just copy/paste for example these settings and use it!)
And is NOT a battery drain...

That's not mad
THIS is mad
And, unfortunately one of my never fully completed programs

I named it "Midi"
But it isn't actual MIDI file player/editor.
It can address all Oric's sound potential...
But it never can synchronize playback speed between Composing and Playback mode
(And THAT is driving me crazy)

Attachments

Iskra (Euphoric) Midi creator and editor

Last edited by Brana on Fri Feb 03, 2017 3:18 am, edited 2 times in total.

Brana wrote:But it never can synchronize playback speed between Composing and Playback mode

From picture I don't fully understand what your program do (for instance what mean the numbers "01,03,06,15" etc.)
but I think it's not big problem to have synchronized playback using timers and some assembler code...

OMG! REALLY??
Ok, here we go:
It enables DRAWING on the screen the PATTERN of desired music,
and plays it even "on the fly" - any note as I draw them on the screen (those red squares).

(Can be quite irritating for the neighbors as I usually run it at night time!)

It enables me just to imagine a melody in my head, and then simply to draw that melody on the screen!
It will generate, play, and the final result will "LPRINT" in the TXT (Printer.out) file, that can then be used by Txt2Bas.EXE utility to have a standalone MUSIC file that will play on Oric Atmos (emulator), OR - as a standalone TAP file containing previously composed melody (created at the MOST EASY WAY), that can then be used futher on from within my other programs...

It runs in Basic, and it gives an final output file IN BASIC, example, this would be the output file generated by this program (just an example):

Oric has 3 sound channels, and this little baby can compose all of them...
Starting from first (draw your melody in it), then add at the second (will mix sounds, and problems with the speed and synchronization starts to occur here), and add the third sound channel (will create three-cords (or - how do you call that on English language) but be out of sync)

It supports sound envelopes (or - again -- how do you call that on English language?)
Meaning - any note can have "fade in" or "fade out" effect, for example... (This works, do).

The main idea of this concept is - my favorite NON-Oric Audio editing software: SoundForge.
(For those who do know how SoundForge works - I need to say nothing more)

iss wrote:
From picture I don't fully understand what your program do (for instance what mean the numbers "01,03,06,15" etc.)

The first number (1) = sound channel (1 - 2 -3)
The second number (2) is the Octave
Third number: Note within the current octave
The forth number: Volume for that note (15=Max loud volume for example)

iss wrote:
but I think it's not big problem to have synchronized playback using timers and some assembler code...

It is a BIG problem for me (runs BASIC code ONLY)
The ONE sub-program is for composing mode.
The OTHER sub-program is for playback.
And there are also other sub-routines which do kick-in if and when needed, and they are all in Basic...

For example: when composing, the lower part of the screen will scroll from right to left as I draw the newest note on the right side of the screen (and it will refresh the music information (numbers) in the upper (black one) section of the screen at that moment).
The scrolling subroutine is executed WHILE the current note is playing.
And the playing lenght of that note is then affected by that current screen scrolling subroutine (keep in mind, I'm in Basic) at that moment.

And in playback mode completely different subroutine is used for on-screen-display of defined notes for all 3 channels.. And that affects the playback speed then, and so on..

It's "dynamite" in concept, but all of them are NOT (never are) actually in sync
And is bugging me for 7-8 years (last couple of years I didn't even launched it once)