Docs by the devs of Reaper:https://www.reaper.fm/sdk/reapeaks.txt specs for the reapeaks-files - outdated according to cfillion: "v5.15+ uses floating point, and it doesn't mention spectrograms & spectral peaks"

Within Reaper, any processing is done strictly in terms of sample-blocks. Complete sample blocks are transferred to and from VSTs. And the environment, that is constructed for any JSFX instance, communicates with the Reaper audio/Midi infrastructure in terms of complete sample blocks.

Any "instance" in Reaper works on a sample block as soon as this is available (e.g. by the previous station in the FX chain), as long as the complete processing latency is not exceeded, which is relevant for blocks read from a file. Hence usually any plugin always sees a block way ahead of the time it is output to an audio device. At max as early as the complete processing latency which is denoted by the sampling frequency, block size and block (buffer) count set with the audio interface (the total can be several seconds).

In fact, with any track (FX chain), Reaper uses a single block buffer at the time, and a single OS thread, and calls the plugins one after the other, providing the memory address of this block, and each plugin just modifies the content of that block. On return from one plugin, Reaper just continues with the next one. "Normal" plugins simply modify the block content when they are called. Complex plugins (such as Kontakt) that create their own OS threads, also only are allowed to do exactly that, but they can work in the background to prepare the information they are going to write in the next block, once they are called by Reaper.

Regarding JSFX, Reaper calls @block as soon as a block is available from the previous thingy in the FX chain. After @block returns, Reaper performs a loop with "samplesblock" iterations to iteratively call @sample. With each iteration it provides values smpl0 ...smpl63 from the appropriate sample block, and after return from @sample updates the modified values of smpl0 ...smpl63 into the sample block.

Hence there is no such thing as "timing" @block vs @sample. They need to be considered to be called "at the same point in time", and that point in time is not in any way related to other points in time throughout Reaper. That is why it does not make any sense to transfer realtime (block- or even sample-accurate) information between multiple instances of plugins via global memory. The "nearby" calls to the communicating plugins can be done with different blocks, the timing can differ by the total "complete processing latency". If in the same track, the "global" communication can overtake the block sequence, if the plugins are located in different tracks, the timing correlation is just random.

Regarding Midi, Reaper marks each Midi message by the samples block the time of duration of which the message is to be located in, and the offset in samples (i.e. 1 / sample frequency) to exactly denote the "virtual" point in time.

Now when receiving Midi in a JSFX (no matter if in @block or in @sample), midirecv() will provide the next Midi message associated with the sample block the JSFX infrastructure is just working on, and provide the correct offset (regarding that sample block) in the "offset" variable.

When sending a Midi message by midisend(), the message is associated to the sample block the JSFX infrastructure is just working on, and the offset given is stored with that message.

Hence it is not necessary to do midirecv() or midisend() in @sample. In fact doing midirecv() in @sample crates a lot of unnecessary overhead - you need to do it an a loop anyway, and the first call of @sample after an @block will output all messages associated with that sample block. Regarding midisend(), I understand that there is not much difference between doing your own loop of "samplesblock" iterations, or using @sample. (In fact a loop of "samplesblock" iterations might be avoidable to decrease CPU overhead, but would often need a much more complex algorithm to calculate the (virtual) timing (e.g. of a ramp of parameter values).)

This is why for sample accurate Midi "timing", only the correct management of the "offset" is critical, creating a "virtual timing", while the "physical timing" (when a message is received or sent) does not matter at all.

Reaper might be hoped to do special handling with midisend() in @sample events (e.g. ignore the offset given and replace it by the offset affiliated to the audio sample to be handled in that loop cycle (i.e. the loop counter), I did a test verifying that the offset value is just left as it is even when sending a midi message in @sample. Hence it does not make much sense to do midisend() in @sample.

If for the same sample block multiple Midi messages with the same offset are sent, Reaper will handle all of them, but supposedly the sequence they will occur in is undefined.

@nofish
Yes, I should add the info that the int is an integer-representation of the first x bytes of the accompanying string. Though I think, you can't access them via get_config_var_string(), so even if they are string, you can access them only as int.

@nofish
Yes, I should add the info that the int is an integer-representation of the first x bytes of the accompanying string. Though I think, you can't access them via get_config_var_string(), so even if they are string, you can access them only as int.

With afxcfg, rendercfg and recordcfg, it's the same case.

Isn't rendercfg what we now can access via GetSetProjectInfo_String(0, "RENDER_FORMAT", ...)?

edit:
Btw. next SWS version will prevent accessing string config variables via SNM_Get/SetIntConfigVar(), SNM_Get/SetDoubleConfigVar() to avoid potential crashes as per Justin's advice in the linked thread.

Unfortunately there is an ongoing misconception regarding plugins' DAW Parameter modulation / Midi / Reaper Control Path, due to lack of appropriate documentation and sometimes misleading naming in the Reaper GUI.

I seem to remember that I already did write a text on this, but supposedly my research is more advanced now.

Regarding the the Midi Routing in Reaper it's important to see that this always happens in parallel with a (potentially mute) the Audio stream. The timing of the Midi messages is tied to same and a description of some details is here -> https://forum.cockos.com/showpost.ph...3&postcount=90. The Midi channel (that by the Midi Standard is 1.. 16) internally in Reaper is enhanced by the "Midi Bus" (1..16) and hence any Reaper Midi message features one of 256 virtual Midi channels (denoted by Bus and Channel).

A plugin in a track's FX chain can receive any of these messages (for "Standard" plugins, Reaper manages the input and output "Bus" - together with the routing instructions, while JSFX plugins can - and are supposed to - do that by themselves in their code).

For plugin "DAW"-parameter modulation, CC messages can be use by Reaper. These are taken from the Midi messages stream in the FX chain at the location of the plugin in question (managed by [Param] -> "Parameter List" -> "Parameter Modulation / Midi link").

Obviously Reaper Scripts don't "live" in an FX chain and hence they can't see Midi messages.

Additionally (and separate from) the Midi message streams (in the tracks, and parallel with audio streams), Reaper features the "Reaper Control Path). Same does not hold Midi messages, but just parameter change instructions. These can be derived from Midi CC messages when activating the "Control" feature in a Midi in device or be explicitly sent by the "MidiToReaControlPath" plugin (or of course by other Reaper extensions that use the appropriate Reaper API). They can be constructed to represent Midi CC messages (or e.g. to represent OSC messages). The Timing of the messages in the Control Path is not strict, but they are handled just "as fast as possible", hence you can't rely on it in any critical situations.

Reaper Scripts can be triggered by such Parameter Change messages in the Reaper Control Path.

For plugin "DAW"-parameter modulation, such Parameter Change messages can be use by Reaper. These are taken from the (single) Reaper Control Path (independent of the track, the plugin is located in, and from any Midi routing). This feature is managed by the [Param] -> "Parameter List" -> "Learn" feature.

Is it possible to make opus encode to family 2, I assume family 255 is used now?

Opus coding, channel mapping family 2 and 3 is now a ietf standard for ambisonic channel mapping.

Bosse

Quote:

Originally Posted by mespotine

I updated the render-code-docs. They include now, how to way to go to generate all render-codes; new additions WavPack, OGG, GIF, LCF AudioCD.. M4A is still missing, as this is Mac-only and I don't have a Mac to document that:

I also made a more complete ini-file of render-codes, that can be put together with the information included.Video render codes
The rendercfg-codes as an ini-file, for many formats like DDP, AIF, MP3, FLAC, Video, OPUS and OGG.
The sections are the file-formats (like [OGG] or [FLAC]).
You can find in all sections a key called Renderstring, in which the renderstring-template is stored.
All changeable parts are replaced by [FormatAlterID].
e.g. for OPUS:

Code:

Renderstring=U2dnTwAA[KBPS][MODE][Complexity]AAAA==

Replace the [KBPS] [MODE] [Complexity] with the values of the corresponding keys in the section OPUS.
e.g.:
KBPS_xxx - where xxx is the bitrate-number
MODE_xxx - where xxx is the mode
Complexity_xxx - where xxx is the complexity-number.

So a renderstring with OPUS, Mode:VBR, Bitrate: 24kbps, Complexity:5 would result in this string:

BTW: I have recently added some functions into my API, who should make it easy to read my source-files I use for my Reaper-Docs, API and such.
As they include parameters, retvals, and tags and such, would you be interested into attempting a version of your great Reaper-API-docs, using my source-files instead of the original ReaScript-html-file?
My more detailed information combined with your search and filter and clipboard-features would make it awesome, I think.

Also added tons of new information to the ConfigVars-docs and added the recently added pre-release-ones as well.
If a configvar is a ProjectDefault-related one, it is now documented. That way you can alter now ProjectDefaults, at least those, who are available as "live-editable"-ConfigVars.
Spent two days on that rework