Hello, I'm currently working on a patch for pure data, so that I can use my Gladiator VST with a more mouse free approach. I am programing this patch to work with osculator inside Ableton Live.
The first thing I did was I made two buttons on my midi controller increment and decrement the first wave form in Gladiator (Osc1 Wave). Researching people trying to do this same thing for the tempo control in Ableton, I was able to find a patch that did this by making use of the moses object. So briefly, what each button is doing is sending to PD from Osculator and then incrementing or decrementing the moses object and then sending information back to Osculator through one midi message (ctlout 1 1). This message is than sent to Ableton and the Gladiator VST.
This has been working pretty great, and I'm fairly pleased, but there are some fine tunings I need to do in order to get it perfect. The Osc1 Wave slider goes up from 0 to 1000, and oddly, there are only 163 waveforms. So from 163 to 1000, the waveform just stays at waveform 163\. When I first map \[midi 1 1\] in osculator to Osc1 Wave in gladiator, as I increment and decrement with my midi control, the intervals are by 16, which is bad because I skip 15 waveforms or so with every interval. To reduce this interval, I set the min and max of the midi message 1 in Ableton (which holds gladiator vst) from 0.0 - 1.0 to 0.0 - 0.17\. This has made my interval mostly 1, with the occasional 2, skipping a waveform.
Ex: 0, 1, 2, 4, 5, 6, 8, etc.
I have done some research in an effort to solve this problem which I believe has something to do with translating/scaling the standard 127 midi range to a different range. I have messed around with strings such as:
* |
|
\[/ 1000\]
but had little luck, and I'm not sure if this would even be in the same realm as my problem.
I'm attaching my osc and pd files, let me know if anything is unclear.
Thanks!
[http://www.pdpatchrepo.info/hurleur/tempo4.pd][0]
[0]: http://www.pdpatchrepo.info/hurleur/tempo4.pd

Hello, I'm currently working on a patch for pure data, so that I can use my Gladiator VST with a more mouse free approach. I am programing this patch to work with osculator inside Ableton Live.
The first thing I did was I made two buttons on my midi controller increment and decrement the first wave form in Gladiator (Osc1 Wave). Researching people trying to do this same thing for the tempo control in Ableton, I was able to find a patch that did this by making use of the moses object. So briefly, what each button is doing is sending to PD from Osculator and then incrementing or decrementing the moses object and then sending information back to Osculator through one midi message (ctlout 1 1). This message is than sent to Ableton and the Gladiator VST.
This has been working pretty great, and I'm fairly pleased, but there are some fine tunings I need to do in order to get it perfect. The Osc1 Wave slider goes up from 0 to 1000, and oddly, there are only 163 waveforms. So from 163 to 1000, the waveform just stays at waveform 163\. When I first map \[midi 1 1\] in osculator to Osc1 Wave in gladiator, as I increment and decrement with my midi control, the intervals are by 16, which is bad because I skip 15 waveforms or so with every interval. To reduce this interval, I set the min and max of the midi message 1 in Ableton (which holds gladiator vst) from 0.0 - 1.0 to 0.0 - 0.17\. This has made my interval mostly 1, with the occasional 2, skipping a waveform.
Ex: 0, 1, 2, 4, 5, 6, 8, etc.
I have done some research in an effort to solve this problem which I believe has something to do with translating/scaling the standard 127 midi range to a different range. I have messed around with strings such as:
* |
|
\[/ 1000\]
but had little luck, and I'm not sure if this would even be in the same realm as my problem.
I'm attaching my osc and pd files, let me know if anything is unclear.
Thanks!
[http://www.pdpatchrepo.info/hurleur/tempo4.pd][0]
[0]: http://www.pdpatchrepo.info/hurleur/tempo4.pd

I was having problems, getting MIDI-clock / realtime-MIDI to work in PureData. When I start receiving a MIDI-clock-signal freshly for the first time after starting PureData everything works as expected. PureData received the realtime MIDI data. (I use this patch for deriving the clock-data: http://little-scale.blogspot.de/2013/03/dealing-with-midi-clock-in-pd-sync-pd.html).
But now... when I stop the clock and restart the thing, I always got the error "warning: MIDI timing FIFO overflowed" in pd-console. I am using Ableton Live 9 as host - MIDI is going over the IAC driver on Mac OS-X. Search on the web only brought old websites and no solution to this, so I experimented a little bit myself and maybe found a solution:
What helped was the following: By coincidence, I went to the "preferences..." setting-panel. There I turned of the "Realtime Termination"-flag (I have the version in German, there is it called "Echtzeit Terminierung" - maybe the English translation differs.)
**Does anyone knows what the Reatime-Termination Flag does?** Any why this is solving the realtime-MIDI-issues that occur with receiving MIDI-clock signals? I just want to be sure, that this is really solving the problem.

Hi all,
This is a simple patch I created for an user in this forum, so I may as well share it here. It allows you to load 10 MIDI files that can be played via Pd or by pressing keys in a MIDI keyboard.
Download: [midi-player.zip](/uploads/files/upload-a8fa431e-be21-4014-a12e-f5623ab1fe94.zip)
![howto.png](/uploads/files/upload-5d2195b0-5e0c-46b7-804e-c626fbf4fc2a.png)
I hope someone will find this useful.
Best,
Gilberto

I need to populate a list with the MIDI notes that are being pressed, and remove the note from the list when the key is released.
One idea is to use [notein], and if the velocity is> 0 then add the note to the list, and if the velocity == 0 then remove the note from the list. But I have no idea of how to implement this.
I think I need [list-extend], [list-find], and [list-delete], but I just can't figure out an implementation.
How can I do this?
**EDIT: solved the problem and have a working patch. See my reply below for details.**

i patched up an app for the monome and thought i'd share. included in the zip is a midi looping abstraction i rolled "from scratch" - it (the abstraction) is completely independent from monome requirements and can be used generally in any pd app (provided the cyclone library is installed).
[http://monome.org/docs/doku.php?id=app:helix][0]
\[EDIT: HA!
alright - i JUST realized i wrote 'androidome' in the title.
my mistake, i meant ARDUINOME. jeez.\]
[0]: http://monome.org/docs/doku.php?id=app:helix

Hey. Im pretty new to pd and im a bit confused. Im working on my Diploma, a light show for a Band called Brandt Brauer Frick. You can see a layout graphic here [http://www.sebastian-selbach.com/wp-content/uploads/2011/08/Stage.jpg][0]. The light-show will be composed in Ableton Live an i need a sort of preview screen for midi data. My question is how do i get the velocity of one singel Midi note.
I know that i can isolate the channel with notein 1 but how does it work for the midi note? thanks in advance.
[0]: http://www.sebastian-selbach.com/wp-content/uploads/2011/08/Stage.jpg

hi everybody
explanation of my problem :
config : ableton + touchOSC + loopMidi (also tryed midiyoke) (+ PD of course)
I use midi clips, sending CC data (1 cc per clip, only 1 channel used for midi output, to avoid wrong data) so I can use the CC value to make play position cursors on touchOSC.
in PD I set something like that :
\[ctlin cc\# ch\#\]
| |
| \[Number) \*\*\*for monitor purpose, easier to read than \[print\]
|
\[send /fader $1(
|
\[sendOSC\]
My play position faders are sometimes working perfectly and sometimes they are glitchy, jittery.
at first i thought it was caused by the big number of OSC messages sent by puredata (I planned to do a trick that would filter values after \[ctlin\] so that it doesn't route redundant values and then only sends different OSC messages. but i got lazy and investigated elsewhere first)
I set a number object after \[ctlin\] and found that the values are delayed. sometimes cleanly delayed : everything coming late, and sometimes warped : everything ok, but some intermitent lower values. and sometimes for no reasons either it works perfectly. the delays are several seconds long.
in ableton i can see the cursor moving correctly all along the midi clip. if i turn off midi out from ableton during a delay, I still have midi activity for a few seconds in PD
other data are sent & received on time (OSC, midi out)
I only use 1 midi in and out in PD and ableton, via 2 loopMidi virtual cables. audio is off, no delay (1ms)
what could cause that ? it's either between ableton and PD or inside PD
thanks in advance

Greetings All,
I have constructed a patch using \[fiddle~\] that will take a signal from my guitar rig and output midi info via \[noteout\] so that I can play other synths.
Right now, if you look at the patch, I have \[timer\] measuring the time between attacks which is then fed eventually into \[noteout\]. This process is retrospective, meaning that it takes the note duration of the last note I played and applies it to the current note I am playing - not so great if I are shredding away and want to end the phrase with a sustained note that is longer than the stream of short notes that I just played because that long note I want will only have a short duration, unless I use an expression peddle through sustain subpatch that would use \[ctlin 64 1\] or something. That works ok, but I would really like to not have to worry about having my foot on an expression peddle to ensure the sustain.
Is there another way that I can generate a midi note-on message and have it sustained until I play another note?
Any help is much appreciated!
Thanks,
dkeller
[http://www.pdpatchrepo.info/hurleur/SIGNAL-To-MIDI.pd][0]
[0]: http://www.pdpatchrepo.info/hurleur/SIGNAL-To-MIDI.pd

hello, its me again with more newbie questions.
im studying with johannes kreidler book and there is a formula on converting midi notes to hertz on chapter 3 (audio) that im not sure i understand.
i know there are \[mtof\] and \[ftom\] objects but since its on the book as a formula im guessing its important and i wouldnt like to skip it just because the computer can do it for me.
according to the book the midi to hz formula using the \[expr\] object is
expr 440\*(pow (2, (($f1-69)/12)))
if the midi note is 39, the frequency is suppossed to be 77.78 but when i try to do it with my calculator i get a wrong result.
there is also the inverse formula for hz to midi which is
m= 69 + 12 . log2(f/440)
if someone could break these formulas down for me it would help me understand how these objects work.
(i already know expr and variables, im sure im not reading the formulas right).

hi,
i'm trying to get this to work but frankly i dont understand why this patch does exactly the opposite of what i want ;-)
what i'm trying to achieve here is that only if midi note \#65 is coming in i want to know about this notes velocity value. any other midi-\# should just be ignored or even reset the velocity value to zero.
any help would be much appreciated!
![](http://offspacecenter.com/pd-q.png)
[http://www.pdpatchrepo.info/hurleur/velocity\_select.pd][0]
[0]: http://www.pdpatchrepo.info/hurleur/velocity_select.pd

Hi
Thanks for the fast response to my previous question, now here's another, I've also looked this up and not had much luck even though it seems like a fairly obvious thing to do.
I'm wanting to use a MIDI input (0-127) to control a filter's cutoff frequency (and potentially other things like volume that our ears respond to in a logarithmic manner), and I'd like that control to be effected logarithmically as it is on most synthesisers etc. It's a bit hard to get my head around the process by which this would be done so I'm not really sure how to implement it.
Also is there an equivalent to Max's \[zmap\] object which is very useful in such situations where scaling a number is involved?

Howdy,
I'd like to import midi files that i make in a sequencer into PD. I've found a few ways to convert midi files to text on windows but was wondering if anyone here has a better solution (i work in osx)...Or if theres already something in the extended versions that can do it?

when i select my usb keyboard,i get this warning...
'Warning: midi input is dangerous in Microsoft Windows\\; see Pd manual'
and if i load a patch synth,my keys don't play the synth.
i'm sure pd can alow midi in,is there something i'm doing wrong?

Does anyone know whether Pd's Midi runs in the same thread as the GUI?
I have managed to get Plogue Bidule synced to a midi clock sent from Pd over the IAC bus (OSx), but when say a window is moved, there is loads of glitching in the audio and the jitter seems to increase enormously. I have tried wacking up the buffer sizes but of course, the increased latency means that Bidule is synced by the order of the buffer size...Or worse if theres jitter involved. ITs not really practical to not touch the computer whilst it is running!
How can this be remedied? Is some kind of latency compensation required here
I totally long for the day when Jack OSX gets midi then we can have rewire like performance but with all the benefits that Jack currently provides! (does anyone have a clue when this might be?)
cheers

Hello!
I have a hardware sequencer that does not transmit NRPN's, so I'm using PD to translate CC's going from the sequencer into NRPN's to external gear.
I also need all the midi data coming from the sequencer such as clock, notes, other CC's, etc... This is why I'm using midiin --> midiout. My only gripe is that the CC's I'm using from my sequencer to modify the external NRPN's still get dumped into midiout.
I've thought about skipping midiin-->midiout and instead routing all notes, midiclkin, ctlin... etc. individually, but I feel like that's major overkill.
Does anyone know a better way to filter out CC's from incoming midi while allowing all other MIDI data through?
Something like:
[midi + cc1 + cc2] --in--> PD[-cc1] --out--> [midi + cc2]
Thanks!

Hi there!
I'm a total newbie to Pure Data but absolutely loving the possibilites.
I need a little help with making a patch that will take 16 rotary MIDI CC controllers and then send those as separate messages to an OSC device.
I can get it up and running kinda but keep getting various error messages.
Like - packOSC: bad path: 'float'
I know I'm doing something monumentally stupid but can't figure it out with my 8 hours experience!
Any pointers?
Cheers.

Hello :-)
My name is Jorge Pinto and I am from Portugal. I am electronics professional and I like to develop hardware.
I am doing a cheap(+-30 Euros) and simple "Homemade MIDI Turntable" to connect on PD.
I am looking for information on others projects similar and feed-back, ideas, etc...
I hope to release the final version very soon with GPL licence.
Please tell me what you think about and give me suggestions. Thank you.
Home page of project in English with one video: [http://casainho.net/tiki-index.php?page=Homemade%20midi%20turntable][0]
---
Casainho
[http://www.Casainho.net][1]
[0]: http://casainho.net/tiki-index.php?page=Homemade%20midi%20turntable
[1]: http://www.Casainho.net

Hi All,
I've attached a patch that I constructed that converts a MIDI continuous controller stream of 1~127 into much higher resolution output (currently set to 127^2 steps). It basically ramps between one MIDI number to the next over a time period specified by the amount of time that has elapsed since the previous MIDI number was passed. Of course it still restricts you from stopping on a number in an range between two MIDI numbers, but at least it allows that range to be swept through.
For example, if you want to use a controller to sweep through frequencies from 20Hz-20,000Hz, if you use a MIDI controller normally it would pick out only 128 discrete frequencies over that range and step from one to another like stepping stones. But using this patch, if you move the controller from, say, 34 to 35, the result would be a smooth sweep through frequencies of about 5300Hz to 4470Hz over the time period equal to the time taken to change from 34 to 35\.
Unfortunately, the major limitation is that the patch only knows the time period at the moment you reach the new number, so the ramp only starts by the time it should have finished. This makes it impractical to use for slow sweeps, but you could use a separate controller for these slower movements (one that is much lower geared).
Any suggestions for improving or expanding the patch would be welcomed.
I know that pitch bend already performs as a 14-bit controller, but I don't want to be limited to having only one hi-res at a time (I only have one pitch bend wheel!)
Cheers,
Brett
[http://www.pdpatchrepo.info/hurleur/High\_Resolution\_Controller.pd][0]
[0]: http://www.pdpatchrepo.info/hurleur/High_Resolution_Controller.pd

Hi all,
I'm trying to implement what the topic's title says, but don't seem to nail it yet.
I've done this for now
![bl_duty_cycle.png](/uploads/files/upload-4974d8ee-8afb-4df4-a066-28f4d8f2fa3e.png)
but I get a pitch change as I change the duty cycle. Any suggestions? I've attached the patch so you can play with it (it uses the [bl_squarewave] abstraction that I just posted in the abstract~ group of this forum)
[bl_duty_cycle.pd](/uploads/files/upload-bc8161de-46c9-40a9-897a-c99243845dcf.pd)

Hi all, my first post here...
I'm using pd 0.40-2 on MacOSX Tiger 10.4.9, with much fun!
But everytime i start the app, i have to set again audio/midi preferences, because pd was back to default parameters (builtin audio i/o and no midi in/out).
Yeah, this is not really a problem, but is a little annoyng...
Do you know if this happen on osx even with 0.39.2-extended-RC1? Is this version the better one? What are the core differences between the 2 versions (beyond extensions..)?
Thank you very much
Regards
jk

Hi,
I am not sure if this is a correct forum section to ask.
I have tried using MrPeach's OSC to trigger the virtual MIDI keyboard in Reaper to record MIDI on a track.
At first I thought this would get me much more precision in timing of the events than if I was sending simple MIDI from Pd. However, there is a lot of timing imprecisions in both beginning and length of the notes. I am using the loopback connection to send the data from Pd so it should be very fast and precise in theory. I have a fast CPU.
I suppose the timing in Pd itself is rock-solid and it outputs the OSC messages with correct timing. So what else could be causing this very noticeable imprecisions?
Any suggestions would be very welcome.
Thank you

Hello everyone,
I'm constructing a PD patch for a uni assignment, but in the process of tinkering with my patch I've managed to create some kind of infinite feedback loop; the console says "error: stack overflow". My problem now is that I can't open the patch to get rid of the error; it simply freezes and doesn't open. Other patches work fine. I've got a few loadbangs initialising some parameters in so I put "noloadbang" in the startup flags, but it says "noloadbang: can't load library" on opening PD.
I *could* just start again and build it from scratch, but I'd rather not... any suggestions on how I can get at my patch to fix it?
Using Windows 7 on a Dell laptop, "pd version 0.42.5-extended"
Ta

Here is the Audio (track was split into stems and eq/reverb/compression added later)
https://soundcloud.com/refund/refund-mantle
Here is a picture of the patch
https://lh5.googleusercontent.com/-cdwRk-erfwg/VJUOEbyKo1I/AAAAAAAABYc/0wyW8XmKhY8/s1152/mantle.png

Hi.
I'm having an strange problem with midi channel using pd.
I'm working with two different midi controller, one is assigned to midi channel 1 and another to midi channel 2.
If i connect them separately they work right, sending and receiving from their channels, but if I connect the two at the same time (setting first on pd) all works bad. I can see with ctlin than first controller (channel 1) now send to midi channel 17 and sometimes 33... crazy... midi only works with 16 channels, right?
I'm missing something?
Ah...I'm working with Pd-extended on Debian wheezy 32-bit.
Thanks a lot for the help!

I am fairly new to PD, and am working on an algorithmic composition patch as part of a project, and I am attempting to save the midi output to a midi file, using the seq object. I am currently able to write data to the seq object, but when I create the midi file, it plays for the length that I recorded but it does not produce any sound. I have attached the patch - any advice would be extremely appreciated!
Here is the patch: [relaxed.pd](/uploads/files/upload-01d42346-af15-4c68-a27b-72624d373dde.pd)

Hi all,
I need to use single buffering mode in order to plot circles that will stay on the screen.
I also want to display a video at the same time, using pix_film, but it doesn't seem to work -
After sending the buffer 1 message to the gemwin, I am sending bangs via a metro to the gemhead object that outputs to the pix_film. But nothing happens...
Does anyone know why?
Thank you so much in advance...

Hi all,
Here is a handy patch if you need a DIY MIDI clock source. This works both as a traditional metronome as well as a MIDI clock source. I created this to use with apps like Sooper Looper which syncs to MIDI clock information, but doesn't have an internal sound generator to keep me in time. Let me know if you need me to explain something because it gets a little confusing at parts.
To use this just unzip the files to wherever you keep your patches and open **[metromidi~]**
![Screen shot 2014-11-13 at 1.50.17 PM.png](/uploads/files/upload-7e215a62-f4bf-40c6-b221-14772ade1247.png)
[metromidi~.zip](/uploads/files/upload-e6649d14-afa5-4b87-965d-83493796c209.zip)

So I've been using IAC and noteout objects to send midi values to logic. The only problem is even when I have the output channel in PD set, [noteout channelnumber], and the input channel set in logic to the same value, the values are still either sent or received on all channels. I've tested this via a very simple setup sending a note to channel 1, and a note to channel 2, with 2 different software instruments set to the right channels, but whenever any midi value is sent, it still arrives at both instruments. Anybody have any ideas as to why this is happening? (And yes, I have auto demix for multitrack already checked). I'm pretty new to this area in general, so it's possible there's something I'm not understanding about the setup.

Hello, I have made an abstraction called *"Key"* to easily manage keyboard shortcuts in pd, but everytime that I load a patch containing the abstraction I get this error for every instance of the patch that is loaded:
*load_object: Symbol "Key_setup" not found*
what can cause this? My abstraction has two creation arguments and I get the error even if they aren't used. Any ideas?
However this is the patch:
[Key.pd](/uploads/files/upload-76b15aa1-23a3-4923-8d4c-a339101c9fdf.pd)

Hello everybody!
I have a question: I have a Brain V1 from Livid Instruments and I have it connected with a couple of FSR sensors sending cc inputs.
I would really need to convert that input into note + cc input.
Can I do it with PD!?!? (I'm a truly new user of this software so far..:)
I thank you in advance guys!

In a monophonic sinthesizer, If we connect the [notein] directly to a [mtof] and then to an oscilator, the effect is not always as expected. Sometimes the note off message turn off a different note that we would like.
The system should store the penultimate key pressed, and play it again when release the last one. But in some cases the player can also release that penultimate key (while the last key is still pressed) and the priority should transferred to the antepenultimate key... and so on.
So, we need a patch which can memorize at least the 10 last notes and make this work for us.
[notein_mono.pd](/uploads/upload-37a7ae14-29c1-4bf4-b585-a55f62000bc0.pd)
The explanation of how does it works will be uploaded soon in my youtube channel:
http://www.youtube.com/user/canalpuredata

Getting a start with pd today, and my initial issue is the test after installation is producing just distortion instead of a sine wave. This must be wrong.
On a windows machine with internal soundcard using ASIO, Not at all far into sound manipulation yet, but have software that does involve the ASIO somehow - REW (Room EQ Wizard - used for room acoustic measurements and loudspeaker measurements). Please advise? Fairly non-technical here, as compared to many of you, I'm sure. Thank in advance...

So right now ive set up a counter, connected to a trigger. The trigger first bangs a number message (which is connected to the cold intlet of MOD) then sends the float to MOD. But this setup doesnt change the MOD at all.

Hi,
I discovered this video on youtube >>> https://www.youtube.com/watch?v=wbXT79_35dk. I can't work out how its done? I am on OS X I've gone through audio midi set up. I can get ableton to pick up midi information from the minibrute with a plug in synth so theres some sort of connection.
I've a patch in PD with makenote and noteout but the minibrute won't respond to this. How can I control the analogue circuit of the minibrute with pure data like the video I've linked above, I am not sure what to do now?
Thank you very much?

i was using my midi in pure data it was working good for a while until today , suddenly i get "notein no such object" and i also get "tabread~ no such array" , i already have a notein and i can see it but the sound goes away
how can i fix such problem ? anyone?
i work on windows 7 64X

Hi, everyone.
I recently came into possession of an original Xbox for free. I did a softmod, and got it working with emulators and that type of stuff. But what I really want is to get Linux working so I can run PD on it.
My broader issue is that resources for using linux on original XBoxes have dried up. Outdated guides, expired domains, and download links 404ing. I've managed to get some stuff working, but issues persist. I got xDSL running, but apt-get install puredata won't work. Getting dependencies is tricky, because the repos are old. Updating the repo locations helps, but doesn't get me all the way there. On and on. I tried switching my efforts to Xebian, but that looks like its going to need a little elbow grease as well.
I was looking for an older version of Dynebolic, which supposedly has Puredata included. But I can't for the life of me find a working link for v1.4 (the last version that supported the original XBox).
I know specific guidance might be a little much to ask. But I wanted to know if anyone had successfully pulled this off in the past, and if there was any general advice that might help me along. Distros that worked for you? Methods for installing/running Pd? Things to look out for?
It would be great to make this thing a little Pd box instead of just donating/re-selling it. Any wisdom from someone who has pulled it off?

Hello guys,
my first post on this forum.
I am making a patch that is reading from a table with a [metro] set to 1 ms.
The problem is that it efficiently freezes the GUI and user's input until the [metro] is stopped by a condition.
At first I thought I have reached the limit of PD clock and my CPU but I then stumbled upon another patch that also used such a fast [metro] and read from a table and the GUI and user's input was fine - the performance of the patch was normal.
The patch I have is a very simple one. Nothing fancy so I expected the performance to be usable.

hey there
i am using a midi and a microphone i am also using (fiddle~) to make a pitch follower and the final process is to make auto correction from the microphone to be played by the midi ( corrected by midi)
is there any patch or any suggestions
thanks
this is the patch [Midi_Hammond_like2.pd](/uploads/upload-f1bbdfd4-c71b-47b6-8a74-a76a6ac182bb.pd)

Hello all,
I created three clickable keyboard objects that may be handy when dealing with MIDI. They can also be used to visualize MIDI data coming in. See:
https://vimeo.com/104180788
* * *
NEWEST VERSIONS:
For Pd-extended: [keyboards-pd-extended.zip](/uploads/files/upload-7ded51df-f90d-49f6-984a-b370e2b7fa5a.zip)
For Pd-l2Ork (the current build in their website is a bit old and you may need to compile it from source in order to use it: as far as I remember, the pre-compiled .deb build available in their website still does not allow images to be clickable, but I may be wrong with that): [keyboards-pd-l2ork.zip](/uploads/files/upload-92623b4d-17f2-4e3b-b36c-4d58838255db.zip)
* * *
I hope you all will enjoy it!
Take care,
Gilberto Agostinho

This is a cross platform alternative for synchronizing two(or more) different computers to one Midi Clock.
Since one will have coge this will be between a Mac OSX and windows running traktor that produces midi clock.
For windows Midi Yoke (tested in XP) to wrap Midi clock to Pure Data extended (tested with v 0.41.4) from there the clock will be sent has an OSC message to one or all network computer(s). On the listening side there will be one other Pure Data patch converting a specific OSC message to Midi clock. Then coge can catch this midi clock just by clicking "Midi clk".
This is nothing special but it took me almost 3 weeks to accomplish it. Given that i knew nothing about pure data at the time, this site was very helpful. All examples were gathered from here.
Also note that this is not the best way.
There are Midi cables for this.
There are usb to midi converters, for those who don't have midi ports. Also external sound boards quite good with several inputs and outputs.
I have experienced a +/- 3 Beats float of BPM count on coge but always around the correct number, over time this seems to float less.
This still lacks testing.
Anyway.
You'll need Pure data extended on every machine.
You'll need Midi Yoke on Windows.
You can use Yac Driver on Mac OSX for midi wrap.
Don't know for linux, sorry. (read somewhere something about jack)
On pure data you'll need to define the midi device input or output which ever is the case.
On windows it's easy on any pure data windows you can set midi devices, choose midi Yoke (the channel number should be the same in the receiver input and sender output).
On OSX it's on the main window, under preferences.
If using this with quartz composer one can skip the pure data on the OSC server side (which is the one that listens). If adding more functions to this keep in mind that quartz composer does not understands 0 arguments messages. (I think this should be a bug)
You are advised to change the IP on the pure data patch to which ever is your destination. You can use 0.0.0.0 to broadcast. Don't use wireless, it's a 0 security communication protocol.
Hope to save someone's time. GL
[http://www.pdpatchrepo.info/hurleur/OSC\_MIDIclk\_sync.zip][0]
[0]: http://www.pdpatchrepo.info/hurleur/OSC_MIDIclk_sync.zip

\>\>I had been hooking the sampler up to the midi out from the x-station<<
does the x-station have a midi 'thru' port too? or just in and out? because if you want to send midi to the sampler via another device, you'd probably have to have a midi thru port to do that (if i recall correctly)
if you don't have a midi through port, then i think it would be best to use the midi-usb device you have.
set the electribe to receive on midi channel 1 only. and then attach it to the the midi-usb device with a midi cable.
connect the x-station to the electribe's midi 'thru' outlet with a midi cable. set the x-station to recieve on midi channel 2 only.
then open pd and set midi output to the midi-usb device name, and then \[noteout 1\] should talk to the electribe only, and \[noteout 2\] should talk to the x-session
one more thing:
does the x-station have its own power source? if so, then disconnect the usb cable. you won't need it. but if the device needs usb for power, then maybe you can still leave it connected via usb as well, but don't select it as a device in pd, because it will be receiving midi messages via the electribe anyway.
one more thing again: really if the x-station has a midi through port, then just attach that to the electribe and all your worries should be gone.

I'm looking for some help about a latency problem with midi processing using pd under Windows XP.
Here is the point : I'm using pd to process some midi data coming from electronic drums. These data are then sent to another software (e.g. Nuendo) thanks to virtual midi ports (MIDI Yoke). Assuming '' is a midi connection (virtual or physical), it looks like,
drums module Sound Card (Midi Ports) Pure Data Midi Yoke Nuendo
The problem is that pd introduces some latency... and pd is definitely responsible for that... because the following setup does not introduce latency :
drums module Sound Card (Midi Ports) Midi CC Midi Yoke Nuendo
(Midi CC allows to connect virtual ports between them)
The last point is that latency does not seem to be sensitive to the patch. I tried a simple midi thru patch and another patch with complex treatments and nothing changed...
Help !!!

Hello everyone,
I just purchased a Ground Control Pro, a midi foot pedal, from DMC. I was hoping to use it to switch between my patches in Puredata. However when I plug it in, using the factory settings=\>to my Uno midi interface=\>powerbook g4\. I don't get any midi signals in the Test Patch in PD. I have selected the midi interface in the PD midi settings, I have also created a external in Audio|Midi OSX utility. If I switch out the GCP with my midi keyboard, PD gets the signals fine.
I am getting midi signals to the Audio/midi OSX utility because when I do a sound test in the utility i get sound response on the "\\/" arrow of the midi interface(the one viewable in the patchable section of the utility)
If anyone has had any experience with the GCpro and setting it up with PD I would much apreciate some direction.
Just for you knowledge, I have tried this setup with the OSX midi utility open while PD was open(I heard that I should try that)....no luck.
Ben

I've been getting into writing patches that generate music all by themselves, using mathematical
rules that apply quite nicely to music theory. I've made a few rhythm patches that make nice cross
rhythms using metronome division and delays (with values derived from multiples of the master
metronome), and i'll post these too if anyone is interested.
In this thread I'm showing off my "Mauritz Escher like Chord progressions" patch.
Screenshot: ![](http://responsible7.googlepages.com/zenpho_escher.gif)
Mp3: [http://responsible7.googlepages.com/zenpho\_escher\_pd.mp3][0]
Patch: [http://responsible7.googlepages.com/zenpho\_escher.pd][1]
**First some basic music theory:**
(skip this if you're comfortable with chords, 7ths, and inversions)
A major scale is constructed of 8 notes, with the "root" note doubled at the 8th note.
For the key of C major (all the "white" notes on a piano) the names and numbers of the notes in
the scale of C-major are:
Name, Number:
C, 1st (root)
D, 2nd
E, 3rd
F, 4th
G, 5th
A, 6th
B, 7th
C, 8th (remember the root is doubled at the octave)
A triad is constructed of the 1st, the 3rd, and the 5th notes in the scale.
A SEVENTH chord is constructed of a triad (notes 1,3 and 5) PLUS the 7th note in the
scale. So a C major 7th is note 1,3,5,7 or C,E,G,B.
Up until now we've been describing "standard" voicings of the chords, in other words, the notes
are played so that the root is the lowest pitched note, the 3rd is higher, the 5th is higher
still, and the 7th is the note just below the octave of the root.
At the risk of sounding redundant, "octave numbers" after the note name help clarify which octave
the note is to be played in. To play a C major 7th on the third octave, we would write:
C3,E3,G3,B3\. To play it an octave higher we would write: C4,E4,G4,B4\.
"Inversions" of chords re-order the pitches of the notes, but still play notes with the same
"name" as the 3rd, 5th, 7th etc. For example:
C3,E3,G3,B3 is a standard C major 7th...
...and G2,C3,E3,B3 is an inversion. All the notes are there (C,E,G,B) but they are in a different
order to the normal "Root, Third, Fifth, Seventh" arrangement. In this case, we say that "the
fifth is in the root".
----
Okay so now we know what a major 7th chord is. Lets deal with chord progressions.
Now imagine playing C3,E3,G3,B3 and removing the "root" (the C3) from the notes played,
we have a chord that reads "E3,G3,B3" - we were playing C major 7th and now we're playing E minor.
\*THIS IS A VERY IMPORTANT STEP\* Moving from C major 7 to E minor sounds "natrual" because the
notes that occour in C major 7 ALSO occour in the E minor.
Now lets make this E minor chord a 7th...
We've said before that a 7th chord can be constructed by playing the 1st, 3rd, and 5th notes, PLUS
the 7th note in the scale.
The scale of E minor (a flavour of minor) is:
Name, Number
E, 1st (root)
F\#, 2nd
G, 3rd
E, 4th
B, 5th
C, 6th
D, 7th
E, 8th (octave)
The 7th note is "D" so we add the D note to our E minor triad to make E minor 7th.
E minor 7th is therefore: "E3,G3,B3,D4".
We can extend this E minor again, removing the root, working out the new scale for G major, adding
the 7th to make G major 7th, and again, and again, and again... but if we do - we keep moving
\*UP IN PITCH\* and spiral off the end of the keyboard.
----
**HOW THE PATCH WORKS**
Okay, so what my patch does is to take the idea of generating new 7th chords over and over,
but to play inversions of these chords so that the notes stay inside a single octave. If the
"root" note is in the 3rd octave, C3 for example. Then when I move to E minor, the D4 is
transposed to be a D3, to keep within this octave range.
Due to the fact that there are 12 semitones in an octave, and notes that fall outside the octave
range will wrap around to be an octave lower. The maths for generating the new chords basically
involves taking each note in the current major 7th chord and adding two semitones to each note in
turn.
Now our terminology could cause confusion here, because there are "notes in a scale" and "notes in a chord"... So I'm going to define some notation to show when i'm talking about the notes in a
chord.
For example:
A C major 7th has the notes C3,E3,G3,B3\.
Note-1-in-the-chord is to be defined as chord\_note\_1\.
Note-2-in-the-chord is defined as chord\_note\_2\.
Note-3-in-the-chord is defined as chord\_note\_3\.
Note-4-in-the-chord is defined as chord\_note\_4\.
chord\_note\_1 has the pitch C3\.
chord\_note\_2 has the pitch E3\.
chord\_note\_3 has the pitch G3\.
chord\_note\_4 has the pitch B3\.
It is important to be clear about the idea of "pitch", "chord\_notes" and "scale\_notes" because
because chord\_note\_3 has the pitch "G3" and scale\_note\_3 of C major which is the pitch "E3".
----
Back to the procedure for generating new seventh chords.
We generate a major 7th to begin with.
C3,E3,G3,B3\.
We add 2 semitones to chord\_note\_1 to get "D3", and we leave the other notes alone.
Our chord now reads: D3,E3,G3,B3\.
Which is an "inversion" of E minor 7th.
This time we add 2 semitones to chord\_note\_2 to get "F\#3", and we leave the other notes alone as
before.
Our chord now reads: D3,F\#3,G3,B3
This is an inversion of G major 7th.
This time we add 2 semitones to chord\_note\_3 to get "A3", we leave the other notes.
Our chord now reads: D3,F\#3,A3,B3
This is an inversion of B minor 7th.
This time we add 2 semitones to chord\_note\_4 to get C\#4...
\*BUT C\#4 IS OUTSIDE THE OCTAVE 3! So we TRANSPOSE it down to C\#3\*
Our chord now reads: D\#3,F\#3,A3,C\#3
This is an inversion of D major 7th.
After my patch modifies all 4 chord\_notes, it moves back to chord\_note\_1, and adds another
2 semitones... over and over.
Eventually we get back to C major 7th again, but on the way we move through a variety of different
chords that evokes very interesting changes of moods.
**Want to try playing with it?**
Mp3: [http://responsible7.googlepages.com/zenpho\_escher\_pd.mp3][0]
Patch: [http://responsible7.googlepages.com/zenpho\_escher.pd][1]
[0]: http://responsible7.googlepages.com/zenpho_escher_pd.mp3
[1]: http://responsible7.googlepages.com/zenpho_escher.pd

should become a kind of groove box but i guess i tinkered it to f\*\*k. it's a damn cpu hog and it will spit out a whole list of errors to the pd window when loading.........
....BUT BESIDES ALL THAT: it sound pretty nice (in my opinon!) .... ;)
so, if you give it a chance, tell me what you think about it.
EDIT: Note, when you open the patch (RUMBLE\_BOX.pd) there will be no sound until you assigned midi in and midi out.
the sequencer sends out midi notes to the single sound generator channels, so you have to send midi out to midi in - i did this with midi ox and midi yoke, i connected "midi yoke in 1" to "midi yoke out 1" in midi ox and assigned them as midi in/out in pd.
[http://www.pdpatchrepo.info/hurleur/Rumble\_Box.zip][0]
[0]: http://www.pdpatchrepo.info/hurleur/Rumble_Box.zip

Yes, I already use something like midi yoke now. It is the standard osx midi routing software called IAC.
I forgot to mention in my question that I use one IAC output port to send signals from the PD patch to Ableton. In ableton I route signals from this IAC port to my MIDI outs (connected to the machines).
The other output I use to send signals to my controller, so that when I change the parameter of a knob I can send the current value of that parameter to the knob of the behringer (when I turn this knob the value doesn;t jump but starts from the value it is actually on).
This liimits my possibilities, I want to be able to filter out signals for certain machines. But now I can only use one actual midi output, The IAC bus.
The nice thing about PD;s midi controller interface is that another midi output is a different midi channel. So when I could have 4 midi output possibilities it would enable me to press a button that says to a knob: send to channel 17 (channel one of the second midi output device) instead of sending to channel 33 (channel one of the third midi output device).
I hope this is possiible. I think it actually is, because there are four midi inputs available.

@dangrondang wrote:
> The circles we calculated the orbits to be proved each to be slightly short of a true, pure circle, thus returning the satellites to the ground. Pi more precisely can be evaluated to 3.1446055, and not the perpetuation of imperfection, and the maintenance of ignorance still taught today.
Sorry to be blunt, but what you are stating is awfully incorrect and the way you proclaim to know this "hidden truth" is really awufully ignorant. So you are stating that there is a mistake on the 3rd decimal digit of pi... It so happens that this decimal place was precisely calculated centuries ago, and the result still (obviously) stands today. There are tons of different approaches on how to get correct digits of pi, such as by using certain convergence series as mentioned by @seb-harmonik.ar. One can _manually_ calculate the 3rd digit of pi using these type of series, and this result has been known for more than a millennium (in the year 480, the value known was 3.1415926, which is way more precise than what you state. By the early 18th century, we knew __100__ digits of pi, none of which have been "corrected" later on. See: https://en.wikipedia.org/wiki/Chronology_of_computation_of_%CF%80). There is no way that in the 1960's pi had to be defined as 3.1446... instead of 3.1415..., that is just pure ignorance and witchery.
But reading about your number on the internet, I found that this 3.1446055 appears in several articles about satellites, pyramids and all that "dark hidden world that nobody tells you about open your eyes sheep people controlled by the illuminati" kind of talk, but not a single time in a serious mathematical or physics website/journal/wiki. Sorry but the world is not a dark place controlled by people with magic powers trying to keep you in the shadows... simply do not use random blogs as source of information.
> The assumption of equal temperament proves a repetitious vexation, denotes ignorance, apathy, laziness, and/or unfamiliarity with the math of music. Worse, this assumption proves far too common among engineers / mathematicians / scientists / software programmers. Which then deprives us musicians of useable tools, and the stupidity of the need for MIDI tuning standard / scala / etc, is the vacuum such engenders.
As for the "stupidity" of using a tempered system, that is just as stupid as using a non-tempered one: it is simply a convention, upon which we built centuries of music. Western music has been based on it for long time (although contemporary composers, myself included, often choose to use micro-intervals), and the decision to create MIDI around this is as logical as it gets when you think what they were aiming at. Or should we have went through all sorts of trouble to incorporate all crazy stuff in the MIDI protocol (which is an Western creation), such as the possibility of having Indian micro-tonal scales? But wait, then what about gamelan scales? No wait, what about __ scales?
From a practical point of view, you can still use MIDI cents, and you can also directly use frequencies if you want to precisely define the pitch of a sound. You can compose music in 12-tones, 27-tones or 193-tones if you wish to. The tools are here, and Pd allows you to do whatever you want with them (I myself have composed works using Pd __and MIDI__ that deals with microtones and microtonal glissandi in real-time).
I hope you won't get personally offended with my message, but I can't really read this type of statement, which tries to propagate pseudo-scientific kind of stuff, without writing a strong reply.
Best,
Gilberto

Hi,
Does anyone know how I'd intercept the APC40 MIDI so I can start to make changes to it.
I've tried using a virtual MIDI cable, feeding the MIDI into PD, then into the cable: and recieving MIDI back from the cable and sending it to the APC40, but:
1: This causes a weird feedback issue: when I turn a knob Ableton gets updated and immediately sends a MIDI response, which PD sends back to the APC40\. This causes slow knob response and flickering: it looks like the APC40 is getting confused by the returning MIDI signals.
The same thing does not happen when the APC40 and Ableton are connected together directly: even though there is two-way MIDI. Obviously the issue is solved somehow: but I don't know how to do it.
2: A lot of the APC functions simply doesn't work: although I've simply done midiin --\> midiout with all the available MIDI interfaces. The handshake between the APC and Ableton supposedly goes through MIDI, and other people have got it working with Bome's MIDI Translator: so it's definitely possible. Any ideas on how to get it working?
Cheers,
Will.

Hello,
I have a problem with using pd and MIDI yoke.
The thing is that I am sending MIDI information from reaktor through MIDI yoke to pd.
For a while everything goes fine after a few seconds things go wrong.
I am using different midi-controllers and it seems that after a while information from the different controllers get "messed-up".
I am sending MIDI through port 1 of MIDI Yoke and receiving from the same port in pd. This is the only port that I am using.
Now I found out (by opening pd in DOS) that I get a MIDI feedback, but I don't understand why...
Also when I set-up the midi port I get the message that using MIDI in windows is dangerous:
Warning: midi input is dangerous in Microsoft Windows\\; see Pd manual).
Can somebody help me pleezze.

Hi, tossing in some cents:
in my experience my issue with Pd is the opposite from LiamG's issues: the sound design is straightforward and simple to implement (again, depending on what's being implemented) but the process of algorithmic composition is not as well suited to data flow programming. My main interest in computer music is algorithmic composition and I believe pd was designed as a tool for live performance which makes it unwieldy to use for algorithmic composition but great for what it was designed for. Control of events over time is severely lacking in my opinion
Let me give examples:
1. There is no way, at a given point in the program flow of a patch, to call a subpatch with certain input arguments and get the output in a certain format. Instead, copies of the subpatch must be made, increasing memory usage compared to the classical idea of a function.
2. When composing, I like to think in terms of "do this thing at this point in time". Say I have a melody and at a certain point in that melody I want to choose between 2 different notes. One way of doing this would be to have a qlist that plays the melody and have it call a receive object that takes parameters to choose between 2 notes. But what if I want to call that function to play a different instrument? all of a sudden I need an extra argument to that control structure, and need to re-program the subpatch in order to do it.
3. Say you want to quickly edit a score of midi notes or midi automation, which maybe don't control midi directly but rather control algorithmic parameters that change over time. This is very difficult to do. You could set up a DAW to send the midi to pd, but MIDI is not designed (in it's structure or in it's output format) for sending data to programming languages, but directly to instruments. There aren't many open source OSC editors either (although there are a couple of promising ones).
recently I have been exploring other programs for algorithmic composition purposes, such as Common Music, based on Scheme. What makes Scheme enticing for algorithmic composition is that it supports continuations natively. What this means is that you can control the program flow manually, and the score for an algorithmic composition could be set up to be the program itself. (Common music does not support this but it could be set up to).
So, in the example above, when composing with Scheme set up in this fashion, all that would be required to choose between 2 notes and play it would be 1 line of code, like ```(play myinstrument (choose 49 64 20))``` and to play it on a different instrument all that needs to change is the first argument. Notice that "choose" does not need to be rewritten.
Personally, I believe what is needed for the future of algorithmic composition is a dedicated DAW-like environment, (I have been thinking of undertaking such an endeavor). It would have tracks, and each track could have several layers that could output to either OSC, MIDI, or another layer in the project. Layers could be midi note editors (that could be configured to output OSC), automation editors with configurable precision (like if they output ints or floats, and how it is scaled), pieces of code that generate notes, pieces of code that take input data and output other data, containers for more layers, etc. The programming languages used would have to bridge over a common interface somehow, and they would need to support continuations or coroutines (Lua does also). The editors could have functions to, say, bind a specific midi note to a specific function in a specific layer. This would combine the ease of using DAWs to compose temporally with algorithmic composition constructs in my view.
But all that is just based on the way I think about composition... everyone's different.

Hi, my patch v2MSy (video to midi synth) is downloadable from . I hope you like it. Basically it's smilar to MetaSynth but my patch works with video or webcam in real time, and not with image. The patch transforms the central Y axis of the video into 128 MIDI signals (from bottom to top) at 25fps (metro at 40ms). The result is a detailed "notation" of the video's central axis where the white pixels represent velocity=127 and black pixels represent velocity=0 (very similar to carillons' mechanism where video stays for the cylinder!)
The patch contains also two indipendent subpatches: an interface for BCF2000 and a "poor" interface for keyboard&mouse.
You need a receiving software to "play" or "write" MIDI noteouts (like a sequencer).
To connect the patch to the receiving-software I use a virtual MIDI cable. I think you can use MidiYoke in Windows, or ALSA in Linux, or IAC driver in Mac. But if you want also feedback for your motorized BCF2000 you should set PD's MIDI settings with "multiple devices"...that doesn't work! Because it seems that only the 1st output device works... So I do in this way:
IAC driver (I use Mac) + MIDI Patchbay
and I set:
-BCF2000 as PD MIDI input
-IAC driver as PD MIDI output
-IAC driver as Patchbay input
-BCF2000 as Patchbay output
In this way, Logic receives MIDI signals from v2MSy, and my Behringer has feedback!
Please, test the patch in Windows or Linux and write if ALSA and MidiYoke do the work and give feedback. Thanks.
But remember that...it's only an "alpha" patch, and it needs powerful hardware.
[0]: http://adrjork.altervista.org/puredatatutorials.html

Greetings! I have some weird MIDI issues I was wondering if anyone here have any idea about what migth be the cause of, and perhaps could offer some pointers towards a potential cure to!
I am working on a project that involves the need to get midi messages from arduino to my sound card. This project involves doing stuff in water, so we have allready gotten a wireless MIDI transmitter that works perfrectly. The trouble, however, seem to be in Pure Data for all I can understand. I am using an arduino decimilla, with a regular flex sensor as control. And the problem is:
When PD receives MIDI data from the arduino , it all seems good in the beginning, the response is quick, and all is well. However, after running only a short time, not more than a few minutes max, the MIDI messages seem to get delayed, so when i move the flex sensor, it takes a while before the changes registeres in PD. What is even weirder is that in addition to this, there seem to be like a double flow of delayed messages, where at first the "original" message comes along, and then short after, a duplicate message comes along too. So when say I bend my flex sensor, at first, the midi goes from 127 to 60, and then after a short while, jumps back to 127 and ramps back to 60 again. Sometimes they even seem to compete and garble eachother at the same time, with values jumping all over the place. I find the irregularity of the whole troublesome affair really frustrating. I am still a noob with arduinou so even though I do not believe the problem lies in the arduino, I cannot be sure. The reason I believe it is in PD is beacause all i have to do is shut down PD and start it up again, and it works nice.. a few minutes. And needless to say, in a live setting, you don't want to go up and restart PD all the time, especially since I will be submerged in a pool during the performance.. :( (have allways used Pduino before the few times I've used it with PD, but that is not an option this time)
I am working in Windows XP with a Edirol UA-25 sound card btw.
The arduino code i'm using to produce MIDI was borrowed from this program;
[http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1165833586/0\#2][0]
Anyone had similar problems with MIDI in PD? I know the warning says "MIDI is dangerous in PD" but... I have used CC messages before with no problem at all. This is so weird...
Any clever suggestions would be hugely appreciated. :)
[0]: http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1165833586/0#2

Nobody can help me?
'Ndèmo fioj...no ste far 'e pìttime... (in venetian "C'mon guys, don't play hard to get!" ;-)
Let's think freely... Image the bone in 2001 Odyssey: a primitive sees a simple object, the bone...but only after a hunch, he realizes that you can add a function to the simple object...so object + function = instrument!
Back to Kubrick: in modern time, the bone becomes the spaceship for 2 reasons:
1) the bone and the spaceship have similar shape
2) all technology is based on the primal act of adding a function to an object, the act of transform a simple object into an instrument...the bone and the highest technology share the same nature: man is man's wolf!
Ok, so the spaceship in the "modern" bone. This is an answer! Kubrick's answer.
What could be the porno-movie's answer? Mmm... 1) similar shape... 2) to transform an object into an instrument... I don't know, let's image by yourself!
What about Disney's answer? 1) similar shape 2) from object to instrument: perhaps the answer could be a timpani mallet that strikes kettledrums in Fantasia!
So... Chopin's Berceuse...
I'm asking to you: which are, in your opinion, the modern electronic/electroacoustic processes that match with that Chopin's variations processes in Berceuse?
Let's take it as a facebook test! Do it. I need your help, seriously.
Here there's a little analysis of Berceuse...please, try to image an answer for every item described below...
BERCEUSE/bars
a) 1-2 loop
b) 3-6 theme (subject)
c) 7-10 counterpoint: 2 thematic lines (countersubject + answer)
d) 11-14 theme "boxed" in a repeating module built like a mirror of intervals from first bar of theme (ascending 4th, descending 3rd); then "bachian" episode
e) 15-18 theme as acciaccatura on 5th deg
f) 19-22 fast quadruplets on intervals from theme's 1st bar (4th and 3rd), then on chromatic scale, then on 3rd only
g) 23-26 3rd now in harmonic distance, moving in triplets on his inversion (6th); then chromatic scale
h) 27-30 harmonic 3rd "enlarged" to 6th (6/3 detached chords with a note transposed by octave up&down)
i) 31-34 harmonic 3rds (chromatic + repeated, then chromatic + A pedal like bars 15-18)
l) 35-38 chords like crystals (moving, inverting, pressing...revealing "reflections" of the melody); then A pedal again but higher, with diatonic intervals
m) 39-42 detached 6thes moving on the inverted intervals of theme's 1st bar (now desc. 4th, asc. 3rd); then 6thes on chromatic scale
n) 43-46 descending "sundry" scale (major+minor); then trill; then arpeggio on tonic chord + repeated 5th deg; then arpeggio on dominant chord + repeated 5th deg; then detached moving built on theme's 2nd bar in inversion (B, C, E become B, A, F); then finally detached 3rds-and-6thes on chromatic desc. scale
o) 47-50 "2nd" theme
p) 51-54 triplets with a "memory" of the 1st theme (A, B, D, C, F...the notes in bar 10 first-half)
q) 55-58 "divertimento" on tonic chord
r) 59-62 "divertimento" on dominant chord
s) 63-70 1st theme "interrupted"...almost a "notation in fade-out"

I'm not seeing anything in there dealing with midi. You are routing the OSC messages to faders and toggles, correct? Midi messages are in the form of:
(int=integer 0-127)
int (controller value), int (controller number), int(midi channel).

Hello,
I've been using pd over the last few months on an XP machine, but now poritng a project onto a Mac computer.
Using the test audio and MIDI menu I get audio signal (out of a USB audio device) but no MIDI signal. I tested playing back a MIDI file using quicktime - no problem hearing the Beatles at their best(!)
What could be happening?
the Pd manual directions are a bit esoteric
"To get MIDI working, you have to do the Mac OSX magic to get a USB MIDI interface installed. I've seen this done with Midisport devices and I think you just download the OSX driver and follow directions."
So i did that, and instaled the driver for an M-Audo MIDISPORT device. From the MIDI Set Up panel, the device shows up with its "online" box checked.
From the System Preferences-\> Quicktime -\> advance panel-\> set to "quicktime music synthesizer" is selected.
Running pd 0.39.2 test 5 on Mac 10.4.8
Any help much appreciated! i will keep perusing the net for more info meanwhile.
jean
ps: I don't plan on using any MIDI cintrol, just having Pd play some MIDI files along with some fm synhesis.

Hi all, I kind of new to Pd and this is my first post here so bare over with me :)
I'm trying to make a synth-like patch that gets a signal in (from a microphone) and control an oscillator (osc~) so that the notes coming out is always within a certain (selectable) scale and key. I've already made a small patch that process the signal from the microphone, so my incoming signal is auto-calibrated to be between 0 (no signal/activity) and 1 (max signal/activity).
I now want to spilt my interval up in say 8 (I select major/ionian scale, and for instance the key C), so for an incoming signal of 0 it plays C3\. If the signal become larger a D3 will be heard, and so on. For max signal I will hear C4\. If I get for example a F3 and the signal gets weaker it should play E3\. (hope you get the idea)
I know how to adjust the scale with just adding to a midi-note-number, so I reckon it would be easier to operate with midi-numbers rather than frequency...
I'm unsure on how to implement this. Maybe I should define all the different scales I want to use in tables, and pick notes from it by moving back and forth within the table (if this is possible) or maybe it's a smarter solution.
I hope any of you have any ideas. I know my explanation was not the best, but I want the end result to be a patch which let me play it like an instrument with the input from a microphone, always in tune with selected key/scale.
\[adc~\]
|
\[module with noise filtering and auto-calibration (already made this)\]
|
\[magic module I need help with :)\]
| \\
\[osc~\] or: \[mtof\]
| |
\[dac~\] \[osc~\]
|
\[dac~\]
Any ideas or reflections will be greatly appreciated,
best regards,
ZnakeByte

This lil' thing just got released on the great netlabel Control Valve, it is 6 tracks of whacky looping made with PD. Download @:
[http://www.controlvalve.net/][0]
inforz:
"ctrlvlv\#007 IS NOW READY FOR DOWNLOADING!!!!!!!!!!!!!!!!&\#8207; ctrlvlv\#007 IS NOW READY FOR DOWNLOADING!!!!!!!!!!!!!!!!&\#8207; ctrlvlv\#007 IS NOW READY FOR DOWNLOADING!!!!!!!!!!!!!!!!&\#8207;
artist: swamps up nostrils title: tarmtott
artist statement: "Swamps Up Nostrils is a spatiotemporal mishap again and again focusing on both experimental wrongdoings and ancient traditional musical structures like beats and harmonies. What you hear is not what you get but it will however seem pretty close anyway, so why bother? Swirling the eternal wormholes between the familiar and the unknown, we hope to entertain but admit that to most people it must seem like meaningless idiocy, but then again most people seem like meaningless idiots too, including ourselves, so I guess it balances out. Only by admitting to being an idiot yourself will you understand what it means to be one, and understand just how many idiots there are around here. We, the failed abhorritions of monkey-like beings of ancient times, will not let us be controlled by our biological shortcomings, although we admit to them causing us both irritation and confusion. This irritation and confusion is not the source of this music. This music was made by utilizing magick and computers, if you believe in such stuff. If you do not, this music was made by utilizing science and computers, if you believe in such stuff. If you do not, this music was made by utilizing faith and computers, if you believe in such stuff. If you do not, this music was made by utilizing computers and computers, if you believe in such stuff. If you do not, this music was made by utilizing music and music, if you believe in such stuff. If you do not, this mucus was made by utilizing mucus and mucus, if you believe in such stuff. If you do not, this made not was believe and unbelief is by whom was finalized as not more. If you do not, please ignore all above statements as they are irrelevant to the audial experience anyway. There appears not more than what vibrates in your ear, and how your brain interprets that on the basis of your own very personal framework of reference. Anyone telling you otherwise is either trying to highjack your brain or may be lying, or may be convinced of otherwise and acts on a compulsion of good faith, although faith can never exist as something good outside someone's subjective defenition of the matter so the statement is meaningless. Now stop reading this nonsense and listen to the music instead, because, as implied in this body of textual represented idiocies, the point is not to read about this music it is to listen to it. Get it?"
swamps up nostrils is arnfinn killingtveit from trodheim, norway. no one can ever be sure what will come out of the speakers when playing a swamps up nostrils release. the first time i heard one it was some sort of drum and bass mixed with circuit bent electronics, with just a tad of field recordings. you might be getting some sort of techno, drone, noise, minimalism, analog, digital, ect...... whatever it might be, it is always top notch sound work, great composition, and a highly enjoyable listen. killingtveit also runs the superb cd-r label Krakilsk
6 tracks of looping-layered sound composition 320kbps. mp3 cover image"
[0]: http://www.controlvalve.net/

Hi guys
I'm trying to route MIDI between Ableton Live and PD using Mac OSX 10.4\. I've set up a couple of IAC buses and can receive MIDI in PD from Live ok, but when I send MIDI back to Live using a noteout object it is not received by Live's armed MIDI track. The MIDI is reaching Live because the MIDI Track In Indicator in the Top RHC is flashing, it's just not getting to the track.
Looking at the MIDI stream in MIDI monitor Live gives this:
TIME SOURCE MESSAGE CHANNEL DATA
16:24:30.721 From IAC Bus 1 Note On 1 C3 100
16:24:30.846 From IAC Bus 1 Note Off 1 C3 64
whereas PD gives this:
\*\*\* ZERO \*\*\* From IAC Bus 2 Note On 1 C3 100
\*\*\* ZERO \*\*\* From IAC Bus 2 Note Off 1 C3 0
So PD does not seem to be outputting any kind of timestamp, could this be the problem? The only other difference seems to be the velocity of the note off message.
Any help would be much appreciated.
Cheers
Ummo

I'm running Pd 0.40-3-extended in Gentoo, everything seems to work fine except that I'm having serious problems with getting PD to recognize my Doepfer midi interface (analogue knobs to midi).
The only midi interface it does recognize is my midi keyboard which PD find as an OSS device \#1\.
However, the Doepfer interface works fine if you look at the raw output with cat:
cat /dev/midi
But PD just refuses to find find it.
Perhaps there is some cheap hack to fool PD to start caring about my /dev/midi instead of only listening to /dev/midi1?
Both the midi keyboard and Dopefer midi interface are run over USB.
Any one got some special tricks 'n tips perhaps?

@discolemonade said:
> player piano
if the player piano have midi input, then you just need a midi out device plugged in your computer (via a soundcard, external midi device, ...) and pure data with a very simple patch that use ```[noteout]```. In Media, Test Audio and Midi.... click on Midi out toggle (before hand you need to configure midi via Media, Midi settings).
That's it.
Cheers~

Hi Liam,
Really nice job, I really like this object!
> To change the scale, you can use the [list-math] object at the output. [list-math / 128] will scale the slider so that it outputs from 0 to 1.
Can I make a suggestion? Why don't you put a __[route scale]__ after the first __[inlet}__, so the user can set his own scale with a message such as __[scale 0 127(__ or __[scale 5 7(__. That would be a much more elegant solution in my opinion than to explain the user to add a __[list-math / 128]__ himself. And you can even make the default case have a scale from 0 to 127 just like a regular __[hsl]__ has. What do you think?
Best,
Gilberto

[http://eyebeam.org/events/rjdj-skillshare][0]
December 5, 2009
12:00 -- 1:30 PM : Introductory workshop on Pd with Hans-Christoph Steiner
2:00 -- 6:00 PM : SkillShare w/Steiner and members of RjDj programming team
Free, capacity for up to 30 participants
RSVP HERE: [http://tinyurl.com/ykaq3l3][1]
Hans-Christoph Steiner returns to Eyebeam with members of the RjDj programming team from Europe to help turn your iPhone or iPod-Touch into a programmable, generative, and interactive sound-processor! Create a variable echo, whose timing varies according to the phone's tilt-sensor or an audio synthesizer that responds to your gestures, accelerations and touches. Abuse the extensive sound capabilities of the Pure Data programming language to blend generative music, audio analysis, and synthy goodness. If you're familiar with the awesome RjDj, then you already know the possibilities of Pure Data on the iPhone or iPod Touch (2nd and 3rd generation Touch only).
Creating and uploading your own sound-processing and sound-generating patches can be as easy as copying a text file to your device! In this 4-hour hands-on SkillShare, interactive sound whiz and Pure Data developer Hans-Christoph Steiner and several of the original RjDj programers will lead you through all the steps necessary to turn your phone into a pocket synth.
How Eyebeam SkillShares work
Eyebeam's SkillShares are Peer-to-Peer working/learning sessions that provide an informal context to develop new skills alongside leading developers and artists. They are for all levels and start with an introduction and overview of the topic, after which participants with similar projects or skill levels break off into small groups to work on their project while getting feedback and additional instruction and ideas from their group. It's a great way to level-up your skills and meet like-minded people. This SkillShare is especially well-suited for electronic musicians and other people who have experience programming sound. Some knowledge of sound analysis and synthesis techniques will go a long way.
We'll also take a lunch break in the afternoon including a special informal meeting about how to jailbreak your iPhone!
Your Skill Level
All levels of skill are OK as long as you have done something with Pd or Max/MSP before. If you consider yourself a beginner It would help a lot to run through the Pd audio tutorials before attending.
NOTE: On the day of the SkillShare we will hold an introductory workshop from 12:00 until 1:30 PM, led by Steiner, for those who want to make sure they're up-to-speed before the actual SkillShare starts at 2:00\. The introductory workshop is for people who have some done something in Pd or Max/MSP but are still relative beginners in the area of electronic sound programming.
What You Should Bring
You'll need to bring your iPhone or iPod Touch (2nd or 3rd generation Touch only), your own laptop, a headset with a built-in mic (especially if using an iPod Touch) and the data cable you use to connect your device to your laptop. Owing to a terrific hack, you won't even need an Apple Developer License for your device!
More Information
RjDj is an augmented reality app that uses the power of the new generation personal music players like iPhone and iPod Touch to create mind blowing hearing sensations. The RjDj app makes a number of downloadable scenes from different artists available as well as the opportunity to make your own and share them with other users. RjDj.me
Pd (aka Pure Data) is a real-time graphical programming environment for audio, video, and graphical processing. Pd is free software, and works on multiple platforms, and therefore is quite portable; versions exist for Win32, IRIX, GNU/Linux, BSD, and MacOS X running on anything from a PocketPC to an old Mac to a brand new PC. Recent developments include a system of abstractions for building performance environments, and a library of objects for physical modeling for sound synthesis.
----------------------------------------------------------------------------
kill your television
[0]: http://eyebeam.org/events/rjdj-skillshare
[1]: http://tinyurl.com/ykaq3l3

Hi, this question goes out to those of you who use PD and Ableton live.
I am trying to control an instrument in Live, Live's Sampler, from PD, and I have ctlout working just fine, in that I can control parameters of filter knobs etc. on a Live Sampler from a PureData ctlout with no problem.
I am using the IAC MIDI driver to route MIDI from PD to Live as I am on a Mac OSX machine.
My problem is that when sending noteout signals, Live is receiving the MIDI as evidenced by Live's MIDI receive indicator lighting up, but it is not making notes play in the sampler, even though I have it set to receive MIDI from that IAC bus. Like I say the ctlout has no problem controlling a filter knob, etc...
Anyone have any suggestions? I am kind of a noob to MIDI stuff and to interapplication MIDI routing.

This is my live guitar effects right as of Feb 14, 2009\. Please let me know if you find it useful or have any ideas for effects or other improvements. If you make some music with it, I'd love to hear it!
Once I have some more time to program it, my next effect will probably be a Vocoder.
Run effectsrig.pd to load it up. A midi expression pedal is recommended for the best experience - but it's not required.
It contains the following effects:
whammy~
-------
Digitech whammy style pitch shifter. Allows for smooth changes to the pitch shift amount.
Based on the one posted by "kenn" on the puredata.info forums (which in turn is based on the pd example code).
shimmer~
--------
A "shimmer" synth-like effect. This is done with a pitch shift in a feedback loop of a very short delay.
octfuzz~
--------
Octave-up distortion like you can obtained with the classic transform and 2 diodde rectifier circuit. Basically it just full-wave rectifies the audio signal. This one really brings out the high frequencies (some times a little too much!).
leslie~
-------
A stereo leslie (rotating speaker) simulator. This is one of my favorites. If modulation is turned all the way down it becomes tremolo. Take one of the outlets for mono use. Try it in stereo for super-swirley bliss! When using an expression pedal to control the rate, heel down will bypass the effect.
Expression pedal control is done by expression.pd. It simply reads in MIDI and scales it to a 0-\>1 range. You can change the midi channel used by editing this file.
The preset system is a little hack-ish, but it works for me. If anyone has any better ideas on how to do this, I'd love to hear them. When you load up the main effectsrig.pd file, you will see a bunch of message boxes. This are quick settings buttons - just click one to apply that effect. They are designed so you can click a couple in a row to quickly apply a few different settings. To start over, click the big "default" one on the left.
It can also load presets based on midi messages. I use this with my Eventide TimeFactor pedal. When I change presets on the TimeFactor, PD follows along. This is handled by the box in the top right. The symbol box is for song titles, and the number boxes show the current TimeFactor preset. Open this box to see how I've done a couple of example midi controlled presets. "pd your\_love\_never\_fails" is a more complicated example that changes the expression pedal behavior slightly.
If you want to use a different midi channel for listening to program changes, just edit preset.pd and presetnum.pd. preset.pd outputs a bang when the preset number supplied as a parameter is chose. presetnum.pd just outputs the number of the selected preset.
[http://www.pdpatchrepo.info/hurleur/effectsrig.zip][0]
[0]: http://www.pdpatchrepo.info/hurleur/effectsrig.zip

Hello all,
So I am working with some midi patching, and I noticed my version of pd-extended was terribly out of date (0.39.3). I download and installed the current stable release (0.40.3) and none of my midi out objects work anymore. I have set my audio device, and midi port (using midi-yoke) and I'm monitoring the port using midi-ox. I have uninstalled .40.3 and installed the current test (0.42.0test5) release, and midi works fine, but I can't get the OSC controls working (using the oscx dlls from the extended package).
Ideally I would like to get 0.40.3 working with midi, but another option would be getting the OSC controls working in 0.42.0test5\. I know could downgrade back to .39.3, but I would like to do that only as a last resort.
Anyone have any thoughts?
Thanks,
AltReality

what i don't understand is, why you need pd if you want to control traktor? isn't it possible to control traktor directly with nintendo?
however, have a look at the objects \[noteout\] and \[ctlout\], this way you can send midi from pd via the midi out you selected in pd's properties. so, if you choose for example midi yoke 2 as midi out in pd, then you have to set midi yoke 2 as midi in in traktor. then sliders connectetd to \[ctlout\] should be sent to traktor.
an example: create a ctlout object with the arguments 60 and 1, this should look like this:
\[ctlout 60 1\]
connect a slider to the first inlet of the ctlout object.
now the slider values should be send to pd's midi out via controller number 60 on channel 1\.

Although historically used in a slightly different way, in today's modern theory, a tetrachord is a four note scalar pattern within the interval of fourth. There are fundamentally 4 of them:
Lydian - C D E F\#
Ionian - C D E F
Dorian - C D Eb F
Phrygian - C Db Eb F
All of the fundamental scales that are used in Western Music are made of combining these tetrachords together. For instance:
Locrian is made of a combination of Phyrgian and Lydian tetrachords separated from each other by the interval of minor 2nd:
C Db Eb F < --- \> Gb Ab Bb C
--C Phrygian--| min 2 |----Gb Lydian----
Now, of course you can come up with more authentic tetrachords other than these four and combine them together in any way you want to create many distinct sounding synthetic scales.
i.e.:
C Db Eb F\# -- G Ab B C
in which two altered phrygian tetrachords combined together with a minor 2nd interval.
I don't know how you've got the impression of pianists only dealing with such theoretical things Maelstorm, but the truth is, this bit of theory has been greatly favored by several composers from Prokofiev, Bartok, Stravinsky to John Zorn, John Adams and many jazz giants who based their music on modal harmony, like Coltrane, Miles, Coleman, Monk, Sun Ra etc.

and here is a project patch i'm working on at the moment. it demonstrates, how useful it could be, to be able to change views within subpatches - if it would work a little more smoothly... ;)
open the folder and run Rumble\_Box.pd.
note, that this patch is not fully working and is only to demonstrate the mentioned effect. i guess, this is a too big gui for a view-change to be computed faster - but for more tiny guis this may be a nice feature though!
edit: oh, i forgot to mention, that the sequencer works with midi-out, so you have to assign a midi-out that runs into a midi in. i solved this by using midi yoke and midi ox - midi yoke 1 for in and out and connected via midi ox.
but for the visual aspect you will need no sound anyways... \*grin\*
[http://www.pdpatchrepo.info/hurleur/Rumble\_Box.zip][0]
[0]: http://www.pdpatchrepo.info/hurleur/Rumble_Box.zip

This patch was something I first made several years ago in Max/MSP, inspired by Leon Gruenbaum's [Samchillian][0] controller. Here it is in cleaned-up Pd form:
[http://www.lubbertdas.org/rela.pd][1]
And here's an example of how to connect it to an instrument: [http://www.lubbertdas.org/relaphasor.pd][2]
Basically what you do is use the QWERTYUIOP keys (not sure if you need to change the patch for different keyboard layouts) to go up and down a scale by different degrees. The space bar repeats the last note, the - and = keys temporarily transpose the current note up or down a semitone, and the \` key turns sustain on and off. You can also change the musical key and scale.
It's hard to play a melody with this patch, but it's good for wanky prog theatrics, especially if you connect it to a more expressive-sounding instrument.
Modifications you may want to make:
-try adding keyboard shortcuts for certain key/scale changes, or to certain notes of the scale
-assign two keys to take you up/down one degree of the current scale without retriggering the note attack. This adds a great hammer-on/pull-off functionality to guitaristic shredding.
-figure out how to use a MIDI keyboard with it so you can play a note on the MIDI keyboard and go up or down from there on the QWERTY keyboard
EDIT: Forgot to mention that this probably only works right in Pd-extended, as it uses \[sort\] from Zexy.
[0]: http://samchillian.com/
[1]: http://www.lubbertdas.org/rela.pd
[2]: http://www.lubbertdas.org/relaphasor.pd

So I have been using LoopBE1 on a project. Started a new project that is a little more intensive on the MIDI load. LoopBe1 bites it constantly.
Looking at MIDI Yoke and Maple Virtual MIDI Driver. Any experience with these or others.
What I am doing is getting a signal from a Beringher BCF2000 to PD. PD then send the input out on 4 channels through the virtual MIDI driver to Cubase to use as a generic remote.
There are 6 continuous controllers, 3 switches capable of send MIDI as cc's to PD. Also there are 4 other buttons that send 9 cc messages at once to PD.
Any suggestions for unbreakable (given that I do not have midi loops in the system) virtual MIDI drivers?
Thanks

hi
Matt Black of Coldcut here.
as a step towards achieving multitrack audio visualisers, ie video synths which are controlled by multiple simultaneous audio inputs, i am interested in getting the following bit of software built, and think it could be made in PD. Am posting it here to see if anyone is interested. I can offer a fee to get this built.
This is the initial spec. I call it a Multitrack Analysis Module , MAM.
-MAM works with multi input ASIO soundcards eg RME fireface, and supports up to 16 audio ins.
-MAM performs SEPARATE fft /spectrum analysis on EACH AUDIO INPUT , say 16 frequency bands per input . delivers amplitude for each frequency band.
-outputs results of fft analysis as midi data. use cc numbers 1-16, with value 1-127 for the 16 frequency bands amplitudes. use midi channel 1-16 to distinguish the 16 audio inputs. Midi data can be routed to available midi interfaces/ports on the host machine, including virtual midi ports such as Maple, MidiYoke.
-optionally, output could be via OSC/ethernet which would get round possible MIDI data rate problems. (Do people think MIDI could handle this amount of data? it could be thinned)
-the MAM should be a self contained module , a stand alone patch that doesnt require a PD framework to run. As i dont know anything about PD , i dont know how it works, but you probably know what i mean.
-ideally, MAM would also be able to run with Ableton LIVE so that one can route the analysis data off to another machine to do the visuals, and still manipulate the audio on the machine with the audio ins in LIVE. this is not essential but i would like to know if people think it would be possible.
Hope this makes sense. I am looking for a good PD coder who is interested in working on a cutting edge project. A payment to build this initial module can be negotiated...not a huge one, but something. Interested parties can post here initially.
Thanks.
Matt Black

The problem with phi as a scale ratio is that it grows too quickly.
Starting at A0 here's the progression
frequency: 55
frequency: 88.9916
frequency: 143.991
frequency: 232.982
frequency: 376.972
frequency: 609.952
frequency: 986.92
frequency: 1596.87
frequency: 2583.78
frequency: 4180.63
frequency: 6764.38
frequency: 10945
Only the first 10 iterations are musically useful. It doesn't matter if you use the reciprocal (period) as a delay time or whether you shift your scale by a linear offset, the order of growth is still too high. There are many so called scales derived from other formula like sqrt(phi), but to be honest they are not musically useful as scales either, I think they are really just mathematical curiosities. I've listened to "music" made with such methods and it's disappointing to say the least. In fact, has anybody got links to any music based on phi scales (with a proper documentation of its methodology) that actually sounds any good? From my limited and humble observations people seem to get hung up on "magic" numbers and their quasi-mystical properties without really giving consideration to whether there is practical utility there.

woo! i don't have to learn this programming stuff after all...(wipes brow)
the pd list came thru with an answer to midi start/stop/tempo issues:
\*\*\*\*\*
how about using pd's \[midiin\] or \[ctlin\] objects, then filtering the
messages
acording to \[url=[http://www.borg.com/~jglatt/tech/midispec.htm][0]
\][http://www.borg.com/~jglatt/tech/midispec.htm][0]
so 0xFA (250 in decimal) will be midi start, 0xFC (252) is midi stop.
0xFB (251) is midi continue.
\> also, what about tempo? anyway to send or recieve that?
0xF8 (248) is the midi clock signal, which gets emitted 24 times every
quarter
note by your midi master. catch this signal and use \[timer\] to calculate the
bpm out of it, if you want.
last but not least, there is the song pointer 0xF2 (242) which gives the
current
song position of your master in 16/th notes.
hope that's what you wanted to know
:::::
thanks a lot, list poster christopher charles :)))
[0]: http://www.borg.com/~jglatt/tech/midispec.htm

Hi
I'm a total noob to PureData. I find it a bit confusing at the moment so bear with me if this sounds a bit stupid..
I would like to use PD in conjunction with a hardware MIDI keyboard, to build my compositions in my usual way (I sort of "multitrack" separate instrument parts over a sequenced beat, and then zoom in on graphically-represented MIDI events and shift them here and there or delete them or change note properties). The MIDI events are then played through my soundcard's internal MIDI port. I would like to do ALL this with PD.
However, there's something NEW I'd like to do too. Namely, I want to use the aforementioned compositional technique, except define my own instruments using FM synthesis, and then to send certain channels to DSP effects units. I then want to be able to control, with my hardware MIDI controller, the wet/dry and other FX parameters relating to individual software effects, and to automate (i.e. record with MIDI) my own actions as I tweak the effects.
So far, no software allows me to have the perfect setup, but it seems that PD might be powerful enough for me to design and implement this environment to my exact requirements. My simple question is: is this a normal use of PD (i.e. can it be done)? Thanks

Nice question!
There's a little explanation in the [pd FAQ] object, which you've probably already read.
![explanation.PNG](/uploads/files/upload-6bd1f823-41fa-449b-8058-4d81a5407c10.PNG)
The last sentence should be replaced by something else, e.g.:
**[square root of 2]······>amplitude compensation factor** (so that a full blast sinusoid input, with a distortion amount of 1 [no distortion], is output as a ±1 signal and not as a ±0.707 signal).
- - -
**INPUT: SINUSOID** (distortion amount: 1, this means no distortion)
When you input a full blast sinusoidal test signal, you get the following readings:
- [env~] ≈ 97
- [dbtorms] ≈ 0.707
- [* 1.4142] ≈ 1
- - -
**INPUT: SQUARE WAVE** (distortion amount: 1, this means no distortion)
That's all great...our input was ±1 and our scaling factor also became 1. But this is only valid for a sinusoid, not for other waveforms. When you input a full blast square wave test signal, you get the following readings:
- [env~] ≈ 100
- [dbtorms] ≈ 1
- [* 1.4142] ≈ 1.4142
Is this 1.4142 scaling factor too big for our needs? Hm...
Our original square wave input becomes louder than it originally was. But this patch assumes that you are playing guitar, and that your guitar's signal will never be a full blast digital square wave (what kind of crazy guitar would that be?). So the 1.4142 scaling factor is right, in general.
- - -
So I imagine this satisfies your curiosity, multiplying by the square root of two is for amplitude compensation, for boosting the signal.
One could delete [* 1.4142] and forget completely about it. That's would be useful too. However, deleting [* 1.4142], and using instead the scaling factor produced by [dbtorms], would only produce a scaling factor of 0.707 (when your input is a full blast sinusoid)...0.707 is too small...I want a factor that is larger than 0.707. So my solution is multiplying 0.707 by 1.4142. In this way, (when your input is a full blast sinusoid) the scaling factor becomes 1, which is more useful than the weaker 0.707.
- - -
Another extra issue is the final clipping function, [clip~ -0.707 0.707]. I guess one should delete it, or set it to [clip~ -1 1]. Hm...

Hi Gilberto!
Yes, that will be useful so thanks for sharing that!
I've not got to try your patch yet and I won't be able to until tonight but from your description it sounds like its missing a couple of key features I require.
I've been thinking about how this app would work best and ideally it would utilize beat detection so you could potentially trigger a MIDI clip at the correct tempo with only one hit, which would best be triggered ~1 beat before the MIDI clip is to start playback.
There should be some visual indicator on-screen of the current tempo detected by the beat detector and I anticipate I'm probably going to want the ability to double/quad or halve/quarter etc the detected tempo before triggering the clips in some cases. After a midi clip has been played or stopped (if it was on loop), the app needs to automatically queue the next clip.
I have been in touch with the author of aubio and he's not aware of any existing Linux software that does what I want that uses his library so it looks like PD may be the best route right now. I did a little research into this last night and it seems I have a choice of at least two beat detectors that can work with PD - [beat] in PD extended or IDT:
http://smc.inescporto.pt/technologies/ibt/
I'll only be able to pull this off with PD/IDT if PD's MIDI file player lets me override the tempo contained within the MIDI file with the one provided by IDT or [beat].

Hi everyone,
I'm not really a programmer and i need help creating some patches that i can keep and use to generate midi for my compositions. Nothing too complex. I can't afford Max, so will have to be in either PD or Bidule i guess?
Idea 1
I need a patch that generates a metronome rhythm, just a simple note on/note off metronome for one definable note, but with the ability to control the speed of the rhythm, from slow, to really fast, on the fly and with the note length set as a percentage of the interval between notes, again changeable on the fly. This then goes out to my DAW to trigger a sample.
Idea 2
Pretty much the same as Idea 1, but with a definable range of note outputs, so as to trigger different samples within a soft sampler in the DAW, choosing each sample randomly.
Idea 3
Same as Idea 2, but output to go to a synth. So would be cool to be able to output chords and for each note on trigger to also output random triggers for lfo/cut off/other cool parameters etc. too.
Idea 4
I'd also like a simple sequence tool which outputs a short repeating predefined sequence of midi note on and note off to send to the DAW, but where I can control and change the interval/time between each note on trigger on the fly to create varied and interesting patterns and again where the note length is changeable on the fly and set as a percentage of the interval between notes.
I want to record the midi output into a DAW where i can then edit it down into the most interesting bits and build compositions.
None of the ideas need a GUI. I just want these useful tools to keep and produce works with and that I can maybe add to and develop in the future when I've learned how to program!
Any ideas, help building or tips would be really appreciated!
Thanks,
Henry

Still on to this...
Is it possible to get fair results with \[autotuned~\]?
I am not able to control it further than pitch shift. When clicking the message boxes containing the different scales I can still play chromatic lines. I made a few test messages with just 12 0's'or 1's which I'd figure would make a drone note, but the result sound pretty random. It seems that it always try to repitch notes B and C, no matter what scale you have selected. I can not figure out how you set the root of the scale.
If it is just me that doing the wrongs and the patch actually is working, can someone please tell me how to use it?
lead's autotune(vocoder).pd I can not get to work "error: signal outlet connect to nonsignal inlet (ignored)" and it is too complex for me to work out, the in\_ADC/out\_MIDI window.
I found a few other patches that I can try to stitch together if I can not find a complete working patch. They are and audio to midi patch, and a midi to audio patch.
I'd like to ask for an opinion on a good solution for what I am trying to do, to make sure I am looking for the right solution in the right place:
I want to be able to decide what pitches are affected or bypassed, and all affected notes should be possible to be transposed by the same amount. Normal.
As for tone, realism is not important but it should not sound "destroyed" by default (ugly artifacts), and I actually want to have the wobbling between notes when the algoritm can not decide where to go.
So formant "truly correct" is not essential, solid basic functionality is more important.
Is there a better solution for this somewhere?
Thanks a lot!
N.

So, I'm working on an interface that will translate gamepad inputs to certain MIDI controls that will control the routing and properties of an audio effects chain in Logic, including loop plugins within logic.
My main patch receives the input and sends it to one of many different subpatches that each deal with the same input in different ways.
For example, if subpatch 1 is active and button 1 is pressed, I want to send MIDI message 1 1 1 to logic, which converts that message in the logic environment to the correct message to, say, start recording loop one. If I want another button on another subpatch to perform the same function (loop one record-on), I would want the same message to be sent to Logic (1 1 1) from a different point in the chain.
Now, to make passing these MIDI messages within my patches easier, I have the digits shifted and parts summed so that MIDI message 1 1 1 would be passed within the patch as 0010101 and 127 5 8 would be passed as 1270508\.
I expect to be cranking out more subpatches every week for possible input sets, all leading back to the same output to logic for the same message ("loop one record-on" will be 1 1 1 for every subpatch), and to make writing subpatches more intuitive, I'd like to do some sort of analogous function to \#DEFINE in C, such as:
\#DEFINE lp1RecOn 0010101
and my subpatches could use the name instead of the number. This also would make it easier to change my MIDI message processing scheme later. If I decide that some other function needs to be 1 1 1, and I change the "loop one record-on" function to use 1 5 4, I would only need to change the definition of the constant in one central reference file (an the routing in logic) instead of crawling through all of my patches, changing every instance of it in them.
Is there any way to \#DEFINE constants in Pd?
(Or do something funcitonally equivalent?)

Basically i tested this on Reason and Ableton. It uses external midi control rather than a built in synth sound in pd. I don't really know anything about Pro Tools but if there is a device that plays midi control notes, then assign channels 1 & 2 to the proper midi driver through Pro Tools. Also you're going to need to set your MIDI output device 1 to the midi driver you use for pro tools.
[http://nerds.de/en/loopbe1.html][0]
use this
You'll have to set the channels for Pro Tools as well as the midi driver.
[0]: http://nerds.de/en/loopbe1.html

Thank you, aeo and Maelstorm!
And my apologies for long absence!
You're right, the \[date\] and \[shell\] would help me get the current system date, but that's not exactly what's needed.
The problem looks like this:
There's a unit of measure, a day.
There's a couple of cycles, say, a 5-day cycle, a 10-day cycle, a 27-day cycle.
All cycles start simultaneously and run continuously. Just like sine waves with different periods, creating different combinations every day. It is easy to get the state of the system, ie the amplitude of each wave, for any given day of its operation.
However, we do not think in terms of "how many days", we think dates. And our conventional Gregorian calendar doesn't have any simple mathematical formula behind it and is therefore difficult to express in pd. (isn't it?)
So we have a \[bang< on, say, 01/01/99, the cycles start running, and we want to know the state of the system on, say, 01/01/01, ie on 732nd day of its operation. Now I had to calculate the number of days of operation in my head (and hopefully did it correctly). Can pd do this job for me? So all I need to do is input two dates, one indicating start, one - target date?
Sorry if my explanation is awkward, I hope that the task itself is quite comprehensible!
Best,
Vadim

i'm not shure this is your case because it seems you're using a midi software synth and a sensor but i know that "if you hear double notes when playing your MIDI keyboard, (slapback echo or flanging, the chances are that MIDI Thru in your sequencer and Local Control on your keyboard are not set up correctly. This causes the keyboard to sound two notes - the one you play on the keyboard and another being echoed from the computer...
On your keyboard you need to switch Local Control to Off. This disconnects the keys from the sound circuitry so sounds can only be played via messages arriving at its MIDI In" ([http://www.practicalpc.co.uk/computing/sound/miditop5.htm][0])
but i'm not so shure this helps... what midi module are you using? is it hadware or software stuff? you can try to monitor icoming midi messages with midiox to check that everything is going right
[0]: http://www.practicalpc.co.uk/computing/sound/miditop5.htm