You would need to have a streaming format in mind. What is it you want to achieve? Do you want the end result to be a file you can play back? Or do you want to somehow livestream to
other people?

NSUrl simply contains a url, a network address. it is basically just a fancy string.

If you just want a file in the end, then you would basicslly use omz's example, then upload the resulting file to a server. For some sort of livestrsm, you will need to research formats/protocols and figure how to transform audio units into the right format.

If I were to do this again, I would experiment with Audio Units, which calls a block on small chunks of waveform. As is, this basically writes to a file every second, reads it back in and displays it. I think I have overlapping recorders to minimize latency.