1) Would it make you happy to have standard User Interfaces on complicated devices or programs like DAW's (Digital Audio Workstations/Recorders like Protools, Nuendo etc), Graphic Editing Programs (like Photoshop, Quark..) etc?

2) Do you think standardizing would jeapordize some of your favorite features/programs if they had to conform to the standards?

3) Do you think standardizing would kill some creative ideas or implementations, or do you think companies could still present those ideas within the framework of standards?

4) In general, do you think attempts at standardization have been a good thing? Do cases come to mind where somehow the standards committiees have totally screwed up? (One case comes to mind for me in the case of the AES deciding Pin 2 was hot instead of Pin 3 on XLRS!)

Specific examples of standardizing or of great user interfacing that could/should be standardized are welcome.

I may use feedback here not only for my own interest, but also forward the feedback to friends in the industry who help formulate interfaces and standards on major industry products.

As always, I will be throwing out my own irritating opinions, within this thread.

Standards are potentially a good thing. The problem is that manufacturers will frequently walk all over them if they don't know better or can save a buck or two.

This was how we got Ampex installing XL mike connectors as their their balanced and unbalanced line inputs rather than using barrier strips like everybody else did until the early 1960s. They chose pin 3 hot despite the existence of an international electrical interface standard that called for pin 2 to be hot. Because Ampex recorders had the first XL connectors many facilities had ever seen, most wired things up to the Ampex "standard." Meanwhile Microphone manufacturers, by far the largest users of XL mike connectors, began building mikes wired to the international standard in the late 1950s. The result was a lot of US facilities running pin 3 hot while using mikes that were pin 2 hot. Some rewired their mikes, many didn't. Meanwhile in Europe, XL connectors didn't become common until the '80s with most large facilities standardizing on pin 2 hot and rewiring anything that was pin 3 hot to pin 2 hot. The AES got involved when circuit board XLR connectors showed up that people would not be able to easily rewire. Pin 2 won because it met the long-time international electrical interface standard and to do otherwise meant all of the worlds microphones that used XL connectors needed to be rewired.

Many of us would really like to see analog interface standards in terms of both level capability and impedances. There's an insane amount of gear on the market that simply can't drive +4 average line levels.

The AES created a wonderful multitrack digital interface called MADI. Unfortunately Alesis was able to save a bundle by using off the shelf Toslink parts as their 8-channel digital multitrack interface and Tascam followed suit using a variation on Sony's SDIF interface.

We are well on the way to a workstation file standard called AES-31. It provides the same functions as a roll of multitrack tape but nothing additional because that would limit developers and would force engineers to turn in their traditionally proprietary mix information with their files. Once we have this standard, developers will be able to create specialized applications that should be a lot better than the Swiss Army knives we use today.

I totally agree BOB! I knew about the Madi Interface and thought it was great! Also agree on the potential for standards to be a great thing... but they gotta make sure the users are mostly happy with the setting of them. Wasnt Madi actually a creation of SONY??

Well we cant be too mad because cable & connector technology for high speed transmission has come a long way. Even firewire is an amazing format, although every time I look at that little floppy, chrome covered connector I go EEK!

What we really need is a cabling and connector system that will stay stable as the actualy transmission speed changes. Right now everyone has like 30 connectors on the rear of their digital boxes to deal with the many formats. Its almost as bad as (gasp) the often changing IBM PC Card Slot formats! I give PCI about 2 more years lol. Im not even going to bring up Bill Gates pathetic excuse for a huge, unreliable, idiotic "moving target" set of operating systems going under the pretty name of WINDOWS. Oh how Id love for someone to really start a competing standard that was simple, swift, and beautiful.... Mr Weeny Gates is one fine businessman tho! Apple on the other hand, continues to make bad business decisions after another....

I had no idea about the Mic/Ampex thing honestly. All I know is you can look at an XLR from 10 feet away and tell which pin is pin 3... THAT SHOULD HAVE BEEN THE HOT ONE! It helps when you have a "turnaround" or for the sound guy working in the dark. Visually its just so intuitive to have the top of a triangle be the Hot pin.

Originally posted by Dave Derr
... Mr Weeny Gates is one fine businessman tho! Apple on the other hand, continues to make bad business decisions after another....

Apple only gets away with regularly shooting themselves in the foot because Gates is the competition. I think they're both rock star wannabee hustlers.

If they can find a cheaper way to make a connector, you can bet they'll do it. Did you know that it's "legal" for a USB port to drop bits? They assume that you'll be applying massive buffering and error correction using your main CPU. It's really just a way to reduce the chip count and connector cost in a computer.

The thing that really bugs me is the amount of time you have to invest in learning a Midi editor, or Digital Audio workstation. I mean, shouldnt little things like "PLAY/STOP" always be the space bar? Shouldnt something like ALT E (the APPLE "Splat" E), or CRTL E always bring up an edit box?? Its always been my opinion you should never have to read the manual to learn basic functions.

You know... BASIC CONTROLS SHOULD BE THE SAME, dont you think? Even with word processors theres no where near enough standardization of controls. (The worst example of an idiosyncratic interface was the early WORDPERFECT interface! Holy HELL, Batman).

Doesnt it bother anyone as much as me that Nuendo and Protools are like learning two different languages?

I remember quick keys from long long ago! After you invest the time to set it up, it will save you lots of trouble.!

ANOTHER PET PEEVE - CONVERTER SPECS!

Another thing that needs redefining or standardizing is converter specs! Good lord, the converter manufacturers sell lots of snake oil. Does anyone actually think we have 24 bit converters?? <laughing> Well Ill promise you that not ONE specification measures at 24 bit specs, and theres usually 3 - 5 useless bits coming out of those things!

The way our digitizing system works out, there is a theoretical 6 dB increase for every bit added to a converter. A 24 bit converter should therefore have a dynamic range of 144 dB! Pshawwww! Someone once told me that once you get over 120dB of dynamic range, your body functions determine the noise floor. Honestly, I dont think we can achieve an honest dynamic range of 130dBm, unless we elevate our overall operating levels to +30dBm which involves 60 volt swings in a non-differential, single ended system. The laws of thermodynamics are un-relenting. It is my opinion you shouldnt believe any published spec over 130 dBm, and not just on converters, but on any audio device!

The thing that is unfair and cheats everyone is that people can call a converter 24 bits just because it puts out 24 bits. "Well hey ya all (in a cheap friendly southern accent)... Step right up while I demonstrate this purty little 100 bit Analog to digital converter. Come one, come all and count the bits. Yes yessss, and we even have a full 19 Useful Bits! Take those extra 81 bits and use them for dither."

<Laughing> Really! What will they call it when someone actually makes a 24 bit converter that performs close to 24 bits? 32 bits?? Its like putting a 300MPH (480 kH) range on a car's speedometer... like we should believe the car is gonna go that fast.

As users, we should demand that at least one key audio specification is within "one bit" of the published "resolution" of the converters. Let me print a little chart here of what we should expect ideally with different converters.

To my knowlege, there is not one converter that gets the "unwieghted" distortion of the 16 bit converter above, not even the 24 bit converters. Also, I'm reasonably sure that there isnt one converter that gets 20 bit, unwieghted noise performance of 120dB. Send me the details if you find one, and I want to see what test setup they made the bench measurements on! (I wanna buy that guys BENCH).

I think its time we started demanding manufacturers start publishing resolution in "meaningful bits", and not these fake, wishful thinking, "dither" bits.

"Standard" commands for DAWs would be a good thing. At the moment you don't even get standardisation between platforms with Pro Tools. When I looked at getting a home Pro Tools rig, I was keen to run it on a PC because the machine would have to do double duty with office software and my partner was sharing the cost. But the work Pro Tools were all Mac and the keyboard command set was completely different between Mac and PC. I figured I'd never know where I was so I had to persuade my partner we had to get a Mac. Luckily she agreed. But if you can't even get parity across platforms, what hope is there?

Further to the XLR business, what about the manufacturers who put the wrong sex connector on stuff? I've occasionally come across a male socket (pins) for an input- what's up with that?

A friend told me that the main advantage of 24 bit converter chips is that you can get in and out differentially and use common mode rejection to kill a lot of RFI both inside the chip and on your circuit board.

Originally posted by Doug Ring Further to the XLR business, what about the manufacturers who put the wrong sex connector on stuff? I've occasionally come across a male socket (pins) for an input- what's up with that?

Upside down jacks or not, I think MIDI is one of the great sucess stories of standardization. The huge explosion of MIDI devices in the 80's and 90's was directly due to the fact that everything MIDI could be used with everything else. Even the lowliest keyboard gained extra usefulness because it could be hooked up with all your other stuff.

MIDI "works" precisely because Everybody got on board. You never have to "convert" your MIDI information to some stupid proprietary format just to use "Brand Z" equipment. Imagine if you did.

The growth MIDI to include computer stuff was a major impetus to get musicians (well me anyway) using computer gear.

Having some of your jacks 'upside down' is good for your cables- it gives them a chance to flex the other way part of the time. heh

Good Point Joeq - Now if they could just standardize the interface on Midi Editors.... Ever notice how people almost always stick with the first Midi Sequencer they learn well, cuz the overhead in learning another is just tooooo painful!?

HEY Im a day past my month but have one more rant, related to standards.

CAN WE PLEASE FORM A COALITION TO GET RID OF THE HORRIBLE PLASTIC CASES FOR CD's? <laughing!> Holy cow I dont think there is one CD case in my car that isnt cracked, hanging open because of a broken hinge, or jammed up from some other type of damage. Its an insult to have to deal with them in every way, and I am sitting here imagining landfills filling up with CD's and broken CD cases.

Can you believe that someone actually designed those things and thought it was good? Someone describe a redeeming quality about those plastic folding CD cases... PLEASE!? Here, let me itemize the down sides:

1) Hard to open initially, especially now they have like nine pieces of security tape in various places around them. Its really impossible without some tool now.
2) Hard to open subsequently. I mean you rarely get the CD out on first try! What dweeb or little slacky engineer designed it and said "Ohhh, this is neat to fight with"?
3) Getting the liner notes in and out is a complete joke.
4) They break the first time you have a problem or drop them. Ive never seen a more fragile consumer product. They crack if you sit on them or put pressure on them. The hinges are 1/4 inch pieces of brittle plastic totally unfit for the real world.
5) Their not that cheap to replace. The CD is cheaper than the case these days!

Can anyone come up with a redeeming feature? I dare ya!

Id rather have the cardboard sleeve ones... the bad thing about them is once you put them in your rack of CD's, you cant read the label from the edge.