From the user's point of view, the most important thing abt. plugins is a standardized interface and tight integration into blender UI. The best example I know is MAX, where plugins simply appear as additional buttons. The thing is, max was developed from the very beginning with that "feature" in mind, whereas blender might need some GUI modifications.

blender dosen't even have the neccesary plugin/scripting api to do everything we want it to do. We'd need to make blender's architechture allow us to do things like rendering and real mesh editing and such before we could make a real inteface for acccessing such functions, right?

Comeing up with a good system now, means we won't have two independant types of plugins that end up needing to be presented in totally different way...

Once we have an interface, then any new development on the plugin API should try to fit that interface.

If Blender had NO plugin API, then I'd say you're right. But making an interface to make it easy to your the plugins is a good tool _even if_ you can only do the things you can do now.

What we should see is:
Python - Access to all object types.
Plugins - Post-Processing Plugin (can do ZBlur, or Glow on stills w/o going to sequence editor),
- Volumetric Texture plugin system (calls a texture plugin at a specified interval along the eye vector when inside the bounding mesh)
- Mesh Plugin (Define new 'primitives' - Could be done in Python if made easy to use)
(Don't know if this belongs here, but) A better library system, so whole objects (including textures and children) and be imported from a lib AND exported to a lib.

Note: most of these can be worked on independantlu, as long as a basic structure is followed.