This is a 'Heads-Up' or sneak pre-view of some work in-progress both here at CHT & by some collaborating Dev's;

We're working on some new software ideas that we think will really open up some exciting new possibilities for both those in the community who want to move the Orbiter forwards and out of the stagnant situation its been in for some time now and also for those UI designers who are constrained both by the limitations of todays Orbiter and also by the tools we have today for designing new UI's - namely HA Design & Quick Designer. As with our earlier work on Web Orbiter 2.0 and subsequently on Touch Orbiter we hope that this new software will spark the communities imagination and will trigger loads of new and exciting ideas from the community.

So what are we up to? Firstly we are not releasing any software today...but we will be in about 4-5 weeks time. Well were doing a number of things in parallel - some of which are approaching being complete enough to release to the community (all code & docs will be GPL from the get go by the way) which is why I am taking the opportunity to mention them in this thread as a pre-cursor to making them available in early May 2011. Firstly we are doing a pretty much from the ground up re-write of Proxy_Orbiter that will re-engineer it so that it does not render screens as it does today but instead 'renders' XML and that XML will describe screens and the visual elements they contain or are constructed from ie the canvas or backdrop, the buttons and other UI controls and other UI elements that make up any screen. Those UI elements can contain bit maps of course...but the new Proxy_Orbiter will not hand-off pre-rendered screens to Orbiters.

So outwardly the new Proxy_Orbiter will talk XML and inwardly it will interface to the rest of the system pretty much as it does now - the objective is to make this new Proxy_Orbiter a 'drop-in' replacement for todays Proxy_Orbiter in that respect. Of course this new outwardly facing XML will mean that new Orbiters will need to be created that can interact with this new XML API and so we are also building some new example Orbiters, one implemented in Qt (we like Qt and thats about as deep as it goes :-) ) & we hope also one for iOS that will use the new XML API exposed by the new Proxy_Orbiter and these will act as implementation examples for those interested in developing new Orbiters ie just as we did with Web Orbiter 2.0 & Touch Orbiter. Again we will release the full C source code for both the new Qt and iOS Orbiter's so that anyone wanting to write another implementation can pore over our C code and understand how we implemented it - this we hope will make it much easier to build new Orbiters for any OS using any dev environment on any device. Again this is the model we used for Touch Orbiter in particular and we hope it will trigger an avalanche of new Orbiters for all manner of devices...and potentially this even extends to Orbiters running on MD's too...

So that all sounds great...but your asking why is this so exciting both for software developers & for UI designers? Firstly the XML API will free the software developer to implement the UI using whatever UI elements/widgets etc the OS/Platform she/he is developing for supports ie a new Android Orbiter can look like an Android App and can exploit any UI widgets/controls available, same goes for iOS, or Qt or Clutter. Because the XML API does not deal with pre-rendered screens/bitmaps the new Orbiters are not constrained and can utilise the full range of UI elements/controls/widgets that are available on that device/OS/Platform such as dynamic scrolling lists on say Android or iOS devices for example or some super animated dynamic equivalent in an Orbiter built in Clutter maybe.

Once this new XML API is in place it will also enable some other exciting things...such as new WYSIWYG visual UI design tools to be developed to make designing/testing new UI's or updating existing ones fast and simple to achieve. And these new tools will we hope make UI design/development much more accessible to many more people here in the Community - and we hope this new accessibility will attract new design orientated members to join us here.

So thats about it for now... we're announcing this now so that we can warm people up to the idea get a little feedback here in this thread, answer some questions and generally get things explained as we move closer to releasing the code/API etc. Right now we're in the final phase build up to release and in the coming weeks we will have some other announcements too relating to some other areas of the system... DCErouter is one of these areas. So expect some more on that soon.

A new way of creating orbiter designs is one of the things i am looking forward to.Now we can make the great looking UI like every popular platform these days have, lift it into 2011 Since i am trying to create a new fresh looking skin this is great news for me.

For me to be able to create a new skin for iOS what is the best program to use (Mac Snow Leopard)?Or must i download the SDK from Apple?What do you recommend is the best way to learn the new way of creating skins?

and again i have to ask could you please post a video as a teaser to the possibilities?

thanks a lot,br,

Raymond

Logged

When you were born, you were crying and everybody else was laughing. Live your life so when you die, you are laughing and everybody else is crying.

A new way of creating orbiter designs is one of the things i am looking forward to.Now we can make the great looking UI like every popular platform these days have, lift it into 2011 Since i am trying to create a new fresh looking skin this is great news for me.

For me to be able to create a new skin for iOS what is the best program to use (Mac Snow Leopard)?Or must i download the SDK from Apple?What do you recommend is the best way to learn the new way of creating skins?

You will only need to work with the new Orbiter source code, and therefore the iOS SDK, if you want to make low level changes to how an Orbiter functions (similar to todays Orbiters). For most UI/Skin development you will just need to use off the shelf graphics tools like Gimp etc and the screen layout editor to visually arrange your screens/UI elements. So the Qt & iOS Orbiters we will deliver will be fully functional and for most Skin development should be adequate initially.

Quote

and again i have to ask could you please post a video as a teaser to the possibilities?

We might release some sample video nearer release...but the real point of this is not to ship you the best UI skin and 'Wow' you...but to provide you all with new tools so you can 'Wow' us ;-)

Thanks for all of the positive responses here in the thread...nice to get a sense that we are doing something that people find interesting.

One point that I did not really explain in my initial post at the top of the thread was that out of the gate the new Proxy_Orbiter & Touch Orbiters should work just fine with existing Skins...and any new skins that are built with the current Orbiters in mind. When you load an existing Skin in you may loose some of the richer new dynamic UI functionality (ie some of that will require some new attributes to be added to your UI to exploit it). I can't be precise about how much of the new rich UI will automatically get rendered in the new Orbiters...I may be able to in a few weeks time as we get close to released code.

If I can be more specific then I will post that info here... with maybe a screen show or two showing 'before & after' views of a UI running in the old & new Orbiters.

Hope the above clarifies at least to some extent how the transition to the new Orbiters will work.

Sounds interesting. As soon as you have any of the design aspects drafted please share - particularly the XML interface design and schemas.It will enable investigation into mapping the items into the native UI capabilities. Hopefully then we can have an Android implementation earlier rather than later.

I am interested in how you see this enabling the new UI widgets and elements though? Using a proxy design won't we still be tied to the HAD Designer and DB stored screen designs? This seems to be just a new rendering for the existing orbiter right? So that instead of rendering each designer object with say SDL code (which then gets captured to an image by the proxy orbiter) you will render to data structures allowing remote systems to do the native rendering (at runtime rather than design time)Does that match what you are thinking?If so then aren't we still tied to the screens and design objects in HAD Designer and our Database - all be it with remote native rendering? Hopefully I am missing something but how then will this enable WYSIWYG editors and the like?

For example the XML would still be describing screens and their layouts rather than data constructs that need to be displayed.<screen width="xxx" height="yyy"><button xpos="xx" ypos="yy"/><button xpos="xx" ypos="yy"/><list><row><column>DDD</column></row></list></screen> Rather than<medialist type="audio"><mediaitem name="xxx" artist="yyy" path="nnn"/><mediaitem name="xxx" artist="yyy" path="nnn"/></medialist>So you would still be tied to buttons in certain spots and lists of data already constrained for a particular screen etc.So how do you do the screen design with a WYSIWYG editor?

Don't get me wrong - certainly not trying to be negative. Just want to get better understanding so I can start to design an Android version and to create some healthy discussion.

I still think that this is moving forward and will create more flexibility in the proxy-orbiters and somewaht more native looking ones.

Sounds interesting. As soon as you have any of the design aspects drafted please share - particularly the XML interface design and schemas.It will enable investigation into mapping the items into the native UI capabilities. Hopefully then we can have an Android implementation earlier rather than later.

We'll share some of our thinking here in the coming weeks for sure. I am taking some vacation time in the next couple of weeks but I will still be in touch with developments while i am away and will post update and try to answer questions etc.

Quote

I am interested in how you see this enabling the new UI widgets and elements though? Using a proxy design won't we still be tied to the HAD Designer and DB stored screen designs? This seems to be just a new rendering for the existing orbiter right? So that instead of rendering each designer object with say SDL code (which then gets captured to an image by the proxy orbiter) you will render to data structures allowing remote systems to do the native rendering (at runtime rather than design time)Does that match what you are thinking?If so then aren't we still tied to the screens and design objects in HAD Designer and our Database - all be it with remote native rendering? Hopefully I am missing something but how then will this enable WYSIWYG editors and the like?

For example the XML would still be describing screens and their layouts rather than data constructs that need to be displayed.<screen width="xxx" height="yyy"><button xpos="xx" ypos="yy"/><button xpos="xx" ypos="yy"/><list><row><column>DDD</column></row></list></screen> Rather than<medialist type="audio"><mediaitem name="xxx" artist="yyy" path="nnn"/><mediaitem name="xxx" artist="yyy" path="nnn"/></medialist>So you would still be tied to buttons in certain spots and lists of data already constrained for a particular screen etc.So how do you do the screen design with a WYSIWYG editor?

Quote

Well initially we will be doing the former ie XML that describes screens and their layouts so that we can bridge the gap between where we are today and where we want to go towards in the future with UI's and Orbiters. I think that ultimately we want to get to XML that, as you say, represents the data constructs that need to be displayed but currently we dont think we can jump there in one go. The initial WYSIWYG editor is based on the new Orbiter in fact...so it will not be 100% WYSIWIG in reality... essentially we are using the guts of the Orbiter to render the UI and adding some code that will allow you to move/adjust and interrogate the screen you are viewing and then commit those changes so that they are retained. This basic editor can then be extended in various ways...you might want to dump the XML out to local storage for manipulation in a text editor for example (do a grep and mass replace/change some elements across the whole UI for example...a job that today is very complex and manual).

Quote

Don't get me wrong - certainly not trying to be negative. Just want to get better understanding so I can start to design an Android version and to create some healthy discussion.

I still think that this is moving forward and will create more flexibility in the proxy-orbiters and somewaht more native looking ones.

I guess ideally we'd start with a 'clean sheet of paper' and not have to worry about how we carry existing parts of the system and legacy data/code with us. But in reality that just is not viable. So I think initially your right we will get maybe 40% of the improvements we ultimately want to have in the first release (I think thats a reasonable estimate of where we'll be) and then we'll move along a path of cleaning things up to a point where we eventually get the rest.