A user’s physical and virtual environments are becoming increasingly interwoven. Mobile and embedded devices have become more the rule than the exception: many present-day users are inexperienced when it comes to traditional computers, but are in daily contact with computerized systems. The increasing diversity of all kinds of devices, together with the possibilities of running arbitrary software autonomously, result in a thousand-and-one potential areas of use, whether mobile or not. The possibility of communicating with the (in)direct environment using other devices and observing that same environment allow us to develop ambient intelligent software which has knowledge of the environment and of the usage pattern of this software. Despite the support for software development for this kind of applications, some gaps still exist, making the creation of consistent, usable user interfaces more difficult. The design methodologies for interactive systems need to support this new situation: an interactive software system is no longer designed for one particular piece of hardware or a single context-of-use. We turn to Model-Based User Interface Development and investigate what needs to be done to support the design and creation of interactive systems that can be used in multiple contexts. Model-Based User Interface Development uses a selected set of models to describe different aspects of a user interface such as user, domain, task, dialog and presentation. We select three models that are widely accepted; the task model, the dialog modal and the presentation model, and introduce the necessary enhancements to serve our purpose. ...