These maturing productions have demonstrated that we can put audiences at the heart of flexible, personalised IP-delivered media experiences in entirely new ways with adaptive programming.

However our work is still handmade. Even our end-to-end productions are bespoke "one-offs". We haven't yet cracked scalable, repeatable production of object-based media experiences; something which is essential if we’re to make them a viable, business-as-usual option for production teams, and widen the field of possible OBM experiences.

To address the challenge of moving OBM from bespoke demonstrators to increasingly sophisticated object-based productions, we're exploring four areas under the toolkit umbrella - Tools, Exemplars, Data Models, and Community of Practice.

Tools

Our previous OBM projects have been authored using short term, unsustainable approaches such as spreadsheets and python scripts, meaning that hefty engineering was required to build and tweak any experience.

We are currently building robust and integrated web software tools that will become the first steps in a platform for authoring and delivering object-based programmes. Initially we are developing:

StoryFormer - A rapid wireframing tool which lets you sketch your OBM experience, and populate it with pre-production content. In this way, an interactive story can be pre-visualised, iterated, refined and tested then final media can be captured.

StoryShooter - A distributed shooting tool to allow an operator to 'drive' the shoot by tracking, tagging, and enriching your story, ensuring you capture all the assets required to make your experience. This will also manage an automated ingest process backed by an IP studio store, surfacing your rushes in precisely the correct place in your StoryFormer wireframe, ready for refined craft editing.

StoryPlayer - A dedicated player component, which will play back everything from your earliest pre-visualisations to your final published experience. This will begin to define requirements for the capabilities of embedded media players in the future.

Alongside recommendations for new workflows, these flexible tools will enable teams to envisage stories that are object-based from the outset; meaning adaptive narratives and experiences are just as easy to make as linear programming. They also have the potential to enable new functionality for 'traditional' linear programmes, allowing production teams to describe their content at a high level of granularity. For example, describing different bits of a programme by topic, speaker, or quiz-show round - so player controls could navigate to parts of that show by content boundary rather than time interval.

Exemplars

Our prototype experiments have demonstrated some of the opportunities and features possible when considering an object-based approach. Experiences can be dynamically responsive to audiences driven by explicit or implicit interaction. They can vary in depth, length, and presentation format - video, audio, or text for example. But we know this is only a tiny slice of the potential creative value offered by embracing OBM methods.

We will collaborate with internal BBC partners and indie production teams, beginning to use our embryonic toolset for imagining, building, and deploying new types of OBM experience. We hope these experiences will be catalysts for identifying inventive new formats as well as short and long-term evolutions of production methods; to a point where production teams can envisage those methods becoming a seamless part of their process. Supporting real productions will also test (more than likely break) the tools, allowing us to iterate resilient workflows that deliver sustainable production.

Data Models

To enable truly scalable methods of making object-based programming it is crucial to construct data models that describe relationships between objects in that programme semantically.

All of our one-off prototypes used individual (sometimes unexpressed) data models which were handcrafted. This is problematic as they don't share a common descriptive language, and consider only isolated and experience-specific concepts. To address this we are creating a core generalised data model to describe all the experiences we can imagine in the future, informed by insight from producers, ontology conventions, and our earlier work.

The data model expresses OBM narrative structure, presentation structure, and production workflows, and uses IP studio concepts to reference media in the abstract. It allows us to have tangible conversations about OBM, enables interoperability, and allows the wider community provide new functionality by creating their own compatible tools, extending the OBM toolkit.

Community of Practice

Finally, we see a lack of sustainable tooling as a barrier to the adoption of OBM by any potential community. We want to transfer the creation of new experiences beyond the realm of R&D engineers and into the world of craftspeople – it’s the community of practice that will explore the potential, provide us with real world use cases, and give us feedback to help develop our tools and workflows.

We need to support a culture of experimentation around the craft and quality of OBM in order to expand our perspective on what this technology can provide, in what ways it is valuable, and how it could be developed. We hope to do this in three ways:

Awareness: talking to people and organisations already interested in or working on adaptive narratives through talks, workshops and conferences.

Advocacy: presenting our work and demonstrating best practice in our work and methods as we explore, connecting people through networks like the Storytellers United slack channel, and helping share perspectives and knowledge.

Access:letting people play with our emerging software tools for trial use and to generate feedback on their development. We're also encouraging developer communities to try our open source code, such as the VideoContext library.

Our first series of workshops with internal and external programme makers is live at the moment, you can read about our latest ones in this blog post.