digital urbanhttp://www.digitalurban.org
Smart Cities, Data, Modelling and Visualisation from The Bartlett Centre for Advanced Spatial Analysis, University College LondonSun, 02 Nov 2014 12:35:27 +0000en-UShourly1http://wordpress.org/?v=4.151.5218-0.1345Subscribe with PlusmoSubscribe with The Free DictionarySubscribe with Bitty BrowserSubscribe with NewsAlloySubscribe with Live.comSubscribe with Excite MIXSubscribe with Yourminis.comSubscribe with Attensa for OutlookSubscribe with WebwagSubscribe with netomat HubSubscribe with Podcast ReadySubscribe with FlurrySubscribe with WikioSubscribe with Daily RotationWelcome to the digital urban feed....Particles – 3dsMax and Lumion/Unityhttp://feedproxy.google.com/~r/blogspot/EYWY/~3/uQuP4Q_GYk8/particles-3dmax-and-lumionunity.html
http://www.digitalurban.org/2014/09/particles-3dmax-and-lumionunity.html#commentsMon, 01 Sep 2014 10:43:24 +0000http://www.digitalurban.org/?p=3683(Visited 968 times, 1 visits today)]]>Particle Flow is a versatile, powerful particle system for Autodesk’s 3ds Max. It employs an event-driven model, using a special dialog called Particle View, allowing you to combine individual operators that describe particle properties such as shape, speed, direction, and rotation over a period of time into groups called events. Each operator provides a set of parameters, many of which you can animate to change particle behaviour during the event. As the event transpires, Particle Flow continually evaluates each operator in the list and updates the particle system accordingly.

pFlow 3ds Max

To achieve more substantial changes in particle properties and behaviour, you can create a flow. The flow sends particles from event to event using tests, which let you wire events together in series. A test can check, for example, whether a particle has passed a certain age, how fast it’s moving, or whether it has collided with a deflector. Particles that pass the test move on to the next event, while those that don’t meet the test criteria remain in the current event, possibly to undergo other tests. The simple example pictured above details a pFlow dialogue determining the birth of particles linked to a target geometry. The particles can subsequently be baked (using pFlow Baker) into an animation timeline for simple output via .fbx, allowing import into external systems such as Unity or Lumion.

The clip above illustrates the pFlow system imported into Lumion with the addition of a scene created in CityEngine.

(Visited 968 times, 1 visits today)

]]>http://www.digitalurban.org/2014/09/particles-3dmax-and-lumionunity.html/feed0http://www.digitalurban.org/2014/09/particles-3dmax-and-lumionunity.htmlThe Making of the CASA Oculus Rift Urban Roller Coasterhttp://feedproxy.google.com/~r/blogspot/EYWY/~3/RLK1-_FeHm8/the-making-of-the-casa-oculus-rift-urban-roller-coaster.html
http://www.digitalurban.org/2014/08/the-making-of-the-casa-oculus-rift-urban-roller-coaster.html#commentsTue, 26 Aug 2014 14:48:54 +0000http://www.digitalurban.org/?p=3657(Visited 2,252 times, 1 visits today)]]>Earlier this year CASA was invited to create a virtual reality exhibit for the Walking on Water exhibition, partnered with Grand Designs Live at London’s ExCeL. While CASA has a tendency to spend a lot of time thinking seriously about cities and data it was quickly decided that a fun and novel way to engage the 100,000 or so expected visitors would be an urban roller coaster ride using the Oculus Rift Virtual Reality headset. Oliver Dawkins, a student on our MRes in Advanced Spatial Analysis is a leading light in urban visualisation and the Oculus Rift, as such he kindly offered to lead the development. In the following guest post, Oliver (of http://virtualarchitectures.wordpress.com/) talks us through the development process….

The first tool chosen for this project was the Unity game engine because it provides a very simple means of integrating the Oculus Rift virtual reality headset into a real-time 3D experience. Initial tests were made in Unity with a pre-made roller coaster model downloaded from the Unity Asset Store. However, rather than simply place that roller coaster in an urban setting I wanted to create a track that would be unique to this experience and feel like it might have been part of the urban infrastructure. Due to time constraints it was not possible to model the urban scene from scratch. Instead I decided to generate it procedurally in Autodesk 3ds MAX using a great free script called ghostTown Lite.

Although I like to use SketchUp for 3D modelling wherever possible 3ds MAX was much better suited to this project as it allowed me to quickly generate the city scene, model the roller coaster track, and animate the path of the ride, all in the one software package. After generating the urban scene I used the car from the Asset Store roller coaster as a guide for modelling my track in the correct proportions.

The path of the ride through the city was modeled using Bezier splines, first in the Top view to get the rough layout and then in the Front and Left views to ensure the path would clear the buildings in my scene. The experience needed to be comfortable to users who may not have experienced virtual reality before so it was agreed to exclude loop-the-loops on this occasion. It was also important to avoid bends that would be too sharp for roller coaster to realistically follow. Once I was happy with the path I welded all the vertices in my splines so that the path could be used to animate the movement of the roller coaster car along the track later.

Next sections of track were added to the path I’d created using the 3ds MAX PathDeform (WSM) modifier. As the name suggests this modifier deforms selected geometry to follow a chosen path. Using this modifier massively simplified the process by allowing my pre-made sections of track to be offset along the length of the path and then stretched, rotated and twisted to fit together as seamlessly as possible. This was the most intricate and time consuming part of the project.

In order to minimise the the potential for motion sickness with the Oculus Rift I was careful to keep the rotation of the track as close to the horizontal plane as possible. Supporting struts were then arrayed along the path of the track and positioned in order to anchor it to the rest of the scene. When I was satisfied a ‘Snapshot’ was made of the geometry in 3ds MAX to create a single mesh ready for export to Unity. At this point the path deformed sections of track could be deleted as Unity does not recognise the modifier.

To create the movement of the roller coaster car along the track a 3ds MAX dummy helper was constrained to the path I’d created earlier. This generated starting and ending key frames on the animation timeline. The roller coaster car model was then placed on the track and linked to the dummy helper. It is possible in 3ds Max to have the velocity and banking of the dummy calculated automatically, but I found that this did not give a realistic feel. Instead I controlled both by editing the animation key frames using a camera linked to the dummy for reference. This was time intensive but gave a better result. The city scene, roller coaster and animation were exported as a single FBX which is the preferred import format for 3D geometry in Unity.

Having completed the track and animated the car it was time to assemble the final scene in Unity. First I generated a terrain using a great plugin called World Composer. This enables you to import satellite imagery and terrain heights from Bing maps to give your backdrops a high degree of realism.

The urban scene and roller coaster were then imported and a skybox and directional light were added. The scene was completed with various assets from the Unity Asset Store including skyscrapers, roof objects, vehicles, idling characters and a flock of birds.

To prepare the Oculus Rift integration the OVR camera controller asset from Oculus was placed inside and parented to the roller coaster car. In my initial tests with the Asset Store roller coaster I’d found that OVR camera would drift from the forward facing position. This would disorientate the user and contribute to motion sickness. To prevent it with a quick fix I parented a cube to the front of the roller coaster car, turned off rendering of the cube so it would be invisible, and set the camera controller to follow the cube.

In order to ensure the best possible virtual experience it is really important to keep the rendered frames per second as high as possible. As the Oculus Rift renders two cameras simultaneously, one for each eye, you need to aim to render 60 fps in Unity so as to ensure the user can expect to experience a frame rate of 30 fps.

In order to achieve this I took advantage of occlusion culling in Unity Pro which prevents objects being rendered when they are outside the camera’s field of view or obscured by other objects.

I also baked the shadows for all static objects in the scene to save them as textures which saves the processor calculating them dynamically. The only objects casting dynamic shadows are the roller coaster car and animated characters.

Finally two simple java scripts were added. The first would start the roller coaster and play a roller coaster sound file upon pressing the ‘S’ key. The second closed the roller coaster application upon pressing the ‘Esc’ key.

The reception of the CASA Urban Roller Coaster ride at Grand Designs Live was fantastic and I’m really pleased to have participated. It was a great project to work on and an excellent opportunity to learn new techniques in 3ds MAX and Unity. Having my first VR roller coaster under my belt I’m looking forward to building another truly terrifying one when I get the time, hopefully for the Oculus Rift DK2 which has just arrived at CASA.

On a last note I’d like to thank Tom Hoffman of Lake Earie Digital whose excellent YouTube tutorials on creating roller coasters in 3ds MAX provided a great guide through the most difficult part of this challenging project.

]]>http://www.digitalurban.org/2014/08/the-making-of-the-casa-oculus-rift-urban-roller-coaster.html/feed2http://www.digitalurban.org/2014/08/the-making-of-the-casa-oculus-rift-urban-roller-coaster.htmlLinking a 1940’s Radio to the Internet of Thingshttp://feedproxy.google.com/~r/blogspot/EYWY/~3/WfnDdEw3i6I/linking-a-1940s-radio-to-the-internet-of-things.html
http://www.digitalurban.org/2014/03/linking-a-1940s-radio-to-the-internet-of-things.html#commentsSun, 02 Mar 2014 21:58:43 +0000http://www.digitalurban.org/?p=3625(Visited 1,185 times, 1 visits today)]]>In the corner of our apartment we have an old 1940’s radio, picked up a few years ago the original valves had already been removed, leaving it modified with a then transistor radio. As such it made the perfect project to remodify and bring up to date via a mix of an embedded blue tooth speaker (in our case a Bose SoundLink) and a Philips Hue for the internal lighting.

Radio linked to Philips Hue

Using our current favourite Internet of Things service – If This Then That – the front light in the radio can be linked to any number of data feeds (see out post on IFTTT, Netatmo & Philips Hue: Linking Data to Lighting), at the moment it changes colour according to the outside temperature. The movie below shows the link to the Philips Hue and the iPhone BBC Radio App (ignore the cat, it decided to take part in every example i filmed):

While in nature quite a basic modification, it does give an old radio case a new lease of life. The link to the Philips Hue for the internal lighting opens up a number of possibilities, along with the options to link the audio output to any number of rules via IFTTT.

(Visited 1,185 times, 1 visits today)

]]>http://www.digitalurban.org/2014/03/linking-a-1940s-radio-to-the-internet-of-things.html/feed0http://www.digitalurban.org/2014/03/linking-a-1940s-radio-to-the-internet-of-things.htmlIntroducing the MSc and MRes Smart Cities at the Bartlett Centre for Advanced Spatial Analysishttp://feedproxy.google.com/~r/blogspot/EYWY/~3/zYWzB3WdAaA/introducing-the-msc-and-mres-smart-cities-at-the-bartlett-centre-for-advanced-spatial-analysis.html
http://www.digitalurban.org/2014/02/introducing-the-msc-and-mres-smart-cities-at-the-bartlett-centre-for-advanced-spatial-analysis.html#commentsWed, 05 Feb 2014 12:11:08 +0000http://www.digitalurban.org/?p=3595(Visited 1,435 times, 1 visits today)]]>Learn the New Science of Cities at University College London with the MSc in Smart Cities at The Bartlett’s Centre for Advanced Spatial Analysis from September 2014.

APPLY NOW FOR SEPTEMBER ENTRY

As Course Director, i am pleased to announce the new MSc and MRes in Smart Cities, here at the The Bartlett Centre for Advanced Spatial Analysis. Both the MSc and MRes offer an innovative and exciting opportunity to study at UCL – with key training for a new career in the emerging Smart Cities market. Smart cities are focused on how computers, data, and analytics which consist of models and predictions, are being embedded into cities. Cities currently are being extensively wired, thus generating new kinds of control and new kinds of services, which are producing massive data streams – ‘big data’. To this end, we need powerful analytics to make sense of this new world with a new skill set to understanding and lead this new field.

CASA

The Bartlett Centre for Advanced Spatial Analysis (CASA) is one of the leading forces in the science of cities, generating new knowledge and insights for use in city planning, policy and design and drawing on the latest geospatial methods and ideas in computer-based visualisation and modelling.

CASA’s focus is to be at the forefront of what is one of the grand challenges of 21st Century science: to build a science of cities from a multidisciplinary base, drawing on cutting edge methods, and ideas in modeling, complexity, visualisation and computation. Our current mix of architects, geographers, mathematicians, physicists, urban planners and computer scientists make CASA a unique department within UCL.

Our vision is to be central to this new science, the science of smart cities, and relate it to city planning, policy and architecture in its widest sense.

London – A Living Lab

EDUCATIONAL AIMS OF THE PROGRAMME

The MSc Smart Cities and Urban Analytics comprises 180 credits which can be taken full-time over 12 months or on a flexible modular basis of up to 5 years duration (full details of the MRes structure can be found here. If taken full time over one year, the following structure is followed:

TERM ONE

Smart Systems TheoryThe module provides a comprehensive introduction to a theory and science of cities. Many different perspectives developed by urban researchers, systems theorists, complexity theorists, urban planners, geographers and transport engineers will be considered, such as spatial interactions and transport models, urban economic theories, scaling laws and the central place theory for systems of cities, growth, migration, etc., to name a few. The course will also focus on physical planning and urban policy analysis as has been developed in western countries during the last 100 years.

The class runs during term one, for two hours per weekAssessment is by coursework (2,500 – 3,000 words)

Quantitative MethodsThis module will empower you with essential mathematical techniques to be able to describe quantitatively many aspects of a city. You will learn various methodologies, from traditional statistical techniques, to more novel approaches, such as complex networks. These techniques will focus on different scales and hierarchies, from the micro-level, e.g. individual interactions, to the macro-level, e.g. regional properties, taking into account both discrete and continuous variables, and using probabilistic and deterministic approaches. All these tools will be developed within the context of real world applications.

The class runs during term one, for two hours per weekAssessment is by a mix of presentations and courseworkGeographic Information Systems and Science

GI Systems and Science aims to equip students with an understanding of the principles underlying the conception, representation/measurement and analysis of spatial phenomena. It presents an overview of the core organising concepts and techniques of Geographic Information Systems, and the software and analysis systems that are integral to their effective deployment in advanced spatial analysis.The practical sessions in the course will introduce students to both traditional and emerging technologies in geographical information science through the use of desktop GIS software like Arc GIS and Quantum GIS, and the powerful statistical software environment, R. In developing technical expertise in these software tools, students will be introduced to real-world geographical analysis problems and, by the end of the course, will be able to identify, evaluate and process geographic data from a variety of different sources, analyse these data and present the results of the analysis using different cartographic techniques.

The class runs during term one, for three hours per week (one hour lecture followed by two a hour practical)

Assessment is by coursework (2,500 – 3,000 words) and via an exam

There is also an optional module selected from any other relevant 15 credit M-level module from UCL

Urban Data and Simulation

TERM TWO

Smart Cities: Context, Policy and Governance

This module provides a perspective of smart cities from the viewpoint of technology. It will provide a context for the development of smart cities through a history of computing, networks and communications, of applications of smart technologies, ranging from science parks and technopoles to transport based on ICT. The course will cover a wide range of approaches, from concepts of The Universal Machine, to Wired Cities and sensing techniques, spatio-temporal real time data applications, smart energy, virtual reality and social media in the smart city, to name a few.

This class runs during term two, for one and a half hours per weekAssessment is by coursework (2,500 – 3,000 words)

Spatial Data Theory, Storage and Analysis

This module introduces you to the tools needed to manipulate large datasets derived from Smart Cities data, from sensing, through storage and approaches to analysis. You will be able to capture and build data structures, perform SQL and basic queries in order to extract key metrics. In addition, you will learn how to use external software tools, such as R, Python, etc., in order to visualise and analyse the data. These database statistic tools will be complemented by artificial intelligence and pattern detection techniques, in addition to new technologies for big data.

The class runs during term two, for two hours per weekAssessment is by project output (5,000 – 6,000 words)

Urban SimulationThe module provides the key skills required to construct and apply models in order to simulate urban systems. These are key in the development of smart cities technologies. You will learn different approaches, such as land-use transport interaction models, cellular automata, agent-based modelling, etc., and realise how these are fashioned into tools that are applicable in planning support systems, and how they are linked to big data and integrated data systems. These models will be considered at different time scales, such as short-term modelling, e.g. diurnal patterns in cities, and long term models for exploring change through strategic planning.

The module runs during term two, for two hours per weekAssessment is by coursework (2,500 – 3,000 words)

The Hidden Data City

TERM THREE

DissertationThis dissertation marks the culmination of your studies and gives you the opportunity to produce an original piece of research making use of the knowledge gathered in the lectures. You will be guided throughout this challenge by your supervisor and with the support of the Course Director, and together you will decide the subject of research. This enterprise will enable you to create a unique, individual piece of work with an emphasis on data collection; analysis and visualisation linked to policy and social science oriented applications.

Assessment is by 10,000-12,000 words dissertation.

STAFFThe teaching staff are worlds leaders in the field from Professor Mike Batty MBE, through to Sir Alan Wilson – you can found out full details and staff profiles at our sub-site http://mscsmartcities.org/

ENTRY REQUIREMENTSIdeally, you will already have a Bachelor’s degree in an appropriate subject such as Geography, GIS, Urban Planning, Architecture, Computer Science, Civil Engineering, Economics or a field related to the Built Environment though other subjects will be considered especially if you can demonstrate a keen interest in your personal statement to convince us you should be given a place. You’ll need to have obtained a 2.2 (or international equivalent) to join the MSc.

If you do not have a Bachelor’s degree, we can still consider you if you have a professional qualification and at least three years relevant experience, so don’t be put off applying if you fall into this category because the greater the mix of students we have on each course, the more interesting seminars and discussions are going to be.

]]>http://www.digitalurban.org/2014/02/introducing-the-msc-and-mres-smart-cities-at-the-bartlett-centre-for-advanced-spatial-analysis.html/feed0http://www.digitalurban.org/2014/02/introducing-the-msc-and-mres-smart-cities-at-the-bartlett-centre-for-advanced-spatial-analysis.htmlIFTTT, Netatmo & Philips Hue: Linking Data to Lightinghttp://feedproxy.google.com/~r/blogspot/EYWY/~3/FZaogZBixT4/iftt-netatmo-philips-hue-linking-data-to-lighting.html
http://www.digitalurban.org/2014/02/iftt-netatmo-philips-hue-linking-data-to-lighting.html#commentsSun, 02 Feb 2014 11:34:57 +0000http://www.digitalurban.org/?p=3554(Visited 2,307 times, 1 visits today)]]>Here at The Bartlett Centre for Advanced Spatial Analysis we run a simple dashboard view of the weather in London. The background of the dashboard changes colour to a variety of pantone shades according to temperature. Via our CEDE project we are starting to experiment with the Philips Hue Wifi Lighting System. With the ability to connect the bulbs to a network and select colours according to a hex colour – the logical link was to link the bulbs to the weather data, allowing the lighting to change according to the external temperature and in sync with the web dashboard.

Philips Hue

Once the bulbs are connected to the network there are a variety of third party services that can be used to control the bulbs, either as a group or individually. One of our current favourites here at CASA is IF This Then That (IFTTT), IFTTT is a service that allows the creation of simple ‘recipes’, linking popular online systems via a series of rules. Among the services linked on IFTTT is the Netatmo Weather Station and the Philips Hue.

Netatmo Weather Station

We are using the Netatmo in CASA as part of our forthcoming office data dashboard, it monitors internal temperature, humidity, sound levels and carbon dioxide with an external unit providing temperature and humidity readings.

Colour Range

Our Colours Weather Dashboard changes its background every 5 degrees centigrade, as such via IFTTT it is possible to create a series of recipes using the same colour range and link it to a temperature reading from the external Netatmo unit.

If This Then That Recipes

IFTTT is an emerging service, as such it does have a number of limitations. We had to create a recipe for each temperature change, rather than combining it into one logical statement. IFTTT also only checks the readings from external services every 15 minutes, making it unsuitable for rapidly changing data. Finally, there do seem to be a few bugs, the hex colour does not seem to create a direct match to the Hue light, so a few tweaks are required to gain the correct colour output.

Ships Lamp Coloured by Temperature and IFTTT

Despite a few limitations it does open up a number of possibilities, the next steps are to look into how to link in other data feeds. This is arguably where the power of IFTTT comes into play, its ability to link a number of services and then control external hardware makes it a perfect opening into linking data to the Internet of Things.

The whole system could arguably be built using an Arduino board / Raspberry Pi linked to LED’s at a much lower cost. The linking of consumer grade units to data is however perhaps a step towards smart objects for everyone. The whole set up took under two hours to get up and running, including the setting up of the IFTTT recipes and linking to the Philips Hue. Edit (20/2/2014) – the light is now installed inside an old Ships Lamp with the colour changing according to the outside temperature.

It is probably a niche market, but the linking of data to lighting is an intriguing development. From making the lights blink when carbon dioxide levels in the office rise through to changing colours according to a twitter or rss feed, we are excited by the possibilities.

The web based weather dashboard can be viewed at http://www.casa.ucl.ac.uk/weather/colours.html data updates every 3 seconds. We will be mounting the ‘weather bulb’ in a small globe and placing it in the corner of the office. The Philips Hue starter pack comes with three bulbs, 50 lights can be linked to each bridge. With the ability to link each bulb, or a group of bulbs, to almost any data feed, the office is about to become a colourful place…

(Visited 2,307 times, 1 visits today)

]]>http://www.digitalurban.org/2014/02/iftt-netatmo-philips-hue-linking-data-to-lighting.html/feed0http://www.digitalurban.org/2014/02/iftt-netatmo-philips-hue-linking-data-to-lighting.htmlHappy Christmas and a Happy New Year (Movie)http://feedproxy.google.com/~r/blogspot/EYWY/~3/XzcA94PeiX8/happy-christmas-and-a-happy-new-year-movie.html
http://www.digitalurban.org/2013/12/happy-christmas-and-a-happy-new-year-movie.html#commentsThu, 19 Dec 2013 10:30:36 +0000http://www.digitalurban.org/?p=3543(Visited 689 times, 1 visits today)]]>2013 has been quite a year for research – the Smart/Future Cities discussion has moved forward with a notable pace, new setups such as the Future Cities Catapult and the Smart London Report from the GLA are starting to drive the uptake. It has been a year of research around cities with multiple new grants, awards and PhD funding at The Bartlett Centre for Advanced Spatial Analysis (home of digitalurban).

2014 brings with it teaching of the Occulus Rift Virtual Reality Headset on our Masters courses, new research into the emotions of cities and a continuation of everything cities at CASA – from all @CASA and @digitalurban have a wonderful Christmas and a Happy New Year….

(Visited 689 times, 1 visits today)

]]>http://www.digitalurban.org/2013/12/happy-christmas-and-a-happy-new-year-movie.html/feed0http://www.digitalurban.org/2013/12/happy-christmas-and-a-happy-new-year-movie.htmlScreens in the Wildhttp://feedproxy.google.com/~r/blogspot/EYWY/~3/PB3QyklkYaE/screens-in-the-wild.html
http://www.digitalurban.org/2013/12/screens-in-the-wild.html#commentsThu, 19 Dec 2013 09:53:04 +0000http://www.digitalurban.org/?p=3537(Visited 556 times, 1 visits today)]]>Screens in the Wild is a collaborative project initiated by researchers from the Space Group at University College London and the Mixed Reality Lab at University of Nottingham. It investigates how media screens located in urban space can be designed to benefit public life, rather than merely transmit commercial content.

Screens in the Wild

The project is building a series of architectural interfaces in East London and Nottingham neighbourhoods, which use broadcast media and interactive technologies to enhance real world connections, add value to the daily experience of the urban environment, foster community participation and ownership of the urban space. In this project we are seeking to involve local organisations and residents as key partners in the creative development process.

It is one of our favourite projects out of the current Research in the Wild themes, The Screens in the Wild team includes: makers, architects, HCI designers, computer scientists, anthropologists, developers, artists and curators.

]]>http://www.digitalurban.org/2013/12/screens-in-the-wild.html/feed0http://www.digitalurban.org/2013/12/screens-in-the-wild.htmlThe One Show – Pigeon Simhttp://feedproxy.google.com/~r/blogspot/EYWY/~3/ldwPVe4kz94/the-one-show-pigeon-sim.html
http://www.digitalurban.org/2013/11/the-one-show-pigeon-sim.html#commentsWed, 27 Nov 2013 11:08:28 +0000http://www.digitalurban.org/?p=3526(Visited 583 times, 1 visits today)]]>Pigeon Sim is a Kinect powered, Google Earth linked, system to fly around live data – pulling in feeds from citydashboard.org it was developed originally as part of an EPSRC funded exhibition project (ANALOGIES) in April 2012 by CASA Researcher George MacKerron (now Lecturer at the University of Sussex). It is now part of the ESRC funded TALISMAN project at CASA and Leeds where it is being further developed to link into the DISTANCE Internet of Schools project funded by the TSB.

As part of a wider segment on the BBC’s One Show with a 4 minute documentary about pigeons by Mike Dilger and Adam Rogers the Pigeon Sim was wired up for a fly through by comedian Jack Dee. The clip below shows the system in action and our attempt of making sense of the whole thing on live tv:

The One Show was hosted by Chris Evans and Alex Jone and attracts audience ratings in excess of 4 million, the joys of live TV and technology should not be underestimated – it worked but with a full studio getting the Kinect to recognise a comedian in a pigeon outfit is challenging to say the least…..

]]>http://www.digitalurban.org/2013/11/the-one-show-pigeon-sim.html/feed0http://www.digitalurban.org/2013/11/the-one-show-pigeon-sim.htmlFuture Cities Finance Initiative Announced in Partnership with Level39http://feedproxy.google.com/~r/blogspot/EYWY/~3/qHWLFogayh4/future-cities-finance-initiative-announced-in-partnership-with-level39.html
http://www.digitalurban.org/2013/10/future-cities-finance-initiative-announced-in-partnership-with-level39.html#commentsThu, 31 Oct 2013 14:33:58 +0000http://www.digitalurban.org/?p=3513(Visited 657 times, 1 visits today)]]>Cities are clearly the topic of the moment, be it Smart/Future/Sustainable/Computable, the concept is moving towards a new understanding of our urban world and with it an opening up of new social and economic opportunities. There has never been a better time to look into the research and commercial opportunities around cities and data (as our forthcoming MSc in Smart Cities highlights, more on that in a future post). As such the new initiative linking up Level39 and the Future Cities Catapult is one of potential. The full press release went out in July, so this a little late to the table in terms of the blog post, but the it is the concept of linking cities research/innovation and financing start ups that we feel is notably timely, the press release with details is below:

The UK’s Future Cities movement is set to receive a major boost, thanks to a new initiative by the Future Cities Catapult and Level39, Europe’s largest accelerator space for financial, retail and future city technologies, based at Canary Wharf.

Launched in March this year, the Future Cities Catapult is an independent innovation centre set up by the Technology Strategy Board (TSB). It aims to help UK businesses develop cutting-edge, high value urban solutions and then sell them to the world. The Catapult will unite business, city governments, innovators and academia to run a series of large-scale pilot projects that address how cities can become more economically active, while lowering their environmental footprint and moving towards a low-carbon economy.

The Future Cities Catapult is prioritising the ‘Future Cities Financing’ aspect of their work programme to address the need for large-scale investment into the sector. Growing cities represent a great opportunity to develop innovative technologies that have the potential to provide a major boost to the UK economy.nMore than £6.5tn will be invested globally in city infrastructure over the next 10 to 15 years. Of this, the accessible market for companies developing future city systems is estimated to be worth £200bn a year by 2030. Critical to the flourishing of future cities is greater access to finance for integrated city systems (not simply the traditional model of investment in specific projects in areas such as transport, energy and health).

The Future Cities Catapult will focus on developing and leveraging finance for cohesive city systems as a key area of activity in future. This work has begun by tasking Level39 with pulling together experts from within its extensive finance and investor networks to raise awareness of investment opportunities and draw upon their knowledge to debate the merits of the current or potential new funding models.

Peter Madden, Chief Executive, Future Cities Catapult, said: “In the future, the majority of people will live in cities. There are huge benefits to people and opportunities for business in making cities better places to live, but at present we lack the financial mechanisms to unlock these opportunities.

“Future Cities Financing will explore the challenges, opportunities and risk profiles for different kinds of investment. By engaging with the right financial stakeholders, we can also amplify the message that investment in smart cities is a new and highly valuable asset class.”

Eric Van Der Kleij, Head of Level39 at Canary Wharf Group plc says:

“We are pleased to be supporting the Future Cities Catapult by creating the right blend of industry experts to debate the crucial issue of future cities financing.” “The built environment at Canary Wharf is one of the most technologically advanced in London and we will continue this journey at Level39 by facilitating new innovation in future cities technology. Our location alongside the capital’s leading financial district and extensive contacts, which include finance specialists, fund managers and investors, will enable us to help unlock vital sources of funding.”

The clip below provides a good insight into Level39 – having been there a couple of times we can safely say its an exciting place:

Finally, details on What is the Future Cities Catapult are highlighted in the movie below:

]]>http://www.digitalurban.org/2013/10/future-cities-finance-initiative-announced-in-partnership-with-level39.html/feed0http://www.digitalurban.org/2013/10/future-cities-finance-initiative-announced-in-partnership-with-level39.htmlCASA Barbican Cinema One Trailer and Live Stream: Future Cities and Digital Technologieshttp://feedproxy.google.com/~r/blogspot/EYWY/~3/3BtxTMleUcw/casa-conference-2013-future-cities-and-digital-technologies-trailer-and-live-stream.html
http://www.digitalurban.org/2013/09/casa-conference-2013-future-cities-and-digital-technologies-trailer-and-live-stream.html#commentsWed, 25 Sep 2013 08:30:25 +0000http://www.digitalurban.org/?p=3506(Visited 867 times, 1 visits today)]]>Cities have always been places where new technologies are invented but as more and more of the world’s population is living in cities, it is ever more urgent to consider their future. Cities are also being reinvented using new digital technologies and this one day conference will explore how CASA at University College London is exploring the future city through big data, smart technologies, and new ideas about simulation and prediction. This conference will showcase a range of work dealing with digital futures based on using computers to measure, model and predict the future state of our town and cities using many examples drawn from London and beyond.

A free day of everything related to future cities and the built environment at the premiere 300 seater Barbican Cinema One:

Held at the 300 seater Barbican Cinema One in London, the open to all conference will deal with new ways of visualising cities in 3D using virtual realities, it will show how we are building an internet of things around which the city is being reinvented and it will explore how the prosperity of cities relates to their scale and size. A particular feature of the day will be a focus on new ways of exploring movement patterns in cities using data sets from smart cards, from open data sources and from new methods for crowdsourcing not only data but ideas for future cities.