Author Archive

The concept of “GeoDesign” was one year old last week when Esri CEO and president Jack Dangermond kicked off the GeoDesign Summit held in Redlands, Calif. His question to the audience: How do you want to interact in the future to make things better?

He spoke about new modalities and how we used to use CAD to generate maps, but now with GIS we can all look at and interact with the map simultaneously.

He said that GIS is going through “another massive shift with real time information, with distributed services and bringing things together dynamically, the whole lifecycle of design and processes is birthing here.” The new paradigm is about creating alternative futures, evaluating them quickly and seeing the conseqences of them.

Dangermond sees that as the world is becoming digital, GIS is becoming pervasive and in the future we will be able to measure “nearly everything that moves or changes.” On top of those measurements we will be able to sketch design alternatives.

Half of the time of the designer and engineer is spent on collecting data.

Michael Goodchild of the University of California spoke on GeoDesign accomplishments through 2010.

a. A research agenda for this area and development.

b. Personal perspective

c. Needed a definition of the field and now have a Wikipedia page.

New networks have been created such as the Geodesign Consortium spearheaded by Karen Hanna and the SDS Consortium by Naicong Li.

Online resources –

Participatory geodesign network – defining geodesign as it relates to public participation.

GIS and Science bibliography on Esri GIS & Science website

Selected readings –

Jack’s talk at TED 2010

GeoDesignWorld.org – Jason Lally and Drew Dara-Abrams

Literature – Regional and Urban GIS: A decision support approach by Esri Press

Goodchild said we need to close what have many have perceived as a growing gap between GIS and design.

“Now more than ever we need a technology to distinguish between small-d and Big-D design,” said Goodchild. “Design consists of the formulation of an optimization problem with objectives and constraints, the collection of data, the execution of a search for the optimum solution, and its implementation.”

His definition of the two “d”s was as follows: Small-d —In this simplistic view implementation is seen as inevitable. Big-d sees the process complicated by disagreements among stakeholders.

Lightning Talks

The Lightning Talks presented at this event were 10 minutes long. A couple of the more enlightening ones are outlined below:

Chris Pyke of the U.S. Green Building Council said that “Green building is not about buildings. It is about this curve – a systematic movement devoted to changing the prevalence of practice – by creating best practices. Thecurve is not spatial, temporal or data driven. The USGBC put in place a collection of people and practices to move the curve.”

One manifestation of green building is buildings, said Pyke. At least 30,000 buildings are in the pipeline, which represent decisions made about water, stormwater, lighting, air space, space, etc.

Over the last decade, people haveunderstood we have a curve, and we try to remove it by adopting best practices, while a building might last 50-200 years. The curve is made up of these decisions over time.

The next 15 years of green building practice is going to be

Driven by evidence

Informed by place

Powered by information.

USGBC has created a portal to understand spatial and temporal dimensions. The portal can expose “augmented reality” information of different actual real projects on the ground. It can capture real information on a real building, so that other projects can be measured by it and come up to its standards. This technology can also be accessed through mobile BGIG Analyst.

Nicholas de Monchaux, assistant professor of Architecture and Urban Design UC Berkeley talked about “creating a robust nervous system for the cities of today.” The digital tools of today allow us to contemplate this new paradigm.

Constance Bodurow, Lawrence Technological Unviersity,

Studio [Ci] a design lab in the College of Architecture, presented the topic “Convergence of Intensity: How to Use Geodesign Tools to Shape A City.” She said we are urbanists, and interested in the future of urban form, and they believe cities should be the most desirable place for human habitation.

A new urban geography and ecosystem are required which leverage the assets and complex combinations of social economic and environmental factors.

Their Studio (Ci) integrates Esri with Google SketchUp to generate unique outcomes. The Convergence of intensity (CI) is a value based approach which builds on value densification and recommends the new geography of the city. It proposes specific criteria of the revitalizing of the post industrial city. “We create 3D extrusions, the city can see it better and have thousands of datasets,” said Bodurow.

Idea Labs

The afternoon was devoted to Idea Labs on special topics. The one I attended was entitled BIM/GIS Integration led by Stu Rich of PenBay Solutions, Ihab Hijazi, Danny Kahler and Fred Abler.

The discussion addressed an ongoing debate about Industry Foundation Classes (IFCs), an object oriented file format for interoperability between CAD and now Building information modeling (BIM) files. Now they are working on an interoperability platform between BIM and BIM, and want to use it to apply to the BIM/GIS conversation.

Participants asked the questions: What are use cases, what are problems we are going to solve, and what are we going to pull out of BIM to put in GIS and vice versa?

The day wrapped up with a talk by Kimon Onuma, architect, evangelist for the integration of BIM and GIS and president of Onuma, Inc. has been using BIM since 1993. His clients include the GSA, U.S. Coast Guard and U.S. Army Corps of Engineers–to name a few.

Onuma remarked that the economy slump is the best thing that has happened to the industry – the people who didn’t have time to look at BIM now are looking at it. On the downside, BIM models have become very heavy and users cannot extract valuable information from them.

Onuma’s viewpoint about technology is that it should be simple, “if we don’t keep it simple, we can’t solve the problem,” he said. A solution should be like an online travel website where you book an airline flight. You ask a question, it gives you an answer.

Onuma has created the BIM Model Server which embodies cloud computing, BIM and GIS, facilities management and other data in real time. It is fast and simple, and allows numbers of people to access the information simultaneously.

He took the audience through the virtual design of a building in Hong Kong, where everyone in the room could click on a link on his site and begin adding design elements. This type of brainstorming way of designing and pulling in information is called a BIMStorm. What the audience did with Onuma in one hour is what is a quick example of what is generally done with an organization in a day or several days of working together on a real project.

He said the intersection of GIS and BIM is “where it explodes.” Multiple servers talk to each other, and with cloud computing you can create mashups. The building is in a city, the city is part of the world and that’s how it connects together.

GstarCAD announced the pre-release of GstarCAD 2011 version for public evaluation.

Thermafiber, Inc. and ARCAT have developed AutoDesk Revit BIM objects for Thermafiber’s mineral wool insulation products. These objects are available for free download on the ARCAT site and also accessible on Thermafiber’s website.

The annual Computer Electronics Show (CES) 2011 in Las Vegas will host “the year of the tablets” as nearly every PC manufacturer will be unveiling an Android or Windows 7 tablet, with Android being the winner.

What will we see in terms of cost for infinite computing after it’s in place?

You have two things going on simultaneously: you have a deep curve into the climbing price of computing – computing is the only asset that’s going down in price while everything else going up. From the commercial perspective we’re shifting some of the costs from customers back to us. Generally people providing this today are not as computer intensive – like Salesforce.com.

We’re affordably doing it; you can now try AutoCAD LT running off the cloud.

Right now the spot price for cloud computing is at 3 cents an hour.

If I’ve got infinite computing available, when and where do I make the decision to use it?

We’re going to have a hybrid computing model. Because of the tablet, there is incredible computing power and you don’t need to be connected. You’ll continue to have local devices – and the cloud for compute intensive jobs. We don’t build out our own cloud, for most of them we are trying to use commoditized resources, if you need an answer within short period of time you pay more; there are some models like this. What if people are able to solve problems they were never able to solve before?

We think the cloud is a choice. Some customers no longer want the local choice, where they need power and resources; they want another choice of deployment. Choice is available to all customers. Pricing models are changing; mobile devices are putting pressure on the market. The way we can use infinite computing is by offering different models for those who only need this software two hours a month.

I’m not sure if it has any fundamental pressure on pricing in general, what pressure it does introduce is offset by greater capability. The price of fundamental resources goes down while capabilities go far up.

What kind of delivery models will you see?

You’ll see electronic software downloads rather than boxes, some people deploying through streaming, etc., and other services that purely exist in the cloud only. You’ll have a variety. We’re looking at our subscription program for people to get information on options.

What about Autodesk’s growth?

Our business without acquisitions is no better or worse than other years, we have 12-15% growth rate in 2010, and that can be changed by economic conditions and by acquisitions. We have factored in the idea of infinite computing but at a low level.

Are you addressing multicore?

We have done a lot of multicore work on our products. It works only when you’re doing a lot of the same thing, like sorting a lot of data items. Our studies show it accounts for only about 15 percent of what engineers do. That’s why the breakthrough is making the cloud available. We can run a larger analysis process across more iterations.

We have some amount of work in foundation stuff, there are some ways to do things in a multithreaded way. It’s a valuable technique, not quite as valuable in general purpose computing as you might think. We’re much more interested in what allows you to optimize an answer to a question.

What about the consumer market?

Our customers are mostly professionals, 1 percent top account for 30 percent of our revenue, 70% of customers account for other revenue. Historically we haven’t done much with consumers, SketchBook Pro is way past 2 million people who have downloaded it, and it has done amazingly well. It’s phenomenal in what it’s been able to do in terms of generating awareness. Selling SketchBook at $8.99 is not a way to make profitable business but it has done a great job of raising awareness, to understand also what people are looking for. There is a greater influence of the consumer market going back into the professional market.

We need to pay attention to the consumer market and see what is going on, such as the community that gets created around Flickr, that social community around professionals. I don’t think our business will change to become a consumer business, although we have more people coming in at the entry stage as new users and students, a feeder population, and are getting people interested in design and math.

We need tools that everyone can take advantage of.

People are more interested in moving things to mobile devices. Open source was the end of an era – commodization. There is still open source software out there successfully deployed in server based environments, but most of our software doesn’t fall into that category.

A full report of AU will come out on Monday in AECWeekly. In a nutshell, the event really focused on yet-unreleased technology, that are in Autodesk Labs, with the exception of “infinite computing.”

Infinite computing is another word for the cloud. Tw o products for the cloud shipped in September are Green Building Studio and AutoCAD WS. GBS is a natural for this – it has terabytes of data and the cloud enables you to download it as you need it. Other technologies that will benefit from this treatment are rendering and finite element analysis in the cloud, where users will not have to maintain expensive systems to house large amounts of data.

Autodesk’s new subscription program adds value with new features to products such as infinite computing and web services. Basically, users will be able to augment their desktop with point functionality from the cloud. The model is changing in that it offers different services to different tiers of customers based on what they need. The platinum, or enterprise tier for example, will receive a rapid response feature from technical support and also support for older versions of software. The consulting team will also map their process to see how to better serve them.