I thought I’d provide an update on Data Driven Design and Construction to be published by John Wiley & Sons and hope this answers your Frequently Asked Questions (FAQ).

When will the book finally be published?

For the publication of Data Driven Design and Construction: 25 Strategies for Capturing, Analyzing and Applying Building Data, we’re looking at Fall 2015.

The author again?

The book was written by Randy Deutsch AIA, LEED AP

The number of data driven strategies that appear in the book?

As suggested by the subtitle, the book contains 25 Strategies for Capturing, Analyzing and Applying Building Data.

Who wrote the book’s Foreword?

James Timberlake FAIA, Partner at KieranTimberlake wrote a truly remarkable, inspiring Foreword for the book. We couldn’t be happier with it.

Tell me the gist of the book in a sentence?

That’s not really a question. This is a FAQ. While recognizing the value of BIM, most individuals and firms use BIM tools today as a document creation tool, when instead design and construction professionals need to recognize BIM’s real value as a database, and start treating it like one. (That, and about 1267 other things worth knowing.)

How many trees will die?

None if you read it in iBooks or on your Kindle. The book manuscript is 798 pages long. How many total pages the actual book will contain is up to Wiley – and what both the aided and unaided eye can see in the way of readable font size.

You’re kidding me. 798 pages?

The book contains 141,125 words (not including the index.)

How much will that set me back?

No pricing information at this time. As Sherlock Holmes said, “it is a capital mistake to theorize before one has data.”

How is the book organized?

The book is written in three parts.

PART I: Why Data, Why Now?

PART II: Capturing, Analyzing and Applying Building Data

PART III: What Data Means for You, Your Firm, Profession and Industry

And nine chapters plus an epilogue, so 10.

Chapter 1 The Data Turn

Chapter 2 A Data-Driven Design Approach for Buildings

Chapter 3 Learning from Data

Chapter 4 Capturing and Mining Project Data

Chapter 5 Analyzing Data

Chapter 6 Applying Data

Chapter 7 Data in Construction and Operations

Chapter 8 Data for Building Owners and End Users

Chapter 9 Building a Case for Leveraging Data

Epilogue The Future of Data in AEC

How many pieces of software are mentioned in the book?

The book strives to be vendor agnostic. That said, 69 separate software and apps are discussed.

Do you mention Meerkat?

Yes.

Great!!

Thanks.

How about Grasshopper plugins Ladybug and Honeybee?

You’ll just have to wait to find out.

Will I know any of the people or firms mentioned in the book?

A list of the experts, innovators and thought leaders interviewed for the book (in alphabetical order) and their firms, institutions and universities:

Thanks. But that’s not a question. The image featured on the book cover is the Hangzhou stadium by NBBJ, image by Andrew Heumann, Leader of NBBJ’s Design Computation team (with shoutout to Nathan Miller.)

…

More information on the availability of the book to follow as soon as it is available. Leave a question or comment in the comments below or contact the author at rdeutsch@illinois.edu

Flux, a 25 person, 2 year old company, the first — and so far only — startup to spin off of the semi-secret Google[x] research moonshot lab and incubator at Google dedicated to projects such as the driverless car and Google glass, has set out to automate the AEC industry.

It’s about time we take notice – and sides.

Google[x] is the company’s main initiative to diversify its sources of income.

Thus, with the global construction market estimated at $5 trillion a year, their foray into our turf.

The Google X engineers initially called the development of the invention Genie (after the genie in Aladdin in “1001 Nights”). Genie, the development team told Google’s management, was a platform with online-based planning applications to help architects and engineers in the design process, especially for skyscrapers and large buildings. The platform includes planning tools of expert architects and engineers and advance analytics and simulation tools. Genie standardizes and automates the design and construction processes with unlimited design options, enabling an architect to preserve the building’s uniqueness in the urban environment.

In the report, the Google X team estimated that Genie could save 30-50% in prevailing construction costs and shorten the time from the start of planning to market by 30-60%. The Genie team estimated that the platform had the potential of generating $120 billion a year for Google, and so Flux was born.

In short: we’re going to increase our urban population in the next 35 years by 3.3B people – which nearly doubles our urban population from right now – and, depending on the size of the building, will require between 6.6 million and 33 million new apartment buildings by 2050 to house them all.

And so the need to see buildings not as one-offs, built from scratch, but from seeds.

Buildings as Mother Nature would want them to be

From a talk Jen Carlile, Co-Founder of Flux, gave in October 2014 at KeenCon,

Using Data to Improve the Built Environment:

Today we build individual buildings as though Mother Nature built each one from scratch, rather from seeds.

Flux asks: What if we were to build buildings from seeds? Seeds that took on different forms and characteristics depending upon where they were planted?

The thinking goes, if we designed this way, we could leverage data and design and build buildings by the thousands in the time it currently takes to design one.

Tool #1: Coded within the building app are all the rules that the building needs to grow or auto-generate: the structural system, HVAC, façade, etc. It knows, for example, that it needs external sunscreens on the west elevation to reduce late afternoon heat gain. These rules are all encoded into the building seed. (See the video within the video that starts at 8:30.)

They use the analogy of the Monterey Cypress tree, which takes on a different shape based on where it is planted, the prevailing winds and conditions of its location and site.

In the same way that if you plant three separate Monterey Cypress seeds in three separate locations you’ll get three separate trees; if you place three separate building “seeds” in three separate locations you’ll get three separate buildings.

In other words, the building takes on different forms based on the different sites it is placed on.

The software “designs” all of the bathrooms, fire stairs, ducts. Because all of the rules are encoded within the building seed, you can make changes to the building. When you do that, the building regrows.

What needs to change

To address the urban population crisis, says Jen Carlile, we need to stop designing individual buildings and start designing building seeds.

The time it takes to design and build needs to dramatically decrease.

Tool #2: Another tool Flux built helps with organizing data, making it more actionable and more universally accessible. Think of it as a feasibility study algorithm that, once you identify a site or sites, instantaneously assesses entitlements, massing, building program, building performance, leasable area and overall project budget.

Simon Rees, Associate Principal / Structural Group Leader at Arup in Los Angeles, in a talk he gave in late October 2014 about a data-driven, integrated project named P12 that involved input from ARUP, Flux, Gensler, Cupertino Electric, Turner Construction, among others, calls this Wrangling Geometry from the data.

Embracing the full complexity of the design and construction process, grounded in real estate data, P12’s goal was to reduce the design and construction of a large-scaled building to a 12 month cycle: 3 months for design, one for permitting, and 8 for construction.

They use the example of zoning codes that dictate what can be built on a site.

The tool pulls in data from neighboring lots, buildings, vegetation. It looks at overlay zones, view corridors. Then it looks at the building code, generating the buildable envelope for a site.

Using downtown Austin, TX as an example, Flux’s software Metro purports to provide a better way to visualize Austin’s development code by

aggregating multiple data sources in one place: data from cities, tax assessors, and third-party sources, so you quickly understand the parameters for a land parcel;

helping developers and land owners to visualize their parcels by situating proposed projects into the surrounding landscape;

showing only the development codes that are applicable, including conditional overlays and uses; providing a quick assessment of project potential. “If and when you are ready to go deeper,” says the website, they’ll “provide helpful reference links to deeds, entitlement history, and permitting history.”

taking a snapshot of the project and share with anyone, getting stakeholders aligned around a common vision

rendering zoning incentive and building usage impacts on the parcel and massing.

They make the process transparent so you can see where all of the data is coming from. So up on you monitor, as part of the tool, side by side with the building massing is the building or zoning code and all of the rules that can be derived from it.

After using the tool for a while, says Carlile, you can develop an intuition as to why the buildings are shaped the way they are: “What we often think of as artistic license is really just the manifestation of a rule set.” This is one of the ways that data can inform, and over time improve, intuition.

In the spirit of openly sharing technology in the software industry, making the design and construction process not only more transparent but more efficient, and reduce the time it takes to design and build buildings, Flux asks: What if there was a standard library where people could build upon the work of others, as opposed to solving the same problems over and over again?

We already have that technology: it’s called the human mind, imagination and memory.

The founders of Flux need to place their efforts into a context by looking at precedents for what they are doing, in both the academic realm and also in practice (for example, Aditazz and Cadolto.)

I think the population growth storyline and Mother Nature metaphor don’t mask the underlying opportunity to best developers at their own game by offering this as a software as a service (SaaS.)

Remember we’re talking about 33 million buildings by 2050. Yes, population growth, but it also screams revenue growth.

One challenge will be on the construction end: constructors will be needed who know how to componentize, commoditize, and put the buildings together quickly. Flux has very little to say so far about leveraging data for two-thirds of the design/permit/build equation: construction.

The technology raises questions such as: Should humans be performing modeling tasks that a computer can perform?

I did feasibility studies for building developers most of my career and on most days I felt like I had the best, most creative job in the world.

A big misunderstanding is that code searches are drudgery, something that needs to be performed by computers. While the most cursory first looks can be made by computers, any building designer knows that interpreting the code – whether zoning or building – can be every bit as creative a task as designing the building itself. I have myself doubled the size of the allowable square footage of a project without seeking a variance based on nothing more than creative interpretation of the code. A computer can read a code, but it can’t read between the lines of a code book: only humans can.

The tools appear to be sophisticated and their presentation require gravitas if they are to come across as something more than design toys. If Flux wants to convince the profession and industry, neither structural- nor software-engineers should be touting the upside of these services. In the two presentations I have seen on the software, neither looks at the software as a comprehensive, integrated system but instead from their own somewhat narrow perspectives.

To be convincing, Flux needs someone as a spokesperson who sees the big picture. Someone who orchestrates large teams and knows the complete assemblage of building design and construction – not just from their silo, domain or point of view.

One sign was something in particular Jen said in the Q & A after her talk, when she referred to Flux’s genetic engineering of buildings:

I think of it like APIs. You can have an API for a structural system. If you can connect your structural API to your fabrication machine, you no longer have to have humans involved.

Yes, that is what it means to automate. But in the meantime, Flux relies on the experience of humans to help move their vision forward. I wonder how team members Arup and Turner Construction would feel hearing their client say this?

Last summer over early morning coffee in Cambridge, Phil Bernstein – who the past couple summers has joined me on the second day – asked me what I covered on the first day of my two-day Harvard GSD BIM leadership seminar.

So I gave him the run down. BIM as a database, interoperability, the seven convergences…

Wait. Again?

In all the times I’ve spoken with Phil, he had never before asked me to repeat myself.

So I listed them for him.

Oh those, he said distractedly – and returned to whatever he was doing.

It was then I knew I was onto something.

Background

The convergences in question have both practical and emergent antecedents.

Since the downturn in the economy in 2008, architects and other design professionals have been expected to design and construct in a manner that uses fewer resources, while still innovating, adding value and reducing waste.

Deliverables have to take less time, cost less money to produce, while not compromising on quality: expectations that are unrealistic at best, often resulting in a negative impact on outcomes, working relationships and experiences.

Old paradigms such as “Quality, Speed, & Price: pick any two”- no longer apply. Owners expect all three – Perfect, Now and Free – on almost every project. Traditional linear thinking no longer works in this converged upon world.

At the same time, emergent forces and technologies have come together in the second decade of the twenty-first century that have developed to the point where they make real-time integration of all facets of the design and construction process possible.

An unstructured, as of now unnamed group of twenty- and thirty-somethings have met the challenges and opportunities of this moment with skill, verve and aplomb (you know who you are.)

Closing Gaps vs. the Eternal Now

The metaphor of closing gaps is based on linear thinking: two events occurring in succession or transition are brought together or bridged.

With increasing demands to make decisions in real time, design professionals are moving beyond the linearity metaphor and thinking in terms of simultaneity, super-integration and convergence.

Emergent tools, work processes and the cloud make real-time convergence today a reality.

Those in academia, the design professions and industry are experiencing a tectonic shift approaching a real-time/right-time meeting point marked by multidisciplinary integration, intersections, interactions – and daresay collisions.

Because areas of professional expertise are converging on transdisciplinary teams increasingly made up of data scientists, computer scientists, mathematicians, sociologists, statisticians, strategists, scripters, and economists working alongside design professionals – it is no longer adequate for students of architecture, engineering and construction management, or for design and industry professionals, to strive to learn and master individualized skills.

Being proficient in any one domain – whether skill, technology or tool – is no longer sufficient. As I’ve written elsewhere, being proficient is no longer sufficient.

Adapting to the new world of work requires learning to think simultaneously on several fronts, and from several points of view.

This is the world of convergence thinking.

There are seven convergences in contemporary design practice that occur at the meeting of two seemingly opposite forces:

Virtual and Physical World

Data and Intuition

Parametrics and Computation

Visualization and Model

Design and Fabrication

Conception and Construction

The Practical and Ineffable

As this is a blog post, not a dissertation, here they are in capsule one at a time:

Virtual and Physical World

Real-time tools enabling man-machine interaction in design and construction are being developed on the UIUC campus. Augmented Reality and Virtual Reality is entering the construction space with implications for how buildings are designed. Software now provides intuitive, real-time answers to wickedly complicated questions, which in turn enables much better decisions.

How can students of design and engineering, and their educators, work together to help leverage these tools throughout the building lifecycle?

Data and Intuition

In lieu of buildings as buildings, or buildings as documents, we’re now seeing a convergence of buildings as data. Design firms are already leveraging data to make better decisions, bring about better insights, and make better buildings.

Where do employees who can do this – understand buildings as databases – come from? How are they learning to effectively and creatively accomplish this? Is this something that can be introduced in school or in practice?

Working with data and analytics not only informs, challenges and validates intuition, but over time changes (i.e. improves) intuition. Convergences such as the data/intuition feedback loop are quickly changing the way design professionals work, the way they think, communicate and interact with one another, and thus have implications for the way they learn, train, and practice.

Parametrics and Computation

Existing BIM tools are in the process of becoming more computationally aware.

Converging parametrics and computational tools. We are now able to compare alternative building designs on the fly – in real time – assuring teams that they are on the right track, meeting the owner’s and building inhabitant’s criteria, comfort and needs from the start.

Converging processes and people. Using Grasshopper plug-ins like Platypus – developed by firm employees, not corporations, disseminated freely on Twitter, not via software resellers – individuals are now able to communicate and collaborate on the design of buildings in real time.

Are architects going to sit side by side with hackers, computer scientists and algorithm builders? Right now, several offices already have architects sitting side by side with hackers, computer scientists and algorithm builders. The future, in other words, is already here. Are we preparing our students for this future? Who will lead this effort? Who, in other words, will be the glue?

Visualization and Model

Tools are currently being developed where visualization reacts directly to analysis, in the 3D model, without the need of producing additional reports.

What impact will this have on the way we currently teach structures, and the daylighting and energy analysis of buildings?

Design and Fabrication

Convergence of design and fabrication. University, not trade school, students today learn to operate robotic arm fabrication equipment directly from their CAD and BIM software.

What are the implications for not only education and practice, but for the trades and industry?

Conception and Construction

After winning the international design competition, it took Jørn Utzon 6 years to figure out how to build the Sydney Opera House shell structures, and by that time he was no longer on speaking terms with the client or the contractor.

Flash forward 50 years: Nathan Miller as the lead computational designer, and later Andrew Heumann, leader of NBBJ’s design computational group, designed, fabricated and all but constructed the Hangzhou Stadium from their laptops.

How are the boundaries between two historically separate entities, design and construction, converging? What are the implications for the way we learn? For the way we practice?

The Practical and Ineffable

As David Ross Scheer has discovered design professionals are increasingly challenged to realize meaning and agency within the constraints of computational tools.

Who are the individuals that are succeeding at this need for the transference and making of meaning brought about by increasing convergence of technology, tools and processes?

Implications for education

The seven convergences have implications for both education and practice.

Convergence requires multidisciplinary participation, merging STEM subjects with those in design and the arts.

We need to educate designers to work compatibly and effectively with those from across different domains and fields.

Like it or not, this is where design practice is heading: we need to understand the implications for education and training, research and development.

Implications for practice

Architecture is a complex undertaking requiring the input of many individuals with varying interests, backgrounds and expertise. This has not – and will not – change.

What is changing is the way these individuals are working, communicating and collaborating. Their individual contributions are converging. In response, they are integrating their efforts – not multitasking. To meet today’s demands for speed, affordability and quality – they are taking and making smart cuts, not shortcuts.

They are currently learning to do this at conferences, at informal meet-ups, in online forums, via gaming, and in social media.

If you aren’t in the Grasshopper forum, or follow them on Twitter, a whole epoch might pass you by.

The linear design process transforms – and increasingly tightens – as a result of the introduction of convergences in contemporary design practice workflows.

A new book

There is a need to clarify and concretize what is happening at this moment in time for those who aren’t following.

Books – whether physical, digital or audio – are seen by some as antiquated technology. But in the middle of the second decade of the third millennium, they are still the best means we have for synthesizing moments and movements, giving them a name, and as importantly, a language that can be understood, shared and discussed by others.

The seven convergences will be explored in a new book I will be writing in 2015.

For those keeping score, convergence is the next natural succession in the research from my previous two books on building information modeling (BIM,) and data analytics in the AEC industry.

Specifically, the practice-based research for this new book aligns with, grows out of, and builds on my current research on the collaborative leveraging of data in design that has led to my in-progress book, “Data Driven Design and Construction: Strategies for Capturing, Analyzing and Applying Building Data,” (John Wiley & Sons, 2015) a +400 pp. publication providing practical information, useful strategies and technical guidance to practitioners, educators and students who are looking to leverage data throughout the building lifecycle.

As a meditation on the impact of technology on the education and making of design professionals, this new book – Convergence – can go a long way to help explain what is happening now in the world of design, as well as to discuss the implications for the future of practice.

I would love to hear from you. Anyone

for whom this topic strikes a chord, rings true, captures the zeitgeist

How do we, as a discipline, capitalize on data and metadata to drive innovation in architecture and construction, just as other disciplines and industries have?

wrote:

Evidence based design is one of the leading fields of research that will eventually result in data driven design and construction.

In a recent post I discussed the need in the architecture, engineering and construction (AEC) industry for an entity – a place or a site you could visit – to access relevant stats, numbers, figures.

In other words, evidence – to support your design, construction and operations decisions.

Not long after this post ran, I came across a prescient passage by Phil Bernstein FAIA.

In an interview with the Autodesk VP and Yale lecturer in the book Design Informed: Driving Innovation with Evidence-Based Design, Bernstein listed the two primary constraints of access to knowledge in the AEC industry as:

Storage capacity

Processing speed

Which he acknowledges today are both largely unconstrained.

OK, so if capacity and speed are no longer stopping us, what is?

An open-source database of accumulated knowledge

Bernstein suggests two other impediments:

The AEC industry’s need for an open-source database of accumulated knowledge

The lack of a research tradition in the AEC industry

Both of these, according to Bernstein, are major barriers to the creation of data necessary to drive current practice.

Is it a lack of knowledge structures?

Interestingly, Bernstein suggests that we need to address this challenge as a design problem:

To build an infrastructure for architectural research and practice

One that produces and disseminates useful knowledge about outcomes that are open, shared and easily accessible.

One might add, reliable data and knowledge.

Despite saying over and over that architecture is an art and a science, is it that we’re unpracticed at conducting research comparable to the sciences?

Is it a lack of a systemic infrastructure or, as Renee Cheng has so eloquently suggested, intellectual infrastructure?

Which ever it is, Bernstein is assured that what is lacking is the content – the data – to populate our tools, such as Building Information Modeling.

As the authors attest in Design Informed:

BIM will fall short of its full potential to predict performance outcomes until evidential data becomes readily available to inform the models.

The closing words of Design Reform are hopeful and provide us with a roadmap if we will only take up the challenge:

When systems to create, communicate, and apply strong, diverse evidence are in place and embedded within the design process, architecture will be recognized as a valuable, knowledge-based profession – a vision shared by most designers.

]]>https://datadrivendesignblog.com/2014/01/22/toward-an-open-source-database-of-accumulated-aec-knowledge/feed/3randy deutschEBDConstruction’s Big Data Problemhttps://datadrivendesignblog.com/2014/01/14/constructions-big-data-problem/
https://datadrivendesignblog.com/2014/01/14/constructions-big-data-problem/#commentsTue, 14 Jan 2014 14:18:01 +0000http://datadrivendesignblog.com/?p=65It is clear from reading McKinsey Global Institute’s (MGI) report, Big data: The next frontier for innovation, competition, and productivity, that big data is now recognized as an important factor of production, alongside labor and capital.

But what about productivity?

Most by now recognize that leaders in every sector will have to grapple with the implications of big data.

What does this mean for design and construction?

MGI studied big data in five domains—healthcare in the United States, the public sector in Europe, retail in the United States, and manufacturing and personal-location data globally. Big data can generate value in each domain.

Some sectors are positioned for greater gains from the use of big data than others.

And, as one might expect, the Construction sector has some work to do if it is to see comparable gains brought about by capturing, analyzing and applying big data.

On MGI’s Big data value potential index, the construction sector falls somewhere in the middle of the horizontal continuum, between low and high potential value.

Adding value while reducing waste

On the vertical portion of the chart, measuring productivity growth in the US between 2000-2008, it will come as no surprise that the AEC is not productive – registering the lowest growth of any industry.

In fact, as stated in the report, several sectors, including construction, educational services, and arts and entertainment, have posted negative productivity growth, which probably indicates that these sectors face strong systemic barriers to increasing productivity.

One of the goals for using big data in the AEC industry is to help – along with the use of BIM, collaborative workflows and integrated delivery methods – improve productivity for owners. Adding value while reducing waste.

A heat map in the MGI report shows the relative ease of capturing the value potential across sectors.

Construction appears on the chart as a good – not a service – and one that is not particularly easy to capture the value potential of big data.

AEC talent, according to the chart, is hard to come by – not a lot of data scientists working in the architecture, engineering and construction space.

What the chart calls a “data-driven mindset” also registers a medium rating for the AEC industry.

And lastly, data availability is also neither easy nor hard to capture.

According to the MGI report, Construction has the least stored data of any industry: 51 Petabytes (as of 2009) compared with 966 Petabytes for Manufacturing and 848 Petabytes for Government. The stored data per firm is among the lowest as well.

In terms of the type of data generated and stored (it varies by sector) Construction – as one would expect – has most of its data in the form of images; next, in text and numbers; and the least amount of data stored in video and audio.

Potential benefits for utilizing big data in construction

The report recognizes that the use of big data in advanced simulations can reduce the number of production- and construction-drawing changes as well as the cost of construction.

Also noted is the fact that construction equipment manufacturers currently embed sensors in their products, providing granular real-time data about utilization and usage patterns, enabling these manufacturers to improve demand forecasts as well as their future product development.

On a larger scale, the report points out that urban planners can benefit significantly from the analysis of personal location data. Decisions that can be improved by analyzing such data include infrastructure, highway design and mass-transit construction.

In conclusion – at least as it pertains to construction – the report indicates that sectors that may have more moderate potential from the use of big data, such as construction, “tend not to have characteristics that could make exploiting big data opportunities more challenging than in other sectors.”

uses data on the construction industry from 2000-2008. Had the report included data from 2009-2014, while work had slowed significantly due to the economy, the industry has made significant inroads into adopting and implementing technology in the past half decade.

looks at the construction industry independent of architecture and engineering. It is hard to tell from the report whether architecture and engineering would be included in real estate, in arts or professional services. If the latter, it has somewhat better prospects in terms of available talent and IT use.

Productivity in the AEC industry ought to increase in the near term, when new technologies such as building information modeling (BIM), collaborative work processes such as integrated project delivery (IPD), lean construction, and now – data – are used seamlessly and comprehensively throughout design, construction and building lifecycle.

Do you agree that embedding data in our processes will increase our productivity?

]]>https://datadrivendesignblog.com/2014/01/14/constructions-big-data-problem/feed/8randy deutschmckinsey-report-1MGI_big_data_full_report.pdf-page-82-of-156A Single Source for High-Performance Building Datahttps://datadrivendesignblog.com/2013/12/25/a-single-source-for-high-performance-building-data/
https://datadrivendesignblog.com/2013/12/25/a-single-source-for-high-performance-building-data/#commentsWed, 25 Dec 2013 21:22:32 +0000http://datadrivendesignblog.com/?p=53I wanted to call this post “Telling AEC Truths with Statistics” because of this quote:

It’s easy to lie with statistics, but it’s hard to tell the truth without them.

– Andrejs Dunkels

Imagine a site you could visit to gather stats, numbers, figures – evidence – to support your design and construction decisions.

In the research for my new book on the use of data in the AEC industry, I was surprised to find that something like this doesn’t already exist.

A place where owners, contractors, architects, engineers, consultants and tradespeople could go to – to find the statistics and data they need to help make decisions, sell ideas, persuade constituents and justify courses of action.

Compare the amount of energy it takes to produce one ton of cement, glass, steel, or aluminum to one ton of wood:

5 times more energy for one ton of cement

14 times more energy for one ton of glass

24 times more energy for one ton of steel

126 times more energy for one ton of aluminum

Wood products make up 47% of all industrial raw materials manufactured in the United States, yet consume only 4% of the total energy needed to manufacture all industrial raw materials. Source

LEED is referenced in project specifications for 71% of projects valued at $50 million and over Source

…

Would such a site be of use to you in your work, research or practice?

What would you suggest to improve this idea?

Do you have a favorite AEC industry stat?

Let us know by leaving a comment. Thanks!

…

Image credits: Top four data images courtesy of Tocci Building Companies

]]>https://datadrivendesignblog.com/2013/12/25/a-single-source-for-high-performance-building-data/feed/6randy deutsch4004085744_ae8f5bb70c_0figure 6-34figure 6-35figure 6-36figure 6-37m-h-statisticsmeasurementConstructors of Inexactitude, Architects of Imprecisionhttps://datadrivendesignblog.com/2013/12/20/constructors-on-inexactitude-architects-of-imprecision/
https://datadrivendesignblog.com/2013/12/20/constructors-on-inexactitude-architects-of-imprecision/#commentsFri, 20 Dec 2013 15:24:42 +0000http://datadrivendesignblog.com/?p=39Architects and contractors I have worked with over the years like to be precise in what they do.

There’s a level of accuracy involved in delivering a successful project.

Shouldn’t we expect our data to be equally precise?

Design specifications call for building tolerances to be in the fraction of an inch.

Construction tolerances have to follow the letter of the documents or be rebuilt.

So, how are architects and contractors going to fare given the imprecision of big data?

Given the impurities and errors so often found when churning large quantities of information?

At a time when design and construction professionals are wondering whether their future includes sitting side-by-side with AEC-dedicated data scientists, how ironic that it isn’t the laser-like precision of statisticians or number-crunching of big data scientists – but the necessary inexactitude and imprecision of vast quantities of data – that they will be contending with.

This imprecision may prove the more difficult thing for architects and contractors to get their heads around.

Accepting errors as part of the mix may be the bigger part of change for those in the AEC industry.

Are building design architects going to be able to handle big data’s messiness?

Are contractors going to be comfortable dealing with sloppy but vast metadata?

Sure, most projects in the earliest stages are no more than a collection of hunches, assumptions, intuitions and WAGs.

But when designers run a dimension string, or calculate square footages while preparing a permit set, they have to be sure that all their numbers add up.

Constructors, too, are often contractually required to report inconsistencies and errors they find in the construction documents.

Are they going to learn to live with errors in the large quantities of data they’ll be working with?

Will this require a new or different mindset from the those who design, model, construct, own and operate buildings?

Will building teams learn to live with and tolerate these quirks of big data?

Or perhaps, as with any contradictions, see them as additional opportunities to gain knowledge and insight?

Companies have a proprietary right to their property. Firms own their work and have the right to give away the copyright. Possession is, after all, nine-tenths of the law.

Firms have a right to make a living. But not all firms can invest in R&D.

What I take issue is with the word “proprietary” and the attitude it implies.

At issue: a firm’s proprietary research and the outcomes of their research.

At issue, as well, is their proprietary data.

Big data in the AEC industry – with all of its potentially powerful benefits – won’t catch on and scale until the technology and data is freely shared.

Information wants to be free – so why not data?

What I am questioning here is the proprietary nature of sensors and other technology vs. open source big data and freely available technology.

In a recent article, Blaine E. Brownell, AIA LEED AP, Associate Professor and Co-Director of the Master of Science Program in Architecture—Sustainable Design track at UMN School of Architecture, asks:

What if the architect were to measure the actual rather than simulate the predicted performance of material assemblies?

The article raises some interesting questions that have implications for data acquisition and analysis in the AEC industry:

How is knowledge of the environment acquired?

What is the potential relationship between form generation and real-time feedback?

How might design practices change when real-time feedback is incorporated into the design process?

You can read the rest of the article Using Real-Time Climate Data to Drive Designhere.

What I take issue with in the article is this:

Brownell explains that KieranTimberlake’s “Faircloth and Welch brought a collection of proprietary sensor and communications technologies from KieranTimberlake, and they set up a website for students to track real-time climate data related to their work.”

There it is: proprietary i.e. proprietary sensor and communications technologies.

Is it mentioned to ward-off would-be parties who might be interested in using the sensors for their own gain?

But isn’t the point of using the sensors and related technology and data to improve the environment and related performance?

The opposite of proprietary isn’t generic, nonproprietary or unpatented, but #opendata

The idea of giving away something in order to gain something isn’t new. It’s the subject of Chris Anderson’s recent book, FREE: How Today’s Smartest Businesses Profit by Giving Something for Nothing.

The irony – perhaps a taste of his own medicine? – is that Anderson’s book is available for free.

In the data space, Jeff Heer, Assistant Professor at Stanford and creator of four data visualization toolkits/languages: Prefuse, Flare, Protovis, D3 that are freely available for all to download and use. Catch a greatpodcast with him here at Data Stories and read about Heer here.

Just as the expansion of open source data is important for reaching scale with such technologies as augmented reality, so too making sensors, communications technology and data available for all to access freely is critical for big data in the AEC industry to scale.

Until technologies for capturing, analyzing and applying data are forever open, clear and free, the construction industry will continue it’s 60-year plus decline in productivity.

Little to Lose and Much to Gain?

As the Brownell article points out, “in architectural education and practice, students and architects spend a lot of time developing the physical form of buildings.”

This is true as well for students, professors and architects who spend a lot of time researching and developing technologies and the resulting data.

If these were to be open source, freely available for any professional to download or acquire and use, what would be lost?

Firms would:

be unable to regain the money they invested in the research.

lose the opportunity to make a profit and fund future research.

potentially lose clients who might feel used, especially where the firm conducted research on the client’s dime.

lose their exclusive right to the data and technology.

lose the right to call them proprietary sensor and communications technologies.

Despite there being much to lose, if otherwise proprietary data and technology were made available to all, so much would be gained – for the profession, industry and especially the environment.

AEC firm’s data- and technology-related research can of course be supported by sponsors and grants with much to gain.

How do firms then fund their research? If they do so through the results of their research, then can they afford to give their work away?

If the firm’s research is sponsored or subsidized – firms can then afford to, but may not be able to – give their data and related technology away if their sponsor has a claim of the research.

The Brownell article points out that KieranTimberlake’s advanced sensing capabilities made the student’s experience described in the article possible; “without such technology, the studio would only have been speculative in nature.”

And yet, nothing would have changed had the technology been freely available to not only the students, but to others as well.

To gain the advantages offered by data in the AEC industry, firms need to agree to and define an open data and technology strategy. Do you agree?

]]>https://datadrivendesignblog.com/2013/12/06/proprietary-the-dirtiest-word-in-data/feed/0randy deutschopendataWhy Do We Build the Way That We Do?https://datadrivendesignblog.com/2013/12/01/why-do-we-build-the-way-that-we-do/
https://datadrivendesignblog.com/2013/12/01/why-do-we-build-the-way-that-we-do/#respondSun, 01 Dec 2013 19:54:35 +0000http://datadrivendesignblog.com/?p=26The Data-Driven Design and Construction (3D + C) blog will feature and, where possible, follow and review, key industry events related to the subject of my book.

Based on the invited list of speakers, the focus of the forum does not appear to be directly focused on data, per se, or even big data, but rather on the benefits for firms to invest in applied research and the results derived from evidence-based design decisions.

The AIA Seattle Data-Driven Design Forum promises to

show the relationship between applied research and positive quantitative outcomes such as reduced energy use and increased performance of buildings as well as qualitative outcomes such as increasing the likelihood of designing more productive workplaces that contribute to the occupant’s health and well-being

offer suggestions for firms to increase their competitive advantage by leveraging their research insights.

While I am currently starting research for my book, Data-Driven Design and Construction, this blog will raise as many questions as it tries answers. That seems to be the case for this forum as well.

Questions raised by the event include:

How important is hard, researched data to a client’s decision-making process?

How can you assure that your design strategies will pay off in terms of increased productivity, higher-performance, or better health and well-being?

For me personally, the most intriguing questions potentially raised at such a gathering have to do with the data:

How can firms best access and utilize existing research data?

How can we benefit from analyzing performance data?

Where does the data come from and where can an ordinary design firm access it?

But also more general questions such as:

What are the most interesting industry problems that require the use of data to address?

What kind of firm commitment does it take to really do this?

The keynote speaker will be Billie Faircloth, AIA, LEED AP, director of KieranTimberlake Research Group (KTRG), who asks: “Why Do We Build the Way That We Do?”

Other speakers include:

Daniel Friedman, Professor and Past Dean of the College of Built Environments, University of Washington, will open the event;

I personally am having a hard time clearing my teaching calendar to attend this event. If you plan on attending, and take good notes, I would truly appreciate if you would share your notes and/or I would like to hear from you about your experience at the forum, either by leaving a comment below or via email: rdeutsch@illinois.edu

The AIA Seattle Data-Driven Design forum will be held from 8AM-5PM on Tuesday, Dec 10, 2014 at the University of Washington Alder Commons 1310 NE 40th St Seattle, WA USA.

]]>https://datadrivendesignblog.com/2013/12/01/why-do-we-build-the-way-that-we-do/feed/0randy deutschD3Welcome to Data Driven Design and Constructionhttps://datadrivendesignblog.com/2013/11/27/welcome-to-data-driven-design-and-construction/
https://datadrivendesignblog.com/2013/11/27/welcome-to-data-driven-design-and-construction/#commentsWed, 27 Nov 2013 18:33:37 +0000http://datadrivendesignblog.com/?p=18Over the next year I will be writing a book, Data-Driven Design and Construction, to be published by John Wiley and Sons in 2015.

The same folks who published my last book, BIM and Integrated Design, back in 2011.

Since that book focused on information (the “I” in BIM), and this one on data, by this time next year I will have worked my way halfway up the DIKW pyramid. Albeit not in order.

I look forward to reaching the pinnacle by addressing the topics of knowledge and wisdom in design and construction in my tenured old age. But that’s for another blog.

This blog is for those who would like to join me in my current data-driven exploration of this still nascent field: mining, applying and analyzing data to improve architectural design and construction. And indirectly, in doing so, the world.

That to maintain or reclaim their roles as leaders, architects, contractors and others in the AEC industry need to account for data derived from digital models, and also be able to gather, navigate and communicate the derived information while working collaboratively on integrated teams throughout the design and construction cycle.

That design and construction are the last profession and industry to address these challenges – and are only beginning to do so now.

This blog will also ask and explore a lot of questions, some of which are:

How do we, as a discipline, capitalize on data and metadata to drive innovation in architecture and construction, just as other disciplines and industries have?

What forces and technologies are coming together in the second decade of the millennium that make the gathering and use of data possible for industry practitioners for firms both small and large?

Why is the architecture, engineering and construction (AEC) industry the last to discover – and utilize – data, for their benefit?

In what ways can design and construction professionals and owners benefit from capturing, collecting and using data in their building models?

What implications does the DIKW hierarchy have for presenting findings to owners and others who may not be as data savvy?

What is the business case for implementing a data transformation within one’s organization?

How is data currently being used in the AEC industry?

Can building data be crunched into a form that can be analyzed by non-experts? Or will architects and other design and construction professionals need to adapt to working with, even alongside, data scientists and analytics experts?

Is there a precedent for this situation, perhaps in another industry, that architects can learn from would do well to model and emulate?

If you would like try to take a stab at any of these, or know someone who might, I’d like to hear from you. By leaving a comment below, sharing a link – or by reaching out via email: rdeutsch@illinois.edu

Thanks for visiting the D3 blog. I look forward to taking this journey with you!