Today we have
reached an important milestone in the data transformation movement with the
naming of members to Commerce Department’s new Data Advisory Council
(CDAC). The 19 leaders we have selected will help guide the Department in
revolutionizing our approach toward data optimization and usability. They are
bright stars in private and public sectors: thought leaders on data; respected
and well-equipped to facilitate this transformation. Members’ expertise mirrors
the spectrum of Commerce data -- demographic, economic, scientific,
environmental, patent, and geospatial. Their agenda? To help us
foster innovation, create jobs, and drive better decision-making throughout our economy and
society. Their first meeting will take place April 23-24 in Washington, D.C.

Selecting
from an impressive and wide array of experience, innovation, education and
talent was not an easy task. The individuals we have chosen are
extraordinary for a host of reasons evident in their positions and
achievements. But perhaps one of the most compelling traits they share is
keen awareness that success is built upon the ability to listen to a chorus of
voices representing a range of viewpoints.

We are
thrilled to have reached this important marker in our “data revolution” and
look forward to the CDAC’s guidance on such key issues as data management; open
data standards; public-private partnership; and ensuring a user-driven process.

On
February 6-8, over 200 software developers, designers and makers gathered at
the Zillow headquarters in downtown Seattle for “Hack Housing”, a hackathon
co-hosted by Zillow and the University of Washington. Teams of programmers
spent the weekend using open data to build apps that help people find
affordable, accessible places to live – and pitching their products in
competition for a $10,000 top prize. Zillow Co-Founder Rich Barton, former
White House Deputy CTO Nick Sinai, and Lisa Wolters from the Seattle Housing
Authority kicked off the event on Friday. The 72-hour jam session also featured
an inspiring video message from Nani
Coloretti, Deputy Secretary for the U.S. Department of Housing and Urban
Development (HUD).

Zillow
uses open data from multiple federal agencies including HUD, the U.S.
Department of Education, and the U.S. Census Bureau, to deliver insights and
information on housing, schools, and communities as part of their living
database of more than 110 million homes.

“The
Hack Housing event is a blueprint for how government can use our valuable open
data assets to help bring private sector innovation to tackle key policy
challenges, such as helping seniors age in their homes and connecting low
income renters and first time home buyers to housing opportunities,” according
to Lynn Overmann, Deputy Chief Data Officer of the U.S. Department of Commerce.
“Bringing together thought-leaders from industry, academia and local, state,
and federal government can generate really compelling product ideas to help
solve some of our most difficult housing issues and also drive economic
impact.”

The teams at Hack Housing focused on user-centered design to
address the needs of specific sets of users, including first-time homebuyers,
older Americans and lower-income families. The White House, U.S. Department of Commerce,
HUD, Department of Transportation, and the U.S. Census Bureau supported the
event by providing know-how and open datasets.

Guest blog post by Thomas Beach, Senior Advisor in the
Office of the Under Secretary and Director, U.S. Patent and Trademark Office, and
Scott Beliveau, Open Data Team Lead, U.S. Patent and Trademark Office

Nobody doubts the value of data today, and the Obama Administration
has taken many important steps towards making government data more open and
accessible to the public. As Secretary Pritzker likes to remind us, the
Department of Commerce is “America’s Data Agency,” and has a unique and central
role in that transformation. Although
open data feels like the flavor of the month for every government agency to
tout, this is especially meaningful for the United States Patent and Trademark Office,
or USPTO. The agency houses a treasure trove of data, and now
has crystalized a path forward to better sharing it with the world.

Disclosing and disseminating data supports our broader mission
of advancing American innovation. After
all, the patent system rests on the trade-off between the disclosure of an
invention and the right to exclude others from using it. From that perspective, the USPTO has been in
the business of open data for a very long time.
If we were going to live up to our mission in this interconnected,
digital world of disseminating information about patents and trademarks, we
knew we needed an agency-wide commitment to improve our data delivery on all
fronts. And that was the spirit in which
we hosted the USPTO Open Data Roundtable with NYU’s GovLab on December 8th.

The roundtable brought together diverse members of our user
community, including industry representatives, prior art searchers, and
academics, with USPTO’s data team.

Among
those actions, launching a department-wide Data Advisory Council was a top
priority and a key commitment. And today, I am pleased to say that we are
making good on our promise: the council has been officially established and we
are now accepting
applications.

We
are looking for the best and brightest data thought leaders in the private and
public sectors to advise our efforts to revolutionize Commerce’s data – to
foster innovation, create jobs, and drive better decision-making throughout our
economy and society. The application process extends through December 3,
2014. If you think you have what it takes, I strongly urge you to apply.

As
we build our Data Advisory Council, we are actively recruiting a Chief Data
Officer (CDO) to drive the transformation of our data, and we are pleased to
announce the hire of an outstanding Deputy CDO, Lynn Overmann, currently a
senior advisor to White House Chief Technology Officer Megan Smith. Lynn
will be responsible for coordinating and guiding the Department’s efforts to
realize the value of our data and to put the vast volumes of our data to better
use each and every day.

Government data helps drive our economy and will increasingly become more important in the future. Thursday, I had the opportunity to speak on this topic at a congressional briefing hosted by U.S. Senator Mark Warner (D-VA), Chairman of the Budget Committee’s Government Performance Task Force, and the Center for Data Innovation. Panelists included Daniel Castro, Director of the Center for Data Innovation, Kathleen Phillips, COO for Zillow, Tom Schenk, Chief Data Officer for the City of Chicago, and Steven Adler, IBM’s Chief Information Strategist.

We explored how government data is the foundation of the ongoing data revolution, fostering innovation, creating jobs and driving better decision-making in both the private and public sectors. The federal government is, and will continue to be, the only provider of credible, comprehensive, and consistent data on our people, economy, and climate. We also pointed to the findings in our recently released report,“Fostering Innovation, Creating Jobs, Driving Better Decisions: The Value of Government Data,” which found that billions in economic output and trillions in resource decisions are driven by federal data.

Daniel Castro, Director of the Center for Data Innovation, urged attendees to make sure Congress continues to invest in our data infrastructure. He highlighted the value of open data, ensuring that data flows more seamlessly between the public and private sectors. Castro also focused on the need to consider new ways to enable cooperation between government and industry to maximize the benefits of big data to the greatest number in society.

Zillow’s Chief Operating Officer Kathleen Phillips discussed how her company uses a wide variety of federal and local data to better connect buyers and sellers in the real estate marketplace. Zillow provides critical information in an easy to digest mapping format for over 50 million properties around the country. Their Zillow Home Value Forecast, fed in part by federal datasets, also predicts local home values. Zillow uses data from the Census Bureau, the Bureau of Labor Statistics, the Bureau of Economic Analysis, the Federal Housing Finance Agency and other federal sources to provide a real time evaluation of local real estate markets.

Today,
U.S. Deputy Secretary of Commerce Bruce Andrews spoke about the software
industry’s role in strengthening the economy at an event hosted by the Software
and Information Industry Association (SIIA), the principal trade association
for the software and digital content industry. During the event, titled “The
Software Century: Analyzing Economic Impact & Job Creation,” Deputy
Secretary Andrews talked with SIIA Vice President of Public Policy Mark
MacCarthy about the Commerce Department’s efforts to support American
businesses in the software and other high-tech sectors.

During
the discussion, Deputy Secretary Andrews highlighted how the Department
supports the software industry at practically every stage of development
through our “Open for Business Agenda.” Those efforts include increasing
broadband access across the country, linking small businesses and their
customers with high-speed Internet, boosting manufacturing to provide the
hardware software needs, and strengthening U.S. intellectual property protections,
cybersecurity and consumer privacy.

Deputy
Secretary Andrews also talked about data as a key department-wide strategic
priority. Commerce is working to unleash more of its data to strengthen the
nation’s economic growth; make its data easier to access, understand, and use;
and, maximize the return of data investments for industries, including the
software industry.

It
was fitting, then, that SIIA today released a first-of-its-kind report
providing detailed analysis and data related to the software industry’s output,
productivity, exports and job creation. MacCarthy, former Under Secretary of
Commerce for Economic Affairs Robert J. Shapiro, and representatives from
Oracle, Intuit and GM discussed the report, titled “The
Impact of the U.S Software Industry on the American Economy,” at the event.

The
report epitomizes how government data is essential for industries to understand
their contributions to the broader economy and how improvements can be made
accordingly. Further, Deputy Secretary Andrews explained that the
prevalence of the Commerce Department’s Bureau of Economic Analysis data
throughout the report is a testament to the usefulness of the department’s data
to help American businesses grow. The value of government data was recently
highlighted in “Fostering
Innovation, Creating Jobs, Driving Better Decisions: The Value of Government
Data,” a Commerce report by the Economics and Statistics Administration
(ESA).

The U.S. Commerce Department’s National
Oceanic and Atmospheric Administration (NOAA) collects weather and climate
data. As we noted in a recent Commerce Department report on the Value
of Government Data, the return to society on investment in government
meteorological data is large.

For example, one survey found that
the overwhelming majority of people said they used weather forecasts and did so
an average of 3.8 times per day. That equates to 301 billion forecasts consumed
per year!

The study’s authors note that,
other than current news events, there is probably no other type of information
obtained on such a routine basis from such a variety of sources. Certainly, the
researchers say, no other scientific information is accessed so frequently. And
while the information is being delivered from an array of sources, most of it
directly or indirectly originates from NOAA’s National Weather Service (NWS). Americans
check to learn what is happening in the weather, and we plan our days – and
lives – based on this data.

The researchers found a median
valuation of weather forecasts per household of $286 per year, which suggests
that the aggregate annual valuation of weather forecasts was about $31.5
billion. The sum of all federal spending on meteorological operations and
research was $3.4 billion in the same year, and the private sector spent an
additional $1.7 billion on weather forecasting, for a total of private and
public spending of about $5.1 billion. In other words, the valuation people
placed on the weather forecasts they consumed was 6.2 times as high as the total
expenditure on producing forecasts. NOAA data is re-packaged and analyzed to
produce 15 million weather products, such as air quality alerts, the three,
five and ten day extended weather forecast, earthquake reports, and tornado and
flash flood warnings. Many end users do not realize that NOAA provides the data
they see and hear every day on The Weather Channel, AccuWeather, the radio and in
the morning paper.

Today, U.S. Secretary of Commerce Penny Pritzker
discussed the Department of Commerce’s expanding role as “America’s Data
Agency” at the 2014 Esri International User’s Conference in San Diego,
California. The annual conference, hosted by Esri, a geographic
information systems (GIS) software development company is attended by
16,000 data experts, including those from federal, state, local, and regional
governments; Fortune 1000 companies; small business owners; university
scholars; and K-12 teachers.

During her address, Secretary Pritzker described how
the Department of Commerce’s data collection – which literally reaches from the
depths of the ocean to the surface of the sun – not only informs trillions of
dollars of private and public investments each year and plants the seeds of
economic growth, but also saves lives. Because of Commerce Department data,
Secretary Pritzker explained, communities vulnerable to tornados have seen
warning times triple and tornado warning accuracy double over the past 25
years, giving residents greater time to search for shelter in the event of an
emergency. The breadth of the Department’s data collection and dissemination, which
touches of the lives of millions of Americans every day, is why many, including
Secretary Pritzker call the Department of Commerce “America’s Data Agency.”

To develop and implement a vision for the next phase
in the open data revolution, Secretary Pritzker announced that the Department
of Commerce will hire its first-ever Chief Data Officer. This leader, Secretary
Pritzker explained, will oversee improvements to data collection and
dissemination in order to ensure that Commerce’s data programs are coordinated,
comprehensive, and strategic. To bolster the Chief Data Officer’s efforts,
Secretary Pritzker explained that the Department will create a data advisory
council, which will be comprised of private sector leaders who will advise the
Department on how to best use and unleash more government data.

Secretary Pritzker also
announced the launch of the International Trade Administration’s “Developer
Portal,” which will centralize data that is vital to exporting businesses
across the country. Finally, Secretary Pritzker invited conference attendees to
participate in a panel discussion later in the week in San Diego on how
businesses can best utilize data from the American Community Survey (ACS), an
annual statistical survey that helps guide $400 billion in federal spending
each year.

And when it comes to data, as the Under Secretary for Economic Affairs, I have a special appreciation for the Commerce Department’s two preeminent statistical agencies, the Census Bureau and the Bureau of Economic Analysis. These agencies inform us on how our $17 trillion economy is evolving and how our population (318 million and counting) is changing, data critical to our country. Although “Big Data” is all the rage these days, the government has been in this business for a long time: the first Decennial Census was in 1790, gathering information on close to four million people, a huge dataset for its day, and not too shabby by today’s standards as well.

Just how valuable is the data we provide? Our report seeks to answer this question by exploring the range of federal statistics and how they are applied in decision-making. Examples of our data include gross domestic product, employment, consumer prices, corporate profits, retail sales, agricultural supply and demand, population, international trade and much more.

“Better
Data for Better Decisions” is my mantra as I crisscross the country talking to
people about making the data we collect easier to find, understand and
use. Making government data more
accessible or “open” to improve government, business and community decisions is
a major initiative in the Commerce Department’s “Open for
Business Agenda.” The open data initiative has the potential to
fuel new businesses, create new jobs and help us make better policy decisions.

One
of our best data sources is the U.S. Census Bureau’s American Community Survey (ACS). The ACS is truly a unique, national treasure,
producing a wealth of data on which our country relies to make important
decisions. The ACS is used to inform
disbursement of over $400 billion a year in Federal funds. State and local decision makers rely on the
ACS information to guide tough choices about competing funding priorities, such
as locating hospitals, funding programs for children, building roads and
transportation systems, targeting first responders, supporting veterans,
locating schools, and promoting economic development. In short, our community
leaders use ACS data to analyze how the needs of our neighborhoods are
evolving. And, our business users rely
on ACS data to make key marketing, location and financial decisions to serve
customers and create jobs.

The
value of the ACS is immense. It makes our businesses more competitive, our
governments smarter, and our citizens more informed.

This
value comes from the fact that the ACS captures so much information so
comprehensively. But, this also means
that the value of the ACS depends critically on the people responding to the
survey, known as the respondents. I met
recently with members of the ACS Data Users
Group,
an organization dedicated to sharing innovations and best practices for ACS
data use, to discuss how to get the best quality data with the least amount of respondent
burden. This is of paramount importance.
A survey seen as too lengthy, burdensome and intrusive will produce
lower response rates and could undermine both the quality of the data and value
of the survey. But reducing the length of the survey could reduce the amount of
information available for decision-making.