When it comes to transforming healthcare, IBM started by looking at what we could do for our own employees. More than a decade ago, thought leaders within the company helped shape one of the most important concepts in healthcare today–patient-centered primary care.

That’s the idea that healthcare should be organized around the individual and that all of the organizations and healthcare providers involved should coordinate to deliver truly personalized services addressing everything from promoting healthy lifestyles to treating diseases.

Since then, we’ve been on a steady march to infuse people-centric, relationship-based thinking into every aspect of healthcare and wellness at IBM–and we’re committed to creating technology-based solutions that give organizations and healthcare providers worldwide the tools for improving the health and well-being of their populations.

Today, we’re taking another step forward with a sweeping partnership with CVS Health aimed at transforming the way individuals and their caregivers engage with essential members of that patient-centered, community-based primary care team—CVS pharmacists and healthcare providers at your local CVS Minute Clinic. These team members are committed to not only address your acute healthcare needs including a sore throat or a sinus infection, but also manage key chronic conditions including heart disease, diabetes, asthma, arthritis, depression, and cancer.

Our two companies will develop new, cognitive computing systems for health management through CVS’s extraordinary, national network of community-based pharmacies and in-store clinics to predict, prevent and personalize.

• To predict, we will apply Watson’s cognitive computing to a wide range of data from electronic medical records, pharmacy records, wearables, fitness devices, home monitoring devices, consumer-oriented mobile apps, and more.
• To prevent, we plan to help the populations we serve by improving health and reducing bad health outcomes such as hospitalizations and unnecessary emergency room visits.
• To personalize, we will leverage’s Watson’s extraordinary capabilities to translate scientific, evidence-based guidelines and interventions into real world practice, empowering CVS pharmacists and healthcare providers to better individualize, customize and “nudge” patients towards their best possible health..

This partnership will never replace the important relationship between patients and their primary care providers. But we think technology that helps manage disease when and where it most convenient for the individual has the potential to improve health and extend lives even while reducing overall healthcare costs for society.

One of the critical issues here is helping individuals to follow their doctor’s orders–taking medications in ways that will improve their health. Do people take the drugs their doctors have prescribed as often as they are supposed to, at the right time of day and in the right combination with food or other prescriptions? Are the medications effective in reducing symptoms, improving key biometric results, and assuring better health? When people don’t follow instructions, their health suffers–and it could lead bad health outcomes, including surgeries, hospitalizations, and irreparable organ damage

Another key element is easy access to healthcare and advice. People value convenience and access, and we need to recognize that retail clinics are an essential component of the US healthcare ecosystem.

People often delay care or treatment due to access issues. How often have you or a family member been in a situation where you need to wait weeks to get an appointment with your doctor? Working with CVS, we’ll leverage Watson to develop technologies and evidence-based, techniques that personalize engagement and proactively engage neighborhood CVS Minute Clinic clinicians before a bad health outcome emerges. In addition, our solutions will seamlessly integrate these engagements with the other key members of the patient’s care team, including the primary care clinician.

This is part of a long march of progress at IBM. We believe that to foster better health and wellness, it’s important to take a holistic view–addressing not just a person’s physical health but their mental health, their social relationships, their financial well-being and their overall sense of purpose. We call these the five dimensions of health.

You see this philosophy reflected in many of the things we do. Our CaféWell engagement platform, which we developed with Welltok, offers our employees a one-stop portal for addressing all five dimensions. We were one of the first major corporations to introduce wellness incentives. We offer fitness wearables to our employees.

The way we deal with pharmacies, prescriptions, and retail clinics shows how our thinking has changed over time. In the past, we primarily used on-line and mail-order pharmacy services. The focus was on improving convenience and reducing costs. Then, in partnership with CVS Health, we shifted to a relationship-based approach. At CVS stores, our people get advice from pharmacists and convenient care for health issues. This approach is working. Already, we have boosted the prescription adherence rate among our employees and their families and thousands have taken advantage of the convenient clinics

As a physician who has spent much of my career focused on public health issues, being involved in this initiative and anticipating its potential impact on the rising tide of chronic diseases is tremendously exciting. With this community-based, relationship-centered, data-driven approach, we have the opportunity to predict, personalize, prevent, and transform healthcare on a global scale–fostering a healthier and happier planet.

By Veena Pureswaran – As the Internet of Things continues turning physical assets into participants in new real-time, digital marketplaces, it’s creating what we describe as a new “Economy of Things.” In fact, such digital marketplaces represent huge economic opportunities for growth and advancement.

In a new study from IBM’s Institute for Business Value, The Economy of Things, we explored the macroeconomic impact of this transformation across three dimensions: Asset Marketplaces, Risk Management and Efficiency, as defined here:

Asset Marketplaces: Most physical assets are vastly underutilized. For example, commercial real estate in the United States is only 67% utilized. The Internet of Things can create liquid marketplaces of underutilized assets by enabling real-time discoverability, usability and payment. Our model shows that the emergence of new digital marketplaces with lower cost office space alternatives can lower rental prices and increase productivity for the industry.

Risk Management: Instrumentation and digitization can revolutionize credit and lending by building more accurate pictures of risk. In economies where there is no formal measure of credit – such as informal SMBs in South Africa which amount to only 8% of all bank lending – our model shows that digital usage data and virtual contract reinforcement can lead to an infusion of credit and reduction in interest rates.

Efficiency: Insights from connected devices in industries that are not technology-intensive could yield substantial gains in efficiency. Our analysis shows that in industries like agriculture, where IT accounts for just 1 percent of all capital spending, real-time integrated sensor data can help achieve greater land productivity.

Veena Pureswaran, IBV Global Electronics Industry Lead, IBM

As new marketplaces and services powered by the Internet of Things emerge and evolve, enterprises must evaluate their role in this transformation. As with any industry transformation, profit pools will get redistributed and real-time use of data and insights will be key to profitability.

While an office desk that trades itself on a marketplace may seem a bit futuristic, even dystopian, technologies are driving the Economy of Things so rapidly that your smart desk may indeed be just around the corner.
________________________________________

World leaders from business, government and the non-profit sector are gathering this week in Nairobi, Kenya, for Global Entrepreneur Summit 2015, the first such summit to be held in sub-Saharan Africa. So it’s a good time to explore the potential for Africa and Africans to take advantage of the power of entrepreneurship and innovation to propel the continent forward.

IBM is committed to helping Africa fulfill it’s promise by providing information technologies to help address the continent’s challenges, through research collaborations with companies and universities, and by helping to foster innovation ecosystems in a number of cities.

These ecosystems, modeled on Silicon Valley, bring together businesses, universities, entrepreneurs, venture capitalists and government agencies to spark economic growth and dynamism.

The early signs of progress are impressive. With time, these ecosystems could help solve some of the continent’s grandest challenges and foster broad-based economic growth.

Africa’s Innovation Ecosystems

The Braamfontein neighborhood near downtown Johannesburg has long suffered from crime and crumbling infrastructure. But it’s undergoing an amazing metamorphosis. Trendy boutiques, restaurants and coffee shops now dot some of the streets, and, now, an effort is underway to transform Braamfontein into a hub of innovation capable of driving economic growth and creating jobs for young people.

The architect of this initiative is Barry Dwolatzky, a computer science professor at Wits University, who has launched The Tshimologong Precinct, a partnership of the university, large corporations, government, and a nascent entrepreneur community. For Barry, the project represents a shot at transforming South African society and turning Africa into an innovation engine for the world.

This project–and others like it in African cities–are tremendously encouraging. In each case, information technology plays a key role in sparking and sustaining the transformations that are underway. In fact, the World Bank counts nearly 100 tech hubs in 29 African countries–most of them in cities.

There’s an important role for multinational companies to play in the rise of Africa. We must help Africa build skills, organizations and infrastructure capable of supporting long-term economic expansion. Innovation ecosystems are a good place to start.

IBM is committed to supporting these ecosystems through our engagements with African governments, businesses and civic organizations. In fact, IBM Research is to be an anchor tenant in the Tshimologong project. Rather than locate our new laboratory on a university campus or in a corporate office park, we decided to place it in the middle of an urban community–where our scientists, are part of the fabric of city life, working alongside students, faculty members and entrepreneurs, taking on projects that are relevant to Africa. Solomon Assefa, a native of Ethiopia who heads up our Johannesburg lab, calls it, a “living laboratory for technological and social innovation.”

IBM Research scientist Tierra Bills with Nairobi officials

After we opened our first Africa lab in Nairobi in 2013, the research organization launched Project Lucy, a 10-year, $100 million initiative aimed at bringing IBM Watson and other cognitive technologies to bear on Africa’s problems. In addition, we’re collaborating with South African scientists on elements of the multi-country Square Kilometre Array radio telescope project–whose aim is to unlock the secrets of the origin of the universe.

Johannesburg Rising

Johannesburg has long been the leading commercial capital in Sub-Saharan Africa. Many of the continent’s leading financial services, mining and communications companies are based there. Yet, when it comes to developing a tech economy, it lags some other African cities. Now, Johannesburg is pursuing technology-based economic development aggressively with a number of government- and private-led initiatives. That’s why I chose it as the focal point for this article. What’s going on in this city holds lessons for other cities in Africa and around the globe–and for the multinational companies that seek to expand there.

When Barry Dwolatzky went looking for models for his tech hub, one of his first stops was in Nairobi, where an energetic and diverse tech community has taken root over the past half decade. At the heart of the community is an organization called iHub, a gathering place in downtown Nairobi where tech entrepreneurs and software developers meet, learn, and exchange ideas and business cards. The hub hosts events ranging from hackathons to seminars on how to raise capital. Nairobi now boasts 10 tech incubators and accelerators. More than 150 tech startups have launched there and thousands of tech jobs have been created.

While Barry is impressed by what has happened in Nairobi, he’s not trying to replicate the exact same model in Johannesburg. In the Tshimologong Precinct, in addition to offering meeting and maker spaces, he’s bringing in large tech companies as anchor tenants and partners, starting with IBM and Microsoft. The organization will offer students and startups training in software quality–something Barry believes will be essential if Africans hope to export their innovations to other parts of the world.

Artist’s rendering of Tshimologong Precinct

Government leaders in Johannesburg count Braamfontein as one of a number of new economic development zones that are emerging through the cooperation of entrepreneurs, government, business leaders and universities. Rather than seeing tech entrepreneurship per se as the driving force in each zone, they believe information technology will enable the expansion of a range of industries in other parts of the city–such as sports and tourism.

In fact, Johannesburg is poised to leapfrog in an area that’s critical to economic development of all kinds: broadband networking. The city built its own fiber-optic backbone–which it is now leasing to private companies to run last-mile services on.

The next phase in the expansion of broadband is to roll out a municipal wi-fi network with 1,000 hotspots. Free broadband is essential for enabling innovation ecosystems to pop up throughout the city. Says Zolani Matebese, head of the broadband project: “Our philosophy is, ‘if we build it they will come.’”

Locally-Relevant Innovation

A seismic shift in communications and economic development began after Vodafone and Safaricom, a mobile carrier in Kenya, began offering the now-much-heralded M-Pesa mobile money service in 2007. It quickly grew to become the most widely used mobile money solution in Africa. But, less well known, it also has become both an inspiration and a platform for innovation and entrepreneurship. Dozens of startups have emerged to create applications that run on top of M-Pesa or other mobile-money services.

M-Pesa and its offshoots are examples of locally-relevant innovation. These business models and technology solutions were unlikely to emerge first in the United States and European countries because they answered the particular needs of emerging economies where many people don’t have traditional banking relationships.

African countries and cities face numerous challenges, ranging from healthcare and poverty to water shortages and financial inequities. I expect many of the most effective solutions to these problems to come from the communities that experience them and technologists and scientists who interact with those communities.

In South Africa, through discussions with South Africans, we’re setting priorities for our new IBM Research facility. The focus will be on local problems and locally-relevant innovations produced in collaboration with local people and institutions. One example of this approach in action: We’re working with the KwaZulu-Natal Research Institute for Tuberculosis and HIV to discover new diagnostic approaches and treatments for tuberculosis, which is the leading killer in South Africa. We’re combining our expertise in big data analytics with K-RITH’s knowledge about the genetic makeup of the tuberculosis bacteria.

Collaboration

Barry Dwolatzky of the Tshimolongong Precinct, right, with IBM’s Solomon Assefa and Wits University students

When Barry Dwolatzky and his colleagues at Wits University began pulling together plans for the Tshimologong Precinct, they knew that it would of necessity be a group effort. They tied up with a wide variety of partners, including corporate sponsors and the city and federal government. By inviting in IBM Research, he added a dimension that few technology hubs possess–the ability to bring science to bear directly on real-world problems.

IBM Research has a long track record of cooperating with others. For instance, immediately after the Ebola outbreak in Sierra Leone last year, IBM formed an alliance with the country’s Open Government Initiative, Cambridge University’s Africa Voices project, mobile telecom carrier Airtel and Kenyan startup Echo mobile to provide an SMS crowdsourcing system for gathering up-to-the-minute, on-the-ground information about the path of the disease.

Wits, our university partner in Johannesburg, is one of the leading research universities on the African continent and boasts no fewer than four Nobel Laureates, including two in the sciences, and notably, Nelson Mandela. When I was in South Africa in February, I was invited by Adam Habib, Wits’ visionary principal, to a strategy session outside the city. He spoke passionately to his leadership team about the opportunity for African universities in urban settings to reach out to the communities around them, engage young entrepreneurs, and foster economic development.

While South African government’s direct role in Tshimologong is relatively modest, it plays an essential role in fostering the ecosystem that everybody hopes will grow up around the hub. The city, in partnership with Wits and a venture capital fund, is staging a city-wide hackathon, the Hack.Jozy Challenge, aimed at fostering start-up activity and problem-solving apps.

South Africa’s government also is playing an increasingly aggressive role in promoting innovation. The legislature passed legislation that makes it easier for angel investors to back small startups. The government recently established a technology innovation promotion agency, it is developing a legal framework for the protection of intellectual property, and it set up technology-transfer offices at 23 publicly-funded universities.

At IBM, we recognize that we won’t succeed long term in Africa unless we and our clients have access to a large pool of people with the skills needed for 21st century knowledge work. So over the next three years we will expand our professional and academic skills-development programs in Africa–to 20 countries. At Wits we have set up a scholarship program that pays tuition, room and board–plus provides mentorships and vacation internships–for underprivileged youngsters from rural areas to study computer science at Wits. The primary focus is on women. We invite other corporations to join with us in this initiative so we can expand its reach.

Passionate Individuals

Even with all the progress in Africa in recent years, many challenges have yet to be overcome. In South Africa, there remains a giant gulf between rich and poor. Nearly 50% of the country’s young people of working age are unemployed. And, meanwhile, there’s a gap between the technology skills that are needed by industry and the expertise of South Africa’s workforce.

A growing tech economy can help address these challenges, but only if the entrepreneurs and organizations in the tech community overcome hurdles of their own. The tech skills gap affects them more than anybody else. There’s a shortage of the right type of venture capital–the kind that backs brand-new companies, and that comes with advisors and mentors attached. Also, according to South African entrepreneurs, local businesses tend to be slow to adopt the latest technologies, making it difficult for startups to get a toehold in the marketplace.

Building large and sustainable tech communities will require a special kind of individual–people with passion and persistence. You see these traits in many of our IBM Research scientists in Kenya and South Africa. The majority of them grew up in Africa, received their university educations and began their careers in the United States or Europe, and have now returned to Africa determined to make a difference.

Abdigani Diriye, IBM Research

One example is Abdigani Diriye, a young scientist at IBM Research – Africa’s Nairobi lab. Amid the chaos of civil war, his family fled his native Somalia when he was just five years old. Three years ago, he received a PhD from the University of London in human-computer interaction, and, last year, he returned to Africa determined to give back. He’s part of a research team focusing on technologies that give Africans access to banking services. Abdigani says: “If you have been fortunate like I have, you have a responsibility to those who don’t have those opportunities–especially those who are just starting out in life.”

I got a taste of the optimism and hopefulness of young Africans during our visit to Egypt in February. I spoke to university students at an innovation park on the outskirts of Cairo. It was truly inspiring. They were so hungry for knowledge, guidance and opportunity.

These are people with the potential to build innovation ecosystems, which could help Africa fulfill its great promise.

With thousands of scientists, engineers, and business leaders focused on cognitive computing across IBM Research and the IBM Watson Group, IBM is pursuing the most comprehensive effort in the tech industry to advance into the new era of computing. Nobody has more people on it, a broader array of research and development projects nor deeper expertise in so many of the most significant fields of inquiry.

Yet we understand that to accelerate progress in cognitive computing, we can’t do this alone. That’s why IBM has been pursuing a strategy of forming deep collaborative partnerships with academic scientists who are among the leaders in their fields as well as opening Watson as a technology platform for others to build on.

Our newest research collaboration is with Yoshua Bengio and the Montreal Institute for Learning Algorithms (MILA), with its 60 researchers — five professors and a group of elite graduate students. Yoshua, a professor at the University of Montreal and the director of MILA, is recognized globally as one of the leading thinkers in the field of deep learning. Scientists and engineers from both IBM Research and Watson Group will work with him and his colleagues on a number of deep learning projects focusing on language, speech and images.

Deep learning is a branch of A.I. where, increasingly, machines learn from experience rather than requiring extensive manual training by humans. Deep learning is critical to fulfilling the promise of cognitive computing because as such systems acquire knowledge in this way, they’re able to learn more, faster, and with less expense. Those capabilities ultimately help people make better decisions that in turn help businesses, government and society work better.

At IBM, we already exploit deep learning techniques in our language understanding, speech, vision, language translation, dialog and other technologies, which are being put to use in a wide range of industries, from healthcare to retailing. We offer several cognitive computing services to entrepreneurs and developers on our Bluemix cloud development platform. And we also have teams working to scale machine learning applications on server clusters.

IBM scientist John Smith

One of our research projects shows how deep learning can help transform healthcare. Scientists from IBM Research and Memorial Sloan Kettering Cancer Center have developed a system for detecting skin cancer. Rather than requiring scientists to write detailed instructions, we tell the computer what a cancerous lesion looks like and feed it thousands of images to continuously improve its knowledge. Using this technology, physicians or even individuals will be able to photograph spots on skin with smartphones and text them to a cloud-based diagnostics service where computers possessing deep learning capabilities will accurately flag the images that should be examined by a cancer specialist. This is a big deal because the survival rate for melanoma is 98% with early detection.

We already do a lot with deep learning innovation at IBM, but we want to do even more. We believe that the new collaboration with Yoshua Bengio and his group will accelerate progress in this critical area of cognitive computing.

Yoshua and his team have produced advances in the field in the past two years that have expanded the capabilities of deep learning far beyond what scientists would have predicted as recently as five years ago. We will now collaborate with him and his colleagues push this even further. In speech, for instance, we would like to extend the capabilities of neural networking architectures to model more aspects of speech recognition and generation processes. In the computer vision realm, we’re interested in building systems that better understand what’s happening in videos.

In addition to Yoshua’s expertise and the skills and talents of his colleagues, one of the things that attracted us to them is their commitment to the practice of “open science.” His team contributes all of their learning inventions to the Theano project and its derivatives on GitHub. At IBM, we have a proven track record of driving open innovation to accelerate progress, and we’ll continue to leverage these open innovations to design cognitive computing products and services capable of transforming industries and professions. We’ll also add the new capabilities to our IBM Watson open development platform, where students, startups, IBM business partners and clients can build their own cloud-based cognitive computing applications.

I’m convinced that this deep learning collaboration will help us achieve our ultimate goal: combining the strengths of humans and machines to achieve things that neither could do as well on their own.

Humans have long dreamed of creating machines that think. More than 100 years before the first programmable computer was built, inventors wondered whether devices made of rods and gears might become intelligent. And when Alan Turing, one of the pioneers of computing in the 1940s, set a goal for computer science, he described a test, later dubbed the Turing Test, which measured a computer’s performance against the behavior of humans.

In the early days of my academic field, artificial intelligence, scientists tackled problems that were difficult for humans but relatively easy for computers–such as large-scale mathematical calculations. In more recent years, we’re taking on tasks that are easy for people to perform but hard to describe to a machine–tasks humans solve “without thinking,” such as recognizing spoken words or faces in a crowd.

That more difficult quest gave rise to the domain of machine learning, the ability of machines to learn. This is what interests me. It’s not really my goal to make machines that think like humans do. My aim is to understand the fundamental principles that may enable an entity, machine or living being, to be intelligent. I have long ago made the bet that this would happen thanks to the ability of such an entity to learn, and my focus is on building machines that can learn and understand the world by themselves, i.e., learn to make sense of it.

The reason I’m laying out this chronology is that I believe we’re at a turning point in the history of artificial intelligence–and, indeed, computing itself. Thanks to more powerful computers, the availability of large and varied datasets, and advances in algorithms, we’re able to cross a threshold that has long held back computer science.

Machine learning is shifting from a highly manual process where humans have had to design good representations for each task of interest into an automated process where machines learn more like babies do — through experience – building internal representations that help to make sense of the world. This is the field of deep learning.

Deep learning isn’t brand new. Indeed, when I was a student in the 1980s, it was the concept of neural networks, the precursor of deep learning, that got me interested in pursuing an academic career in computer science. What’s new is that the accumulation of many scientific and technical advances has yielded breakthroughs in AI applications such as speech recognition, computer vision, and natural language processing. This has brought into the field a large group of researchers, mostly graduate students, and we’re now making progress in deep learning at a gallop.

We’re able to do that because of advances in creating hierarchies of concepts and representations that computers discover by themselves. The hierarchies allow a computer to learn complicated concepts by building them out of simpler ones. This is also how humans learn and build their understanding of the world; they gradually refine their model of the world to better fit what they observe and discover new ideas from the composition of older ones, new ideas that help them to better fit the evidence, the data.

For example, a deep learning system can represent the concept of an image of a cat by combining simpler concepts, such as corners and contours, which are in turn defined in terms of edges. But we don’t have to teach it explicitly about these intermediate concepts, it learns them on its own. We don’t have to show the system pictures of all the possible cat colors, shapes, and behaviors for such object recognition systems to correctly identify that it is a Siamese cat that’s somersaulting in a photograph. When it “sees” a cat, it “knows” it is one.

I’m privileged to be part of a troika of computer scientists who are widely credited with spearheading advances in this field–along with Geoffrey Hinton and Yann LeCun. We co-authored a paper, Deep Learning, which was published in the journal Nature in May, where we laid out the promise of our branch of A.I. But this isn’t a field where a few “media stars” are doing all that needs to be done. To produce the advances that are possible and to find applications for them will require thousands of scientists and engineers–in academia and in industry.

That’s why I’ve been dedicated to rallying people to our exciting project. I’m co-authoring a book, Deep Learning, with Ian Goodfellow and Aaron Courville. Our core audiences are university students studying machine learning and software engineers working in a wide variety of industries that are likely to find important uses for it. This book-in-progress is posted on the Web, and we welcome people to read, learn and give us feedback.

Which brings me to another key point: I’m an advocate of open science. Like open source developers, participants in the open science movement believe that we should share knowledge as soon as we gain it to increase the pace at which the boundaries of science are pushed, and for the benefit of all. Many of my research colleagues and I contribute all of our deep learning inventions to the Theano project and its derivatives on GitHub. There, anybody who is building deep learning systems can use the algorithms and programming tools, and we urge them to contribute back to the project: hundreds already do so.

Just as sharing is essential to open science, so is collaboration–the kind that’s done transparently. The whole enterprise of science is a giant brainstorm. The Montreal Institute for Learning Algorithms (MILA), with its 60 researchers — including 5 professors, contributes to it via numerous collaborative research projects with scientists in universities and industry.

The newest of our collaborative research partners is IBM. We look forward to working with scientists and engineers in IBM Research and the Watson Group on a very ambitious research agenda, including deep learning for language, speech and vision. We believe that, together, we’ll be able to scale up and extend deep learning methods by using powerful computers to take on very large datasets. It will help machines learn more, across broader domains, faster and from a larger set of data sources, including the vast amounts of unlabeled data – that have not been curated by humans.

I’m tremendously excited about the future of deep learning. We’ve made rapid progress, and while we’re far from solving the great riddle of what it will take to enable machines to truly understand the world, I’m very hopeful that we’ll crack it.

And then the floodgates will open. Once computers truly understand text, speech, images and sounds, they will become our indispensible assistants. This will revolutionize the way we interact with computers, helping us live more conveniently in our day-to-day lives and perform more effectively at work. It will enable society to take on some of the grand challenges that matter to us–such as curing deadly diseases and spreading knowledge and wealth more broadly. As importantly, it will help us understand who we are and that part of who we are that has always fascinated me, i.e., how intelligence arises. This has been my dream for more than 30 years, and it’s fast becoming our reality.

Working together, we achieved an industry first–producing working test chips at New York’s SUNY NanoTech Complex near Albany whose smallest features approach 7 nanometers. As a result, the industry will be able to place more than 20 billion tiny switches on chips the size of a fingernail.

The SUNY NanoTech Complex

The implications of our achievement are huge for the computer industry. By making the chips inside computers more powerful and more efficient, IBM and our partners will be able to produce the next generations of servers and storage systems for cloud computing, big data analytics and cognitive computing.

With this feat by the alliance, we’re extending the life of the silicon semiconductor, one of the most important inventions of the 20th century, which has come to symbolize the seemingly inevitable march of technological progress–the ability to make all sorts of computers and electronic devices faster, smaller and more energy efficient. These advances represent the most significant chip-industry design and manufacturing innovations in nearly a decade.

In recent years, the chip industry has struggled to sustain a torrid pace of semiconductor innovation. Each wave of miniaturization has come only through near-superhuman feats of creativity by scientists and engineers.

IBM has played a critical role in many of these breakthroughs. For example, our scientists led the shift from aluminum wiring to copper to improve processing speeds; using Silicon on Insulator technology to reduce power consumption; and using High-k materials to reduce leakage of electrical current.

We achieved the latest step improvement, called the “7 nm node” by the chip industry, through a combination of new materials, tools and techniques. In materials, we’re using silicon germanium for the first time in the channels on the chips that conduct electricity. We have employed a new type of lithography in the chip-making process, Extreme Ultraviolet, or EUV, which delivers order-of-magnitude improvements over today’s mainstream optical lithography. All told, we’ve made dozens of design and tooling improvements. It has been a massive effort requiring multiple breakthroughs in science, technology and chip architectures and manufacturing processes.

One of the test chips.

Looking ahead, there’s no clear path to extend the life of the silicon semiconductor further into the future. The next major wave of progress, the 5 nm node, will be even more challenging than the 7 nm node has been.

Society is more than 50 years into the journey of silicon semiconductors, and, thanks to our work on the 7 nm node, the technology still has some runway. Now, we look further into the future and see the opportunity to reinvent computing. Science doesn’t get harder–or more satisfying–than this.

New York’s Lake George is a pristine, 32-mile-long lake in the Adirondack Mountains that is noted for its water quality and clarity. While the lake is very clean, it faces multiple anthropogenic threats, including road salt incursion and several invasive species.

The project involves more than 60 scientists around the world (four IBM Research labs are involved), including biologists, computer scientists, physicists, engineers and chemists. Working as a virtual team, we’re pushing the boundaries in Internet-of-Things sensors, data analytics, and modeling of complex natural systems.

The Jefferson Project, named after Thomas Jefferson, who admired the lake, was launched 1 ½ years ago. But now that we have dozens of sensors deployed and powerful computer systems set up, we are beginning to analyze rich streams of data and to use that data to refine our multiple computer models of the workings of the lake and its watershed.

For starters, we created a state-of-the-art observational system–including sensors deployed on the lake bottom, on floating platforms and in feeder streams. Some of the sensors are “smart.” For instance, our floating platform is capable of detecting when the weather is changing and adjusts the cadence of its monitoring activities so that we can better capture certain events.

The computer modeling system is cutting edge, as well. Typically, scientists create discrete models for different elements of an ecosystem. In this case, we are coupling them–the way nature works. Our weather model feeds into the model that predicts run-off from storms, which feeds into the salt model, which feeds into the lake circulation model, which feeds into the food-chain model. Using data from sensors, we will be able to validate and continuously refine our models over time as we add new sensors and measurement data to the information platform.

The end goal is to be able to run simulations to help predict how events such as heavy storms, road salt run-off, and introduction of new plant or animal species would likely affect the entire natural environment. It’s a holistic approach to ecosystem analysis and management.

What if the state or local department of transportation uses more or less salt on a road within the Lake George watershed, or changes its chemical recipe? What if a storm washes out a new area of land and deposits sediment into the lake? Armed with more and better knowledge from the data and analyses, pubic officials, business leaders and citizens will be able to make better decisions that could reduce harm to the lake.

It’s an exciting time for our research team. In the coming months, we’ll gather huge amounts of data,, deploy new sensors and perform experiments to help us understand how the lake is changing. (Already, we know from past monitoring efforts, chloride inputs from road salts have tripled, algae has increased 33% and five invasive species have been introduced over the past 35 years.)

Eventually, we’ll commercialize the technology–taking on problems in other lakes, rivers, estuaries, coastal areas, ports, and the oceans. Today, large swaths of the globe are facing water crises that already threaten the health of individuals and the economies of nations. Fresh water challenges will continue to spread and disrupt more lives. It may turn out that the research we’re doing now to keep Lake George pristine may be critical to the sustainability of the planet and the survival of people, countless animals and plants living on it. This is the kind of research a scientist wants to work on.

Wendy Hite is a bit of a food snob. She grew up in South West Louisiana, where food and family are all mixed up in the great gumbo of life, and, for the longest time, she couldn’t imagine how she could improve on traditional Cajun-style cooking.

She used the cognitive cooking discovery program to develop a crawfish deviled egg dish that was mighty tasty–familiar, in some ways, but also new to her. “This has been fun,” she says. “It gets you to try new things and to be more creative than you normally would be.”

Wendy is one of the home cooks who participated in the Chef Watson with Bon Appétit beta program. She joined a community of thousands who enjoy experimenting with the Watson app and sharing their adventures in the kitchen with other like-minded people.

The Chef Watson club is about to get bigger. IBM and Bon Appétit have opened up the Web application and Facebook
page to anybody who wants to participate. At the same time, they’re unveiling new features and navigation aimed at making it easier for people to interact with Watson to create new dishes and to expand their flavor palates. Here’s a link to Bon Appétit’s story about it.

Wendy’s Cajun Smoked Sausage and Blue Crab Gumbo

Chef Watson, which was invented by scientists at IBM Research, has helped shape the public perception of the potential for cognitive computing. The Chef Watson food truck was a huge hit at the South by Southwest culture festival in Austin last year, and fans have been snapping up the cookbook, Cognitive Cooking with Chef Watson. But the Web application, which pairs Watson’s knowledge of food chemistry and taste preferences with Bon Appétit’s 10,000 recipes and culinary expertise, for the first time enables cooks everywhere to try their hand with Watson.

Most of IBM’s cognitive computing solutions are aimed at transforming industries and professions–everything from the healthcare and oncology to financial services and wealth management. Chef Watson demonstrates how smart machines can help people explore the world around them and discover new possibilities and new ways of getting things done whether it’s finding promising treatment pathways to fight diseases or helping law firms build courtroom strategies by discovering connections between their cases and earlier precedents. It also signals that there will be a wide variety of uses for cognitive technologies designed to help individuals live better and have more fun.

For IBM’s team, the Bon Appétit partnership has provided a sandbox where they can try out new features and get valuable feedback from thousands of regular folks–a rare opportunity for people who normally design software for large enterprises. “We learned a lot,” says Florian Pinel, one of the inventors of Chef Watson who in addition to being a computer scientist has a diploma from the Institute of Culinary Education in New York. “We learned more about how people wanted to interact and the things they wanted to do.”

What’s new in Chef Watson? In the previous experience, users were invited to choose one or two main ingredients, a dish type and a style of cooking. In return, Chef Watson suggested dishes based on more than 10,000 recipes in Bon Appétit’s database combined with its deep knowledge of food chemistry and human taste preferences. In the new experience, users can start by combining a couple of ingredients and, in an “Inspiration Station” on the screen, see suggestions for other ingredients and cooking styles. The new version is more flexible and interactive. Cooks and Chef Watson embark on journeys of discovery together.

Wendy’s Way Down South Wild Shrimp and Grits

“We’re trying to figure out how we can best create an environment for collaboration,” says Jacquelyn Martino, a user experience researcher in IBM Watson Group who is a Chef Watson power user in her spare time. A couple of weeks ago, she and Watson created a dish that combines salmon, turnips, radishes, Romano cheese, pumpkin seeds and cinnamon. She liked it so much she Tweeted to it.

IBM’s team also tinkered with Chef Watson to encourage people to be more adventurous. They found that most people started by choosing tried-and-true pairings such as chicken and garlic or beef and potatoes. So they allotted additional spaces for up to four key ingredients, which, they hope, will entice people to go a little crazy. Now the classic beef and potatoes might be combined with Chef Watson’s suggestions–perhaps pistachios and prunes.

By talking to users and observing their behavior online, the team saw that many cooks wanted to use Chef Watson to help them with common food challenges like developing or adapting recipes to comply with special dietary needs, health preferences and economic factors. So they made it easier to eliminate gluten and sweeteners; to use fruits and vegetables that are available from local producers; and to assemble dishes that make use of ingredients that are already in the refrigerator or panty–cutting down on waste.

Wendy Hite, who lives in Texas hill country with her husband and three children, has a strong aversion to wasting food. Even before she got her hands on the updated Web application, she used Chef Watson to help her family consume the foodstuffs that most often go to waste in her home–bread and yogurt. One result: she and Chef Watson came up with a bread pudding recipe that included black cherry yogurt. “Normally, I hate yogurt, but this was really good,” she says.

Susan Sink and students

For Susan Sink, of Raleigh, N.C., Chef Watson helps her respond to her calling–which is helping people to eat healthier and more sustainably. As a volunteer for non-profit organizations, she teaches classes in healthy cooking using locally grown food from farmers’markets.

Susan’s Winter Greens Slaw

The new Chef Watson experience, she says, helps her swap out ingredients that are undesirable for some reason or unavailable from local producers. “One thing I like to do is to take a basic recipe and move it through the four seasons,” she says. For coleslaw, for instance, she might use baby collard greens in the winter, bok choy and turnip greens in the spring, and chard with savoy and purple cabbage in the early summer and fall. She might also rotate through different varieties of carrots, turnips, radishes and kohlrabi to add more color and flavor as they become available. Chef Watson helps her spot new substitutes and uncommon additions, enriching the flavor of common recipes.

Chef Watson can also help people solve that ever-so-common backyard gardener dilemma: figuring out what to do with an overabundance of one vegetable or another. For Wendy Hite last summer, the challenge was dispensing with a bumper crop of ultra-hot habanero peppers. In the Hite household, a few habaneros go a long way. But her husband’s co-workers have proved to be enthusiastic consumers of the dishes she creates with Chef Watson. She made mango-habanero hot sauce. “It passed the co-worker test,” she says.

Now, with Chef Watson available to everybody, you can expect summer gardeners everywhere to find amazing things to do with their excess veggies. Maybe there will be another cookbook in the works.

We live in a dangerous world. You know the threats as well as I do. But we don’t have to live in fear. I’m convinced that technology can help police, corporate security officers, national security agencies and emergency management officials do their jobs better–making people, companies, cities and countries safer.

Situational intelligence is the key to making the world less dangerous. The more we know, the better prepared we are when the worst happens–and the more likely we are to be able to prevent it. To know more, we need to be able to sift through all the evidence to understand what’s happening now, and why, and what’s likely to happen next.

IBM’s Safer Planet portfolio is focused on this challenge–with offerings addressing law enforcement, emergency management, cyber threat intelligence, counter fraud, and defense and security intelligence. At the core of our Safer Planet products and services are technologies that help people gather a tremendous amount of data, make connections between bits of information that don’t at first seem like they’re related, and paint a full picture of a situation or a person.

In fact, I think of our domain as a place where many of the important technology trends of our time converge. The Internet of Things makes it possible to gather detailed real-time information about everything from the people passing through an airport to the physics of the atmosphere above Rio de Janeiro. Big Data analytics makes it easier to ingest and process all of that data. Mobile apps and technologies make it possible for police to gather information and access intelligence wherever they are. Cloud services make it easier to get new apps going and to add capabilities continuously as they become available. And data visualization–through dashboards and other techniques–transforms complex information into actionable insights.

Here’s a taste of some of the things we do:

For law enforcement, we enable organizations of any size to access the world’s largest network of crime data–with more than a billion records–from the cloud. Our technology, which is used by more than 4,000 law enforcement agencies in North America, enables investigators to tap sophisticated analytics and “fuzzy searches” to use partial information such as a portion of a license plate, a tattoo or a nickname to help make connections and accelerate investigations.

For emergency management organizations, we provide technology that features analytics and real-time weather data to help communities predict natural disasters more accurately so they can plan and deploy the right resources. We’re incorporating into our solution weather data and technology made available via our strategic partnership with The Weather Company. Now, organizations can create multiple scenarios of how a major weather event could unfold so they can prepare a set of contingency plans–and be ready to respond quickly no matter what happens.

For corporate information security officers across industries, we provide cyber threat intelligence to help organizations close what we call the cyber security intelligence gap. Because cyber criminals are inventing new exploits faster than organizations can protect against them, organizations must evolve their traditional IT security approach to incorporate an added layer of human insight into the criminals’ strategy, allowing them to more quickly detect threats.

For professionals responsible for fighting fraud and financial crimes, we provide technology that helps commercial organizations and government agencies leverage advanced analytics to detect potential fraud, respond to it in near real time, and incorporate new knowledge about fraud in their defenses. In one project with a client, our technology spotted suspicious activity in 14 percent of their dental insurance claims.

For intelligence agencies, we provide technology that ingests data as it streams from multiple sources and helps them detect non-obvious relationships and patterns. It features a recommendation engine that reconciles seemingly contradictory information–for instance, different names and addresses for the same person.

Our technology is put to use in some of the most complex and consequential situations that humans experience–where life-and-death decisions must be made, often on the spot, using incomplete information. We help people find the proverbial needle in the haystack, whether it’s an escaped prisoner on the run or a cyber-criminal network probing the defenses of a company or government agency. We help people address not only the known unknowns, but the unknown unknowns.

Our goal is nothing less than to help transform law enforcement and security. We’ll get better and better at this as we add new cutting-edge capabilities to the mix, including cognitive technologies. And we’ll also expand to new domains and markets. Think about it. In any situation where there’s lots of disparate information, lots of complexity, and a lot of value in understanding what’s really going on, quickly, this kind of intelligence-gathering technology will be extremely valuable. So we’ll be there.

You walk into a room at night and flip the light switch on the wall. The lights come on. You didn’t think twice about that …you were certain it would work. While we’re not at that point everywhere in the world yet, it is true of most industrialized regions that electricity is a highly reliable resource. But the reality behind that simple action of turning on a light switch is a constantly evolving list of uncertainties that utilities deal with 24/7.

Uncertainty takes many forms in the utility industry, from the health of individual devices as they age, to volatility of fuel prices, to the behavior of you, the consumer, and your use of electricity or natural gas. And uncertainty can be equated to risk — the risk of failing to achieve both operational and business objectives. That’s not a risk any business wants to take.

The utility industry does a remarkable job dealing with uncertainty, but there are numerous factors at play that are increasingly making it more difficult. If you put uncertainty into mathematical terms, every source of uncertainty represents a new variable in the equations that utilities use to plan and operate their business — from what to charge to how much energy to generate. As the number of variables increases, the techniques used to model and solve those equations have to evolve. Not only does the utility industry have to solve more complex equations with more variables, but has to do it faster.

In our discussions with utilities, the IBM Research Smarter Energy team has identified six propositions that represent key factors driving change that contributes to uncertainty in the utility industry:

1. “Distributed”is the keyword for the new utility system: the generation and storage of and intelligence about energy are all becoming more distributed. This can have benefits such as fewer critical points of failure, leading to more resilient infrastructure, but will also lead to new business models such as those being discussed in New York State as part of the Public Service Commission’s “Reforming the Energy Vision”initiative that is a strategic approach to developing a clean, reliable, and affordable energy system for all New Yorkers.

2. The utility system is increasingly instrumented and intelligent: utilities are becoming “data rich,”but we have to put all that data to use in an effective way in order to have a positive impact on operational and business performance.

3. Energy cost is increasingly based on time and place of use: we need to be on the lookout for ways to more rapidly engage and influence when and how customers use energy.

4. Renewable energy sources are becoming cost competitive: photovoltaics, or converting solar energy into direct current electricity, is below 10 U.S. cents per kilowatt-hour, which is lower than grid-supplied electricity in some regions.

5. Renewable mandates are accelerating investments: environmental policies, in the form of regulatory mandates, are driving further investment in renewable supply technologies. This week for example, the White House hosted a Clean Energy Investment Summit highlighting more than $4 billion in private sector commitments to scale up investment in clean energy innovation.

6. Uncertainty will only continue to increase: weather impacts, demand, consumer behavior, fuel costs, policy changes, and disruptive technologies are all contributing factors to a growing number of risks that utilities are facing.

At the 3rd annual IBM Smarter Energy Research Institute (SERI) Conference taking place this week at IBM’s T.J. Watson Research Center in New York, uncertainty and how to quantify and manage it will be the main topic of discussion with nearly 30 utilities (including our SERI partners), government and academic organizations attending. Those discussions and, more importantly, the collaboration with our SERI partners to solve these challenges going forward has important implications for the utility industry.