Visualization is the study of the transformation of data to visual representations. These visual elements are then used to gain insight into and from the data. In the 30 years since the landmark “Visualization in Scientific Computing” report in which the National Science Foundation Panel on Graphics, Image Processing, and…

Universal media access, as proposed and discussed in the late 1990s and early 2000s, is now a reality. It is very easy to generate, distribute, share, and consume any media content, anywhere, anytime, and on any device. Most of these real-time entertainment services (streaming audio and video) are typically deployed…

One of major advantages of the Internet of Things (IoT) is its massive data-gathering capability. IoT data, when complemented with relevant contextual information, can support business decisions with accurate, dependable, relevant, and timely information for enabling predictive, prescriptive, and other forms of analytics. Improving IoT systems through context-awareness is one…

Organizations and companies have been using basic data analytics for years to uncover simple insights and trends. The appetite for more data and better analytics has grown over the years, and now most modern organizations track and record nearly all types of data: transactional, clickstream, social media, audio, video, sensor,…

Virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems present users with realistic simulations. The most common application is gaming, but did you know that VR/AR/MR can also create virtual musical instruments (VMIs)? VMIs may or may not mimic traditional physical musical instruments. Physical instruments can produce beautiful…

Microelectronic technology’s ongoing scaling according to Moore’s Law enables increasing microprocessor performance and complexity, paving the way for innovative applications that were unthinkable just a few years ago. Today, we are surrounded by electronic devices that exchange data with one another via the Internet, creating the Internet of Things (IoT).…

Nobel Prize-winning economist Herbert Simon famously described the “poverty of attention” that accompanies an overabundance of information. This concept certainly applies to today’s Big Data era. As our capacity to generate, store, and process vast amounts of data increases, we need new techniques to derive reliable and actionable insights from…

In the emerging Internet of Things (IoT) era, defined by ubiquitous networks of interconnected and Internet-enabled objects, digital technology fluency is more valuable than ever. At the core of IoT are microcontrollers, sensors, actuators, and MEMS. Groundbreaking innovation requires an understanding of the concepts behind these devices. As Albrecht Schmidt…

In December 2016, the IEEE Computer Society released its annual vision of the future of technology. “Self-driving cars” is near the top of the list of trends that will reach adoption in 2017. This probably isn’t surprising to most of us, given the substantial media attention and hype already surrounding self-driving…

With the increasing amount and importance of data that modern information systems maintain, security and privacy breaches pose an increasing threat to users, as well as system designers and operators. A key cybersecurity challenge is human error. According to a 2015 US Department of Defense report, accidental misuse caused by poor…

The Big Data era promises some significant advancements but also introduces many new challenges. The memory and processing power required for storing and analyzing that data will quickly surpass what our current infrastructure can provide. Massive data centers are expensive, and trends indicate that both demand and cost will continue…

Visualization is the study of the transformation of data to visual representations. These visual elements are then used to gain insight into and from the data. In the 30 years since the landmark “Visualization in Scientific Computing” report in which the National Science Foundation Panel on Graphics, Image Processing, and…

Growing components of our global energy ecosystem, renewable energy sources generated approximately 23 percent of total worldwide electricity in 2015, according to the International Energy Agency. After hydro- and wind power, solar power — or photovoltaic (PV) energy — is the third-largest renewable energy source. According to SolarPower Europe, PV…

We asked some experts in the field to share their responses to a few fundamental questions about IoT interoperability. Matus Maar, Oscar Lazaro, Roberto Minerva, Sebastián Pantoja, Laurent Belmon, and Yossi Dan provide varying perspectives on the future of IoT interoperability in industry. Bios Matus Maar is a co-founder and partner…

The Internet of Things (IoT) is an omnipresent network of physical and virtual objects and resources that are equipped with sensing, computing, actuating, and communication capabilities. Yet, although billions of these objects have already been deployed, something is slowing IoT solution adoption: device incompatibility. The original IoT vision involves a…

Most of today’s Internet users have at least two personal devices that they use to create, share, and consume multimedia content anywhere and anytime. Both the amount of content and the amount of time people spend viewing it have increased significantly in recent years. It’s estimated that by 2019 it…

Service robotics has long captivated Hollywood, with many robot characters appearing in science-fiction films and television series. In the 1950s, for example, the film Forbidden Planet featured a servant named Robby the Robot. Fifty years later, Robin Williams played a domestic-household android robot in Bicentennial Man. In Disney Pixar’s 2008 WALL·E, the protagonist is…

One of the most valuable aspects of the emerging Internet of Things (IoT) is the data it produces. Businesses use that data to support their decisions, and — as IoT grows — they need better tools for relevant and timely discovery. These days, discovery systems can find the right data…

Massive open online courses (MOOCs) are web-based classes with no limit on student enrollment, allowing thousands of remote students to take the same course simultaneously. Although MOOCs haven’t sparked the revolution in education that many anticipated, they have certainly provided learners from around the world with opportunities that would otherwise…

In the introduction to the September 2015 issue of Computing Now, I discussed how as microelectronics shrink, integrated circuit (IC) performance and complexity increase — leading to vulnerability to several phenomena and thus threatening reliability in the field. That issue explored one phenomenon, in particular: bias temperature instability. Yet, several other challenges are…

The security-and-privacy world is changing rapidly. We've long thought about security and privacy as two equivalent goals, picturing a world in which both were equally achievable and could exist simultaneously. However, we've learned that security and privacy are sometimes mutually exclusive. For example, after the 2015 terrorist attack in San Bernardino, California, the US Federal Bureau of Investigation clashed with Apple over access to the shooter's iPhone.

The IEEE Computer Society’s 2022 Report, which was released through the Computing Now site in 2014, presents insights from tech leaders to explore what our world might look like in 2022. Among its findings, the report predicts an integrated network of smart devices, which it calls “seamless intelligence,” that will be…

We edited a special issue on “Software Engineering for Big Data Systems” for the March/April 2016 issue of IEEE Software magazine. The issue focused on big data’s implications for software engineering and five categories of design requirements for building such systems: pervasive distribution; write-heavy workloads; variable request workloads; computation-intensive analytics; and high…

This issue of Computing Now presents highlights of the 2015 IEEE Visualization (VIS) conference, held from 25 through 30 October in Chicago. As our 2014 and 2015 Computing Nowtheme articles have shown, the field of visualization continues to grow as data science advances across many arenas. For the first time, VIS 2015 featured sessions…

Electronic systems are increasingly used in safety-critical applications, not only in “traditional” domains (such as aerospace) that have employed fault tolerance for decades but also in new domains in which embedded systems are common, such as the automotive, biomedical, and telecommunications domains. In most cases, it’s important to detect possible…

Augmented reality is a formidable method of presenting information. Its in-situ nature enables the presentation of just-in-time information and data visualization in the context of physical objects and locations. Virtual reality lets users view and explore environments that are literally out of their reach. Both augmented and virtual realities can…

Software, as they say, is eating the world. From personal computers to our cell phones, cars, and vacuum cleaners, software is increasingly growing in terms of both the number of places we find it and the functionality we expect of it. Unfortunately, like most of us, as software matures and…

As “the foundation of a global infrastructure for the information society,” as the ITU has called it, the Internet of Things (IoT) is now close to the peak of its evolution. By 2020, it’s estimated that there will be 10 connected IoT devices for every person in the world — 40 to 80…

Analytics is generally defined as the discovery and multimodal communication of meaningful patterns in data that can be stored locally or in the cloud. Mobile analytics emphasizes lightweight discovery on mobile devices, computation balancing, and communication of meaningful patterns in sensed, observed, discovered, and computed data while on the move.…

According to Moore’s law, the scaling of microelectronic technology allows for increasing system performance and complexity, thus paving the way for innovative applications that were unthinkable just a few years earlier. However, that same shrinking feature size increases the vulnerability of integrated circuits (ICs) to aging phenomena such as bias…

During the Renaissance era, workshops included a headmaster together with apprentices, wageworkers, assistants, teachers, and guests. They created a broad range of commissioned products ranging from great cycles of frescoes and majestic altarpieces to small, ordinary objects. These workshops often featured multiple threads based on precise divisions of labor to…

The definition of the Internet of Things (IoT) is somehow elusive because it refers to so many disciplines, technologies, and application domains. Essentially, the IoT envisions systems of networked sensors and smart objects that work together to make an environment intelligent, usable, and programmable. IoT technology encompasses sensors, circuits, embedded…

In the past half century, driven by rapid, phenomenal advances in microelectronics closely following Moore’s law, computers of different kinds, forms, and shapes have evolved, redefined, and transformed almost everything we deal with. However, they still function on the same fundamental computational principles that Charles Babbage and Alan Turing envisaged and…

Industry reports indicate that digitally generated data is doubling roughly every year. This isn’t an evolution but rather a disruptive change that presents a new set of opportunities and problems to the storage industry and researchers. We’re already experiencing the next paradigm in data processing as enterprises and individuals adapt…

For this issue of Computing Now, we gathered a set of articles that exemplifies the latest developments in computer-generated visualization. In this “big data” era, more and more people appreciate the importance of visualization in observing, interpreting, and analyzing data, as well as in communicating and disseminating the discovered findings…

Virtual communities take on many different forms — from the explicit, by-design communities that form discussion groups and forums to the implicit groups that develop organically in software development, online games, and the like. As technologies grow and change, easing some aspects of communication and neglecting others, these communities’ character…

Entertainment is usually associated with the idea of doing something we enjoy — something we can do alone or with others to amuse ourselves, to have fun in our leisure time, or perhaps something relaxing, or that can make us laugh. Yet, ways to entertain and be entertained have changed…

在线社交网络（OSN，Online Social Network）上的数据分享已成为全球多数人日常生活的重要组成部分。OSN用户自愿分享了难以数计的数据（如照片、视频、短信，网络查询等），这些行为被各种Web服务观测到（通过浏览器Cookie和广告跟踪器等多种手段）并记录为巨量行为数据。基于这些主动贡献的数据和观测到的数据，许多在线服务商能够自动推断出新信息，还能构建出用户模型档案，再销售给第三方用户，这就是他们目前商业模式的核心。

Sharing data on online social networks (OSNs) has become an important part of everyday life for a wide majority of citizens worldwide. OSN users share myriads of volunteered data (such as photos, videos, text messages, Web queries, and likes) and are observed by a variety of Web services (through various…

Nearly 20 years after the 1st workshop on Agent Theories, Architectures, and Languages (ATAL’94) at ECAI’94, which many assume as a starting point of intensive agent systems research, we asked some of the most prominent and active researchers in the field to share their views on a few fundamental questions…

The ability to dynamically reconfigure hardware is an idea with powerful possibilities, but it’s a concept that has been difficult to implement. Reconfigurability can be implemented in various ad hoc ways as well as with field programmable gate arrays (FPGA) and coarse-grained reconfigurable arrays (CGRA). These chips provide substantial computing…

Rising energy demands and the growing negative environmental impact from the increased adoption of IT services are motivating the green movement in IT, which places great importance on the design and implementation of green solutions. Green IT is applicable to a range of high-tech domains, including datacenters, mobile computing, and…

Human-computer interaction (HCI) is a multidisciplinary research area focused on interaction modalities between humans and computers; sometimes, the more general term human-machine interface (HMI) is used to refer to the user interface in a manufacturing or process-control system. In other words, the HCI discipline investigates and tackles all issues related…

Several converging and complementary factors are driving the further ascension of the cloud (cloud computing). The increasing maturity of cloud technologies and offerings coupled with users’ greater awareness of the cloud’s potential benefits (as well as limitations) is accelerating the cloud’s adoption. Better Internet connectivity, intense competition among cloud service providers…

In a mere 25 years, the Web has irrevocably transformed the world. It has become indispensable, impacting nearly every aspect of human activities in practically all fields. It continues to leap ahead offering new capabilities and extending its reach and utility. The Web has indeed become the most influential technology…

About 540 million years ago, a vast multitude of species suddenly appeared in the fossil record, along with major diversification of organisms. Paleontologists call this event the “Cambrian explosion.” We are currently witnessing the computing world’s version of the Cambrian explosion. In the past decade, advances in sensing capabilities, screen…

For computational simulations, the era of “big data” ended before it began. We’re actually living in the era of infinite data — in which the stream pouring forth from computational models and simulations can be as voluminous as outputing every value at every timestep, drowning disks and researchers alike with high-cadence, arbitrarily…

Semantics represents the backbone of several of today’s technological trends and visions, from the Internet of Things (IoT) and Web of Things (WoT) paradigms to the Open Data and Linked Data initiatives. Moreover, it is increasingly advertised as a key enabling technology in a growing number of application domains. In the past ten years, we at Polytechnic University…

The Internet of Things (IoT) is an extension to the current Internet that enables connections and communication among physical objects and devices (see the September 2013 Computing Now theme for more on IoT and its role in ubiquitous sensing). Estimates suggest that there will be 50 billion devices and people connected…

Big data continues to grow exponentially, and surveillance video has become the largest source. Against that backdrop, this issue of Computing Now presents five articles from the IEEE Computer Society Digital Library focused on research activities related to surveillance video. It also includes some related references on how to compress…

For this issue of Computing Now, we gathered a set of articles that exemplifies current trends in computer-generated visualization. The field of visualization was benchmarked in 1987 with a landmark report, entitled “Visualization in Scientific Computing,” which was prepared by the (US) National Science Foundation (NSF) Panel on Graphics, Image Processing, and…

Microelectronic technology’s ongoing scaling according to Moore’s law enables us to keep increasing microprocessor performance and complexity, thus paving the way for innovative applications that were unthinkable just a few years before. However, that same shrinking feature size poses new challenges for testing and reliability of high-performance microprocessors. The December…

Emphasis on autonomous systems has been growing steadily in recent years as users and organizations increasingly require software systems to deliberate on their behalf, to face unpredictable situations without human assistance, and to autonomously find solutions to complex problems. Whatever the sort of autonomy that’s required, one technology has autonomy…

Multimedia has become pervasive in the past decade thanks to easy creation, delivery, and consumption through a vast number of devices and platforms. However, the multimedia ecosystems enabling the services we use in our daily lives can be quite complex and typically involve multiple parties, ranging from various providers (network/content/service)…

A pillar of the Future Internet, the Internet of Things (IoT) will comprise many billions of Internet-connected objects (ICOs) or “things” that can sense, communicate, compute, and potentially actuate, as well as have intelligence, multimodal interfaces, physical/virtual identities, and attributes. The IoT incorporates concepts from pervasive, ubiquitous, and ambient computing,…

The trend for the 90s and a large portion of the early 2000s was commodity hardware — items that were widely available and relatively inexpensive — with the x86 architecture dominating the market. Since then the trend has been toward specialized hardware, in which a hardware platform is designed and built for…

Recent advances in technology and a corresponding reduction in costs are driving growth in research into indoor positioning and navigation. In 2011, IT Professional magazine dedicated a very interesting special issue to “Real-Time Location Systems and RFID,” guest edited by J. Morris Chang, Yo-Ping Huang, and Simon Liu. This video by Prof.…

Rather than treat grid, cloud, and high-performance computing (HPC) as separate and distinct approaches, this month’s CN theme focuses more on interoperability among these methodologies — and the issues that arise along the way. Working in Harmony The first article in our theme, “From Meta-Computing to Interoperable Infrastructures” by Stelios…

Climate change is a reality, and its main cause is manmade greenhouse gas (GHG) emissions, most notably carbon dioxide (CO2). IT professionals and the IT industry are now called upon not only to make IT systems and their work practices greener but also to harness IT’s power to address the…

The only component of the data center that continues to grow in size and number, storage is a fascinating area of computing. Enterprise IT leaders continue to seek maximum efficiency in organizing and operating their data centers. As IT has become the business technology of modern enterprises, the unavailability of…

I’ve had a multifaceted career in computing — a term that I note is different from the discipline commonly known as computer science. I began as a software developer/engineer, working in the areas of printing/publishing and telecommunications equipment. After completing my doctoral studies and “making the transition” as a postdoctoral scientist,…

“We have met the enemy and he is us.” —Walt Kelly, Pogo Over the past few decades, security research has garnered increasing attention and funding. Despite much effort, however, current security practice conveys an ad hoc flavor — find a bug; patch it; find the next bug; and so on. This…

In the past, users generally consumed multimedia content in a passive manner without any interaction. Today, universal access to multimedia is technically feasible anywhere, anytime, and with any device thanks to the evolution of and investments in networking infrastructure, which have dramatically increased the available bandwidth. A side effect of…

I consider myself a bit of a language junkie, although I’m more properly termed a languages person trapped in a systems researcher’s body. In the early part of my career, I worked with a colleague at Argonne National Laboratory on compilers (and tools for creating compilers) for experimental object-oriented languages.…

Emerging markets — nations in the process of rapid growth, industrialization, and socioeconomic development — are the world’s new powerhouses. They represent two-thirds of the global population, generate more than 20 percent of its gross domestic product, and are restructuring themselves to foster further growth and development. Although these markets…

High performance computing is no longer limited to those who own supercomputers. HPC’s democratization has been driven particularly by cloud computing, which has given scientists access to supercomputing-like features at the cost of a few dollars per hour. The four articles I’ve selected for this month’s Computing Now theme highlight…

Augmented Reality (AR) is rapidly becoming one of the best known buzzwords associated with future user interfaces. Its name recognition has accelerated further over recent months, thanks to the announcement of Google’s Project Glass, whose eyewear display prototype the popular press often incorrectly refers to as exemplifying AR. But, what does…

Numerous headlines from the past few years have pointed out challenges attracting, motivating, retaining, and graduating STEM majors, thus leading to worrying shortfalls in these professional areas: After the tsunami hit Japan on 11 March 2011, in-car navigation systems were the main source of information for those seeking passable roads…

Numerous headlines from the past few years have pointed out challenges attracting, motivating, retaining, and graduating STEM majors, thus leading to worrying shortfalls in these professional areas: In “Why Would-Be Engineers End Up as English Majors,” CNN’s Assia Boundaoui commented that “science and math programs are designed and taught to…

Recent reports from Amazon Web Services (AWS) indicate that the company’s S3 storage service will soon have more than a trillion objects in storage and be capable of handling a million requests per second. Clearly, we are living through an era of transformation in storage system architectures designed to deliver continuously scalable…

It’s very likely that you’ve been on camera from the moment you left home today — recorded as you rode in the elevator, walked on the street, bought coffee at the local deli, withdrew money, and as you’ve moved throughout your office building. While you’re at work, cameras might be…

Cloud computing has recently been the focus of much excitement in the IT community. For decades, when organizations needed to increase their data and computation capacity, they had two options: purchase more hardware if the budget permitted, or make the IT operation more efficient and lean (but this limited the…

Where does technology come from? Can you teach people to be entrepreneureal? What can we do as a society to encourage and foster innovation? As software completely transforms the business world, what can we do to harness and channel the creative power that it unleashes? This special issue of Computing…

The smart grid is a profound transformation in how electricity is received, used, and distributed that will play out over the next several decades. This change will affect virtually every aspect of human life and the environment.

The sensing capabilities of the infrastructure and devices surrounding our daily lives are improving and becoming more affordable by the day. Office buildings, transport infrastructure, and homes are increasingly instrumented with smart devices that can detect human presence and environmental conditions. In this month’s theme, we focus on the topic…

Process mining research started in the late ’90s but has only recently appeared on the radar for business intelligence practitioners. Process mining lets users and managers look inside processes, providing valuable insights for process improvement and compliance checking. On 7 October 2011, the IEEE Task Force on Process Mining released…

Video streaming over the Internet has become omnipresent. Content providers such as Netflix, Hulu, Apple, and Vudu don’t deploy their own delivery infrastructure, but use existing Internet distribution means to deliver their services. This streaming approach works surprisingly well without any particular support from the underlying network, even in heterogeneous…

The development of mobile applications that can run across multiple heterogeneous devices is challenging. Not only do mobile devices differ considerably at the hardware level, but the software development environments are also very different. This month’s theme highlights some of the issues and challenges involved in this active area of…

Service-oriented computing has been the focus of tremendous research and technology development over the last 10 years and is now acknowledged as a central paradigm for Internet computing. In this context, researchers have looked at service composition from two complementary points of view: service orchestration and choreography. Although several software…

The race toward the fastest computer has become more global than ever, as witnessed by the list of the world’s top 500 supercomputers. Only two years ago, the three top supercomputers were from the US Department of Energy (DoE). The list is updated every six months; over the following year and…

When I recently met with Lawrence Miller, author of Barbarians to Bureaucrats, he relayed his premise that the Industrial Revolution basically took humans — who are “community” or “pack” animals by nature — and isolated them in front of machines. As a result, people were facing simple machines with redundant tasks…

The July/August issue of IEEE Software looks at the software business from the perspective of software engineering, true to its mission. But the IEEE Computer Society family of magazines offers the opportunity to look at this topic through the lens of many different perspectives, offering a more complete vision to the reader.…

“Cyber” is all the rage today. Many topics fall under the “cyber-” umbrella: cyberspace, cybersociety, cyberculture, cybersecurity, cyberpunk, cyberterrorism, cyberinfrastructure, cyberart, cyberwar, cyberdefense, cyberoffense, cyberattack, cyberexploitation, cybercrime, and many more. From the perspective of the IEEE Computer Society, one of the most interesting (and challenging) of these is cybersecurity. This…

As the world’s climate heats up and more people become concerned about the environment, a new spotlight appears on information technology. IT affects our environment in many ways, but most people — including many IT professionals — don’t realize this. Each stage of a computer’s life, from production and use…

Nowhere has the impact of entrepreneurship and innovation been more dramatic and significant than in computing, engineering, and information technology. Indeed, high-tech businesses have long relied on innovative change as a key factor to their very survival. Simply put, companies that don’t innovate don’t survive. The entrepreneurial spirit has been…

The computer industry has long treated storage as a peripheral element. Now, data and data storage are attracting the most attention in systems design. Computer engineers invented storage hierarchy to balance performance with cost per byte processed and stored. Additionally, structuring data helps the computing thread to reach the data…

In late 2008, I had the pleasure of working with Steve Gottleib and Volodymyr Kindratenko on a special issue of Computing in Science and Engineering magazine. Entitled “Novel Architectures,” the issue discussed how to incorporate accelerators into one’s arsenal of programming techniques. Having spent most of my life working on more “conventional”…

Services computing cuts across various disciplines to cover the science and technology of bridging the gap between business services and IT services. The underlying technology suite includes Web services and service-oriented architecture (SOA), cloud computing, business consulting methodology and utilities, and business process modeling, transformation, and integration. In terms of…

It seems like just yesterday that open source was the new craze that was going to sweep change and revolution into the software industry. Like all new tech trends, it was amorphous—it was everything from a methodology to tools to a social movement. It was loved and hated. And as…

MPEG technologies are well established with respect to audio and video coding. New activities over these basic standards aim to develop applications to handle MPEG multimedia content; some such applications are already a reality. This month’s theme presents some of the last research results around this idea. These articles discuss…

In commenting on Computer‘s September 2009 issue in his article “Really Rethinking Formal Methods,” (login required for full text) David Parnas challenges some of the core claims and assumptions underlying mathematical approaches to software modeling, model-checking, and related formal methods. He notes that there is a significant gap between formal methods…

Hardware security and trust issues span a broad spectrum of topics, including the malicious insertion of hardware Trojans designed to act as silicon time bombs by enabling chips upon fabrication and disabling them upon tampering, intellectual property (IP) and integrated circuit (IC) piracy, digital rights management, untrusted third-party IP cores,…

Context is the unstated actor in human communications, actions, and situations. It makes our communications efficient, our commands actionable, and our situations understandable by those—devices, people, or organizations—seeking to provide us with content or services. The advent of context-aware computing is therefore concomitant with the increased embedding of technology into…

Faces are an important communication vector. Through facial expressions, gaze behaviors, and head movements, faces convey information on not only a person’s emotional state and attitudes, but also on discursive, pragmatic, and syntactic elements. Expressions result from subtle muscular contractions and wrinkle formations, and we perceive them through a complex…

The flexibility possible with software-defined radios (SDRs) is key to the future of wireless communication systems. Prior generations of wireless devices relied on highly customized, application-specific hardware with little emphasis on future-proofing or adaptation to new standards. This design approach generally yielded power- and performance-optimized solutions at the expense of…

It isn’t hard to find articles and white papers ranting about the coming shortage of IT talent. Some of these discourses focus on the causes and symptoms of the ever-shrinking IT workforce, and many extol the need to expand the pool of proficient IT talent. Suggestions include increasing the student…

Do you think “agile architecture” is a contradiction in terms? Well, perhaps not. Architecture should be seen not as immutable but as an asset to reevaluate at each iteration, in close collaboration between architects and developers. But in practice, how do you really use agile methods at the level of…

Businesses’ data storage needs are always increasing, and nothing indicates that we will be storing less data in the future. On the contrary, everything points to an era where data management will gain center stage in computing. Data loss is a disaster and can have significant costs to consumers and…

This month, Computing Now explores biometrics—the authentication of persons based on physical or behavioral characteristics. Biometric technology can be found in many aspects of our daily life, from paying for groceries to accessing personal computers and buildings to automatically labeling and organizing digital pictures. Historical Background Biometrics has a long…

In Computing Now this month, we take a broad look at prospects for computing in 2010, focusing on technology, policy, and applications. We’ve included several essays from the “Internet Predictions” issue of IEEE Internet Computing (Jan/Feb 2010), in which invited experts share their thoughts about the prospects for the Internet in the…

Multimedia semantics is more than developing ontologies to describe the nature of multimedia content. It’s the key research area for interoperable, intelligent access to and management of multimedia materials. There are many metadata standards. More than 10 organizations vie for leadership in content description, including the Dublin Core Metadata Initiative,…

With the wide adoption of Web and mobile technologies and with the virtualization of many facets of everyday life as a backdrop, social computing takes a computational approach to the study and modeling of social interactions and communications. It also encompasses the development of information and communications technologies (ICTs) supporting…

System virtualization is a method for executing applications in which the applications are installed in and executed by a software representation of a real computer called a system virtual machine. System VMs, in turn, run on top of a software layer called the hypervisor. System virtualization is the underpinning for a number…

Cloud computing is one of today’s most-discussed technical topics. Despite the relative decline of grid computing and unfulfilled promises of utility computing, cloud computing appears to be catching on in both industry and academia. Compared to its predecessors, cloud computing seems to be better positioned in terms of economic viability,…

The boundaries between the real and virtual worlds are breaking down: computer-generated (CG) characters and scenes in movies engage and convince us, and our kids are as comfortable interacting with graphical environments and characters as they are with their own real-world friends and families. This trend is not only true…

Computer science practitioners deal with ethical challenges daily. How they deal with these challenges has ramifications that go beyond personal responsibility; often, the health, safety, and welfare of the public are at stake. This month, Computing Now covers professional ethics, which concerns what computing professionals should and shouldn’t do in…

The Internet has evolved from a dial-up command line childhood into a broadband, visually rich adolescence. However, in many ways it’s still handicapped by its laissez-faire heritage. Video—the most powerful visual multimedia format—requires significantly higher bandwidth, quality of service, reliability, scalability, and security than Internet’s best-effort legacy might be able…

When asked to put together a special issue for our first anniversary of Computing Now, I must confess that there were a dozen or so stray thoughts circling in my head. To be entrusted to this task as a member of a highly capable and visionary board was, indeed, a…

In recent years, trust in computing has been receiving increased attention. With the emphasis on loosely coupled and decentralized systems and the advent of service orientation, trust management has moved beyond the domains of security, multiagent systems, and e-commerce to become a key concern across all aspects of computing. However,…

Serious games use video game technologies to simulate realistic situations, providing valuable experience that can support discovery and exploration while saving money and lives. Serious games have been used for many purposes, including flight and vehicle simulation, scientific simulation and visualization, industrial and military training, medical and health training, education,…