The hyperscalers of the world have to deal with dataset sizes – both streaming and at rest – and real-time processing requirements that put them into an entirely different class of computing. They are constantly inventing and reinventing what they do in compute, storage, and networking not just because they enjoy the intellectual challenge, but because they have swelling customer bases that hammer on their systems so hard they can break them.

Since the ’50s, scientists have chased the promise of clean energy from sun-like reactions between deuterium and tritium, the plentiful isotopes of hydrogen. This carbon-free energy, achieved at temperatures of 360 million degrees Fahrenheit, would offer a great way to heat water and, in turn, spin turbines to create countless kilowatts of electricity.

While the large majority of business leaders agree that AI capabilities provide businesses with a distinct advantage over the competition, many of the smaller businesses are still holding back. After all, AI comes with a big price tag and requires a specific skill-set and job training that small businesses simply do not have the funds for.

The last couple of years have seen cloud computing gradually build
some legitimacy within the HPC world, but still the HPC industry
lies far behind enterprise IT in its willingness to outsource
computational power. The most often touted reason for this is cost
– but such a simple description hides a series of more interesting
causes for the lukewarm relationship the HPC community has with
public cloud providers. Here, we explore how things stand in 2018 –
and more importantly, what the cloud...

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise IT in its willingness to outsource computational power. The most often touted reason for this is cost – but such a simple description hides a series of more interesting causes for the lukewarm relationship the HPC community has with public cloud providers. Here, we explore how things stand in 2018 – and more importantly, what the cloud vendors need to do if they want to make their services competitive with on-premise HPC.

High-bandwidth memory can improve a computer's performance. On-package memory (OPM) is a popular option in many commercial systems. Before this effort, little was known about OPM's implications on speed and power use. The team experimentally characterized and analyzed modern OPM storage. They provided guidelines on tuning the memory to speed up high-performance computing (HPC) applications.

In biology, similar graph-clustering algorithms can be used to understand the proteins that perform most of life's functions. It is estimated that the human body alone contains about 100,000 different protein types, and almost all biological tasks -- from digestion to immunity -- occur when these microorganisms interact with each other.

As AI's role in society continues to expand, J B Brown of the Graduate School of Medicine reports on a new evaluation method for the type of AI that predicts yes/positive/true or no/negative/false answers. Brown's paper, published in Molecular Informatics, deconstructs the utilization of AI and analyzes the nature of the statistics used to report an AI program's ability.

Stephen Hawking passed away at his home in Cambridge, England, in the early morning of March 14; he was 76. Born on January 8, 1942, Hawking was an English theoretical physicist, cosmologist, author and director of research at the Centre for Theoretical Cosmology within the University of Cambridge. A brilliant scientist and visionary, Hawking advanced cosmology as a computational science and led the launch of several UK supercomputers dedicated to cosmology and particle physics.

In an upper-level seminar on artificial intelligence, Occidental College professor Justin Li started a discussion outside the realm of a typical computer science class. Should a self-driving car, if unable to brake in time, be programmed to steer into a wall to avoid crashing into pedestrians — perhaps killing a single person in the vehicle in order to save five on the street?

About 20% of all IoT enabled devices shall have basic level Blockchain services enabled in them in 2019. That said, there would be more devices that will be will be able to send data to private ledgers. This is an obvious hint at how the two most futuristic technologies have started converging, and most importantly, heralding a new world where companies are introducing new blockchain-based initiatives left, right and center!

The first step in rolling out a massive supercomputer installed at a government sponsored HPC laboratory is to figure out when you want to get it installed and doing useful work. The second is consider the different technologies that will be available to reach performance and power envelope goals. And the third is to give it a cool name.

2017 was not necessarily the best year to build a large HPC system for life sciences say Ari Berman, VP and GM of consulting services, and Aaron Gardner, director of technology, for research computing consultancy, BioTeam. Perhaps that’s true more generally as well. The reason is there were enough new technology options entering or expected soon to market – think AMD’s EPYC processor line, Intel Skylake, and IBM’s Power9 chip – that choosing wisely among them could seem premature.

Richard Childress Racing (RCR) is hoping to improve racing times through a multi-year partnership with ANSYS. RCR will use ANSYS Pervasive Engineering Simulation software to more accurately predict machine performance and enhance vehicle speed on the race track. A fraction of a second on the race track can determine which team takes the trophy, so NASCAR Monster Energy Cup Series teams must constantly improve speeds to stay competitive.

More institutions — including Harvard, Massachusetts Institute of Technology and Stanford, among others — are offering courses targeting ethical use of data and computer science, writes the New York Times. Accrediting group ABET even requires that ethics be apart of its institutions' computer science programs.

As the technology sector works to solve its diversity problem it
must grapple with a puzzle: why fewer women studying computer
science? Today, less than 20 per cent of computer science graduates
in the US are female, compared with more than a third in the
mid-1980s. “You don’t see the same gender disparity in other in
other sciences as you do in computer science,” says Reshma Saujani,
founder of Girls Who Code, a non-profit organisation that runs
after-school clubs across the US for girls ...

As the technology sector works to solve its diversity problem it must grapple with a puzzle: why fewer women studying computer science? Today, less than 20 per cent of computer science graduates in the US are female, compared with more than a third in the mid-1980s. “You don’t see the same gender disparity in other in other sciences as you do in computer science,” says Reshma Saujani, founder of Girls Who Code, a non-profit organisation that runs after-school clubs across the US for girls up to 12th grade (age 18).

Joining different kinds of materials can lead to all kinds of breakthroughs. It's an essential skill that allowed humans to make everything from skyscrapers (by reinforcing concrete with steel) to solar cells (by layering materials to herd along electrons).

3-D printing has gained popularity in recent years as a means for
creating a variety of functional products, from tools to clothing
and medical devices. Now, the concept of multi-dimensional printing
has helped a team of researchers at the Advanced Science Research
Center (ASRC) at the Graduate Center of the City University of New
York develop a new, potentially more efficient and cost-effective
method for preparing biochips (also known as microarrays), which
are used to screen for and analyze b...

3-D printing has gained popularity in recent years as a means for creating a variety of functional products, from tools to clothing and medical devices. Now, the concept of multi-dimensional printing has helped a team of researchers at the Advanced Science Research Center (ASRC) at the Graduate Center of the City University of New York develop a new, potentially more efficient and cost-effective method for preparing biochips (also known as microarrays), which are used to screen for and analyze biological changes associated with disease development, bioterrorism agents, and other areas of research that involve biological components.

The number of international students enrolling in American universities is declining for the first time in years, amid volatile shifts in U.S. immigration policy. That’s according to the latest data from the federal government’s National Science Board. The number of international graduate students enrolled in U.S. science and engineering programs dropped 6 percent between 2016 and 2017 and 5 percent in non-science and engineering fields.

The study of computer science has undergone sweeping change in the past couple of decades, right along with the technology itself. And Marty Guenther, who recently retired as undergraduate coordinator in the computer science department, has been both witness to that and a key participant during her 25 years at the University.

Researchers from Google are testing a quantum computer with 72 quantum bits, or qubits, scientists reported March 5 at a meeting of the American Physical Society — a big step up from the company’s previous nine-qubit chip.

While it may be the era of supercomputers and "big data," without smart methods to mine all that data, it's only so much digital detritus. Now researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley have come up with a novel machine learning method that enables scientists to derive insights from systems of previously intractable complexity in record time.

The phenomenal complexity of computing is not decreasing. Charts of
growth, investment and scale continue to follow a logarithmic
curve. But how is computational balance to be maintained with any
level of objectivity under such extreme circumstances? How do we
plan for this known, and yet highly unknown challenge of building
balanced systems to operate at scale? The ever more bewildering set
of options (e.g. price lists now have APIs) may, if not managed
with utmost care, result in chaos and con...

The phenomenal complexity of computing is not decreasing. Charts of growth, investment and scale continue to follow a logarithmic curve. But how is computational balance to be maintained with any level of objectivity under such extreme circumstances? How do we plan for this known, and yet highly unknown challenge of building balanced systems to operate at scale? The ever more bewildering set of options (e.g. price lists now have APIs) may, if not managed with utmost care, result in chaos and confusion.

Life sciences is an interesting lens through which to see HPC. It
is perhaps not an obvious choice, given life sciences’ relative
newness as a heavy user of HPC. Even today, after a decade of
steady adoption of advanced computing technologies including a
growing portion of traditional HPC simulation and modeling, life
sciences is dominated by data analytics – big data sets, petabyte
storage requirements, and recently fast networking to handle the
near real-time flood of data from experimenta...

Life sciences is an interesting lens through which to see HPC. It is perhaps not an obvious choice, given life sciences’ relative newness as a heavy user of HPC. Even today, after a decade of steady adoption of advanced computing technologies including a growing portion of traditional HPC simulation and modeling, life sciences is dominated by data analytics – big data sets, petabyte storage requirements, and recently fast networking to handle the near real-time flood of data from experimental instruments and the use of data commons by dispersed collaborators.

For today’s education leaders, one ongoing challenge is to provide
a quality learning experience for students while keeping the price
of tuition affordable. As institutional expenditures continue to
rise, University and College leadership continue to look for ways
to be fiscally efficient, while also providing a positive
experience for the student, from enrollment through graduation and
beyond. Using the right data, in the right way, can help
institutions and leaders keep up with this ongoing ...

For today’s education leaders, one ongoing challenge is to provide a quality learning experience for students while keeping the price of tuition affordable. As institutional expenditures continue to rise, University and College leadership continue to look for ways to be fiscally efficient, while also providing a positive experience for the student, from enrollment through graduation and beyond. Using the right data, in the right way, can help institutions and leaders keep up with this ongoing challenge.