An anonymous reader writes: Following on the fortieth anniversary of Dungeons & Dragons last year, another formative influence on modern gaming is celebrating its fortieth birthday: Games Workshop. Playing at the World covers the story of how the founders, Ian Livingstone and Steve Jackson (not the other Steve Jackson), started out as subscribers to the 1960s British gaming zine Albion playing Diplomacy by mail and (in Ian's case) publishing silly cartoons. When Albion folded at the beginning of 1975, Livingstone and Jackson formed Games Workshop with its own zine Owl & Weasel as a way to bring "progressive games" (as in "progressive rock") to the UK. Shortly thereafter, when they discovered Dungeons & Dragons, fantasy and role-playing games became their focus. After Owl & Weasel grew up into White Dwarf in 1977, its famous "Fiend Factory" column ended up populating the D&D Fiend Folio. And in the 1980s, of course, they brought us Warhammer and their retail stories brought stylish miniatures to many a needful gamer. Happy birthday to Games Workshop!

An anonymous reader writes: Following on the fortieth anniversary of Dungeons & Dragons last year, another formative influence on modern gaming is celebrating its fortieth birthday: Games Workshop. Playing at the World covers the story of how the founders, Ian Livingstone and Steve Jackson (not the other Steve Jackson), started out as subscribers to the 1960s British gaming zine Albion playing Diplomacy by mail and (in Ian's case) publishing silly cartoons. When Albion folded at the beginning of 1975, Livingstone and Jackson formed Games Workshop with its own zine Owl & Weasel as a way to bring "progressive games" (as in "progressive rock") to the UK. Shortly thereafter, when they discovered Dungeons & Dragons, fantasy and role-playing games became their focus. After Owl & Weasel grew up into White Dwarf in 1977, its famous "Fiend Factory" column ended up populating the D&D Fiend Folio. And in the 1980s, of course, they brought us Warhammer and their retail stories brought stylish miniatures to many a needful gamer. Happy birthday to Games Workshop!

First three-dimensional reconstruction of the inside of the giant mimivirus particle, using an X-ray free-electron laser (credit: Tomas Ekeberg et al./Physical Review Letters)

By measuring a series of diffraction pattern from a virus injected into an XFEL beam, researchers at Stanford’s Linac Coherent Light Source (LCLS) have determined the first three-dimensional structure of a virus, using a mimivirus.

X-ray crystallography has solved the vast majority of the structures of proteins and other biomolecules. The success of the method relies on growing large crystals of the molecules, which isn’t possible for all molecules.

“Free-electron lasers provide femtosecond X-ray pulses with a peak brilliance ten billion times higher than any previously available X-ray source,” the researchers note in a paper in Physical Review Letters. “Such a large jump in one physical quantity is very rare, and can have far reaching implications for several areas of science. It has been suggested that such pulses could outrun key damage processes and allow structure determination without the need for crystallization.”

The current resolution of the technique (about 100 nanometers) would be sufficient to image important pathogenic viruses like HIV, influenza and herpes, and further improvements may soon allow researchers to tackle the study of single proteins, the scientists say.

Mimivirus is one of the largest known viruses. The viral capsid is about 450 nanometers in diameter and is covered by a layer of thin fibres. A 3D structure of the viral capsid exists, but the 3D structure of the inside was previously unknown.

Abstract for Three-dimensional reconstruction of the giant mimivirus particle with an x-ray free-electron laser

We present a proof-of-concept three-dimensional reconstruction of the giant Mimivirus particle from experimentally measured diffraction patterns from an X-ray free-electron laser. Three-dimensional imaging requires the assembly of many two-dimensional patterns into an internally consistent Fourier volume. Since each particle is randomly oriented when exposed to the X-ray pulse, relative orientations have to be retrieved from the diffraction data alone. We achieve this with a modified version of the expand, maximize and compress (EMC) algorithm and validate our result using new methods.

“A large part of what we did in the previous project was to place data close to computation. But what we’ve seen is that how you place that computation has a significant effect on how well you can place data nearby.”

Disentanglement

The problem of jointly allocating computations and data is very similar to one of the canonical problems in chip design known as “place and route.” The place-and-route problem begins with the specification of a set of logic circuits, and the goal is to arrange them on the chip so as to minimize the distances between circuit elements that work in concert.

This problem is what’s known as NP-hard, meaning that as far as anyone knows, for even moderately sized chips, all the computers in the world couldn’t find the optimal solution in the lifetime of the universe. Nonetheless, chipmakers have developed a number of algorithms that, while not absolutely optimal, seem to work well in practice.

Adapted to the problem of allocating computations and data in a 64-core chip, these algorithms will arrive at a solution in the space of several hours.

As shown in an open-access paper in the Proceedings of the 21st International Symposium on High Performance Computer Architecture, Sanchez and students Nathan Beckmann and Po-An Tsai developed their own algorithm, which finds a solution that is more than 99 percent as efficient as that produced by standard place-and-route algorithms. But it does so in milliseconds.

“What we do is we first place the data roughly,” Sanchez says. “You spread the data around in such a way that you don’t have a lot of [memory] banks overcommitted or all the data in a region of the chip. Then you figure out how to place the [computational] threads so that they’re close to the data, and then you refine the placement of the data given the placement of the threads. By doing that three-step solution, you disentangle the problem.”

In principle, Beckmann adds, that process could be repeated, with computations again reallocated to accommodate data placement and vice versa. “But we achieved 1 percent, so we stopped,” he says. “That’s what it came down to, really.”

Keeping tabs

The MIT researchers’ system monitors the chip’s behavior and reallocates data and threads every 25 milliseconds. That sounds fast, but it’s enough time for a computer chip to perform 50 million operations.

During that span, the monitor randomly samples the requests that different cores are sending to memory, and it stores the requested memory locations, in an abbreviated form, in its own memory circuit.

Every core on a chip has its own cache — a local, high-speed memory bank where it stores frequently used data. On the basis of its samples, the monitor estimates how much cache space each core will require, and it tracks which cores are accessing which data.

The monitor does take up about 1 percent of the chip’s area, which could otherwise be allocated to additional computational circuits. But Sanchez believes that chipmakers would consider that a small price to pay for significant performance improvements.

“There was a big National Academy study and a DARPA-sponsored [information science and technology] study on the importance of communication dominating computation,” says David Wood, a professor of computer science at the University of Wisconsin at Madison. “What you can see in some of these studies is that there is an order of magnitude more energy consumed moving operands around to the computation than in the actual computation itself. In some cases, it’s two orders of magnitude. What that means is that you need to not do that.”

The MIT researchers “have a proposal that appears to work on practical problems and can get some pretty spectacular results,” Wood says. “It’s an important problem, and the results look very promising.”

Cache hierarchies are increasingly non-uniform, so for systems to scale efficiently, data must be close to the threads that use it. Moreover, cache capacity is limited and contended among threads, introducing complex capacity/latency tradeoffs. Prior NUCA schemes have focused on managing data to reduce access latency, but have ignored thread placement; and applying prior NUMA thread placement schemes to NUCA is inefficient, as capacity, not bandwidth, is the main constraint. We present CDCS, a technique to jointly place threads and data in multicores with distributed shared caches. We develop novel monitoring hardware that enables fine-grained space allocation on large caches, and data movement support to allow frequent full-chip reconfigurations. On a 64-core system, CDCS outperforms an S-NUCA LLC by 46% on average (up to 76%) in weighted speedup and saves 36% of system energy. CDCS also outperforms state-of-the-art NUCA schemes under different thread scheduling policies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature.
At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

schwit1 tips news that a team of astronomers has studied one of the most distant galaxies ever observed and found puzzling results. The light we're seeing from this galaxy comes from roughly 700 million years after the Big Bang, so on the cosmic scale, it's quite young. But the galaxy appears much older than astronomers expected. Their paper was published today in Nature. At this age it would be expected to display a lack of heavier chemical elements — anything heavier than hydrogen and helium, defined in astronomy as metals. These are produced in the bellies of stars and scattered far and wide once the stars explode or otherwise perish. This process needs to be repeated for many stellar generations to produce a significant abundance of the heavier elements such as carbon, oxygen and nitrogen. Surprisingly, the galaxy A1689-zD1 seemed to be emitting a lot of radiation in the far infrared, indicating that it had already produced many of its stars and significant quantities of metals, and revealed that it not only contained dust, but had a dust-to-gas ratio that was similar to that of much more mature galaxies.

Researchers at Queen’s University Belfast, the University of Manchester, and the STFC Daresbury Laboratory are developing new software to increase the ability of supercomputers to process big data faster while minimizing increases in power consumption.

To do that, computer scientists in the Scalable, Energy-Efficient, Resilient and Transparent Software Adaptation (SERT) project are using “approximate computing” (also known as “significance-based computing”) — a form of “overclocking” that trades reliability for reduced energy consumption.

The idea is to operate hardware slightly above the threshold voltage (also called near-threshold voltage, NTV), actually allowing components to operate in an unreliable state — and assuming that software and parallelism can cope with the resulting timing errors that will occur — using increased iterations to reach convergence, for example.

“We also investigate scenarios where we distinguish between significant and insignificant parts [of programs] and execute them selectively on reliable or unreliable hardware, respectively,” according to the authors of a paper in Computer Science – Research and Development journal. “We consider parts of the algorithm that are more resilient to errors as ‘insignificant,’ whereas parts in which errors increase the execution time substantially are marked as “significant.’”

Software methods for improving error resilience include checkpointing for failed tasks and replication to identify silent data corruption.
“This new software … [means] complex computing simulations which would take thousands of years on a desktop computer will be completed in a matter of hours,” according to the project’s Principal Investigator, Professor Dimitrios Nikolopoulos from Queen’s University Belfast.

The SERT project, due to start this month, has just been awarded almost £1million from the U.K. Engineering and Physical Sciences Research Council.

The researchers are simulating detailed models of natural phenomena such as ocean currents, the blood flow of a human body, and global weather patterns to help address some of the big global challenges, including sustainable energy, the rise in global temperatures, and worldwide epidemics.

Abstract of On the potential of significance-driven execution for energy-aware HPC

Dynamic voltage and frequency scaling (DVFS) exhibits fundamental limitations as a method to reduce energy consumption in computing systems. In the HPC domain, where performance is of highest priority and codes are heavily optimized to minimize idle time, DVFS has limited opportunity to achieve substantial energy savings. This paper explores if operating processors near the transistor threshold voltage (NTV) is a better alternative to DVFS for breaking the power wall in HPC. NTV presents challenges, since it compromises both performance and reliability to reduce power consumption. We present a first of its kind study of a significance-driven execution paradigm that selectively uses NTV and algorithmic error tolerance to reduce energy consumption in performance-constrained HPC environments. Using an iterative algorithm as a use case, we present an adaptive execution scheme that switches between near-threshold execution on many cores and above-threshold execution on one core, as the computational significance of iterations in the algorithm evolves over time. Using this scheme on state-of-the-art hardware, we demonstrate energy savings ranging between 35 and 67 %, while compromising neither correctness nor performance.

sarahnaomi sends this report from Motherboard:
Canadian police say they've uncovered a massive online file sharing network for exploitative material that could involve up to 7,500 users in nearly 100 countries worldwide. But unlike past investigations into the distribution of child porn, which typically involve targeting suspects individually, police have instead seized over 1.2 petabytes of data ... from a data center responsible for storing the material, and may even attempt to lay criminal charges against its operators, too. "What we are alleging is occurring is that there are individuals and organizations that are profiting from the storage and the exchange of child sexual exploitation material," Scott Tod, Deputy Commissioner of the Ontario Provincial Police (OPP), told Motherboard at a conference late last month, after speaking to a crowd of defense specialists. "They store it and they provide a secure website that you can log into, much like people do with illegal online gaming sites."

sarahnaomi sends this report from Motherboard: Canadian police say they've uncovered a massive online file sharing network for exploitative material that could involve up to 7,500 users in nearly 100 countries worldwide. But unlike past investigations into the distribution of child porn, which typically involve targeting suspects individually, police have instead seized over 1.2 petabytes of data ... from a data center responsible for storing the material, and may even attempt to lay criminal charges against its operators, too. "What we are alleging is occurring is that there are individuals and organizations that are profiting from the storage and the exchange of child sexual exploitation material," Scott Tod, Deputy Commissioner of the Ontario Provincial Police (OPP), told Motherboard at a conference late last month, after speaking to a crowd of defense specialists. "They store it and they provide a secure website that you can log into, much like people do with illegal online gaming sites."

sarahnaomi sends this report from Motherboard: Canadian police say they've uncovered a massive online file sharing network for exploitative material that could involve up to 7,500 users in nearly 100 countries worldwide. But unlike past investigations into the distribution of child porn, which typically involve targeting suspects individually, police have instead seized over 1.2 petabytes of data ... from a data center responsible for storing the material, and may even attempt to lay criminal charges against its operators, too. "What we are alleging is occurring is that there are individuals and organizations that are profiting from the storage and the exchange of child sexual exploitation material," Scott Tod, Deputy Commissioner of the Ontario Provincial Police (OPP), told Motherboard at a conference late last month, after speaking to a crowd of defense specialists. "They store it and they provide a secure website that you can log into, much like people do with illegal online gaming sites."

sarahnaomi sends this report from Motherboard: Canadian police say they've uncovered a massive online file sharing network for exploitative material that could involve up to 7,500 users in nearly 100 countries worldwide. But unlike past investigations into the distribution of child porn, which typically involve targeting suspects individually, police have instead seized over 1.2 petabytes of data ... from a data center responsible for storing the material, and may even attempt to lay criminal charges against its operators, too. "What we are alleging is occurring is that there are individuals and organizations that are profiting from the storage and the exchange of child sexual exploitation material," Scott Tod, Deputy Commissioner of the Ontario Provincial Police (OPP), told Motherboard at a conference late last month, after speaking to a crowd of defense specialists. "They store it and they provide a secure website that you can log into, much like people do with illegal online gaming sites."

sarahnaomi sends this report from Motherboard: Canadian police say they've uncovered a massive online file sharing network for exploitative material that could involve up to 7,500 users in nearly 100 countries worldwide. But unlike past investigations into the distribution of child porn, which typically involve targeting suspects individually, police have instead seized over 1.2 petabytes of data ... from a data center responsible for storing the material, and may even attempt to lay criminal charges against its operators, too. "What we are alleging is occurring is that there are individuals and organizations that are profiting from the storage and the exchange of child sexual exploitation material," Scott Tod, Deputy Commissioner of the Ontario Provincial Police (OPP), told Motherboard at a conference late last month, after speaking to a crowd of defense specialists. "They store it and they provide a secure website that you can log into, much like people do with illegal online gaming sites."

sarahnaomi sends this report from Motherboard: Canadian police say they've uncovered a massive online file sharing network for exploitative material that could involve up to 7,500 users in nearly 100 countries worldwide. But unlike past investigations into the distribution of child porn, which typically involve targeting suspects individually, police have instead seized over 1.2 petabytes of data ... from a data center responsible for storing the material, and may even attempt to lay criminal charges against its operators, too. "What we are alleging is occurring is that there are individuals and organizations that are profiting from the storage and the exchange of child sexual exploitation material," Scott Tod, Deputy Commissioner of the Ontario Provincial Police (OPP), told Motherboard at a conference late last month, after speaking to a crowd of defense specialists. "They store it and they provide a secure website that you can log into, much like people do with illegal online gaming sites."