Google Plans to Add Support for Containerized Linux Apps to Chromebooks

Google is apparently working on Project Crostini for Chrome OS to allow Linux VMs on Chrome OS, according to a Reddit thread, which points to a recent Chromium commit explaining a new device policy designed will allow Linux virtual machines to run on Chrome OS if it's set to true. This can only mean that the dream of running Linux apps on your Chromebook is finally becoming a reality. It won't be long until the new policy feature described above will hit the Dev, Beta, and then Stable channel of Chrome OS so you can taste Linux software yourself on Chrome OS.

Nightdive has been at work on Turok since 2015. Nightdive’s specialty has been in rescuing and refreshing beloved video game franchises that their original publishers have forsaken. Turok: Dinosaur Hunter was a fantastically weird title, pitting the player against prehistoric monsters with weapons ranging from a bow and arrows to a gatling gun.

As you can see, this isn’t a high-definition remaster. But Turok’s wildass premise really wouldn’t work without the 1990s goofiness of low-poly characters, infinite backgrounds, and one-size-fits-all textures. Look, you’ve got dinosaurs that not only breathe fire, they also carry firearms, so it’s not like anyone’s going for immersion here.

Saturday, 24 February 2018

mitigation techniques on the major operating systems, in the weeks and months ahead we are likely to see more performance optimizations come to help offset the performance penalties incurred by mitigations like kernel page table isolation (KPTI) and Retpolines. This week a new patch series was published that may help with KPTI performance.

Intel developer Dave Hansen discovered that the back when KPTI was known as

. My overworked memory missed realizing it by a few days, but it's been a pretty miraculous two years for this high-performance graphics and compute API.

In the two years that Vulkan 1.0 has been around we have seen drivers for all major platforms, not one but two open-source AMD Vulkan Linux drivers (AMDVLK and RADV), the rapid rate of ascent for the RADV and Vulkan drivers have shown how much lighter the driver implementations are than OpenGL drivers, these open-source drivers have kept up-to-date with minor specification revisions, OpenGL SPIR-V/Vulkan interoperability was introduced last year, and more game developers continue making use of Vulkan.

Also, we have seen Feral Interactive focus now on bringing their new games to Linux using Vulkan (in fact, Vulkan exclusives now) and the open-source community around Vulkan has been thriving. In fact,

is now around 1,800 projects referencing Vulkan while just 94 mention D3D12 or 38 as "Direct3D 12." On the Windows side there have been a few prominent Vulkan titles like DOOM and Wolfenstein. But sadly not as many Vulkan games as I would have hoped for now two years in, especially for cross-platform games.

Making it challenging as well is that many of the current games utilizing Vulkan aren't seeing significant performance benefits compared to their OpenGL code-paths since the engines/renderers aren't yet fully designed for making use of a low-level graphics API. We have been seeing more work in this direction and the Vulkan vs. Linux OpenGL performance improving as well as the increasing maturity of the Vulkan drivers.

What do you think of Vulkan now two years since the public release? What do you want to see added to the API? What do you think they will introduce in their next major update? Has the Vulkan tooling to date been satisfactory? What about macOS support? More convergence between OpenCL and Vulkan for compute? Will 2018 finally be "the year of Vulkan gaming"? Share your thoughts with us this weekend in the forums.

Thursday, 22 February 2018

Intel has announced that it has released production microcode updates to OEM manufacturers for Kaby Lake, Coffee Lake, and Skylake platforms. Along with this announcement, Intel has finally given us a schedule and availability table for the microcode revisions that can be found here. Nice to see they haven't just went quiet and hoped we forgot, I was actually asking yesterday if anyone had heard anything. It's nice to see that in their guidance paper that they plan on going back further than initally thought, though we will have to see if they make it out of "planning" or get pushed out by vendors. Also there are some CPU's that seem to be missing from the list. Keep checking your motherboard manufacturers sites for updates, you can find a list we compiled here. Based on these efforts, we have now released production microcode updates to our OEM customers and partners for Kaby Lake- and Coffee Lake-based platforms, plus additional Skylake-based platforms. This represents our 6th, 7th and 8th Generation Intel Core product lines as well as our latest Intel Core X-series processor family. It also includes our recently announced Intel Xeon Scalable and Intel Xeon D processors for data center systems. Discussion

The first English translation of Operation Elop, an examination by Finnish journalists into the final years of Nokia phones, has reignited debate about the fate of what was Europe's largest and most admired technology company.

Operation Elop is largely negative about the Canadian CEO's tenure, the first non-Finn to hold the position at the company, but nevertheless comes to his support when the authors find that criticism was unfair. For example, the vilification that Stephen Elop received on receiving a "$26m payoff" was completely unwarranted, the authors conclude, since the figure (and much of the reporting) was wildly inaccurate. If you want an American CEO, they point out, you need to pay an American CEO's compensation. And Elop's time at Nokia cost him his marriage, don't forget.

But the collapse of Nokia also cost Finnish communities dear: the details of rising alcoholism, and child social services under strain as thousands of employees were laid off, make for grim reading.

Elop's tenure at Nokia and the company's downfall will be studied for decades to come.

There are two major reasons I can think of to hack a game console. The first one is obvious: so you can play cracked copies of games. Thatâs why modern consoles are so difficult to hack, because millions of dollars are on the line. But some people just want to run any software they choose on the hardware they own. And for those people, Linux on the Switch is a huge achievement. I'm surprised it even took this long.

Tuesday, 20 February 2018

PlayerUnknown’s Battlegrounds developer PUBG Corp. has delayed several major upcoming features, including the launch of the game’s new map, as the company focuses on rampant cheating. In a post today on the Battlegrounds Steam page, the developer said that anti-cheat efforts have also set back communication with fans about plans for 2018 and beyond.

“Early this year, development of some of the major features and systems was delayed as our focus shifted towards tightening our anti-cheat effort,” PUBG Corp. said. “Also, due to other reasons, we have not been able to show you the team’s development roadmap for 2018.”

The Steam post, which also serves as the latest set of patch notes, also indicated that players would now be segregated by the speed and quality of their internet connection.

“Users with lower pings will be prioritized during matchmaking,” PUBG Corp. said, meaning that players with the fastest internet connections will be allowed to join new games faster.

Other changes include the following:

Some fences on Miramar were replaced with unbreakable versions “in order to optimize the client.”

Players will no longer be able to enter first-person or otherwise see inside the plane during the initial drop “to improve the early game client and server performance.” To compensate, players will have a counter in the lower left corner of the screen indicating how many players are left on the plane.

Reporting players who have cheated through the replay system will not include a one minute-long file centered on the map location where the report was made. Redundant reports of the same player will no longer be accepted.

Sunday, 18 February 2018

Microsoft has revealed its plans to use blockchain distributed-ledger technologies to securely store and manage digital identities, starting with an experiment using the Microsoft Authenticator app. From a report: Microsoft reckons the technology holds promise as a superior alternative to people granting consent to dozens of apps and services and having their identity data spread across multiple providers. It highlights that with the existing model people don't have control over their identity data and are left exposed to data breaches and identity theft. Instead, people could store, control and access their identity in an encrypted digital hub, Microsoft explained. To achieve this goal, Microsoft has for the past year been incubating ideas for using blockchain and other distributed ledger technologies to create new types of decentralized digital identities.

The Engineering and Physical Sciences Research Council awarded a remarkable photograph its overall prize in science photography. The subject of the photograph? A single atom visible to the naked eye. Well, perhaps not exactly the naked eye, but without a microscope. In the picture above (click here to enlarge), the atom is that pale blue dot between the two needle-like structures.

You probably learned in school that you couldn’t see a single atom, and that’s usually true. But [David Nadlinger] from the University of Oxford, trapped a positively charged strontium atom in an ion trap and then irradiated it with a blue-violet laser. The atom absorbs and reemits the light, and a camera can pick up the light, creating a one-of-a-kind photograph. The camera was a Canon 5D Mk II with a 50mm f/1.8 lens — a nice camera, but nothing too exotic.

The ion trap keeps the single atom balanced between two small needle points about 2 millimeters apart. [Nadlinger] did some math that convinced him the photograph could be possible and made it a reality on a Sunday afternoon. The pale dot isn’t especially spectacular by itself, but when you realize that it is the visual effect of a single atom, it is mind-blowing. Turns out, the lab has taken some similar photographs in the past. They don’t remember who took it, but they have a picture of 9 calcium-43 ions trapped, that you can seen below. The ions are 10 microns apart and at an effective temperature of 0.001 degrees Kelvin.

Other winning photographs included patterns on a soap bubble, an EEG headset in use, and microbubbles used to deliver drugs. There’s also an underwater robot, a machine for molecular beam epitaxy that looks like a James Bond villain’s torture device, and lattices made with selective laser melting 3D printing.

If you want to look at atoms from the comfort of your own home, maybe you should build an STM. You might even try NIST’s improved atom probe while you are at it. Just remember you can’t trust atoms. They make up everything.

Saturday, 17 February 2018

Google's Project Zero team has published details of an unfixed bypass for an important exploit-mitigation technique in Edge. From a report: The mitigation, Arbitrary Code Guard (ACG), arrived in the Windows 10 Creators Update to help thwart web attacks that attempt to load malicious code into memory. The defense ensures that only properly signed code can be mapped into memory. However, as Microsoft explains, Just-in-Time (JIT) compilers used in modern web browsers create a problem for ACG. JIT compilers transform JavaScript into native code, some of which is unsigned and runs in a content process. To ensure JIT compilers work with ACG enabled, Microsoft put Edge's JIT compiling in a separate process that runs in its own isolated sandbox. Microsoft said this move was "a non-trivial engineering task." "The JIT process is responsible for compiling JavaScript to native code and mapping it into the requesting content process. In this way, the content process itself is never allowed to directly map or modify its own JIT code pages," Microsoft says. Google's Project Zero found an issue is created by the way the JIT process writes executable data into the content process.

Logstash 6.2.0 Release Improves Open Source Data Processing Pipeline

Many modern enterprises have adopted the ELK (Elasticsearch, Logstash, Kibana) stack to collect, process, search and visualize data.

At the core of the ELK stack is the open-source Logstash project which defines itself as a server-side data processing pipeline - basically it helps to collect logs and then send them to a users' "stash" for searching, which in many cases is Elasticsearch.

Why Red Hat Invested $250M in CoreOS to Advance Kubernetes

For the last three years or so, Red Hat has been on a collision course with CoreOS, with both firms aiming to grow their respective Kubernetes platform. On Jan. 30 the competition between the two firms ended, with CoreOS agreeing to be acquired by Red Hat in a $250 million deal.

CoreOS didn't start out as a Kubernetes platform vendor, but then again neither did Red Hat.

Saturday, 10 February 2018

Joining the work by Timothy Arceri, Andres Rodriguez, Samuel Pitoiset, and others working on the open-source Linux graphics stack while being funded by Valve, Daniel Schürmann is the company's latest hire.

Daniel Schürmann is this new name to Linux graphics contributions. He began his Valve work by contributing some

The German Linux developer has contributed to the Mixxx DJ mixing software, Cinnamon, and other open-source projects.

It turns out there are two Daniel Schürmanns into Linux/open-source development, not to be confused with the desktop developer, this Daniel from TU Berlin had written his master thesis on OpenMP offloading using OpenCL and SPIR-V. You can check out his thesis

Friday, 9 February 2018

This release is the result of more than three year of volunteer work, add numerous new features, fixes more than 1500 bugs and show more than 20.000 commits. All platform share the same code now. VLC 3.0 is a massive change in the VLC core,

Ever since the inception of the Like button, Facebook users have been asking for a "dislike" button. Today, Facebook is testing a "downvote" button with certain users in the comment section of posts within Facebook groups and on old Facebook memories content. The Daily Beast reports: The feature appears to give users the ability to downrank certain comments. This is the first time Facebook has tested anything similar to a "dislike" button and it could theoretically allow for content that's offensive or relevant to be pushed to the bottom of a comment feed. In 2016, citing Facebook executives, Bloomberg said a dislike button "had been rejected on the grounds that it would sow too much negativity" to the platform. It's unclear how widely the dislike button is being tested. Facebook regularly tests features with small subsets of users that never end up rolling out to the broader public. Most users currently are only able to either Like or Reply to comments in a thread. The downvote option could have radical implications on what types of discussions and comments flourish on the platform. While it could theoretically be used to de-rank inflammatory or problematic comments, it could also easily be used as a tool for abuse.

Sunday, 4 February 2018

"Free/libre and 100% community backed version of XenServer," promises a new Kickstarter page, adding that "Our first prototype (and proof of concept) is already functional." Currently, XenServer is a turnkey virtualization platform, distributed as a distribution (based on CentOS). It comes with a feature rich toolstack, called XAPI. The vast majority of XenServer code is Open Source. But since XenServer 7.3, Citrix removed a lot of features from it. The goal of XCP-ng is to make a fully community backed version of XenServer, without any feature restrictions. We also aim to create a real ecosystem, not depending on one company only. Simple equation: the more we are, the healthier is the environment. The campaign reached its fundraising goal within a few hours, reports long-time Slashdot reader NoOnesMessiah, and within three days they'd already raised four times the needed amount and began unlocking their stretch goals.

wiredmikey quotes SecurityWeek: Researchers have discovered more than 130 malware samples designed to exploit the recently disclosed Spectre and Meltdown CPU vulnerabilities. While a majority of the samples appear to be in the testing phase, we could soon start seeing attacks... On Wednesday, antivirus testing firm AV-TEST told SecurityWeek that it has obtained 139 samples from various sources, including researchers, testers and antivirus companies... Fortinet, which also analyzed many of the samples, confirmed that a majority of them were based on available proof of concept code. Andreas Marx, CEO of AV-TEST, believes different groups are working on the PoC exploits to determine if they can be used for some purpose. "Most likely, malicious purposes at some point," he said.

The Head of Service Management & Anti-Cheat at Playerunknown's Battlegrounds have announced they they have developed a new anti-cheat solution and will be deploying an early version of it on the servers next week. The solution was developed in house, and has been being tested on their test servers. The main focus of the new system is blocking unauthorized programs that hook into the game and transform game files. In addition, PUBG is upgrading the in-game report function, so it will allow them to investigate reported content faster and more accurately; File modification checking, any modification or deletion of game files may result in a ban; and they are cracking down on account sharing, no longer allowing family sharing on steam.

Sounds good to me. I have not played many rounds in PUBG, but I will say in the few I have, several ended suspiciously. THe primary foxus of this anti-cheat makes me worry for people that use things like RTSS, Afterburner, and the like for FPS reporting, we don't want another unjust ban wave happening. The comments section on the announcement implies that players have another idea to quell cheating.

The internally developed anti-cheat solution is planned to be upgraded steadily after the first implementation next week. As mentioned earlier, the eradication of cheat programs will not end with a few attempts and actions. In addition to the systems currently in development and already implemented, we are looking into a more effective system, and we will actively introduce any solutions that were confirmed to be reliable and accurate. We will continue taking firm measures against the developers, distributors and users of cheats. We promise you that we will do our best every day in our battle for a fair game environment.

Friday, 2 February 2018

schwit1 shares a report from CNN: EBay, one of the world's biggest online marketplaces, announced Wednesday that it's dropping PayPal as its main partner for processing payments in favor of Dutch company Adyen. In 2002, eBay paid $1.5 billion to buy PayPal, an online payments company whose founders include Silicon Valley heavyweights Elon Musk and Peter Thiel. It proved to be a very successful investment. When eBay spun off PayPal in 2015 -- something investors and analysts had urged it to do -- the payments company's market value was close to $50 billion. It's now above $100 billion. Based in Amsterdam, Adyen already works with other big tech companies including Uber and Netflix. It says it handles more than 200 different payment methods and over 150 currencies. The shift will start gradually in North America later this year and eBay expects most marketplace customers around the world to be using the new system in 2021.

Bloomberg: Amazon.com is advertising its Alexa-powered speakers in the big game on Sunday. It's an amusing 90 seconds that features celebrities like Gordon Ramsay, Rebel Wilson, Anthony Hopkins, Cardi B and the world's wealthiest man, Jeff Bezos himself. The word "Alexa" is uttered 10 times during the Super Bowl spot, but thankfully, the Amazon Echo in your living room isn't going to perk up and try to respond. Bezos and company have evidently been thinking about this problem for a long time, before the Echo was even introduced. A September 2014 Amazon patent titled "Audible command filtering" describes techniques to prevent Alexa from waking up "as part of a broadcast watched by a large population (such as during a popular sporting event)," annoying customers and overloading Amazon's servers with millions of simultaneous requests. The patent broadly describes two techniques. The first calls for transmitting a snippet of a commercial to Echo devices before it airs. Then the Echo can compare live commands to the acoustic fingerprint of the snippet to determine whether the commands are authentic. The second tactic describes how a commercial itself could transmit an inaudible acoustic signal to tell Alexa to ignore its wake word.

An anonymous reader shares a report from CNBC, written by Gabriel Weinberg, CEO and founder of DuckDuckGo: You may know that hidden trackers lurk on most websites you visit, soaking up your personal information. What you may not realize, though, is 76 percent of websites now contain hidden Google trackers, and 24 percent have hidden Facebook trackers, according to the Princeton Web Transparency & Accountability Project. The next highest is Twitter with 12 percent. It is likely that Google or Facebook are watching you on many sites you visit, in addition to tracking you when using their products. As a result, these two companies have amassed huge data profiles on each person, which can include your interests, purchases, search, browsing and location history, and much more. They then make your sensitive data profile available for invasive targeted advertising that can follow you around the Internet. [...] So how do we move forward from here? Don't be fooled by claims of self-regulation, as any useful long-term reforms of Google and Facebook's data privacy practices fundamentally oppose their core business models: hyper-targeted advertising based on more and more intrusive personal surveillance. Change must come from the outside. Unfortunately, we've seen relatively little from Washington. Congress and federal agencies need to take a fresh look at what can be done to curb these data monopolies. They first need to demand more algorithmic and privacy policy transparency, so people can truly understand the extent of how their personal information is being collected, processed and used by these companies. Only then can informed consent be possible. They also need to legislate that people own their own data, enabling real opt-outs. Finally, they need to restrict how data can be combined including being more aggressive at blocking acquisitions that further consolidate data power, which will pave the way for more competition in digital advertising. Until we see such meaningful changes, consumers should vote with their feet.

Thursday, 1 February 2018

One of the more tricky issues revolving around the GPU shortages of the past several months has been the matter of how to address the problem on the GPU supply side of matters. While the crux of the problem has been a massive shift in demand driven by a spike in cryptocurrency prices, demand has also not tapered off like many of us would have hoped. And while I hesitate to define the current situation as the new normal, if demand isn’t going to wane then bringing video card prices back down to reasonable levels is going to require a supply-side solution.

This of course sounds a lot easier than it actually is. Ignoring for the moment that GPU orders take months to process – there are a lot of steps in making a 16nm/14nm FinFET wafer – the bigger risk is that cryptocurrency-induced GPU demand is not stable. Ramping up GPU production means gambling that demand will stay high enough long enough to absorb the additional GPUs, and then not immediately contract and have the market flooded with used video cards. The latter being an important point that AMD got burnt on the last time this happened, when the collapse of the cryptocurrency-prices and the resulting demand for video cards resulted in the market becoming flooded with used Hawaii (290/390 series) cards.

Getting to the heart of matters then, in yesterday’s Q&A session for their Q4’2017 earnings call, an analyst asked AMD about the current GPU supply situation and whether AMD would be ramping up GPU production. The answer, much to my surprise, was yes. But with a catch.

Q: I just had a question on crypto, I mean if I look at the amount of hash compute being added to Ethereum in January I mean it's more than the whole of Q4, so we have seen a big start to the Q1. […] And is there any sort of acute shortages here, I man can your foundry partners do they have the capacity to support you with a ramp of GPUs at the moment and is there enough HBM2 DRAM to source as well?

A: Relative to just where we are in the market today, for sure the GPU channel is lower than we would like it to be, so we are ramping up our production. At this point we are not limited by silicon per se, so our foundry partners are supplying us, there are shortages in memory and I think that is true across the board, whether you are talking about GDDR5, or you’re talking about high bandwidth memory. We continue to work through that, with our memory partners and that will be certainly one of the key factors as we go through 2018.

So yes, AMD is ramping up GPU production. Which is a surprising move since they were burnt the last time they did this. At the same time however, while cryptocurrency demand has hit both major GPU manufacturers, AMD has been uniquely hit as they’re a smaller player less able to absorb rapid changes in demand, and, more importantly, their GPUs are better suited for the task. AMD’s tradition of offering more memory bandwidth and more raw FLOPS than NVIDIA at any competing price point, coupled with some meaningful architectural differences, means that their GPUs are in especially high demand by cryptocurrency miners.

But perhaps the more interesting point here isn’t that AMD is increasing their GPU production, but why they can only increase it by so much. According to the company, they’re actually RAM-limited. They can make more GPUs, but they don’t have enough RAM – be it GDDR5 or HBM2 – to equip all of the cards AMD and board partners would like to make.

This is an interesting revelation, as this is the first time memory shortages have been explicitly identified as an issue in this latest run-up. We’ve known that the memory market is extremely tight due to demand – with multiple manufacturers increasing their RAM prices and diverting GDDR5 production over to DDR4 – but only now is that catching up with video card production to the point that current GDDR5 production levels are no longer “enough”. Of course RAM of all types is still in high demand here at the start of 2018, so while memory manufacturers can reallocate some more production back to GDDR5, GPU and board vendors have to fight with both the server and mobile markets, both of which have their own booms in demand going on, and are willing to pay top dollar for the RAM they need.

GDDR5: The Key To Digital Gold

In a sense the addition of cryptocurrency to the mix of computing workloads has created a perfect storm in an industry that was already dealing with RAM shortages. The RAM market is in the middle of a boom right now – part of its traditional boom/bust cycle – and while it will eventually abate as demand slips and more production gets built, for the moment cryptocurrency mining has just added yet more demand for RAM that isn’t there. Virtually all supply/demand problems can be solved through higher prices – at some point, someone has to give up – but given the trends we’ve seen so far, GPU users are probably the most likely to suffer, as traditionally the GPU market has been built on offering powerful processors paired with plenty of RAM for paltry prices. Put another way, even if the GPU supply situation were resolved tomorrow and there were infinite GPUs for all, RAM prices would be a bottleneck that kept video card prices from coming back down to MSRP.

With all that said, however, AMD’s brief response in their earnings call has been the only statement of substance they’ve made on the matter. So while the company is (thankfully) ramping up GPU production, they haven’t – and are unlikely to ever – disclose just how many more GPUs that is, or for that matter how much RAM they expect they and partners can get for those new GPUs. So while any additional production will at least help the current situation to some extent, I would caution against getting too hopeful about AMD’s ramp-up bringing the video card shortage to an end.

The underpinnings of Chrome OS have found their way into the server room in a very roundabout way. Red Hat has acquired CoreOS, the creators of an operating system for containerized apps (Container Linux) that shares roots with both Google's Chromium OS project and Gentoo Linux. The $250 million deal promises to help Red Hat fulfill its dreams of helping people use open code to deploy apps in any environment they like, whether it's on a local network or multiple cloud services.

CoreOS has played a particularly major role in Kubernetes, the Google-built open platform for deploying those containerized apps. It's the second-largest contributor to the project beyond Google itself, Red Hat said. Additional tools like Tectonic and Quay have made it easier for big businesses to move and track apps.

You probably won't notice any of the Chrome OS influence at Red Hat. However, this shows just how far Google's web-centric platform has spread. Elements of an OS originally designed for frugal PCs will soon find their way into products from the open source world's biggest business provider -- Google definitely didn't anticipate that.