Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

For anyone using Windows 7 by way of Apple's Boot Camp utility, beware: support for Windows via Boot Camp remains, but for the newest Apple laptops, it's only for Windows 8 for now. From Slashgear:
This applies to the 2015 MacBook Air, and the 13-inch model of the 2015 MacBook Pro. Windows 8 will remain compatible, as will the forthcoming Windows 10. The 2013 Mac Pro also dropped Boot Camp support for Windows 7, while 2014 iMacs are still compatible, along with 2014 MacBook Airs and 2014 MacBook Pros.
For those who still prefer to run Windows 7 on their Macs, there are other options. This change to Boot Camp will not affect using the Microsoft operating system through virtualization software, such as Parallels and VMware Fusion. Also at PC Mag.

angry tapir writes Reacting to the surging popularity of the Docker virtualization technology, Red Hat has customized a version of its Linux distribution to run Docker containers. The Red Hat Enterprise Linux 7 Atomic Host strips away all the utilities residing in the stock distribution of Red Hat Enterprise Linux (RHEL) that aren't needed to run Docker containers. Removing unneeded components saves on storage space, and reduces the time needed for updating and booting up. It also provides fewer potential entry points for attackers. (Product page is here.)

jones_supa writes: Phoronix notes how it has been a long time since last hearing of any major innovations or improvements to VirtualBox, the virtual machine software managed by Oracle. This comes while VMware is improving its products on all platforms, and KVM, Xen, Virt-Manager, and related Linux virtualization technologies continue to advance as well. Is there any hope left for a revitalized VirtualBox? It has been said that there are only four paid developers left on the VirtualBox team at the company, which is not enough manpower to significantly advance such a complex piece of software. The v4.3 series has been receiving some maintenance updates during the last two years, but that's about it.

"My mental model of CPUs is stuck in the 1980s: basically boxes that do arithmetic, logic, bit twiddling and shifting, and loading and storing things in memory. I'm vaguely aware of various newer developments like vector instructions (SIMD) and the idea that newer CPUs have support for virtualization (though I have no idea what that means in practice). What cool developments have I been missing? "

jones_supa writes: Network World's analysis of publicly listed sponsors of 36 prominent open-source non-profits and foundations reveals that the lion's share of financial support for open-source groups comes from a familiar set of names. Google was the biggest supporter, appearing on the sponsor lists of eight of the 36 groups analyzed. Four companies – Canonical, SUSE, HP and VMware – supported five groups each, and seven others (Nokia, Oracle, Cisco, IBM, Dell, Intel and NEC) supported four. For its part, Red Hat supports three groups (Linux Foundation, Creative Commons and the Open Virtualization Alliance).

It's tough to get more than a general sense of how much money gets contributed to which foundations by which companies – however, the numbers aren't large by the standards of the big contributors. The average annual revenue for the open-source organizations considered in the analysis was $4.36 million, and that number was skewed by the $27 million taken in by the Wikimedia Foundation (whose interests range far beyond OSS development) and the $17 million posted by Linux Foundation.

New submitter fourbadgers writes: CoreOS, the start-up making the CoreOS Linux distribution, has announced Rocket, a container management system that's an alternative to Docker. CoreOS is derived from Chrome OS and has a focus on lightweight virtualization based on Linux containers. The project has been a long-time supporter of Docker, but saw the need for a simpler container system after what was seen as scope-creep in what Docker provides.

jfruh writes The ability to cram multiple virtual servers on a single physical computer is tempting — so tempting that many shops overlook the downsides of having so many important systems subject to a single point of physical failure. But how can you isolate your servers physically but still take up less room? Matthew Mobrea takes a look at the options, including new server platforms that offer what he calls "dense isolation."

Nerval's Lobster writes Every year, approximately 250,000 military personnel leave the service to return to civilian life. When the home front beckons, many will be looking to become IT professionals, a role that, according to the U.S. Bureau of Labor Statistics, is among the fastest growing jobs in the country. How their field skills will translate to the back office is something to ponder. With the advent of virtualization, mobile, and the cloud, tech undergoes rapid changes, as do the skill sets needed to succeed. That said, the nature of today's military—always on the go, and heavily reliant on virtual solutions—may actually be the perfect training ground for IT. Consider that many war-fighters already are IT technicians: They need to be skilled in data management, mobile solutions, security, the ability to fix problems as they arise onsite, and more. Military personnel used to working with everything from SATCOM terminals to iPads are ideally suited for handling these issues; many have successfully managed wireless endpoints, networks, and security while in the field. Should programs that focus on placing former military personnel in civilian jobs focus even more on getting them into IT roles?

Andy Updegrove writes: The Linux Foundation this morning announced the latest addition to its family of major hosted open source initiatives: the Open Platform for NFV Project (OPNFV). Its mission is to develop and maintain a carrier-grade, integrated, open source reference platform for the telecom industry. Importantly, the thirty-eight founding members include not only cloud and service infrastructure vendors, but telecom service providers, developers and end users as well. The announcement of OPNFV highlights three of the most significant trends in IT: virtualization (the NFV part of the name refers to network function virtualization), moving software and services to the cloud, and collaboratively developing complex open source platforms in order to accelerate deployment of new business models while enabling interoperability across a wide range of products and services. The project is also significant for reflecting a growing recognition that open source projects need to incorporate open standards planning into their work programs from the beginning, rather than as an afterthought.

darthcamaro writes "Docker has become the new hotness in virtualization technology — but it is still a project that is led by the backing of a single vendor — Docker Inc. Is that a problem? Should there be an open-source Foundation to manage the governance and operation of the Docker project? In a video interview — Docker founder and Benevolent Dictator for Life Solomon Hykes says — No."

An anonymous reader writes: Today, Red Hat unveiled Red Hat Enterprise Linux 7, with new features designed to meet both modern datacenter and next-generation IT requirements for cloud, Linux Containers, and big data. The new version includes Linux containers (LXC), which let Linux users easily create and manage system or application containers, improved MS Active Directory / Identity Management (IdM) integration, XFS as the default file system, scaling to 500 TB (additional file system choices such as btrfs, ext{3,4} and others are available), a new and improved installation experience, managing Linux servers with OpenLMI, enhancements to both NFS and GFS2, optimized network management, bandwidth, the use of KVM Virtualization technology and more. See the complete list of features here (PDF). CentOS 7 shouldn't be lagging too far behind due to recent cooperation between Red Hat and CentOS project.

Mcusanelli (3564469) writes "Rackspace, Cumulus Networks and CoreOS have become members of the Linux Foundation to support open source networking, virtualization and cloud computing. The Linux Foundation said in a statement: 'From the virtualization layer to networking hardware, Linux and open source are critical to modern computing and a new generation of cloud services and applications. Today's new Linux Foundation members are part of this market shift and see open source as the lynchpin for optimal scalability, efficiencies, security and data center savings.'"

darthcamaro (735685) writes "Red Hat's open source oVirt project hit a major milestone this week with the release of version 3.4. It's got improved storage handling so users can mix and match different resource types, though the big new feature is one that seems painfully obvious. For the first time oVirt users can have the oVirt Manager and oVirt VMs on the same physical machine. 'So, typically, customers deployed the oVirt engine on a physical machine or on a virtual machine that wasn't managed or monitored,' Scott Herold, principal product manager for Red Hat Enterprise Virtualization said. 'The oVirt 3.4 release adds the ability for oVirt to self-host its engine, including monitoring and recovery of the virtual machine.'"
(Wikipedia describes oVirt as "a free platform virtualization management web application community project.")

darthcamaro (735685) writes "Docker has become one of the most hyped open-source projects in recent years, making it hard to believe the project only started one year ago. In that one year, Docker has now gained the support of Red Hat and other major Linux vendors. What does the future hold for Docker? Will it overtake other forms of virtualization or will it just be a curiosity?"

benrothke writes "At first glance, The Art of the Data Center: A Look Inside the Worlds Most Innovative and Compelling Computing Environments appears like a standard coffee table book with some great visuals and photos of various data centers throughout the world. Once you get a few pages into the book, you see it is indeed not a light-read coffee table book, rather a insightful book where some of the brightest minds in the industry share their insights on data center design and construction." Read below for the rest of Ben's review.

First time accepted submitter xyourfacekillerx writes "After a long hiatus of developing (ASP.NET), I decided to pick it up again. I need to learn .NET and SQL for my new job (GIS tech using ESRI software). Down the road they need a PHP website, tons of automation tasks, some serious data consolidation, they want mobile apps in theory. This is not my job description, but I'm sure I can do it. Long story short, I need to setup a development environment on my home desktop, so I can do all this in my spare time. Trouble is, I share the machine (Win 8.1, 2.7 dual core pentium something or other, with virtualization support.) I want to avoid affecting the other users profiles. I currently use my profile for music production (Reason) and photography (Photoshop, et al) so it's already resource intensive with RAM, CPU and VMM. I'll be needing to install all of your basic Microsoft developer suites, IIS, SQl Server, ANdroid SDK, Java SDK, device emulators, etc. etc. Plus AMP and finally GIS software. There will obviously be a lot of services running, long build times, and so on. To wit, I wouldn't be able to use my desktop for my other purposes like the music editing. So I need some advice. Would it help to set up all these tools under a different account on the same Win 8.1 install? Or should I virtualize my development environment (and how?), and run the virtual machine side by side? Or should I add a HDD or secondary partition and boot to that when I intend to develop? I am poor ATM, but is there a cheap very mini PC I can place next to my desktop and run all my development software off that, remote desktop into it? I've done a lot of googling the last week and haven't turned up anything, so I turn to Slashdot. Please help me get organized so I can start coding again."

Nerval's Lobster writes "Will Xbox One and PS4 emulators hit your favorite download Websites within the next few years? Emulators have long been popular among gamers looking to relive the classic titles they enjoyed in their youth. Instead of playing Super Mario Bros. on a Nintendo console, one can go through the legally questionable yet widespread route of downloading a copy of the game and loading it with PC software that emulates the Nintendo Entertainment System. Emulation is typically limited to older games, as developing an emulator is hard work and must usually be run on hardware that's more powerful than the original console. Consoles from the NES and Super NES era have working emulators, as do newer systems such as Nintendo 64, GameCube and Wii, and the first two PlayStations. While emulator development hit a dead end with the Xbox 360 and PS3, that may change with the Xbox One and PS4, which developers are already exploring as fertile ground for emulation. The Xbox 360 and PS4 feature x86 chips, for starters, and hardware-assisted virtualization can help solve some acceleration issues. But several significant obstacles stand in the way of developers already taking a crack at it, including console builders' absolute refusal to see emulation as even remotely legal."

dcblogs writes "IDC expects that anywhere from 25% to 30% of all the servers shipped next year will be delivered to cloud services providers. In three years, 2017, nearly 45% of all the servers leaving manufacturers will be bought by cloud providers. The shift is slowing the purchase of server sales to enterprise IT. The increased use of SaaS is a major reason for the market shift, but so is virtualization to increase server capacity. Data center consolidations are eliminating servers as well, along with the purchase of denser servers capable of handling larger loads. The increased use of cloud-based providers is roiling the server market, and is expected to help send server revenue down 3.5% this year, according to IDC."

goodminton writes "I'm research the long-term consistency and reproducibility of math results in the cloud and have questions about floating point calculations. For example, say I create a virtual OS instance on a cloud provider (doesn't matter which one) and install Mathematica to run a precise calculation. Mathematica generates the result based on the combination of software version, operating system, hypervisor, firmware and hardware that are running at that time. In the cloud, hardware, firmware and hypervisors are invisible to the users but could still impact the implementation/operation of floating point math. Say I archive the virutal instance and in 5 or 10 years I fire it up on another cloud provider and run the same calculation. What's the likelihood that the results would be the same? What can be done to adjust for this? Currently, I know people who 'archive' hardware just for the purpose of ensuring reproducibility and I'm wondering how this tranlates to the world of cloud and virtualization across multiple hardware types."