The next time your boss complains when you suggest that picking up secure USB sticks because of the price, you might want to reference this report from Kingston which details several horror stories of what happens with a lax policy towards portable storage. We have seen Stuxnet recently, as well there is a long list of tricks that can be played with USB devices with the U3 autorun present on many USB devices.

This goes far beyond just a complaint about using USB sticks received for free at trade shows or picked up on discount from Costco, the report cites an instance where unmarked USB sticks were left in obvious spots in government parking lots and over half of them ended up being plugged into the wok PC of the person who found it. Maybe now spending a little extra on secure USB sticks will seem a little more attractive to the beancounters.

Fountain Valley, CA -- August 9, 2011 -- Kingston Digital, Inc., the Flash memory affiliate of Kingston Technology Company, Inc., the independent world leader in memory products, today announced the results of a study conducted by the Ponemon Institute looking at USB prevalence and risk in organizations. The study found that inexpensive consumer USB Flash drives are ubiquitous in all manner of enterprise and government environments ― typically with very little oversight or controls, even in the face of frequent and high profile incidents of sensitive data loss. The Ponemon Institute is an independent group that conducts studies on critical issues affecting the management and security of sensitive information about people and organizations.

The study underscores the pressing need for organizations to adopt more secure USB products and policies. A group of 743 IT professionals and IT security practitioners from global companies based in the United States were polled, and all acknowledged the importance of USB drives from a productivity standpoint. They cautioned, however, about the lack of organizational focus regarding security for these tools to meet appropriate data protection and business objectives.

The most recent example of how easily rogue USB drives can enter an organization can be seen in a U.S. Department of Homeland Security test in which USBs were ‘accidentally’ dropped in government parking lots. Without any identifying markings on the USB stick, 60 percent of employees plugged the drives into government computers. With a ‘valid’ government seal, the plug-in rate reached 90 percent.

According to the Ponemon study, more than 40 percent of organizations surveyed report having more than 50,000 USB drives in use in their organizations, with nearly 20 percent having more than 100,000 drives in circulation. The study finds that a whopping 71 percent of respondents do not consider the protection of confidential and sensitive information on USB Flash drives to be a high priority. At the same time, the majority of these same respondents feel that data breaches are caused by missing USB drives.

The Ponemon study concluded that a staggering 12,000 customer, consumer and employee records were believed to be lost on average by these same companies as a result of missing USBs. According to a previously released Ponemon report, the average cost of a data breach is $214 per record, making the potential average total cost of lost records to the organizations surveyed for the Ponemon USB Flash drive study, reach upwards of $2.5 million (USD). Other key findings in the report include:

The majority of those organizations (67 percent) confirmed that they had multiple loss events – in some cases, more than 10 separate events.

Oversight and control of USBs in enterprises can be better:

Free USB sticks from conferences/trade shows, business meetings and similar events are used by 72 percent of employees ― even in organizations that mandate the use of secure USBs.

In terms of policies and controls, of the hundreds of IT professionals and IT security professionals polled, only 29 percent felt that their organizations had adequate policies to prevent USB misuse.

“An unsecured USB drive can open the door for major data loss incidents,” said Larry Ponemon, Chairman and Founder of the Ponemon Institute. “Organizations watch very carefully, and put a plethora of controls around, what enters their businesses from cyberspace. This study drives home the point that they must also take a more aggressive stance on addressing the risks that exist in virtually every employee’s pocket.”

“Kingston believes a lack of oversight, education and corporate confusion are factors that lead to the overwhelming majority of data loss when it comes to USB Flash drives,” said John Terpening, Secure USB business manager, Kingston. “Organizations fear that any attempt to control a device like a USB is likely to be futile and costly, both in terms of budget and loss of productivity. However, a simple analysis of what an organization needs and the knowledge that there is a range of easy-to-use, cost-effective, secure USB Flash drive solutions can go a long way toward enabling organizations and their employees to get a handle on the issue.”

The Razer Hydra bears a small resemblance to the Wii controller at first glance but that is quickly dispelled when you realize you get two devices to hold. Both have 4 face buttons, a 'start' button, a clickable analog stick and two bumper triggers, which give you enough input options for PC gaming. The wired base station these controllers use senses the small magnetic field the controllers emit, which is how the motion sensing capabilities work. That field was not enough to disturb any of tbreak's other equipment which is vital to the success of the controller. As for gaming? With Portal 2 they had a blast, but when it came to other shooters ... not so much.

"The Hydra is not spectacularly different, it uses the same nun-chuck approach of the Wii, however it’s technology and precision far outclasses Nintendo’s toy. According to Razer, the Hydra uses magnetic forces to detect the exact location and orientation of the controllers and delivers an “ultra-low latency”, “fluid and precise” gaming experience."

While the boys were having fun at an event in Texas, TechwareLabs were at a show of a completely different colour. Black Hat 2011, the yearly computer security convention was also taking place in Las Vegas, bringing to light the discoveries of the past year when it comes to vulnerabilities and how to protect yourself against them. One of the topics for discussion was how the Secure Socket Layer works, by assuming that a Trusted Authority is behind a security certificate which requires them to provide a secure connection between yourself and their servers. Over the past year we saw a hack at Comodo, who are a major Certificate Authority, which lead to nefarious people getting their hands on certificates assigned to Microsoft, Yahoo and Google, which allowed them to easily fool even a computer using SSL.

Taking that as an example of the failure of the idea of single, large CAs as the way to implement SSL. If you were to no longer trust Comodo and its certificates then about 1/4 of the secure sites on the net would never allow you to connect. Instead a programmer detailed a FireFox extension called Convergence as an alternative. This distributed way of dealing with Certificate authentication would allow you to switch between trusting and untrusting certain CAs without damaging your ability to connect to secure sites on the web.

"This interesting presentation concerns a security protocol that you probably use everyday. It is in your browser, on the server you connect to, and bought together by a “Certificate Authority”. The idea behind SSL is to provide a secure connection between you, the client browser, and the server providing the sensitive data to you. For instance a Bank website is designed to provide the client with convenient access to account details, transactions, etc. But there is a major issue with a pivotal player in this process. The Certificate Authority or CA is charged with certifying the organizations to which it provides certificates. The CA is supposed to be a trustworthy entity working on behalf of us, the end users, to ensure that any organization it issues a certificate to is credible and trustworthy. After all many users depend on the CA’s, SSL protocol, and issued certificates to enforce authentication and integrity in the online space. You have little choice but to trust the CAs and expect them to provide a high quality level of authentication services."

There is a lot of discussion over how expensive the PC is compared to the consoles. I have heard from a number of former PC gamers who switched to the console to escape the large cost of ownership. I have also heard from a number of console gamers who claim that they cannot afford a three-thousand dollar gaming behemoth to just launch the typical PC game. Suffices to say, my head has exploded more times than causality allows for.

Is your PC bleeding gushes of money?

Let us clarify something straight out of the gate before tl;dr kicks in: the true cost of a console is not the price you pay for the box itself. For proof, look at Sony: the cost of the $499 PS3 at launch was $805.85 according to CNET. That means that for each PS3 they sold they lost $306.85. You may think, “Pfft, that’s fine. They’ll make it up later.” Nope, it was mid-2010 before Sony made any money on each PS3 sales. They were bleeding for 3 years.

So where does Sony and Microsoft make their money? Firstly, Microsoft has that cash-cow Xbox Live that they have been milking for a substantial time now. You may consider $60 per year to be chump change however after 4 years that tallies up to 240$. I want you to consider the following: Xbox Live every 4 years, or a Radeon HD 6950 (Bundled with Dirt 3) for four years without paying a cent more? (Actually, okay -- you pay 3 cents more at $59.99-per-month{{edit: year, typo}}). It is also pretty much given that not only will your games look better than on a 360 by a long shot, you will also still be able to physically play games in four years’ time. You might be turning the quality settings to medium or low near the end of your card’s life cycle, but hey: at least you have the option for quality settings. Also, just because a console claims to run a game at a specific resolution does not mean it actually is. For instance, most Call of Duty games on the consoles are actually rendered at approximately 600p but are up-scaled to their listed resolutions. To claim an upscaled 600p is 1080p would be like claiming a DVD upscaled is the same thing as a BluRay.

And this leads to our next point: You can buy a three-thousand dollar computer. You can also buy a Porsche. You do not need a Porsche to drive to work, but there are some distinct advantages to owning one that make it viable for a portion of the market. The rest of us can be perfectly happy driving to work with a Hyundai or a Chevy. Besides, it’s cheaper than paying a taxi. For good examples of cost efficient PCs, check out our constantly updated Hardware Leaderboard. Technically a license of Windows is not included, which is the one kink in PC gaming openness. Ideally we would be all running Linux or a similarly licensed OS not just for cost but also for longevity. Videogames will struggle as timeless art so long as the platforms they run on are not timeless. Unfortunately even in the PC gaming sphere there is no guarantee that the platform will just be torn out from under your dependent art. But, at least the PC platform is not designed to be disposable like the consoles. It is the lesser of two evils, and baby-steps to an ideal future.

A moment of silence for your wallet.

So how much money are we talking about? I personally summed up how much I spent on the first Xbox in $10 per game license fees and $60 per year Xbox Live fees which came to $520 excluding the cost of the system itself and accessories which need to be replaced each generation for no sensible reason. Keep in mind; I was not a very extreme gamer purchasing only five games per year on average. Had I been PC exclusive, however, that would have been $500-some-odd dollars over the price of the system and accessories itself that I would not have needed to pay. The truth of the matter is over the long run you pay more to be a console gamer than a PC gamer unless you physically choose to pay more for your PC. Also, do not forget: due to the existence of proprietary platforms, if you own multiple systems because your games are only available on one or another, you are even further worse off.

There will be a follow-up article to this in the near future discussing what you are paying for with consoles – spoiler: it is, in general, not desirable.

The most important difference about Quakecon this year is that they can finally, for the first time in years, promote an upcoming in-house title. RAGE was definitely all the rage this year as John Carmack spent the majority of his keynote discussing the technical decisions made during the game for the 360, for the PS3, and for the PC. The biggest take-away that a lead game programmer could learn from the keynote is that you should never tell the artist team when approached about design specifications, “Make beautiful stuff and we’ll figure out how to make it work.” However, for the rest of us not on iD’s programming department, we get to see what a comment like that looks like in the new RAGE trailer.

Reminded of USENET a decade ago, “Syntax Error: Asking for the best computer, money is no object.”

Something has us believing that this will be a very profitable year for Bethesda’s parent company, Zenimax. With two large games, RAGE and Elder Scrolls V, coming out this autumn from Zenimax we hopefully should see them able to reinvest and grow over the coming years. As for the game itself, I get three distinct vibes from the most recent cinematic trailer: The first is Doom 3 which is most felt in the intro as the player is loaded into the pod-like device; the second is Fallout 3 from characters interact; the third vibe I cannot pin against any given game and mostly consists of the vehicular aspect of the trailer. What does it remind our readers of? (Registration not required for commenting.)

Want to rest your eyes from all of the Quakecon coverage? How about another Sony tablet ad? The first three parts of Sony’s S1 and S2 ad campaign are behind us with the conclusion of this five part series occurring in the fourth part. Frankly I do not really understand it either, but apparently the fifth ad will be a collection of the previous four making the fourth one the actual finale of a series of five. I guess that somewhat makes sense: what better way to promote the products’ collective slogan “Open Your Imagination” than blowing your mind? I say nothing.

This Two Will Passed… okay? Go play it.

The title of this video is “Together anywhere” and features an unsurprising amount of tracks for anyone who watched any or all of the preview videos. Besides metal rails, be sure to pay close attention to the setup prior to the couch station because you will drop bricks at the end of the video. This is also the first time that lyrics appear in the ads which put a very uplifting feel on the campaign. While not as metaphorical as the first two parts suggested, I believe they got their point across. Now all that is left to do is see if it will translate to sales.

Digitimes reported today that Intel will be meeting with its Original Design Manufacturer (ODM) partners in Taipei next week to discuss the Bill of Materials (BOM) that outlines the components to be used in Intel's Ultrabook notebook class. The goal of the meeting will be to tweak the Bill of Materials such that the initial selling price will be below $1,000 USD.

Intel has further broken up the Ultrabook category into two thickness classes of 18mm and 21mm. The 18mm reference designs, of which Intel has rendered five, have thus far omitted any optical drives. An example of the 18mm design can be seen in the upcoming Asus UX21 and UX31 ultrabooks. The proposed Bill of Materials for the 18mm ultrabooks is between $493 and $710 USD while the 21mm ultrabooks BOM is between $475 and $650 USD.

Beyond the Bill of Materials, the site notes that Intel is further planning to release next generation ultrabooks based on 22nm Ivy Bridge processors in 2012 and 22nm Haswell CPUs in 2013. These ultrabooks will come in sizes ranging from 11" to 17." The 11" to 13" models will have a thickness of 18mm while the 14" to 17" models will be of the 21mm variety.

Apple Insider notes that the push from Intel to keep the cost of materials and initial selling price for its ultrabooks below $1,000 may be due to the $999 entry level Macbook selling so well and Intel's desire to provide a competitive product that can match the thin-ness of the Mac notebooks and is priced to sell. Do you think Intel's ultrabooks will catch on with consumers, or will it be another niche and/or gimmick product?

A lot of news is blowing up about the exciting conference happening right now called Quakercon. For those of us not lucky enough to bask in the PepsiCo subsidiary that popularized Amish oatmeal delight there is another, smaller conference going on right now called “Quakecon”. Frankly, I think they’re ripping off our wonderful breakfast food company. Still, if you cannot check out Quakercon treating us to tonnes of steamed meals – why not check out Quakecon treating us to tonnes of Steam deals!

Yes I realize there is no such thing as Quakercon… … yet.

In the event that you are following Quakecon and for some reason do not own many iD or Bethesda games, there is a bundle that will roll you pretty much entirely up to date for just shy of $70. The Quakecon Pack 2011 contains the following:

Quake III Arena

Quake IV

Wolfenstein 3D

The Ultimate Doom

Final DOOM

DOOM II

QUAKE

QUAKE II

QUAKE II Mission Pack: The Reckoning

QUAKE II Mission Pack: Ground Zero

QUAKE III: Team Arena

HeXen: Beyond Heretic

HeXen: Deathkings of the Dark Citadel

Heretic: Shadow of the Serpent Riders

Spear of Destiny

Return to Castle Wolfenstein

QUAKE Mission Pack 2: Dissolution of Eternity

QUAKE Mission Pack 1: Scourge of Armagon

DOOM 3

HeXen II

DOOM 3 Resurrection of Evil

Master Levels for Doom II

Commander Keen

Rogue Warrior

The Elder Scrolls III: Morrowind® Game of the Year Edition

Call of Cthulhu®: Dark Corners of the Earth

BRINK

Fallout 3: Game of the Year Edition

Fallout: New Vegas

Hunted: The Demon’s Forge™

Fallout New Vegas: Dead Money

Fallout New Vegas: Honest Hearts

Fallout New Vegas: Old World Blues

The Elder Scrolls IV: Oblivion® Game of the Year Edition Deluxe

If $70 is too much for you but you have plans to pre-order Elder Scrolls V: Skyrim and RAGE: doing so will knock the price of this combo pack to $40. Also, you will likely not even install a third of this pack at best, especially if you plan on putting a dent in to RAGE and Skyrim.

"You ever heard the numbers thrown around about how many wafers will be produced by a fab? GlobalFoundries is telling us Fab 8 will build processors using up to 60,000 wafers a month when it is under full production. Have you ever wondered how all those wafers get to where they need to be? We show you how that happens."

TIBURON, CA-August 4, 2011—Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia, announced estimated graphics chip shipments and suppliers’ market share for Q2’11.

Shipments during the second quarter of 2011 did not behave according to past years with regard to seasonality, and was higher on a year-to-year comparison for the quarter. 2011 is shaping up to be an anomalous year as businesses take their own path to recovery.

Normally, the second quarter of the year is a slower business quarter in the graphics industry (and in the PC industry as a whole). This year, Q2’11 did not conform to the normal seasonal cycle. Instead, sales were up significantly compared to previous years. The growth in Q2 comes as a welcome change, if not a bit worrying—is it inventory building for back to school and the holiday season, or channel stuffing?

Our forecast for the coming years has been modified since the last report, and is less aggressive on both desktops and notebooks—tablets have changed the nature of the PC market. Our findings include Desktops, Notebooks (and Netbooks), and PC-based commercial (i.e., POS) and industrial/scientific and embedded; and do not include handhelds (i.e., mobile phones), x86 Servers or ARM-based Tablets (i.e. iPad and Android-based Tablets), Smartbooks, or Servers.

The quarter in general

In Q2’11, Intel celebrated its sixth quarter of Embedded Processor Graphics CPU (EPG, a multi-chip design that combined a graphics processor and CPU in the same package) shipments, and enjoyed a 21% average growth in Desktops and Notebooks.

AMD and Nvidia lost in overall market share, while Intel grew compared to last quarter.

Year to year this quarter Intel had tremendous market share growth (14.7%), AMD had a loss of 14.2%, and Nvidia slipped 18.4% in the overall market partially due to the company withdrawing from the integrated segments.

The Q2’11 change in total shipments from last quarter increased 6.3%, significantly above the ten-year average of 3.5%, and raising concerns about an inventory buildup.

Over 84 million PCs shipped worldwide in Q2’11, an increase of 2.4% compared to Q1’11, (based on an average of reports from Dataquest, IDC, and HSI) causing speculation that the 6.3% up-swing in graphics could be an inventory buildup and have a negative impact on Q3 or Q4.

AMD’s HPU quarter-to-quarter growth has been extraordinary at an average of 80% for desktop and notebook, and Intel’s EPG growth was significant at an average of 41%. This is a clear showing of the industry’s affirmation of the value of CPUs with embedded graphics and is in line with our forecasts. The major, and logical, impact is on older IGPs, and some on low-end add-in boards (AIBS).

Graphics chips (GPUs) and chips with graphics (IGPs, HPUs, and EPGs) are a leading indicator for the PC market. At least one and often two GPUs are present in every PC shipped. It can take the form of a discrete chip, a GPU integrated in the chipset, or embedded in the CPU. The average has grown from 115% in 2001 to almost 160% GPUs per PC.

Since the crash of 2009, combined with the introduction and influence of ARM-based Tablets, the PC market has deviated from historical trends. Until the segment for Tablets is clearly defined the fluctuations in the market data is likely to continue. The disruptions probably won’t settle down for a while as Tablets find their place in the market and agreement can be reached on to include them in the PC market analysis, or to not include them.

Market shares shifted for the big three, and put pressure on the smaller three, and they showed a decrease in shipments as indicated in Table 1 (units are in millions.)

Just recently we looked at a Tom’s Hardware review of CPU architecture since about 2005. While the performance of the CPU itself was not covered in the review, that was entirely not the purpose of the article; the problem investigated was whether there was a lot of innovation with the architectures themselves or whether companies were just ramping up the clock rate and adding more cores to get their performance. Implied in the article’s findings was the extent to which Intel was relying on a higher clock rate to even be comparable to AMD at the time, and even if they were being comparable is debatable. At some point AMD decided to change their tactics and stop ranking their processors by clock rate due to the huge disparity between Intel’s performance and their own at any given clock. This drew some flak in the forums but ended up sticking as even Intel dropped the Gigahertz moniker.

I owned a Core 2 Duo E6600 MHz! It’s so fast they needed to count in hex!

Scott, not me but another Scott, accused AMD back in 2001 of confusing users about the actual clock rate of their products. That post was crushed by video gaming’s most popular astrophysicist: yes, exactly. That didn’t stop the debate about whether that is an ethical thing to do, whether Intel’s ethics are any better, or whether they’re hypocrites. Regardless, the soapbox was eventually put away and everyone went back to their lives.

Could it be? Is there an actual explanation as to why every single Steam game you ever bought just has to install DirectX, even though you just installed it for that last game you bought and the one before that and the one before ...

Rock, Paper, SHOTGUN has the explanation as to what is going on, though it is up to you to decide if it is reasonable or not. Gone are the days of one DX fits all games, instead each of the currently used versions of DX, as in DX9, DX10 and DX11 depending on your software and hardware have many sub-versions. In DX9's case, there are over 40 versions of a D3D helper library called D3DX and that number grows in DX10 and DX11 and that is before you toss in 32bit versus 64bit OS versions.

Doesn't it make you happier to know the reason why you are stuck watching that stupid progress bar slowly grow instead of being able to play the game you just bought?

"Oh God, not again – can’t I just play the damned thing? WHY? [Stomp, stomp, stomp."] This is a sound surely as familiar to the residents of the Brunswick area of Brighton as are the constant squawks of seagulls fighting over the contents of their recycling boxes. This is a sound I make, or at least variations upon it, every single time I first run a game I have downloaded via Steam. This time, I always think. This time it won’t ask me to install DirectX again first. Surely the 1023rd time’s the charm. That dream will likely never come to pass. However, at least we now know why – Valve have explained this particularly modern annoyance."

Look at that UX21 there, isn't it gorgeous? Only 1.1kg of brushed aluminium, fully kitted out with a new style lithium battery, is only 1.7cm thick and it has ASUS' Instant On technology which will boot you to desktop in 5 seconds. It was shown off as the shining example of what Intel's Ultrabook could be at this years CES and everyone who saw it loved it. It seemed that Intel was going to go straight to the core of Apple's ultra light market, not that their processors aren't already in Apple's MacBooks but it is nice to keep the PC partners happy as well.

DigiTimes has heard from a few manufacturers and are ready to add a large lead weight to the Ultrabook, the same weight that dragged down the CULV; namely price. When competing with Apple, the number one thing you need to do is beat them on price. You might be able to match their quality of design, or match them on the size of the notebook or even on the weight but the problem is that Apple was there first. Consumers know Apple's ultramobile platforms and have been using them for years, so if a newcomer tries stealing market share from Apple the only thing they cannot match is Apple's prices. The manufacturers that DigiTimes talked to placed the cost of the components they need to include to meet Intel's specfications are ~$1000, which is the market price of a lower end MacBook Air. Since businesses tend to like to make a bit of profit, as does everyone else in the supply chain, the cost of even a low end Ultrabook will be higher than an equivalent MacBook. Unless Intel is willing to drop prices, the Ultrabook will likely do even worse than the CULV, since at least the CULV had a mobile power user niche to crawl into and hide.

"While Intel is positioning ultrabook as a set of specifications to enable partners to design notebooks imitating MacBook Air, ultrabooks may encounter the same frustrations as CULV notebooks did if prices are not lower than those of the Air, according to sources from Taiwan-based supply chain makers.

The sources pointed out that Intel's ultrabook concept is not a brand new innovation, but a design to allow first-tier notebook players to quickly catch up with Apple's advances in the ultra-thin segment and help the notebook industry recover from the impact of tablet PCs.

Intel has been hosting conferences with the upstream notebook supply chain about its ultrabook since the second quarter and is providing suggestions and assistance in designing related components and methods for reducing costs. Ultrabooks will feature a similar design as MacBook Air and adopt li-polymer batteries, which will completely remove the device's capability of exchanging the battery, to significantly reduce weight, while the machine will adopt metal chassis for heat dissipation and a solid state drive (SSD).

In addition, all the components will be soldered on to the machine's PCB to save space and reduce weight, but the new methods will completely change the existing notebook production process of combining several modules together.

The sources pointed out that the new MacBook Airs are priced at about US$999-1,599 with rather strong demand in the US; however, designing an ultrabook based on Intel's technical suggestions will still be unable to reduce the machine's price level to lower than the MacBook Air's unless Intel is willing to reduce its prices, which already account for one-third of the total cost. If Intel does reduce its prices there is a chance for vendors to provide pricing below US$1,000."

Mobile gaming has seen a relatively sharp rise in popularity in recent years thanks to the rise of powerful smartphones and personal media players like the iPod Touch and its accompanying App Store. Mobile networks, powerful System On A Chips (SoC) that are capable of 3D graphics, lighting, and physics, and a large catalog of easy to download and play games have created an environment where people actually want to play games on their mobile devices. Many people now indulge themselves in quick Angry Birds sessions while in long lines, on work breaks, or wherever they have time when out and about.

One area where mobile devices have not caught on; however, is at home. Mobile devices face stiff competition from game consoles and the PC. That competition has not stopped numerous manufacturers from trying to implement an all-in-one mobile console that was portable and easy to plug into a larger display when at home. Everything from cheap controllers with logic inside that allows them to play old arcade games to smart phones with HDMI outputs costing hundreds of dollars have passed through the hands of consumers; however, the mobile console has yet to overcome the sheer mind share of consumers who prefer dedicated game consoles and their PCs.

According to Anandtech, Qualcomm, a popular manufacturer of ARM SoC for smart phones has announced its plans to pursue that vision of an integrated, mobile console. They claim that the increased power provided by next generation SoC technology will allow tablets and smartphones to deliver graphics that are better than those of current dedicated game consoles like the PS3 and Xbox 360. Due to Sony and Microsoft wanting to extend the lives of consoles well into the future, mobile technology may well surpass it. The company "is committed to delivering both the hardware and the software support needed to bring developers to these mobile platforms," according to Anandtech.

Qualcomm wants to bring portable consoles to the masses powered by their SoCs and backed by their software. The tablets and smartphones would be able to connect to displays using HDMI or wireless technology in addition to supporting controllers (or acting as a controller itself). Further, the games library will be the culmination of software from all platforms and will rival the graphical prowess of the current consoles. Qualcomm hopes that a large library and capable hardware will be enough to entice consumers to the idea of a portable console becoming their all-in-one gaming device.

Portable consoles are similar to tablets and 3D television in that there is a major push for it every few years, a few devices come out, and then it dies off to be reborn again a few years later. Whether Qualcomm is able to pull off the plans for a portable console remains to be seen; however, the device is bound to catch on at some point. At the very least, this is certainly not the last time we will hear about the portable console. You can see more of Qualcomms plans here.

What do you believe is holding back the portable console from catching on with consumers? Is it a good idea in the first place?

As Superman fans well know, Kal-El is faster than a speeding bullet, and NVIDIA’s new Tegra 3 Kal-El chip is no different. We reported on a demonstration of the Kal-El chip running games with dynamic lighting and realistic cloth physics earlier this year, and it is certainly an impressive mobile chip.

Speaking of “impressive,” Asus’ chairman Jonney Shih was quoted by Forbes recently in stating that the upcoming Transformer 2 device would be “impressive.” While Shih was not able to share any details about the device in question, he did mention that Asus will be unveiling new tablets before the end of this year. With the NVIDIA Kal-El chip set to launch this month, the timing is certainly favorable for a quad core Transformer 2.

The Transformer 1, will the second iteration have even more oomph?

Of all the Android tablets, the Transformer has been one of the most well recieved; therefore, it seems likely that Asus would pursue another iteration of the device. Whether that device will be powered by the Tegra 3 chip is still uncertain, however. Do you think the rumor of a quad core Transformer is likely, or is this something that is "too good to be true?"

It is common knowledge that computing power consistently improves throughout time as dies shrink to smaller processes, clock rates increase, and the processor can do more and more things in parallel. One thing that people might not consider: how fast is the actual architecture itself? Think of the problem of computing in terms of a factory. You can increase the speed of the conveyor belt and you can add more assembly lines, but just how fast are the workers? There are many ways to increase the efficiency of a CPU: from tweaking the most common or adding new instruction sets to allow the task itself to be simplified; to playing with the pipeline size for proper balance between constantly loading the CPU with upcoming instructions and needing to dump and reload the pipe when you go the wrong way down an IF/ELSE statement. Tom’s Hardware wondered this and tested a variety of processors since 2005 with their settings modified such that they could only use one core and only be clocked at 3 GHz. Can you guess which architecture failed the most miserably?

Pfft, who says you ONLY need a calculator?

(Image from Intel)

Netburst architecture was designed to get very large clock rates at the expensive of heat -- and performance. At the time, the race between Intel and its competitors was clock rate: the higher the clock the better it was for marketers despite a 1.3 GHz Athlon wrecking a 3.2 GHz Celeron in actual performance. If you are in the mood for a little chuckle, this marketing strategy was all destroyed when AMD decided to name their processors “Athlon XP 3200+” and so forth rather than by their actual clock rate. One of the major reasons that Netburst was so terrible was branch prediction. Branch prediction is a strategy you can use to speed up a processor: when you reach a conditional jump from one chunk of code to another, such as “if this is true do that, otherwise do this”, you do not know for sure what will come next. Pipelining is a method of loading multiple commands into a processor to keep it constantly working. Branch prediction says: “I think I’ll go down this branch” and loads the pipeline assuming that is true; if you are wrong, you need to dump the pipeline and correct your mistake. One way that Pentium Netburst kept high clock rates was by having a ridiculously huge pipeline, 2-4x larger than the first generation of Core 2 parts which replaced it; unfortunately the Pentium 4 branch prediction was terrible keeping the processor stuck needing to dump its pipeline perpetually.

The sum of all tests... at least time-based ones.

(Image from Tom's Hardware)

Now that we excavated Intel’s skeletons to air them out it is time to bury them again and look at the more recent results. On the AMD side of things, it looks as though there has not been too much innovation on the efficiency side of things only now getting within range of the architecture efficiency that Intel had back in 2007 with their first introduction of Core 2. Obviously efficiency per core per clock means little in the real world as it tells you neither about raw performance of a part nor how power efficient it is. Still, it is interesting to see how big of a leap Intel made away from their turkey of an architecture theory known as Netburst and model the future around the Pentium 3 and Pentium M architectures. Lastly, despite the lead, it is interesting to note exactly how much work went into the Sandy Bridge architecture. Intel, despite an already large lead and focus outside of the x86 mindset, still tightened up their x86 architecture by a very visible margin. It might not be as dramatic as their abandonment of Pentium 4, but is still laudable in its own right.

If you find yourself gaming in a noisy environment and are trying to keep your contribution to the noise down by using headsets it can be frustrating if you cannot hear the game you are playing. ASUS has a way to solve that, thanks to the active noise cancellation in their Republic of Gamers Vulcan ANC Pro Gaming Headset. Red & Blackness Mods tried out a pair for review and were impressed by the light weight of the headset as well as detachable mic for when you don't need to communicate with team mates. They were not overly impressed with the sound quality but as these are specifically designed for gaming that is not a major concern and not attempting for high end audio helped keep the price down.

"Asus mostly known for their high end laptops and motherboards have recently started pumping out various accessories and even touchpads. Today we are taking a look at the Asus Vulcan ANC Pro Gaming headphones that you can pick up for around 50$. What type of quality and sound quality can we expect from these?"

At last years Intel Developers Forum, the star of the show was Sandy Bridge as we had not yet seen the chip in action. That seems longer than a year ago, but it was only last September which means that we are drawing close to the 2011 IDF. According to DigiTimes this year they will be focusing on mobility products, as we know Intel is working on a new(ish) form factor which they are calling Ultrabook which will replace the CULV form factor that we have known previously. There were not that many ultrabook branded Sandy Bridge products released this year and with the upcoming release of Ivy Bridge it seems that there won't be many in the future, as Ivy Bridge is intended to do everything Sandy could and use less power doing it. The other focus that DigiTimes expects to see, as do we at PC Perspective, is more information on Haswell which will be the next generation of Intel chip architecture. We don't expect to see working silicon until 2013, but you can expect a lot more information about the instruction sets it will use, which is only to be expected at a developers conference.

"CPU maker Intel is set to host its Intel Developer Forum (IDF) show in San Francisco from September 13-15 and its plans for ultrabook, upcoming Ivy Bridge- and Haswell-based processors as well, as its strategy for next-generation tablet PC processor, are all expected to become focuses at the show, according to sources from PC players.

Intel, was originally set to enter the Ivy Bridge-based CPU generation in the fourth quarter of 2011, but after considering the yield rate of the 22nm process and the market status worldwide, the company, in the end, decided to postpone the launch of the new-generation CPU to March of 2012, allowing a smooth transition between the two generation of CPU structures.

In addition to the Ivy Bridge structure, Intel is also set to reveal the detail specifications of its Haswell processor, which will appear in 2013, at IDF in September, the sources noted.

As for ultrabook, Intel will display several completed models from the first-tier notebook players including Asustek Computer, Hewlett-Packard (HP) and Lenovo, at the show with Intel is also expected to provide the detail of its ultrabook design concept as well as its three stages of execution plan for the device.

For tablet PCs, Intel has been cooperating with Google to pair up its Atom Z670 processor with Android 3.0, while will showcase its latest progress in MeeGo and AppUP Center at the show."

The Firefox UX development team recently posted a presentation showing off some of the latest design and UI (user interface) improvements for the popular Firefox web browser by Mozilla. While not all of the design choices shown in the presentation will make it into the Aurora or other beta builds, they do indicate that Mozilla is at least considering mixing up their traditional interface for upcoming releases. The image below is one of the screenshots included in the presentation, and at first glance it may be mistaken for Google's Chrome browser. However, upon closer inspection it becomes clear that Mozilla have not simply copied Chrome's minimalist design but they have gone with a similar tab design, continued with the transparency that is already present in certain builds and sprinkled some Mozilla flair on top to create one possible look for a future Firefox browser.

Some other proposed changes of the design include a new menu that is icon based versus word lists and is located on the right side of the window as well as an improved full screen experience that seeks to give web apps the screen real estate they need. A new home tab and add-on manager interface are also proposed changes. As shown in the screenshot above, tabs that are not in focus, have their backgrounds become fully transparent so that only the text is visible. This definitely helps the main tab stand out and may help in reducing the amount of distraction users face when having multiple tabs open.

While these are only proposed changes, it is apparent that Mozilla are planning some kind of major UI overhaul if they can get the users to accept it, and the next major release may well see a slightly more chrome-esque appearance with that special Firefox flair. What are your thoughts on the proposed designs, do they seem likely? If you are still using Firefox, what features of other browsers would you like to see Firefox emulate?

Windows XP is almost old enough that revisionist historians can have a crack at it without anyone speaking out against it. That is, it would be if not for the large number of users still using the operating system at their home and work. The decade old operating system has only now fallen below 50% of Windows' market share. More specifically, the slip in market share occurred between June and July where it fell 0.63% to a total of 49.94%.

The numbers are percentages of MS's total 87.66% market share.

In comparison, Windows Vista holds a much smaller 9.24% market share after dropping 0.28%. Microsoft’s most recent operating system, Windows 7; meanwhile, saw a gain of 0.74% to a total of 27.87% market share, which puts the new operating system well on its way to overtaking the XP juggernaut. Techspot has the full scoop on the market share situation, which you can read about here.