The Register sat down with Alex Thurber, a BlackBerry senior VP, to discuss the companies plans to license their particular flavour of Android to other phone manufacturers. Thurbur has worked at Cisco, McAfee after Intel's purchase of the company as well as a firewall company called WatchGuard so he has had some experience with locking down kit. We will still see two more BlackBerry devices before they finally stop selling hardware but you should expect to see other brands running Blackberry licensed versions of Android soon. They will have NIAP (National Information Assurance Partnership) certification, the same certification that Samsung's KNOX and LG's GATE qualify for. Drop by for deeper look into what they discussed.

"BlackBerry says it won’t license its brand and security hardened Android “to any Tom Dick and Harry” as it tries to maintain the value of its brand."

The good news about this hack is that you would need good timing and physical proximity to the wireless remote which instructs the pump to administer insulin; the bad news is that this is all that is needed and it could result in the death or hospitalization of the target. The vulnerability stems from the usual problem, the transmission between the remote and pump is done in the clear letting anyone who is looking retrieve serial numbers and codes. With that information you can then trigger a dose to be delivered or quite feasibly change the default amount of dosage the pump delivers, as was done previous with a different model.

IoT security as it applies to fridges and toasters is one thing; medical devices quite another. News of unauthorized access to pacemakers and other drug delivery systems which could result in death is not uncommon, yet companies continue to produce insecure systems. Adding even simply encryption to transmissions as well as firmware based dosage sizes should be trivial after the release of a product and even easier before it is released. Keep this in mind when you are seeking medical care, choosing devices which are less likely to kill you because of shoddy security makes sense. You can pop by Slashdot for links to some stories or wade into the comments if you so desire.

"Johnson and Johnson has revealed that its JJ Animas OneTouch Ping insulin pump is vulnerable to hackers, who could potentially force the device to overdose diabetic patients -- however, it declares that the risk of this happening is very low."

Google introduced its own premium smartphone today in the form of the Pixel and Pixel XL. Running Android Nougat 7.1, the Pixel smartphones will not only run the latest operating system but will be the new premium experience with the best Android features including Google Assistant and Smart Storage with unlimited cloud storage of photos and videos.

Google is definitely taking a greater interest in promoting Pixel than they have with even their Nexus devices. It will be interesting to see how other Android manufacturers react to this news but I would imagine that they are not all that pleased and Google will be in a similar position to Microsoft with its Surface products and Nvidia with it's Founder's Edition graphics cards.

Google's Pixel lineup includes the Pixel (5.6 x 2.7 x 0.2-0.3") and the Pixel XL (6 x 2.9 x 0.2-0.34") that wrap their respective 5-inch 1080p (441 PPI) and 5.5-inch 1440p (534 PPI) displays in a full aluminum and glass unibody design that will come in one of three colors: Very Black, Quite Silver and Really Blue. The smartphones feature curved corners and rounded edges with Corning Gorilla Glass 4 on the front and half of the back. Google has put a fingerprint sensor on the back of the phone and power, volume, three microphones, a USB-C port, and, yes, a 3.5mm audio jack.

There are both front and rear cameras and Google is claiming that the rear camera in particular is the best smartphone camera yet (with a DxOMark score of 89 points). The rear camera (which sits flush with the back of the phone) is rated at 12.3 MP with a f/2.0 aperture, and 1.55µm pixels. The camera further features an IMX378 sensor. electronic image stabilization, and both phase detection and laser auto focus. The Pixel can take HDR+ photos and videos at up to 4K30, 1080p120, or 720p240. Users can adjust white balance and use automatic exposure or auto focus locking. The front camera is less impressive at 8MP with fixed focus lens and f/2.4.

Internally, Google has opted to use the Qualcomm Snapdragon 821 (MSM8996) which is a 2+2 design that pairs two Kryo cores at 2.15 GHz with two Kryo cores at 1.6 GHz along with an Adreno 530 GPU, an impressive 4GB of LPDDR4 memory, and either 32GB or 128GB of internal storage which is regrettably non-expandable. The smartphones can tap into up to Category 11 LTE (Cat 9 in the US), 802.11ac Wi-Fi, Bluetooth 4.2, and NFC. Sensors include GPS, proximity, accelerometer, gyroscope, magnetometer, barometer, and hall sensors.

The Pixel features a 2,770 mAh battery and the Pixel Xl uses a slightly larger 3,450 mAh battery. In either case, Google rates the Pixel and Pixel XL at 13 hours and 14 hours of internet browsing and video playback respectively. Further, the batteries are able to be quick charged enough for up to "seven hours of use" after just 15 minutes of charging time using the included 18W USB-C charger.

Pricing works out to $649 for the 32GB Pixel, $749 for the 128GB Pixel, $769 for the 32GB Pixel XL, and $869 for the 128GB Pixel XL. In the US Google has partnered with Verizon for brick-and-mortar availability in addition to it being available on the Google store and other online retailers.

Google is banking a lot on these devices and asking a very premium price tag for the unlocked phones. It is certainly a gamble whether users will find the unique features enough to go with the Pixel over other flagships. What do you think about Google's increased interest in the smartphone space with the launch of its own hardware? How well will Pixel fit into the existing environment – will Pixel lead Android hardware and the OS to success or simply fragment it more?

I do like the look of the Pixel (especially the blue one) and the feature lists sounds good enough that maybe I could live without a removable battery and non-expandable storage (I'll be holding onto my old T-Mobile unlimited plan for as long as possible! heh). Pricing is a bit steep though and I think that will trip a lot of people up when searching for their next device.

EVGA sent along a newsletter which is worth mentioning as there are a few good deals to be had, even if you have already picked up one of their cards. Anyone who recently bought a Pascal based EVGA card or is planning to in the near future can get up to four EVGA Powerlink cable management ... thingies. It is for your PCIe power connectors and wraps around your GPU, allowing you to power your card without exposing those wires and connectors, great for modders or those who prefer a clean looking build. You do need to create an EVGA account and register your card, do keep that in mind.

The PCIe power connectors on the Powerlink are adjustable, no matter which card you purchased you will be able to use the adapter. There are capacitors inside which are intended to help ensure smooth power delivery, so this not simply an extenstion cord. They also have some deals on previous generation NVIDIA cards as well as their TORQ mouse.

During Google's #madebygoogle event (embedded below), the company introduced a number of new pieces of hardware including a new Chromecast. The Chromecast Ultra is aimed at owners of 4K televisions and supports both 4K Ultra HD and HDR content from the likes of Netflix, YouTube, and other apps. Like previous models, the Chromecast takes input from Android, iOS, Mac OSX, and Windows devices that "cast" media to the TV. Additionally, it can be paired with Google Home where users can use voice commands such as "Ok, Google. Play the sneezing panda video on my TV."

The Chromecast Ultra is a small circular puck with a Micro USB port and a short flexible flat HDMI cable that is permanently attached to the device. The Micro USB port is used for both power and data. One neat feature about the new Chromecast Ultra is that the power adpater has an Ethernet port on it so that users can hook the streaming device up to their wired network for better performance (important for streaming 4K content). Not to worry if you rely on WiFi though because it does support dual band 802.11ac.

Google has not yet revealed what hardware is under the hood of its new 4k capable Chromecast, unfortunately. They did release pricing information though: the Chromecast Ultra will be $69 and is "coming soon". If you are interested you can sign up to be notified when it becomes available.

Later this month Amazon will be releasing a new Fire TV Stick with upgraded internals and Alexa Voice controls. The refreshed media streamer features a 1.3 GHz MediaTek MT8127 SoC with four ARM Cortex A7 cores and a Mali 450 GPU, 1GB of RAM, 8GB of internal storage (for apps mainly, and not expandable), and support for newer 802.11ac (dual band, dual antenna) Wi-Fi, and Bluetooth 4.1 wireless technologies.

While that particular SoC is ancient by smartphone standards, it is a decent step up from its predecessor's dual 1GHz ARM A9 cores and VideoCore 4 GPU. It supports h.265 and HEVC decode along with 1080p60 output. The inclusion of 802.11ac WiFi should help the streaming device do its job effectively even in areas littered with WiFi networks (like apartment buildings or townhomes).

The big change from the old Fire TV Stick is the integration of Alexa Voice control and a new remote control with microphone input. Using voice input, users can control media playback, open apps, search for content, and even order pizza. There is no 4K support or expandable storage here (for that you would have to move to the $99 Fire TV) but it is less than half the price.

The refreshed Fire TV Stick will be available on Amazon for $39.99 on October 20th. Pricing along with the additional voice input makes it a competitive option versus Roku's streaming stick and Google's Chromecast.

Ars Technica have put together an overview of the new Windows Server, three pages which broadly cover the new features you will find. As has often been discussed there will be three ways of installing the new Server OS, the familiar Desktop experience as well as Core and Nano. Nano is similar to the Core installation which we saw introduced in Server 2012 but further reduces the interface and attack surface by removing the last remnants of the GUI, no support for 32bit apps and the Microsoft installer; all you get is a basic control console. The Core and Desktop versions remain the same as in the 2012 version.

If you are curious about the inclusion of Docker features such as the Linux-like containers and changes to Hyper-V or deployment techniques drop by for a read.

"Like a special breed of kaiju, Microsoft's server platform keeps on mutating, incorporating the DNA of its competitors in sometimes strange ways. All the while, Microsoft's offering has constantly grown in its scope, creating variants of itself in the process."

Kingston have updated their line of gaming headsets with the new HyperX Cloud Stinger, available already for ~$50. This makes them attractive for those who do not often use a gaming headset but might want one around just in case. The low price could make you underestimate the design, Kingston used 50mm drivers and the microphone mutes itself the moment you swing it away from your voice hole. That said, Overclockers Club were not in love with the quality of the sound compared to expensive headphones, but for this price point they have no qualms about recommending these for casual use.

"Overall, I'm quite impressed with the HyperX Cloud Stinger Gaming Headset. A mouth full just to say that – but after disliking the HyperX Cloud Revolver as much as I did – I'm actually quite happy with this drop in price and slight redesign. With closed ear cups I would have expected a little more in the bass-land, it wasn't the end of the world. The overall sound is nice and flat, and movies, music, and games are all quite tolerable in the closed environment."

A change of one percent may seem tiny at first glance but historically it is an incredibly large shift in market share for an operating system. Unfortunately for Microsoft it is Windows 7 which has gained share, up to 48.27% of the market with Windows 10 dropping half a point to 22.53% while the various flavours of Windows 8 sit at 9.61%. This would make it almost impossible for Microsoft to reach their goal of two one billion machines running Windows 10 in the two years after release and spells bad news for their income from consumers.

Enterprise have barely touched the new OS for a wide variety of reasons, though companies still provide significant income thanks to corporate licenses for Microsoft products and older operating systems. It should be very interesting to see how Microsoft will react to this information, especially if the trend continues. The sales data matches many of the comments we have seen here; the changes which they made were not well received by their customer base and the justifications they've used in the design of the new OS are not holding water. It shouldn't be long before we here more out of Redmond, in the mean time you can pop over to The Inquirer to see Net Applications' data if you so desire.

"The latest figures from Net Applications’ Netmarketshare service show Windows 7, now over seven years old, gain a full percentage point to bolster its place as the world’s most popular desktop operating system with 48.27 per cent (+1.02 on last month)."

Blender 2.78 has been a fairly anticipated release. First off, people who have purchased a Pascal-based graphics card will now be able to GPU-accelerate their renders in Cycles. Previously, it would outright fail, complaining that it didn't have a compatible CUDA kernel. At the same time, the Blender Foundation fixed a few performance issues, especially with Maxwell-based GM200 parts, such as the GeForce 980 Ti. Pre-release builds included these fixes for over a month, but 2.78 is the first build for the general public that supports it.

In terms of actual features, Blender 2.78 starts to expand the suite's feature set into the space that is currently occupied by Adobe Animate CC (Flash Professional). The Blender Foundation noticed that users were doing 2D animations using the Grease Pencil, so they have been evolving the tool in that direction. You can now simulate different types of strokes, parent these to objects, paint geometry along surfaces, and so forth. It also has onion skinning, to see how the current frame matches its neighbors, but I'm pretty sure that is not new to 2.78, though.

As you would expect, there are still many differences between these two applications. Blender does not output to Flash, and interactivity would need to be done through the Blender Game Engine. On the other hand, Blender allows the camera, itself, to be animated. In Animate CC, you would need to move, rotate, and scale objects around the stage by the amount of pixels on an individual basis. In Blender, you would just fly the camera around.

This leads in to what the Blender Foundation is planning for Blender 2.8x. This upcoming release focuses on common workflow issues. Asset management is one area, but Viewport Renderer is a particularly interesting one. Blender 2.78 increases the functionality that materials can exhibit in the viewport, but Blender 2.8x is working toward a full physically-based renderer, such as the one seen in Unreal Engine 4. While it cannot handle the complex lighting effects that their full renderer, Cycles, can, some animations don't require this. Restricting yourself to the types of effects seen in current video games could decrease your render time from seconds or minutes per frame to around real-time.

I've been seeing a lot of people discussing how frequently Windows 10 seems to be getting updated. This discussion usually circles back to how many issues have been reported with the latest Anniversary Update, and how Microsoft has been slow in rolling it out. The thing is, while the slow roll-out is interesting, the way Windows 10 1607 is being patched is not too unusual.

The odd part is how Microsoft has been releasing the feature updates, themselves.

In the past, Microsoft has tried to release updates on the second Tuesday of every month. This provides a predictable schedule for administrators to test patches before deploying them to an entire enterprise, in case the update breaks something that is mission-critical. With Windows 10, Microsoft has declared that patches will be cumulative and can occur at any time. This led to discussion about whether or not “Patch Tuesday” is dead. Now, a little over a year has gone by, and we can actually quantify how the OS gets updated.

For instance, Windows 10 version 1507 had seven sub-versions of 10240 prior to general release, and five hotfixes pushed down Windows Update within the first month of release. The following month, September 2015, had an update on Patch Tuesday, as well as an extra one on September 30th. The following month also had two updates, the first of which on October's Patch Tuesday. It was then patched once for every following Patch Tuesday.

The same trend occurred with Build 10586 (Windows 10 version 1511). Microsoft released the update to the public on November 12th, but pushed a patch through Windows Update on November 10th, and five more over Windows Update in the following month-and-a-bit. It mostly settled down to Patch Tuesday after that, although a few months had a second hotfix sometime in the middle.

We are now seeing the same trend happen with Windows 10 version 1607. Immediately after release, Microsoft pushed a bunch of hotfixes. If history repeats itself, we should start to see about two updates per month for the next couple of months, then we will slow down to Patch Tuesday until Redstone 2 arrives sometime in 2017.

So, while this seems to fit a recurring trend, I do wonder why this trend exists.

Part of it makes sense. When Microsoft is developing Windows 10, it is trying to merge additions from a variety of teams into a single branch, and do so once or twice each year. This likely means that Microsoft has a “last call” date for these teams to merge their additions into the public branch, and then QA needs to polish this up for the general public. While they can attempt to have these groups check in mid-way, pushing their work out to Windows Insiders in a pre-release build, you can't really know how the final build will behave until after the cut-off.

At the same time, the massive flood of patches within the first month would suggest that Microsoft is pushing the final build to the public about a month or two too early. If this trend continues, it would make the people who update within the first month basically another ring of the Insider program. The difference is that it is less out-in, because you get it when Windows Update tells you to.

It will be interesting to see how this continues going forward, too. Microsoft has already delayed Redstone 2 until 2017, as I mentioned earlier. This could be a sign that Microsoft is learning from past releases, and optimizing their release schedule based on these lessons. I wonder how soon before release will Microsoft settle on a “final build” next time. It seems like Microsoft could avoid many stability problems by simply setting an earlier merge date, and aggressively performing QA for a longer period until it is released to the public.

Logitech has announced the successor to the popular C920 with the C922 Pro Stream Webcam, and this new model includes a 720p/60 mode, along with the 1080p/30 capability of its predecessor.

“C922 Pro Stream Webcam offers full HD quality and features for all streaming needs. At either 1080p 30 FPS or 720p 60 FPS, C922 is the perfect solution for streaming to Twitch, YouTube and any other video streaming application imaginable. Advanced 20-step autofocus through a full HD glass lens with F-stop F 2.8 and 78-degree field of view means no matter what action is happening, C922 can capture those crucial moments in perfect HD clarity.”

When Microsoft launched the Surface there were negative reactions from vendors who saw this as new competition from what was previously their partner. Today DigiTimes reports that certain unnamed GPU vendors have similar feelings about NVIDIA's Founder's Edition cards. Jen-Hsun responded to these comments today, stating that the Founders Editions were "purely to solve problems in graphics card design".

While he did not say that NVIDIA would not consider continuing practice in future cards he does correctly point out that they did share everything about the design and results with the vendors. Those vendors are still somewhat upset about the month in which only Founder's Editions were available for sale as they feel they lost some possible profits by not being able to sell their custom designed GPUs. Then again, considering the limited supply on the market, the amount of sales they could have made that extra month would certainly have been limited. It will be interesting to see if we hear more about this directly from the vendors in the coming weeks.

"Since Nvidia has restricted its graphics card brand partners from releasing in-house designed graphics cards within a month after the releases of its Founders Edition card, the graphics card vendors are displeased with the decision as it had given Nvidia time to earn early profits without competition."

Update: There has been a little confusion. The web browser, Firefox, is still going strong. In fact, they're focusing their engineering efforts more on it, by cutting back on these secondary projects.

Less than a year after their decision to stop developing and selling smartphones through carriers, Mozilla has decided to end all commercial development of Firefox OS. Releases after Firefox OS 2.6 will be handled by third parties, such as Panasonic, should they wish to continue using it for their smart TV platform. Further, source code for the underlying operating system, Boot-to-Gecko (B2G), will be removed from their repository, mozilla-central, so it doesn't hinder development of their other products.

Obviously, this is quite disappointing from a platform standpoint. Many applications, especially for mobile and similar devices, can be created in Web standards. At this point, we usually get comments about how web browsers shouldn't be app platforms, and that JavaScript is too inefficient. The thing is, Web is about the best, ubiquitous platform we have, and it will only get better with initiatives such as WebAssembly. Also, native applications don't necessarily perform better than Web-based ones, especially if the latter are packaged standalone (versus sharing resources with other tabs in a browser).

Regardless, Mozilla needs to consider their long-term financial stability, and throwing resources at Firefox OS apparently doesn't return enough value for them, both directly and for its impact on society.

Machine translation is quite difficult, especially between certain pairs of languages that vary greatly in how they handle implied context and intonation. At Google, the current translation system picks out known words and phrases, converts them to the target language, and blindly outputs them. This, unfortunately, ignores how the phrases are structured together.

Google has been working toward a newer system, though. Google Neural Machine Translation (GNMT) considers whole sentences, rather than individual words and phrases. It lists all possible translations, and weighs them based on how humans rate their quality. These values are stored and used to better predict following choices, which should be a familiar concept to those who have been reading up on deep learning over the last couple of years.

This new system makes use of Google's “TensorFlow” library, released to the public last year under a permissive, Apache 2.0 license. It will also be compatible with Google's custom Tensor Processing Unit (TPU) ASICs that were announced last May at Google I/O. The advantage of TPUs is that they can reach extremely high parallelism because they operate on extremely low-precision values.

The GNMT announcement showed the new system attempting to translate English to and from Spanish, French, and Chinese. Each pairing, in both directions, showed a definite increase, with French to English almost matching a human translation according to their quality metric. GNMT is currently live to the public when attempting to translate between Chinese and English, and Google will expand this to other languages “over the coming months”.

Recently, HP released a firmware update for some inkjet printers that disabled certain third-party cartridges. The claim is that the customer “is exposed to quality and potential security risks” when using counterfeit cartridges. I'm curious why HP is claiming that users shouldn't trust HP's abilities to secure their devices against attacks from malicious cartridges, but that's probably not an implication that HP considered when publishing this press release.

Also, if the intent was to inform users about counterfeit and potentially malicious cartridges, you would think that they would have provided an override method from the start. Thankfully, they are now. HP is preparing an optional firmware update that does not check cartridges. They claim that it will be available in a couple of weeks, and provide a link to where it will be hosted.

With the amount of VR benchmarks coming out of [H]ard|OCP lately we wonder if they are in danger of becoming the worlds first VR addicts. They tested the usual suite of two AMD cards and five NVIDIA to determine the amount of dropped frames and average render times in this particular game. As it turns out the game is harder on the player than it is the GPU, all were able to provide decent experiences when swashbuckling. The developer recommends you clear a 2x1.5m area to play this game and from what [H]ard|OCP experienced while playing this is no joke; you will get exercise while you are duelling some of the harder opponents.

"Do you want to fight the Black Knight in a sword fight? There is not exactly a "Black Knight" in Sword Master VR, but you can certainly get that feeling. In fact, you can fight him and a couple of his friends at the same time if you are up to the challenge. Just pull the sword from the stone for $10."

There will be an improvement in audio support on Type-C USB connections which will decrease power demands, as USB Audio Device Class 3.0 specifications have just been announced. When compared to the 3.5mm headphone jack, USB audio is a power hog which will shorten the amount of time your battery will last on a phone or other mobile device but it seems that the USB-IF have been working to overcome this issue. Product manufacturers are looking forward to this as USB can be isolated from other internals far more effectively than the 3.5mm jack which would allow them to waterproof their devices.

Hopefully the new compliance testing regime brought about after the consequences of using a bad cable to charge your laptop will ensure we do not have any related problems with audio devices. The Register does remind us that Bluetooth 5 is yet to be commonly found on mobile devices and could offer yet another 3.5mm nail in the coffin.

"Hear that, children? That's the sound of another set of nails in the coffin of headphone jacks in mobile devices."

The Chinese officially began searching the stars around noon local time on Sunday using the newly completed FAST radio telescope which has surpassed Arecibo in being the world's largest single aperture telescope. Nestled in the natural Dawodang (limestone) depression in the remote and mountainous Pingtang county, Guizhou province, the Five-Hundred-Meter Aperture Spherical Telescope (FAST) will search the heavens to catalog pulsars, investigate dark matter, gravitational waves, and fast radio bursts, and assist in the search for extraterrestrial life and natural hydrogen in distant galaxies.

The $180 million project has been in development for 14 years with construction beginning in 2011. The massive scientific endeavor required the relocation of several villages and 10,000 people living in the vicinity. Further, the remote area required the telescope to be constructed without the use of heavy machinery and the dish had to be constructed manually. FAST is modeled after the Arecibo observatory in Puerto Rico and uses 4,450 triangular reflector panels supported by a steel mesh suspended over the limestone valley using large steel towers anchored to the surrounding hills. FAST deviates from Arecibo when it comes to reflecting and receiving radio signals, however. While Arecibo uses a 900 ton movable receiver with a complex set of mirrors that make up a sub reflector, FAST uses 2,250 actuators (winches) that pull on up to 300m sections of the dish to create a parabola that can move in real time to track signals as the Earth rotates and reflect them back to the receiver which is reportedly much lighter and can contain more instruments than Arecibo.

While Arecibo, with its 305 meter dish, can track signals up to 20° from the zenith, FAST can track signals up to 26° from the zenith at 300 meter parabola sizes and up to 40° with smaller parabola sizes making it rather versatile. The massive dish combines the benefits of a large single fixed dish and a smaller dish (or dishes which could be combined to provide higher resolution using interferometry) that can tilt and rotate.

Specifically, Dennis Normile quoted experts in saying:

Single dishes excel at observing point sources like neutron stars and at scanning a multitude of frequencies in the search for extraterrestrial intelligence, says astronomer Li Di, a FAST project scientist, who previously worked at NASA’s Jet Propulsion Laboratory in Pasadena, California. Another advantage is that, compared with the multiple dishes in an array, single dishes are “relatively cheap and relatively straightforward to upgrade,” says George Hobbs, an astronomer at CSIRO. “You just keep building better receivers.” (Dennis Normile at Science Magazine)

FAST is quite the accomplishment and I am interested to see what the scientists are able to discover using the world's largest radio telescope. Hopefully it will continue to receive adequate funding!