Pages

Daily Tech Digest - November 20, 2017

There are a few factors driving this delay in dumping the corporate datacenter: First, enterprises have no plans to give up their datacenters. Although some companies have very publicly reduced their own datacenters, most of the companies that have datacenters now will have them five years from now. They simply don’t seem to believe their increased use of the cloud means they will eventually decrease their private datacenter usage. Second, enterprises have tax and business reasons to hang on to their datacenters. I’ve worked with many enterprises that have datacenter leases that continue for another ten years. Moreover, the CFOs often find that owning the hardware and software provides a tax advantages that they are not willing to give up.

Writing for HBR, Andrew Ng concurs: ‘To the majority of companies that have data but lack deep AI knowledge,’ he says, ‘I recommend hiring a chief AI officer or a VP of AI,’ adding that ‘some chief data officers and forward-thinking CIOs are effectively taking on this role.’ This change isn’t by any means certain, and in March this year HBR also ran a piece by Kristian J Hammond, AI research scientist at the MocCormick School of Engineering at Northwestern, entitled, ‘Please don’t hire a Chief Artificial Intelligence Officer.’ ‘In much the same way that the rise of Big Data led to the Data Scientist craze,’ argues Hammond, ‘the argument is that every organization now needs to hire a C-Level officer who will drive the company’s AI strategy.’ But simply having an AI strategy isn’t enough, Hammond argues: instead AI needs to be integrated into the business in the service of business goals, not given its own department.

In terms of gender, men continue to dominate the highest-level digital jobs, including those in computer, engineering, and management fields, as well as lower-digital occupations such as transportation, construction, natural resources, and building and grounds occupations. But interestingly, women had slightly higher digital scores than men did (48 to 45), and represent about three-quarters of the workforce in many of the largest mid-level digital positions. This group includes jobs in healthcare, office administration, and education. In terms of race, white employees remain overrepresented in high-level digital occupation groups (such as engineering and management), as well as mid-level ones (including business and finance, the arts, and legal and education professions).

Forrester predicts that 2018 will be the year when a majority of enterprises start dealing with the hard facts: AI and all other new technologies like big data and cloud computing still require hard work. Our 2017 predictions for data and analytics pointed to AI as the spark to the insights revolution. This came true: Survey respondents who told us their firm was investing in AI rose from 40% in 2016 to 51% in 2017. But success isn’t easy — 55% of firms have not yet achieved any tangible business outcomes from AI, and 43% say it’s too soon to tell. The wrinkle? AI is not a plug-and-play proposition. Unless firms plan, deploy, and govern it correctly, new AI tech will provide meager benefits at best or, at worst, result in unexpected and undesired outcomes. If CIOs and chief data officers (CDOs) are serious about becoming insights driven, 2018 is the year they must realize that simplistic lift-and-shift approaches will only scratch the surface of possibilities that new tech offers.

A recent DNS threat report from EfficientIP revealed that 25% of organizations in the US experienced data exfiltration via DNS, and of those, 25% had customer information or intellectual property stolen. The average time to discover a breach was more than 140 days. Considering that hackers can silently drain about 18,000 credit card numbers per minute via DNS, that's a customer database many times over. In addition, businesses aren't installing the required patches on their DNS servers, either (86% applied only half of what is necessary, according to our report), which makes sense in the case of Equifax, where apparently only one employee was responsible for patches. Sinister DNS data exfiltration will continue to occur unless businesses play a stronger offense. It's a challenge for organizations to win the cybersecurity battle without a proactive strategy that addresses DNS.

Smaller businesses may be more nimble in attacking their data governance challenges, especially when getting buy-in from key stakeholders, adopting methodologies, and gaining consensus for metadata definitions. Yet data governance does require guidance, resources, and perhaps most importantly, discipline. And, as we have been hearing on our briefings with a number of technology vendors whose products are engineered to support data governance programs, some best practices are emerging that can guide organizations of all sizes in tackling their governance needs by organizing their data policies according to business priorities. Externally-imposed business policies embed data requirements. Data governance practitioners apply an iterative approach to iteratively decompose the inherent data dependencies associated with the business directives, and can employ technical methods to implement data standards and business rules.

"If you have a strong security culture, and not just information security culture, but an overall security culture, there are generally indications of the change of attitudes and things like that, if it's going to be a malicious insider, that you are going to have a chance [to pick it up]," Doyle said. "I guess the threat for the inadvertent one is a lot of cases there may not be any indicators until you find yourself in trouble." It's a view shared across the industry, with Sophos CTO Joe Levy saying an accidental insider is more likely to compromise a company than an outsider. "They are closer to the data, just in terms of the amount of difficulty and the proximity, it's much more likely the latter is going to happen," Levy said. For McAfee CTO Steve Grobman -- who spoke to ZDNet before the company had its own misadventures last week -- the definition of vulnerabilities needs to go beyond software.

Find a project you like and contribute code, only to discover that “your contribution [is] lost in a sea of hundreds of unanswered issues and pull requests that are piling [up].” From the project maintainer’s perspective, “It’s fun at first and then the notifications start piling up so [you] start responding faster and then that leads to even more notifications,” resulting in “an odd productivity paradox.” But this is a good problem, you insist. More contributions equals more good! Well, yes. But as Eghbal highlights, open source was a bit easier to manage when the total user population (measured imperfectly by SourceForge) was 200,000. Two decades later, it’s more like 20 million, resulting in a heck of a lot of notifications to filter.

External devices (such as USB storage drives) are invaluable tools for your home or small business. With them you can expand your storage capacity and backup files. Because of some of the work I do (such as working with numerous Virtual Machines), I occasionally need to share a USB connected device over my network. In my search to make this possible and easy, I came across a product called USB Network Gate. With this handy app, I can quickly share out a USB device to make it available on another network-attached machine. This makes it incredibly convenient to save files to that external drive, from any machine on my network ... The first thing you must do is download and install the app. USB Network Gate is available for Linux, macOS, Windows, and Android. For my test purposes, I installed the app on Elementary OS and Windows 10.

If you’ve been using agile approaches for a while, I’m sure you’ve heard of relative estimation with planning poker. Teams get together to estimate the work they will do in this next iteration. Each person has a card with either numbers such as the Fibonacci series, or t-shirt sizes. As the PO explains the story, the team members hold up a card to explain how large they think this story is. Every team member doesn’t have to agree on the relative size. The conversation about the sizing is what’s important. The team members discuss the code, the design, the tests (or lack thereof), and other risks they see. The conversation is critical to the team’s understanding of this story. And, when the team decides that the story is larger than a “1,” the team knows there is uncertainty in the estimate.

Quote for the day:

"Data is a precious thing and will last longer than the systems themselves." -- Tim Berners-Lee