Pages

“A deeper emotional attachment is starting to develop,” Mindshare writes. It says this increasing attachment is caused by improvements in understanding of the user, by the virtual assistant. Affinity increases the more the voice assistant understands the user. Artificial Intelligence (AI) improvements are behind those recent gains, and the user understanding is likely to increase as AI continues to improve over time. People like thinking they’re conversing with a genuine person when they talk to the devices, Mindshare says. According to its research, 70 percent of those interviewed want that. In addition, the report says, “Over a third (37 percent) of regular voice technology users say that they love their voice assistant so much that they wish it were a real person.”

The Quantum Dawn exercises are one component of Sifma’s comprehensive work with its members on a variety of cyber security initiatives. These exercises create a cross-departmental incident response focus that is tough to achieve in daily business operations. For example, the cyber security team at a given bank may understand their realm extremely well, but may not fully understand how payment processing in their bank works and the impact if payment processing functions are attacked as part of a sophisticated criminal enterprise targeting the bank. But through such collaborative exercises, each department understands its roles and responsibilities. Rapid and accurate communication is key. Indicators of compromise discovered during the early parts of an attack may trigger specific parts of the incident response playbook.

According to the Ponemon study, enterprises' focus on encryption and key management is being spurred on by increased cloud adoption as more data moves into third-party data centers. Approximately 67% of organizations report that they either perform encryption on premises prior to sending data to the cloud or encrypt data in the cloud using keys they generate and manage on premises. An additional 37% also report that they encrypt some cloud data using methods that turn complete control of keys and encryption processes to the cloud provider. This most recent study doesn't offer a fine point on how much data is going to the cloud completely unencrypted--but data out in 2016 from HyTrust showed that number to be pretty alarming. According to that study, about 28% of all data within all cloud workloads remain unencrypted.

The benefits of DevOps are clear: High-performing organizations deploy 200 times more frequently, with 2,555 times faster lead times, according to a study of more than 25,000 tech professionals from Puppet and DevOps Research and Assessment. High-performers are also twice as likely to succeed with product deployments without service impairments or security breaches. And when something does go wrong, they can fix it 24 times faster. "Tons of evidence showed us that [with DevOps], you can go more quickly and be more reliable at the same time," said Gene Kim, co-author of the report, and co-author of The DevOps Handbook. Here are five tips to help make sure your DevOps implementation reaps the maximum benefits.

Frustration turned to anger, and after trying time and time again to get the company’s attention, Dave took it upon himself to destroy the software just to prove a point. This kind of situation is more common that usually thought: broken promises, the undervaluing of an employee’s opinion, and not heeding sensible advice can often result in those on the frontline of development to lash out against the company. In order to detect situations like Dave’s, the first line of defense is often looking out for the human signs of an unhappy employee. If this fails, then companies need to turn to technology to look for behavior on the network that is out of the ordinary. ... It’s also important to note that your data needs to be monitored at all time: while at rest, while it’s moving, and data in use for policy violations.

Understandably, companies spend more time trying to prevent crises than preparing for them. However, crisis readiness has become at least as important as risk management, takeover readiness, and vigilance over safety. Underpreparedness has consequences and helps explain why companies engulfed by a large crisis initially underestimate the ultimate cost by five to ten times.2Senior executives are frequently shocked by how quickly a problem can turn from a minor nuisance into an event that consumes and defines the company for years to come. ... When a crisis hits (or is about to hit), one of the first actions should be to create a cross-functional team to construct a detailed scenario of the main primary and secondary threats, allowing the company to form early judgments about which path the crisis may travel.

"The issue did not have a direct impact on vehicle safety," said Jim Trainor, a spokesman for Hyundai Motor America. "Hyundai is not aware of any customers being impacted by this potential vulnerability." The bug surfaced as the auto industry bolsters efforts to secure vehicles from cyber attacks, following a high-profile recall of Fiat Chrysler vehicles in 2015 and government warnings about the potential for car hacks. Risks have multiplied in recent years as vehicles have grown more complex, adding features like mobile apps that can locate, unlock and start them. "What's changed is not just the presence of all that hackable software, but the volume and variety of remote attack surfaces added to more recent vehicles," said Josh Corman, director of the Atlantic Council's Cyber Statecraft Initiative.

Because it is highly probable that an organization will fall victim to a data breach at some point, it is wise to be as prepared as possible for that attack. Having a cybersecurity program in place can minimize the damage. Similar to insurance, companies without an effective plan in place will pay a premium, facing both financial and reputational repercussions. That said, cyber insurance providers have emerged with nearly 70 carriers on the market now. However, given the evolving nature of technology, an organization’s network, systems and methods for securing these assets change, which means their cyber risk changes. As a result, determining the appropriate policy is challenging. Additionally, the cyber insurance market is brand new, so the offerings are questionable at best. It is much more advisable to focus on implementing and maintaining a strong security program instead.

The key thing to remember about Power BI sharing is that it is domain based. In other words, if my Power BI dashboard is created under the markwkaelin.com domain, it can be shared only with other email addresses in that domain. It is important that the enterprise IT department and Office 365 administrators understand this limitation and plan accordingly. To share a dashboard, first open Power BI. In this example, I am using the Office 365 version. Next, navigate to the dashboard you want to share. Right-click the dashboard name in the navigation panel or click the Share button on the tab bar in the upper-right corner. Either method will take you to a screen where you can list the email addresses of the people you want to share this dashboard with in your enterprise.

Workers already wear smartwatches on the job for quick access to notifications and emails, as well as an array of personal fitness data. Also, some employers are giving workers smartwatches for specific tasks, Ubrani said. Among the workplace uses for smartwatches, enterprise software company SAP has made mobile apps available for Apple Watch and Samsung smartwatches for more than a year, but it isn't clear how widely they have been deployed. In 2015, one ambitious concept design detailed how a medical device service technician could check the status of repairs on an Apple Watch with the SAP Work Manager app. The success of that project isn't known. A more recent example is the Salesforce Wave Analytics app, which works with the iPhone and the Apple Watch to provide sales reps and managers with current data on their customer accounts.

Quote for the day:

"Our minds can be convinced, but our hearts must be won." -- Simon Sinek

In almost every software team there are members titled as quality engineers (QA). Their role is mainly to understand the specifications and based on them define a set of test cases in order to validate the product and to detect possible flaws. If we search what QA and QC mean by looking at the definitions, we see that a QC is "an aggregate of activities (such as design analysis and inspection for defects) designed to ensure adequate quality especially in manufactured products", whereas the QA is "a program for the systematic monitoring and evaluation of the various aspects of a project, service, or facility to ensure that standards of quality are being met", as per merriam-webster.com definitions. Based on these definitions, people embedded in software development teams in charge of defining test cases and validating the product are more QC engineers. This might cause problems.

You can’t secure what you don’t know about. The only way to know if a breach or vulnerability exists is to employ broad discovery capabilities. A proper discovery service entails a combination of active and passive discovery features and the ability to identify physical, virtual and on and off premise systems that access your network. Developing this current inventory of production systems, including everything from IP addresses, OS types and versions and physical locations, helps keep your patch management efforts up to date, and it’s important to inventory your network on a regular basis. If one computer in the environment misses a patch, it can threaten the stability of them all, even curbing normal functionality.

Achieving synergy between applications and infrastructure is more than just blending disparate management regimes. A functioning application-centric environment requires enterprise executives to make changes to their current ecosystem on both a systems and an operational level. This can be difficult for organizations that maintain substantial legacy infrastructure geared toward conventional data workloads. One of the first things to do is to stop depending on silo-specific tools. When application requirements were fairly predictable, it was common for organizations to provision infrastructure to support the most demanding circumstances, even if that resulted in over-provisioned resources that would sit idle for long periods. This also often led to isolated application and infrastructure environments within the datacenter ecosystem as solutions were crafted to solve unique challenges at particular times.

Business architects (BA's) have to interact with so many different stakeholders that staying out of turf wars can be difficult. Strategy development teams may question why you want to hear about their strategies. Business process teams may push back against capability modeling as being redundant with process optimization efforts. Other architecture teams may be challenged by your very existence. And of course consulting firms will pop up everywhere and claim they can do everything. Avoid turf wars at all costs and stay away from decision rights conversations. (See my previous post on being politically savvy.) In most companies, there is plenty of work to get done, so leveraging the time and talents of other teams is crucial to making progress. Get these teams involved, make them part of what you are doing, and help them to see the business outcome you are striving for. \\

Life after the smartphone will be wondrous. We’ll be amazingly productive. Our faces won’t be filled with light, our fingers won’t be a chaotic symphony. We won’t be strangled by USB charging cables. We'll never have nomophobia. As you could probably guess if you’ve read this column lately, you know that smartphones will be replaced by artificially intelligent bots. They already live among us. Soon, they won’t run on our phones or laptops. They will just run. They will exist in the cloud, at your office, in your car, and everywhere you happen to need help and stay productive. First, they need to get a lot smarter. A companion bot will follow you constantly -- sometimes literally. You’ll talk to the bot, but simple tasks like asking about the weather or the Golden State Warriors playoff schedule will seem trite.

With greater amounts of data comes larger challenges in understanding the lineage, quality and relationships between data from multiple sources and of different types. And CIOs arguably struggle more than ever to effectively manage and analyse data to make it actionable. At one company the hype about machine learning had executives excited about using proprietary algorithms to gain competitive advantage. A data scientist was hired and told there are years’ worth data stored in Amazon S3, and was tasked with figuring out how to drive innovation with it. Unfortunately, there was no metadata to show where the data came from and how the data lake integrated with the rest of the company’s data. There was also no infrastructure for data analysis, forcing the data scientist to try to find tools compatible with the technology stack and install them.

Traditional DNS has weaknesses like that. With certain types of DNS attacks an adversary can make you think you are going to a favorite website but can re-direct you to a bad one, perhaps to steal your login info or to download malicious code. This is another very important reason to use a managed DNS service. There are cautions to consider when selecting a DNS provider. Some DNS providers collect information from you in ways that may creep you out. For example, if you select the free DNS service from Google, although there are privacy protections, they will be aggregating even more data on you and your browsing habits. It is free and offers protection and is backed by a company with incredible engineers, but you will give up some info you might want kept private.

Writing documentation can be downright boring sometimes, but great documentation is an excellent precursor to increased adoption of an API. Writing excellent documentation is as exacting as the code itself. There are syntax errors and unwanted whitespace that you can introduce. Sometimes your ideas simply stop flowing, but you still need to fill in the blanks to make sure your documentation is complete. With the growth of APIs as products, your documentation is more important than ever in order to create a successful API. API definitions and documentation go together, and while API specifications today are increasingly managed as code in GitHub, the API docs are not. Let’s change this to make the standard to write and manage API documentation, including related web sites, in GitHub and related code repositories.

We all know personal hygiene habits that we’re supposed to have, but probably don’t practice consistently (did you really floss three times yesterday?). And there are social behaviors that we really look out for – and probably even judge people on. But when it comes to IT habits, most organizations don’t seem be screening consultants for key behaviors and policies. This is not a good state of affairs because IT habits and internal policies make a material difference to the likelihood of project success. The short list of policy issues below should be part of any screening criteria for cloud consultants, in general, and Salesforce consultants, in particular. Now, it’s not essential that a consultant comply with every item on the checklist below, but wherever policies diverge from these, it’s an opportunity to engage in a healthy conversation … before you sign.

Victims that fall for the scheme will be redirected to an actual Google page, which can authorize the hacking group's app to view and manage their email. Users that click “allow” will be handing over what’s known as an OAuth token. Although the OAuth protocol doesn't transfer over any password information, it's designed to grant third-party applications access to internet accounts through the use of special tokens. The OAuth protocol may have been designed for convenience, but security experts have warned it can be used for malicious effect. In the case of Fancy Bear, the hacking group has leveraged the protocol to build fake applications that can fool victims into handing over account access, Trend Micro said.

Quote for the day:

"Life is a mystery. You never know which small decision will make the biggest difference." -- @Leadershipfreak

Design succeeds when it finds ideal solutions based on the real needs of real people. In a recent Harvard Business Review article on the evolution of design thinking, Jon Kolko noted, "People need their interactions with technologies and other complex systems to be simple, intuitive, and pleasurable. Design is empathic, and thus implicitly drives a more thoughtful, human approach to business." When done well, human centered design enhances the user experience at every touch point, and fuels the creation of products and services that deeply resonate with customers. Human centered design is foundational to the success of companies like SAP, Warby Parker, and AirBnB. ... To delight employees, Cisco has identified "moments that matter" -- such as joining the organization, changing jobs, and managing family emergencies -- and redesigned its employee services around these moments.

In addition, the majority of software developers and system administrators are not accustomed to working in an environment containing federally regulated information such as ePHI, Copolitco wrote. Security controls may chafe developers as they have to adjust how they do things. “All companies who have a compliance obligation must remember that the point of HIPAA compliance is to impose a certain level of security, said Reed. “Security is the ultimate goal, not necessarily compliance. Compliance comes as a result of having a good security program. Being compliant does not mean you are secure; it merely means you have 'checked the boxes.'” An HHS Office for Civil Rights official stated at the recent HIMSS and Healthcare IT News Privacy & Security Forum in Boston that the organization will be conducting on-site audits of hospitals in 2017 and that OCR is engaged in over 200 audits at the moment.

No matter what I have done throughout my career "data" has always played a very important part. Over the years, I've recognized how data has transformed the business and engineering part of development. Machine learning is no longer limited to large enterprises, and smaller companies are ready to get involved and take advantage of its benefits. Also, with the proven results from deep neural networks in various fields, it is clear that this is the time when machine learning and deep neural networks will play a very important role in technology going forward. I suppose my interests in data science are very well timed for the rise of machine learning. It is certain that technology changes everything time and time again, and for every programmer, self-transformation is an important step to keep relevant and competent in an ever-changing field.

I have found the ‘N+1 Syndrome’ to be the most common reason, especially among accomplished professionals who are doing well in their current gig. The thinking goes like this: You are earning well, you have a good name at work, your families are comfortable, and most importantly, you get that nice paycheck at the end of the month! Yes, you have this exciting idea, the thought of not reporting to a stupid boss is enticing, the lure of hitting that IPO jackpot, becoming famous, and retiring by the time you are 40 is tantalizing! You are going to do it, yes! No-one is better qualified! You will just get this one little thing out of the way, and then you are set! Most even have excellent ideas for the new business, but somehow they keep moving the start date forward by a year, then another year, and another.

For context, CPUs, or central processing units, are the processors that have been at the heart of most computers since the 1960s. But they are not well-suited to the incredibly high computational requirements of modern machine learning approaches, in particular deep learning. In the late 2000s, researchers discovered that graphics cards were better suited for the highly parallel nature of these tasks, and GPUs, or graphics processing units, became the de facto technology for implementing neural networks. But as Google’s use of machine learning continued to expand, they wanted something custom built for their needs. “The need for TPUs really emerged about six years ago, when we started using computationally expensive deep learning models in more and more places throughout our products.

To understand why APIs inherently lack security, you must understand that API exploits attempt to compromise the application in one of two ways. The first is through application programming errors that attempt to reveal data or impair the operation of the application. These exploits manifest themselves through malicious inputs like SQL injection, cross-site scripting, and other such attempts at exposing data. Generally, applications can be secured against programming errors. This is often an iterative approach that can take months to years of use, testing, patching and retesting, but it can be done. The second avenue is through attempts to exploit the business logic of the application to create unauthorized access or fraudulent transactions. The harder portion to identify and stop are the exploits of business logic. Applications are being designed to deliver micro-services which expose a large number of interfaces to the Internet.

Even if the backup looks promising, there is no easy button. The people creating ransomware know that backups can stand between them and their payday. There are a lot of cases where Microsoft Volume Shadow Copies have been destroyed by ransomware. If you leave your backups online so you can have quick recovery, you may find that ransomware can actually delete or corrupt your backups. This is not uncommon; ead the user groups from various backup companies and you’ll see the sad tales of woes. If you are not concerned enough, there are other potential dangers to your backups. They need to be airlocked from systems your users have access to. Before you bring your backups online, make sure the affected computers are off of the network. You need to be absolutely certain that those systems can’t access the backup.

TMDs are compounds composed of a transition metal such as molybdenum or tungsten and a chalcogen (typically sulfur, selenium or tellurium, although oxygen is also a chalcogen). Like graphene, they form into layers. But unlike graphene, which conducts electricity like a metal, they are semiconductors, which is great news for flexible chip designers. Stefan Wachter, Dmitry Polyushkin and Thomas Mueller of the Institute of Photonics, working with Ole Bethge of the Institute of Solid State Electronics in Vienna, decided to use molybdenum disulfide to build their microprocessor. They deposited two molecule-thick layers of it on a silicon substrate, etched with their circuit design and separated by a layer of aluminium oxide. "The substrate fulfills no other function than acting as a carrier medium and could thus be replaced by glass or any other material, including flexible substrates," they wrote.

DevOps implementations also vary from company to company. At business law firm Benesch, Friedlander, Coplan & Aronoff LLP, "I think the real focus is on agile communication and client outcomes, versus delivery," said CIO Jerry Justice. "[It's about] creating a solid feedback loop so you can adjust targets and timings." However, not all companies are ready to fully jump on board the new workflow. While Simon Johns, IT director at Sheppard Robson Architects LLP, said the firm has yet to implement DevOps, he also said that "there are elements of the 'philosophy' I would like to introduce into our workflows—build fast, fail fast type of situations." David Wilson, director of IT services at VectorCSP, said he doesn't plan to implement the workflow. "After nearly 30 years of IT experience, I doubt any of those large software companies are really investing in this," Wilson said.

While some network ports make good entry points for attackers, others make good escape routes. TCP/UDP port 53 for DNS offers an exit strategy. Once criminal hackers inside the network have their prize, all they need to do to get it out the door is use readily available software that turns data into DNS traffic. “DNS is rarely monitored and even more rarely filtered,” says Norby. Once the attackers safely escort the data beyond the enterprise, they simply send it through their DNS server, which they have uniquely designed to translate it back into its original form. The more commonly used a port is, the easier it can be to sneak attacks in with all the other packets. TCP port 80 for HTTP supports the web traffic that web browsers receive. According to Norby, attacks on web clients that travel over port 80 include SQL injections, cross-site request forgeries, cross-site scripting, and buffer overruns.

Quote for the day:

Leadership: "If you are not building for the long term you are doing the wrong thing." --@Bill_George

According to Vice Chancellor J. Travis Laster, the blockchain could help to remove the middleman when it comes to how shares are held and voted, as at present they are operating on an outdated system that is too complex to determine who owns a share and how it’s used in decision making. Delaware is just one state in the U.S. that is showing an increased interest in the distributed ledger. Only recently, the Senate in the state of New Hampshire considered a blockchain bill that would deregulate digital currency transactions such as bitcoin from money transmitter regulations in the state. By doing so, the bill is designed to protect consumers when using digital currencies such as bitcoin instead of making them register with money transmitter regulators.

The business world is abuzz with the benefits of collaboration tools: less reliance on email, more organic collaboration on projects and better communication and relationships between teams. Collaboration tools encompass many solutions, including video conferencing, VoIP, document sharing and instant messaging. However, it is also important to think about the security risks that are inherent in tools such as document collaboration platforms, presentation software, remote support tools and virtual events. Each of these can create potential security threats, and evaluating vulnerabilities – and viable solutions – should be a sustainable part of your tool-selection process.

One tricky part is making changes to all ethereum clients, no matter what programming language they're written in, in lockstep. Ethereum Foundation's Khokhlov has been writing tests using a tool called Hive to ensure not only that the clients implement the changes correctly, but that all clients agree on consensus-level changes. That's because if all clients don’t follow the same rules, there could be an accidental split into different networks (as happened briefly in November). Just like former phase changes Frontier and Homestead, the shift to Metropolis requires a 'hard fork' – meaning nodes or miners that fail to upgrade to the new blockchain will be left behind. Because of the possibility of an inadvertent split, hard forks are controversial and taken very seriously.

“All solar systems are monitored in real time through the cloud,” Fruhen announced at a recent tech event in Nairobi. “Five years [ago] when we were founded nobody was thinking about IoT or Big Data but now we collect over 30 million payment notifications every year.” He added that they have more than one million device readings every day. This is from the batteries to temperature of the devices and sensors. Additionally, they have geographical data on where the devices are located. They also have 450,000 rooftop sunshine readings every day. They have calculated that they have saved their users US$338 million since they started, five years ago. “Cloud is the enabler for all these,” he reiterated. “We have 680 terabits of data on our platform.” The company has used its data to provide upgrade devices to users who have finished their solar loan.

Today, cloud vendors are designing managed cloud services from the ground up to meet the most advanced data security requirements, giving current and prospective customers the peace of mind that their data is private and secure. They should also deliver across-the-board support for every aspect of cloud security including physical security, network security, data protection, monitoring, and access controls. Data encryption for data in flight and at rest along with tokenisation of sensitive data items are strategies that can help improve Data Security and help to meet the most stringent of data privacy requirements. Cloud vendors understand that any successful cloud security solution requires close collaboration between you and your cloud service provider, knowing that it’s critical that your organisation has a programme that covers everything from data governance and compliance to cloud user access.

To deny that tech will be important to students' futures seems unthinkable. But it's not enough to recognize students will need tech to be successful. Your students also need to see you as a willing learner of technology. They need to see you as a learner period. And it's a shame if you aren't leveraging your skills as a teacher because you aren't willing to learn technology. All of your teacher skills are priceless, but they can be even more relevant and powerful if you know how to effectively use technology for learning, too. ... Lots of kids like to use technology. But using tech because it is engaging isn't as important as using it because your students are engaged. If your students are curious and motivated learners, they will have questions that need answers. They will want to create and share new knowledge.

Computational thinking is not new. Seymour Papert, a pioneer in artificial intelligence and an M.I.T. professor, used the term in 1980 to envision how children could use computers to learn. But Jeannette M. Wing, in charge of basic research at Microsoft and former professor at Carnegie Mellon, gets credit for making it fashionable. In 2006, on the heels of the dot-com bust and plunging computer science enrollments, Dr. Wing wrote a trade journal piece, “Computational Thinking.” It was intended as a salve for a struggling field. “Things were so bad that some universities were thinking of closing down computer science departments,” she recalled. Some now consider her article a manifesto for embracing a computing mind-set. Like any big idea, there is disagreement about computational thinking — its broad usefulness as well as what fits in the circle.

To produce a regression analysis of inference that can be justified or trustworthy in the sense that helpful. The term in the statistical methods that generate a linear the best estimator is not bias (best linear unbiased estimator) abbreviated BLUE. Then there are some other things that are also important to note, in which the data to be processed, must meet certain requirements. In terms of statistical methods some terms or conditions of the so-called classical assumption test. Because they meet the assumptions of classical statistical coefficient will be obtained which actually became estimator of parameters that can be justified or accurate ... With adjustments being an attempt to fulfill certain requirements (classical assumption) in the regression analysis as a form of simplification in the application of modern economics, which is a form of empirical science.

The six-qubit chip is also a test of a manufacturing method in which the qubits and the conventional wiring that controls them are made on separate chips later “bump bonded” together. That approach, a major focus of Google’s team since it was established just over two years ago, is intended to eliminate the extra control lines needed in a larger chip, which can interfere with how qubits function. “That process is all working,” says Martinis. “Now we’re ready to kind of move fast.” Designs for devices with 30 to 50 qubits are already in progress, he says. He briefly flashed up images of the six-qubit chip at the recent IEEE TechIgnite conference in San Bruno, California, but his group has yet to formally disclose technical details.

What is data worth? In my 2010 predictions I expected to see “datasets increasingly recognized as a serious, balance sheet-worthy asset”. I was a bit early there. Data is clearly still not a well understood or significant investment category – brand “goodwill” is better accounted for, but there is no doubt that markets value companies perceived to be data rich with higher evaluations than other companies. Data is a moat. IBM acquired the Weather Company for around $2bn according to the Financial Times, and promptly put CEO David Kenny in charge of a swathe of its Watson and Cloud units. Uber now has a business selling data to companies including Starwood, and is leveraging data to make deals the public sector organisations such as the city of Boston. But taking advantage of data is hard – requiring entirely new skill sets. Valuing it is hard. Cleaning it is hard. Querying it is hard. Managing and maintaining it is hard.

Quote for the day:

"A problem is only a problem when viewed as a problem." -- Robin Sharma

“Trust in online payments and consistent education to accept new ways to pay are the two major challenges that we are currently facing,” explains Doku chief operating officer, Nabilah Alsagoff. “Most Indonesians are still comfortable and pretty much rely on bank transfer and COD as their preferred method of payments.” One of Doku’s main aims is to make e-commerce systems easier to navigate for both customers and merchants, she says. The ultimate goal is to be a part of Indonesians’ daily payment habits via e-money, especially for the unbanked in a country of over 250 million where only 65 million are bank account holders. But not only is access to customers a barrier, so too are laws and regulations. Most fintech players feel that the regulation in Indonesia is still in [a] grey area.

More broadly, uncertainty over the review announced this week has unsettled Grishma and many others like her. She will have to wait until at least around August to learn her fate, but having accepted the US job offer she is not in a position to apply for positions elsewhere, including in Europe. "It's pretty debilitating," Grishma told Reuters. "I'd like to start work to mitigate the financial damage." Trump's decision was not a huge surprise, given his election campaign pledge to put American jobs first. But the executive order he signed, though vague in many areas, has prompted thousands of foreign workers already in the United States or applying for visas to work there to re-think their plans. Companies who send them also face huge uncertainty.

The systems Vegis and her team have built are hosted on Bluemix, IBM's data storage, processing, and analytics cloud. "IBM's tools have enabled us to save both time and money on programming and development," Vegis said. With the initial hurdle of developing machine learning systems and processing data already accomplished, Foris.io has been able to actually gather data instead of just planning for it. According to Vegis, cognitive computing platforms like Watson allow them to "take concept to prototype in a shorter period of time, which we know will improve our chances of securing funding." That doesn't just apply to her and Foris.io—it's a huge benefit for all tech innovators. With a probe installed, data gathering begins. The devices, capable of transmitting data several kilometers, measure moisture, pH level, salinity, temperature, and other factors

A flexible structure is just as important today as business needs are changing at an accelerating pace and it allows IT to be responsive in meeting new business requirements, hence the need for an information architecture for ingestion, storage, and consumption of data sources. One of the challenges facing enterprises today is that they have an ERP (like SAP, Oracle, etc.), internal data sources, external data sources and what ends up happening is that “spread-marts” (commonly referred to as Excel Spreadsheets) start proliferating data. Different resources download data from differing (and sometimes the same) sources creating dissimilar answers to the same question. This proliferation of data within the enterprise utilizes precious storage that is already overflowing - causing duplication and wasted resources without standardized or common business rules.

Human work will become more versatile and creative. Robots and people will work more closely together than ever before. People will use their unique abilities to innovate, collaborate and adapt to new situations. They will handle challenging tasks with knowledge-based reasoning. Machines enabled by the technologies that are now becoming commonplace – virtual assistants like Siri and Alexa, wearable sensors like FitBits and smart watches – will take care of tedious work details. People will still be essential on the factory floors, even as robots become more common. Future operators will have technical support and be super-strong, super-informed, super-safe and constantly connected. We call this new generation of tech-augmented human workers, both on factory floors and in offices, “Operator 4.0.”

Looking at payments through a global (rather that U.S.-based) lens, 2017 is not going to be a year of leap-frog innovations, but rather a year of incremental improvements focused on country-by-country wins. As mobile infrastructure continues to expand and the Internet reaches an additional two billion people in markets where access was previously nonexistent, we’re bound to see a spike in demand for online and mobile purchases. At the same time, the payment methods landscape will only become more fragmented, requiring payment platforms to optimize between multiple payment options, acquirers and processors, handle currency conversions cost-effectively and transparently, and account for numerous legislative nuances across multiple markets. Decades-old payments systems won’t cut here.

The mess comes in what the older cohort in the business see in the self-organizing abilities and discipline in the personalities of the newcomers. I personally disagree with this 'mess theory' and see it as a normal difference in perspectives between generations that were professionally made in different ecosystems, with sharp differences in tempo and culture. Actually it’s our role (as veterans in the craft) to stretch a good hand to get the newcomers professionally in shape seamlessly and gracefully. So what’s the problem, then? Well, that becomes an issue when resources to coach these hordes of not-yet-matured practitioners are not enough. Especially when we remember the sometimes insane pressures on teams and leaders to meet their schedules, leaving very little space for helping juniors outside what’s barely needed to get them 'technically productive'.

Regardless of the cause, the threat of data breaches is imminent and can have severe repercussions for organizations, especially if they are found guilty of failing to take sufficient measures to secure their data. Singapore's data protection law has one of the highest fines in Asia with each breach subject to a potential fine of S$1 million. Similarly, breaching Europe’s new General Data Protection Regulation can result in a fine of the larger of either 20 million Euro or 4 per cent of the organization’s global annual turnover. Beyond financial penalties, a data breach can cause irreversible damage to a company’s reputation as well as potentially significant damages payable in civil liability to third parties, not to mention possible personal criminal liability for senior management. Organizations should be well aware of the prevailing legal regulations that govern ever growing popular technology solutions such as cloud storage, collection, analysis, and offshore storage of customer data.

AI will be everywhere in our products, in our technologies, and in our operations. And I believe AI can bring value in each and every one of those aspects. In the Telco markets, we've been talking about the technology of AI to build what we call a network brain. The whole notion of this network brain is to help telecom operators to be more intelligent as they build, run, and manage their network. Also, we have tried to bring artificial intelligence into smartphones. Last year we launched Huawei Magic; a concept phone with AI capabilities built into it. The idea was to show how the phones would evolve from smartphones to intelligent phones. And then our network and cloud service - no matter whether it's Public Cloud or Private Cloud - we also inject the capabilities of AI into the Cloud platform to better enable enterprises.

“Companies face a terrible choice: either they turn their business into software and they accept the fact that they’re going to have rampant vulnerabilities and breaches or they let their competition win the innovation race. And everyone chooses software,” said Williams. “But as a result, we’re going to have 111 billion new lines of code in 2017. And the problem is that these legacy tools, dynamic analysis tools, static analysis tools and web application firewalls, were invented in the early 2000s. They’re absolutely incapable of scaling to the level of modern software.” This requires an approach that uses automation. Every business that has been around for more than five years will have legacy software integration challenges, which requires developing new code. Companies are constantly integrating new software platforms with older systems and a cybersecurity platform has to be able to protect all of these assets.

Quote for the day:

"Sometimes the questions are complicated and the answers are simple." -- Dr. Seuss

Symantec has found some possible proof. The company noticed that the computer worm has been leaving a message over infected devices since at least March, Grange said. That message has been digitally signed and fetched in a way that leaves little doubt it comes from Hajime's developer. The short message doesn't reveal anything about the Hajime developer's identity. But the vigilante hacker is aware the security community has been studying the Hajime worm. One clue: The mysterious developer refers to himself or herself as the "Hajime author" in the message the worm has been leaving behind. However, it was actually security researchers at Rapidity Networks that came up with the name Hajime, which is Japanese for the term "beginning."

The SCP is intended to "identify the challenges Australian organisations face when competing in local and international cyber security markets". "The SCP provides a roadmap to strengthen Australia's cyber security industry and pave the way for a vibrant and innovative ecosystem. It articulates the steps and actions required to help Australia become a global leader in cyber security solutions, with the aim of generating increased investment and jobs for the Australian economy," it says. The SCP was launched by Senator Arthur Sinodinos, Minister for Minister for Industry, Innovation and Science. "The aspiration, and it's set out here in this plan so clearly, is to be a global leader in this space," Sinodinos said.

Chances are, small businesses already have a fairly large amount of data collected, particularly if they have been in business for at least a year. Even if the business is older and had not begun in the digital age, and does not have many electronic records, the paper records still contain data. Sales slips, time cards, order forms, all of these have data worth analyzing. Perhaps the records are a mix of paper and electronic records. Maybe more recent inventory records are recorded in a spreadsheet, while the older information is kept in a hand-written ledger. It would be worth the business owner’s while to digitize the paper records. This will require an initial output of resources, but the time spent scanning images or entering data into a database program will be paid back in the time saved by the staff not having to dig through paper files looking for information in addition to gaining the ability to query these records.

The Microsoft spokesperson added that the new offering will help IoT product manufacturers "that value time to market with technical stack prescribed and managed for them". "It is designed to enable the rapid innovation, design, configuration, and integration of smart products with enterprise-grade systems and applications to reduce product manufacturers' go-to-market cycle and increase the speed at which they can innovate so they can stay ahead of their competition and deliver smart products that delight their customers," the spokesperson told ZDNet. IoT Central is vertically and horizontally agnostic, though the spokesperson said its early adopters happen to operate in the manufacturing and engineering industries such as ThyssenKrupp Elevator, Sandvik Coromant, and Rolls-Royce.

According to Forrester’s Business Technographics survey of over 3,000 global technology and business decision makers from last year, 41percent of global firms are already investing in AI and another 20 percent are planning to invest in the next year. Most large enterprises’ first foray into AI is with chatbots for customer service, what we call “conversational service solutions.” These run the gamut from hard coded rules-based chatbots which aren’t artificially intelligent to very sophisticated engines using a combination of NLP, NLG, and deep learning. From a customer insights perspective, many companies are starting to uses some of the “sensory” components of AI such as image and video analytics and speech analytics to unlock insights from unstructured data.

Since OT is technology that was built pre-Internet and is goal-oriented, its security is not always a top priority, Brown said. Others agreed. "I think it's still sort of a nascent field which is ironic because industrial systems, operational systems are from a past era," said Alex Eisen, a security researcher for ForeScout. Eisen later continued, "Think about trains, iron, mechanical engineering, electrical engineering and now we find ourselves in this modern world, information age, where a lot of these hard skills and experience is sort of tucked away." The panel discussed risks to assuming OT and IT systems are not connected. Brown went on to describe multiple attacks that have happened because of unknown entanglement between the two systems. The panelists — which included representatives from SMUD, the Sacramento Regional County Sanitation District, security companies, and others — discussed how OT systems can be protected:

Even if you want to stick with a closed source operating system (or, the case of macOS, partially closed source), your business can still take advantage of a vast amount of open source software. The most attractive benefit of doing so: It's generally available to download and run for nothing. While support usually isn't available for such free software, it's frequently offered at an additional cost by the author or a third party. It may be included in a low-cost commercially licensed version as well. Is it possible, then, to run a business entirely on software that can be downloaded for free? There certainly are many options that make it possible — and many more that aren't included in this guide.

To get a sense of what pressures IT leaders were under and how they were dealing with them, I recently sampled just over 50 of the top practitioners in the space with a focus on what I regarded were leading organizations in their industry -- mostly large enterprise CIOs, as well as a few CTOs, CDOs, and EVPs of IT who I knew were pushing the envelope -- to better understand the IT initiatives they are focusing on to becoming more agile. By picking cutting-edge leaders at top organizations, the intent was that the data will show what they're facing and how they're dealing with it this year, in a way that gives more typical organizations time to prepare for what they'll likely face next year and beyond. Unsurprisingly, the data clearly that top IT leaders are feeling much more pressure for their team to move quicker than they ever have in the past.

While many have high hopes for IoT, few are on their way to full deployment. The survey found 41 percent of respondents expect IoT to have a big impact on their industries within three years, affecting things like efficiency and differentiated products and services. But only 7 percent said they have a clear vision with implementation well under way. Most companies don't have everything they need to succeed in IoT, with many saying they'll need new technical skills, data integration and analytics capabilities, or even a rethinking of their business model. Thirty-one percent of the executives said their organizations face a "major skills gap" in industrial IoT. The annual developer survey co-sponsored by the open-source Eclipse IoT Working Group, IEEE IoT, Agile IoT and the IoT Council, also found growing adoption along with continuing concerns.

EDA is valuable to the data scientist to make certain that the results they produce are valid, correctly interpreted, and applicable to the desired business contexts. Outside of ensuring the delivery of technically sound results, EDA also benefits business stakeholders by confirming they are asking the right questions and not biasing the investigation with their assumptions, as well as by providing the context around the problem to make sure the potential value of the data scientist’s output can be maximized. As a bonus, EDA often leads to insights that the business stakeholder or data scientist wouldn’t even think to investigate but that can be hugely informative about the business. In this post, we will give a high level overview of what EDA typically entails and then describe three of the major ways EDA is critical to successfully model and interpret its results.

Quote for the day:

"Our leadership style is defined by who we are and what we do, not by what we say." -- Gordon Tredgold

Large-scale cyberattacks with eye-watering statistics, like the breach of a billion Yahoo accounts in 2016, grab most of the headlines. But what often gets lost in the noise is how often small and medium-sized organizations find themselves under attack. In the last year, half of American small businesses have been breached by hackers. That includes Meridian Health in Muncie, Indiana, where 1,200 workers’ W-2 forms were stolen when an employee was duped by an email purporting to come from a top company executive. Many small companies are just one fraudulent wire transfer away from going out of business. There’s lots of advice available about how to fight cybercrime, but it’s hard to tell what’s best. I am a scholar of how businesses can more effectively mitigate cyber risk, and my advice is to know the three “B’s” of cybersecurity: Be aware, be organized and be proactive.

The evidence presented here suggests that the garbage threads were not active for the vast majority of the pause. If the pause was due to background I/O then the GC threads, captured by the OS, should have accumulated an inordinate amount of kernel time, but they didn't. This all suggests that the GC threads were swapped out, and incredibly, not rescheduled for more than 22 seconds! If our app wasn't paused by the garbage collector then the only possibility is that the JVM was paused by the operating system, even if that doesn't seem to make any sense. Fact is, operating systems sometimes do need to perform maintenance, and when this happens, just as is the case with GC, the OS may need to pause everything else. Just like GC pauses coming from a well tuned collector, OS pauses are designed to occur infrequently and be very brief to the point of hardly being noticed.

The key thing to remember is that as you supply machine learning software with more data, it keeps on learning and adapting. Other areas in which a machine learning application can help marketers include: Customer segmentation – Machine learning customer segmentation models are very effective at extracting small, homogeneous groups of customers with similar behaviors and preferences. Customer churn prediction – By discovering patterns in the data generated by many customers who churned in the past, churn prediction machine learning forecasting can accurately predict which current customers are at a high risk of churning. This allows proactive churn prevention, an important way to increase revenues. Customer lifetime value forecasting – CRM machine learning systems are an excellent way to predict the customer lifetime value (LTV) of existing customers, both new and veteran.

While people have long been seen as the weakest link in IT security through lack of risk awareness and good security practice, the people problem also includes the skills shortage at a technical level as well as the risk from senior business stakeholders making poor critical decisions around strategy and budgets. Interestingly, the increase in reported skills shortages contrasts with a decrease in those reporting a lack of experience being a market factor. This suggests that as the industry matures the shortage of experienced, senior managerial professionals will reduce and the problem will be felt most acutely in the hands-on technical disciplines. “The survey highlights the continued need for industry, government, academia and professional organisations like the IISP to continue to work hard to attract new entrants and younger people into the industry,” said Piers Wilson, author of the report and director at the IISP.

In this market, what a lot of our customers see is that their biggest challenge is people. There are a lot of people when it comes to setting up MSSPs. The investment that you made is the big differentiator, because it’s not just the technology, it’s the people and process. When I look at the market and the need in this market, there is a lack of talented people. How did you build your process and the people? What did you have to do yourself to build the strength of your bench? Later on we can talk a little bit more about Zayo and how HPE can help put all of this together. ... But within the SOC, our customers require things like customized reporting and even customized instant-response plans that are tailored to meet their unique audits or industry regulations. It’s people, process and tools or technology, as they say. I mean, that is the lifeline of your SOC.

At any recent security conference lately, you probably have heard hundreds of vendors repeating the words "We have the best artificial intelligence (AI) and machine learning." If you happened to be in one of those conversations and asked "What does that mean?," you probably got a blank stare. Many security consumers are frustrated when marketing pitches don't clearly articulate what AI does in a product to help protect an environment better. There are several dilemmas facing security companies that keep them from being more up-front about how they use AI and machine learning. For some, the concepts are a marketing statement only, and what they call AI and machine learning is actually pattern matching. Also, machine learning relies on a tremendous volume of data to be effective, and there are very few vendors that possess enough of it to be successful in its implementation.

While the technology has grown in popularity, mainly because it's the basis for the wildly hyped cryptocurrency and payment platform Bitcoin, many experts are still not sure exactly how it works. Even the founder of Bitcoin, Satoshi Nakamoto, is a shadowy figure and no one appears to know with certainty who he is or if the name is a pseudonym for a group of developers. Nakamoto, however, holds one million bitcoins, or the equivalent to $1.1 billion. Angus Champion de Crespigny, blockchain leader at Ernst & Young, called the technology "overhyped" and said many business applications touted as beneficiaries of its use have regulatory or operational issues that can be difficult to solve via one technology alone. "We're seeing interest in using it to propagate security policies and identity access management, but it's early days.

"ReactXP is designed with cross-platform development in mind," its site says, though it promises it will only let developers "share most of your code" among platforms. "With React and React Native, your Web app can share most its logic with your iOS and Android apps, but the view layer needs to be implemented separately for each platform. We have taken this a step further and developed a thin cross-platform layer we call ReactXP." Developer Eric Traut provided more information in a blog post. "It builds upon React JS and React Native, allowing you to create apps that span both Web and native with a single code base," he said. Although it's built on both implementations, an FAQ indicates it borrows more heavily from React Native. ... ReactXP is described as a thin abstraction layer built upon and bridging React JS and React Native.

"A lot of the vulnerability is bad configurations which stem from poor consultancy. These things weren't meant for a huge company," Grigg said. He's hardly pointing the finger at anyone to lay blame, as Grigg said that in his earlier years he had likely provided some bad consultancy. "I started to notice buddies of mine who were really good consultants, and watching them do their work, I thought, 'I probably shouldn't be allowed to touch this stuff'. Unfortunately, It's the norm to have bad consultants," Grigg said. Many companies hire a third party to come in as the 'fix it' people. Those that specialized in SIEM platforms, as Grigg eventually did, found themselves "Fixing what was super messed up," he said. Because so much of the SIEM industry is legacy software that was the same tool just redesigned and rebranded, Grigg said, "Those back doors still exist on there today."

The simplest way to increase value is to implement a policy that ensures that bugs are reproduced in a test case before any attempt to their resolution, so that they can’t happen again without being detected by running the test suite. Not only is the software better by having the bug removed, but the expected behaviour is now formally documented by an executable test case. But there is no such thing as a single best way to debug software. Each software developer has his/her own preferred tool or process to do so. ... When dealing with a buggy piece of software, I add assertions (available in some form in virtually all languages today) that check for the conditions that represent the expected behaviour of the system. I iteratively reduce the scope of my bug (things are all right when entering it, and faulty when exiting it) by adding more and more precise assertions, until I find the source of the problem, and fix it.

Quote for the day:

"A bird isn't afraid of the branch breaking because it's trust is not on the branch, but on it's wings." -- Unknown

For humans to be the most productive in their collaboration with machines, they need advanced technology skills that probably exceed their current capabilities. The skills gap must be closed for workers at various levels of competencies and who possess a variety of experiences. Filling such widely disparate skills gaps, bridging the college-to-work gap, and retooling millions of workers into completely new jobs are daunting tasks. Traditional approaches to education have come under pressure due to the costs (student debt in the U.S. is estimated at $1.3 trillion) and questionable efficacy (a late-2016 study showed that nearly half of new college graduates are underemployed). Given the magnitude of the problem, a new approach is necessary. Though not yet widely adopted, adaptive learning is a low-cost, proven, and highly efficient way to equip people from factory workers to physicians with skills — not just in technology, but in other realms as well.

StorageOS also optimises storage, tracking where containers are running and ensuring storage remains as local as possible to keep latency down. It aims to tackle the key weakness of storage for container environments – that container storage is not persistent. That means that when containers cease running, whether for planned or unplanned reasons, storage is lost and not resumed when containers are restarted. Containers are gaining popularity because of their ability to be deployed and scaled rapidly. Organisations can deploy a given number of containers to support a campaign launch, for example, then, if demand spikes, more containers can be added, effectively increasing the parallelised operation of the application. These can also be in different locations, so some containers could be run in-house while additional capacity is run from a public cloud.

The industry should aim to achieve a level of interactive integration and cooperation between analysts and their tools, so that they seamlessly play off of each other’s strengths to be better than their sum. The current place where analyst and automation meet are at the SIEM and the threat intelligence platform. The SIEM is the centre of events. The threat intelligence platform (TIP) is where intelligence is managed by the analyst. Your SIEM and TIP should work well enough together that any events that already correlate to threat intelligence can be viewed in the SIEM while the TIP can still be used to research any probable future threats. The experienced analyst is central to the process for the steps that require their intuition, given all of the possible information, to make a decision. Once they make or review decisions they can quickly deploy any changes to the appropriate systems or channels.

To be sure, a few others could build a similar service, namely Amazon and Microsoft. But they haven’t yet. With help from TrueTime, Spanner has provided Google with a competitive advantage in so many different markets. It underpins not only AdWords and Gmail but more than 2,000 other Google services, including Google Photos and the Google Play store. Google gained the ability to juggle online transactions at an unprecedented scale, and thanks to Spanner’s extreme form of data replication, it was able to keep its services up and running with unprecedented consistency. Now Google wants a different kind of competitive advantage in the cloud computing market. It hopes to convince customers that Spanner provides an easier way of running a global business, a easier way of replicating their data across multiple regions and, thus, guard against outages.

In developing mobile policies, hospitals must address the security of patient information and the need to comply with the privacy and security regulations of the Health Insurance Portability and Accountability Act (HIPAA), notes the Spok report. Some organizations that responded to the survey, in fact, “viewed mobile strategies as primarily a security project concerning HIPAA compliance,” the report points out. However, hospitals’ mobility strategies must extend beyond security to help them reach their organizational goals, Edds says. Kuhnen, similarly, says that hospitals must go beyond mobile security if they don’t want to fall behind. “They need to look at the productive uses of mobile technology—how the technology can make their workflows more efficient and improve user satisfaction.”

To decrease customer churn, you can use predictive modeling to identify the variables that are predictive of customer churn. While you can find drivers of churn manually when the data set is small, you will need to rely on the power of machine learning when you integrate all your data sources. Because integrated data sets can contain many variables, data analysts/scientists are simply unable to quickly sift through the sheer volume of data manually. Instead, to create predictive models of customer churn, businesses can now rely on the power of machine learning. Machine learning is a set of techniques that allow computers to make dynamic, data-driven decisions without explicit human input. In the context of CSM, machine learning helps computers “learn” the differences between users who stay and those who leave.

Wearables may soon not rely on a smartphone, as more than one network-connected smartwatch hit the market. One such smartwatch launching next month was developed by a major network to function as an independent device. Verizon’s new Wear24 smartwatch can connect to Verizon’s 4G LTE network without requiring a smartphone. The smartwatch automatically operates using the user’s existing phone number when sending texts and making calls, according to Verizon. The smartwatch is equipped with an eSIM (Embedded Subscriber Identity Module), which enables the network connectivity. This functions similarly to the SIM card in a smartphone, but is not removable. Integrating eSIMs into IoT devices enables networks to remotely configure device connectivity settings and allow or deny access based on the status of a device owner’s subscription.

The talent shortage is real, and it might get worse before it gets better. As the amount of accessible data grows, data crime is becoming more pervasive. Ransomware, sophisticated extended-duration attacks, phishing and whaling attacks are all targeting large enterprises, government organizations, mom and pop shops and everyone in between. It doesn't help that the rapid growth of data crimes is a relatively new trend, making it hard to find people who are deeply experienced in fighting data crime and who can be thrown into the fire immediately. This gap can have the biggest effect on small business leaders, who often can’t compete with larger companies when it comes to offering the salary and benefits that attract today’s top IT talent. At this point, qualified newly hired professionals command average salaries of roughly $150,000, and that number most likely has room to grow.

"This is the real scare, to not just a particular industry of a particular size, but to everybody. It is a matter of existence," said Aurora. That's where Darktrace's artificial intelligence system comes in, with the latest technology offering called Antigena. Once a threat is identified, Antigena automatically responds by taking proportionate actions to neutralize it and buy security teams enough time to catch up. In essence, it acts like a digital antibody that can slow down or stop compromised connections or devices within a network without disrupting normal business operations. "Human beings are still going to be fundamental, but right now, the kind of attacks — you find it very difficult to figure out and they're so quick that if you look at traditional means, by the time human beings get to respond, it's too late," Aurora explained.

A common request from network operations: “I don’t want to wait for users to phone us about problems, nor do I have time to sift through mounds of data. Tell us who’s having a problem and how to fix it.” True analytics needs to automatically surface insights and recommend useful actions that IT can take to proactively improve user experience. What’s more, the tools should be able to suggest what actions to take to deliver the biggest bang for the buck relative to improving the users’ network experience. ... But what comes out of the machine learning algorithm must be translated back into a plain English recommendation, such as: “By removing the rogue access points interfering with the 5GHz radio of a certain access point you can effectively mitigate 400 client hours of poor client Wi-Fi performance.”

Quote for the day:

"Any powerful idea is absolutely fascinating and absolutely useless until we choose to use it." -- Richard Bach