Category Archives: Security

Around The Cloud

Ars Technia has been buzzing this week about how safe the Linux kernel is, calling its current situation an “unprecedented security crisis”. Linux now underpins not only server farms but also the cloud, Android phones, Chromebooks, and everything connected to the Internet of Things (IoT). It now serves as the single point of failure that, like the first domino in a line, can send all the pieces crashing down should an exploit be discovered. Kees Cook, the head of the new Linux Kernel Self Protection Project, said “…the Linux kernel needs to deal with attacks in a manner where it actually is expecting them and actually handles gracefully in some fashion the fact that it’s being attacked.”

Google is planning on showing off their latest and greatest innovations at an event in San Francisco on October 4th. We’re expecting to see new phones, new pricing and release dates for Google’s Amazon Echo competitor, a new Chromecast, and a new router. But what the internet is really buzzing about are the new Pixel and Pixel XL phones, which will be showcasing Android Nougat 7.1 and Google’s new virtual assistant app called Allo. The Verge goes in more details about the Pixel phones, as well as Google Home (the Amazon Echo competitor), Chromecast, Google Wi-Fi, and a mysterious new operating system for laptops and tablets nicknamed “Androme”.

In other Google related news, TIME reported today that Google has started testing their Uber competitor, Waze Rider, in San Francisco. The Waze Rider app searches to connect riders with drivers that are already traveling in the same direction as the rider. In contrast, Uber just seeks to connect a rider with the nearest driver, disregarding where the driver is headed and requiring the driver devote themselves entirely to the rider’s interests. While this results in cheaper fares for riders, it also means lower pay for drivers. However, drivers also do not have to turn it into a part time or full time job, making Waze Rider the ideal way to earn a little money while on your way to work or running errands around town.

An angry French man has won the internet today with his display of passion in an Apple store. While Apple often has received praise for their high standards in customer service, evidently this man did not agree. He entered an Apple store in Dijon yesterday and started calmly smashing Apple devices with a weighted boule metal ball (used in a French bowls game) to make his point clear, perhaps because he felt that no one was taking him seriously before. Video footage of this event can be seen at The Verge.

Yahoo Breach

Latest reports suggest that the recent Yahoo! data breach may exceed 500 million records, with some sources implying millions more records penetrated, upping the total number of records stolen in various recent hacks to approximately 3.5 billion. CloudTweaks spoke to Kevin O’Brien, CEO of GreatHorn, for expert insight into this latest violation. GreatHorn provides cybersecurity solutions for cloud communication platforms and is the first automated spear phishing prevention platform natively integrated into cloud-based email systems such as Google Apps and Office 365.

Says O’Brien, “It’s concerning that it took two years to uncover the breach and demonstrates how ill-equipped even one of the world’s largest tech companies is to address the gap between a breach and detection. The attackers – who are presumed to be state-sponsored hackers – didn’t just steal your grandmother’s email address. They stole the good stuff: unencrypted security questions and answers as well as full names. It’s troubling that this data was unencrypted at all; security questions are often re-used between sites and provide full account access.”

The Relevant Details

With the theft potentially including a variety of personal data from names to telephone numbers to security question answers, users of this global service have been put at risk. Possibly the largest attack of its kind in terms of user accounts penetrated, the FBI is involved in the investigation though has yet to make any comments on the allegations that the attack may have come from outside the USA, possibly from a foreign government. Though Yahoo! hasn’t revealed the evidence which has led them to believe this attack may be state-sponsored, governments have in the past hacked email accounts to keep track of citizens or dissents, and there is some expert opinion suggesting that the 2010 Google Gmail hacking of accounts used by Chinese human rights activists may have been of such motivation.

Unfortunately, the discovery of the hack is most certainly not the end of the line. Stresses O’Brien, “The Yahoo! breach will likely lead to a long tail of harder to detect phishing attacks. For example, since Q2 2015, we’ve been tracking a resurgence in ‘Display Name’ spoof attacks, aimed especially at enterprise clients where the stakes are millions of dollars’ worth of damages. These attacks involve a criminal using a friendly name, e.g., that of a spouse, co-worker, or friend, but sending messages from an email address that isn’t the one the sender typically uses. This is often an attempt to trick people into divulging sensitive information – ‘I need the W2s for these employees for a wage study, can you send them to me?’ – or authorizing fraudulent invoice payments or wires. With the account credential loss involved here, we can expect these attacks to become more sophisticated, as these faked emails will come from the actual addresses of the spoofed sender, not “yourceo@c-level.co.”

What’s Next?

As if we’re not constantly reminded, ensuring you’re running the latest in cybersecurity solutions should be of top priority; furthermore, we all need to pay attention to standard security protocol, stay informed about potential risks, and follow fundamental security principles. The Yahoo! breach may still leave ordinary users at risk, especially if the information makes it onto the black market and is sold on. With many people using the same username, email address, and password for many online services, some of these sites storing financial information such as banking and credit card details, the transmittal of breached data further increases vulnerability. Resetting passwords for Yahoo! accounts isn’t enough; for those potentially affected, an overhaul of all online and network protection may be in order.

A wake-up call for many, whether users of Yahoo! or not, we’re reminded to review our accounts for suspicious activity, implement two-step authentication where possible, and take seriously the threats of phishing campaigns. Who knows what breaches are happening right now that we won’t be aware of for another two years?

Cloud Architecture

These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything across back-end networks causes headaches for the end-users who try to access the systems over VPN and other private links.

Many strategies have been implemented to address this issue across traditional datacenter infrastructures. Independent physical networks with a “DMZ” for public-facing components, complex routers and firewall configurations have all done the job, although they do add multiple layers of complexity and require highly specialized knowledge and skill sets to accomplish.

Virtualization has made management much easier, but virtual administrators are still required to create and manage each aspect of the configuration – from start to finish. Using a private cloud configuration can make the process much simpler, and it helps segment control while still enabling application administrators to get their jobs done.

Multi-tenancy in the Private Cloud

Private cloud architecture allows for multi-tenancy, which in turn allows for separation of the networking, back-end and front-end tiers. Cloud administrators can define logical relationships between components and enable the app admins to manage their applications without worrying about how they will connect to each other.

One example is a web-based application using a MySQL back-end data platform. In a traditional datacenter platform, the app administrators would request connectivity to either isolate the back-end database or to isolate everything and allow only minimal web traffic to cross the threshold. This requires network administrators to spend hours working with the app team to create and test firewalls and other networking rules to ensure the access they need without opening any security holes that could be exploited.

Applying private cloud methodology changes the game dramatically.

Two individual virtual networks can be created by the cloud administrator. Within each network, traffic flows freely, removing the need to manually create networking links between components in the same virtual network entirely. In addition, a set of security groups can be established that will only allow specified traffic to route between the back-end data network and the front-end web server network – specifically ports and protocols used for the transfer of MySQL data and requests. Security groups utilize per-tenant access control list (ACL) rules, which allow each virtual network to independently define what traffic it will and will not accept and route.

Private cloud networking

Due to the nature of private cloud networking, it becomes much easier to not only ensure that approved data is flowing between the front and back end networks, but to ensure that traffic only flows if it originates from the application networks themselves. This allows for free-flow of required information but blocks anyone outside the network from trying to enter through those same ports.

In the front-end virtual network, all web traffic ports are opened so that users can access those web servers. With the back-end network, the front-end network can be configured to easily reject any other protocol or port and only allow routing from the outside world to the front-end servers, but nowhere else. This has the dual effect of enabling the web servers to do their jobs but won’t allow other administrators or anyone else in the datacenter to gain access, minimalizing faults due to human error or malicious intent.

Once application and database servers are installed and configured by the application administrators, the solution is complete. MySQL data flows from the back-end network to the front-end network and back, but no traffic from other sources reaches that data network. Web traffic from the outside world flows into and out of the front-end network, but it cannot “leapfrog” into the back-end network because external routes would not be permitted to any other server in the configuration. As each tenant is handled separately and governed by individual security groups, app administrators from other groups cannot interfere with the web application. The admins also cannot cause security vulnerabilities by accidentally opening unnecessary ports across the board because they need them for their own apps.

Streamlined Administration

Finally, the entire process becomes easier when each tenant has access to self-service, only relying on the cloud administrator for configuration of the tenancy as a whole and for the provisioning of the virtual networks. The servers, applications, security groups and other configurations can now be performed by the app administrator, and will not impact other projects, even when they reside on the same equipment. Troubleshooting can be accomplished via the cloud platform, which makes tracking down problems much easier. Of course, the cloud administrator could manage the entire platform, but they no longer have to.

Using a private cloud model allows for greater flexibility, better security, and easier management. While it is possible to accomplish this with a traditional physical and virtual configuration, adding the self-service and highly configurable tools of a private cloud is a great way to take control, and make your systems work the way you want, instead of the other way around.

Ariel brings more than twenty years of technology innovation and entrepreneurship to Stratoscale. After a ten-year career with the IDF, where he was responsible for managing a section of the Technology R&D Department, Ariel founded Passave, now the world leader in FTTH technology. Passave was established in 2001, and acquired in 2006 by PMC-Sierra (PMCS), where Ariel served as VP of Strategy. In 2006 Ariel founded Pudding Media, an early pioneer in speech recognition technology, and Anobit, the leading provider of SSD technology acquired by Apple (AAPL) in 2012. At Apple, he served as a Senior Director in charge of Flash Storage, until he left the company to found Stratoscale. Ariel is a graduate of the prestigious IDF training program Talpiot, and holds a BSc from the Hebrew University of Jerusalem in Physics, Mathematics and Computer Science (Cum Laude) and an MBA from Tel Aviv University. He holds numerous patents in networking, signal processing, storage and flash memory technologies.

Multi-Instance vs. Multi-Tenant

The cloud is part of everything we do. It’s always there backing up our data, pictures, and videos. To many, the cloud is considered to be a newer technology. However, cloud services actually got their start in the late 90s when large companies used it as a way to centralize computing, storage, and networking. Back then, the architecture was built on database systems originally designed for tracking customer service requests and running financial systems. For many years, companies like Oracle, IBM, EMC and Cisco thrived in this centralized ecosystem as they scaled their hardware to accommodate customer growth.

Unfortunately, what is good for large enterprises, does not typically translate to a positive experience for customers. While the cloud providers have the advantage of building and maintaining a centralized system, the customers must share the same software and infrastructure. This is known as a multi-tenant architecture, a legacy system that nearly all clouds still operate on today.

Here are three major drawbacks of the multi-tenant model for customers:

Commingled data – In a multi-tenant environment, the customer relies on the cloud provider to logically isolate their data from everyone else’s. Essentially, customers and their competitors’ data could be commingled in a single database. While you cannot see another company’s data, the data is still not physically separate and relies on software for separation and isolation. This has major implications for government, healthcare and financial regulations, and not to mention, a security breach that could expose your data along with everyone else.

Excessive maintenance and downtime – Multi-tenant architectures rely on large and complex databases that require hardware and software maintenance on a regular basis, resulting in availability issues for customers. While some departments can experience downtime in the off hours such as sales or marketing, enterprise applications that are used across the entire enterprise need to be operational nearly 100 percent of time. Ideally, enterprise applications should not experience more than 26 seconds of downtime a month on average. They simply cannot suffer the excessive maintenance downtime of a multi-tenant architecture.

All are impacted – In a multi-tenant cloud, any action that affects the multi-tenant database such as outages, upgrades, or availability issues affect all those who share that multi-tenancy. When software or hardware issues are found on a multi-tenant database, it causes an outage for all customers. The same goes with upgrades. The main issue arises when this model is applied to run enterprise–wide business services. Entire organizations cannot tolerate this shared approach on applications that are critical to their success. Instead, they require upgrades done on their own schedule for planning purposes and for software and hardware issues to be isolated and resolved quickly.

With its inherent data isolation and multiple availability issues, multi-tenancy is a legacy cloud computing architecture that will not stand the test of time. To embrace and lead today’s technological innovations; companies need to look at an advanced cloud architecture called multi-instance. A multi-instance architecture provides each customer with their own unique database. Rather then using a large centralized database, instances are deployed on a per-customer basis, allowing the multi-instance cloud to scale horizontally and infinitely.

With this architecture and deployment model come many benefits, including data isolation, advanced high availability, and customer-driven upgrade schedules.

Here’s a closer look at each of these areas:

True data isolation – In a multi-instance architecture, each customer has its own unique database making sure their data is not shared with other customers. A multi-instance architecture is not built on a large centralized database, instead, instances are deployed on a per-customer basis, making hardware and software maintenance easier to perform and issues can be resolved on a customer-by-customer basis.

Advanced high availability – Ensuring high availability of data and achieving true redundancy is no longer possible through legacy disaster recovery tactics. Multiple sites being tested infrequently and used only in the most dire of times, is simply not enough. Through multi-instance cloud technology, true redundancy is achieved with the application logic and database for each customer instance being replicated between two paired, yet geographically separate data centers. Each redundant data center is fully operational and active resulting in almost real-time replication of the customer instances and databases. Coupling a multi-instance cloud with automation technology, the customer instances can be quickly moved between each data center resulting in high availability of data.

Customer-driven upgrades – As described above, the multi-instance architecture allows cloud service providers to perform actions on individual customer instances, this also includes upgrades. A multi-instance cloud allows each instance to be upgraded on a schedule that fits compliance requirements and the needs of individual customers.

When it comes down to it, the multi-instance architecture clearly has significant advantages over the antiquated multi-tenant clouds. With its data isolation and a fully replicated environment that provides high availability and scheduled upgrades, the multi-instance architecture puts customers in control of their cloud.

The Cloud, through bringing vast processing power to bear inexpensively, is enabling artificial intelligence. But, don’t think Skynet and the Terminator. Think cucumbers!

Artificial Intelligence (A.I.) conjures up the images of vast cool intellects bent on our destruction or at best ignoring us the way we ignore ants. Reality is a lot different and much more prosaic – A.I. recommends products or movies and shows you might like from Amazon or Netflix learning from your past preferences. Now you can do it yourself as one farmer in Japan did. He used it to sort his cucumber harvest.

Makoto Koike, inspired by seeing Google’s AlphaGo beat the world’s best Go player, decided to try using Google’s open source TensorFlow offering to address a much less exalted challenge but nonetheless a difficult one: sorting the cucumber harvest from his parent’s farm.

Now these are not just any cucumbers. They are thorny cucumbers where straightness, vivid color and a large number of prickles command premium prices. Each farmer has his own classification and Makoto’s father had spent a lifetime perfecting his crop and customer base for his finest offerings. The challenge was to sort them quickly during the harvest so the best and freshest could be sent to buyers as rapidly as possible.

This sorting was previously a “human only” task that required much experience and training – ruling out supplementing the harvest with part-time temporary labor. The result was Makoto’s poor mother would spend eight hours a day tediously sorting them by hand.

Makoto tied together a video inspection system and mechanical sorting machines with his DIY software based on the Google TensorFlow and it works! If you want a deep dive on the technology check out the details here. Essentially the machine is trained to recognize a set of images that represent the different classifications of quality. The challenge is using just a standard local computer required keeping the images at a relatively low resolution. The result is 75% accuracy in the actual sorting. Even achieving that required three days of training the computer on recognizing the 7000 images.

Expanding to a server farm (no pun intended) large enough to raise that accuracy to 95% would be cost prohibitive and only needed during harvest. But Makoto is excited because Google offers Cloud Machine Learning (Cloud ML), a low-cost cloud platform for training and prediction that dedicates hundreds of cloud servers to training a network with TensorFlow. With Cloud ML, Google handles building a large-scale cluster for distributed training, and you just pay for what you use, making it easier for developers to try out deep learning without making a significant capital investment.

If you can do this with sorting cucumbers imagine what might be possible as cloud power continues to increase inexpensively and the tools get easier to use. The personal assistant on your phone will really become your personal assistant and not the clunky beasts they are today. In your professional life they’ll be your right-hand minion taking over the tedious aspects of your job. Given what Makoto achieved perhaps you should try your hand at it. Who knows what you might come up with?

By John Pientka

(Originally published Sept 22nd, 2016. You can periodically read John’s syndicated articles here on CloudTweaks. Contact us for more information on these programs)

Ransomware

The vision is chilling. It’s another busy day. An employee arrives and logs on to the network only to be confronted by a locked screen displaying a simple message: “Your files have been captured and encrypted. To release them, you must pay.”

Ransomware has grown recently to become one of the primary threats to companies, governments and institutions worldwide. The physical nightmare of inaccessible files pairs up with the more human nightmare of deciding whether to pay the extortionists or tough it out.

Security experts are used to seeing attacks of all types, and it comes as no surprise that ransomware attacks are becoming more frequent and more sophisticated.

Security Experts Take Note

Chris Sellards, a Certified Cloud Security Professional (CCSP) working in the southwestern U.S. as a senior security architect points out that cyber threats change by the day, and that ransomware is becoming the biggest risk of 2016. Companies might start out with adequate provisions against infiltration, but as they grow, their defenses sometimes do not grow with them. He points out the example of a corporate merger or acquisition. As two companies become one, the focus may be on the day-to-day challenges of the transition. But in the background, the data that the new company now owns may be of significantly higher value than it was before. This can set the company up as a larger potential target, possibly even disproportionate to its new size.

The problem with ransomware as a security threat is that its impact can be significantly reduced through adequate backup and storage protocols. As Michael Lyman, a Boston-area CCSP states, when companies are diligent about disaster recovery, they can turn ransomware from a crisis to merely a nuisance. He says that organizations must pay attention to their disaster recovery plans. It’s a classic case of the ounce of prevention being worth more than the pound of cure. However, he points out that such diligence is not happening as frequently as it should.

As an independent consultant, Michael has been called into companies either to implement a plan or to help fix the problem once it has happened. He points out that with many young companies still in their first years of aggressive growth, the obligation to stop and make sure that all the strategic safeguards are in place is often pushed aside. “These companies,” he says, “tend to accept the risk and focus instead on performance.” He is usually called in only after the Board of Directors has asked management for a detailed risk assessment for the second time.

Neutralizing The Danger

Adequate disaster preparations and redundancy can neutralize the danger of having unique files held hostage. It is vital that companies practice a philosophy of “untrust,” meaning that everything on the inside must remain locked up. It is not enough to simply have a strong wall around the company and its data; it must be assumed that the bad people will find their way in somehow, which means all the data on the inside must be adequately and constantly encrypted.

It is essential to also bear in mind that ransomware damage does not exist solely inside the organization. There will also be costs and damage to the company-client relationship. At the worst is the specter of leaked confidential files – the data that clients entrusted to a company – and the recrimination and litigation that will follow. But even when a ransom event is resolved, meaning files are retained and no data is stolen, there is still the damage to a company’s reputation when the questions start to fly: “How could this have happened?” and “How do we know it won’t happen again?”

As cloud and IOT technologies continue to connect with each other, businesses and business leaders must understand that they own their risk. It is appropriate for security experts to focus on the fear factor, especially when conversing with the members of the Executive, for whom the cost of adequate security often flies in the face of profitability. Eugene Grant, a CCSP based in Ontario, Canada, suggests that the best way to adequately convey the significance of a proactive security plan is to use facts to back up your presentation; facts that reveal a quantitative risk assessment as opposed to solely qualitative. In other words, bring it down to cost versus benefit.

No company is too small to be immune or invisible to the black hats. It is up to the security specialists to convey that message.

For more on the CCSP certification from (ISC)2, please visit their website. Sponsored by (ISC)2.

Micro-segmentation

Changing with the times is frequently overlooked when it comes to data center security. The technology powering today’s networks has become increasingly dynamic, but most data center admins still employ archaic security measures to protect their network. These traditional security methods just don’t stand a chance against today’s sophisticated attacks.

That hasn’t stopped organizations from diving dive head-first into cloud-based technologies. More and more businesses are migrating workloads and application data to virtualized environments at an alarming pace. While the appetite for increased network agility drives massive changes to infrastructure, the tools and techniques used to protect the data center also need to adapt and evolve.

Recent efforts to upgrade these massive security systems are still falling short. Since data centers by design house huge amounts of sensitive data, there shouldn’t be any shortcuts when implementing security to protect all that data. The focus remains on providing protection only at the perimeter to keep threats outside. However, implementing strictly perimeter-centric security such as a Content Delivery Network (CDN) leaves the inside of the data center vulnerable, where the actual data resides.

Cybercriminals understand this all too well. They are constantly utilizing advanced threats and techniques to breach external protections and move further inside the data center. Without strong internal security protections, hackers have visibility to all traffic and the ability to steal data or disrupt business processes before they are even detected.

Security Bottleneck

At the same time businesses face additional challenges as traffic behavior and patterns are shifting. There are greater numbers of applications within the data center, and these applications are all integrated with each other. The increasing number of applications has caused the amount of traffic going east-west traffic – or laterally among applications and virtual machines – within the data center to drastically grow as well.

As more data is contained with the data center and not crossing the north-south perimeter defenses, security controls are now blind to this traffic – making lateral threat movement possible. With the rising number of applications, hackers have a broader choice of targets. Compounding this challenge is the fact that traditional processes for managing security are manually intensive and very slow. Applications now are being rapidly created and evolving far more quickly than static security controls are able to keep pace with.

To address these challenges, a new security approach is needed—one that requires effectively bringing security inside the data center to protect against advanced threats: Micro-segmentation.

Micro-Segmentation

Micro-segmentation works by grouping resources within the data center and applying specific security policies to the communication between those groups. The data center is essentially divided up into smaller, protected sections (segments) with logical boundaries which increase the ability to discover and contain intrusions. However, despite the separation, application data needs to cross micro-segments in order to communicate with other applications, hosts or virtual machines. This makes lateral movement still possible, which is why in order to detect and prevent lateral movement in the data center it is vital for threat prevention to inspect traffic crossing the micro-segments.

For example, a web-based application may utilize the SQL protocol for interacting with database servers and storage devices. The application web services are all logically grouped together in the same micro-segment and rules are applied to prevent these application services from having direct contact with other services. However SQL may be used across multiple applications, thus providing a handy exploit route for advanced malware that can be inserted into the web service for the purpose of laterally spreading itself throughout the data center.

Micro-segmentation with advanced threat prevention is emerging as the new way to improve data center security. This provides the ability to insert threat prevention security – Firewall, Intrusion Prevention System (IPS), AntiVirus, Anti-Bot, Sandboxing technology and more – for inspecting traffic moving into and out of any micro-segment and prevent the lateral spread of threats. However, this presents security challenges due to the dynamic nature of virtual networks, namely the ability to rapidly adapt the infrastructure to accommodate bursts and lulls in traffic patterns or the rapid provisioning of new applications.

In order to address data center security agility so it can cope with rapid changes, security in a software-defined data center needs to learn about the role, scale, and location of each application. This allows the correct security policies to be enforced, eliminating the need for manual processes. What’s more, dynamic changes to the infrastructure are automatically recognized and absorbed into security policies, keeping security tuned to the actual environment in real-time.

What’s more, by sharing context between security and the software-defined infrastructure, the network then becomes better able to adapt to and mitigate any risks. As an example, if an infected VM is identified by an advanced threat prevention security solution protecting a micro-segment, the VM can automatically be re-classified as being infected. Re-classifying the VM can then trigger a predefined remediation workflow to quarantine and clean the infected VM.

Once the threat has been eliminated, the infrastructure can then re-classify the VM back to its “cleaned” status and remove the quarantine, allowing the VM to return to service. Firewall rules can be automatically adjusted and the entire event logged – including what remediation steps were taken and when the issue was resolved – without having to invoke manual intervention or losing visibility and control.

Strong perimeter security is still an important element to an effective defense-in-depth strategy, but perimeter security alone offers minimal protections for virtualized assets within the data center. It is difficult to protect data and assets that aren’t known or seen. With micro-segmentation, advanced security and threat prevention services can be deployed wherever they are needed in the virtualized data center environment.

InformationWeek Reveals Top 125 Vendors

Five-part series details companies to watch across five essential technology sectors

SAN FRANCISCO, Sept. 27, 2016 /PRNewswire/ — InformationWeek released its list of “125 Vendors to Watch” in 2017. Selected by InformationWeek’s expert editorial team, the companies listed fall into one of five key themes: infrastructure, security, cloud, data management and DevOps.

“The rapid pace of technological change puts more pressure on IT organizations than ever before, but also offers unprecedented opportunities for companies to rethink how they do business,” said Susan Fogarty, Director of Content, InformationWeek & Interop ITX. “We are pleased to recognize technology suppliers that are helping our readers to navigate the possibilities.”

The technology industry is in a state of constant transition and evolution. In turn, new benchmarks are developing as a fresh class of innovative tools, disruptive technology and methodology, and professionals break into the space. To meet these expectations, technology vendors are hard at work to ensure they are adequately adapting to provide the enterprise with the most innovative and effective systems and products.

Across the wide spectrum of sectors that the tech industry touches, there has been a surge of innovation within a few key areas: infrastructure, security, cloud, data management and DevOps. To help professionals navigate where they should be looking for the latest and greatest technologies within these growing sectors, InformationWeek has compiled a list of top 25 companies per theme. The InformationWeek editorial team has detailed their selections for 2017 in a five-part blog series.

InformationWeek’s Top 125 Technology Vendors to Watch

Infrastructure: Businesses are rethinking their IT infrastructures and vendors are looking to software and open source solutions to help them reform. Here are the top companies providing those solutions.

A10 Networks

Barefoot Networks

Big Switch Networks

Cambium Networks

Cisco

CloudGenix

CoreOS

Cumulus Networks

Docker

ExtraHop

Mist

Nimble Storage

Nutanix

Pluribus Networks

Pure Storage

SimpliVity

NetApp SolidFire

Rubrik

StorageOS

SwiftStack

Teridion

Veeam Software

VeloCloud

Viptela

VMware

Security: A wave of companies is entering the security field, armed with technologies to help businesses mitigate the next generation of cyberattacks. Here are some of the most innovative security companies to watch.

Accenture/FusionX

Bay Dynamics

CloudFlare

CrowdStrike

Cymmetria

Deep Instinct

FireEye/Mandiant

IBM and Watson

Intel/McAfee

IOActive

Kaspersky

Lookout

Nok Nok Labs

Okta

Onapsis

Optiv

Palo Alto Networks

Rapid7

RSA

Splunk

Symantec/Blue Coat

Vectra Networks

Veracode

White Ops

Zscaler

Cloud: The need to handle big data and real-time events, alongside the ability to respond to business demands has dictated a shift towards cloud computing by companies seeking to remain competitive in the information age. Here are some of the companies staying ahead of the curve.

Alibaba Cloud

Amazon Web Services

Apptio

Bluelock

CloudHealth Technologies

CenturyLink

CGI IaaS

CSC Agility Platform

DigitalOcean

Dimension Data

EMC Virtustream

Fujitsu K5

GoGrid

Google Cloud Platform

IBM Cloud

Joyent Cloud

Kaavo

Microsoft Azure

New Relic

Oracle Cloud

Pantheon

Rackspace

SAP Hana Cloud Platform

Verizon Cloud

VMware

Data Management: Data is critical to get a clear picture of customers, products, and more, but in order to do so, that data must be managed across multiple systems — systems that aren’t necessarily compatible. Here are the companies that can help enterprise organizations wrangle their multiple data sources.

Alation

Ataccama

AtScale

Cloudera

Collibra

Confluent

Databricks

Dell Boomi

Hortonworks

Informatica

Information Builders

Looker

MapR

MarkLogic

MongoDB

Orchestra Networks

Profisee

Reltio

SAP

SAS

SoftwareAG

Talend

Teradata

TIBCO Software

Verato

DevOps: Organizations looking to transform into a DevOps organization need a solid plan, complete with executive buy-in, and the right tools to get all the jobs done. Here are InformationWeek’s picks for companies offering products organizations should know about when making the move to DevOps.

Atlassian

Canonical (Ubuntu Juju)

Chef

CFEngine

Electric Cloud

Google (Cloud Platform)

HashiCorp

Inedo

Jenkins

Kony

Loggly

Microsoft (Visual Studio)

Nagios

New Relic

Octopus Deploy

Path Solutions

Puppet

RabbitMQ

Red Hat (Ansible)

SaltStack

Splunk

Tripwire

UpGuard

UrbanCode (IBM)

Xamarin

Interop ITX 2017

The same core industry themes highlighted by InformationWeek will be incorporated into Interop’s revamped Conference, Interop ITX. In the technology industry, change outpaces many others, the next phase of tech education is Interop ITX – a Conference that anticipates the X factor: anyone or anything that can impact your business, your customers, or your market.The Conference incorporates an educational program built by Interop’s trusted community of technology professionals. To learn more about Interop ITX and to register, please visit: reg.interop.com/ITX

InformationWeek

For more than 30 years, InformationWeek has provided millions of IT executives worldwide with the insight and perspective they need to leverage the business value of technology. InformationWeek provides CIOs and IT executives with commentary, analysis and research through its thriving online community, digital issues, webcasts, virtual events, proprietary research and live, in-person events. InformationWeek’s award-winning editorial coverage can be found at www.informationweek.com. InformationWeek is organized by UBM Americas, a part of UBM plc (UBM.L), an Events First marketing and communications services business. For more information, visit ubmamericas.com.

UBM Americas

UBM Americas, a part of UBM plc, delivers events and marketing services in the fashion, technology, licensing, advanced manufacturing, automotive and powersports, healthcare, veterinary and pharmaceutical industries, among others. Through a range of aligned interactive environments, both physical and digital, UBM Americas increases business effectiveness for customers and audiences through meaningful experiences, knowledge and connections. The division also includes UBM Brazil’s market leading events in construction, cargo transportation, logistics & international trade, and agricultural production; and UBM Mexico’s construction, advanced manufacturing and hospitality services shows. For more information, visit:www.ubmamericas.com.

Unusual Clandestine Cloud Data Centre Service Locations Everyone knows what the cloud is, but does everybody know where the cloud is? We try to answer that as we look at some of the most unusual data centre locations in the world. Under the Eyes of a Deity Deep beneath the famous Uspenski Cathedral in the…

Big Data Future Today’s organizations should become more collaborative, virtual, adaptive, and agile in order to be successful in complex business world. They should be able to respond to changes and market needs. Many organizations found that the valuable data they possess and how they use it can make them different than others. In fact,…

The Big Data Movement In recent years, Big Data and Cloud relations have been growing steadily. And while there have been many questions raised around how best to use the information being gathered, there is no question that there is a real future between the two. The growing importance of Big Data Scientists and the…

The Financial Services Cloud Fintech investment has been seeing consistent growth in 2015, with some large moves being made this year. The infographic (Courtesy of Venturescanner) below shows the top Fintech investors and the amount of companies they’re currently funding: Just this week, a financial data startup known as Orchard Platform raised $30 million in…

Cloud Computing and SMBs Cloud Computing is the hottest issue among IT intellects of Small and Medium Businesses (SMBs). Like any other computer-orientated technology, Cloud Computing has some misconceptions and myths that often kick-start arguments among the two opposing groups: Cloud Supporters and Cloud Opponents. Both of these groups have their own ideology and reasons…

Protecting Your Web Applications It’s no secret that organizations are embracing the cloud and all the benefits that it entails. Whether its cost savings, increased flexibility or enhanced productivity – businesses around the world are leveraging the cloud to scale their business and better serve their customers. They are using a variety of cloud solutions…

Most Active Internet Of Things Investors A recent BI Intelligence report claimed that the Internet of Things (IoT) is on its way to becoming the largest device market in the world. Quite naturally, such exponential growth of the IoT market has prompted a number of high-profile corporate investors and smart money VCs to bet highly…

Compelling IoT Industries Every year, more and more media organizations race to predict the trends that will come to shape the online landscape over the next twelve months. Many of these are wild and outlandish and should be consumed with a pinch of salt, yet others stand out for their sober and well-researched judgements. Online…

Evolving Internet of Things The Internet of Things, or IoT, a term devised in 1999 by British entrepreneur Kevin Ashton, represents the connection of physical devices, systems and services via the internet, and Gartner and Lucas Blake’s new infographic (below) explores the evolution of the IoT industry, investigating its potential impact across just about every…

5 Essential Cloud Skills Cloud technology has completely changed the infrastructure and internal landscape of both small businesses and large corporations alike. No professionals in any industry understand this better than IT pros. In a cutthroat field like IT, candidates have to be multi-faceted and well-versed in the cloud universe. Employers want to know that…

Cloud Native Trends Once upon a time, only a select few companies like Google and Salesforce possessed the knowledge and expertise to operate efficient cloud infrastructure and applications. Organizations patronizing those companies benefitted with apps that offered new benefits in flexibility, scalability and cost effectiveness. These days, the sharp division between cloud and on-premises infrastructure…

Box.net, Amazon Cloud Drive The online (or cloud) storage business has always been a really interesting industry. When we started Box in 2005, it was a somewhat untouchable category of technology, perceived to be a commodity service with low margins and little consumer willingness to pay. All three of these factors remain today, but with…

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

Secure Third Party Access Still Not An IT Priority Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT consultants, to supply chain analysts and beyond, the threats posed by third parties are real and growing. Deloitte, in its Global Survey 2016 of third party risk, reported…

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…

Data Insecurity In The Cloud Today’s escalating attacks, vulnerabilities, breaches, and losses have cut deeply across organizations and captured the attention of, regulators, investors and most importantly customers. In many cases such incidents have completely eroded customer trust in a company, its services and its employees. The challenge of ensuring data security is far more…

Success for Today’s CMOs Being a CMO is an exhilarating experience – it’s a lot like running a triathlon and then following it with a base jump. Not only do you play an active role in building a company and brand, but the decisions you make have direct impact on the company’s business outcomes for…

Hyperconverged Infrastructure In this article, we’ll explore three challenges that are associated with network deployment in a hyperconverged private cloud environment, and then we’ll consider several methods to overcome those challenges. The Main Challenge: Bring Your Own (Physical) Network Some of the main challenges of deploying a hyperconverged infrastructure software solution in a data center are the diverse physical…

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…