Looking Up: 4 Technical DevOps Predictions for 2020

By Kit Merker

| December 21, 2018

SHARE:

Very soon, we won’t have to look to the heavens to predict our fortunes. By 2020, says a study by IDC, there will be as many bits in the world’s accumulated data and binaries as there are stars in the universe. It will be these 44 zetabytes (44 trillion gigabytes), doubling every two years, that will direct our future.

No question about it, DevOps has won the race. By the end of 2017, a survey by Forrester Research revealed that 90 percent of responding organizations had either already implemented or had plans to implement DevOps.

As a result, the automation and repeatability of CI/CD has helped deliver agility and faster times to market in the process of building and deploying applications.

These core trends, as every company is now a software company, fuel our journey to a world of Liquid Software, where the automation of continuous integration and continuous deployment gives way to true continuous updates.

So how will the stars — or better yet, the bits — align for future DevOps? Here are some of the things we’re not just expecting to see, but in many cases seeing already:

Trend #1: DevOps AI

DevOps automation produces more builds, more frequently, and with them a lot of information. That’s the perfect environment for AI and machine learning to play a growing role. The more data produced, the better these analytical technologies can detect patterns, make predictions and act on them to keep the CI/CD pipeline running smoothly.

Manual testing may someday become a thing of the past as AI takes over to predict how code will behave based on activity and repository logs. From this accumulation of information, AI will be able to produce automated functional, acceptance, and deployment testing, speeding software releases even as it helps assure reliability for continuous delivery.

The growing use of Infrastructure as Code means that operations managers have greater flexibility in allocating server environments and clusters, just by changing code descriptions. But people don’t have to write that code; Al can analyze large bodies of log reports to make informed guesses about what will be needed, and generate that code through automated services.

Most significantly, AI can produce intelligence automation about your DevOps process itself from the volumes of logs and performance flow data. AI will help contribute continuous improvement to CI/CD pipelines, avoiding mistakes that break builds and even predicting faults. AI may someday even be used to improve the AI software itself.

Trend #2: 5G Drives IoT

With the far greater speeds of 5G internet, scheduled to come on the market in 2020, the biggest obstacle to large-scale connected devices will fall. In a recent white paper, Huawei Technologies projects that 5G with 10Gb/s speeds could support a 1,000x gain in capacity, with connections for at least 100 billion devices. In essence, this means the network and connectivity may become practically irrelevant.

That means that larger binaries supporting more complex operation can be delivered to the devices that make up the networks of connected cars, homes, and cities that will comprise the Internet of Things.

As greater network speed encourages more devices to come online, it will also encourage a greater frequency of updates, bringing us ever nearer to the Liquid Software ideal where IoT will thrive.

Trend #3: Low-code DevOps

Even as today’s DevOps automation streamlines the process of CI/CD, it still requires developers to define most of the pipeline through job specifications, YAML files, or other intensive activities that are, in effect, coding by hand.

As the need for speed to create DevOps builds, we expect DevOps platforms to integrate low-code tools that help define pipelines through easier to use, point-and-click UI. The popular IFTT (If-This-Then-That ) platform for connecting apps and devices suggests one method, or graphical programming UIs like node graphs might fulfill the need.

With these tools a much wider range of technologists can contribute to the creation and maintenance of pipelines, policies, or Helm charts. Enterprises will benefit from lower costs of setup and training to build these vital parts of the DevOps process.

Trend #4: DataOps

The rise of data-driven computing and the democratization of analytics has spurred calls for an agile methodology to develop and deploy these data-intensive applications. The critical data that these solutions rely on can come from multiple data sources and data pipelines, so a process of managing them reliably is needed.

The emerging methods of DataOps draw directly from the key principles of DevOps — automation to help distributed teams support frequent and continuous integration and delivery. In the same way that DevOps helps developers, quality assurance, and operations to smoothly and securely collaborate, DataOps provides the same benefits to the joint efforts of developers, data scientists, data engineers, and operations.

Supporting DataOps requires some changes in the infrastructure and platforms, even as many of the core tools remain the same. To start, more languages and frameworks, such as the statistical language R, must be embraced by DataOps, and processes must enforce strict data access and governance policies.

As data becomes the chief driver of more applications, we can expect the range of data marketplaces to grow, including Data-as-a-Service.

All four of these these predicted trends suggest some exciting times are ahead in the DevOps universe. With JFrog’s Liquid Software vision and industry-leading DevOps tools, we look forward to helping you meet your goals in 2019, 2020 and beyond.