Tag Archives: in-memory computing

One of the most frequent topics of conversation among government IT leaders today is how to manage legacy systems at a time when the pressure to modernize is at an all time high but budgets remain tight. But what exactly is a legacy system, and why are they creating such problems in the federal government?

Last month, some of the brightest minds in the IT business got together at the Adobe Coldfusion Summit to exchange ideas and best practices on how to successfully deliver web applications to market. One of the defining themes of this year’s summit was how best to address issues of speed and scalability – the two most common complaints for both web application developers and end users.

There are plenty of challenges facing the exploitation of Big Data by the federal government – from the quality and reliability of the data to the many disparate systems on which information is stored. But the challenges are common to most federal agencies, not unique to each one.

Data is data and at its most basic level, it is merely lines of code. However, we continue to become more and more connected each day, with 40% of the world’s population online and 1.75 billion of us using smart phones every day. Those in the business world as well as in the public sector are striving to determine how best to utilize Big Data, and by extension Big Data Analytics, to improve efficiency and mitigate cost.

While 2015 has been the year of cloud, it has also been the year of High Performance Computing (HPC). As more and more agencies embrace the cloud, they also have to consider how best to leverage and, in some cases, accelerate accompanying compute resources.

When most hear the term “High-Performance Computing” or HPC, they immediately think well-funded laboratories, CERN’s Large Hadron Collider facility, and global institutions routinely processing terabytes, petabytes, exabytes and more data for specific scientific and research missions. While this kind of specialized super-computing continues to evolve, many more organizations have demanding and growing requirements for larger-scale data processing—without the benefits of super-computing capabilities.

In the last days of July 2015, the White House released a new Executive Order (EO) focused on creating a National Strategic Computing Initiative (NSCI). The purpose of this EO is to establish a “cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors” that will “maximize the benefits of high-performance computing (HPC) research, development, and deployment.”

One of the items that should always be top of mind for government IT leaders is what happens when the lights go out? While it might not always be a power outage that causes systems to go dark, ensuring continuity of operations during periods of stress is critical for all government agencies investing in in-memory technologies.

Big data was big news in 2014, and for good reason—data is at the heart of every organization. But data that’s locked away or hard for your users to access where and when they need it doesn’t help anyone. In-memory computing is a transformative technology that drastically scales your current application environment to ensure that users can access what they need, from multiple apps, as quickly as possible. In-memory computing is already becoming a core component of high performing applications within government and with great results. Best of all, it allows you to achieve this massive scale all while supercharging the speed of your enterprise applications so you can tackle big data quickly.