The Software Matters in Open Networking

Open networking offers the ability to break free of vendor lock-in and achieve white box economics. SDN promises automation, reduced OpEx and increased agility. However, the open networking and SDN landscape can be confusing, and migrating away from legacy infrastructure can seem fraught with peril. Is the journey worth it?

Join Storage Switzerland and Pluribus Networks where we will answer the following questions:
• What are the benefits of open networking and SDN in the data center?
• How can I safely migrate to a disaggregated white box architecture when I have incumbent vendors deployed throughout my network?
• How do I deploy SDN in my data center and do I need a full hardware refresh to do it?

Register before November 9th and receive an exclusive copy of Storage Switzerland's newest eBook "Virtualizing and Automating the Network - A Guide to White Box Switches, Network Operating Systems and Software-Defined Networking."

Many businesses are eager to initiate a cloud migration strategy and take advantage of the economic benefits that clouds offer. However, concerns regarding vendor lock-in, proprietary technology, and hidden costs have served as a deterrent for a large number of IT managers. The answer for a growing number of businesses is a multi-cloud strategy where they can create a cloud solution that is tailored to their specific needs. Giving enterprises the ability to choose the computer, storage, and SaaS services that best fits their business requirements and budget puts them in the driver's seat and gives them full control.

Join Storage Switzerland and Wasabi where we will discuss the challenges with the soup-to-nuts clouds and how a best-of-breed multi-cloud solution is a better strategy for most organizations.

For over a decade Network Attached Storage (NAS) was the go to file storage device for organizations needing to store large amounts of unstructured data. But unstructured data is changing. While large file use cases are still prevalent, small file use cases are becoming more dominant. Workloads like artificial intelligence, analytics and IoT are typically driven by millions, if not billions of small files.

Object Storage is often hailed as the heir apparent but is it? Can file systems be redesigned to continue to support traditional NAS workloads while also supporting modern, small file and high velocity workloads? Join Storage Switzerland and Qumulo for our live webinar, “NAS vs. Object — Can NAS Make a Comeback,” to learn the state of unstructured data storage and if NAS file systems can provide a superior alternative to object storage.

Join us on our live event to learn
•. Why traditional NAS solutions fall short
•. Why object storage systems haven't replaced NAS
• How to bridge the gap by modernizing NAS file systems
•. Live Q&A with file system and NAS experts

To some, tape storage may seem like an outdated technology in the era of NAS and object-based storage. But— here’s a surprise – tape today is more relevant than ever. Even the most modern data centers can benefit from its low cost of ownership, scalability, reliability and security. In our next live webinar, Storage Switzerland is joined by Spectra Logic, Fujifilm and Iron Mountain to discuss why tape use shouldn’t just continue but actually expand, including in hybrid cloud environments.

Many IT professionals are familiar with tape and have leveraged the numerous advantages the technology has to offer. But for those less familiar or new to tape the webinar will start with:
• Tape Media and Tape Library Basics
o How does tape work?
o How is it different than other storage technologies?

Data analytics and business intelligence (BI) initiatives have become mission-critical, but today, it is simply taking too long to arrive at insights. 90% of the time that it takes to generate insights lies in human labor, and that labor is significantly tied to data preparation. Join subject matter experts from Storage Switzerland and Promethium for an in-depth look at what’s wrong with data analytics today, and how a self-service data analytics and governance strategy can empower organizations to avoid spending months searching for, collecting and preparing data.

Commvault has acquired Hedvig. What are the ramifications for the combined companies and the storage industry? Find out live on Thursday September 5th at 11:30am ET / 8:30am PT when we have Sanjay Mirchandani, CEO of Commvault and Avinash Lakshman, CEO of Hedvig in a special Live Edition of our Meet The CEO Webinar series.

Join us to learn a little bit more about the CEO behind this acquisition, what motivated the two companies to come together and what role Hedvig’s scale-out Software Defined Storage Software play’s in Commvault’s future. Most importantly ask questions directly of our CEO’s.

Hyperscale applications like Elastic, Hadoop, Kafka and Cassandra typically use a shared nothing design where each node in the compute cluster operates on its data. Hyperscale architectures, to maximize storage IO performance, keep data local to the compute node processing the job. The problem is that the organization loses the efficiency of shared storage infrastructure. As the hyperscale architecture scales, overprovisioned and underutilized compute, GPU and storage resources cost the organization money!

Join Storage Switzerland and DriveScale for our 20 minute webinar to learn how composable infrastructure can provide both high performance and high efficiency.

Backup and Recovery is as old as the data center itself, but organizations still struggle with the process. IT professionals have tried many solutions to solve their data protection solutions, and the market is full of options, but none seem to address the core issues. Launched only two years ago, HYCU, Inc took a different approach by building backup solutions custom-built for each environment they protect.

Join HYCU, Inc’s CEO, Simon Taylor as he sits down with Storage Switzerland Lead Analyst, George Crump to discuss HYCU’s unique approach to the fiercely competitive backup market.

Workloads like artificial intelligence (AI), machine learning (ML), big data analytics, the Internet of Things (IoT) and data warehousing need storage memory-levels of performance. Charles Fan, CEO of MemVerge, has a vision for making this possible without disrupting the application programming model or the storage software. Join us as Fan discusses his company’s invention of “Memory-Converged Infrastructure (MCI).”

Attend for additional discussion, including learning:

The problems inherent in using dedicated clusters to serve these specific applications.

Why it is so challenging for enterprises to enable modern workloads to run at memory speed with technologies like Intel’s Optane App Direct Mode.

How to unlock the raw performance potential of Intel 3D XPoint Persistent Memory with the efficiencies of pooled storage – and without changing any code.

Many businesses are eager to initiate a cloud migration strategy and take advantage of the economic benefits that clouds offer. However, concerns regarding vendor lock-in, proprietary technology, and hidden costs have served as a deterrent for a large number of IT managers. The answer for a growing number of businesses is a multi-cloud strategy where they can create a cloud solution that is tailored to their specific needs. Giving enterprises the ability to choose the computer, storage, and SaaS services that best fits their business requirements and budget puts them in the driver's seat and gives them full control.

Join Storage Switzerland and Wasabi where we will discuss the challenges with the soup-to-nuts clouds and how a best-of-breed multi-cloud solution is a better strategy for most organizations.

Vendors are re-writing software and adding custom hardware in an attempt to not bottleneck extremely low latency NVMe Flash and Storage Class Memory technologies. Eventually, they all are at the mercy of physics. Data has to traverse an internal and sometimes and external network so the computing tier can process it. Computational Storage offers an alternative by performing at least some of the processing on the storage device itself, eliminating most of the network activity.

Join Storage Switzerland's Lead Analyst, George Crump as he leads a panel of computational storage experts to include NGD System's Scott Shadley, ScaleFlux's Thad Omura, and Samsung's Pankaj Mehra as preview the Computational Storage Workshop at the Flash Memory Summit on Thursday, August 8th.

Join the panel to learn

* What is Computational Storage?
* Why does it Matter?
* What are the Use Cases / Success stories

Qumulo’s CEO, Bill Richter, will share his perspective on how file data sharing and retention requirements are changing and why a new file storage architecture is required in a live discussion with Krista Macomber, Senior Analyst for Storage Switzerland. Be sure to register so you don’t miss this chance to learn how to revamp your file system architecture to obtain visibility and scalability and meet your performance requirements, while at the same time staying within your budget.

Join us for a unique opportunity to hear ClearSky Data’s CEO, Ellen Rubin, share her perspective on how combining the cloud and the edge can help IT professionals to mitigate – or to eradicate entirely – physical storage management duties. Don’t miss this opportunity to learn how to re-think your production storage strategy to embrace storage-as-a-service in a way that frees yourself from the hassle of managing on-premises equipment, from primary storage to backup and disaster recovery.

Organizations need to make sure their backup infrastructure can meet the recovery objectives of today’s biggest challenges: ransomware, rapid restoration and disasters. Businesses expect their IT teams to recover faster than ever and they need business performance to be the same during the recovery effort as it is during normal production. Meeting these new challenges and expectations requires IT to rethink the backup process.

Ransomware requires rapid, frequent backups of all types of data and that the backup software also protects itself. Rapid recovery is more than just instantiating a VM from backup storage. It requires understanding the impact on performance, as well as planning a path for migration back to production storage. Finally, there is the never-ending threat of site-wide disasters. Thanks to cyber-attacks, geographically safe areas don’t exist. All data centers must prepare for a site failure and for a rapid disaster recovery without impacting application performance.

In our live webinar Storage Switzerland and StorageCraft will discuss why these are the top challenges facing the data protection architecture, why current data protection infrastructure won’t meet these challenges, and how to overcome them.

Data analytics and business intelligence (BI) initiatives have become mission-critical, but today, it is simply taking too long to arrive at insights. 90% of the time that it takes to generate insights lies in human labor, and that labor is significantly tied to data preparation. Join subject matter experts from Storage Switzerland and Promethium for an in-depth look at what’s wrong with data analytics today, and how a self-service data analytics and governance strategy can empower organizations to avoid spending months searching for, collecting and preparing data.

As businesses migrate more of their data and applications to the cloud, a more comprehensive and mature disaster recovery implementation is required. Cloud service providers have built a base layer of data protection capabilities that focus primarily on enabling recovery from hardware failure, but the reality is that most recovery requests today stem from a cyberattack or from human error. As a result, cloud services need some help in the form of intelligent and granular search and tagging, as well as management capabilities such as the ability to schedule snapshots and apply retention policies.

Cloud Daddy’s CEO, Spencer Kupferman, will join Storage Switzerland for a live webinar discussion regarding how the trifecta of increasing and more sophisticated malware attacks, regulatory oversight of data, and the shift to cloud are changing the face of the disaster recovery market. Register now to be sure you hear his perspective on why capabilities like simplified and centralized oversight of backup and replication jobs and cross-region backup and restore matter in today’s data protection paradigm.

Backups cannot be assumed to be safe today, against an emergence of more sophisticated malware. These new variants were designed to be discrete, often sitting idle and being copied across the backup repository to then attack slowly – all with the ultimate goal of attacking as many systems as possible while at the same time avoiding detection for as long as possible. To avoid restoring trigger files, IT must be able to proactively verify and monitor the quality and viability of backups. The problem is that legacy verification methods cannot keep up with the growing volume of backup data that must be verified or the frequency with which verification must occur.

Lynn LeBlanc, CEO of HotLink, will join Storage Switzerland to share her perspective on how IT professionals may restructure their approach to backup verification so that all recovery points may be vetted, and malware attacks contained. Don’t miss this opportunity to hear her perspective on the new pain points related to backup verification that are emerging with modern ransomware, and how IT professionals may apply analytics and artificial intelligence for more efficient and thorough verification.

Organizations, application owner and users all have much higher expectations of IT than ever before. They expect IT to recover real-time data instantaneously and recall aged data very quickly. These expectations mean that backup architectures are getting stretched at both ends; rapid primary storage recovery and fast access to data residing on long-term storage repositories.

Join Storage Switzerland, Veeam and NetApp for our webinar as we discuss the new service level objectives IT needs to prepare for and how the backup software and hardware architecture needs to change to meet these new demands.

Join our panel of data protection experts as we discuss:
- The new service level objectives
- Why current data protection infrastructures fall short
- The three steps you need to take to modernize your data protection architecture

Against the backdrop of Moore’s Law and ultra-low latency non-volatile memory express (NVMe) solid state drives, network speed and total capacity are increasingly the levers that make or break an application’s performance. Fibre Channel (FC) storage networking may be frequently announced dead, but in fact is alive and growing because of the consistent levels of very low latency and high bandwidth that it can bring to growing NVMe-oF implementations and workloads such as high-velocity analytics, artificial intelligence and private cloud workload hosting. President of the FC Industry Association (FCIA), Mark Jones, joins Storage Switzerland Senior Analyst Krista Macomber live to share his take on what is driving continued adoption of FC, and where the technology will fit in moving forward.

Kaycee Lai and Promethium have a unique vantage point into the key bottleneck that is greatly slowing down the time it takes to arrive at data-driven business intelligence: data qualification. You won’t want to miss this opportunity to learn more about why getting to the right data is so laborious, how it is not only slowing but also jeopardizing the quality of your business intelligence initiatives, and why stringing together multiple data querying tools is not an effective answer.

Enterprises are dealing with a constant tide of copy data sprawl. More copies are being created to serve secondary business processes like analytics, test and development, and frequently with limited oversight or governance from IT. Meanwhile, regulations like the European Union’s (EU’s) General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) require more data to be stored for longer periods of time. Copy data management (CDM) solutions can help to contain this sprawl, but they must be agile in order to effectively support business operations. Also, in order to truly minimize infrastructure-related costs, they should have the flexibility to be procured through a software-as-a-service (SaaS) model.

Ash Ashutosh, CEO of CDM pioneer Actifio, will join Storage Switzerland to share his perspective on how customer pain points related to managing copy data sprawl are evolving, and how his company is pivoting in-line with those requirements. Be sure to join and hear his thoughts on changing business models, and how CDM can not only support more efficient and effective business collaboration, but also to protect against the threat of ransomware and supporting compliance requirements.

Tune into Storage Switzerland's channel to learn from this analyst firm focused on storage, virtualization and the cloud. Storage Switzerland’s goal is to provide unbiased evaluations and interview content on sponsoring and non-sponsoring companies through articles, public events and product reviews.