While IoT as a buzz word has only really come into the forefront over the last two to three years, the term was first coined nearly nineteen years ago by Kevin Ashton, a British technology pioneer and creator of the global standard system for RFID.

Ashton made this statement back in 1999, and I think it’s even more relevant today than it was then:

“Today computers—and, therefore, the internet—are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the internet were first captured and created by human beings by typing, pressing a record button, taking a digital picture or scanning a bar code.

The problem is, people have limited time, attention and accuracy—all of which means they are not very good at capturing data about things in the real world. If we had computers that knew everything there was to know about things—using data they gathered without any help from us—we would be able to track and count everything and greatly reduce waste, loss, and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.”

I like this quote because Ashton clearly articulates why businesses are investigating and deploying IoT solutions—to deliver value back to the business, and ultimately, to customers. IoT helps companies bring together business operations and IT to take unstructured operations data and turn it into analytical insights.

People hear IoT and they immediately think “sensors” and “cloud.” That’s not necessarily wrong, but there is so much more to it. IoT is not only about the sensors. It’s an architecture made up of technology from a multitude of vendors. When designing an IoT solution, it’s important to keep in mind two key questions: what data do you want to collect, and what do you want to do with that data once you’ve collected it?

It’s important to focus on the business outcome, not the technology. Instead of “sensors” and “cloud,” think in terms of words like “architecture” and “solution.” Consider the bigger picture throughout the entire design process.

Here is an example of what I’m talking about:

By focusing on the architecture and not on the technology behind it, you get more choice and flexibility in building an IoT solution that’s right for your business.

So, how do you go about building this architecture?

First, (and I know I just said to not think about these, but…) you’ll need a sensor. This could be anything depending on the environment, the object it’s connected to, or the data being collected. Next, you need to get the data from the sensor/device to either the edge platform or your data platform. There is a wide variety of different communications/transport protocol options, but I recommend getting up to speed on LoRaWAN, Sigfox, Zigbee, and ANT to start.

Once you have decided on a network, you need a gateway. A gateway provides multiple critical functions, including device connectivity, protocol translation, security, and update management, as well as edge data processing and filtering. Some newer gateways can also be set up as a platform for edge analytics, providing your businesses with real-time data streaming onsite before staging the results to your chosen data platform.

Then, you pick your platform. Whether you choose a public, hybrid, or private cloud solution, what’s important is that your data is always available. But it’s not only data availability that you have to take into consideration: there’s also compliance, security, performance, scalability, connectivity, elasticity, and integration.

Again, don’t focus on the technology. Think about what you’re going to do with your data. Are you looking to use the data collected from the IoT devices to provide business insights, predictive analytics, a machine learning platform, or even start venturing into the world of AI? Your platform choice will define your next steps. Take your time and explore your options.

This blog is part one of my new series on IoT. Be sure to check back in for my next piece, where I explore the bigger picture on how IoT can help you drive digital transformation for your business.

Understanding what you want to get out of your data is crucial, but in a world of ever-changing technologies, you need to be thinking about what’s next. What comes after IoT?

Over the past 12 months, I’ve heard the word “compliance” thrown around quite a bit. From ISO to General Data Protection Regulation (GDPR), compliance is now at the forefront of many organisations strategic requirements and is. is now recognised at board level, highlighted in many cases by the consequences (being fined) for not maintaining compliance.

One thing to remember is that it’s not always an IT problem. I don’t know how many times I have walked into a meeting and been asked by a customer what do they need to buy. Take GDPR, for example. Out of 107 actions, only eight can be fixed by a purchasable IT solution. The rest is policy-driven, and this is where it gets complicated. Although compliance is not an IT issue, IT is often crucial for you to stay compliant, you need to make sure you have technology in place to ensure that you adhere to the policies you have in place.

For this article, I am going to focus on one of the hot topics of conversation when it comes to compliance. The new European Union General Data Protection Regulation (GDPR). For many, this is a word that either causes confusion or panic. Please don’t panic! Don’t burrow your head in the sand. Talk to the experts! I may not be an expert when it comes to compliance, but over the last twelve months, I have learned a lot from listening and talking to partners and customers about their experiences. One of the big concerns I hear about repeatedly is how good are my foundations. Where does your business stand today in line with the new regulation? You must make sure you can clearly define or find the information you need to start. From hardware inventory, current security vulnerabilities, firewall policy and more important classification of your data. It is fine to have all these tools to monitor and protect against security threats and data breaches. However, if you don’t understand your data and how you use it you will struggle to understand and meet the GDPR requirements. But what I have also learnt from industry experts, is that it’s not just knowing what data you have but ask yourself the question why do I have this data? And understand your policies around it.

So, let’s take it back a step for anyone reading about GDPR for the first time.

The EU GDPR goes into effect May 25, 2018. It applies to all organisations processing the personal data of EU residents. The regulation will introduce a new way for organisations to handle data protection and it will be enforced fairly. The penalties for non-compliance of GDPR can be up to 20 million euros or four percent of company’s annual turnover. In addition, data subjects get a right to claim for compensation against an organisation under GDPR. But when Talking with the ICO one of the things they think will be the biggest penalty they can give is not money but stopping your ability to process data! How would your company cope with that?

It is important to remember that a data breach isn’t necessarily black and white. You could have all the security and encryption layers you want, but you may still be breached from either an external intrusion or an internal intrusion. What has become clear to me is that you need to have a clear audit trail of data throughout the business, from tracking user activity to change control activities and everything in between. The reason this is important is that part of the GDPR regulation requires that you declare to the ICO (In the UK) or equivalent any data breaches within 72 hours. Having an audit trail that proves that you have adhered to all policies and procedures may help reduce any penalties imposed on your company.

Let’s stop and think about the IT elements for a moment. It’s all well and good that you can provide the audit trail once you have been breached, but what elements do you need to think about when you’re trying to prevent a breach? It’s not as simple as just encrypting everything. You should make sure you keep your internal system up to date with the latest patches, so make sure you have a good patch manager in place to monitor servers, end-user devices, etc. One of the other elements you need to keep an eye on is your firewall management. Make sure that this is correctly patched, and, more importantly, that all policies are adhered to and implemented.

As I said at the beginning, I am not an expert on compliance, but these are thoughts and ideas I have picked up on over the past year. So, here’s my call to action for anyone reading this: Make sure you understand your data, and remember that the hard part isn’t becoming compliant; it’s the challenge of staying there.

If your interested in hearing from a Data privacy expert have a listen to the podcast below;

Yesterday was my last day working for Concorde Technology Group as their Group Technical Services Manager.

I joined Concorde in January from ArrowECS taking on a new and exciting role with new challenges. As I developed into the Role at Concorde and transitioned from working in distribution to the world or reseller it became apparent that some goals where actually harder to achieve that first thought. Whilst my time in the role was short I can honestly say I take away with me some great skills and experience I gained working for and with the team at Concorde. The world of the Reseller is different to what I have experienced and you have to be agile, fast and very detailed to compete in this space. With the whole IT industry moving at such a fast pace it was agreed that my skills and talents should focus on a different direction than the one Concorde is on.

Change is good and healthy there is no set recipe for life you should always embrace change, it’s a time to redeploy our skills and create new opportunity along the way so as I am moving in to the next chapter in my career. I look back on the positives and negatives I have learnt about myself in my time at the company, and what I want I want to achieve next. I can honestly say the experience has been good for me and will use all of my experiences in my next chapter.

I have met a lot of great colleagues and customers along the way and wish them all the best and success for the future.

The summer is nearly over, and conference season is getting into full swing. Take it from a veteran: one conference you don’t want to miss is NetApp Insight. I’ve been to many conferences that cover lots of different technologies, but for me, Insight is different. It’s not just a technology event—it’s been instrumental in helping me develop my career.

I’ve attended Insight for the past seven years, and I’m not planning on ditching that trend any time soon! Here are my top three reasons for why I make Insight my conference priority every year.

Expand Your Knowledge

Since I started coming to Insight in 2010, technology has changed dramatically. But NetApp has never stood still. Every year I return to Insight to find out about all the new and exciting technologies that NetApp is developing and investing in. From the early days of Data ONTAP 7-Mode to the cloud and beyond, NetApp’s evolving portfolio never disappoints.

This year, I am really looking forward to hearing about developments in the cloud portfolio and the next-generation data centre. NetApp has taken big leaps forward over the last two years in developing their portfolio and taking a strong position in these markets, and I can’t wait to hear what else is on the horizon.

If you are attending Insight, make sure that you make it for the general sessions. I am really excited about this year’s guest speaker. As a bit of a space enthusiast, I can’t wait to hear from Adam Steltzner, an aerospace engineer for NASA’s Jet Propulsion Laboratory. Hearing about the technologies that helped make the Mars rover missions successful is something you won’t want to miss.

While you’re filling out your schedule, you might want to think about attending our NetApp A-Team panel session. It’s a great chance to hear from partners and customers about their experiences working with NetApp. And it’s hosted by yours truly…for what it’s worth.

Get Certified

Education is a huge part of why I attend Insight. NetApp University provides you with all the tools and resources you need to get certified whilst you’re there. You can take all the tests you want—for free—and if you don’t pass, you can retake them the next day. It’s a great opportunity to expand your professional skillset and to demonstrate to your employer that you’re an expert in your field (and that Insight is worth more to them than a week on the town with your industry mates).

Although trying to fit certifications into your already packed Insight agenda isn’t everyone’s priority, I urge you to make the time. Gaining certifications at Insight each year has enabled me to develop my career and increase the value that I provide to my customers and to the company that I work for.

Expand your Network

Everyone who knows me also knows that I am not one to shy away from a good networking event (aka the bar). Insight is a great way to meet people from across the industry, but it’s not all about the parties. These networking events really do make a difference.

You never know who you’re going to meet on the show floor. One minute it’s a guy you attended a training course with five years ago and you’re sneakily trying to read his badge to remember his name. Next, you meet the founder of NetApp strolling around and ask him for a quick selfie.

And, of course, you get to meet the NetApp A-Team at these events. I have been lucky enough to be a member of this group for the past two years, and I can say with confidence that if there’s a party happening, you can bet the A-Team is not far away. If you are in Vegas or Berlin (or both!) this year, I urge you to go grab one of us if you see us walking around the show floor, or attend the A-Team session and ask us why we advocate for NetApp.

So, if you’re asking yourself, “Is attending Insight worthwhile?” I hope I’ve helped you come to a conclusion. In my opinion, not only is it worthwhile, it’s an essential part of developing your IT career, both technically and professionally.

Everyday businesses are changing the way they deliver applications to their users. Applications traditionally delivered on-premises are increasingly taken as a service from Software-as-a-service (SaaS) providers such as Microsoft.

It is estimated that the use of SaaS software will grow at a rate roughly 5 x that of On-premises solutions. Now you’re probably thinking these are just marketing figures, but in my experience over the last year, the conversations I am having with businesses show that there is a real interest and push towards SaaS both for functionality and commercial reasons.

The most prominent of these SaaS solutions is Office 365. Businesses are deploying O365 to provide email and collaboration services to their users.

However, there has been one worrying aspect of SaaS deployments that I have noticed and that is the risks that come with putting your data in cloud.

We know public cloud providers have robust disaster recovery capabilities with multiple datacentre and replications but native backup is something some providers lack and its assumed in most cases that this included as part of the SaaS service you are paying for. It often comes as a shock when we find out that in some cases it isn’t and the next questions that always follows is “how can I backup my data?”

There are a number of tools in the market that can provide the ability to backup your email from Office 365 but what about your other applications and what happens if you don’t have somewhere to back it up too.

That’s where NetApp can help with their new Cloud Control software

Cloud Control provides business with the ability to backup and protect their cloud based data. Its gives businesses the unique tools to be able to take the data they have within O365 and back it up to a secondary location.

One of the key things for me is that this provides a business with flexibility and choice. Cloud Control provides you with multiple deployment scenarios today.

Back up your Office 365 data to the Cloud Control storage as part of your solution created and managed by Cloud Control, this is an AWS S3 which provides cloud-to-cloud backup.

Bring your own license and back up your Office 365 data to your AWS S3 storage or you can use StorageGrid Web scale solutions as the backup target, which provides cloud-to-cloud backup whether that be public or private.

Cloud control is a full SaaS application there is no need for agents to be deployed, no software to install and no infrastructure required, making it easy to deploy and manage.

But Cloud Control doesn’t just help you protect your data through backups, it also provides multiple layers of operational and physical security.

Strong encryption:
Cloud Control protects data at rest with 256-bit AES object-level encryption with unique encryption key. All data in transit is also protected with Secure Socket Layer (SSL) encryption.

Controlled access:
Access to production environment is granted only to a dedicated operations team who has specific operational requirements. Changes to the production environment are tracked and audited.

In summary to me, Cloud Control can provide a flexible, Secure, efficient and cost effective solution for your SaaS applications.

I have been asked in a number of meetings over the past few months “what is GDPR?” and in some cases “What do I have to buy?”

But let’s get one thing straight from the start GDPR is NOT an IT problem you can’t just buy something and make it go away. This is a common misconception and I thought I would take the time to jot down what I have learned so far and see if it can help you.

The EU General Data Protection Regulation (GDPR) comes in to force on 25th May 2018. It applies to all organisations processing personal data of EU residents, the regulation will introduce a new and enforced way that organisations handle data protection. The penalties for non-compliance of GDPR can be up to 20 million euros or 4% of company’s annual turnover. In addition, data subjects get a right to claim for compensation against an organisation under GDPR.

It is important to understand your obligations and to start working towards your compliancy requirements. Being ready by 25th May 2018 will be a major undertaking, but the risks of not being prepared for GDPR are too big to ignore.

What are the new requirements?

Privacy by Design – GDPR has introduced formal principles of Privacy by Design into their Regulations which includes reducing your data collection to what you actually require and the retention of this data to gaining clear consent from the consumers to process their data.

Right to Erasure – The current EU data protection directive already provides a right for consumers to request data deletion. But GDPR extends this regulation to include data that’s been published out to the internet. This is where you hear a second term known as the “right to be forgotten” which extends to keeping your data fully out of the public view and ensuring it is removed from all systems.

Breach Notification – Within 72 hours of a personal data breach been discovered you have to inform the appropriate authorities. This has to also be extended out to the data subjects if the data is classified as “high risk to their rights and freedoms”.

Fines – Now this is where most company’s ears perk up, GDPR introduces fines that can be up to 4% of a company’s global revenue or 20Million Euro – whichever is higher

Data Protections Impact Assessments (DPIA) – A DPIA is required in high-risk situations, for example where a new technology is being deployed or where a profiling operation is likely to significantly affect a subject.

Data Protection Officer (DPO) – Not all companies have a DPO, but if you don’t I would advise that you assign this duty so someone in your organisation to take proper responsibility for your data protection compliance. Below are the regulation details identifying if you need a DPO.

“DPOs must be appointed in the case of: (a) public authorities, (b) organizations that engage in large scale systematic monitoring, or (c) organizations that engage in large scale processing of sensitive personal data (Art. 37). If your organization doesn’t fall into one of these categories, then you do not need to appoint a DPO.”

Consent – GDPR introduces new strict regulations around collecting data, you have to make sure that you are clear and concise when requesting consent from the subject. You have to define what the data is been collected for and make sure that all it is used for. As a controller of data you are responsible for making sure you have an audit trail of consent for all data collected from a subject. You may as a business need to review how you’re collecting and recording consent and if you need to make any changes to your procedures.

Children data protection – GDPR will bring in special protection for children’s personal data, focused particularly on commercial internet services such as social networking. To put this into context if you collect data about children, then you will need consent from the parent or guardians to process any personal details lawfully. It may have significant implications for your organisation if your business is aimed at children and collects their personal data. All consent has to be again clear and defined when collecting children’s data and your privacy notice must be written in language that children will understand.

Does Brexit mean I have to comply?

There are few of misconceptions around Brexit when it comes to GDPR. The main one been that “Brexit means we don’t have to comply”. This is FALSE! Businesses will still have to adhere to this regulation, this an EU regulation that protects EU citizen’s data. Which mean if you hold any details about an EU Citizen you have to make sure you are compliant and have taken the necessary steps regardless of the jurisdiction.

As I said above GDPR comes into force next year 25/5/2018 and we will still be in the EU so don’t burrow your head in the sand.

Now there are a number of other requirements that you may need to meet to comply with EU GDPR, but I am not a legal expert. So please take the time to investigate where you stand in relation to GDPR understand your risks and what data you hold. Attend an event and discuss it further with legal experts to help you start and build your foundations for GDPR.

The trouble most people have with understanding Data Fabric is that it’s not a product that you can just go out and buy. It’s NetApp’s answer to the future of IT. It’s a way of using a wide portfolio of products to enable continuous data availability across multiple on-premises and cloud platforms.

But the real value of data fabric is it provides a platform for transforming your business

While it’s not as simple or easily measurable as just expanding your bottom line, the real value of a Data Fabric is its power to transform your business.

I typically hear four questions about the value of a Data Fabric:

How can it change how I utilise my infrastructure?

How can it help me use my resources better?

How can it help me use my data more efficiently?

How can it help my business make money?

How can Data Fabric change how I utilise my infrastructure?

Whether you’re an existing NetApp customer with a data centre full of NetApp kit or not, the NetApp Data Fabric can help you get more out of your IT infrastructure.

Let’s say your business has a new requirement to provide backup, test and development in the cloud, but you don’t want to have a large admin team to manage all the different tools or equipment required to deliver this solution. So you need to make sure the solution is easy to manage, with full choice and control over your data.

You can build a data fabric to address these challenges and I don’t mean by some “one-size-fits-all” compromise either. I can think of three data fabric components that we can use to meet our needs: FlexArray, ONTAP Cloud, and AltaVault.

FlexArray would provide you with the capabilities to sweat the assets you already have, so you wouldn’t need to replace all your existing storage. In fact, if you wanted to keep it, you could use FlexArray to repurpose it to run ONTAP. This gives your existing storage and all the efficiency benefits of ONTAP

ONTAP Cloud now think about having on premises efficiency and control but in the cloud. With ONTAP Cloud you are able to replicate data from your onsite ONTAP array out into AWS or Azure. In an instance it can provide a test and Dev environment without having to pay for hardware and enables you to operationally scale.

AltaVault provides you with end-to-end efficiency and security when moving data to the cloud. It supports all leading backup and archive software, giving you flexibility and choice to fit it into your existing infrastructure without compromise. It can be deployed as a physical, virtual, or cloud-based solution. In less than 30 minutes, you can be backing up your data from any of your on-premises environments to the cloud of your choice.

How can Data Fabric help me use my resources better?

The Data Fabric gives you choice without sacrificing control of your data. This is key to a successful IT strategy. Forget about trying to predict what you’re going to do in 3-5 years. Think about how your decisions can change your business today. With NetApp Data Fabric and the technologies that enable it, you can buy for what you need today and scale for what you need tomorrow. Your infrastructure is agile and adaptable to your dynamic business requirements.

How can Data Fabric help me use my data more efficiently?

ONTAP 9 is the pinnacle of NetApp’s quarter century of innovation and is at the very heart of NetApp’s data fabric strategy.

NetApp continue to build capabilities into the platform to ensure that your key data assets are not only stored efficiently, but are highly available, protected and secured.

However, the true power of ONTAP is in its flexibility, the ability to not only run ONTAP on “traditional” physical controllers, but also as a software defined option with ONTAP Select or in the public cloud with ONTAP cloud, means not only can ONTAP allow us to seamlessly move data between storage tiers and controllers, but between virtual appliances and cloud providers to. All of this while maintaining all the same capabilities you expect on-premises meaning your data management, protection, security and analytics tools work in exactly the same way, regardless of ONTAP’s location.

Add to that NetApp’s desire to allow the ability to mirror data between any platform in its portfolio via SnapMirror to Anywhere technology, then you can see how your data fabric can take shape.

How can Data Fabric help my business make money?

A good portion of our IT budgets are probably spent just keeping the lights on. How much do you actually spend on development that moves the business forward?

A couple of months ago, a customer approached me to build an infrastructure that gives them the ability to run their business for peak workloads during heavy sales periods during the last three months of the year.

They wanted a virtualisation environment with a storage platform to run the required 200 servers during these peak times. The rest of the year, the environment runs at 50% of the peak workload (only having to run 100 servers). If this was a fixed, capex-based infrastructure, they would have unused equipment sitting around for most of the year. Over a three-year contract, that’s 27 months of wasted investment.

With a Data Fabric, we allowed them to achieve the same capabilities at a much lower cost. We started by deploying a virtualized flash platform on premises to account for standard workload and capacity requirements. While that flash platform may be able to cope with some of the burst that’s required as the business ramps up to its busy time, that’s not the only requirement. Compute and possibly additional storage may be needed for the extra 100 VMs.

A Data Fabric allowed us to use a hybrid cloud solution to address this challenge. By using ONTAP Cloud, we could seamlessly move data between the on-premises kit and either AWS or Azure.

Our fabric strategy also had the flexibility, if needed, to use a NetApp Private Storage (NPS) solution, allowing you to keep your data on your own NetApp systems for constant, guaranteed performance, whilst using your choice of public cloud providers for compute. This solution gives you the agility to scale up or down on demand and only pay for what you need when you need it, saving you that capital expenditure.

If you’ve been asking yourself, “What does Data Fabric mean for me and my business?” you’re not alone. Data Fabric is NetApp’s vision for the future of IT, and the benefits to your business both now and in the future are unmatched in the industry. I have spoken to a lot of customers over the past year and one thing I have learned is that the Data Fabric can help you solve your business challenges today and in the future so

Tell me if this sounds familiar. The other day I was giving a presentation to a customer on how the latest NetApp technology can help transform their business, and I had an epiphany. I thought to myself, “If I only had some of this kit ten years ago, I could have solved some really huge problems and avoided some big-time heartbreak.”

Then it hit me. The challenges I could have solved years ago aren’t really that different from the challenges my customers are trying to solve today. If you’ve been in the business as long as I have, you start to see the same themes resurface over and over again. For me, those have typically been things like consolidation, budget, agility, flexibility, management, cloud adoption, performance, and so on.

The Data Fabric is designed to help customers address each of those challenges head-on. And because NetApp knows one size does not fit all in today’s IT world, you can mix and match virtually any piece of the portfolio to meet specific needs. Over the next 5 blogs, I’m going to go into detail about how NetApp can help you solve some of these persistent challenges of 21st century IT.

Let’s start with the challenge that I think is still top of mind for many customers in 2016: reigning in budgets while consolidating infrastructure.

Doing more with less

I heard a statistic recently that claimed the majority of companies spend 60% of their budgets just keeping the lights on, and only 5% on innovation and growing the business. This has always been a driving force in IT. How can you shrink your footprint and your resource consumption while expanding your offerings and delivering more innovation? Lately, this conversation has been dominated by one word—flash.

NetApp has been at the forefront of the flash market for years, and they’ve got it down to a science. They’ve taken their flagship storage system, FAS, and given it an all-flash upgrade to create (you guessed it), All Flash FAS, which delivers extreme performance and low latency for business-critical operations. In terms of consolidation, that means you can get even more IOPs from less disks that take up less space and require less power and cooling to run them.

With ONTAP 9 software, that story gets even better. How much better? Try 4:1 capacity efficiency over spinning disk. That means you can store four times as much data within a smaller foot print using faster, more efficient disks. To put that into context for a large environment, if you were running 1PB on spinning disk, that might take up two and a half racks plus within your datacenter. With All Flash FAS running the new 15.3TB SSDs, you’re down to 8U. That’s a dramatic reduction in rack space, shrinking your datacenter footprint and ultimately saving you cash—and that’s just the tip of the iceberg. You add in the rich data management, storage efficiency features like deduplication and automation, and non-disruptive operations that ONTAP is known for, with upcoming features like compaction, and you’ve got a recipe for a bite-sized powerhouse of a data center.

And it’s not just the cost savings on what you buy now, it’s about making more intelligent decisions for your IT of the future. This has always been a big problem for IT managers—do you overbuy and hope you’ll use it, or under-purchase to save money while praying you don’t need it?

With NetApp, you don’t need to make that call. Take ONTAP Select. If this product was available 10 years ago, I would have bought it in a heartbeat. ONTAP Select is NetApp’s software-defined version of its core ONTAP software, allowing you to enjoy the rich data management and efficiency features of ONTAP on white box x86 servers.

ONTAP 9 and the Data Fabric gives you the freedom to buy what you need for today without burning up tomorrow’s budget. NetApp is changing the way people consume and purchase IT, and its right in line with the way the rest of the industry is moving in terms of software-defined. It’s a proactive versus reactive approach to IT that ultimately makes IT more valuable to the business by turning it into a strategic asset, not a cost center.

So with budgets and data center sprawl under control with NetApp, including All Flash FAS, ONTAP 9, ONTAP Select, and more, you can focus on the next big challenges for your business while innovating and delivering more value to customers.

Tune in next time for my take on how the NetApp portfolio can help you improve agility and flexibility for your most demanding workloads.