Zadara Blog

News, information, opinion and commentary on issues affecting enterprise data storage and management.

Multi-cloud Management Best Practices

More and more companies are implementing a multi-cloud strategy. According to the RightScale 2017 State of the Cloud Report, 85 percent of enterprises now have their applications running in more than one cloud. These companies have discovered the benefits of multi-cloud management, including avoiding being locked into any one cloud provider, having the flexibility to take full advantage of what each cloud offering does best, and perhaps most importantly, being shielded from being taken offline because a cloud provider is suffering an outage.

But managing an environment in which workloads are spread among several different clouds, each with its own feature set, user interface, and access protocols, is not a trivial exercise. As Judith Hurwitz, CEO of Hurwitz & Associates and author of Cloud Computing For Dummies puts it: “Multicloud management is an effort to have a very complicated, hybrid environment act as though it’s one single system that knows how to act and understands all its parts.”

What does it take to manage that kind of environment? Let’s take a look at some best practices for multi-cloud management.

Know Your Clouds

One of the major benefits of a multi-cloud approach is that it allows users to match a particular workload with the cloud platform best suited for that set of applications.

When it comes to public cloud providers, the list of options is a long one. It includes major players such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), IBM Bluemix, and Oracle Cloud, along with many others. One of your first steps in formulating a multi-cloud strategy must be to understand what each of these clouds has to offer. Although all offer a comprehensive suite of services for general workloads, each has its own particular strengths. Any specific workload may be better suited to one of these platforms than it is to another.

For example, if you need to deploy Windows client apps, Microsoft Azure may be your best option. If your workloads involve big data analytics, GCP or Bluemix may be a good fit.

Make Cyber Security Priority #1

cyber security priority one

As you begin assessing your workloads for best fit with particular cloud platforms, the security requirements of your various applications should be an overarching concern. In particular, you’ll need to understand how the data protection, backup/restore, and disaster recovery requirements of applications involving sensitive or mission-critical information will be accommodated in each cloud, or across clouds.

This not only concerns the level of access control and data protection each cloud platform can guarantee, but may also involve the issue of data sovereignty – the concept that data stored in any country is subject to the laws of that country. So, you’ll need to know where a particular cloud might store its data. If sensitive information will be stored in, backed up to, or “bursted” to that cloud platform, in what jurisdictions might that data physically reside, and what are the potential legal or regulatory compliance issues that might apply?

Because of such concerns, many companies have reached the conclusion that their multi-cloud strategy should include a private, on-premises cloud as well as a mix of public clouds.

Match Each Workload With the Appropriate Cloud

Each of your workloads has its own requirements regarding issues such as I/O performance, latency, scalability, and security. You’ll want to prioritize these requirements and assess the tradeoffs you’ll need to make in assigning that workload to a particular cloud platform. Doing so will require that you develop a good understanding of exactly what datasets, servers, storage, and other components are used by each workload, and of the interdependencies between them.

Collect and Track Performance Data

Have you chosen the best cloud to host each of your workloads? Are applications that span clouds running efficiently, or are there unseen mismatches that are sapping performance? You can’t answer these questions if you don’t track the performance of your applications as they run on their assigned platforms. A number of sophisticated toolsets are available that allow you to monitor and manage performance within and across clouds. Acquiring and consistently using such tools to track performance and make adjustments as needed should be a fundamental aspect of your multi-cloud management strategy.

Centralize Management In A Single Interface

centralize multi-cloud management

The last thing an IT team needs is to have to work with many different cloud interfaces, each with its own access protocols and terminology. A crucial aspect of being able to efficiently handle a multi-cloud environment is that you must be able to manage it as a single entity.

That’s why almost all multi-cloud implementations make use of software-defined storage (SDS) in some form. With SDS, users are presented with a consistent, standardized “single pane of glass” interface that looks exactly the same no matter on which platform the data may actually reside.

A good example of this is the Zadara Storage Cloud. Zadara has its VPSA Storage Array technology installed in the facilities of a number of major cloud providers, including AWS, Azure, and GCP. VPSA Storage Arrays can also be installed at customer sites as part of a private cloud or hybrid cloud solution. As far as the user is concerned, it doesn’t matter where the data actually resides, whether on-site, in the cloud, or spanning multiple clouds. Users interact with a consistent, cloud-independent dashboard through which they can manage their entire storage infrastructure. Storage solutions like that offered by Zadara can immensely simplify the task of managing a multi-cloud environment.

Making Multi-Cloud Management Simpler

The Zadara Storage Cloud offers a number of additional features that can be applied uniformly across different cloud platforms. These include predictable performance, elastic scalability, and top-of-the-line data backup/restore and disaster recovery functionality. Because of such capabilities, the Zadara Storage Cloud can make multi-cloud management much less of a headache than it otherwise could be.

If you’d like to know more about how the multi-cloud capabilities of the Zadara Storage Cloud can work for you, please download our Getting Great Performance in the Cloud white paper.

July 25, 2017

Posted In: Industry Insights

Tags: , , ,

Leave a Comment

Who Will Support Your Company’s Private Cloud?

For most enterprises, the move into cloud computing is already well under way. The reason is that the cloud offers some compelling benefits over traditional IT, including lower costs, greater flexibility, and enhanced ease of use.

But while most companies make use of the public cloud to a greater or lesser extent, some also have a need to keep their most sensitive or mission-critical information on their own premises, and under their direct control. In order to meet that requirement while still reaping the benefits of the cloud computing model, most of these organizations opt to implement a private cloud. According to the RightScale 2017 State of the Cloud Report, 72 percent of enterprises are now doing so.

Normally, on-premises private clouds are physically housed at a company’s own data center, or perhaps in the facilities of a cloud provider such as VMware. In either case, the important thing is that the organization’s data is kept safe behind a firewall, and not exposed to the public internet.

The Challenges of Running A Private Cloud


challenge sign

The decision to implement a private cloud raises some important issues that should be considered up front. Chief among them is the fundamental question of whether your cloud will be an entirely DIY (do-it-yourself), in-house effort, or if you will partner with a cloud services provider that can take much of the load off your internal support staff.

Supporting a private cloud on your own is not a trivial exercise. In fact, says James Maguire, managing editor at Datamation, “Running a private cloud in-house is notoriously difficult; it takes an entire host of skills and expertise, from network performance analysts to virtualization pros to IT security gurus.”

So, one of the first decisions you’ll need to make is whether your in-house IT team has the technical chops to take full responsibility for the support of your cloud. If you decide to go the DIY route, your staff must be prepared to handle every task, including the initial architecting of the system, HW/SW acquisition and configuration, 24/7/365 monitoring, proactive detection and handling of potential problems, and upgrading hardware and software as failures take place or new technology becomes available.

On the other hand, if you decide to use the services of a cloud provider, you’ll need to carefully assess the level of support the vendor will provide. You should put in place up front a comprehensive support agreement that details the roles and responsibilities of the provider in supporting your cloud. And don’t forget to clearly delineate the level of support to be provided if you decide to move on from that provider. Obviously, vendors have little incentive to extend themselves to provide above-and-beyond support to a soon-to-be former customer.

Lifting the Burden Of Support

The easiest way to implement a private cloud is to work with a provider that can substantially lift the burden of support off the shoulders of your in-house IT team. A good example of such a vendor is Zadara Storage. With its On-Premises Private Cloud offering, based on its VPSA Storage Array technology, Zadara takes full responsibility for supporting the customer’s installation.

The company installs its VPSAs, which combine Zadara’s patented software with standard storage hardware, on site in the customer’s facilities. But the customer is not responsible for supporting the equipment. Instead, Zadara remotely operates, monitors (24/7/365), maintains, and upgrades both hardware and software as needed. Software updates take place in the background, without in any way disrupting production. And when hardware needs to be replaced, Zadara ships a new VPSA to the customer, whose only responsibility is to physically install it. Customers are never required to perform tasks such as configuring the units or migrating data to them.

hand shake at work

Could a Private Cloud Be the Answer for Your Company?

One of the most important features of Zadara’s private cloud solution is that it does not involve any capital (CapEx) expenditures. Although Zadara installs and replenishes VPSAs in the customer’s facilities, there’s never a charge for the hardware itself, no matter how many appliances may be installed. Instead, customers are charged a monthly fee for only the amount of storage they actually use. And, of course, there’s never a separate charge for the support activities Zadara carries out.

If you’d like to explore how Zadara can help your company benefit from having a private cloud, please download our latest analyst paper: Zadara Storage Voted by IT Pros as On-Premise Enterprise Storage-as-a-Service Market Leader.

July 18, 2017

Posted In: Industry Insights

Leave a Comment

Cloud Adoption: Starting With A Hybrid Cloud Architecture

More and more corporations are moving to the cloud every day, and many are starting slowly with a hybrid cloud architecture.

The advantages of cloud computing and storage, such as infinite scalability, rapid deployment, universal accessibility, and greater flexibility and agility, are compelling. According to the SolarWinds IT Trends Report 2017, 95 percent of businesses have already migrated at least some of their critical applications to the cloud, and the trend continues to gain momentum.

Yet some businesses still refuse to use the public cloud to any great extent. Many of them are concerned about data security, fearing that information stored in a widely accessible, multi-tenant environment is inherently more vulnerable than if kept at home in their own data centers. However, modern cloud implementations offer encryption of data in-flight and at-rest. Industry-leading cloud storage solutions also offer “resource isolation” where instances are allocated their own drives, cores and network connections ensuring one user’s data cannot commingle with another’s data in any way. Additionally, to fully address these concerns the cloud deployment can be on-premises, within the user’s own data center.

Performance is also a pressing concern for many companies. Because information stored in the public cloud can be physically housed hundreds or thousands of miles away from the servers that use it, the cloud exhibits latency effects that limit the speed with which applications can access the data on which they depend. However, well-architected cloud solutions place data next to the compute so that performance is equal to onsite workloads.  Alternatively, workloads with demanding real-time performance requirements can also be placed in an on-premises cloud, within the user’s own data center.


The Public Cloud Is Probably In Your Company’s Future

Despite these security and performance concerns, and others such as regulatory compliance issues, the momentum is toward more and more workloads eventually ending up in the public cloud. The advantages of the cloud model are simply too great to ignore. Businesses that continue to avoid the public cloud will eventually find themselves at a competitive disadvantage relative to other companies that are more aggressive in taking advantage of the benefits cloud computing provides.

Robert Pinkham, managing director of infrastructure and cloud at Accenture, puts it this way:

“If you think about the agility, the flexibility, and the ability to spin up a new environment in the public cloud, to move workloads, to turn it on or off. Those attributes are starting to be a requirement for the entire compute, the entire data center environment, the entire organization.”

And the Gartner research firm pulls no punches in its assessment:

“By 2020, a corporate ‘No Cloud’ policy will be as rare as a ‘No Internet’ policy is today.”

So, now’s the time for businesses that have hung back from the public cloud to begin assessing how they can make it work for them. For most, moving from an on-premises IT infrastructure to heavy involvement in cloud computing would be too big a step if taken all at once. Implementing a hybrid storage solution can help bridge that gap, and make a company’s transition to the cloud less challenging.


How a Hybrid Cloud Architecture Eases the Transition to the Public Cloud

Hybrid cloud storage allows an organization to commit selected portions of its data to the public cloud while keeping the rest well protected behind its own firewall.

Companies that want to begin moving to the public cloud usually start by migrating only their least sensitive information. Often this will include backup, archival, and infrequently used data. The cloud can also be used as a spill-over target (a process called “cloud bursting”) when temporary surges in the demand for storage exceed the capacity of the on-premises system.

Any data that is critical to the operations of the business, that contains personal information, or that is used by performance-intensive applications, is retained in house. But over time, as the organization gains experience and develops confidence in its ability to use the public cloud safely and productively, more and more information can be shifted from on-premises storage to the cloud. Although some companies continue to use a hybrid cloud architecture indefinitely, others eventually find that they can entrust almost all of even their most sensitive information to the public cloud.

Making a Hybrid Cloud Architecture Work For You


The key to a successful hybrid storage implementation is ensuring that applications are able to access needed data in the same manner, through the same software interface, no matter where that information is physically stored. In other words, users should not be required to make adjustments based on whether the files they work with are located on site or in the public cloud.

Achieving that kind of data transparency requires some care in the selection of both on-premises and cloud-based storage solutions. The ideal would be for the same products to be used both on site and in the cloud. Zadara Storage, for example, installs (and remotely supports) its VPSA Storage Array technology both in customers’ data centers, and also in the facilities of major cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).

VPSAs provide transparent access to both local and cloud data through a network file system that is directly accessible and shareable between Windows or Linux clients. And, with features like remote replication, remote mirroring, and powerful snapshot/clone capabilities, the technology also allows applications to failover from on-premises servers to VMs mounted in the cloud.

Zadara offers VPSAs as a storage-as-a-service (STaaS) solution. Whether VPSA hardware is installed on premises or is accessed through the cloud, customers only pay a monthly fee for just the amount of storage they actually use.

Zadara’s VPSA technology provides an ideal building block for companies desiring to begin the process of moving data and applications to the public cloud. If you’d like to know more about how hybrid storage can be your company’s bridge to the cloud, please download the ‘STaaS vs Traditional Storage’ infographic.

July 12, 2017

Posted In: Uncategorized

Tags: ,

Leave a Comment

Managing A Multi-Cloud Environment

Today, more and more companies are moving parts or all of their IT infrastructure to the cloud. Many of those businesses, however, are discovering that restricting themselves to just one cloud platform isn’t the best solution for them. What are the benefits of managing a multi-cloud environment?

A survey by Forrester Research reveals that 52 percent of large enterprises are already using more than one cloud provider. In fact, almost a third of those organizations are working with four or more cloud vendors. IDC predicts that by 2018 more than 85 percent of enterprises will have implemented a multi-cloud strategy.

The reason so many of the most tech-savvy enterprises are pursuing a multi-cloud approach is that it offers some compelling advantages over a single cloud strategy.


Benefits of Managing a Multi-Cloud Environment

There are three main objectives most companies have in pursuing a multi-cloud strategy:

  • Avoid vendor lock-in. The fact is, it’s almost always prudent to avoid putting all your eggs in a single vendor’s basket. Having the ability to switch from one cloud provider to another enhances a company’s ability to negotiate for the services, prices, and terms that best meet its needs. It also provides customers with the greatest flexibility to take advantage of new service offerings and technologies as they become available. To Bryson Koehler, CTO for IBM Watson and IBM Cloud, the multi-cloud approach provides companies with the “ultimate agility.”
  • Match workloads with the most suitable platform. Although all the major clouds present themselves as suitable for general workloads, each platform has niches for which it is particularly appropriate. For example, AWS is especially suited to Open Source workloads, while Microsoft Azure is a natural for Windows-centric applications.
  • Avoid downtime even if a cloud suffers an outage. The major cloud providers are quite reliable. However, even the best will sometimes go down, as was illustrated by the major AWS outage that occurred in February of 2017. A sophisticated multi-cloud solution, with automatic failover from one cloud to another, can allow companies to avoid downtime even if a cloud provider goes offline for an extended period.


Challenges of a Managing a Multi-Cloud Environment

There are great benefits with a multi-cloud strategy, but they come with some potential challenges. Most of these arise from the reality that the major clouds differ from one another in significant ways. For example, while Amazon’s AWS has its roots in Linux, the fact that Microsoft Azure was originally called Windows Azure provides a hint as to its favored operating environment. Because each cloud has its own set of APIs, service definitions, user portals, and management interfaces, enabling application portability and unified management between them can be difficult.

Moreover, while the major clouds all have capable management tool suites, these are for the most part unique to each platform. From a purely business perspective, providers have little incentive to make it easy for customers to move from their cloud platform to another.


Requirements for Managing a Multi-Cloud Environment

One requirement for successfully operating in a multi-cloud environment is having effective tools that allow users to manage and automate workloads hosted on different clouds. Such tool sets, of which RightScale Multi-Cloud Platform is perhaps the most well known, provide the ability to monitor and manage the resources of various cloud services through a common interface.

A second necessity is a means of efficiently replicating data between clouds. According to George Crump, President of Storage Switzerland, the best way of accomplishing this is by leveraging on-premises appliances that can seamlessly connect to multiple clouds on the back end. A good example of this approach is the VPSA Storage Array from Zadara Storage. VPSAs are already resident in the facilities of major cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), among others. Zadara also installs VPSAs in customers’ on-site data centers.

Equipped with features such as remote replication, remote mirroring, and snapshot abilities, Zadara’s VPSAs can concurrently connect with and share data between multiple cloud platforms. And they provide a consistent, cloud-independent dashboard through which administrators can manage their entire storage infrastructure.

Zadara offers VPSAs as a storage-as-a-service (STaaS) solution, meaning that whether the hardware is accessed through the cloud or is installed on site, customers simply pay a monthly fee for just the amount of storage they actually use.


Could Multi-Cloud Work For Your Company?

In today’s environment, most businesses are dependent on the cloud to some degree. For companies that use cloud services for critical business functions, and that could suffer substantial losses if their cloud platform was unavailable for a significant period of time, taking a good look at how a multi-cloud solution could work for them should be a priority.

If you’d like to know more about how Zadara’s VPSA technology could help your company implement and manage a multi-cloud IT infrastructure, please download the ‘Zadara Storage Cloud’ whitepaper.

July 5, 2017

Posted In: Tech Corner

Tags: ,

Leave a Comment

Who Needs a Private Cloud?

Corporations, both large and small, are moving into the cloud in great numbers. That’s because the cloud offers some very attractive benefits, including rapid deployment, infinite scalability, automatic provisioning, universal accessibility, and an OpEx pay-as-you-go economic model that yields substantial cost savings.

Actually, what most people mean when they speak of the “the cloud” is just one part of the cloud universe. What they really have in mind is the public cloud, which includes major providers such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), as well as a number of smaller services.

The public cloud is built on a multi-tenancy model in which a large number of independent users share the same physical resources, such as servers and storage. This allows providers to spread their expenses across many customers, which brings down overall costs for everyone.

But the public cloud isn’t necessarily the best option for every company. The very fact that many different users have access to a common set of resources over the internet can be a source of concern. That’s why many organizations that would like to obtain the benefits of the cloud computing model are still reluctant to make use of the public cloud.


Concerns With the Public Cloud

The number one issue holding many companies back from the cloud is the fear that their data could be more vulnerable if stored there. According to the Cloud Security 2016 Survey, 93 percent of the cyber-security and IT professionals polled characterized themselves as very concerned or moderately concerned about the security of cloud data.

In particular, some enterprises are extremely reluctant to allow sensitive or mission-critical information to be removed from their immediate control. That stance may be dictated not only by data security concerns but in many cases by regulatory compliance obligations as well. For example, in the U.S., companies that deal with health information must conform to HIPAA requirements that impose strict standards regarding where and how such data can be stored.

Other companies are leery of the public cloud because they have performance-intensive workloads. Data stored in the cloud can be physically located hundreds or thousands of miles away from a customer’s data center. Even in the best of circumstances, when data and the servers that use it are separated by long distances, the speed of light, if nothing else, imposes latency delays that limit processing speeds.

If your company has any of these concerns, the public cloud may not be a viable option, at least for the most sensitive portions of your data. But that doesn’t mean you must necessarily forego all the operational and financial benefits of the cloud. Instead, the best option for you may be to deploy a private cloud.


Advantages of the Private Cloud

The public cloud, by definition, involves multi-tenancy. Private clouds, on the other hand, apply the same principles of cloud computing in a single-tenant environment. Servers and storage remain securely under a company’s own roof (or that of a chosen co-location facility) and behind its firewall. Having complete control over all components of its IT infrastructure makes it much easier for an organization to meet stringent data security, regulatory, and workload performance goals.


How to Have the Best of Both Clouds

Customers that implement the Zadara Storage Cloud in their on-premises data centers do not incur any up-front CapEx expenditures for storage hardware. Instead, with its On-Premise as-a-Service (OPaaS) solution, Zadara installs its own hardware/software resources on site, yet still only charges the customer a monthly fee for just the amount of storage actually used. Storage capacity can be instantly expanded (or contracted) as required. Moreover, following the STaaS (storage-as-a-service) model, Zadara remotely operates, maintains, and upgrades its installed equipment. In essence, Zadara provides all the benefits of a public cloud storage solution in a private cloud environment.

If you’d like to know more about how deploying a private cloud can benefit your company, please download our latest analyst paper: Zadara Storage Voted by IT Pros as On-Premise Enterprise Storage-as-a-Service Market Leader.

June 28, 2017

Posted In: Tech Corner


Leave a Comment

How to Provide Big Data Analytics To Clients

It’s hard to overstate the importance of big data analytics in almost every facet of human life today. According to IBM, 2.5 million terabytes of data is being produced on a daily basis. That deluge of information represents a unique business opportunity that both large enterprises and smaller companies are rushing to take advantage of.

Some are gaining critical marketing insights from customer behavior data. Security companies are analyzing the digital output of surveillance cameras in real time to detect threats as they occur. And doctors are literally saving lives by leveraging the analysis of millions of patient records in order to produce better diagnoses and treatment regimes.

According to IDC, the big data market is expected to grow to $48.6 billion by 2019. And more and more managed IT services providers (MSPs) are positioning themselves to earn a significant portion of that revenue.

What a Big Data Analytics Solution Requires

The Oxford Dictionary defines big data as “extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions.” This definition says a lot about what it takes to implement big data analytics solutions.

The first practical implication of the definition is that big data requires big storage. And because many organizations never discard data once it’s collected (new uses for archived information are discovered frequently), big data storage must be able to scale in capacity quickly, non-disruptively, and almost limitlessly.

Moreover, that storage has to be fast. In order for an analysis engine to access and correlate the relevant portions of huge datasets in a timely fashion, the storage infrastructure must exhibit extremely high levels of IOPS (Input/output Operations Per Second) performance.


What MSPs Bring To The Big Data Analytics Table

Obviously, performing analyses on huge datasets requires more than just putting a few extra servers and hard disk drive (HDD) arrays into an organization’s data center. Many companies, large enterprises as well as small and medium-sized businesses (SMBs), simply don’t have the time, staff, budget, or, frankly, the interest to develop and support their own big data infrastructure. MSPs that can offer such clients a high level of expertise in developing and supporting an effective big data analytics solution can provide an indispensable service.


MSPs Need Partners

Is it realistic to expect MSPs to acquire the comprehensive skill sets necessary for successfully implementing a big data analytics solution? Probably not. The good news, however, is that they don’t have to. The key for the vast majority of MSPs will be partnering with specialists in the technologies that make up a viable big data analytics system. This is especially true of the storage component.

For example, the fact that big data storage must be both extremely big and extremely fast imposes some seemingly incompatible requirements on the storage infrastructure. Because of the sheer amount of storage needed, there’s pressure to employ the least expensive storage technology available, which at this point is low-cost commodity hard disk drives (HDDs). The problem is that in many cases HDDs are simply too slow to deliver the necessary I/O performance.

Determining optimal solutions to such issues might be a real stretch for most MSPs. But by partnering with a world-class STaaS (storage-as-a-service) provider, MSPs gain access to a level of expertise and experience that will allow them to offer their clients state-of-the-art big data storage solutions.


How STaaS Can Solve Big Data Storage Issues

A first class STaaS provider, such as Zadara Storage, can help their MSP partners craft viable solutions to the most challenging big data storage issues.

For example with Zadara’s VPSA Storage Array technology, MSPs can define different storage performance tiers, using costly but very fast solid state drive (SSD) arrays for the most active data, and slower but less expensive HDDs for information that is required less frequently. In this way, the memory technology mix can be exactly right-sized to fit the required performance level, reducing storage costs to a minimum.

In addition, because Zadara Storage Clouds are physically connected to the leading public cloud providers like Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP), and can be installed on-premises in customer data centers as well, VPSA storage can be located in close proximity to the servers that use the data to perform analyses. This reduces to an absolute minimum the latency issues that arise when there is a long distance between the data store and the compute engine that accesses it.

Zadara customers are never required to use capital expenses (CapEx) to purchase storage hardware. Instead, they simply pay a monthly fee for just the amount of storage they actually use. This is true whether VPSA units are installed in the cloud or on-site in the customer’s data center. In either case, the storage capacity can be seamlessly scaled up or down in seconds.


Clients Are Looking To MSPs For Help

Most forward-looking companies are at least intrigued by the possibilities big data analytics could open up for them. But they also understand that implementing such a solution is no trivial task. They know they need help. And that represents a tremendous opportunity for MSPs that are willing to work with expert partners to provide the capabilities their clients need.

If you’d like to know more about how partnering with Zadara Storage could make big data analytics a reality for your clients, please download the ‘STaaS vs Traditional Storage’ infographic.

June 21, 2017

Posted In: Tech Corner

Tags: , , ,

Leave a Comment

Data Security Best Practices for MSPs

What are data security best practices? With so much news about hacking and ransomware, it can be hard to keep up.

In May of 2017 a ransomware attack shook the cyber world as never before. Called WannaCry, the malware encrypted data on more than 200,000 computers in at least 150 countries. Both large enterprises and smaller businesses were thrown into chaos as perpetrators demanded payment in return for a promise to provide the encryption keys that would allow victims to regain access to their data.

This episode has served as a wake up call to many organizations that have paid insufficient attention to ensuring that their business-critical information is protected from the existential threat posed by modern cyber criminals. In this environment, there’s probably no greater benefit Managed Services Providers (MSPs) can provide for their clients than to guide them in implementing top-flight data protection solutions that can keep their precious data safe.

So, what can MSPs do to ensure the safety of their clients’ data?


Start With The Basics


A good MSP will proactively work with clients to fully understand their particular data protection needs and help them develop an appropriate plan. That assessment will include issues such as identifying business-critical information that requires a high level of protection, determining if there are regulatory requirements, such as HIPAA compliance, that must be met, and specifying appropriate RTO (Recovery Time Objective) and RPO (Recovery Point Objective) levels to insure business continuity if a disruption occurs.

Of course the MSP will see to it that regular data backups are performed, including replication to remote sites to insure against simultaneous loss of both the original data and the backup in a fire or other local disaster. Plus, the backup/recovery process will be regularly tested to insure that backed up data can actually be restored.

Continuous 24/7 monitoring of a client’s IT infrastructure is a fundamental element of the services offered by most MSPs. This monitoring should focus not only on potential hardware or software failures, but also on detecting threats and intrusions both from outside and from within the customer’s organization. This will include encouraging the client to maintain and regularly update a comprehensive role-based access management process that strictly limits permissions to those required by each individual’s job responsibilities.

Other basics that must be covered include insuring that anti-malware software is installed and kept up to date, and that all software upgrades and security patches are promptly applied. And a good MSP will seek to educate the client’s personnel about how to avoid falling victim to “social engineering” threats.


Moving To The Next Level Of Data Security Best Practices


Many MSPs already do a good job of providing the basics of data protection for their clients. But MSPs that go beyond the basics to offer enterprise-grade data security at an affordable price will stand out from the crowd and gain a distinct competitive advantage.

Providing that next level of protection has historically been an expensive proposition that many MSPs were simply not positioned to undertake. But now, with the advent of the storage-as-a-service (STaaS) concept, the ability for MSPs to offer enterprise-class data protection has become a practical reality. By partnering with a first-class STaaS provider, MSPs can offer a range of data security services far superior to what most of their competitors can achieve on their own.

A good example of such a partner is Zadara Storage. Through its VPSA Storage Array technology, Zadara allows MSPs to offer their customers top grade data protection services, including:

  • Automatic, continuous, incremental backups to both on-premises and off-site remote storage, including private clouds and public clouds such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).
  • Frequent zero-impact snapshots with the retention of multiple versions, allowing customers to keep as long a history of their data as they desire.
  • RAID protection that spreads copies of the data across several disks so that a drive failure won’t result in loss of data.
  • Automatic failover to virtual servers either on-premises or in the cloud, providing the ability for applications to continue running even when a local disruption occurs.
  • Multi-zone and multi-cloud capabilities that can keep a client’s IT services online even if a major cloud provider suffers an outage.


MSPs Can Now Provide a High Level of Data Protection For All Their Clients

By partnering with an industry-leading STaaS provider like Zadara Storage, not only can MSPs offer a technically superior data protection solution to their clients, but they can do so at a much lower cost than was previously possible. Zadara’s offering is based on a pay-as-you-go model in which the client never needs to spend precious capital funds to acquire storage hardware, but simply pays a monthly fee for just the amount of storage they actually use. And that means enterprise-grade data protection can be a reality not just for large companies, but for smaller ones as well.

If you’d like to know more about how you can provide your customers with the highest levels of data protection at a cost they can afford, please download our data protection tip sheet.

June 7, 2017

Posted In: Industry Insights, Tech Corner

Tags: , , ,

Leave a Comment

Benefits of Hybrid Cloud Storage (And Why MSPs Should Recommend It)

What are the benefits of hybrid cloud storage? How can you, as and end user, or MSP, take advantage of these benefits of hybrid cloud storage?

Enterprises are moving their data to the cloud at a rapid pace. According to a recent survey conducted by IDG, 70 percent of organizations already have at least one application in the cloud, and another 16 percent plan to do so within a year.

The driving force behind this shift is the compelling set of benefits that cloud storage offers to businesses. These include greater agility and flexibility, enhanced data security, reduced IT management complexity, and substantial savings in total cost of ownership.

But at the same time, many IT professionals still view placing their organization’s critical data in the cloud as a risky proposition. In one survey, 62 percent of respondents cited concerns about data security as the biggest factor inhibiting them from more fully embracing the cloud.


Objections to Cloud Storage

In these times when it seems that a corporate data breach is in the news on a daily basis, the security of cloud storage is a universal concern. But it’s not the only one. Many companies face regulatory mandates, such as HIPAA requirements, that impose strict data protection standards and implicitly restrict where sensitive information can be housed. And organizations running applications with high I/O performance demands are concerned that, because of unavoidable latency delays that occur when data is transmitted long distances over the internet, the public cloud is simply not a viable option for their storage needs.

Because of such issues, many companies have a legitimate need to keep portions of their data in-house and under their immediate, direct control. But that doesn’t mean they have to forego all the advantages of the cloud storage model. Many enterprises today are implementing a hybrid solution in which their most sensitive or mission-critical information is kept securely within their own premises, while lower priority data is stored in the public cloud. According to a recent survey conducted by North Bridge Venture Partners, the hybrid model is used by 47 percent of companies, making it by far the most popular approach to enterprise data storage.


Benefits of Hybrid Cloud Storage For Managed IT Services Providers (MSPs)

The hybrid storage model offers MSPs the opportunity to make themselves invaluable to their clients by guiding them toward a solution that accommodates their data security and performance needs while still providing cost-effective cloud storage for the bulk of their data.


The on-premises storage infrastructure should be configured as a private cloud. This allows the use of a software-defined storage (SDS) paradigm that treats all the system’s data store resources, whether local or in the cloud, as a single pool of storage. Customers are presented with the same interfaces and operational procedures for all their data, wherever it may be located.

Because the SDS platform has granular control of the entire storage infrastructure, it can implement characteristic cloud storage features such as quick and easy capacity provisioning, essentially infinite scalability, increased reliability, automatic failover from local storage to the cloud, and top-flight data backup and disaster recovery regimes. In a real sense, by helping customers implement a hybrid storage solution, MSPs can offer the best of both the on-premises and cloud storage worlds.


When Should You Recommend Hybrid Cloud Storage?

Obviously, you’ll want to recommend a hybrid solution if your clients have high-level data security or I/O performance requirements. In such situations, MSPs can help clients define which data should stay in-house, and which can safely be committed to the public cloud.

But there are also several other use cases in which hybrid storage may well be your client’s best option.

  • Legacy systems: Many companies have legacy applications running in their in-house environment that it would be difficult to move to the public cloud without a costly major overhaul.
  • New app development: Clients can be encouraged to use the public cloud to develop and test new applications before putting them into their in-house production environment.
  • Backups, DR, archiving, and “cloud bursting”: If the reason for keeping data on-premises has to do with performance rather than security, the public cloud can be used for archiving, backup, disaster recovery, and as a reserve pool to accommodate unexpected surges in storage demand (cloud bursting).


MSPs Should Work With a Good STaaS (Storage-as-a-Service) Provider


Implementing a coherent, unified hybrid infrastructure in which on-premises and cloud-based data and applications seamlessly interoperate is not an easy task. The best way for most MSPs to do so is by partnering with a first class STaaS vendor that has a high level of expertise and experience in configuring enterprise-class hybrid storage solutions.

Zadara Storage is a good example of that type of STaaS provider. The company’s VPSA Storage Array technology is designed from the ground up both for the cloud and for in-house data centers. VPSAs are already resident in the facilities of major public cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). They can also be installed in customers’ premises to form the basis of a private or hybrid cloud storage solution.

MSPs that partner with Zadara can provide a cost-effective, technically superior hybrid storage solution with a 100 percent uptime SLA guarantee. There’s no requirement for any capital outlays to purchase equipment. Even when Zadara VPSA units are installed on site, clients simply pay a monthly fee for just the amount of storage they actually use.

If you’d like to know more about how you can provide your customers with a top-flight hybrid storage solution at an affordable cost, please download our latest analyst paper: Zadara Storage Voted by IT Pros as On-Premise Enterprise Storage-as-a-Service Market Leader.

June 1, 2017

Posted In: Tech Corner

Tags: ,

Leave a Comment

Why Managed IT Services Providers Shouldn’t Be Afraid Of The Cloud

“Can MSPs beat the cloud?” asks the MSPAlliance. Service Providers shouldn’t be afraid of the cloud, yet that’s a question many MSPs are asking these days.

The rate at which businesses are moving parts of their IT infrastructure to the cloud is increasing day by day.

A report from 451 Research concludes that in 2017 cloud services will account for 34 percent of IT budgets, up from 28 percent just a year before.

What does that mean for MSPs that have built their business on providing services that cloud vendors now claim they can deliver more efficiently and at lower cost?

According to a 2016 survey by CompTIA, cloud computing “top[s] the list of things that keep MSPs awake at night.” The fear is that the reduced costs, increased flexibility and greater ease of use available with cloud-based products will encourage customers to conclude they don’t really need a MSP middle man when they can directly implement cloud solutions on their own.


To learn more about why service providers love the Zadara Cloud, sign up for our webinar

Why MSPs Can’t Ignore The Cloud

Stephen Orban, Head of Enterprise Strategy at AWS, notes that in a recent survey of MSPs by CompTIA, 44% of respondents admitted that they only support cloud services when their customers specifically ask them to do so. That “ignore the cloud and maybe it will go away” strategy is simply not a viable option. The fact is that cloud providers are already aggressively reaching out to customers over the heads of MSPs that don’t proactively offer cloud solutions.

“The biggest threat is the born-in-the-cloud guys. They’re more nimble, they’re not tied into any existing business models or margins,” says Rocco Seyboth, vice president of product and marketing at BitTitan Inc. “A lot of the products the born-in-the-cloud guys are trying to sell the customer directly conflict with the legacy MSP’s revenue streams.”

Rather than ignoring the cloud and continuing with business as usual, forward-looking MSPs will themselves become an integral part of the cloud eco-system.


How MSPs Can Respond To The Competitive Threat Of The Cloud

As more and more of their customers become intent on moving all or parts of their IT infrastructure to the cloud, the MSPs that survive and thrive will be the ones that embrace the cloud rather than avoiding it. They will see it not as a threat but as an opportunity to gain a competitive advantage by offering customers better solutions at lower costs than traditional approaches to IT can provide.

These enterprising MSPs recognize that they must demonstrate to customers the value-added they bring to the table. So, they develop in-depth knowledge of the cloud-based applications and services that could be most useful to their customers, positioning themselves to be able to integrate the offerings of several vendors or cloud providers into a unified solution that meets each customer’s specific needs.

“The successful MSPs of the future are going to be the ones who know how to integrate services, not just hardware and software. That’s the great challenge of cloud,” says former MSP senior executive Howard M. Cohen. “The big advantage of an MSP is that it’s part of their business to keep themselves current. What services are available, how they are best configured, how they are best combined with others.”


Where To Start

In essence, IT is all about data. Applications can’t work without it. So, rather than starting with the complex process of moving applications to the cloud, the best place to begin offering cloud solutions to customers is with data storage. By partnering with a good STaaS (Storage-as-a-Service) provider, MSPs can not only gain invaluable experience in developing and implementing cloud solutions, but can also offer customers storage options that combine superior performance, security, scalability, and reliability with substantially lower costs.

With the STaaS model, customers don’t purchase storage, but storage services. Zadara Storage, for example, works with MSPs to provide customers with comprehensive storage solutions encompassing Zadara-owned hardware and software, 24/7 monitoring and support, as well as real-time capacity scaling, remote backup, automatic transparent failover, and top-of-the-line data security and disaster recovery capabilities. Customers are not required to lay out capital funds to purchase their own storage equipment, but simply pay a monthly fee for only the amount of storage they actually use. This storage can be based either in the public cloud, via cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform, in an on-premises private cloud, or in a hybrid combination of the two.


The Cloud Represents Great New Opportunities for MSPs


The cloud is changing the managed IT services business, and in order to survive and grow, MSPs must change as well. But that fact need not be a source of fear. In fact, forward-looking MSPs will see it as an opportunity to grow their business. As Charles Weaver, CEO of the MSPAlliance says, “There’s far more demand for managed services than there are qualified MSPs to deliver them.”

If you’d like to learn more about how MSPs can provide superior cloud-based solutions for their customers, please download the Solving IT Challenges eBook.

May 24, 2017

Posted In: Industry Insights

Tags: , ,

Leave a Comment

How MSPs Can Sell Enterprise Backup and Recovery Services

In today’s 24/7/365 commercial environment, information is a crucial resource for almost any business.  With that said, enterprise backup and recovery services are critical.

A company that loses access to its data for even just minutes will almost certainly suffer significant negative repercussions.

For example, every hour of downtime can cost a Fortune 1000 enterprise an average of $500,000 to $1 million. Plus, companies risk long-term loss of both customers and reputation.

Many times businesses don’t really understand how vulnerable they are to a potentially catastrophic loss of data due to disruptions such as man-made or natural disasters, hardware/software failures, malware attacks, or ill-advised actions by workers that can result in critical information being inadvertently (or deliberately) deleted or corrupted. Without a comprehensive backup solution in place, companies run the risk of critical and irreplaceable data disappearing in an instant due to some unanticipated circumstance.

Yet many businesses lack a viable strategy for ensuring that their vital information is adequately backed up and recoverable. Far too many companies aren’t consistently backing up their data at all. And many of those that do are actually more vulnerable than they know because their backup process doesn’t cover all the necessary bases.

That’s where a Managed IT Services Provider (MSP) can perform a vital service for its customers. Many businesses don’t have the time, money, or expertise to craft an adequate backup plan on their own. But a good MSP can help identify and implement enterprise-class backup solutions that will keep their clients’ data safe and available in almost any circumstance.

To learn more about Zadara Storage enterprise backup and recovery services, click here to join our webinar, “Why Service Providers Love Zadara Storage” on June 7th. 

Why Many Companies Have Inadequate Data Backup

The biggest reason many businesses skimp on their enterprise backup and recovery services is cost. It’s undeniable that implementing a good backup plan requires an investment of funds that might seem to be more urgently needed elsewhere. That often leads to companies relying on solutions that are less costly, but also less comprehensive and reliable.

Many times IT managers believe they are covered because workers are backing up their data to some type of on-premises storage on a daily or weekly basis. Or the company may maintain an account with a cloud backup service that automatically uploads data from servers or employee computers on a preset schedule.

But such practices can give an organization a false sense of security. It’s not uncommon for businesses that have some type of backup solution in place when they lose data to still have to call on a professional data recovery service to retrieve information that the company’s backup system was not able to fully restore.



What a First-Class Data Backup Solution Looks Like

Sometimes small businesses settle for inadequate backup because they are unaware of the enterprise backup and recovery services features that they need. The backup solutions offered by Zadara Storage provide a good picture of what a modern enterprise-grade backup system looks like.

For example, Zadara’s VPSA Storage Array technology enables backup to both on-premises and remote storage in order to ensure that a local disaster, such as a fire or flood, can’t wipe out both the original data and the backup at the same time. In addition, all data is RAID protected, which ensures that copies of the data are dispersed across several disks so that a drive failure will not cause data loss.

The Zadara technology allows automatic, continuous, incremental backups, with frequent zero-impact snapshots not only of data but of the operating system and running applications as well. This, along with mirroring to local and remote storage, enables automatic failover to virtual servers in the cloud or on site. If a local server should suddenly go down, these already-provisioned backup servers can instantly kick in, allowing a client’s applications to continue running without interruption. Moreover, with Zadara’s multi-zone and multi-cloud capabilities, even if a major cloud provider such as AWS or Microsoft Azure suffers an outage, the backup system can still provide customers with continuous access to their data.

Other features of a top-notch enterprise backup solution include unlimited scaling, in-flight and at-rest encryption, and centralized management. A critical service an MSP can add is continuous verification of backups, to ensure not only that client data is backed up correctly, but that it can, in fact, be recovered.


How MSPs Can Encourage Clients To Adopt Enterprise Backup and Recovery Services


MSPs, as trusted partners in helping companies get the best from their IT investments, have the opportunity to educate their clients about the dangers, potentially to the very survival of the business, of making less than adequate provisions for backing up critical data. At the same time, the MSP must be able to provide a solution that not only meets the required operational standards but which also is affordable within the client’s budget constraints.

That’s why partnering with a top-notch STaaS (storage-as-a-service) provider like Zadara Storage is key. Because the STaaS model is inherently more technically comprehensive and cost effective than solutions involving traditional storage, MSPs that take the initiative to encourage clients to upgrade their backup capability can offer enterprise-grade backup at a small business price point.

If you’d like to know more about how you can provide your customers with backup services that are more comprehensive and cost-effective than they can achieve on their own, please download the ‘Zadara Storage Cloud’ whitepaper.

May 17, 2017

Posted In: Tech Corner

Tags: , , , , ,

Leave a Comment