Zadara Blog

News, information, opinion and commentary on issues affecting enterprise data storage and management.

Best practices for migrating data to the cloud

Originally published on InfoWorld

Moving petabytes of production data is a trick best done with mirrors. Follow these steps to minimize risk and cost and maximize flexibility

Enterprises that are embracing a cloud deployment need cost-effective and practical ways to migrate their corporate data into the cloud. This is sometimes referred to as “hydrating the cloud.” Given the challenge of moving massive enterprise data sets anywhere non-disruptively and accurately, the task can be a lengthy, complicated, and risky process.

Not every organization has enough dedicated bandwidth to transfer multiple petabytes without causing performance degradation to the core business, or enough spare hardware to migrate to the cloud. In some cases, those organizations in a physically isolated location, or without cost-effective high-speed Internet connections, face an impediment to getting onto a target cloud. Data must be secured, backed-up, and in the case of production environments, migrated without missing a beat.

[ Working with data in the cloud requires new thinking. InfoWorld shows you the way: How Cosmos DB ensures data consistency in the global cloud. | Stay up on the cloud with InfoWorld’s Cloud Computing Report newsletter. ]

AWS made hydration cool, so to speak. In fall 2016 AWS branded such offerings as Snowball, a petabyte-scale data transfer service using one or more AWS-supplied appliances, and Snowmobile, an exabyte-scale transport service using an 18-wheeler truck that carries data point to point. These vehicles make it easy to buy and deploy migration services for data that resides in the AWS cloud. It would take 120 days to migrate 100TB of data using a dedicated 100Mbps connection. The same transfer using multiple Snowballs would require about a week.

Yet for the remaining 55 percent of the public cloud market that is not using AWS – or those enterprises with private, hybrid, or multi-cloud deployments that want more flexibility – other cloud migration options may be more appealing than AWS’s native offerings. This may be especially true when moving production data, where uploading static data onto appliances leaves the IT team with a partial copy during the transfer. They need a way to resynchronize the data.

The following is a guide to cloud hydration best practices, which differ depending on whether your data is static, and thus resources are offline, or in production. I will also offer helpful tips for integrating with the new datacenter resources, and accommodating hybrid or multicloud architectures.

Static data

Unless data volumes are under 1TB, you’ll want to leverage physical media such as an appliance to expedite the hydration process for file, block, or object storage. This works elegantly in environments where the data does not need to be continuously online, or the transfer requires the use of a slow, unreliable, or expensive Internet connection.

1. Copy the static data to a local hydration appliance. Use a small, portable, easily shipped NAS appliance, configured with RAID for durability while shipping the between sites. The appliance should include encryption – either 128-bit AES, or preferably 256-bit AES, to protect against unauthorized access after the NAS leaves the client facility.
Using a very fast 10G connection, teams can upload 100MB to 200MB of data per second onto a NAS appliance. The appliance should support the target environment (Windows, Linux, etc.) and file access mechanism (NFS, CIFS, Fibre Channel, etc.). One appliance is usually sufficient to transfer up to 30TB of data. For larger data volumes, teams can use multiple appliances or repeat the process several times to move data in logical chunks or segments.

2. Ship the appliance to the cloud environment. The shipping destination could be a co-location facility near the target cloud or the cloud datacenter itself. Regardless of whether the target is a public cloud or hybrid/multi-cloud setting, two other considerations distinguish the smooth and easy migration from those that can become more protracted.

3. Copy the data to a storage target in the cloud. The storage target should be connected to the AWS, Azure, Google, or other target cloud infrastructure using VPN access via high-speed fiber.

For example, law firms routinely need to source all emails from a client site for the e-discovery purposes during litigation. Typically, the email capture spans a static, defined date-range from months or years prior. The law firm will have its cloud hydration vendor ship an appliance to the litigant’s site, direct them to copy all emails as needed, then ship the appliance to the cloud hydration vendor for processing.

While some providers require the purchase of the applianceothers allow for one-time use of the appliance during migration, after which it is returned and the IT team is charged on a per terabyte basis. No capital expenditure or long-term commitment required.

Production data

This process requires some method of moving the data and resynchronizing once the data is moved to the cloud. Mirroring represents an elegant answer to migrating production data.

Cloud hydration using mirroring requires two local on-premises appliances that have the capability to keep track of incremental changes to the production environment while data is being moved to the new cloud target.

1. Production data is mirrored to the first appliance, creating an online copy of the data set. Then a second mirror is created from the first mirror, creating a second online copy.

2. The second mirror is “broken” and the appliance is shipped to the cloud environment.

3. The mirror is then reconnected between the on-premises copy and the remote copy and data synchronization is re-established.

4. An online copy of the data is now in the cloud and the servers can fail over to the cloud.

For example, a federal agency had 2PB of on-premises data that it wanted to deploy in a private cloud. The agency’s IT team set up two on-premises storage resources adjacent to each other in one datacenter, moved production data onto one mirror, then set up a second mirror so that everything was copied. Then the team broke the mirror and shipped the entire rack to a second datacenter several thousand miles away, where its cloud hydration vendor (Zadara Storage) re-established the mirrors.

When reconnected, data were synchronized to represent a full, up-to-date mirror copy. Once the process was complete, the hardware that was used during the data migration process was sent to a remote location to serve as a second disaster recovery copy.

In another example, a global management consulting firm used 10G links to move smaller sets of data from its datacenter to the target storage cloud, and hydration appliances to move petabytes of critical data. Once the 10G link data uploads were copied to the storage resource, the cloud hydration provider used a AWS Direct Connect link to AWS. In this way the resources were isolated from the public cloud, yet made readily available to it. Other static data were copied onto the NAS appliances and shipped to locations that are available to the AWS cloud.

Features for easy integration

Regardless of whether the target is a public cloud or a hybrid or multicloud setting, three other factors distinguish the smooth and easy migrations from the more difficult and protracted ones.

– Format preservation. It’s ideal when the data migration process retains the desired data format, so that IT teams can copy the data into the cloud and instantly make use of it, versus converting copied data into a native format that is used locally but is not accessible from within the cloud itself. IT managers need to be to get at the data right away, without the extra step of having to create volumes to access it. With terabytes of data, the extra few hours of delay may not seem like a big deal, but at petabyte scale, the delay can become insufferable.

– Enterprise format support. Traditional storage device formats such as CIFS and NFS are either minimally supported by public cloud providers or not supported at all. Yet the applications these file systems serve often yield the most savings, in terms of management time and expense, when moved to the cloud. Having the ability to copy CIFS, NFS, or other legacy file types and retain the same format for use in the cloud saves time, potential errors, and hassle from the conversion, and helps assure the hydration timeline.

– Efficient export. No vendor wants to see a customer decommission its cloud, but when needs change, bidirectional data migration or exporting of cloud data for use elsewhere needs to proceed just as efficiently – through the same static and production approaches as described above.

Hybrid cloud or multicloud support

A final consideration with any cloud hydration is making sure it’s seeded to last. With 85 percent of enterprises having a strategy to use multiple clouds, and 20 percent of enterprises planning to use multiple public clouds (RightScale State of the Cloud Report 2017), IT teams are revising their architectures with hybrid or multicloud capabilities in mind. No company wants to be locked into any one cloud provider, with no escape from the impact of the inevitable outage or disruption.

Cloud hydration approaches that allow asynchronous replication between cloud platforms make it a no-brainer for IT teams to optimize their cloud infrastructures for both performance and cost. Organizations can migrate specific workloads to one cloud platform or another (e.g., Windows applications on Azure, open source on AWS) or move them to where they can leverage the best negotiated prices and terms for given requirements. A cloud migration approach that enables concurrent access to other clouds also enables ready transfer and almost instant fail-over between clouds, in the event of an outage on one provider.

Experts have called 2017 the year of the “great migration.” Projections by Cisco and 451 Research suggest that by 2020, 83 percent of all datacenter traffic and 60 percent of enterprise workloads will be based in the cloud. New data migration options enable IT teams to “hydrate” their clouds in ways that minimize risk, cost, and hassle, and that maximize agility.

Howard Young is a solutions architect at Zadara Storage, an enterprise Storage-as-a-Service (STaaS) provider for on-premises, public, hybrid, and multicloud settings that performs cloud hydrations as one of its services. Howard has personally assisted in dozens of cloud hydrations covering billions of bits of data.

New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to newtechforum@infoworld.com.

 

 

 

 

June 29, 2018

Posted In: Blog, Industry Insights, Tech Corner

Tags: , , , ,

How to Decide Whether a Hybrid Cloud Is a Good Option for Your Company

How to Decide Whether a Hybrid Cloud Is a Good Option for Your Company

Is your company considering moving some or all of your IT operation to the cloud? If so, you’ll need to decide which approach to the cloud best fits your needs: the public cloud, a private cloud, or that mixture of the two that’s come to be known as the hybrid cloud?

The most popular cloud model today, used by 75 percent of companies, is the hybrid cloud. In this approach a company puts some, often a majority, of its workloads in the public cloud, while keeping other parts at home in a private cloud. Is this the best option for your company? Here are some questions to consider as you make that decision.

1. Do You Have Data Security and Privacy Concerns?

Is your company concerned that its data may be vulnerable in the multi-tenant environment of the public cloud? Or perhaps it must meet legal or regulatory requirements, such as those imposed by the Health Insurance Portability and Accountability Act (HIPAA), to ensure the security of personal information. Many organizations faced with such mandates feel compelled to include a private cloud along with the public cloud in their solution mix so that they can keep their most sensitive data behind their own firewalls.

2. Do You Have High Performance Workloads That Would Be Impacted by Cloud Latency?

Data stored in the public cloud can sometimes be physically located half a world away from the servers that use it. When such long distances are involved, some level of data access delay, or latency, is unavoidable.

Applications that require high levels of I/O performance, such as big data analytics or online database systems for which users demand almost instantaneous response times, are usually best suited to a private cloud implemented in your own data center where storage and the servers that access it may be separated only by feet or inches.

3. Do You Have Resources To Implement a Private Cloud?

hand writing private cloud

With the public cloud, you purchase only IT services, not hardware. So there are no direct costs for equipment, data center space, electrical power, cooling, etc. That’s all handled by the cloud provider.

In contrast, with a private cloud you are often responsible for the hardware and software required to run it. If you have an existing data center, those resources may already be available. Or you may choose to house your private cloud in the facilities of a third party, such as a managed IT services provider (MSP). In any case, the private cloud element of your hybrid solution will require additional resources beyond those needed for the public cloud.

4. Do Your Workloads Require a High Degree of Scalability?

In today’s world, your IT system’s “customers” (whether actual paying customers or in-house users), have little tolerance for system slowdowns. Even when demand temporarily expands far beyond the averages your IT infrastructure was designed to accommodate, your system must be able to maintain its responsiveness. In other words, it must be able to scale instantly and without limitation.

In an on-site data center, that can become extremely expensive because you must keep extra storage and server capacity on hand to handle surges in demand. But most of the time that excess capacity will sit idle, eating up capital while performing no service.

The public cloud, on the other hand, is specifically designed for unlimited and instantaneous scalability. By configuring your data center workloads to spill over into the public cloud when necessary (a process called “cloud bursting”), you can achieve a high level of scalability without having to maintain extra hardware in reserve.

5. Do You Have a “Cloud-Savvy” IT Staff?

Carl Brooks, an analyst at 451 Research, speaks of the “humongous complexity” of the hybrid cloud. That’s perhaps a bit pessimistic, but historically creating a unified, coherent system that encompasses public and private clouds and, in some cases, an on-premises data center, was a technically non-trivial task. That’s especially true if you have legacy applications that may require extensive updating in order to make them cloud-compatible.

A number of companies are now offering relatively easy-to-use tools for managing a complex multi-cloud environment (which is what a hybrid cloud is). Still, any organization seeking to implement a hybrid cloud should be aware be aware of the issues and choose a partner that can simplify the process and eliminate the complexity.

Zadara Storage: A Great Partner for Your Hybrid Cloud Implementation

A good example of a partner that brings to the table a high level of expertise with respect to public, private, and hybrid cloud platforms is Zadara Storage. Zadara already has its VPSA Storage Arrays installed in the facilities of major cloud providers, including Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). It also installs the same technology in customer data centers as the basis for private or hybrid cloud implementations. In both instances, Zadara remotely operates, monitors, maintains, and upgrades its storage hardware and software as needed, relieving the customer of the necessity of having expertise in such tasks.

If you’d like to explore how Zadara can assist your company in its implementation of a hybrid cloud, please download the ‘Zadara Storage Cloud’ whitepaper.

September 6, 2017

Posted In: Tech Corner

Leave a Comment

Does Your Company Need a Managed Private Cloud?

Does Your Company Need a Managed Private Cloud?

manage private cloud

Cloud computing offers benefits that more and more companies are moving to take advantage of. These include lower costs, greater flexibility, and enhanced ease of use. But in many instances, corporations also have compelling reasons for keeping at least part of their IT infrastructure in house and behind their own firewalls. For some, it’s a need to ensure maximum security for their sensitive data or to meet regulatory compliance requirements. Others choose to stay at home because of performance constraints that require their data storage and compute resources to be in close physical proximity to each other.

Although these organizations might want to take advantage of the benefits of cloud computing, it may seem that running a private cloud operation under their own roofs is a bigger task than they are prepared to take on. For many, the best option may well be a managed private cloud.

What Is a Managed Private Cloud?

A “cloud,” whether public or private, is really nothing more than a set of services that can be easily accessed by users. The National Institute of Standards and Technology (NIST) defines cloud computing this way:

“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

A key element of this definition is that it applies to any kind of cloud. It may be a multi-tenant public cloud run by one of the behemoths of the cloud computing world, such as Amazon Web Services (AWS). It could be an in-house private cloud operated by a single company exclusively for its own use. Or, as is becoming more and more common today, it could be a private cloud managed by a third-party provider who specializes in offering individual customers a complete suite of cloud services on a single-tenancy basis. This latter model is what’s called a managed private cloud.

Clouds Are Built on Stacks

stack of books

The set of capabilities available with a particular cloud offering is embodied in its “stack,” a broad range of services layered on top of one another. In setting up a private cloud there are several stack offerings to choose from.

The most popular stack platform is OpenStack, which is an open source package managed by the OpenStack Foundation, and supported by a wide range of cloud providers. There are also several other open source offerings, such as CloudStack, Eucalyptus, and OpenNebula. Microsoft is entering the private cloud marketplace with Azure Stack. According to Network World, “Azure Stack is basically the same APIs, tools and processes that power Azure, but it’s intended to be hosted on-premises in private cloud scenarios.” Oracle, too, has joined the fray with its Oracle Cloud at Customer offering.

Building a private cloud will mean adopting one of the available stack platforms and fully understanding its intricacies. Rather than devoting internal staff resources to that task, many companies are choosing to outsource the responsibility of implementing and managing their private cloud stack to a partner that has specialized expertise in this area.

Benefits of a Managed Private Cloud

A good managed cloud services provider will set up, operate, and upgrade as necessary your company’s private cloud infrastructure. This may be on-premises at your site, or in the provider’s facility. They will optimize the environment for greatest efficiency with your particular workloads, while also implementing top-flight data security, backup/restore, and disaster recovery capabilities. The provider will monitor the operations of your cloud on an ongoing basis, and will proactively identify and fix problems as they occur.

Being able to offload such functions to a specialist is what makes the managed private cloud an especially attractive option for many companies today.

business man with thumbs up cloud

A key feature that should be part of your managed private cloud is interoperability with the public cloud. The most popular cloud computing approach today, by far, is the hybrid model in which companies make full use of both private and public clouds. Because of the public cloud’s multi-tenancy model, it can achieve economies of scale that make it more cost effective than a private cloud for some services. A well run managed private cloud will be able to call on the public cloud when it provides the best fit for particular workloads, or when surges in demand temporarily require more resources than the private cloud can provide.

How Zadara Can Make Your Managed Private Cloud an Even Better Solution

The Zadara Storage Cloud can make interoperability between private and public clouds much easier than traditional approaches to data storage. The company has its VPSA Storage Array technology already installed in the facilities of major public cloud providers, including AWS, Azure, and Google Cloud Platform (GCP). VPSA Storage Arrays can also be installed on site as the storage component of your private cloud implementation. Because these devices can be set up to automatically mirror and replicate data between themselves, sharing data (including server images) between clouds can be accomplished transparently, non-disruptively, and in real time.

If you’d like to explore how a managed private cloud can work for your organization, please download the ‘Zadara Storage Cloud’ whitepaper.

August 8, 2017

Posted In: Tech Corner

Leave a Comment

Managing A Multi-Cloud Environment

Today, more and more companies are moving parts or all of their IT infrastructure to the cloud. Many of those businesses, however, are discovering that restricting themselves to just one cloud platform isn’t the best solution for them. What are the benefits of managing a multi-cloud environment?

A survey by Forrester Research reveals that 52 percent of large enterprises are already using more than one cloud provider. In fact, almost a third of those organizations are working with four or more cloud vendors. IDC predicts that by 2018 more than 85 percent of enterprises will have implemented a multi-cloud strategy.

The reason so many of the most tech-savvy enterprises are pursuing a multi-cloud approach is that it offers some compelling advantages over a single cloud strategy.

 

Benefits of Managing a Multi-Cloud Environment

There are three main objectives most companies have in pursuing a multi-cloud strategy:

  • Avoid vendor lock-in. The fact is, it’s almost always prudent to avoid putting all your eggs in a single vendor’s basket. Having the ability to switch from one cloud provider to another enhances a company’s ability to negotiate for the services, prices, and terms that best meet its needs. It also provides customers with the greatest flexibility to take advantage of new service offerings and technologies as they become available. To Bryson Koehler, CTO for IBM Watson and IBM Cloud, the multi-cloud approach provides companies with the “ultimate agility.”
  • Match workloads with the most suitable platform. Although all the major clouds present themselves as suitable for general workloads, each platform has niches for which it is particularly appropriate. For example, AWS is especially suited to Open Source workloads, while Microsoft Azure is a natural for Windows-centric applications.
  • Avoid downtime even if a cloud suffers an outage. The major cloud providers are quite reliable. However, even the best will sometimes go down, as was illustrated by the major AWS outage that occurred in February of 2017. A sophisticated multi-cloud solution, with automatic failover from one cloud to another, can allow companies to avoid downtime even if a cloud provider goes offline for an extended period.

 

Challenges of a Managing a Multi-Cloud Environment

There are great benefits with a multi-cloud strategy, but they come with some potential challenges. Most of these arise from the reality that the major clouds differ from one another in significant ways. For example, while Amazon’s AWS has its roots in Linux, the fact that Microsoft Azure was originally called Windows Azure provides a hint as to its favored operating environment. Because each cloud has its own set of APIs, service definitions, user portals, and management interfaces, enabling application portability and unified management between them can be difficult.

Moreover, while the major clouds all have capable management tool suites, these are for the most part unique to each platform. From a purely business perspective, providers have little incentive to make it easy for customers to move from their cloud platform to another.

 

Requirements for Managing a Multi-Cloud Environment

One requirement for successfully operating in a multi-cloud environment is having effective tools that allow users to manage and automate workloads hosted on different clouds. Such tool sets, of which RightScale Multi-Cloud Platform is perhaps the most well known, provide the ability to monitor and manage the resources of various cloud services through a common interface.

A second necessity is a means of efficiently replicating data between clouds. According to George Crump, President of Storage Switzerland, the best way of accomplishing this is by leveraging on-premises appliances that can seamlessly connect to multiple clouds on the back end. A good example of this approach is the VPSA Storage Array from Zadara Storage. VPSAs are already resident in the facilities of major cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), among others. Zadara also installs VPSAs in customers’ on-site data centers.

Equipped with features such as remote replication, remote mirroring, and snapshot abilities, Zadara’s VPSAs can concurrently connect with and share data between multiple cloud platforms. And they provide a consistent, cloud-independent dashboard through which administrators can manage their entire storage infrastructure.

Zadara offers VPSAs as a storage-as-a-service (STaaS) solution, meaning that whether the hardware is accessed through the cloud or is installed on site, customers simply pay a monthly fee for just the amount of storage they actually use.

 

Could Multi-Cloud Work For Your Company?

In today’s environment, most businesses are dependent on the cloud to some degree. For companies that use cloud services for critical business functions, and that could suffer substantial losses if their cloud platform was unavailable for a significant period of time, taking a good look at how a multi-cloud solution could work for them should be a priority.

If you’d like to know more about how Zadara’s VPSA technology could help your company implement and manage a multi-cloud IT infrastructure, please download the ‘Zadara Storage Cloud’ whitepaper.

July 5, 2017

Posted In: Tech Corner

Tags: ,

Leave a Comment

Who Needs a Private Cloud?

Corporations, both large and small, are moving into the cloud in great numbers. That’s because the cloud offers some very attractive benefits, including rapid deployment, infinite scalability, automatic provisioning, universal accessibility, and an OpEx pay-as-you-go economic model that yields substantial cost savings.

Actually, what most people mean when they speak of the “the cloud” is just one part of the cloud universe. What they really have in mind is the public cloud, which includes major providers such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), as well as a number of smaller services.

The public cloud is built on a multi-tenancy model in which a large number of independent users share the same physical resources, such as servers and storage. This allows providers to spread their expenses across many customers, which brings down overall costs for everyone.

But the public cloud isn’t necessarily the best option for every company. The very fact that many different users have access to a common set of resources over the internet can be a source of concern. That’s why many organizations that would like to obtain the benefits of the cloud computing model are still reluctant to make use of the public cloud.

 

Concerns With the Public Cloud

The number one issue holding many companies back from the cloud is the fear that their data could be more vulnerable if stored there. According to the Cloud Security 2016 Survey, 93 percent of the cyber-security and IT professionals polled characterized themselves as very concerned or moderately concerned about the security of cloud data.

In particular, some enterprises are extremely reluctant to allow sensitive or mission-critical information to be removed from their immediate control. That stance may be dictated not only by data security concerns but in many cases by regulatory compliance obligations as well. For example, in the U.S., companies that deal with health information must conform to HIPAA requirements that impose strict standards regarding where and how such data can be stored.

Other companies are leery of the public cloud because they have performance-intensive workloads. Data stored in the cloud can be physically located hundreds or thousands of miles away from a customer’s data center. Even in the best of circumstances, when data and the servers that use it are separated by long distances, the speed of light, if nothing else, imposes latency delays that limit processing speeds.

If your company has any of these concerns, the public cloud may not be a viable option, at least for the most sensitive portions of your data. But that doesn’t mean you must necessarily forego all the operational and financial benefits of the cloud. Instead, the best option for you may be to deploy a private cloud.

 

Advantages of the Private Cloud

The public cloud, by definition, involves multi-tenancy. Private clouds, on the other hand, apply the same principles of cloud computing in a single-tenant environment. Servers and storage remain securely under a company’s own roof (or that of a chosen co-location facility) and behind its firewall. Having complete control over all components of its IT infrastructure makes it much easier for an organization to meet stringent data security, regulatory, and workload performance goals.

 

How to Have the Best of Both Clouds

Customers that implement the Zadara Storage Cloud in their on-premises data centers do not incur any up-front CapEx expenditures for storage hardware. Instead, with its On-Premise as-a-Service (OPaaS) solution, Zadara installs its own hardware/software resources on site, yet still only charges the customer a monthly fee for just the amount of storage actually used. Storage capacity can be instantly expanded (or contracted) as required. Moreover, following the STaaS (storage-as-a-service) model, Zadara remotely operates, maintains, and upgrades its installed equipment. In essence, Zadara provides all the benefits of a public cloud storage solution in a private cloud environment.

If you’d like to know more about how deploying a private cloud can benefit your company, please download our latest analyst paper: Zadara Storage Voted by IT Pros as On-Premise Enterprise Storage-as-a-Service Market Leader.

June 28, 2017

Posted In: Tech Corner

Tags:

Leave a Comment

How to Provide Big Data Analytics To Clients

It’s hard to overstate the importance of big data analytics in almost every facet of human life today. According to IBM, 2.5 million terabytes of data is being produced on a daily basis. That deluge of information represents a unique business opportunity that both large enterprises and smaller companies are rushing to take advantage of.

Some are gaining critical marketing insights from customer behavior data. Security companies are analyzing the digital output of surveillance cameras in real time to detect threats as they occur. And doctors are literally saving lives by leveraging the analysis of millions of patient records in order to produce better diagnoses and treatment regimes.

According to IDC, the big data market is expected to grow to $48.6 billion by 2019. And more and more managed IT services providers (MSPs) are positioning themselves to earn a significant portion of that revenue.

What a Big Data Analytics Solution Requires

The Oxford Dictionary defines big data as “extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions.” This definition says a lot about what it takes to implement big data analytics solutions.

The first practical implication of the definition is that big data requires big storage. And because many organizations never discard data once it’s collected (new uses for archived information are discovered frequently), big data storage must be able to scale in capacity quickly, non-disruptively, and almost limitlessly.

Moreover, that storage has to be fast. In order for an analysis engine to access and correlate the relevant portions of huge datasets in a timely fashion, the storage infrastructure must exhibit extremely high levels of IOPS (Input/output Operations Per Second) performance.

 

What MSPs Bring To The Big Data Analytics Table

Obviously, performing analyses on huge datasets requires more than just putting a few extra servers and hard disk drive (HDD) arrays into an organization’s data center. Many companies, large enterprises as well as small and medium-sized businesses (SMBs), simply don’t have the time, staff, budget, or, frankly, the interest to develop and support their own big data infrastructure. MSPs that can offer such clients a high level of expertise in developing and supporting an effective big data analytics solution can provide an indispensable service.

 

MSPs Need Partners

Is it realistic to expect MSPs to acquire the comprehensive skill sets necessary for successfully implementing a big data analytics solution? Probably not. The good news, however, is that they don’t have to. The key for the vast majority of MSPs will be partnering with specialists in the technologies that make up a viable big data analytics system. This is especially true of the storage component.

For example, the fact that big data storage must be both extremely big and extremely fast imposes some seemingly incompatible requirements on the storage infrastructure. Because of the sheer amount of storage needed, there’s pressure to employ the least expensive storage technology available, which at this point is low-cost commodity hard disk drives (HDDs). The problem is that in many cases HDDs are simply too slow to deliver the necessary I/O performance.

Determining optimal solutions to such issues might be a real stretch for most MSPs. But by partnering with a world-class STaaS (storage-as-a-service) provider, MSPs gain access to a level of expertise and experience that will allow them to offer their clients state-of-the-art big data storage solutions.

 

How STaaS Can Solve Big Data Storage Issues

A first class STaaS provider, such as Zadara Storage, can help their MSP partners craft viable solutions to the most challenging big data storage issues.

For example with Zadara’s VPSA Storage Array technology, MSPs can define different storage performance tiers, using costly but very fast solid state drive (SSD) arrays for the most active data, and slower but less expensive HDDs for information that is required less frequently. In this way, the memory technology mix can be exactly right-sized to fit the required performance level, reducing storage costs to a minimum.

In addition, because Zadara Storage Clouds are physically connected to the leading public cloud providers like Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP), and can be installed on-premises in customer data centers as well, VPSA storage can be located in close proximity to the servers that use the data to perform analyses. This reduces to an absolute minimum the latency issues that arise when there is a long distance between the data store and the compute engine that accesses it.

Zadara customers are never required to use capital expenses (CapEx) to purchase storage hardware. Instead, they simply pay a monthly fee for just the amount of storage they actually use. This is true whether VPSA units are installed in the cloud or on-site in the customer’s data center. In either case, the storage capacity can be seamlessly scaled up or down in seconds.

 

Clients Are Looking To MSPs For Help

Most forward-looking companies are at least intrigued by the possibilities big data analytics could open up for them. But they also understand that implementing such a solution is no trivial task. They know they need help. And that represents a tremendous opportunity for MSPs that are willing to work with expert partners to provide the capabilities their clients need.

If you’d like to know more about how partnering with Zadara Storage could make big data analytics a reality for your clients, please download the ‘STaaS vs Traditional Storage’ infographic.

June 21, 2017

Posted In: Tech Corner

Tags: , , ,

Leave a Comment

Data Security Best Practices for MSPs

What are data security best practices? With so much news about hacking and ransomware, it can be hard to keep up.

In May of 2017 a ransomware attack shook the cyber world as never before. Called WannaCry, the malware encrypted data on more than 200,000 computers in at least 150 countries. Both large enterprises and smaller businesses were thrown into chaos as perpetrators demanded payment in return for a promise to provide the encryption keys that would allow victims to regain access to their data.

This episode has served as a wake up call to many organizations that have paid insufficient attention to ensuring that their business-critical information is protected from the existential threat posed by modern cyber criminals. In this environment, there’s probably no greater benefit Managed Services Providers (MSPs) can provide for their clients than to guide them in implementing top-flight data protection solutions that can keep their precious data safe.

So, what can MSPs do to ensure the safety of their clients’ data?

 

Start With The Basics

data-security-best-practices

A good MSP will proactively work with clients to fully understand their particular data protection needs and help them develop an appropriate plan. That assessment will include issues such as identifying business-critical information that requires a high level of protection, determining if there are regulatory requirements, such as HIPAA compliance, that must be met, and specifying appropriate RTO (Recovery Time Objective) and RPO (Recovery Point Objective) levels to insure business continuity if a disruption occurs.

Of course the MSP will see to it that regular data backups are performed, including replication to remote sites to insure against simultaneous loss of both the original data and the backup in a fire or other local disaster. Plus, the backup/recovery process will be regularly tested to insure that backed up data can actually be restored.

Continuous 24/7 monitoring of a client’s IT infrastructure is a fundamental element of the services offered by most MSPs. This monitoring should focus not only on potential hardware or software failures, but also on detecting threats and intrusions both from outside and from within the customer’s organization. This will include encouraging the client to maintain and regularly update a comprehensive role-based access management process that strictly limits permissions to those required by each individual’s job responsibilities.

Other basics that must be covered include insuring that anti-malware software is installed and kept up to date, and that all software upgrades and security patches are promptly applied. And a good MSP will seek to educate the client’s personnel about how to avoid falling victim to “social engineering” threats.

 

Moving To The Next Level Of Data Security Best Practices

data-security-best-practices

Many MSPs already do a good job of providing the basics of data protection for their clients. But MSPs that go beyond the basics to offer enterprise-grade data security at an affordable price will stand out from the crowd and gain a distinct competitive advantage.

Providing that next level of protection has historically been an expensive proposition that many MSPs were simply not positioned to undertake. But now, with the advent of the storage-as-a-service (STaaS) concept, the ability for MSPs to offer enterprise-class data protection has become a practical reality. By partnering with a first-class STaaS provider, MSPs can offer a range of data security services far superior to what most of their competitors can achieve on their own.

A good example of such a partner is Zadara Storage. Through its VPSA Storage Array technology, Zadara allows MSPs to offer their customers top grade data protection services, including:

  • Automatic, continuous, incremental backups to both on-premises and off-site remote storage, including private clouds and public clouds such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).
  • Frequent zero-impact snapshots with the retention of multiple versions, allowing customers to keep as long a history of their data as they desire.
  • RAID protection that spreads copies of the data across several disks so that a drive failure won’t result in loss of data.
  • Automatic failover to virtual servers either on-premises or in the cloud, providing the ability for applications to continue running even when a local disruption occurs.
  • Multi-zone and multi-cloud capabilities that can keep a client’s IT services online even if a major cloud provider suffers an outage.

 

MSPs Can Now Provide a High Level of Data Protection For All Their Clients

By partnering with an industry-leading STaaS provider like Zadara Storage, not only can MSPs offer a technically superior data protection solution to their clients, but they can do so at a much lower cost than was previously possible. Zadara’s offering is based on a pay-as-you-go model in which the client never needs to spend precious capital funds to acquire storage hardware, but simply pays a monthly fee for just the amount of storage they actually use. And that means enterprise-grade data protection can be a reality not just for large companies, but for smaller ones as well.

If you’d like to know more about how you can provide your customers with the highest levels of data protection at a cost they can afford, please download our data protection tip sheet.

June 7, 2017

Posted In: Industry Insights, Tech Corner

Tags: , , ,

Leave a Comment

Benefits of Hybrid Cloud Storage (And Why MSPs Should Recommend It)

What are the benefits of hybrid cloud storage? How can you, as and end user, or MSP, take advantage of these benefits of hybrid cloud storage?

Enterprises are moving their data to the cloud at a rapid pace. According to a recent survey conducted by IDG, 70 percent of organizations already have at least one application in the cloud, and another 16 percent plan to do so within a year.

The driving force behind this shift is the compelling set of benefits that cloud storage offers to businesses. These include greater agility and flexibility, enhanced data security, reduced IT management complexity, and substantial savings in total cost of ownership.

But at the same time, many IT professionals still view placing their organization’s critical data in the cloud as a risky proposition. In one survey, 62 percent of respondents cited concerns about data security as the biggest factor inhibiting them from more fully embracing the cloud.

 

Objections to Cloud Storage

In these times when it seems that a corporate data breach is in the news on a daily basis, the security of cloud storage is a universal concern. But it’s not the only one. Many companies face regulatory mandates, such as HIPAA requirements, that impose strict data protection standards and implicitly restrict where sensitive information can be housed. And organizations running applications with high I/O performance demands are concerned that, because of unavoidable latency delays that occur when data is transmitted long distances over the internet, the public cloud is simply not a viable option for their storage needs.

Because of such issues, many companies have a legitimate need to keep portions of their data in-house and under their immediate, direct control. But that doesn’t mean they have to forego all the advantages of the cloud storage model. Many enterprises today are implementing a hybrid solution in which their most sensitive or mission-critical information is kept securely within their own premises, while lower priority data is stored in the public cloud. According to a recent survey conducted by North Bridge Venture Partners, the hybrid model is used by 47 percent of companies, making it by far the most popular approach to enterprise data storage.

 

Benefits of Hybrid Cloud Storage For Managed IT Services Providers (MSPs)

The hybrid storage model offers MSPs the opportunity to make themselves invaluable to their clients by guiding them toward a solution that accommodates their data security and performance needs while still providing cost-effective cloud storage for the bulk of their data.

benefits-of-hybrid-cloud-storage

The on-premises storage infrastructure should be configured as a private cloud. This allows the use of a software-defined storage (SDS) paradigm that treats all the system’s data store resources, whether local or in the cloud, as a single pool of storage. Customers are presented with the same interfaces and operational procedures for all their data, wherever it may be located.

Because the SDS platform has granular control of the entire storage infrastructure, it can implement characteristic cloud storage features such as quick and easy capacity provisioning, essentially infinite scalability, increased reliability, automatic failover from local storage to the cloud, and top-flight data backup and disaster recovery regimes. In a real sense, by helping customers implement a hybrid storage solution, MSPs can offer the best of both the on-premises and cloud storage worlds.

 

When Should You Recommend Hybrid Cloud Storage?

Obviously, you’ll want to recommend a hybrid solution if your clients have high-level data security or I/O performance requirements. In such situations, MSPs can help clients define which data should stay in-house, and which can safely be committed to the public cloud.

But there are also several other use cases in which hybrid storage may well be your client’s best option.

  • Legacy systems: Many companies have legacy applications running in their in-house environment that it would be difficult to move to the public cloud without a costly major overhaul.
  • New app development: Clients can be encouraged to use the public cloud to develop and test new applications before putting them into their in-house production environment.
  • Backups, DR, archiving, and “cloud bursting”: If the reason for keeping data on-premises has to do with performance rather than security, the public cloud can be used for archiving, backup, disaster recovery, and as a reserve pool to accommodate unexpected surges in storage demand (cloud bursting).

 

MSPs Should Work With a Good STaaS (Storage-as-a-Service) Provider

benefits-of-hybrid-cloud-storage

Implementing a coherent, unified hybrid infrastructure in which on-premises and cloud-based data and applications seamlessly interoperate is not an easy task. The best way for most MSPs to do so is by partnering with a first class STaaS vendor that has a high level of expertise and experience in configuring enterprise-class hybrid storage solutions.

Zadara Storage is a good example of that type of STaaS provider. The company’s VPSA Storage Array technology is designed from the ground up both for the cloud and for in-house data centers. VPSAs are already resident in the facilities of major public cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). They can also be installed in customers’ premises to form the basis of a private or hybrid cloud storage solution.

MSPs that partner with Zadara can provide a cost-effective, technically superior hybrid storage solution with a 100 percent uptime SLA guarantee. There’s no requirement for any capital outlays to purchase equipment. Even when Zadara VPSA units are installed on site, clients simply pay a monthly fee for just the amount of storage they actually use.

If you’d like to know more about how you can provide your customers with a top-flight hybrid storage solution at an affordable cost, please download our latest analyst paper: Zadara Storage Voted by IT Pros as On-Premise Enterprise Storage-as-a-Service Market Leader.

June 1, 2017

Posted In: Tech Corner

Tags: ,

Leave a Comment

How MSPs Can Sell Enterprise Backup and Recovery Services

In today’s 24/7/365 commercial environment, information is a crucial resource for almost any business.  With that said, enterprise backup and recovery services are critical.

A company that loses access to its data for even just minutes will almost certainly suffer significant negative repercussions.

For example, every hour of downtime can cost a Fortune 1000 enterprise an average of $500,000 to $1 million. Plus, companies risk long-term loss of both customers and reputation.

Many times businesses don’t really understand how vulnerable they are to a potentially catastrophic loss of data due to disruptions such as man-made or natural disasters, hardware/software failures, malware attacks, or ill-advised actions by workers that can result in critical information being inadvertently (or deliberately) deleted or corrupted. Without a comprehensive backup solution in place, companies run the risk of critical and irreplaceable data disappearing in an instant due to some unanticipated circumstance.

Yet many businesses lack a viable strategy for ensuring that their vital information is adequately backed up and recoverable. Far too many companies aren’t consistently backing up their data at all. And many of those that do are actually more vulnerable than they know because their backup process doesn’t cover all the necessary bases.

That’s where a Managed IT Services Provider (MSP) can perform a vital service for its customers. Many businesses don’t have the time, money, or expertise to craft an adequate backup plan on their own. But a good MSP can help identify and implement enterprise-class backup solutions that will keep their clients’ data safe and available in almost any circumstance.

To learn more about Zadara Storage enterprise backup and recovery services, click here to join our webinar, “Why Service Providers Love Zadara Storage” on June 7th. 

Why Many Companies Have Inadequate Data Backup

The biggest reason many businesses skimp on their enterprise backup and recovery services is cost. It’s undeniable that implementing a good backup plan requires an investment of funds that might seem to be more urgently needed elsewhere. That often leads to companies relying on solutions that are less costly, but also less comprehensive and reliable.

Many times IT managers believe they are covered because workers are backing up their data to some type of on-premises storage on a daily or weekly basis. Or the company may maintain an account with a cloud backup service that automatically uploads data from servers or employee computers on a preset schedule.

But such practices can give an organization a false sense of security. It’s not uncommon for businesses that have some type of backup solution in place when they lose data to still have to call on a professional data recovery service to retrieve information that the company’s backup system was not able to fully restore.

enterprise-backup-and-recovery

 

What a First-Class Data Backup Solution Looks Like

Sometimes small businesses settle for inadequate backup because they are unaware of the enterprise backup and recovery services features that they need. The backup solutions offered by Zadara Storage provide a good picture of what a modern enterprise-grade backup system looks like.

For example, Zadara’s VPSA Storage Array technology enables backup to both on-premises and remote storage in order to ensure that a local disaster, such as a fire or flood, can’t wipe out both the original data and the backup at the same time. In addition, all data is RAID protected, which ensures that copies of the data are dispersed across several disks so that a drive failure will not cause data loss.

The Zadara technology allows automatic, continuous, incremental backups, with frequent zero-impact snapshots not only of data but of the operating system and running applications as well. This, along with mirroring to local and remote storage, enables automatic failover to virtual servers in the cloud or on site. If a local server should suddenly go down, these already-provisioned backup servers can instantly kick in, allowing a client’s applications to continue running without interruption. Moreover, with Zadara’s multi-zone and multi-cloud capabilities, even if a major cloud provider such as AWS or Microsoft Azure suffers an outage, the backup system can still provide customers with continuous access to their data.

Other features of a top-notch enterprise backup solution include unlimited scaling, in-flight and at-rest encryption, and centralized management. A critical service an MSP can add is continuous verification of backups, to ensure not only that client data is backed up correctly, but that it can, in fact, be recovered.

 

How MSPs Can Encourage Clients To Adopt Enterprise Backup and Recovery Services

enterprise-backup-and-recovery

MSPs, as trusted partners in helping companies get the best from their IT investments, have the opportunity to educate their clients about the dangers, potentially to the very survival of the business, of making less than adequate provisions for backing up critical data. At the same time, the MSP must be able to provide a solution that not only meets the required operational standards but which also is affordable within the client’s budget constraints.

That’s why partnering with a top-notch STaaS (storage-as-a-service) provider like Zadara Storage is key. Because the STaaS model is inherently more technically comprehensive and cost effective than solutions involving traditional storage, MSPs that take the initiative to encourage clients to upgrade their backup capability can offer enterprise-grade backup at a small business price point.

If you’d like to know more about how you can provide your customers with backup services that are more comprehensive and cost-effective than they can achieve on their own, please download the ‘Zadara Storage Cloud’ whitepaper.

May 17, 2017

Posted In: Tech Corner

Tags: , , , , ,

Leave a Comment

How Can Managed Services Providers Offer Flexibility and Agility To Their Clients

Managed Services Providers (MSPs) are in business to help clients achieve their objectives in a cost-effective manner by taking full advantage of the capabilities of modern IT technology. At least, that’s what clients expect of the MSPs they partner with. In today’s world, with the amounts of information companies use growing at exponential rates, that often means helping customers find ways to meet an ever-increasing need for more data storage capacity.

For many small and medium-sized businesses (SMBs) it’s becoming prohibitively expensive to continue adding capacity by purchasing more and more storage hardware. Perhaps even more importantly, the inherent limitations of traditional on-premises storage solutions are depriving those companies of the flexibility and agility they need if they are to succeed in today’s rapidly changing business environment.

As recent surveys indicate, a majority of IT professionals now look to the cloud to overcome the restrictions their existing storage solutions impose on their organizations. And forward-looking MSPs are taking the lead to ensure that the substantial benefits of the cloud storage model are made available to their clients.


Service Providers Gain Competitive Advantage by Leveraging Zadara Storage

Watch the Webinar


How The Cloud Makes Storage Less Costly

The traditional storage solutions still in use at many SMBs suffer from several limitations that restrict the ability of those companies to quickly react as business conditions change. Foremost among these is the high cost of using capital funds (CapEx) to purchase additional storage hardware to meet ever-growing demands for more capacity.

By offering a storage solution incorporating cloud-based Software-as-a-Service (STaaS), MSPs can help their clients meet their storage needs, and keep up with growing demand, at a substantially lower cost than with the traditional approach. STaaS customers need not procure any hardware. Instead, they purchase cloud-based storage services for a monthly fee. In other words, they can shift their spending from CapEx to OpEx. With no up-front purchase costs and monthly charges that may well be lower than the operating costs associated with on-premises installations, companies that turn to the cloud to provide their storage needs can significantly reduce their up-front and ongoing expenses.

Customers who wish to retain all or parts of their data storage in-house for security or performance reasons can also lower their costs by partnering with the right STaaS vendor. Zadara Storage,  for example, offers an On-Premises-as-a-Service (OPaaS) solution that implements a private cloud on site. Zadara installs its own hardware/software resources at the customer’s location, and monitors, maintains, and updates them remotely. With no more than a six months commitment, the customer receives the same OpEx pay-as-you-go storage services as they would through the public cloud.

 

How Cloud Storage Helps Make Companies More Flexible and Agile

Normally, the storage hardware in corporate data centers is scheduled to be replaced every three to five years. That time frame is based on the rate at which storage units are expected to reach end-of-life, and also on the rigidity of the capital funding process, which requires that spending is planned well in advance. But such long refresh cycles are simply incompatible with the speed of change in both technology and business conditions that characterizes today’s environment. If, for example, a client experiences a sudden surge in the number of online visitors to its website, having the site slow down significantly, or even go down completely because of insufficient storage capacity to handle the demand, would be a major catastrophe.

Storage administrators attempt to ensure that they always have sufficient storage capacity to accommodate unexpected surges in demand by over-provisioning. That is, they purchase and keep on hand extra storage units that normally sit idle, but are available to be brought online when needed. Not only is this an expensive solution, requiring as it does that capital funds be used to purchase equipment that may never be used, but it is also unreliable.

Due to the lead time required for allocating capital funds, the amount of storage equipment that will be needed at any particular time must be forecast months in advance. Such predictions have not proved to be notably accurate. The result is that in a business environment characterized by numerous rapid changes that could not be anticipated in advance, companies that depend on the over-provisioning strategy may find themselves unable to respond quickly to the new opportunities or problems they may encounter at any time.

With STaaS there need be no delays with bringing additional capacity online as needed. In fact, extra capacity can be added automatically and almost instantaneously through software. A top-flight STaaS provider, such as Zadara, will ensure that the physical storage resources to meet even sudden and unexpected increases in demand are always available.


Overall, we are seeing 80% better performance with Zadara Storage than with our prior storage solution.” — Chris Jones, Infrastructure Architect at Netrepid

Read the Case Study


One MSP’s Experience Offering STaaS To Its Clients

Netrepid is a Service Provider that provides colocation infrastructure and application hosting services that work side by side with a large variety of industries to accelerate their technology evolution from the ground to the cloud.

Recognizing that the ever-shifting storage requirements of their customers demanded not only substantially lower costs but also a level of flexibility and agility that only a cloud-based solution could provide, Netrepid decided to partner with Zadara to offer a top-flight STaaS solution. In fact, they incorporated Zadara’s On-Premises-as-a-Service offering.

According to Chris Jones, Infrastructure Architect at Netrepid, their partnership has greatly improved their business model and performance. He states, “Overall, we are seeing 80% better performance with Zadara than with our prior storage solution.”

 

STaaS May Be The Solution Your Clients Need

Many SMBs lack the expertise and the confidence to evaluate on their own whether the cloud is a viable option to meet their storage needs. That’s where a good MSP can step in to help clients understand how a best-of-breed cloud storage solution can help propel their business to the next level.


Partnering with a first-class STaaS provider enables you to provide your customers with a cost-effective enterprise-grade storage solution

Join the Zadara Partner Network Today

Zadara Storage Managed Services. Download White Paper


 

April 20, 2017

Posted In: Tech Corner

Tags: , ,

Leave a Comment