Zadara Blog

News, information, opinion and commentary on issues affecting enterprise data storage and management.

Bring Cold Object Storage to Your Private Cloud

In today’s computing environment, more and more companies are beginning to work with massive datasets, ranging into the hundreds of petabytes and beyond. Whether it’s big data analytics, high-definition video, or internet-of-things applications, the necessity for companies to handle large amounts of data in their daily operations continues to grow.

Historically, enterprises have managed their data as a hierarchy of files. But this approach is simply inadequate for efficiently handling the huge datasets that are becoming more and more common today. For example, public cloud platforms, such as Amazon Web Services (AWS) and Microsoft Azure, that must service many thousands of users simultaneously, would quickly become intolerably unresponsive if every user data request meant having to traverse the folders and subfolders of multiple directory trees to find and collect the information needed for a response.

That’s why modern public cloud platforms, and other users of big data, use object storage in place of older file systems. And as the use of private clouds grows, they too are employing object storage to meet the challenges of efficiently handling large amounts of data.

big data word cloud

What Is Object Storage?

With object storage, there is no directory tree or folders. Instead, there is a flat global namespace that allows each unit of stored data, called an object, to be directly addressed.

Each object contains not only data, but also metadata that describes the data, and a global ID number that uniquely identifies that object. This allows every object in the storage system, no matter where it might be physically stored, to be quickly retrieved simply by providing its unique identifier.

Why Object Storage is Well Suited To Private Clouds

When it comes to handling massive datasets in a cloud environment, object storage has a number of unique advantages. Let’s take a look at some of these:

  • It’s infinitely scalable. Because of its flat namespace, an object storage system can theoretically be scaled without limitation simply by adding objects, each with its own unique ID.
  • Metadata makes searching easy. The metadata that accompanies each object provides critical information about the object’s data, making it easy to search for and retrieve needed data quickly and efficiently without having to analyze the data itself.
  • It’s highly robust and reliable. The VPSA Object Storage differs from a traditional RAID redundant storage using a distributed “Ring” topology policy under the hood.  Zadara Object store allows for a 2-way or 3-way replication as options which the customers can choose at creation time. By the use of erasure coding (instead of RAID) to achieve continuous and efficient replication of data across multiple nodes, an object storage system automatically backs data up, and can quickly rebuild data that is destroyed or corrupted. Nodes can be added or removed at will, and the system uses Swift’s underlying Ring replication to ensure that new objects are incorporated, or removed ones are rebuilt, automatically and transparently.
  • It simplifies storage management. The metadata of an object can contain as much (or as little) information about the data as desired. For example, it could specify where the object is to be stored, which applications will use it, the date when it should be deleted, or what level of data security is required. Having this degree of detail available for every object allows much of the data management task to be automated in software.
  • It lowers costs. Object storage systems don’t require expensive specialized storage appliances, but are designed for use with low-cost commodity disk drives.

storage arrays in cloud

Zadara VPSA Object Storage

Zadara offers an object storage solution that incorporates all the advantages discussed above, and then some. VPSA Object Storage is specifically designed for use with private as well as public clouds. It is especially suited to storing relatively static data such as big data or multimedia files, or for archiving data of any type. VPSA Object Storage provides anytime, anywhere, any-device remote access (with appropriate access controls) via HTTP.

The VPSA Object Storage solution, which is Amazon S3 and OpenStack Swift compatible, features frequent, incremental, snapshot-based, automatic data backup to object-based storage, eliminating the need to have separate backup software running on the host.

If you would like to explore how Zadara VPSA Object Storage can help boost your company’s private cloud, please contact us.

October 10, 2017

Posted In: Industry Insights

Tags: , , , , , , , , , ,

Leave a Comment

Challenges MSPs Face as Customers Move to the Cloud

The face of the MSP (managed IT services provider) marketplace is changing rapidly. Not so long ago the keys to success for most MSPs revolved around recommending or selling the newest and best hardware and software products to their customers. But as more and more companies migrate to the cloud, that approach is no longer adequate.

The Cloud’s XaaS Model Changes Everything for MSPs

Perhaps the most important feature of the cloud model is that it allows customers to meet many, if not all, of their IT requirements by making use of pay-as-you-go services offered by cloud providers. This “anything as a service” (XaaS) approach reduces, or in some cases totally eliminates, the necessity of purchasing specific hardware/software solutions. For example, many companies no longer meet their document processing needs by installing Microsoft Office on their computers. Instead they simply subscribe to Office 365 and receive the services they need through the cloud.


Service Providers Gain Competitive Advantage by Leveraging Zadara Storage

Watch the Webinar


In today’s IT environment customers aren’t looking for products, but for solutions. That means MSPs must now demonstrate that they provide a unique value proposition for customers who can theoretically go directly to a CSP (cloud service provider) to obtain almost any type of IT service they might need.

Yet the good news for MSPs is that customers aren’t really looking for services – they’re looking for solutions to the business issues they face. As IT business coach Mike Schmidtmann puts it, “Cloud is a business conversation, not a price-and-product conversation.”

So, the MSPs that survive and thrive in the age of the cloud will be those who shift away from simply offering specific products, and move toward providing strategic IT solutions that help their customers realize their business objectives.

value-added features

A Good MSP Will Help Customers Develop an IT Strategy Based on Business Goals

Most MSP clients are not interested in IT per se. Their focus is on using IT effectively to enhance their business operations. So, the first service a cloud-savvy MSP can provide to their customers is to help them develop a comprehensive IT strategy that is closely aligned with the company’s business objectives. In effect, the MSP will seek to become an extension of the customer’s own IT staff, providing a depth of expertise and operational capability that would be very difficult for the customer to maintain in-house.

Once armed with a good understanding of the customer’s business goals, an MSP can help them develop a comprehensive IT strategy that will support those objectives. So, the first conversations between MSPs and their customers shouldn’t be about specific solutions, but about the goals and strategy that customer is pursuing for both the present and the future of its business.


Service Provider Success Story:

Overall, we are seeing 80% better performance with Zadara Storage than with our prior storage solution.” — Chris Jones, Infrastructure Architect at Netrepid

Read the Case Study


A Good MSP Will Identify Specific Cloud Solutions That Meet Customer Needs

cloud storage as-a-service

A recent CompTIA survey reveals that many companies, especially smaller ones, have a great deal of difficulty in aligning their IT infrastructure with their business strategy. They simply don’t have the in-house technological expertise to do so effectively. John Burgess, president of an MSP in Little Rock, AR, says that such companies are “usually fairly ad hoc and reactionary in how they manage and spend technology.”

Here’s where the added value an MSP partner can provide becomes clearly evident. A good MSP can help identify the specific available cloud services that best fit the customer’s business strategy. In doing so, the MSP will be looking not just at individual services and the CSPs that offer them, but at how those services can be integrated into a unified system that can be effectively managed as a single solution.

A Good MSP Will Manage the Customer’s Cloud Infrastructure

Perhaps the most important service a good MSP can offer is to relieve customers of the burden of having to worry about their IT operations. This involves the capability to initially put the system in place, to monitor its operations on a 24/7/365 basis, and to proactively handle problem resolution and upgrades to system components.

A Good MSP Will Establish Relationships With Expert Partners

Few MSPs have the resources to develop and maintain in-house the kind of comprehensive cloud expertise required to fully support their customers on their own. Most will benefit from having specialized expert partners that can support the MSP in the services they offer to customers.

A good example of such a partner is Zadara Storage. As a storage-as-a-service (STaaS) provider, Zadara offers a high level of expertise in all elements of storage, whether in the public cloud, private clouds, or customers’ on-premises data centers. In fact, Zadara’s VPSA Storage Arrays are already installed in the facilities of major public cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), and are available for installation on customer premises as the basis of a private or hybrid cloud solution.

Whether the VPSA Storage Arrays they use are in the cloud, on-premises, or both, Zadara customers never buy storage hardware. Instead, they purchase storage services, paying a monthly fee for only the amount of storage they actually use during that billing period.


Partnering with a first-class STaaS provider enables you to provide your customers with a cost-effective enterprise-grade storage solution

Join the Zadara Partner Network Today

Zadara Storage Managed Services. Download White Paper


 

October 4, 2017

Posted In: Industry Insights

Tags: , , , , , , , , , , , ,

Leave a Comment

How a Multi-Cloud Strategy Can Benefit MSPs

Businesses of all sizes are moving to the cloud in ever-increasing numbers. MSPs (Managed IT Services Providers) are recognizing that if they don’t want to be left behind, they’ve got to lead the way. That’s why most successful MSPs today are committed to providing their customers with a comprehensive array of services delivered through the cloud.

But for an MSP to provide the highest levels of cloud-based services, it’s not enough to develop expertise with any particular cloud platform. All the major cloud service providers (CSPs), such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), offer a common set of basic features. Yet, they also differ from one another significantly in the types of services each is best suited to provide. That’s why deriving the maximum benefit from the cloud model requires the ability to take advantage of the best that each individual cloud platform has to offer.

In other words, to get the most out of the cloud, you need a multi-cloud strategy.

The multi-cloud approach provides some important advantages to MSPs, both in the services they can offer their customers, and in terms of their own operations. Let’s take a look at some of these benefits.


Service Providers Gain Competitive Advantage by Leveraging Zadara Storage

Watch the Webinar


Benefits to MSP Customers

multi cloud puzzle pieces

Match Workloads To the Most Suitable Platforms

All the major clouds provide similar suites of basic services. Yet each is optimized for different types of workloads. For example, if your customer is running Windows client apps, Microsoft Azure is a natural fit. If they are doing big data analytics, GCP might be a better choice. Part of your job as a multi-cloud MSP is to help your customers determine the best cloud platform for each of their workloads.

Avoid Vendor Lock-In

A good MSP will work with clients to ensure that their workloads are portable between platforms. That way, if a client becomes dissatisfied with a particular platform for any reason, their options won’t be limited by the prospect of a costly and time-consuming migration to another cloud.

Reduce Costs

Each CSP provides different service plans, at different price points, for each set of features it offers. Part of what a cloud-savvy MSP can offer clients is the ability to distribute specific workloads among the various cloud platforms to not only take advantage of what each cloud does best but also, to get the best pricing for exactly the services the client needs.


Service Provider Success Story:

“Zadara Storage is a key reason that our solution can outperform the competition.” — David Benson, Chief Technology Officer and Co-founder, BeBop Technology

Read the Case Study


Enhance Data Security

By replicating data (and even virtual servers) among different clouds, MSPs can offer a high level of backup/recovery and disaster recovery services to clients. A disruption at one location can immediately trigger failover to either another zone or to an entirely different cloud platform.

Benefits to the MSP Itself

choosing multi cloud

Increase Your Ability to Meet SLA Requirements

MSPs are usually bound by a Service Level Agreement (SLA) that provides a specific up-time guarantee. A multi-cloud strategy helps MSPs meet SLA up-time requirements by allowing operations to be quickly and transparently shifted from a CSP that is experiencing an outage to a different platform.

Keep Up With Technological Advances

The major cloud platforms are quite competitive with one another. Each works hard to introduce new or improved features that are not available through its competitors. But multi-cloud MSPs are able to tap into innovations introduced by any of the CSPs with whom they work.

Extend Your Expertise

A multi-cloud approach can substantially reduce the degree to which an MSP is required to be a technical jack of all trades. Instead of maintaining in-house experts for a wide range of solutions, MSPs can leverage the expertise of the various CSPs to offer platform-specific services to clients.

How Zadara Can Help With a Multi-cloud Strategy

The Zadara Storage Cloud is ideal as part of a multi-cloud solution. With Zadara VPSA Storage Arrays installed both on customer premises and connected to the major cloud providers such as AWS, Azure, and GCP, data can be seamlessly and transparently replicated among various public and private cloud platforms. Download White Paper: Getting Great Performance in the Cloud.


Partner with a first-class STaaS provider to provide your customers with a cost-effective enterprise-grade storage solution

Join the Zadara Partner Network Today

Zadara Storage Managed Services. Download White Paper


 

September 21, 2017

Posted In: Industry Insights

Tags: , , , , , , , ,

Leave a Comment

Why Companies Adopt Both Public and Private Clouds

More and more companies are basing significant portions of their IT infrastructure in the cloud. According to the RightScale 2017 State of the Cloud Survey of IT professionals, a full 95 percent of respondents said that their companies have adopted the cloud as an integral part of their IT operations. For some of those companies, the focus is on the public cloud; for others it’s on an in-house private cloud. The majority make use of both public and private clouds.

What is it about public and private clouds that causes so many companies to be drawn to them? Let’s take a look at the benefits each of these cloud models offer to businesses today.

The Benefits of the Cloud

It was not that long ago that the standard approach to IT in most companies was to build and maintain their own in-house datacenters. But the cloud computing model has brought about a fundamental shift in the way businesses seek to meet their IT needs. No longer must companies devote scarce capital (CapEx) funds to the purchase of their own servers, storage, and networking hardware. Instead, the cloud model encourages them to purchase IT services on a pay-as-you-go basis for a monthly fee.

Customers pay only for the services that they actually use. The cloud platform provider is responsible to acquire, support, and upgrade the required hardware and software as necessary, and to ensure that a sufficient amount of these resources is always available to allow on-demand provisioning and scaling. The result is that the cloud model offers companies lower overall costs, greater flexibility and agility, rapid deployment of applications, and a substantial reduction in the amount of expert staff required to manage the organization’s IT infrastructure.

How Public and Private Clouds Differ From One Another

public and private clouds cross streets

Public cloud platforms, such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) are, as the name implies, open to everyone. They operate on a multi-tenancy model in which hardware and software resources are shared among a number of different customers. This allows the public cloud to realize economies of scale that drive down costs for all users.

Private clouds, on the other hand, are built on a single-tenancy model. That means they are devoted exclusively to one customer, and there is no sharing of resources. Private clouds can be implemented either in a company’s on-premises datacenter using its own hardware, in an external facility run by a trusted partner such as a managed services provider (MSP), or even, in some cases, with dedicated resources in the facilities of a public cloud provider. The key is that a private cloud is isolated to a single customer, and there is no intermingling of that customer’s hardware/software resources or data with those of other customers.

Advantages of the Public Cloud

Because of its large multi-tenant user base, a public cloud platform can normally provide IT services at a lower cost than a private cloud could achieve. Costs are also reduced by the fact that customers have no responsibility for purchasing, housing, supporting, or managing hardware. The result is that workloads can be deployed on a public cloud platform more quickly and inexpensively than would be the case with a private cloud.

Advantages of a Private Cloud

cloud in chains protected for data protection

The main driver in the decision of many companies to make use of a private cloud is the desire to retain maximum control over business-critical data. Although public clouds now provide the highest levels of data protection, the multi-tenant nature of such platforms, and the fact that they are designed to allow access by users around the world, presents a level of perceived vulnerability that many companies are not comfortable with. Plus, businesses in certain industries face strict regulatory compliance obligations, such as those imposed by the Health Insurance Portability and Accountability Act (HIPAA). With a private cloud, all of a company’s data can remain safely hidden behind the organization’s own firewall, totally inaccessible to outsiders.

The ability to tailor a private cloud to the exact requirements of a company’s specific workloads may also provide performance advantages over what could be achieved with a public cloud platform.

The Zadara Storage Solution Spans Both Public and Private Cloud Platforms

The Zadara Storage Cloud provides a common storage solution for both public and private clouds. Its VPSA Storage Arrays support each of the major public cloud platforms such as AWS, Azure, and Google Cloud Platform (GCP). They also form the basis of many private cloud implementations. The Zadara Storage architecture also provides resource isolation, so users gain the benefits of multi-tenant public clouds, but with the security and predictable performance of a private cloud. Whether they use the public cloud, a private cloud, or a hybrid combination of the two, Zadara customers receive all the benefits of the cloud model, including paying a monthly fee for just the amount of storage they actually use. And Zadara takes on the responsibility to monitor and support the customer’s storage, whether on-site or in the public cloud.

If you would like to know more about how Zadara can help you develop a comprehensive cloud solution for your company, please download the ‘Zadara Storage Cloud’ whitepaper.

September 13, 2017

Posted In: Industry Insights

Tags: , , , , , , , , , , , , ,

2 Comments

Developing Your Multi-Cloud Strategy

 

laptop on desk cloud

If your company is like most, sooner or later you are going to be involved with multi-cloud computing. The RightScale 2017 State of the Cloud Survey reveals that 85 percent of enterprises already have a multi-cloud strategy, and the percentage is increasing every year. According to Bernard Golden, CEO of Navica, the future of corporate IT is clear: “All enterprises will use multiple cloud providers, and they must plan for how they will operate in a multi-cloud environment.”

If your business doesn’t yet have a thoroughly thought out multi-cloud strategy, it’s time to start developing one.

Why You Need a Multi-cloud Strategy

Although all the major public cloud platforms, such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), offer a full range of cloud services, they still differ from one another in significant ways. As Navica’s Bernard Golden puts it, “Each cloud provider implements core computing functionality quite differently, and each offers quite different services within each category.” For example, Microsoft’s Azure is a natural for deploying Windows client apps, while when it comes to data storage, AWS offers a breadth of services the other platforms don’t yet match.

Guiding Principles for a Good Multi-Cloud Strategy

writing in clouds principles

Perhaps the most vital principle for managing a multi-cloud environment is that the entire system must be managed as a single entity. Although each cloud platform has its own native management console, users should never be required to use different procedures for each cloud – and they certainly shouldn’t be expected to log into different clouds depending on the application they may be running. Administrators should be able to manage the functionality of the system through a “single pane of glass” interface that doesn’t change depending on which cloud platform is providing a particular set of services or applications.

Assess Your Environment

The core of a good multi-cloud strategy is matching workloads with the platforms best able to run them efficiently and cost effectively. That means having a good understanding of the operational characteristics of the workloads you currently run. So, in developing your multi-cloud strategy, you’ll want to start by doing a comprehensive assessment of your current environment, taking note of factors such as data protection needs, performance demands, and the particular services required by each workload.

Prepare Your Organization

In developing your multi-cloud strategy, you’ll need to assess not just your workloads, but your IT organization as well. Although users may not need to be aware of which cloud is actually servicing their applications, your IT staff will need to know. In fact, they’ll need to have the platform-specific expertise required to develop, install and manage workloads in each cloud. As Mary McCoy, Demand Generation Programs Manager at Continuum puts it, “IT organizations need to develop deep skills in each of the cloud providers they will use.”

That means your multi-cloud strategy should include provisions for either training your own staff, or for partnering with a third party services provider that can provide the required expertise.

Develop Your Data Protection Strategy

data protection in the cloud

One of the most vital components of your multi-cloud strategy will be defining how data protection will be enacted across clouds and for the system as a whole. While each cloud platform may have its own approach to data protection, it is absolutely critical that your organization’s data access, backup/restore, security, and regulatory compliance policies be uniformly applied across the system. Take extreme care that in attempting to mesh the security approaches of different clouds you don’t open unforeseen holes that aggressive intruders can take advantage of.

Develop Policies

A major foundation for implementing a viable multi-cloud approach is the use of software-defined storage. Only with SDS can the environment be managed through software from a single point of control. The SDS software, in turn, is guided by a set of policy directives developed by administrators that define how the system as a whole, as well as each individual component, should function. Your multi-cloud plan should specify the overall guidelines that will be applied when your policies are developed in detail.

Simplify Your Multi-Cloud Environment

The key to a successful multi-cloud implementation is being able to manage a complex system encompassing different cloud platforms, each with its own distinct characteristics, as a single entity. That can only be accomplished by instituting a software-defined environment that is guided by well thought out policy directives.

The Zadara Storage Cloud is a good example of an SDS solution that facilitates that kind of unified control. With Zadara VPSA Storage Arrays installed both at customer sites and in the facilities of major cloud providers like AWS, Azure, and GCP, intercommunication and data transfer between clouds can be managed automatically and transparently through software. Functions such as remote replication, mirroring, and application failover between clouds can be applied system-wide without users having to be concerned with the specific protocols of each platform.

If you’d like to know more about how the multi-cloud capabilities of the Zadara Storage Cloud can work for you, please download our Getting Great Performance in the Cloud white paper.

August 16, 2017

Posted In: Industry Insights

Tags: , , , , , , , ,

Leave a Comment

How to Meet SLA Uptime Commitments: Managed Service Providers (Hint: It Starts with SDS)

How to Meet SLA Uptime. Something organizations often struggle with. Something any service-based company strives for.

Most businesses are operating in an around-the-clock world. Because of that, companies must be able to interact with customers or prospects on a 24/7 basis.

But when that doesn’t happen, the costs can be high. According to a report by consulting firm IDC, small and medium-sized businesses (SMBs) lose between $137 and $427 for every minute of IT downtime.

It’s no surprise, then, that businesses that entrust the support of their IT infrastructure to a Managed Services Provider usually require that the MSP ensures a high level of availability for the customer’s IT operations. Those assurances generally take the form of an uptime guarantee in the Service Level Agreement (SLA).

For example, a 90 percent uptime guarantee would compel the MSP to ensure that the client’s IT operations will be down no more than 10 percent of the time. But MSPs will almost never encounter an uptime requirement that generous. At 90 percent availability, the client’s operations could be offline for up to 36.5 days a year. In today’s 24/7/365 business environment, very few companies would agree to that.

According to Terri McClure, a senior analyst at Enterprise Strategy Group, uptime guarantees of 99.9 percent are common. That’s called a “three nines” level of availability. But even that high level allows for the downtime of 8.76 hours per year. For businesses that are dependent on their IT operations, that amount of downtime is still far too much. That’s why many MSPs are attempting to gain a competitive advantage by offering their clients higher availability guarantees. With a “four nines” (99.99 percent) availability commitment, downtime reduces to 52.56 minutes per year, while a “five nines” guarantee would allow for no more than 5.26 minutes of downtime in a year.

How can a MSP offer clients a SLA with such high availability guarantees without exposing itself to severe penalties if the standard isn’t met? Software-Designed Storage (SDS) offers the most viable answer to that question.

Traditional Storage Becomes Less Reliable As It Grows

The SAN and NAS storage solutions typically used in traditional data centers do not always support today’s businesses environment. Such systems employ dedicated, proprietary, and expensive storage devices that are designed to be highly reliable at the hardware level. Overall system reliability is keyed to the low failure rates that are assumed to characterize each individual storage device.

The fact that traditional storage systems scale in capacity by adding more and more of these high-availability storage units makes those systems less reliable as they grow. The failure of a single storage array or controller won’t necessarily disrupt the system as a whole. But as more and more storage devices are added in order to accommodate the exponentially growing capacity demands companies are now experiencing, the number of failure points also grows, and the likelihood of multiple simultaneous failures increases. The question of ‘how to meet SLA uptime’ sometimes goes out the window when you’re growing exponentially.

In addition, the design of traditional storage solutions often requires that they are deliberately taken offline for maintenance, hardware upgrades, or software updates.

The result of these factors is that as traditional storage systems grow and evolve, the likelihood that they will experience significant downtime increases significantly.

SDS Treats Hardware Failures as Inevitable and Expected

How-to-Meet-SLA-Uptime

SDS, on the other hand, is designed with failure in mind. SDS doesn’t necessarily worry about how to meet SLA uptime commitments in the sense that it plans for failure. In an SDS implementation, the intelligence of the system resides not in the hardware, but in software. Because SDS is designed to allow the use of inexpensive commodity disk drives in place of the costly dedicated storage hardware that characterizes traditional storage, it assumes that individual devices will fail relatively frequently. The software can be configured to quickly and transparently compensate for such failures by employing sophisticated storage management features such as automatic data replication, mirroring, deduplication, snapshots, and the ability to essentially hot swap storage devices.

In other words, SDS implementations are designed to be inherently self-healing.

SDS Provides Greater Reliability Than Traditional Storage

How-to-Meet-SLA-Uptime

A top-of-the-line SDS offering, such as the Zadara Storage Cloud, allows MSPs to offer more aggressive uptime guarantees than would be prudent with traditional storage. For example, the Zadara Storage VPSA Storage Array solution is designed from the ground up for High Availability (HA). It provides capabilities for both on-premises and remote mirroring of data, and for asynchronous replication of snapshots to geographically remote VPSAs. The Zadara Multi-Zone HA option allows automatic, real-time failover across widely separated locations. And with its multi-cloud capability, Zadara enables automatic, transparent failover between different clouds, such as Amazon Web Services, Microsoft Azure, and Google Cloud Platform.

Because of these unique capabilities, Zadara is able to offer a storage SLA with a 100 percent uptime guarantee. By partnering with Zadara, MSPs can assure clients of the highest levels of data availability at an affordable price.

If you’d like to know more about how Zadara’s SDS solution can help you meet the stringent uptime requirements modern SLAs demand, please download the ‘Zadara Storage Cloud’ whitepaper.

May 9, 2017

Posted In: Industry Insights

Tags: , ,

Leave a Comment

Software-Defined Storage Durability: Why You Can’t Get It Wrong

When it comes to woodworking, measure twice, cut once is the adage, but when I was applying this advice when working on a project this weekend, I got it wrong. It came down to that fact that the tape measure was missing the 50-inch mark.

Yet, when it comes to Software-Defined Storage, a common theme customers continue to ask is “what if I get it wrong?” Without trying to provide a flippant response, “you can’t with SDS,” I like to explain to them that unlike appliances which force you down a ridged storage deployment model and requires extreme upfront due diligence, Software-Defined Storage provides flexibility where you can change mid-stream to meet your ever-changing application requirements. With Zadara, you can build an all-flash-array, all-SATA for data repository, or an hybridization giving you mid-tier performance at a low entry price.

In this Tech Tip Tuesday, we examine how to create and test different durability levels that are available as part of our SDS toolbox. With three drives, I show you how to build a RAID group that will meet your durability and storage efficiency metrics.

Click here to watch the recording on this how to.

Click here to sign up for a future Tech Tip Tuesday.

February 14, 2017

Posted In: Tech Corner

Tags: ,

Leave a Comment

What are the Differences Between Software-Defined Storage and Traditional SAN & NAS?

If you’ve paid any attention to the IT blogs, news magazines, and tech journals, you’ve likely noticed the trend toward everything being ‘software defined’. You’ve probably heard of software-defined data centers, software-defined networking, and more. Software-defined storage marks a revolutionary new way of building and managing storage — one which costs less, yet is more agile, more reliable, and better supports multi-tenancy. So, what are the differences between software-defined storage and traditional SAN and NAS?

Continue reading What are the Differences Between Software-Defined Storage and Traditional SAN & NAS?

April 6, 2016

Posted In: Tech Corner

Tags: , ,

2 Comments

Managing Disaster in The Public Cloud

Despite the best intentions of the public cloud infrastructure providers, issues do happen from time to time. Some problems are more significant than others, as we saw with the recent Amazon Web Services (AWS) DynamoDB outage (details here). The impact to AWS customers was widespread and although it would be easy to dismiss the affected services as trivial (Netflix, IMDB, Tinder, Buffer), these are still clients running production workloads. Do a quick Google search and it’s easy to find many other similar examples, including this one affecting Microsoft’s Azure Storage Service.

Continue reading Managing Disaster in The Public Cloud

October 15, 2015

Posted In: Tech Corner

Tags: , , ,

Leave a Comment

How the Cloud Disrupts Traditional Enterprise Storage

Traditionally, enterprise IT had to buy their own expensive hardware. Then, the cloud came along and showed us that it doesn’t have to be that way. In the cloud, you eliminate the CapEx investment and rigidity of physical storage purchases. The cloud provides significant flexibility to grow and shrink as needed. If your business requirements change, you can adjust your IT resources accordingly. Additionally, there is no need to worry about hardware in the cloud because providers take complete responsibility for making sure everything works, including hardware and software deployment and upgrades – all behind the scenes.

Continue reading How the Cloud Disrupts Traditional Enterprise Storage

May 29, 2015

Posted In: Industry Insights

Tags:

Leave a Comment