Among the many clouds in the storage sky, Amazon Web Services is the dominant leader by far. The annual AWS re:Invent trade show is taking place in Las Vegas this week, and given how many enterprises have some data in the AWS cloud, the event is expected to sell out.It’s no secret that the cloud is one of today’s top storage industry disruptors, significantly reshaping enterprise IT architectures and investments. Cloud object storage has a well-earned reputation for cutting costs since enterprises can avoid both upfront capital and costly operational expenditures by storing data off-premises. However, the challenges of getting data to the cloud means many enterprises are only getting started on adoption. There is much more growth ahead, as the cloud’s elastic scaling can also deliver agility, but so far, most enterprises are relegating cloud storage for archival.To read this article in full, please click here
For a very long time, IT professionals have made storage investments based on a few key metrics – how fast data can be written to a storage media, and how fast it can be read back when an application needs that information, and of course, the reliability and cost of that system. The critical importance of storage performance led us all to fixate on latency and how to minimize it through intelligent architectures and new technologies.Given the popularity of flash memory in storage, the significance of latency is not about to fade away, but a number of other metrics are rapidly rising in importance to IT teams. Yes, cost has always been a factor in choosing a storage investment, but with Cloud and object storage gaining popularity, the price of storage per GB is more than a function of speed and capacity, but also the opportunity cost of having to power and manage that resource. When evaluating whether to archive data on premises, or to send it offsite, IT professionals are now looking at a much wider definition of overall cost.To read this article in full or to leave a comment, please click here
For a very long time, IT professionals have made storage investments based on a few key metrics – how fast data can be written to a storage media, and how fast it can be read back when an application needs that information, and of course, the reliability and cost of that system. The critical importance of storage performance led us all to fixate on latency and how to minimize it through intelligent architectures and new technologies.Given the popularity of flash memory in storage, the significance of latency is not about to fade away, but a number of other metrics are rapidly rising in importance to IT teams. Yes, cost has always been a factor in choosing a storage investment, but with Cloud and object storage gaining popularity, the price of storage per GB is more than a function of speed and capacity, but also the opportunity cost of having to power and manage that resource. When evaluating whether to archive data on premises, or to send it offsite, IT professionals are now looking at a much wider definition of overall cost.To read this article in full or to leave a comment, please click here
Storage has long been one of the biggest line items on the IT budget. Rightly so, given that data is valuable asset and the lifeblood of every business today. Critical applications consume data as quickly as they get it, and many companies are also using their data to find new insights that help them develop novel products and strategies.Regardless of how hot a file is when it is created, in time, its use cools. As a business matures, more and more cool, cold and even frigid data continues to pile up. With analytics now offering new insights on old data, however, no one wants to delete old files or send them to offline archival storage. This means buying more and more capacity is a given, just as death and taxes.To read this article in full or to leave a comment, please click here
Storage has long been one of the biggest line items on the IT budget. Rightly so, given that data is valuable asset and the lifeblood of every business today. Critical applications consume data as quickly as they get it, and many companies are also using their data to find new insights that help them develop novel products and strategies.Regardless of how hot a file is when it is created, in time, its use cools. As a business matures, more and more cool, cold and even frigid data continues to pile up. With analytics now offering new insights on old data, however, no one wants to delete old files or send them to offline archival storage. This means buying more and more capacity is a given, just as death and taxes.To read this article in full or to leave a comment, please click here
If there’s one problem just about every IT professional can relate to, it is the pain of a storage migration. Aging is part of life not only for us IT veterans, but also the storage systems we manage. Despite the fact that we’ve been having to move data off old storage for decades, the challenge of moving data from one storage resource to another, without disrupting business, remains one of the most time consuming and stressful projects for an IT team.Many of the IT professionals I speak with tell me that their migrations are scheduled over months, and can even take a year to plan and execute. It’s no surprise then that IT professionals named migrations as the number two issue facing their departments in a recent survey. Only performance presents a bigger challenge for today’s IT professionals.To read this article in full or to leave a comment, please click here
If there’s one problem just about every IT professional can relate to, it is the pain of a storage migration. Aging is part of life not only for us IT veterans, but also the storage systems we manage. Despite the fact that we’ve been having to move data off old storage for decades, the challenge of moving data from one storage resource to another, without disrupting business, remains one of the most time consuming and stressful projects for an IT team.Many of the IT professionals I speak with tell me that their migrations are scheduled over months, and can even take a year to plan and execute. It’s no surprise then that IT professionals named migrations as the number two issue facing their departments in a recent survey. Only performance presents a bigger challenge for today’s IT professionals.To read this article in full or to leave a comment, please click here
It’s hardly surprising that IT professionals have their hands full in the age of IoT (Internet of Things) and Big Data. Supporting rapidly growing data volumes, new data types, and many more data sources is making it harder than ever for IT to meet service level agreements (SLAs) while keeping spending in check. The complexity IT manages is clear in the results of a recent Storage Census of over 300 IT professionals my company, Primary Data, conducted at VMworld 2017. The survey showcased the conflicting pressures currently faced by IT leaders. Those surveyed included delivering performance, executing data migrations, meeting expectations with existing budgets, and integrating the cloud into their infrastructure among the biggest challenges facing their departments today. Let’s examine the factors that contribute to these challenges and how IT can solve them.To read this article in full or to leave a comment, please click here
It’s hardly surprising that IT professionals have their hands full in the age of IoT (Internet of Things) and Big Data. Supporting rapidly growing data volumes, new data types, and many more data sources is making it harder than ever for IT to meet service level agreements (SLAs) while keeping spending in check. The complexity IT manages is clear in the results of a recent Storage Census of over 300 IT professionals my company, Primary Data, conducted at VMworld 2017. The survey showcased the conflicting pressures currently faced by IT leaders. Those surveyed included delivering performance, executing data migrations, meeting expectations with existing budgets, and integrating the cloud into their infrastructure among the biggest challenges facing their departments today. Let’s examine the factors that contribute to these challenges and how IT can solve them.To read this article in full or to leave a comment, please click here
Machine learning is a hot topic across the technology spectrum today. From self-driving cars, to catching nefarious content in the fight against terrorism, to apps that automatically retouch photos before you even take them, it is popping up just about everywhere. Each innovation is creating a new wave of business opportunity while simplifying and automating tasks that are generally beyond the reach of how much data we human beings can process at once, or even in a lifetime.To read this article in full or to leave a comment, please click here
Machine learning is a hot topic across the technology spectrum today. From self-driving cars, to catching nefarious content in the fight against terrorism, to apps that automatically retouch photos before you even take them, it is popping up just about everywhere. Each innovation is creating a new wave of business opportunity while simplifying and automating tasks that are generally beyond the reach of how much data we human beings can process at once, or even in a lifetime.To read this article in full or to leave a comment, please click here
If it seems your IT team has more data to manage than ever before, you’re not mistaken. Just about every enterprise is trying to determine how to manage more data growth without scaling budget or staff as well.451 Research analyst Henry Baltazar emphasized this trend in a recent report, noting, “The increasing relevancy of data management is in parallel with the ongoing growth of the sheer volume of data that enterprises must deal with.” The good news is that there are many approaches IT can take to ease the challenges of data growth. Let's take a look at four steps IT can use to make a big impact.To read this article in full or to leave a comment, please click here
If it seems your IT team has more data to manage than ever before, you’re not mistaken. Just about every enterprise is trying to determine how to manage more data growth without scaling budget or staff as well.451 Research analyst Henry Baltazar emphasized this trend in a recent report, noting, “The increasing relevancy of data management is in parallel with the ongoing growth of the sheer volume of data that enterprises must deal with.” The good news is that there are many approaches IT can take to ease the challenges of data growth. Let's take a look at four steps IT can use to make a big impact.To read this article in full or to leave a comment, please click here
Whether or not you work in the IT department, you have likely experienced the pain of migrating from one system to another. When you buy a new laptop, or a new phone, you’re faced with having to backup and replicate your old data to your new system, or start from scratch with none of the files you might need on your new device.Imagine this problem at enterprise scale. Moving terabytes of data is a daunting task that also requires planning and downtime when IT has to add a new storage system, upgrade or replacement. Just like with our smartphones, the old system likely still has some value, but since data can’t move easily from one system to the other, the equipment we’re leaving behind often remains as a backup to the backup copy.To read this article in full or to leave a comment, please click here
Storage is a fast-evolving industry. Groundbreaking hardware technologies quickly become commoditized, which is a challenge for vendors, but a great benefit to customers. Today’s shiny new array soon becomes matched by a similarly capable JBOD (Just a Bunch of Disk) product that might not have as robust of vendor support, but costs the enterprise far less than a brand-name system. This commoditization extends into flash as well. While it is still growing in adoption in the enterprise, Gartner already sees JBOF (Just a Bunch of Flash) products on the horizon in this segment as well. Cloud storage is on the rise in tandem with flash, and smart data management software can help enterprises overcome the complexity of cloud adoption and easily integrate JBOC (Just a Bunch of Cloud) with their existing architectures. To read this article in full or to leave a comment, please click here
The technology industry is buzzing about intelligence, analytics and other ways to make insight-driven decisions based on real data. What is not talked about so much is where these applications gets all of this information.Metadata is the data about data, and it is the key to significant insights for the enterprise. Knowing anything about the data used in an enterprise starts with metadata, which notes important details, such as when a file was last opened, how often it has been accessed, who accessed it, its size, its location, and so on. In fact, Stewart Baker, general counsel to the NSA has said, “Metadata absolutely tells you everything about somebody’s life. If you have enough metadata, you don’t really need content.”To read this article in full or to leave a comment, please click here
Storage vendors pitch new systems in innumerable ways. Whether they tout performance claims about IOPS and low latency, protection, reliability, and security features or sell on convenience, capacity, cost, or even brand reputation, there are many options vendors can offer an IT team looking to fix a problem.Although these various abilities have been around for many years, they have long been confined to a storage-centric ecosystem. With the advent of advanced data management software, it finally becomes possible to shift to a data-centric architecture that enables IT admins to automatically align data with storage that meets enterprises’ business objectives.To read this article in full or to leave a comment, please click here
Storage vendors pitch new systems in innumerable ways. Whether they tout performance claims about IOPS and low latency, protection, reliability, and security features or sell on convenience, capacity, cost, or even brand reputation, there are many options vendors can offer an IT team looking to fix a problem.Although these various abilities have been around for many years, they have long been confined to a storage-centric ecosystem. With the advent of advanced data management software, it finally becomes possible to shift to a data-centric architecture that enables IT admins to automatically align data with storage that meets enterprises’ business objectives.To read this article in full or to leave a comment, please click here
If you’re a storage admin, it might seem like there’s a new flash storage system being pitched at your inbox every week. Maybe a few times a week, in fact. Perhaps you’re also investigating the cloud, and whether your enterprise would want to go with a hybrid, private or public cloud implementation. Chances are you already have a lot of storage in your infrastructure from past purchases, and when you add it all up, you could be sitting on quite a diverse collection of resources—and those resources may be significantly underutilized today.The diversity of storage types presents many options, which creates a real challenge for admins—but only because all those different resources could not be seamlessly connected until now. With storage solutions that deliver ultra-fast performance, such as all-flash arrays, some that save with low cost, cloud capacity for cold (inactive) data, and numerous other shared storage resources, most enterprise IT teams have the right resources to serve a wide variety of different data demands. The challenge is knowing what data needs which resource, then continually aligning data to the right resource as its needs change over time.To read this article in full or to leave a comment, Continue reading