This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
The cloud is the promised land when it comes to storage. A recent 451 Research report said AWS and Azure will be two of the top five enterprise storage vendors by 2017 with AWS as number two overall. But the challenge with using the cloud for primary storage is the latency between that storage and users/applications. To take advantage of the economics, scale, and durability of cloud storage, it will take a combination of caching, global deduplication, security, and global file locking to provide cloud storage with the performance and features organizations require.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
I’m an aerospace engineer by degree and an IT executive by practice. Early in my career, I worked on missile hardware and simulators with some of the smartest minds at Marshall Space Flight Center in Huntsville, AL. An adage from those days still drives me today: “Better is the evil of good enough.”
In rocket science, an astronaut’s life is literally in the balance with every engineering decision. Being perfect is mission critical. But along the way, NASA engineers realized while perfection is important, it was not to be universally adopted, for several key reasons: It is very expensive, it draws out timelines, and it can result in extreme over-engineering.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Only 1% of companies use software-defined WAN (SD-WAN) solutions today, but Gartner says the promise of cost savings and performance improvements will drive that number to more than 30% by 2019. Why aren’t more businesses deploying now given the sizeable list of vendor tools available? It could be a lack of understanding about the varying approaches to bringing software-defined networking to the branch.
Before exploring those differences, let’s review why SD-WAN is so promising for branch environments. Compared to traditional WANs, SD-WANs reduce the complexity of network hardware at branch offices and centralize and simplify management. SD-WANs also allow businesses to augment or replace MPLS networks by using less expensive Internet links in a logical overlay and intelligently routing traffic over multiple paths directly to the Internet, rather than through a central data center. This improves application performance and makes more efficient use of bandwidth.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Heralded for driving positive transformations in consumer products, retailing, healthcare, manufacturing and more, the Internet of Things (IoT) promises a “smart” everything, from refrigerators, to cars, to buildings, to oil fields. But there’s a dark side to IoT, and if we don’t overcome the challenges it presents, we will be heading for trouble.
The easiest way to see these challenges in action is to explore a possible IoT deployment. Let’s assume the following. A very large industrial food storage warehouse and distribution center is using Internet-connected devices to ensure the proper temperature of various zones, such as a massive refrigeration area for items requiring constant, non-freezing cooling and a massive freezer area for items requiring constant freezing.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
FTP turns 45 this year. And, while this original protocol for transferring files over the Internet is still widely used, many companies are looking for a more modern alternative. Initially, concerns about FTP centered on security. But, as IP technology became ubiquitous for global data exchange, FTP’s more fundamental performance limitations also became apparent.
Because FTP was originally designed without security features like data integrity and confidentiality, the first security concerns arose around privacy of control channel data like user IDs and passwords, and then spread to the actual data being transferred. “Secure” FTP (FTPS) was developed in response. FTPS is FTP with Transport Layer Security (TLS), which protects file content and user names and passwords while in transit over the Internet from eavesdropping and modification.
To read this article in full or to leave a comment, please click here
There is no downside to a licensing model where you only pay for what you’re actually using and have the ability to increase or decrease licensing. This is what makes the IBM sub-capacity licensing model so attractive.
The advantages of IBM’s sub-capacity licensing model are obvious, but the misinterpretations and misunderstanding of how to deploy sub-capacity happens frequently. In fact, I would say three out of five clients we work with start out saying they are using sub-capacity licensing when in reality they are using full-capacity licenses.
Your enterprise is always at full-capacity with IBM unless the appropriate steps are taken to change that status to sub-capacity IBM licensing. With few exceptions, IBM will consider an organization at full-capacity unless IBM License Metric Tool (ILMT) is implemented. What does this mean? If ILMT hasn’t been implemented, IBM doesn’t recognize your right to license at sub-capacity and will, in fact, view the organization’s license metrics as a full-capacity IBM licensing model. Under full-capacity licensing, you must license all active, physical processors in the server versus sub-capacity licensing where you pay for the virtual cores allocated.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
What if your network was more than just a collection of hardware and cables strung together over the years to solve specific problems? What if your network was agile enough to empower your business today and offer deep insight into the flow of information throughout your data center? What if this network could adapt to your changing business needs at the drop of a dime and help ensure no opportunity slips through the cracks?
Sounds like a dream, but in fact its very much possible today.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Hybrid cloud implementations are becoming standard for companies building next-generation cloud applications, but their adoption raises questions about how to run and manage database operations that support both environments.
While hybrid cloud allows IT to expand infrastructure resources only when required (i.e. ‘bursting’), improves disaster prevention, and makes it possible to offload some hardware and operational responsibility and associated costs to others, database issues to consider include:
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
Cisco estimates 50 billion devices and objects will be connected to the Internet by 2020. And that estimate may be low. If consumers count every device that draws power in their home – lamps, light bulbs, kitchen gadgets – and then factor in objects at work, there may be many more billions of connected devices by then.
But the problem is, many traditional networks are still manual, static and complex, which isn’t ideal for IoT. To realize the promise of a hyper-connected future, three shifts must take place.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
In 2001, a bunch of people got together and wrote a manifesto on Agile software. There were two main factors that made the output suspect. First, the fact that they even called it a manifesto. Second, the manifesto had nothing to do with software. It talked about values.
For those in need of a refresher, here’s the “Manifesto for Agile Software Development:”
We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:
-- Individuals and interactions over processes and tools
-- Working software over comprehensive documentation
-- Customer collaboration over contract negotiation
-- Responding to change over following a plan --
That is, while there is value in the items on the right, we value the items on the left more.
Somewhere along the line, we started doing daily standups, two-week sprints, maybe a little pair programming here and there. Since then our software output and productivity have sky-rocketed. Remember when we used to have an end-of-project company bug hunt? How about the integration Continue reading
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Cloud storage revenue is forecast to grow more than 28% annually to reach $65 billion in 2020. The driving force is the substantial economies of scale that enable cloud-based solutions to deliver more cost-effective primary and backup storage than on-premises systems can ever hope to achieve.
Most IT departments quickly discover, however, that there are significant challenges involved in migrating and synchronizing many thousands or even millions of files from on-premise storage systems to what Gartner characterizes as Enterprise File Synchronization and Sharing (EFSS) services in the cloud. According to Gartner, “by 2019 75% of enterprises will have deployed multiple EFSS capabilities, and over 50% … will struggle with problems of data migration, up from 10% today.”
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
The technology industry operates on micro and mega cycles of innovation. Micro cycles happen every hour, day, week and year. Mega cycles are far more rare, occurring every 20 years or so, like the leap from mainframes to client-server computing.
We are now entering the next mega innovation cycle. As with the previous seismic shifts, the benefits will be massive for those who adapt and potentially catastrophic for those who do not. We all know the compute layer is moving to the cloud – we’ve been watching this shift for years. Big Data, mobility, and the Internet of Things (IoT) are well on their way. Security, which seems to grab all the headlines lately, is still clearly a work in progress.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
It’s a cliché, but “change is the only constant.” Every company periodically reviews and makes changes to their applications, processes and solutions they use to conduct business. And nowhere is this rationalization more important than in the ever-shifting and increasingly perilous arena of cyber security.
Companies often begin the security rationalization process after accumulating a portfolio of tools over the years (i.e. penetration testers, web-application, and code scanners) or through mergers and acquisitions or shifting business strategies.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
More than a third of businesses in the United States currently use the cloud, but by 2020 that number is expected to more than double to a whopping 80%. But even though the cloud is secure, it doesn’t guarantee immunity from data breaches. Now that the cloud is rapidly becoming a mainstream part of IT, businesses must think more critically about how to bolster their security beyond cloud providers’ default security infrastructure—which often proves to be inadequate for the changing face of business.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
As organizations work to make big data broadly available in the form of easily consumable analytics, they should consider outsourcing functions to the cloud. By opting for a Big Data as a Service solution that handles the resource-intensive and time-intensive operational aspects of big data technologies such as Hadoop, Spark, Hive and more, enterprises can focus on the benefits of big data and less on the grunt work.
The advent of big data raises fundamental questions about how organizations can embrace its potential, bring its value to greater parts of the organization and incorporate that data with pre-existing enterprise data stores, such as enterprise data warehouses (EDWs) and data marts.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
Big data is the hot buzzword in security analytics today, but buyers are skeptical because many companies have spent years building “data lakes” only to discover it was impossible to “drain the lake” to get something useful.
And unfortunately, today’s solutions often include expensive clusters coupled with static business intelligence reports and “sexy” dashboards that look good but add little to useful and productive security analytics. Focusing on the analytics and how to use the data (very valuable data) in order to make real time decisions, discover critical patterns, determine on-going and changing security policies and dramatically improve security – ah – that’s useful.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
Most companies now use a range of cloud applications, and uptime performance for those applications is measured by Service Level Agreements (SLAs). These agreements acknowledge that glitches, system crashes and downtime have an enormous impact on business continuity and can adversely affect customer loyalty and churn. Gartner estimates downtime can cost major corporations as much as $100,000 per hour.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
The NoSQL industry was developed quickly on the promise of schema-free design, infinitely scalable clusters and breakthrough performance. But there are hidden costs, including the added complexity of an endless choice of datastores (now numbering 225), the realization that analytics without SQL is painful, high query latencies require you to pre-compute results, and the inefficient use of hardware leads to server sprawl.
All of these costs add up to a picture far less rosy than initially presented. However, the data model for NoSQL does make sense for certain workloads, across key-value and document data types. Fortunately, those are now incorporated into multi-mode and multi-model databases representing a simplified and consolidated approach to data management.
To read this article in full or to leave a comment, please click here
The unceasing arms-race between cyber attackers and cyber defenders has gained unprecedented levels of sophistication and complication. As defenders adopt new detection and response tools, attackers develop various techniques and methods to bypass those mechanisms. And deception is one of the most effective weapons on both sides of the game.
Deception techniques have traditionally been among the favorite methods in the attackers’ arsenal. Surprise and uncertainty provide the attacker with an inherent advantage over the defender, who cannot predict the attacker’s next move. Rather surprisingly, however, the broken symmetry can also be utilized by the defender.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
For most organizations that migrated to a new version of Windows in the past two years, the cost and frustration was not only high, the resources required were crippling. But ready or not, chances are a new migration project will soon be on your to-do list. In fact, almost a quarter of all PCs will be upgraded to Windows 10 within a year. That’s more than 350 million devices. It’s already on more than 100 million devices, and counting.
To read this article in full or to leave a comment, please click here