Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
Serverless computing, a disruptive application development paradigm that reduces the need for programmers to spend time focused on how their hardware will scale, is rapidly gaining momentum for event-driven programming. Organizations should begin exploring this opportunity now to see if it will help them dramatically reduce costs while ensuring applications run at peak performance.
For the last decade, software teams have been on a march away from the practice of directly managing hardware in data centers toward renting compute capacity from Infrastructure as a Service (IAAS) vendors such as Amazon Web Services (AWS) and Microsoft Azure. It is rare that a software team creates unique value by managing hardware directly, so the opportunity to offload that undifferentiated heavy lifting to IaaS vendors has been welcomed by software teams worldwide.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Security for containers has evolved quite substantially over the past year, but there is still a lot of education that needs to be done. The key point being that the biggest difference in this new paradigm is that everything is based on continuously delivered, micro-service based, applications. The fact that the technology enabler for that paradigm is containers is really less of an issue.
When it comes to containerized applications, everyone seems to be in agreement - statically analyzing what an application can do inside a container and rejecting non-security compliant images and/or vulnerable images is a must. However, no matter how good a job you do with vulnerability scanning and container hardening, there are unknown bugs and vulnerabilities that may manifest in the runtime and cause intrusions or compromises. That is why it’s so important to outfit your system with real-time threat detection and incident response capabilities.
To read this article in full or to leave a comment, please click here
This contributed piece has been edited and approved by Network World editors
Finding the right hire for IT can be a chore, and if you get it wrong, the consequences can be substantial. According to the U.S. Department of Labor, a bad hire costs at least 30% of the initial annual salary, but that number is believed to be much higher. It’s a mistake that companies simply can’t afford to make in today’s increasingly competitive marketplace.
To read this article in full or to leave a comment, please click here
This contributed piece has been edited and approved by Network World editors
C-level executives are increasingly calling on IT to investigate cloud options. In fact, according to Gartner, 90% of organizations are looking into crafting a cloud strategy. There are, in fact, significant economic advantages to be found in the cloud. But despite the potential benefits, you need to look carefully before you leap.
Cloud providers have many things working in their favor. For starters, their ability to procure and operate at scale gives them substantial discounts that can be passed along to customers. Cloud customers also benefit by being able to purchase compute for specific applications on a pay-as-you-go basis, without the need for a long-term commitment. The real savings come from elasticity and not having to lay out substantial capital if you only want to ramp up compute for a short period of time.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
Cloud adoption is a strategic initiative for nearly every company today, but there is still a fair amout of fear, uncertainty and doubt around cloud security, most of it unfounded. In my experience, coding errors and application vulnerabilities are the root of most security problems, regardless of where the data resides. When it comes to cloud, you need to look past the distractions and focus primarily on securing applications.
The main difference between on-premise and cloud security is there is no longer a well-defined security perimeter that can be protected by hardware appliances. Security teams need to move away from hardware-defined approaches to programmatic, software-defined solutions. And it’s worth noting, cloud is not the only driver in this dissipation, the rapid onset of mobile-first is another key contributor.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
According to Gartner, a company with a corporate “no-cloud” policy in 2020 would be as rare as a company today operating without Internet. IDG estimates that 70% of enterprises are running at least one application in the cloud today and that number is projected to reach 90% in the next 12 months. In other words, in a couple of years a company not in the cloud will be unfathomable.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Network advances often require cable upgrades, but rewiring takes time and money. Often the existing cable can be leveraged, or the extent of the upgrade minimized, using ]media converters, one of the least glamorous yet most common and perhaps most versatile tools in a network manager’s toolbox.
New media converters and extenders are available that support Power-over-Ethernet (PoE and PoE+) and legacy cabling types such as coax and 2-wire. These devices enable increased utilization of existing network cabling while upgrading network performance.
To read this article in full or to leave a comment, please click here
Our organization, like most large public bodies, is locked into formal bureaucratic procedures and, by general standards, is highly risk-averse. In addition, like other organizations of the United Nations System, it has a unique attribute which makes moving to the cloud a much greater leap than for most other organizations: UN System organizations enjoy a special status.
In the aftermath of World War II, countries negotiating the Charter for the future United Nations agreed the organization should be in a position to function without interference from any single Member State. For this reason, a regime of privileges and immunities was developed. It is this special legal regime that ensures UN organizations are immune from the jurisdiction of national courts, that their premises cannot be entered by national enforcement agencies without their consent, and that their archives – including their data – cannot be accessed without their agreement.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Cybersecurity experts are excited about big data because it is the “crime scene investigator” of data science. If your organization is hacked and customer information compromised, your use of big data to collect massive amounts of information on your systems, users and customers makes it possible for data analysts to provide insight into what went wrong.
But while big data can help solve the crime after it occurred, it doesn’t help prevent it in the first place. You’re still left cleaning up the mess left behind by the breach: angry customers, possible compliance issues with data privacy standards like HIPAA and PCI DSS, maybe even government fines and class-action lawsuits.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
You don’t have to look far to see the amazing things that organizations are doing with big data technology: pulling information from past transactions, social media and other sources to develop 360-degree views of their customers. Analyzing thousands of processes to identify causes of breakdowns and inefficiencies. Bringing together disparate data sources to uncover connections that were never recognized before.
All of these innovations, and many more, are possible when you can collect information from across your organization and apply data science to it. But if you’re ready to make the jump to big data, you face a stark choice: should you use a pre-integrated “out-of-the-box” platform? Or should you download open-source Hadoop software and build your own?
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Keeping internal networks safe from the ravages of the Internet is increasingly hard, but virtual container solutions allow users to function normally while preventing the “deplorables” of the Internet– malware, exploits, and other negative phenomena – from reaching files and sensitive data.
Keeping suspicious files and connections in a separate container – a virtual space isolated from the rest of the network – is a savvy strategy that can save you a great deal of trouble and expense.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not advocate a position that is particular to the author’s employer and has been edited and approved by Network World editors.
With cybersecurity threats on the rise, companies are increasingly taking advantage of cybersecurity insurance. And while cyber insurance can be worth it, it’ll cost you. Last year, U.S. insurers earned $1B in cyber premiums. You can minimize your premiums by showing your insurance company you’re actively mitigating cyber risks, which is a win-win: lower your risk and secure a more cost-effective insurance plan.
To read this article in full or to leave a comment, please click here
The substantially high cost of MPLS circuits ($200-$400/Mbps/month) compared to easily deployed, lower cost broadband Internet (with a price tag of $1/Mbps/month) has triggered a shift in enterprise architectures to the software defined WAN. SD-WAN provides the flexibility to choose the most optimal transport and dynamically steer traffic over a mix of MPLS circuits, the public Internet, or even wireless LTE circuits.
The access transport selection depends on a variety of factors, including the type of application, traffic profile, security requirements, QoS and network loss and latency. When implemented correctly, SD-WAN truly has significant advantages: Faster service deployment, increased flexibility, unified management and improved application performance, to name a few. But, while familiarity about SD-WAN has increased over the last year, a survey by Silver Peak and IDG shows only 27% of small- to mid-sized enterprises have shifted to SD-WAN.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Like the threat landscape itself, web gateways have changed over the years. Back in the 1990s, organizations primarily used them to prevent employees from wasting time surfing the web – or worse, from visiting gambling, adult and other unauthorized websites. Today web gateways do much more than enforce regulatory compliance and HR policies. Whether they are implemented on-premise or as cloud-based services, organizations rely on web gateways to thwart Internet-borne threats delivered through users’ browsers.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Amazon was the first company to take a large monolithic system and deconstruct it into micro services. Netflix was next, deconstructing its behemoth software stack, seeking a more agile model that could keep up with 2 million daily API requests from more than 800 different device types. Forward-thinking companies like Google, eBay, Uber and Groupon soon followed. Today, enterprises are abandoning monolithic software architectures to usher in the latest era in systems architecture: micro services.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
You want to embed real-time communications features into your website or mobile application for direct peer-to-peer communication and you’ve landed on WebRTC. That’s a great start.
Now you realize that backend services are critical for building a robust solution. You are thinking about hosting your solution in the cloud, using an Infrastructure-as-a-Service (IaaS) environment built on top of Amazon Web Services (AWS). Again, good choice. AWS is an obvious first place to look as they’re a leader in the cloud services space.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Ready or not, unified communications is starting to move to the cloud. A recent survey by BroadSoft predicts that cloud UC market penetration will jump almost six times in the next four years, from 7% percent today to 41% of the overall UC space by 2020.
According to Gartner, “the UC-as-a-Service market as a whole is transitioning from the ‘early adopter phase’ to the ‘early mainstream phase’ for enterprise delivery.” Even those enterprises once reluctant to move to the cloud are gazing upward and putting small groups of users into the cloud.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
In the quest for securing the cloud, one key aspect is often left out of the discussion: the security impact of the cloud endpoint – most notably the imperiled browser.
As enterprises and individuals increasingly move computing to the cloud, security at the endpoint has been an escalating concern. Taking matters into their own hands, many enterprise consumers are going “direct to cloud” – avoiding enterprise IT practices that would otherwise protect endpoints, connectivity and data. Meanwhile, IT executives that once viewed cloud-based shared computing and storage infrastructure as their least trustworthy option now see the cloud as the safest choice.
To read this article in full or to leave a comment, please click here
Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
The large scale DDoS attack on DYN last week interrupted access to many major web sites, and while the specifics of the attack have been widely analyzed, here are the important lessons learned:
* DDoS attacks are alive and well: A few years ago DDoS attacks were hot news, but reports died down as the focus shifted to news about social engineering attacks, large scale data breachs and insider trading schemes. DDoS attacks seemed like yesterday’s risk but they are very much alive and well. In fact, they are back and stronger than ever.
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
The Internet of Things (IoT) promises to produce troves of valuable, fast moving, real-time data, offering insights that can change the way we engage with everyday objects and technologies, amplify our business acumen, and improve the efficiencies of the machines, large and small, wearable and walkable, that run our world.
But without careful, holistic forethought about how to manage a variety of data sources and types, businesses will not only miss out on critical insights, but fall behind the status quo. Here’s how to get prepared to wrangle and extract meaning from all of the data that’s headed your way:
To read this article in full or to leave a comment, please click here