Archive

Category Archives for "Network World SDN"

BrandPost: Security and the Cloud Go Hand-in-Hand: Are You Prepared?

Just because you’ve tapped into the vast resources of a cloud service provider to replace previously on-premises IT assets doesn’t lessen your management or cybersecurity burden. In fact, cloud migration creates new issues for network admins to focus on: migrations are inherently risky from a cyber perspective – data on the move is data that can be exploited in transit.Cloud providers are prone to proclaiming that their security is better than any single business can achieve, simply because they have more resources to apply to the issue. By now, we all know that simply throwing money at the challenge is no guarantee of success, so maybe take that with a dose of skepticism.To read this article in full, please click here

AMD plots its move against Intel in the data center

Smelling blood in the water, a revitalized AMD is preparing for a big push against Intel in the data center, hoping to win back the market share it gained and lost a decade ago.AMD is promoting its Epyc processors, with 16 or 32 cores, as a lower TCO, higher performance option than Intel’s Xeon. It argues a single-socket 32-core server is cheaper up front and in the long run than a dual socket setup, which is Intel’s bread and butter.“We’re not saying single socket is for everyone, but at the heart of the market is where 50 percent to 80 percent are 32 cores per server and down, and our top single socket can do it more efficiently with lower costs and licensing. But in some cases some people will want to stay at two-socket,” said Glen Keels, director of product and segment marketing for data center products at AMD.To read this article in full, please click here

IDG Contributor Network: Zero Trust Networking (ZTN): don’t trust anything

John Kindervag, a former analyst from Forrester Research, was the first to introduce the Zero-Trust model back in 2010. The focus then was more on the application layer. However, once I heard that Sorell Slaymaker from Techvision Research was pushing the topic at the network level, I couldn’t resist giving him a call to discuss the generals on Zero Trust Networking (ZTN). During the conversation, he shone a light on numerous known and unknown facts about Zero Trust Networking that could prove useful to anyone. The traditional world of networking started with static domains. The classical network model divided clients and users into two groups – trusted and untrusted. The trusted are those inside the internal network, the untrusted are external to the network, which could be either mobile users or partner networks. To recast the untrusted to become trusted, one would typically use a virtual private network (VPN) to access the internal network.To read this article in full, please click here

Scale Computing, APC partner to offer micro data center in a box

Hyperconverged infrastructure (HCI) vendor Scale Computing and power management specialist APC (formerly American Power Conversion, now owned by Schneider Electric) have partnered to offer a range of turnkey micro data centers for the North American market.The platform combines Scale’s hyperconverged software, HC3 HyperCore, running on top of its own hardware and built on APC’s ready-to-deploy racks for a micro data center. Micro will sell the platform as a single SKU.The pre-packaged platform is entirely turnkey, with automated virtualization, power management resources, and built-in redundancy. This makes it well-suited for remote edge locations, such as cell phone towers, where staff is not immediately available to maintain the equipment.To read this article in full, please click here

How to list repositories on Linux

A Linux repository is a storage location from which your system retrieves and installs OS updates and applications. Each repository is a collection of software hosted on a remote server and intended to be used for installing and updating software packages on Linux systems. When you run commands such as “sudo apt update” or “sudo apt upgrade”, you may be pulling package information and package updates from a number of repositories.Repositories contain thousands of programs. Standard repositories provide a high degree of security, since the software included is thoroughly tested and built to be compatible with a particular distribution and version. So, you can expect the updates to occur with no unexpected "side effects."To read this article in full, please click here

Linux tricks that can save you time and trouble

Good Linux command line tricks don’t only save you time and trouble. They also help you remember and reuse complex commands, making it easier for you to focus on what you need to do, not how you should go about doing it. In this post, we’ll look at some handy command line tricks that you might come to appreciate.Editing your commands When making changes to a command that you're about to run on the command line, you can move your cursor to the beginning or the end of the command line to facilitate your changes using the ^a (control key plus “a”) and ^e (control key plus “e”) sequences.You can also fix and rerun a previously entered command with an easy text substitution by putting your before and after strings between ^ characters -- as in ^before^after^.To read this article in full, please click here

Georgia Tech research: smart-building and IoT technology are highly fragmented

Greater cooperation among standards bodies, corporations, city governments and other stakeholders is needed so IoT and existing smart-building technology can work together to deliver the full potential of smart cities, according to a Georgia Tech study.The problem is that standards are lacking for current in-building systems, let alone having standards so they can share with newer IoT devices.[ Check out our corporate guide to addressing IoT security. ] One vendor of automation software for, say, elevators might use a much different data format than the manufacturer of a given building’s HVAC systems, making it difficult to integrate these two critical systems into the same framework.To read this article in full, please click here

Nvidia revs up AI with GPU-powered data-center platform

Nvidia is raising its game in data centers, extending its reach across different types of AI workloads with the Tesla T4 GPU, based on its new Turing architecture and, along with related software, designed for blazing acceleration of applications for images, speech, translation and recommendation systems.The T4 is the essential component in Nvidia's new TensorRT Hyperscale Inference Platform, a small-form accelerator card, expected to ship in data-center systems from major server makers in the fourth quarter.The T4 features Turing Tensor Cores, which support different levels of compute precision for different AI applications, as well as the major software frameworks – including TensorFlow, PyTorch, MXNet, Chainer, and Caffe2 – for so-called deep learning, machine learning involving multi-layered neural networks.To read this article in full, please click here

IDG Contributor Network: On the road to your IoT adventure: planning, deployment and measurement (Part 2)

Welcome to the second installment in my series on how organizations can get started on the road to success with their Internet of Things (IoT) projects. In the first part of my series based on “Building the Internet of Things – a Project Workbook,” I explained how to identify your IoT vision and path to value. The next steps include planning, deployment and measuring your success. Let’s get started.Benchmark your organization against industry peers The first step is to determine how your organization stacks up to its industry peers.  Benchmarking will help establish metrics you can use to validate your project, secure funding, evaluate your team and promote success after the project is complete. It also helps establish a baseline, allowing you to see where you stand at the beginning of the project, so you can measure how far you’ve come at the end. You can use the benchmarking method of your choice, but I encourage you to evaluate the following areas:To read this article in full, please click here

BrandPost: 10 new terms that will help define the future of 5G

Ciena Bo Gowan, Leader, Social Media 5G is coming, and with it a host of new terms and acronyms. We try to make sense of it all with definitions for the ones that matter most for the future of 5G network architectures.Trying to digest the vast amounts of information related to 5G is like drinking from a fire hydrant, and it doesn’t help that there is a litany of new terms being introduced as part of the journey to 5G architectures.To read this article in full, please click here

IDG Contributor Network: 5 ways IoT device management differs from MDM

They say necessity is the mother of invention, and the accelerating pace of the mobile revolution is no exception. The proliferation of mobile devices affected countless industries, creating a powerful need for businesses to manage those devices. And so, mobile device management (MDM) was born. While these solutions have evolved considerably over the years to keep pace with new technologies and innovations, their core function remains enabling IT personnel to remotely manage, track, troubleshoot and secure devices – mostly smartphones and tablets these days.It’s easy to see the parallels between the factors driving the development and adoption of MDM – namely device control and security – and those that brought about IoT device management. Similarities in concepts and terminologies further cloud the distinction between the two. But, as any company that has attempted to repurpose an MDM solution for the management of IoT devices can attest, the resemblance is only superficial.To read this article in full, please click here

Supermicro unveils an insanely fast, insanely thin storage server

You want fast? Supermicro has introduced a new 1U server filled with Samsung-made NF1 SSD drives with capacity as high as 576TB and throughput of up 20GB/sec and 10 million IOPS.By comparison, a server with 15,000 RPM SAS hard disks can manage about 175-210 IOPS.There are other devices capable of 10 million IOPS, such as the EMC DSSD D5, but that was a massive 5U unit and has since been discontinued. The server uses Samsung’s NF1 form-factor (formerly known as NGSFF), which look like very large M.2 drives and come with two rows of NAND flash chips, maximizing capacity. Samsung has already shown off the drives in 8TB and 16TB capacities.[ Check out 10 hot storage companies to watch. | Get regularly scheduled insights by signing up for Network World newsletters. ] The new Supermicro product, the SSG-1029P-NMR36L, has 36 18TB NF1 drives in its 1U chassis, doubling the capacity of a model introduced in January with 288TB. The server also comes with two 28-core Xeon SP processors and holds up to 3TB of memory in 24 DIMM slots and dual 16-lane PCIe network cards.To read this article in full, please click here

IDG Contributor Network: How the IoT helps you make money, save money and stay out of jail

The joke goes like this: “There are only three reasons to complete an Internet of Things project – to make money, save money and stay out of jail.” The truth of the first two reasons is easy enough to demonstrate. The third is more complex.First, it’s an exaggeration for effect. The punchline infers that a properly implemented IoT project should help companies avoid violating data protection and compliancy rules, but the penalty for that is fines, not jail. In addition, even the idea that the IoT helps companies avoid compliance troubles is unproven.High-profile security breaches that exploited Internet of Things (IoT) vulnerabilities have created major business concerns, and people really aren’t sure how safe IoT is. These breaches have redoubled industry efforts to make the IoT as secure as possible, and the joke implies that these efforts are succeeding. But that’s debatable.To read this article in full, please click here

IDG Contributor Network: Data-driven resource management and the future of cloud

Cloud adoption is undoubtedly the cornerstone of digital transformation, and for many, it is the foundation for rapid, scalable application development and delivery. Companies of all sizes and from across all industries are racing to achieve the many benefits afforded by public, private or hybrid cloud infrastructure. According to a recent study, 20 percent of enterprises plan to more than double public cloud spending in 2018, and 71 percent will grow public cloud spending more than 20 percent.Enterprises moving to the cloud are often seeking to improve employee collaboration, ensure redundancy, boost security and increase agility in application development. One of the top advantages afforded by the cloud is the ability to auto-scale in response to demand — a feature that has transformed what was once capacity planning into a more continuous cycle of capacity and resource management.To read this article in full, please click here

Software-defined data centers need MANO

Software-defined data-center (SDDC) networks hold the promise of quickly and automatically reallocating resources to best support applications without changing the underlying physical infrastructure, but they require the proper integration of management, automation and network orchestration (MANO).To read this article in full, please click here(Insider Story)

IDG Contributor Network: Migrating to the cloud Is a good start, but what you do next is critical

Today, it seems like every business is migrating to the cloud. And it’s true – nearly three in four businesses are using cloud solutions to augment traditional networking practices, with no signs of slowing down. The cloud’s potential has captured the attention of business leaders across nearly every industry, thanks to its promise of speed, scale and control. In fact, nine out of ten companies rely on the cloud to accelerate digital transformation and drive business growth.To read this article in full, please click here

IDG Contributor Network: IoT alphabet soup: when should an enterprise use MQTT versus LWM2M?

There is tremendous interest from industrial enterprises to understand the nuances of the two most debated IoT data communications protocols: MQTT and LWM2M. MQTT and LWM2M are protocols that create a standard way to get device data to systems, platforms, applications, and other devices.Let’s talk a little about each protocol and when it’s best used in an enterprise IoT deployment.MQTT and when to use it Message queuing telemetry transport (MQTT) is an ISO standard which describes a publish/subscribe (pub/sub) messaging protocol. Nearly all IoT platforms support MQTT communication, making it the de facto standard for device-to-platform IoT communication.To read this article in full, please click here

IDG Contributor Network: 5 reasons why it’s important to mix and match in your cloud strategy

The rise of the cloud settled the age-old debate about whether IT teams should choose an array of exceptional technologies from various providers, or a fully integrated stack of (mostly unexceptional) applications from a single vendor.Thanks to cloud computing, you can have the very best applications—and the very best clouds—for the IT tasks at hand. And you don't have to deal with the headache of investing heavily in infrastructure and building it yourself.Here's a quick look at five ways the cloud and related technologies enable your IT team to launch, integrate, scale, and secure the full spectrum of applications.1. You can pick the best cloud for the job While businesses may end up running most applications in a single cloud, there are lots of reasons to diversify and adopt a multi-cloud strategy. The major cloud infrastructure providers have individual strengths—and a well-planned multi-cloud strategy enables you to pick the cloud platform that offers the best combination of technical features, pricing, and performance for each application.  Some examples of workloads that may be better running on one hyper-scale cloud over another could be enterprise business applications, big data or high-performance computing (HPC) workloads.To read this article Continue reading

IDG Contributor Network: Visibility is key for devops and the hybrid cloud

Cloud has undoubtedly become a key component of successful business in recent years, especially when you consider the race to digitally transform. Across the globe, companies are moving their applications and services to the cloud and are consequently reaping the benefits of lower capex and opex as a result.However, with this process, cloud migration is only a beginning for any organization’s digital transformation (DX) journey. If harnessed correctly, cloud is a pillar of innovation for DX, and can be a driving force for new business models and use cases that – even a few years ago – weren’t possible. No one knows this better than devops teams; these teams hold the line when it comes to continuous delivery and deployment, and it therefore stands to reason that devops play a crucial role in the digital transformation journey. In practice however, the decision makers in charge of cloud strategies are rarely those in the bowels of the ship.To read this article in full, please click here

BrandPost: Ethernet Adventures: Making Progress with an Old Friend – Good ol’ Ethernet

In the second post of this 3-part series we unravel one hero’s journey on the road to streamlined enterprise networking operations. Ciena’s Chris Sweetapple follows Our Hero’s adventures as he realizes the opportunities of high-bandwidth business Ethernet.In episode 1 our hero, stuck in a tangle of outdated networking equipment, services and complexity, realizes that with business Ethernet as his WAN technology of choice, he can complement the existing network set-up and connect his branch offices, while handling massive volumes of traffic concurrently using high speed (up to 100G) high-performance Ethernet connectivity.To read this article in full, please click here

1 85 86 87 88 89 366