Over the last few years, I have been sprawled in so many technologies that I have forgotten where my roots began in the world of data center. Therefore, I decided to delve deeper into what’s prevalent and headed straight to Ivan Pepelnjak EVPN webinar hosted by Dinesh Dutt.I knew of the distinguished Dinesh since he was the chief scientist at Cumulus Networks and for me; he is a leader in this field. Before reading his book on EVPN, I decided to give Dinesh a call to exchange our views about the beginning of EVPN. We talked about the practicalities and limitations of the data center. Here is an excerpt from our discussion.To read this article in full, please click here
Welcome to Agility City! Let me set the scene: In the castle, the Wonderful Wizard orchestrates networks in beautiful and powerful ways. Point-to-point tunnel connections are heralded as “architectural wonders,” which decades ago were called bridges with disdain. Meanwhile, The Wicked Witch of the West brews a primordial potion of complexity that is hidden behind curtains of automated provisioning. Packets of information are heavily laden with unnecessary information and double encryption.It almost makes you want Dorothy Gale to appear and click her Ruby Red slippers; “There is no place like home. There is no place like home.” If only we start talking about true networking, and not orchestration of bridges.To read this article in full, please click here
Every project management tool seeks to do the same instrumental thing: keep teams connected, on task and on deadline to get major initiatives done. But the market is getting pretty crowded, and for good reason — no platform seems to have gotten the right feel for what people need to see, and how that information should be displayed so that it’s both actionable/relevant, and contextualized. That’s why monday.com is worth a shot. The platform is based off a simple, but powerful idea: that as humans, we like to feel like we’re contributing to part of a greater/effort good — an idea that sometimes gets lost in the shuffle as we focus on the details of getting stuff done. So projects are put onto a task board (think of it like a digital whiteboard), where everyone can have the same level of visibility into anyone else who’s contributing set of tasks. That transparency breaks down the silos between teams that cause communication errors and costly project mistakes — and it’s a beautiful, simple way to connect people to the processes that drive forward big business initiatives. To read this article in full, please click here
One of the most basic things to understand in backup and recovery is the concept of backup levels and what they mean.Without a proper understanding of what they are and how they work, companies can adopt bad practices that range from wasted bandwidth and storage to actually missing important data on their backups. Understanding these concepts is also crucial when selecting new data-protection products or services.[ Check out 10 hot storage companies to watch. | Get regularly scheduled insights by signing up for Network World newsletters. ]
Full backupTo read this article in full, please click here
Ready or not, the upgrade to an important Internet security operation may soon be launched. Then again, it might not.The Internet Corporation for Assigned Names and Numbers (ICANN) will meet the week of September 17 and will likely decide whether or not to give the go ahead on its multi-year project to upgrade the top pair of cryptographic keys used in the Domain Name System Security Extensions (DNSSEC) protocol – commonly known as the root zone key signing key (KSK) – which secures the Internet's foundational servers.RELATED: Firewall face-off for the enterprise
Changing these keys and making them stronger is an essential security step, in much the same way that regularly changing passwords is considered a practical habit by any Internet user, ICANN says. The update will help prevent certain nefarious activities such as attackers taking control of a session and directing users to a site that for example might steal their personal information.To read this article in full, please click here
Future 5G-based wireless networking equipment and data center equipment will combine antennas and the corresponding radio guts into one microprocessor unit, researchers from the Georgia Institute of Technology say.Integrating all of the wireless elements that one needs in a radio will reduce waste heat and allow better modulation, according to the group, which has been working on a one-chip, multiple transmitter and receiver package design. Longer transmission times and better data rates will result, they say.“Within the same channel bandwidth, the proposed transmitter can transmit six- to ten-times higher data rate,” says Hua Wang, an assistant professor in Georgia Tech's School of Electrical and Computer Engineering, in a news article on the university’s website about the idea.To read this article in full, please click here
Many businesses today are moving customer-facing websites and applications to the cloud—and rightfully so. Cloud computing allows enterprises to reduce infrastructure costs and spend more time focusing on revenue generation and business growth. But cloud computing requires a shift in thinking about how to ensure high-quality user experiences and repeat business.Simply going live with a cloud deployment isn't enough. You also need to embrace openness and think about what happens outside the walls of your cloud provider's data center. Here are three steps all businesses can take to help make sure customers have a speedy, positive experience when accessing cloud-based websites and applications.To read this article in full, please click here
Fully baked products weren’t the only technologies on display at the VMworld conference in Las Vegas this week; VMware previewed three in-the-works projects related to edge computing, artificial intelligence and enterprise blockchain.The first is Project Dimension, which aims to deliver the functionality of VMware’s cloud offerings to the edge as a managed service. Project Dimension will combine the elements of VMware Cloud Foundation – including software-defined services for compute, storage, network and security, along with cloud management capabilities – in a hyperconverged form factor that’s operated by VMware. [ Read also: How to plan a software-defined data-center network ]
Just as VMware Cloud on AWS manages a customer’s infrastructure in the Amazon cloud, Project Dimension will manage a customer’s on-premises data-center and edge locations, such as branch offices and warehouse sites.To read this article in full, please click here
The semiconductor world is buzzing over the news that custom semiconductor manufacturer GlobalFoundries, the foundry born when AMD divested itself of its fabrication facilities, announced the sudden decision to drop its 7nm FinFET development program and restructure its R&D teams around “enhanced portfolio initiatives.”For now, GlobalFoundries will stick to 12nm and 14nm manufacturing. All told, approximately 5 percent (of roughly 18,000 employees) will lose their jobs. But it also sets back AMD, a GlobalFoundries customer, in its bid to get ahead of Intel, which has struggled for two years to get to 10nm and won’t get there until 2020.[ Learn who's developing quantum computers. ]
“The vast majority of today’s fabless customers are looking to get more value out of each technology generation to leverage the substantial investments required to design into each technology node. Essentially, these nodes are transitioning to design platforms serving multiple waves of applications, giving each node greater longevity. This industry dynamic has resulted in fewer fabless clients designing into the outer limits of Moore’s Law,” said Thomas Caulfield, who was named CEO of GlobalFoundries last March, in a statement.To read this article in full, please click here
The recent release of Linux kernel 4.18 followed closely by the releases of 4.18.1, 4.18.2, 4.18.3, 4.18.4, and 4.18.5 brings some important changes to the Linux landscape along with a boatload of tweaks, fixes, and improvements.While many of the more significant changes might knock the socks off developers who have been aiming at these advancements for quite some time, the bulk of them are likely to go unnoticed by the broad expanse of Linux users. Here we take a look at some of the things this new kernel brings to our systems that might just make your something-to-get-a-little-excited-about list.[ Also read: Invaluable tips and tricks for troubleshooting Linux ]
Code Cleanup
For one thing, the 4.18 kernel has brought about the surprising removal of nearly 100,000 lines of outdated code. That's a lot of code! Does this mean that any of your favorite features may have been ripped out? That is not very likely. This code cleanup does means that a lot of code deadwood has been carefully expunged from the kernel along with one significant chunk. As a result, the new kernel should take up less memory, Continue reading
Getting wide-area network links up and securely running quickly with minimal IT irritation has always been Cisco Meraki’s strong suite.Equipping customers tasked with securely supporting more cloud applications and mobile devices with ever more throughput and the latest connectivity options are the chief goals behind a raft of new model additions to Cisco Meraki’s MX and Z branch-office security appliances. [ Related: MPLS explained -- What you need to know about multi-protocol label switching
Meraki’s MX family supports everything from SD-WAN and Wi-Fi features to next-generation firewall and intrusion prevention in a single package. To read this article in full, please click here
Microsoft is offering extended support for Windows Server 2008 and SQL Server 2008 to customers who shift these platforms from on-premises into Microsoft’s Azure cloud.The scheduled ends of extended support for the 2008 versions of Server and SQL Server are Jan. 14, 2020 and July 9, 2019, respectively. But if customers move these workloads into the Azure cloud, they get three extra years of support at no extra cost beyond the price of the Azure service.In the past, when the end-of-life clock started ticking, organizations made a mad dash to upgrade operating systems and SQL servers in order to keep their systems supported. Some organizations chose to continue running their applications completely unsupported, unpatched and un-updated – a very bad thing to do in this age of viruses, malware and cyberattacks.To read this article in full, please click here
How soon the 4.18 kernel lands on your system or network depends a lot on which Linux distributions you use. It may be heading your way or you may already be using it.If you have ever wondered whether the same kernel is used in all Linux distributions, the answer is that all Linux distributions use the same kernel more or less but there are several big considerations that make that "more or less" quite significant.
Most distributions add or remove code to make the kernel work best for them. Some of these changes might eventually work their way back to the top of the code heap where they will be merged into the mainstream, but they'll make the distribution's kernel unique -- at least for a while.
Some releases intentionally hold back and don't use the very latest version of the kernel in order to ensure a more predictable and stable environment. This is particularly true of versions that are targeted for commercial distribution. For example, RHEL (Red Hat Enterprise Edition) will not be nearly as aggressively updated as Fedora.
Some distributions use a fork called Linux-libre, which is Linux without any proprietary drivers built in. It omits software Continue reading
Every data center admin knows that dealing with excess heat is one of the biggest, most expensive factors involved in running a modern data center.For decades, engineers have been looking for new ways to mitigate the issue, and now Norway is building a brand-new town designed to turn the problem into an opportunity to lower costs, reduce energy usage, and fight climate change.[ Read also: Data center cooling market set to explode in the coming years | Get regularly scheduled insights: Sign up for Network World newsletters ]
Hug your servers ... to stay warm
According to Fast Company, the town of Lyseparken, now under construction near Bergen, Norway, is being built to use the excess heat generated by a new data center in the heart of the community to keep a nearly 6.5 million square feet of nearby planned business and office space—and eventually up to 5,000 homes—warm. It works like this:To read this article in full, please click here
I’ve long felt Japan has been severely overlooked in recent years due to two “lost decades” and China overshadowing it — and supercomputing is no exception.In 2011, Fujitsu launched the K computer at the Riken Advanced Institute for Computational Science campus in Kobe, Japan. Calling it a computer really is a misnomer, though, as is the case in any supercomputer these days. When I think “computer,” I think of the 3-foot-tall black tower a few feet from me making the room warm. In the case of K, it’s rows and rows of cabinets stuffed with rack-mounted servers in a space the size of a basketball court.With its distributed memory architecture, K had 88,128 eight-core SPARC64 VIIIfx processors in 864 cabinets. Fujitsu was a licensee of Sun Microsystems’ SPARC processor (later Oracle) and did some impressive work on the processor on its own. When it launched in 2011, the K was ranked the world's fastest supercomputer on the TOP500 supercomputer list, at a computation speed of over 8 petaflops. It has since been surpassed by supercomputers from the U.S. and China.To read this article in full, please click here
Imagine you are in a crowded ER, and doctors are running from room to room. In the waiting area, patients are checking in via an online portal, and hospital staff are quickly capturing their confidential medical and insurance information on a mobile device. You look down the hall, where admitted patients are receiving treatment through Wi-Fi enabled biomedical devices, and some are even streaming their favorite show on Netflix while they wait. In a hospital, there are hundreds of individuals conducting critical tasks at any given moment, and they rely on thousands of connected devices to get the job done. But, this begs the question: What happens if that network fails?To read this article in full, please click here
Microsoft and Salesforce have separately announced plans to release some key software products as open source for anyone to use in their data centers.Microsoft plans to release its Open Network Emulator (ONE), a simulator of its entire Azure network infrastructure that it uses as a way to find and troubleshoot problems before they cause network outages. The announcement was made by Victor Bahl, a distinguished scientist with Microsoft Research, on a Microsoft podcast.To read this article in full, please click here
Welcome to what we’re hoping is the first in a long string of regular updates from the world of IoT; everything from security to platform news will be fair game, and the aim is to help you be better grounded in the rapidly expanding Internet of Things space.Schneider’s building open thingsSchneider Electric, the Andover, Mass.,-based building-infrastructure manufacturer, recently rolled out a new open framework for IoT implementations, dubbing the product EcoStruxure Building.[ Check out our corporate guide to addressing IoT security. ]
It’s a software platform that makes it easy for sensors and controllers to talk to each other, even in complicated, large-scale building projects where there could be a lot of both types of devices.To read this article in full, please click here
What impact is the internet of things having on enterprise networks and the way we use DNS? For many network and security pros today, the answer is “no clue,” due to the lack of source address validation combined with the explosive growth of IoT, expected to hit more than 75 billion connected devices by 2025[2].From embedded sensors laced with unknown code to devices that can exfiltrate data from otherwise secure networks, IoT devices are already leading a new wave of cyberthreats, from sensors designed with little or no thought towards security to network connectivity relying on default passwords, that can lead to cameras that query SQL databases. This blog post looks at some of the dangers with a view to how DNS can help stop them before they do harm.To read this article in full, please click here
The selection process for this roundup started back in May at the tail-end of a previous, but closely related competition, 10 Hot IoT startups to watch.AI wasn’t a key selection criterion then. Some startups had it, some did not, but enough of them focused squarely on AI that it made sense to look more closely at this subsector of the overall IoT market.[ Check out our corporate guide to addressing IoT security. ]
This roundup considered about 20 companies from the previous contenders that had strong AI components. Notice of the search for candidates was posted on HARO, LinkedIn, Twitter, etc., and all told, just under 40 startups were considered.To read this article in full, please click here