The featured webinar in February 2017 is the Network Automation 101 webinar, and the featured video describes the reasons you should be interested in network automation, its basics, and the difference between automation and orchestration.
Read more ...4.9G math: Licensed + unlicensed = 1 Gb/s
The United Nations estimates that one in six people (in Asia and the Pacific) live with disability – that is a total of 650 million people. Persons with Disabilities (PWDs) often face barriers that restrict them from participating in society on an equal basis, including the access to, and use of, information and communication technologies (ICTs).
The BBR algorithm appears to be building critical mass of support in the Internet community which makes reading this research paper even more worthwhile.
When bottleneck buffers are small, loss- based congestion control misinterprets loss as a signal
of congestion, leading to low throughput. Fixing these problems requires an alternative to loss-based congestion control. Finding this alternative requires an understanding of where and how network congestion originates.
BBR: Congestion-Based Congestion Control – ACM Queue : http://queue.acm.org/detail.cfm?id=3022184
The post Research: BBR: Congestion-Based Congestion Control – ACM Queue appeared first on EtherealMind.
I’m not known for going on rants but lately I’ve been seeing a lot of stupid tweets from vendors that have really bothered me. So today I’ll give my best Tom Hollingsworth “networkingnerd” impression and tell you what’s on my mind. To give you an example what the vendor marketing teams are putting out there I give you this piece of work:
At first it seems a bit cute and funny. Oh look! It’s Star Wars! All nerds love Star Wars! I do too, just to be clear. What this kind of marketing does though is to dumb down the customers. It insults my intelligence as a Network Architect. Hardware still matters. There still is a physical world. Almost all projects in networking has some kind of existing network so almost all deployments are going to be brownfield to some extent. Please show me the organization that does not have an existing network and is going to deploy something like NSX or ACI for their first network. Please show me the organization that has no legacy systems or applications. Please show me the organization that develops and owns all of their applications and they are all nicely Continue reading
Like all hardware device makers eager to meet the newest market opportunity, Intel is placing multiple bets on the future of machine learning hardware. The chipmaker has already cast its Xeon Phi and future integrated Nervana Systems chips into the deep learning pool while touting regular Xeons to do the heavy lifting on the inference side.
However, a recent conversation we had with Intel turned up a surprising new addition to the machine learning conversation—an emphasis on neuromorphic devices and what Intel is openly calling “cognitive computing” (a term used primarily—and heavily—for IBM’s Watson-driven AI technologies). This is the first …
Intel Gets Serious About Neuromorphic, Cognitive Computing Future was written by Nicole Hemsoth at The Next Platform.