The digital world is becoming ever more visual. From webcams and drones to closed-circuit television and high-resolution satellites, the number of images created on a daily basis is increasing and in many cases, these images need to be processed in real- or near-real-time.
This is a computationally-demanding task on multiple axes: both computation and memory. Single-machine environments often lack sufficient memory for processing large, high-resolution streams in real time. Multi-machine environments add communication and coordination overhead. Essentially, the issue is that hardware configurations are often optimized on a single axis. This could be computation (enhanced with accelerators like GPGPUs or …
Apache Kafka Gives Large-Scale Image Processing a Boost was written by Nicole Hemsoth at The Next Platform.
It’s uncommon to find an organization that succeeds in building a private OpenStack-based cloud. It’s extremely rare to find one that documented and published the whole process like Paddy Power Betfair did with their OpenStack Reference Architecture whitepaper.
I was delighted to see they decided to do a lot of things I was preaching for ages in blog posts, webinars, and lately in my Next Generation Data Center online course.
Highlights include:
Read more ...One thing that puts a lot of network engineers off NETCONF and YANG is the complexity of the device configuration process. Even the simplest change involves multiple tools and requires some knowledge of XML. In this post I will show how to use simple, human-readable YAML configuration files to instantiate YANG models and push them down to network devices using a single command.
Continue reading