D-Wave Makes Quantum Leap with Reverse Annealing

The art and science of quantum annealing to arrive at a best of all worlds answer to difficult questions has been well understood for years (even if implementing it as a computational device took time). But that area is now being turned on its head—all for the sake of achieving more nuanced results that balance the best of quantum and classical algorithms.

This new approach to quantum computing is called reverse annealing, something that has been on the research wish-list at Google and elsewhere, but is now a reality on the newest D-Wave 2000Q (2048 qubit) hardware. The company described

D-Wave Makes Quantum Leap with Reverse Annealing was written by Nicole Hemsoth at The Next Platform.

Red Hat Throws Its Full Support Behind Arm Server Chips

The gatekeeper to Arm in the datacenter has finally swung that gate wide open.

Red Hat has always been a vocal support of Arm’s efforts to migrate its low-power architecture into the datacenter. The largest distributer of commercial Linux has spent years working with other tech vendors and industry groups like Linaro to build an ecosystem of hardware and software makers to support Arm systems-on-a-chip (SoCs) in servers and to build standards and policies for products that are powered by the chips. The company was a key player in the development of the Arm Server Base System Architecture (SBSA) specification

Red Hat Throws Its Full Support Behind Arm Server Chips was written by Jeffrey Burt at The Next Platform.

Cray picks Cavium processors for ARM-based supercomputers

Cray has picked Cavium’s ThunderX2 processor for its first ARM-based supercomputer, quite a win for the little guy coming just a week after the 800-pound gorilla that is Qualcomm formally introduced its ARM-based server processor, the Centriq.The Cavium ThunderX2 processor is based on 64-bit Armv8-A architecture and will be used in the Cray XC50 supercomputer. Cray customers will have a complete ARM-based supercomputer with all of the company’s software tools, including the Cray Linux Environment, the Cray Programming Environment, and Arm-optimized compilers, libraries, and tools for running today’s supercomputing workloads.To read this article in full, please click here

Cray picks Cavium processors for ARM-based supercomputers

Cray has picked Cavium’s ThunderX2 processor for its first ARM-based supercomputer, quite a win for the little guy coming just a week after the 800-pound gorilla that is Qualcomm formally introduced its ARM-based server processor, the Centriq.The Cavium ThunderX2 processor is based on 64-bit Armv8-A architecture and will be used in the Cray XC50 supercomputer. Cray customers will have a complete ARM-based supercomputer with all of the company’s software tools, including the Cray Linux Environment, the Cray Programming Environment, and Arm-optimized compilers, libraries, and tools for running today’s supercomputing workloads.To read this article in full, please click here

How To Access Devices with Unsupported SSL Ciphers

With the HeartBleed bug effectively killing off SSLv3 and vulnerabilities in cipher block chaining ruling out another whole swathe of SSL ciphers, network engineers may have found themselves trying to connect to a device and either getting no response (Safari), or getting a response like this (Chrome):

Chrome SSL Error

Or this (Firefox):

Firefox SSL Error

Once upon a time, it was possible to go into settings and enable the old, insecure ciphers again, but in more recent updates, those ciphers no longer exist within the code and are thus inaccessible. So what to do? My answer was to try a proxy.

Charles Proxy

The first proxy I looked at seemed promising. Although not free, Charles Proxy offers a 30 day free trial, and that seemed like a good thing to try. It’s limited additionally by only running for 30 minutes at a time before it has to be reloaded, but for my testing purposes that was not a problem.

During installation I declined to give Charles Proxy permission to configure the system proxy settings. Instead, I manually updated just my Firefox browser to use the proxy which was now listening on port 127.0.0.1:8888. Since I was making an SSL connection, I also Continue reading

Forrester predicts what’s next for IoT

What’s in store for the Internet of Things (IoT) in 2018? That’s the question on many people’s minds in the fast-growing IoT industry. One set of answers can be found in a new report from Forrester, called Predictions 2018: IoT Moves From Experimentation To Business Scale. According to Forrester and published reports last week, that journey means many things, but apart from the usual superheated speculation about IoT’s incredible growth and increasing impact, here’s what I think is most interesting. To read this article in full, please click here

Forrester predicts what’s next for IoT

What’s in store for the Internet of Things (IoT) in 2018? That’s the question on many people’s minds in the fast-growing IoT industry. One set of answers can be found in a new report from Forrester, called Predictions 2018: IoT Moves From Experimentation To Business Scale. According to Forrester and published reports last week, that journey means many things, but apart from the usual superheated speculation about IoT’s incredible growth and increasing impact, here’s what I think is most interesting. To read this article in full, please click here

Samsung Invests in Cray Supercomputer for Deep Learning Initiatives

One of the reasons this year’s Supercomputing Conference (SC) is nearing attendance records has far less to do with traditional scientific HPC and much more to do with growing interest in deep learning and machine learning.

Since the supercomputing set has pioneered many of the hardware advances required for AI (and some software and programming techniques as well), it is no surprise new interest from outside HPC is filtering in.

On the subject of pioneering HPC efforts, one of the industry’s longest-standing companies, supercomputer maker Cray, is slowly but surely beginning to reap the benefits of the need for this

Samsung Invests in Cray Supercomputer for Deep Learning Initiatives was written by Nicole Hemsoth at The Next Platform.

AMD charges back into the HPC fray with new systems

After years of watching its presence shrink on the Top 500 supercomputer list, AMD is battling back with a new set of EPYC-based server processors and specially-tuned GPUs for high-performance computing (HPC) in a complete server system.The company and its partners announced new servers with the EPYC 7601 processor, which it claims is three times more performance-efficient than Intel’s best Xeon server processors, the Xeon Platinum 8180M1, as measured by SPECfp[i] benchmark. The news came at the Supercomputing ’17 show taking place in Denver.Target workloads for AMD solutions include machine learning, weather modeling, computational fluid dynamics, simulation and crash analysis in aviation and automotive manufacturing, and oil and gas exploration, according to the company.To read this article in full, please click here

AMD charges back into the HPC fray with new systems

After years of watching its presence shrink on the Top 500 supercomputer list, AMD is battling back with a new set of EPYC-based server processors and specially-tuned GPUs for high-performance computing (HPC) in a complete server system.The company and its partners announced new servers with the EPYC 7601 processor, which it claims is three times more performance-efficient than Intel’s best Xeon server processors, the Xeon Platinum 8180M1, as measured by SPECfp[i] benchmark. The news came at the Supercomputing ’17 show taking place in Denver.Target workloads for AMD solutions include machine learning, weather modeling, computational fluid dynamics, simulation and crash analysis in aviation and automotive manufacturing, and oil and gas exploration, according to the company.To read this article in full, please click here

The Super Secret Cloudflare Master Plan, or why we acquired Neumob

The Super Secret Cloudflare Master Plan, or why we acquired Neumob

We announced today that Cloudflare has acquired Neumob. Neumob’s team built exceptional technology to speed up mobile apps, reduce errors on challenging mobile networks, and increase conversions. Cloudflare will integrate the Neumob technology with our global network to give Neumob truly global reach.

It’s tempting to think of the Neumob acquisition as a point product added to the Cloudflare portfolio. But it actually represents a key part of a long term “Super Secret Cloudflare Master Plan”.

The Super Secret Cloudflare Master Plan, or why we acquired Neumob CC BY 2.0 image by Neil Rickards

Over the last few years Cloudflare has been building a large network of data centers across the world to help fulfill our mission of helping to build a better Internet. These data centers all run an identical software stack that implements Cloudflare’s cache, DNS, DDoS, WAF, load balancing, rate limiting, etc.

We’re now at 118 data centers in 58 countries and are continuing to expand with a goal of being as close to end users as possible worldwide.

The data centers are tied together by secure connections which are optimized using our Argo smart routing capability. Our Quicksilver technology enables us to update and modify the settings and software running across this vast network in seconds.

Continue reading

IDG Contributor Network: How patchable software can secure the IoT

As the Internet of Things continues to grow, delving further into every corner of our markets and societies, the ability to secure it from malevolent attackers and massive data-breaches will become more vital towards its survival. Today’s IoT security landscape is a confused mess, with vulnerabilities running rampant and paltry little being done to make it more secure. So how can IoT experts and tech enthusiast alike contribute to a safer IoT?The answer lies in patchable software. By embracing more industry standards and fostering the greater implementation of patchable software, IoT enthusiast can ensure that this much-beloved connectivity phenomenon lives on to serve us for years to come.To read this article in full, please click here

Integrating Docker EE Into Société Générale’s Existing Enterprise IT Systems

Société Générale is a 153-year old French multinational bank that believes technology and innovation are key to enriching the customer experience and advancing economic development. A few years ago, the bank started a project to define their next generation application platform that would help them get 80% of their applications running in the cloud by 2020. Société Générale chose Docker Enterprise Edition (Docker EE) to be the foundation of their application platform and began working with it 15 months ago. This year at DockerCon Europe, Stephan Dechoux, DevOps architect, and Thomas Boussardon, Middleware Specialist, shared their journey over this time integrating Docker Enterprise Edition [Docker EE] into Société Générale IT systems.

You can watch their breakout session here:

A New Platform For Today and Tomorrow

Société Générale has a diverse application portfolio that includes many different types of applications, including legacy monolithic apps, SOA, distributed apps and REST APIs. The bank is also a global organization with teams and data centers around the world. A primary goal  was to deliver a new application platform to improve time-to-market and lower costs, while accelerating innovation. Initially Société Générale considered off-the-shelf PaaS solutions, but realized that these were better suited for greenfield applications Continue reading

Deploy360 at IETF 100, Day 3: SIDR, TLS & Crypto

This week is IETF 100 in Singapore, and we’re bringing you daily blog posts highlighting some of the topics that Deploy360 is interested in. After the focus on IPv6 & IoT during the first couple of days, we’re switching tack today with a focus on routing and crypto matters.

We’re having to wait until after lunch, but then there’s a choice of UTA, SIDROPS or ROLL at 13.30 SGT/UTC+8.

UTA will be focusing on resolving the final IESG comments on the use of TLS for email submission and access which outlines current recommendations for using TLS to provide confidentiality of email traffic between a mail user agent and a mail access server. Next up for discussion are the open issues on a draft related to Strict Transport Security (STS) for mail (SMTP) transfer agents and mail user agents, before consideration of  a draft on an option to require TLS for SMTP.


NOTE: If you are unable to attend IETF 100 in person, there are multiple ways to participate remotely.


Over in SIDROPS, there will be a review of the status of BGP Origin Validation deployment in RENATA: the Columbia National Research and Education Network. This represents the first wide-scale deployment Continue reading

IDG Contributor Network: The benefits of multi-cloud computing

Its application might be a tough concept to grasp, but the idea of multi-cloud computing is a simple one. It’s the choice of a business to distribute its assets, redundancies, software, applications and anything it deems worthy not on one cloud-hosting environment, but rather across several. At its surface, this concept might seem to be rubbing against the grain a bit. For security purposes alone, having all your company’s proverbial eggs in one basket appears the best way to keep your information from suffering leakage. Plus, many cloud-hosting companies will offer perks and discounts when your company use their services in totality.   However, the model of using multiple cloud services to house your business’s functions and features has an impressive list of advantages that can provide security, flexibility, cost-effectiveness and more to increase your business’s efficiency and ensure it stays up and running 24 hours a day. To read this article in full, please click here

Your online freedoms are under threat – 2017 Freedom on the Net Report

As more people get online everyday, Internet Freedom is facing a global decline for the 7th year in a row.

Today, Freedom House released their 2017 Freedom on the Net report, one of the most comprehensive assessments of countries’ performance regarding online freedoms. The Internet Society is one of the supporters of this report. We think it brings solid and needed evidence-based data in an area that fundamentally impacts user trust.

Looking across 65 countries, the report highlights several worrying trends, including:

  • manipulation of social media in democratic processes
  • restrictions of virtual private networks (VPNs)
  • censoring of mobile connectivity
  • attacks against netizens and online journalists

Elections prove to be particular tension points for online freedoms (see also Freedom House’s new Internet Freedom Election Monitor). Beyond the reported trend towards more sophisticated government attempts to control online discussions, the other side of the coin is an increase in restrictions to Internet access, whether through shutting down networks entirely, or blocking specific communication platforms and services.

These Internet shutdowns are at the risk of becoming the new normal. In addition to their impact on freedom of expression and peaceful assembly, shutdowns generate severe economic costs, affecting entire economies [1] and Continue reading

Another Reason to Run Linux on Your Data Center Switches

Arista’s OpenFlow implementation doesn’t support TLS encryption. Usually that’s not a big deal, as there aren’t that many customers using OpenFlow anyway, and those that do hopefully do it over a well-protected management network.

However, lack of OpenFlow TLS encryption might become an RFP showstopper… not because the customer would really need it but because the customer is in CYA mode (we don’t know what this feature is or why we’d use it, but it might be handy in a decade, so we must have it now) or because someone wants to eliminate certain vendors based on some obscure missing feature.

Read more ...