Cloudflare has certified with the U.S. Department of Commerce for the new EU-U.S. Privacy Shield framework.
Beginning this summer, the U.S. Department of Commerce began accepting submissions to certify under the EU-U.S. Privacy Shield framework, a new mechanism by which European companies can transfer personal data to their counterparts in the United States. By certifying under Privacy Shield, Cloudflare is taking a strong and pro-active stance towards further protecting the security and privacy of our customers.
Since 1998, following the European Union’s implementation of EU Data Protection Directive 95/46/EC, companies in Europe wishing to transfer the personal data of Europeans overseas have had to ensure that the recipient of such data practices an adequate level of protection when handling this information. Until last October, American companies were able to certify under the U.S.-EU Safe Harbor Accord, which provided a legal means to accept European personal data, in exchange for assurances of privacy commitments and the enactment of specific internal controls.
However, after having been in effect for roughly fifteen years, in October 2015 the European Court of Justice overturned the Safe Harbor and declared that a new mechanism for transatlantic data transfers would need Continue reading
BGP Peering BGP Peering is an agreement between different Service Providers. It is an EBGP neighborship between different Service Providers to send BGP traffic between them without paying upstream Service Provider. To understand BGP peering, first we must understand how networks are connected to each other on the Internet. The Internet is a collection […]
The post BGP Peering – Private, Public, Bilateral and Multilateral Peering appeared first on Cisco Network Design and Architecture | CCDE Bootcamp | orhanergun.net.
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
The Internet of Things (IoT) promises to produce troves of valuable, fast moving, real-time data, offering insights that can change the way we engage with everyday objects and technologies, amplify our business acumen, and improve the efficiencies of the machines, large and small, wearable and walkable, that run our world.
But without careful, holistic forethought about how to manage a variety of data sources and types, businesses will not only miss out on critical insights, but fall behind the status quo. Here’s how to get prepared to wrangle and extract meaning from all of the data that’s headed your way:
To read this article in full or to leave a comment, please click here
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Everyone says there is an information security talent gap. In fact, some sources say the demand for security professionals exceeds the supply by a million jobs. Their argument is basically this: attacks are not being detected quickly or often enough, and the tools are generating more alerts than can be investigated, so we need more people to investigate those alarms.
Makes sense, right?
Wrong.
We believe that, even if companies aroaund the world miraculously hired a million qualified InfoSec professionals tomorrow there would be no change in detection effectiveness and we would still have a “talent gap.” The problem isn’t a people issue so much as it is an InfoSec infrastructure issue.
To read this article in full or to leave a comment, please click here
CC BY 2.0 image by Brian Hefele
Cloudflare helps customers control their own traffic at the edge. One of two products that we introduced to empower customers to do so is Cloudflare Traffic Control.
Traffic Control allows a customer to rate limit, shape or block traffic based on the rate of requests per client IP address, cookie, authentication token, or other attributes of the request. Traffic can be controlled on a per-URI (with wildcards for greater flexibility) basis giving pinpoint control over a website, application, or API.
Cloudflare has been dogfooding Traffic Control to add more granular controls against Layer 7 DOS and brute-force attacks. For example, we've experienced attacks on cloudflare.com from more than 4,000 IP addresses sending 600,000+ requests in 5 minutes to the same URL but with random parameters. These types of attacks send large volumes of HTTP requests intended to bring down our site or to crack login passwords.
Traffic Control protects websites and APIs from similar types of bad traffic. By leveraging our massive network, we are able to process and enforce rate limiting near the client, shielding the customer's application from unnecessary load.
To make this more concrete, let's look at a Continue reading
The post Worth Reading: What is segment routing? appeared first on 'net work.
Todays free-ranging episode discusses why organizations move to the cloud, what they gain & what they lose. They also look at how network management and monitoring is changing, and the relentless move to open networking. The post Show 308: Moving To The Cloud (And Back) appeared first on Packet Pushers.
IT officially has a new $2B company.
In the public cloud business, scale is everything – hyper, in fact – and having too many different kinds of compute, storage, or networking makes support more complex and investment in infrastructure more costly. So when a big public cloud like Amazon Web Services invests in a non-standard technology, that means something. In the case of Nvidia’s Tesla accelerators, it means that GPU compute has gone mainstream.
It may not be obvious, but AWS tends to hang back on some of the Intel Xeon compute on its cloud infrastructure, at least compared to the largest supercomputer centers and hyperscalers like …
Amazon Gets Serious About GPU Compute On Clouds was written by Timothy Prickett Morgan at The Next Platform.