SAP’s Data Hub Helps Enterprises Unify and Analyze Data
Data Hub doesn’t move data around, but processes it where it resides.
Data Hub doesn’t move data around, but processes it where it resides.
Data Box also joins the list of ways to transfer information to the cloud.
I’m not a big user of Apple’s Automator tool, but sometimes it’s very useful. For example, A10 Networks load balancers make it pretty easy for administrators to capture packets without having to remember the syntax and appropriate command flags for a tcpdump command in the shell. Downloading the .pcap file is pretty easy too (especially using the web interface), but what gets downloaded is not just a single file; instead, it’s a gzip file containing a tar file which in turn contains (for the hardware I use) seventeen packet capture files. In this post I’ll explain what these files are, why it’s annoying, and how I work around this in MacOS.
If you’re wondering how one packet capture turned into sixteen PCAP files, that’s perfectly reasonable and the answer is simple in its own way. The hardware I use has sixteen CPU cores, fifteen of which are used by default to process traffic, and inbound flows are spread across those cores. Thus when taking a packet capture, the system actually requests each core to dump the flows matching the filter specification. Each core effectively has awareness of both the client and server sides of any connection, so both Continue reading
MapR Technologies has been busy in recent years build out its capabilities as a data platform company that can support a broad range of open-source technologies, from Hadoop and Spark to Hive, and can reach from the data center through the edge and out into the cloud. At the center of its efforts is its Converged Data Platform, which comes with the MapR-FS Posix file system and includes enterprise-level database and storage that are designed to handle the emerging big data workloads.
At the Strata Data Conference in New York City Sept. 26, company officials are putting their focus …
MapR Bulks Up Database for Modern Apps was written by Nicole Hemsoth at The Next Platform.
Today we announced Geo Key Manager, a feature that gives customers unprecedented control over where their private keys are stored when uploaded to Cloudflare. This feature builds on a previous Cloudflare innovation called Keyless SSL and a novel cryptographic access control mechanism based on both identity-based encryption and broadcast encryption. In this post we’ll explain the technical details of this feature, the first of its kind in the industry, and how Cloudflare leveraged its existing network and technologies to build it.
Cloudflare launched Keyless SSL three years ago to wide acclaim. With Keyless SSL, customers are able to take advantage of the full benefits of Cloudflare’s network while keeping their HTTPS private keys inside their own infrastructure. Keyless SSL has been popular with customers in industries with regulations around the control of access to private keys, such as the financial industry. Keyless SSL adoption has been slower outside these regulated industries, partly because it requires customers to run custom software (the key server) inside their infrastructure.
One of the motivating use cases for Keyless SSL was the expectation that customers may not trust a third party like Cloudflare with their Continue reading
Cloudflare’s customers recognize that they need to protect the confidentiality and integrity of communications with their web visitors. The widely accepted solution to this problem is to use the SSL/TLS protocol to establish an encrypted HTTPS session, over which secure requests can then be sent. Eavesdropping is protected against as only those who have access to the “private key” can legitimately identify themselves to browsers and decrypt encrypted requests.
Today, more than half of all traffic on the web uses HTTPS—but this was not always the case. In the early days of SSL, the protocol was viewed as slow as each encrypted request required two round trips between the user’s browser and web server. Companies like Cloudflare solved this problem by putting web servers close to end users and utilizing session resumption to eliminate those round trips for all but the very first request.
As Internet adoption grew around the world, with companies increasingly serving global and more remote audiences, providers like Cloudflare had to continue expanding their physical footprint to keep up with demand. As of the date this blog post was published, Cloudflare has data centers in over 55 countries, and we continue Continue reading
For operations teams, managing a serverless environment requires a fundamentally new approach.
In the Future of Networking with Fred Baker Fred mentioned an interesting IPv6 deployment scenario: give a /64 prefix to every server to support container deployment, and run routing protocols between servers and ToR switches to advertise the /64 prefix to the data center fabric preferably using link-local addresses.
Let’s recap:
Read more ... Look for an AWS version to launch next year.
The Azure and Ansible teams are collaborating on several interesting projects that we want to share. And if you joined us for AnsibleFest San Francisco earlier this month, you met both teams and heard some of the news. More on that below.
If you use Ansible to manage Azure and Windows environments, then hopefully you can join us at Microsoft Ignite this week in Orlando.
Ansible’s Matt Davis will co-present with Microsoft’s Hari Jayaraman, to discuss popular DevOps tools customers use to implement infrastructure as code processes in Azure. And the Ansible team will be in the Red Hat booth (#527) to demo automating Azure environments or any other questions you may have.
Session Info:
Infrastructure as Code
Friday, September 29
10:15 AM - 11:00 AM
Hyatt Regency Windermere W
One of the many announcements at AnsibleFest included the 16 new Azure modules contributed by the Azure team. The focus of the team was to cover the base use cases for Ansible users running workloads at scale in Azure.
New modules were added to manage Azure services:
It’s been a week of jubilation: The Internet Society celebrated 25 years of advocacy for an open, globally-connected, and secure Internet with events that crisscrossed the globe. The festivities kicked off at the University of California Los Angeles campus where in 1969 the first message was sent over ARPANET – the Internet’s predecessor.
On 18 September, the 25 Under 25 award ceremony honored young people around the world for their extraordinary work. Born in the age of the Internet, these everyday heroes are passionate about using it to make a positive impact on their communities. Their projects include connecting people with disabilities to employment opportunities, using AI to identify fake news, and humanizing issues affecting refugees and the LGBT community.
Learn more about the 25 Under 25 awardees
Watch the 25 Under 25 Award Ceremony
Just a few hours later, the 2017 Internet Society Global Internet Report: Paths to Our Digital Future was launched. The interactive report, the result of in-depth interviews, roundtables, and surveys conducted in 160 countries and 21 regions around the world, offers a glimpse into how the future of the Internet might impact humanity. The report encourages you to explore paths to our digital future, asks thought-provoking Continue reading