Here's a simple scenario: you have some Virtual Machines (VMs) in your on-premises environment, likely in VMware vSphere or Microsoft Hyper-V. You want to either fully migrate some or all of those VMs to the AWS Cloud or you want to copy a gold image to the AWS Cloud so you can launch compute instances from that image. Simple enough.
Now, how do you do it?
Can you just export an OVA of the VM, copy it up, and then boot it? Can you somehow import the VMDK files that hold the VM's virtual drive contents? Regardless the eventual method, how do you do it at scale for dozens or hundreds of VMs? And lastly, how do you orchestrate the process so that VMs belonging to an application stack are brought over together, as a unit?
Super fast: ISPs run by local communities or run through a partnership with a local community offer some of the fastest broadband in the U.S., a story at Vice.com says. Six of the 10 fastest ISPs in the country are either operated by local communities or are partnerships between the public and private sectors, according to a PCMag review.
Conflicting laws: Australia’s recently-passed encryption law, which mandates law enforcement access to encrypted communications, may conflict with the EU’s GDPR and the U.S. CLOUD Act, according to a story at ZDNet. Australian law enforcement agencies may have trouble requiring U.S. and EU companies to decrypt data, the Law Council of Australia has said.
Tweeting is back: The government of Chad has restored access to social media after a 16-month shutdown, QZ.com reports. That’s a lot of missed likes. The government had restricted access to electronic communications for “security reasons” and in “a context of terrorist threats.”
It steals your face: Mobile phone apps that allow you to edit pictures of your face may introduce security vulnerabilities, Forbes notes. One app may upload faces to a database without users’ permission, and another app Continue reading
Today's Tech Byte podcast, sponsored by AppNeta, delves into how AppNeta provides visibility and real-time insight into network performance by monitoring the end-to-end network path and capturing packets and flows, and performs synthetic testing for an accurate measure of user experience.
The post Tech Bytes: AppNeta Blends Network Data, Synthetic Transactions For Performance Visibility (Sponsored) appeared first on Packet Pushers.
Something very interesting is happening in the Indian telecom space these days.
The Indian government is considering a new data localisation law that would require all data around Indian citizens to be stored locally, i.e., within Indian borders. It starts with the fintech companies first, and would then bring in the social media and other IOT companies storing data in its ambit. The Reserve Bank of India (RBI) has cheerfully given a deadline to all fintech companies to ensure that the entire data operated by them, is stored in data centers only in India. Ouch.
RBI so far has refused to accept the representations made by the fintech companies to relax the norms. It’s ruled out the option of data mirroring while addressing the arguments of technological hurdles raised by the fintech companies. It’s instead suggested that companies opt for cloud services or private clouds in order to ensure data localization.
So, what’s data localisation? Data localisation is the process localising the citizen’s data to one’s home country for its processing, storage and collection before it goes through the process of being transferred to an international level. It’s done to ensure the country’s data protection and privacy Continue reading
Something very interesting is happening in the Indian telecom space these days.
The Indian government is considering a new data localisation law that would require all data around Indian citizens to be stored locally, i.e., within Indian borders. It starts with the fintech companies first, and would then bring in the social media and other IOT companies storing data in its ambit. The Reserve Bank of India (RBI) has cheerfully given a deadline to all fintech companies to ensure that the entire data operated by them, is stored in data centers only in India. Ouch.
RBI so far has refused to accept the representations made by the fintech companies to relax the norms. It’s ruled out the option of data mirroring while addressing the arguments of technological hurdles raised by the fintech companies. It’s instead suggested that companies opt for cloud services or private clouds in order to ensure data localization.
So, what’s data localisation? Data localisation is the process localising the citizen’s data to one’s home country for its processing, storage and collection before it goes through the process of being transferred to an international level. It’s done to ensure the country’s data protection and privacy Continue reading
My most loathed feature of Go was the mandatory use of GOPATH
:
I do not want to put my own code next to its dependencies. I was not
alone and people devised tools or crafted their own Makefile
to
avoid organizing their code around GOPATH
.
Hopefully, since Go 1.11, it is possible to use Go’s modules to
manage dependencies without relying on GOPATH
. First, you need to
convert your project to a module:1
$ go mod init hellogopher go: creating new go.mod: module hellogopher $ cat go.mod module hellogopher
Then, you can invoke the usual commands, like go build
or go test
.
The go
command resolves imports by using versions listed in
go.mod
. When it runs into an import of a package not present in
go.mod
, it automatically looks up the module containing that package
using the latest version and adds it.
$ go test ./... go: finding github.com/spf13/cobra v0.0.5 go: downloading github.com/spf13/cobra v0.0.5 ? hellogopher [no test files] ? hellogopher/cmd [no test files] ok hellogopher/hello 0.001s $ cat go.mod module hellogopher require github.com/spf13/cobra v0.0.5
If you want a specific version, you can Continue reading
Python is a great language to write a standalone script. Getting to the result can be a matter of a dozen to a few hundred lines of code and, moments later, you can forget about it and focus on your next task.
Six months later, a co-worker asks you why the script fails and you don’t have a clue: no documentation, hard-coded parameters, nothing logged during the execution and no sensible tests to figure out what may go wrong.
Turning a “quick-and-dirty” Python script into a sustainable version, which will be easy to use, understand and support by your co-workers and your future self, only takes some moderate effort. As an illustration, let’s start from the following script solving the classic Fizz-Buzz test:
import sys for n in range(int(sys.argv[1]), int(sys.argv[2])): if n % 3 == 0 and n % 5 == 0: print("fizzbuzz") elif n % 3 == 0: print("fizz") elif n % 5 == 0: print("buzz") else: print(n)
I find useful to write documentation before coding: Continue reading
I was listening to a nice podcast with Nick Buraglio discussing the recent BGP hijack SNAFU impacting Cloudflare (and their reaction) and while I usually totally agree with Nick, I think that he tried to be way too nice when saying (paraphrasing) “I think Cloudflare was a bit harsh - I would prefer a more community-oriented approach along the lines of how could we help you do your job better”
Read more ...The public-cloud moves are significant as they continue what has been an ongoing cloud rethink by...
On today's Heavy Networking, sponsored by InterOptic, we explore how to extend the life of your cabling plant as you grow to 100G Ethernet. We get very nerdy on cabling, modules, lasers, and more with guests Robert Coenen, VP of Business Development at Interoptic; and Alex Latzko, lead network architect at Server Central Turing Group.
The post Heavy Networking 460: Extending The Life Of Your Cabling Plant With InterOptic (Sponsored) appeared first on Packet Pushers.
The deal marks an expansion of a long-running partnership between both companies.
The update also introduces a new Performance Tier accelerating backup, which provides double the...
Since its creation in 2011, RightsCon has gathered people from different sectors to discuss human rights in the digital age. It started as an event with a few hundred experts, but has become a major conference, with nearly 3000 participants in 2019. The 2019 program consisted of 17 tracks focusing on major issues, which totalized more than 450 sessions held in a period of four days.
As the conference started to attract a wider group of people, it adopted a series of measures to increase its diversity. The recent host countries, including Tunisia and Costa Rica, reflect the worldwide nature of the event, which now gathers individuals from all over the globe.
RightsCon has also gathered a considerable number of young people. They’ve had the opportunity to connect not only through regular conference activities, but during a summit on Day Zero. The summit aimed to engage youth and also brief them on the discussions taking place during the rest of RightsCon.
The sessions at RightsCon were designed with different formats, which was reflected in the physical structure of the meeting rooms. They were organized not just in an audience format, but also roundtables, allowing for people to feel equal footing Continue reading
Weekly Wrap for July 19, 2019: Verizon and Ericsson trial a cloud-native core; AT&T wants to...
Bitfusion’s technology is focused on the virtualization of hardware accelerators like graphic...
Microsoft banked $11 billion in commercial cloud revenue during the quarter and Azure revenue,...
The idea of a 10x engineer is just too good to be true. In this Short Take I take a look at the recent controversy and share my thoughts about the twitter thread that sparked it all. Here’s a hint, there’s no such thing as a 10x engineer but there are some things to be learned from what the author had to say.
The post 10x Engineers – Don’t Believe The Hype appeared first on Network Collective.