Take these security books with you on vacation

Real-life cyber mysteriesImage by ThinkstockWhy spend your beach time this summer reading fictional mystery novels when real world mysteries are swirling through the cyber sphere? BAE Systems has put together a summer reading list for cyber security professionals. It includes titles that cover the international underworld of money laundering, the greatest criminal minds in hacking, insights into understanding how cyber criminals think, the impact of potential cyber attacks and cyber wars on mission critical targets as well as practical advice and business lessons on cyber security.To read this article in full or to leave a comment, please click here

30 days in a terminal: Days 2-5 — Social media in shell

My adventures, for the first day or so, in using nothing but a Linux terminal proved to be mostly successful. I ended up needing to jump through a few hoops to get my work done, but everything was doable.After having spent the full weekend within the confines of the shell, my results are much more of a mixed bag.In this article, I focus on social media: Twitter, Reddit, that sort of thing. For some of them, I have totally awesome solutions. Some I’m still struggling to find a solution for—with very little hope in sight.To read this article in full or to leave a comment, please click here

DARPA wants to design an army of ultimate automated data scientists

Because of a plethora of data from sensor networks, Internet of Things devices and big data resources combined with a dearth of data scientists to effectively mold that data, we are leaving many important applications – from intelligence to science and workforce management – on the table.It is a situation the researchers at DARPA want to remedy with a new program called Data-Driven Discovery of Models (D3M). The goal of D3M is to develop algorithms and software to help overcome the data-science expertise gap by facilitating non-experts to construct complex empirical models through automation of large parts of the model-creation process. If successful, researchers using D3M tools will effectively have access to an army of “virtual data scientists,” DARPA stated.To read this article in full or to leave a comment, please click here

DARPA wants to design an army of ultimate automated data scientists

Because of a plethora of data from sensor networks, Internet of Things devices and big data resources combined with a dearth of data scientists to effectively mold that data, we are leaving many important applications – from intelligence to science and workforce management – on the table.It is a situation the researchers at DARPA want to remedy with a new program called Data-Driven Discovery of Models (D3M). The goal of D3M is to develop algorithms and software to help overcome the data-science expertise gap by facilitating non-experts to construct complex empirical models through automation of large parts of the model-creation process. If successful, researchers using D3M tools will effectively have access to an army of “virtual data scientists,” DARPA stated.To read this article in full or to leave a comment, please click here

DARPA wants to design an army of ultimate automated data scientists

Because of a plethora of data from sensor networks, Internet of Things devices and big data resources combined with a dearth of data scientists to effectively mold that data, we are leaving many important applications – from intelligence to science and workforce management – on the table.It is a situation the researchers at DARPA want to remedy with a new program called Data-Driven Discovery of Models (D3M). The goal of D3M is to develop algorithms and software to help overcome the data-science expertise gap by facilitating non-experts to construct complex empirical models through automation of large parts of the model-creation process. If successful, researchers using D3M tools will effectively have access to an army of “virtual data scientists,” DARPA stated.To read this article in full or to leave a comment, please click here

E-book buyers will soon get settlement payments from Apple price-fixing case

Some buyers of e-books will begin to receive payments Tuesday as part of a settlement in a price-fixing case against Apple. People who purchased e-books between April 1, 2010, and May 21, 2012, will receive credits from e-book sellers, or will get a check if they opted out of receiving credits, according to Hagens Berman Sobol Shapiro, a consumer-rights class-action law firm involved in the lawsuit against Apple. As part of the Apple settlement of the case, e-book buyers will receive US$6.93 for every purchase that was a New York Times bestseller and $1.57 for every other e-book. The settlement covers e-books purchased from Apple as well as from other retailers, including Amazon.com and Barnes & Noble.To read this article in full or to leave a comment, please click here

Getting to the point of dual homing

I wonder how many times I’ve seen this sort of diagram across the many years I’ve been doing network design?

dual-homing

It’s usually held up as an example of how clever the engineer running the network is about resilience. “You see,” the diagram asserts, “I’m smart enough to purchase connectivity from two providers, rather than one.”

Can I point something out? Admittedly it might not be all that obvious from the diagram, but… Reality is just about as likely to squish your network connectivity like a bug no a windshield as it is any other network. Particularly if both of these connections are in the same regional area. The tricky part is knowing, of course, what a “regional area” might happen to mean for any particular provider.

The problem with this design is very basic, and tied to the concept of shared link risk groups. But let me start someplace a little simpler than that—with the basic, and important, point that putting fiber in the ground, and maintaining fiber that’s in the ground, is expensive. Unless you live in Greenland, fiber can be physically buried pretty easily (fiber in Greenland is generally buried with dynamite by a blasting crew, or Continue reading

Does your smartphone embarrass you?

Modern computer and internet technology is amazing, allowing us to do an incredible number of things that simple weren’t possible before, both individually and as part of larger organizations.But anyone who works with computer and mobile devices knows that everything isn’t perfect. Too often, computer systems are frustratingly hard to use. And now, the Nielsen Norman Group has identified a new problem stemming from sub-optimal user interfaces: computer-assisted embarrassment.Earlier this month, Susan Farrell described the phenomenon this way: “Smart devices have invaded our world and inserted themselves in almost every context of our existence. Their flaws and faulty interactions are no longer only theirs—they reflect badly on their users and embarrass them in front of others.”To read this article in full or to leave a comment, please click here

Intel Knights Landing Yields Big Bang For The Buck Jump

The long wait for volume shipments of Intel’s “Knights Landing” parallel X86 processors is over, and at the International Supercomputing Conference in Frankfurt, Germany is unveiling the official lineup of the Xeon Phi chips that are aimed at high performance computing and machine learning workloads alike.

The lineup is uncharacteristically simple for a Xeon product line, which tends to have a lot of different options turned on and off to meet the myriad requirements of features and price points that a diverse customer base usually compels Intel to support. Over time, the Xeon Phi lineup will become more complex, with

Intel Knights Landing Yields Big Bang For The Buck Jump was written by Timothy Prickett Morgan at The Next Platform.

How big data is changing the game for backup and recovery

It's a well-known fact in the IT world: Change one part of the software stack, and there's a good chance you'll have to change another. For a shining example, look no further than big data.First, big data shook up the database arena, ushering in a new class of "scale out" technologies. That's the model exemplified by products like Hadoop, MongoDB, and Cassandra, where data is distributed across multiple commodity servers rather than packed into one massive one. The beauty there, of course, is the flexibility: To accommodate more petabytes, you just add another inexpensive machine or two rather than "scaling up" and paying big bucks for a bigger mammoth.To read this article in full or to leave a comment, please click here

How big data is changing the game for backup and recovery

It's a well-known fact in the IT world: Change one part of the software stack, and there's a good chance you'll have to change another. For a shining example, look no further than big data.First, big data shook up the database arena, ushering in a new class of "scale out" technologies. That's the model exemplified by products like Hadoop, MongoDB, and Cassandra, where data is distributed across multiple commodity servers rather than packed into one massive one. The beauty there, of course, is the flexibility: To accommodate more petabytes, you just add another inexpensive machine or two rather than "scaling up" and paying big bucks for a bigger mammoth.To read this article in full or to leave a comment, please click here

Montreal wins Intelligent Community of the Year

COLUMBUS, Ohio -- Montreal, Quebec, was named “Intelligent Community of the Year” this week at the annual Intelligent Community Forum (ICF) Summit. In the face of economic decline and political scandals, Canada’s largest French-speaking city began its turnaround with a Smart City plan starting in 2011.The city, home to a 10th of Canada’s population, had endured trade losses, an eclipse of manufacturing, and years of separatist nostalgia. The new Montreal staked its future on a broader economic base of Information and Communications Technologies (ICT), aerospace, health sciences, and clean technologies. These sectors now field 6,250 companies with 10% of the region’s workforce.To read this article in full or to leave a comment, please click here

Montreal wins Intelligent Community of the Year

COLUMBUS, Ohio -- Montreal, Quebec, was named “Intelligent Community of the Year” this week at the annual Intelligent Community Forum (ICF) Summit. In the face of economic decline and political scandals, Canada’s largest French-speaking city began its turnaround with a Smart City plan starting in 2011.The city, home to a 10th of Canada’s population, had endured trade losses, an eclipse of manufacturing, and years of separatist nostalgia. The new Montreal staked its future on a broader economic base of Information and Communications Technologies (ICT), aerospace, health sciences, and clean technologies. These sectors now field 6,250 companies with 10% of the region’s workforce.To read this article in full or to leave a comment, please click here

Docker 1.12: Now with Built-in Orchestration!

Three years ago, Docker made an esoteric Linux kernel technology called containerization simple and accessible to everyone.  Today, we are doing the same for container orchestration. 

Container orchestration is what is needed to transition from deploying containers individually on a single host, to deploying complex multi-container apps on many machines. It requires a distributed platform, independent from infrastructure, that stays online through the entire lifetime of your application, surviving hardware failure and software updates. Orchestration is at the same stage today as containerization was 3 years ago.  There are two options: either you need an army of technology experts to cobble together a complex ad hoc system, or you have to rely on a company with a lot of experts to take care of everything for you as long as you buy all hardware, services, support, software from them. There is a word for that, it’s called lock-in.

Docker users have been sharing with us that neither option is acceptable. Instead, you need a platform that makes orchestration usable by everyone, without locking you in. Container orchestration would be easier to implement, more portable, secure, resilient, and faster if it was built into the platform.

Starting with Docker 1.12, Continue reading

Announcing the Docker for Mac and Windows Public Beta

Back in March, we launched a private beta for a new ambitious project called Docker for Mac and Docker for Windows. Our major goal was to bring a native Docker experience to Mac and Windows, making it easier for developers to work with Docker in their own environments. And thousands agreed. Over thirty thousand applied in the first 24 hours. And by last week, we let in over seventy thousand.

And now all you need to get started developing is Docker and a text editor. No more installing dependencies and runtimes just to debug applications.

Continue reading

Introducing the Docker for AWS and Azure Beta

Today, we’re excited to announce Docker for AWS and Docker for Azure: the best ways to install, configure and maintain Docker deployments on AWS and Azure.

Our goals for Docker for AWS and Azure are the same as for Docker for Mac and Windows:

  • Deploy a standard Docker platform to ensure teams can seamlessly move apps from developer laptops to Docker staging and production environments, without risk of incompatibilities or lock-in.
  • Integrate deeply with underlying infrastructure to make sure Docker takes advantage of the host environment’s native capabilities and exposes a familiar interface to administrators.   
  • Deploy the Docker platform to all the places where you want to run containerized apps, simply and efficiently and at no extra cost.
  • Make sure the latest and greatest Docker versions are available for the hardware, OSs, and infrastructure you love, and provide solid upgrade paths from one Docker version to the next.

Continue reading