Rushing to the Now

Forget the (predictable) predictions for 2016. What’s here and happening right now? Perhaps, hiding behind the cloud (check) of ignorance, the rotting corpse of media disinterest or the red lit distractions of modern life, are things that may soon be obvious to all. Here’s my view of What Lies Beneath the fog, the decomposing bodies and those that […]

The post Rushing to the Now appeared first on Packet Pushers.

Rushing to the Now

Forget the (predictable) predictions for 2016. What’s here and happening right now? Perhaps, hiding behind the cloud (check) of ignorance, the rotting corpse of media disinterest or the red lit distractions of modern life, are things that may soon be obvious to all. Here’s my view of What Lies Beneath the fog, the decomposing bodies and those that […]

The post Rushing to the Now appeared first on Packet Pushers.

A Beginner’s Guide to Scaling to 11 Million+ Users on Amazon’s AWS

How do you scale a system from one user to more than 11 million users? Joel Williams, Amazon Web Services Solutions Architect, gives an excellent talk on just that subject: AWS re:Invent 2015 Scaling Up to Your First 10 Million Users.

If you are an advanced AWS user this talk is not for you, but it’s a great way to get started if you are new to AWS, new to the cloud, or if you haven’t kept up with with constant stream of new features Amazon keeps pumping out.

As you might expect since this is a talk by Amazon that Amazon services are always front and center as the solution to any problem. Their platform play is impressive and instructive. It's obvious by how the pieces all fit together Amazon has done a great job of mapping out what users need and then making sure they have a product in that space. 

Some of the interesting takeaways:

  • Start with SQL and only move to NoSQL when necessary.
  • A consistent theme is take components and separate them out. This allows those components to scale and fail independently. It applies to breaking up tiers and creating microservices.
  • Only invest in tasks Continue reading

The Incident Response “Fab Five”

I’ve been focused on security analytics for several years and spent a good part of 2015 investigating technologies and methodologies used for incident response.  Based upon lots of discussions with cybersecurity professionals and a review of industry research, I’ve come up with a concept I call the incident response “fab five.”  Enterprise organizations with the most efficient and effective incident detection and response, tend to establish best practice and synchronization in 5 distinct areas: Host monitoring.  This centers on understanding the state and activities of host computers.  Host monitoring tends to concentrate on Windows PCs, but may also include oversight of Macs, Linux, servers, and even cloud-based workloads.  Historically, host monitoring was based upon log collection and analysis but SOC managers are also embracing open source EDR tools (i.e. GRR, MIG, etc.) as well as commercial forensic offerings (i.e. Carbon Black, Countertack, Hexis Cyber Solutions, Guidance Software EnCase, RSA Ecat, Tanium, etc.).  The trend is toward collecting, processing, and analyzing more host forensic data in real-time. Network monitoring.  Beyond network logs, I see leading-edge organizations collecting and analyzing a combination of flow and PCAP data.  Think of technologies Continue reading

When Novell tapped David Bowie’s ch-ch-changes for an ad campaign

Networking and computing vendors have a long history of using famous songs to help market their offerings, and also have a tradition of reinventing themselves over and over. So it's no surprise that David Bowie's Changes would wind up in at least one major ad campaign.MORE: A history of singing the Big BluesThe music and fashion icon's death on Sunday at the age of 69 reminded me of that $60 million Novell "The Power to Change" marketing campaign that debuted on Monday Night Football back in the year 2000.To read this article in full or to leave a comment, please click here

99 Problems and Configuration and Telemetry Ain’t Two

Isn’t SNMP just great? I love monitoring my network using an unreliable transport mechanism and an impenetrable and inconsistent data structure. Configuring my devices using automation is equally fun, where NETCONF has been subverted into something so ridiculously vendor-specific (and again, inconsistent), that each new device type (even from a single vendor) can mean starting again from scratch. Is there any hope for change? OpenConfig says yes.

Monitoring The Network

Love it or hate it (hate it), SNMP remains the de facto standard for alerting and monitoring. I think we cling on to SNMP as an industry because we’re scared that any replacement will end up being just as clunky, and we’d simple be putting expensive lipstick on a particularly ugly pig. If we want to get rid of SNMP, whatever comes next will need to bring significant benefits.

Configuring the Network

If you’re dedicated to making changes manually, it’s likely you don’t care much about the mechanisms currently available to automate configuration changes. However, I can assure you that writing scripts to make changes to network device configurations is a frustrating activity, especially in a multi-vendor environment. I should add that I consider automating CLI commands and screen-scraping the Continue reading

Take them seriously — you could change the world

We often think that because we’re engineers, squirreled away in the basement suite (we used to have a fireproof suit hanging in the basement elevator as a little joke on the IT world at one job), we can’t have a huge impact on people. Or maybe it’s because you don’t think you’re famous enough — you don’t have a blog, several books published, multiple speaking engagements, and you don’t work for some big vendor. Whatever the reason for thinking you don’t — or shouldn’t — have an impact in someone’s life, let me say this.

You’re wrong.

The impact of one person can hardly be underestimated; from a book I read recently, for instance:

I turned and walked out of his office, closing the door with the characteristic rattle of the frosted glass pane. Though I could not have put it into words then, I was a different person from the one who had walked into that office ten minutes earlier. A person for whom I had the highest regard had taken me seriously. If he thought I was worthy of an hour of his time every week, then just maybe I was worth something. -Michael Card, The Walk

The Continue reading

FCC: 10 percent of Americans still lack access to proper broadband

Last week, we reported on the strides Internet services providers in the United States have made to improve broadband connection speeds, but noted how ISPs still have a lot of catching up to do. Case in point: As Endgadget reported Friday, a new Federal Communications Comission report shows that as of 2014, roughly 10 percent of Americans still didn’t have access to a broadband Internet connection that meets the FCC’s minimum definition of broadband (25 megabits per second download; 3Mbps upload—a standard that the agency set in early 2015).To read this article in full or to leave a comment, please click here

The Microsoft Exchange Server settings you must get right

Microsoft has invested millions of dollars into Azure and Office 365, and their competitors are following suit with bona fide public cloud offerings of their own. But public cloud solutions are not for everyone. Organizations of many stripes have legitimate reasons for not wanting their restricted data on systems beyond their total control.For many of these entities, on-premises Exchange Server is a messaging must. Microsoft continues to update the software with the assurance that any improvements made to its cloud-based stack will eventually trickle down. Increasingly, these features are adding layers of complexity to the already daunting task of running an enterprise-grade messaging system. It's easy to get lost when going through hardware capacity planning, setting up DAGs (database availability groups) and site resiliency, configuring mail routing, and making sure your users can actually connect to the system.To read this article in full or to leave a comment, please click here

Why Syncsort introduced the mainframe to Hadoop

When you think of leaders in big data and analytics, you’d be forgiven for not listing Syncsort among them. But this nearly 50-year-old company, which began selling software for the decidedly unglamorous job of optimizing mainframe sorting, has refashioned itself into a critical conduit by which core corporate data flows into Hadoop and other key big data platforms. Syncsort labels itself "a freedom fighter" liberating data and dollars -- sometimes millions of dollars -- from the stranglehold of big iron and traditional data warehouse/analytics systems.In this installment of the IDG CEO Interview Series, Chief Content Officer John Gallant spoke with Josh Rogers, who was named CEO this week, as well as outgoing CEO Lonne Jaffe, who remains as Senior Advisor to Syncsort’s board. Among other topics, the pair talked about why Syncsort was recently acquired by Clearlake Capital Group, and how Syncsort’s close partnership with Splunk is dramatically improving security and application performance management.To read this article in full or to leave a comment, please click here(Insider Story)

Why Syncsort introduced the mainframe to Hadoop

When you think of leaders in big data and analytics, you’d be forgiven for not listing Syncsort among them. But this nearly 50-year-old company, which began selling software for the decidedly unglamorous job of optimizing mainframe sorting, has refashioned itself into a critical conduit by which core corporate data flows into Hadoop and other key big data platforms. Syncsort labels itself "a freedom fighter" liberating data and dollars -- sometimes millions of dollars -- from the stranglehold of big iron and traditional data warehouse/analytics systems.To read this article in full or to leave a comment, please click here(Insider Story)

Your license plate: Window to your life

Big Brother watching you is bad enough. But Big Brother allowing hackers to watch you as well is worse.And that is increasingly the case, thanks to the indiscriminate, and insecure, collection of vehicle license plate data, according to recent reports from the Electronic Frontier Foundation (EFF) and the alt-weekly DigBoston.The technology at issue is Automated License Plate Readers (ALPR) – cameras mounted on patrol cars or stationary roadside structures like utility poles that record not just the plate number, but metadata including the date, time and location of the vehicle.EFF reported late last year that it had found, “more than a hundred ALPR cameras were exposed online, often with totally open Web pages accessible by anyone with a browser.” Those cameras were in several Louisiana communities; in Hialeah, Florida; and at the University of Southern California.To read this article in full or to leave a comment, please click here

OED tools: Chocolatey

The problem Install software on Windows and keep it updated is a boring and repetitive task. Linux and BSD/OSX users can install software from packages and keep it updated with a simple apt-get update;apt-get upgrade command. Wouldn’t it be great to have the same feature on Windows? The automation Chocolatey is a package manager for […]

Lenovo Thinkpad T420: Another excellent, inexpensive Linux laptop

For the past three years, I have been using a Lenovo Thinkpad T400 as my main platform for researching open-source network simulators and emulators. The T400 is an excellent, inexpensive computer that, even today, offers excellent value.

lenovo-thinkpad-t420-1

But, I need a computer that supports high-resolution external monitors so it must have a DisplayPort output. I also want to expand the number of VMs I can run concurrently with adequate performance so I need a processor that supports HyperThreading. I want to switch to the Ubuntu Linux distribution and the Ubuntu Unity desktop environment needs just a bit more processing power to run smoothly.

I recently purchased a used Lenovo Thinkpad T420 laptop, which offers everything I want and more. It is a five-year old product but it offers all the ports and performance I need. Because it is well past its depreciation curve, anyone can purchase a used T420 for a very low price. Read on to learn more about the Lenovo Thinkpad T420, another excellent and inexpensive Linux platform.

The Lenovo Thinkpad T420

The Lenovo Thinkpad T420 is a business-class notebook produced in 2011 that was leased in large volumes by companies for use by their employees. Now, Continue reading