The First IoT Culling: My Devices are Dying.

 

Cull: to reduce the population of (a wild animal) by selective slaughter

As an early adopter of technology, I sometimes feel like I get to live in the future. Or as William Gibson said “The future is already here, it’s just not evenly distributed”. There are a lot of benefits to be gained from this, but there are also risks.  One of the biggest risks is 

How long is the product you choose going to be around?

 

I was an early adopter in the first wave of IoT devices, from wearables to home convenience devices, I dipped my toes in the pool early. Most of these platforms were Kickstarter projects and I’ve been generally happy with most of them, at least the ones that were actually delivered. ( That’s a story for another time…).

But in the last six months, the market seems to have decided that there are just too many of these small companies.

The Death Bells are Ringing

In the last year, I’ve noticed that there’s starting to be a trend. Many of the early platforms that I invested in seem to be disappearing. Some have been bought off and killed. Remember the Pebble watches which Continue reading

IDG Contributor Network: What have we learned from WannaCry?

Ransomware has been a growing threat for a couple of years now. More than 4,000 ransomware attacks have occurred every day since the beginning of 2016, according to an FBI report. So, it was no surprise to find it in the headlines again recently. The WannaCry ransomware attack proved to be one of the most successful and widespread to date -- it took a single day to infect more than 230,000 computers across more than 150 countries.WannaCry was able to spread so effectively because of a known vulnerability that Microsoft patched back in March. Organizations that fell victim had failed to patch, and many lacked basic security protections and working backups. Analyzing in the aftermath it’s clear that we have a problem. We already know exactly how to guard against ransomware, the problem is that many organizations aren’t doing it.To read this article in full or to leave a comment, please click here

Don’t Build Big Data With Bad Data

I was at Pure Accelerate 2017 this week and I saw some very interesting things around big data and the impact that high speed flash storage is going to have. Storage vendors serving that market are starting to include analytics capabilities on the box in an effort to provide extra value. But what happens when these advances cause issues in the training of algorithms?

Garbage In, Garbage Out

One story that came out of a conversation was about training a system to recognize people. In the process of training the system, the users imported a large number of faces in order to help the system start the process of differentiating individuals. The data set they started with? A collection of male headshots from the Screen Actors Guild. By the time the users caught the mistake, the algorithm had already proven that it had issues telling the difference between test subjects of particular ethnicities. After scrapping the data set and using some different diverse data sources, the system started performing much better.

This started me thinking about the quality of the data that we are importing into machine learning and artificial intelligence systems. The old computer adage of “garbage in, garbage Continue reading

50% off Grand Theft Auto V for PlayStation 4 and XBox One – Deal Alert

When a young street hustler, a retired bank robber and a terrifying psychopath find themselves entangled with some of the most frightening and deranged elements of the criminal underworld, the U.S. government and the entertainment industry, they must pull off a series of dangerous heists to survive in a ruthless city in which they can trust nobody, least of all each other. Explore the stunning world of Los Santos and Blaine County in the ultimate Grand Theft Auto V experience, featuring a range of technical upgrades and enhancements for new and returning players. The list price of this highly rated game has been reduced 50% to just $29.99. See it on Amazon.To read this article in full or to leave a comment, please click here

IDG Contributor Network: The rise and rise of the IT chief of staff

At two-thirds of companies today, a senior IT leader is working on the next draft of the IT strategy, or perhaps revising a workforce plan in preparation for a transformation. After that, he or she might chase down some red flags on the monthly IT scorecard or meet with a finance leader to discuss a budget update. And before leaving for the day, this leader might draft a monthly IT team newsletter or put the finishing touches on a deck for an upcoming all-hands meeting.Who is this person, the CIO?No, it’s the IT chief of staff.The CIO role is changing fast as they get involved in product technology, help to shape digital strategy and build digital acumen across the company. The only way CIOs can find time to do these things is to delegate many of the IT management and governance tasks that have traditionally filled much of their calendars.To read this article in full or to leave a comment, please click here

Improving on history—the Linux history command, that is

The Linux history command allows users to repeat commands without retyping them and to look over a list of commands they've recently used, but that's just the obvious stuff. It is also highly configurable, allows you to pick and choose what you reuse (e.g., complete commands or portions of commands), and controls what commands are recorded.In today's post, we're going to run through the basics and then explore some of the more interesting behaviors of the history command.The basics of the Linux history command Typing "history" and getting a list of previously entered commands is the command's most obvious use. Pressing the up arrow until you reach a command that you want to repeat and hitting enter to rerun it is next. And, as you probably know, you can also use the down arrow. In fact, you can scroll up and down your list of previously entered commands to review them or rerun them.To read this article in full or to leave a comment, please click here

Improving on history—the Linux history command, that is

The Linux history command allows users to repeat commands without retyping them and to look over a list of commands they've recently used, but that's just the obvious stuff. It is also highly configurable, allows you to pick and choose what you reuse (e.g., complete commands or portions of commands), and controls what commands are recorded.In today's post, we're going to run through the basics and then explore some of the more interesting behaviors of the history command.The basics of the Linux history command Typing "history" and getting a list of previously entered commands is the command's most obvious use. Pressing the up arrow until you reach a command that you want to repeat and hitting enter to rerun it is next. And, as you probably know, you can also use the down arrow. In fact, you can scroll up and down your list of previously entered commands to review them or rerun them.To read this article in full or to leave a comment, please click here

Improving on history

The Linux history command allows users to repeat commands without retyping them and to look over a list of commands that they've recently used, but that's just the obvious stuff. It is also highly configurable, allows you to pick and choose what you reuse (e.g., complete commands or portions of commands), and control what commands are recorded. In today's post, we're going to run through the basics and then explore some of the more interesting behaviors of the history command.The basics Typing "history" and getting a list of previously entered commands is the command's most obvious use. Pressing the up arrow until you reach a command that you want to repeat and hitting enter to rerun it is next. And, as you probably know, you can also use the down arrow. In fact, you can scroll up and down your list of previously entered commands to review them or rerun them.To read this article in full or to leave a comment, please click here

Parked electric cars will power buildings, researchers say

Power-thirsty data centers could receive a new kind of electricity supply that uses surplus juice found in idle electric vehicles (EVs). It uses the cars’ batteries to augment a building’s grid power.Vehicle-to-grid (V2G) technology provides a way to store, then subsequently use, energy in commuters’ cars while employees work during the day. It can also supply power when a delivery fleet is parked at night, researchers say.+ Also on Network World: Smart buildings start with awareness + Cost savings for building infrastructure could be significant in green-electricity applications, such as solar and wind, which require local electricity storage. Installing the batteries needed for green applications is expensive.To read this article in full or to leave a comment, please click here

Parked electric cars will power buildings, researchers say

Power-thirsty data centers could receive a new kind of electricity supply that uses surplus juice found in idle electric vehicles (EVs). It uses the cars’ batteries to augment a building’s grid power.Vehicle-to-grid (V2G) technology provides a way to store, then subsequently use, energy in commuters’ cars while employees work during the day. It can also supply power when a delivery fleet is parked at night, researchers say.+ Also on Network World: Smart buildings start with awareness + Cost savings for building infrastructure could be significant in green-electricity applications, such as solar and wind, which require local electricity storage. Installing the batteries needed for green applications is expensive.To read this article in full or to leave a comment, please click here

Docker and Booz Allen Hamilton Modernize Traditional Apps in Government IT

Existing applications and infrastructure account for the majority of IT spend in maintenance and support. Docker and Booz Allen Hamilton are partnering together to help Federal agencies modernize traditional apps with Docker Enterprise Edition (EE), deploy onto modern infrastructure to save infrastructure and operational costs, increase security and gain workload portability.

This program helps accelerate the path to modern microservices and infrastructure with containers:

  1. First by containerizing the app in place and using container architecture to break up the app into smaller services over time
  2. The full stack portability provided by Docker EE allows for workload consolidation for greater app density per server, accelerate hardware refresh cycles and cloud migration.
  3. Lastly, Docker EE provides new levels of security for the legacy app. Scanning provides binary level visibility into components and their security profile for proactive remediation and configurable isolation properties can greatly reduce the attack surface area

View the webinar on demand here:

Here are some of the top Q&A from the session:

Q: What does Image2Docker exactly capture in the VM?

A: Image2Docker captures the application in the VM and pulls out what can be provided by the base image or the underlying linux/win kernel.

Q: When it Continue reading

What makes for a successful protocol ?

What makes for a successful protocol ? Which protocol is successful and why ?   Have you ever been asked these questions  ? As an engineer you cannot say I believe Protocol X is successful or Protocol Y is not.   There is nothing like ‘ I believe ‘. There should always a science behind […]

The post What makes for a successful protocol ? appeared first on Cisco Network Design and Architecture | CCDE Bootcamp | orhanergun.net.