Deep Instinct’s Artificial Brain Spots Zero-Day Security Threats
Deep learning goes deeper than machine learning.
Deep learning goes deeper than machine learning.
InformationWeek US IT Salary Survey shows how new technologies are impacting networking and data center pros.
Today's storage market is crowded with AFAs. Here are some shopping guidelines.
First in this series is the subject of Algorithms. This topic is very interesting to me because when I first strived to understand what exactly they were, I was expecting something a lot more complicated than what they turned out to be. I think, shamefully, that Hollywood may have had an influence on this, as the term “algorithm” is one of many terms abused by “cyber” movies and the like, portrayed to be some sort of ultimate cyber weapon in the war against Ellingson Mineral Company.
The reality is much simpler. “Algorithm” is defined as “a set of steps that are followed in order to solve a mathematical problem or to complete a computer process”. It really is that simple. Think of a mathematical problem that you’d need to solve yourself (ignoring for the moment that there’s likely a 3rd party library that has already done this).
A common example is the calculation of the Fibonacci sequence. Forget about writing code for a minute, and think about the problem in plain English. Given a starting sequence (1, 1), how do you continue calculating and adding numbers to this sequence, to produce N number of Fibonacci numbers?
Historically, my background is far closer to the systems side of things, but as I’ve picked up software development experience over the past few years, I’ve come to appreciate the fundamentals of computer science that others in my shoes may not have been exposed to. That said, I have been working on a pseudo-formal blog series on computer science fundamentals.
These fundamentals have a wide variety of applications. Those with more of an IT-focused background will learn that even if you don’t use graph theory, or optimize algorithms in your day job, many of these concepts are at the crux of many of the technologies that we use every day. If, like me, you’ve become bored with the endless cycle of IT certifications, learning these concepts could be a great addition to your skill set, as you can leverage these concepts to extrapolate details from some of the “closed” products we use from IT vendors.
Finally, it’s important to remember that the most important part of any of this is how this knowledge is applied. As you read the posts that I’ll release in the next few weeks, remember that understanding how to optimize a piece of code is useful, Continue reading
News emerged today that Open vSwitch (OVS) has formally moved over to the Linux Foundation. This is something that has been discussed within the OVS community for a while, and I for one am glad to see it happen.
Why am I glad to see it happen? The project can finally shed itself of the (unfair) claims that the governance under Nicira (and later VMware) wasn’t “open enough.” These accusations persisted despite numerous indications otherwise. Thomas Graf, an OVS committer—who does not work for VMware, for the record—came to this conclusion in his OVSCon 2015 presentation:
OVS is one of the most effective and well governed open source projects I’ve worked on.
Moving to the Linux Foundation allows OVS to continue to grow and flourish without continued accusations of unfair governance. The project intends to continue to use its existing governance model, in which technical leadership of the project is determined by the committers, and committer status is determined by your involvement in the project via code contributions and code reviews.
For more information, refer to the official Linux Foundation press release.
Bringing a new chip to market is no simple or cheap task, but as a new wave of specialized processors for targeted workloads brings fresh startup tales to bear, we are reminded again how risky such a business can be.
Of course, with high risk comes potential for great reward, that is, if a company is producing a chip that far outpaces general purpose processors for workloads that are high enough in number to validate the cost of design and production. The stand-by figure there is usually stated at around $50 million, but that is assuming a chip requires validation, …
Deep Learning Chip Upstart Takes GPUs to Task was written by Nicole Hemsoth at The Next Platform.