Microsoft lures Win Server 2008 users toward Azure

Microsoft is offering extended support for Windows Server 2008 and SQL Server 2008 to customers who shift these platforms from on-premises into Microsoft’s Azure cloud.The scheduled ends of extended support for the 2008 versions of Server and SQL Server are Jan. 14, 2020 and July 9, 2019, respectively. But if customers move these workloads into the Azure cloud, they get three extra years of support at no extra cost beyond the price of the Azure service.In the past, when the end-of-life clock started ticking, organizations made a mad dash to upgrade operating systems and SQL servers in order to keep their systems supported. Some organizations chose to continue running their applications completely unsupported, unpatched and un-updated – a very bad thing to do in this age of viruses, malware and cyberattacks.To read this article in full, please click here

Microsoft lures Win Server 2008 users toward Azure

Microsoft is offering extended support for Windows Server 2008 and SQL Server 2008 to customers who shift these platforms from on-premises into Microsoft’s Azure cloud.The scheduled ends of extended support for the 2008 versions of Server and SQL Server are Jan. 14, 2020 and July 9, 2019, respectively. But if customers move these workloads into the Azure cloud, they get three extra years of support at no extra cost beyond the price of the Azure service.In the past, when the end-of-life clock started ticking, organizations made a mad dash to upgrade operating systems and SQL servers in order to keep their systems supported. Some organizations chose to continue running their applications completely unsupported, unpatched and un-updated – a very bad thing to do in this age of viruses, malware and cyberattacks.To read this article in full, please click here

Interview: Benefits of Network Automation (Part 2)

As promised, here’s the second part of my Benefits of Network Automation interview with Christoph Jaggi published in German on Inside-IT last Friday (part 1 is here).

What are some of the challenges?

The biggest challenge everyone faces when starting the network automation is the snowflake nature of most enterprise networks and the million one-off exceptions we had to make in the past to cope with badly-designed applications or unrealistic user requirements. Remember: you cannot automate what you cannot describe in enough details.

Read more ...

Learning the structure of generative models without labeled data

Learning the structure of generative models without labeled data Bach et al., ICML’17

For the last couple of posts we’ve been looking at Snorkel and BabbleLabble which both depend on data programming – the ability to intelligently combine the outputs of a set of labelling functions. The core of data programming is developed in two papers, ‘Data programming: creating large training sets, quickly’ (Ratner 2016) and today’s paper choice, ‘Learning the structure of generative models without labeled data’ (Bach 2017).

The original data programming paper works explicitly with input pairs (x,y) (e.g. the chemical and disease word pairs we saw from the disease task in Snorkel) which (for me at least) confuses the presentation a little compared to the latter ICML paper which just assumes inputs x (which could of course have pair structure, but we don’t care about that at this level of detail). Also in the original paper dependencies between labelling functions are explicitly specified by end users (as one of four types: similar, fixing, reinforcing, and exclusive) and built into a factor graph. In the ICML paper dependencies are learned. So I’m going to work mostly from ‘Learning the structure of generative Continue reading

Is the Linux 4.18 kernel heading your way?

How soon the 4.18 kernel lands on your system or network depends a lot on which Linux distributions you use. It may be heading your way or you may already be using it.If you have ever wondered whether the same kernel is used in all Linux distributions, the answer is that all Linux distributions use the same kernel more or less but there are several big considerations that make that "more or less" quite significant. Most distributions add or remove code to make the kernel work best for them. Some of these changes might eventually work their way back to the top of the code heap where they will be merged into the mainstream, but they'll make the distribution's kernel unique -- at least for a while. Some releases intentionally hold back and don't use the very latest version of the kernel in order to ensure a more predictable and stable environment. This is particularly true of versions that are targeted for commercial distribution. For example, RHEL (Red Hat Enterprise Edition) will not be nearly as aggressively updated as Fedora. Some distributions use a fork called Linux-libre, which is Linux without any proprietary drivers built in. It omits software Continue reading

Excess data center heat is no longer a bug — it’s a feature!

Every data center admin knows that dealing with excess heat is one of the biggest, most expensive factors involved in running a modern data center.For decades, engineers have been looking for new ways to mitigate the issue, and now Norway is building a brand-new town designed to turn the problem into an opportunity to lower costs, reduce energy usage, and fight climate change.[ Read also: Data center cooling market set to explode in the coming years | Get regularly scheduled insights: Sign up for Network World newsletters ] Hug your servers ... to stay warm According to Fast Company, the town of Lyseparken, now under construction near Bergen, Norway, is being built to use the excess heat generated by a new data center in the heart of the community to keep a nearly 6.5 million square feet of nearby planned business and office space—and eventually up to 5,000 homes—warm. It works like this:To read this article in full, please click here

Excess data center heat is no longer a bug — it’s a feature!

Every data center admin knows that dealing with excess heat is one of the biggest, most expensive factors involved in running a modern data center.For decades, engineers have been looking for new ways to mitigate the issue, and now Norway is building a brand-new town designed to turn the problem into an opportunity to lower costs, reduce energy usage, and fight climate change.[ Read also: Data center cooling market set to explode in the coming years | Get regularly scheduled insights: Sign up for Network World newsletters ] Hug your servers ... to stay warm According to Fast Company, the town of Lyseparken, now under construction near Bergen, Norway, is being built to use the excess heat generated by a new data center in the heart of the community to keep a nearly 6.5 million square feet of nearby planned business and office space—and eventually up to 5,000 homes—warm. It works like this:To read this article in full, please click here

Arista BGP EVPN – Overview and Concepts

Introduction Traditionally, Data Centers used lots of Layer 2 links that spanned entire racks, rows, cages, floors, for as far as the eye could see. These large L2 domains were not ideal for a data center, due to the slow convergence, unnecessary broadcasts, and difficulty in administering. To optimize the data center network, we needed […]

The post Arista BGP EVPN – Overview and Concepts appeared first on Overlaid.

Reaction: Centralization Wins

Warning: in this post, I am going to cross a little into philosophy, governance, and other odd subjects. Here there be dragons. Let me begin by setting the stage:

Decentralized systems will continue to lose to centralized systems until there’s a driver requiring decentralization to deliver a clearly superior consumer experience. Unfortunately, that may not happen for quite some time. —Todd Hoff @High Scalability

And the very helpful diagram which accompanies the quote—

The point Todd Hoff, the author makes, is that five years ago he believed the decentralized model would win, in terms of the way the Internet is structured. However, today he doesn’t believe this; centralization is winning. Two points worth considering before jumping into a more general discussion.

First, the decentralized model is almost always the most efficient in almost every respect. It is the model with the lowest signal-to-noise ratio, and the model with the highest gain. The simplest way to explain this is to note the primary costs in a network is the cost of connectivity, and the primary gain is the amount of support connections provide. The distributed model offers the best balance of these two.

Second, what we are generally talking about here Continue reading

Learn About Social Engineering Techniques with Josue Vargas

Learn in-depth information about social engineering techniques and countermeasures in the course Certified Ethical Hacker: Social Engineering, available as a stand alone download or with your INE.com All Access Pass!

Not Like in Hollywood

When we think of hacking, we often picture a grim fellow opening a laptop, typing really fast and bam! he’s infiltrated the Pentagon. Watching those films when you know one or two things about system and network security is a hilarious experience! If you have been following the Certified Ethical Hacker series, you would know better by now. Hacking is not for the faint-hearted, not just because of the technical difficulty (indeed there are several needed skills to be developed), but because of the resilience needed. I’m talking about the fact that a successful hack comes after many failed attempts in most scenarios.

Because hacking into systems as an outsider is so difficult there’s a key toolkit that every hacker needs to master as much as they master sniffing, session hijacking, application hacking, or any other technical specialty- I’m talking about Social Engineering.

It’s a Dark World

A highly empathetic person might have a hard time with the concept of Social Engineering. It’s pretty Continue reading