Plants, factories, and manufacturers in general are embracing IoT, which in turn is driving the use of artificial intelligence at the edge of corporate networks as a way to streamline industrial processes, improve efficiency and detect maintenance issues before they become problems – perhaps even big problems that could force plant shutdowns.To read this article in full, please click here(Insider Story)
The cloud continues to grow in popularity as businesses look to take advantage of digital trends. However, the term “cloud” is multi-definitional and means different things to different types of organizations. In the small-business segment, cloud likely means software as a service (SaaS), as those organizations want turnkey applications offered on a pay-as-you-go model. For larger companies, cloud means public infrastructure as a service, such as Amazon Web Services or Microsoft Azure. [ Related: How to plan a software-defined data-center network ]
Private clouds are alive and kicking
For large businesses, the cloud likely means hybrid where private data centers make up most or even all of the cloud infrastructure. The ZK Research 2018 Global Cloud Forecast projected that by 2020, there would be more workloads in private clouds than in public clouds or available as legacy on-premises workloads. (Note: I am an employee of ZK Research.)To read this article in full, please click here
Network administrators, IT managers and security professionals face a never-ending battle, constantly checking on what exactly is running on their networks and the vulnerabilities that lurk within. While there is a wealth of monitoring utilities available for network mapping and security auditing, nothing beats Nmap's combination of versatility and usability, making it the widely acknowledged de facto standard.What is Nmap?
Nmap, short for Network Mapper, is a free, open-source tool for vulnerability scanning and network discovery. Network administrators use Nmap to identify what devices are running on their systems, discovering hosts that are available and the services they offer, finding open ports and detecting security risks.To read this article in full, please click here
Constructing a more adaptive network takes more than the industry finally finding the resolve to do it. There are real technology advancements that must be brought to bear. It is these advancements – nurtured in the care of those with the will and experience to disrupt the status quo – that will drive the industry forward to the reality of the Adaptive Network. Whether you are taking the initial first steps or planning a larger-scale transformation, this sought-after webinar series will provide you the insight and proven experience to help you traverse this complex journey.To read this article in full, please click here
Linux systems provide many ways to look at disk partitions. In this post, we'll look at a series of commands, each which shows useful information but in a different format and with a different focus. Maybe one will make your favorites list.lsblk
One of the most useful commands is the lsblk (list block devices) command that provides a very nicely formatted display of block devices and disk partitions. In the example below, we can see that the system has two disks (sda and sdb) and that sdb has both a very small (500M) partition and a large one (465.3G). Disks and partitions (part) are clearly labeled, and the relationship between the disks and partitions is quite obvious. We also see that the system has a cdrom (sr0).To read this article in full, please click here
A survey from the Uptime Institute found that while data centers are getting better at managing power than ever before, the rate of failures has also increased — and there is a causal relationship.The Global Data Center Survey report from Uptime Institute gathered responses from nearly 900 data center operators and IT practitioners, both from major data center providers and from private, company-owned data centers.It found that the power usage effectiveness (PUE) of data centers has hit an all-time low of 1.58. By way of contrast, the average PUE in 2007 was 2.5, then dropped to 1.98 in 2011, and to 1.65 in the 2013 survey.To read this article in full, please click here
Along with the rise of cloud computing, Agile, and DevOps, the increasing use of microservices has profoundly affected how enterprises develop software. Now, at least one Silicon Valley startup hopes the combination of microservices and edge computing is going to drive a similar re-think of the Internet of Things (IoT) and create a whole new software ecosystem.Frankly, that seems like a stretch to me, but you can’t argue with the importance of microservices to modern software development. To learn more, I traded emails with Said Ouissal, founder and CEO of ZEDEDA, which is all about “deploying and running real-time edge apps at hyperscale” using IoT devices.To read this article in full, please click here
Deduplication is arguably the biggest advancement in backup technology in the last two decades. It is single-handedly responsible for enabling the shift from tape to disk for the bulk of backup data, and its popularity only increases with each passing day. Understanding the different kinds of deduplication, also known as dedupe, is important for any person looking at backup technology.What is data deduplication?
Dedupe is the identification and elimination of duplicate blocks within a dataset. It is similar to compression, which only identifies redundant blocks in a single file. Deduplication can find redundant blocks of data between files from different directories, different data types, even different servers in different locations.To read this article in full, please click here
Dynamic Host Configuration Protocol (DHCP) is the standard way network administrators assign IP addresses in IPv4 networks, but eventually organizations will have to pick between two protocols created specifically for IPv6 as the use of this newer IP protocol grows.DHCP, which dates back to 1993, is an automated way to assign IPv4 addresses, but when IPv6 was designed, it was provided with an auto-configuration feature dubbed SLAAC that could eventually make DHCP irrelevant. To complicate matters, a new DHCP – DHCPv6 – that performs the same function as SLAAC was independently created for IPv6.
[ Now read 20 hot jobs ambitious IT pros should shoot for. ]
Deciding between SLAAC and DHCPv6 isn’t something admins will have to do anytime soon, since the uptake of IPv6 has been slow, but it is on the horizon.To read this article in full, please click here
You can’t spell IoT without IT, but that doesn’t mean IT departments are big fans of Internet of Things deployments. In fact, we’ve observed that IoT tends to exacerbate some of the challenges that IT departments face within their organizations.Understanding and anticipating IT’s primary areas of concern can help IoT deployments succeed.Lines are blurring
Traditionally, the IT department has operated in a vacuum. The focus of IT departments has been in internal support, network management and managing enterprise applications. IoT deployments have been the traditional focus of operations teams that typically deploy point solutions to solve a business issue.To read this article in full, please click here
With increasing use of containers by DevOps, data-center network administrators need to respond to the distinct demands they place on the network, including scalability, predictable performance, multi-tenancy and security.To read this article in full, please click here(Insider Story)
More than 120 million Microsoft Office accounts have moved from on-premises to the cloud since the launch of Microsoft Office 365. Many of those accounts belong to users in large enterprises that weren’t fully prepared for the transition. The fact is as many as 30 to 40 percent of enterprises struggle with some level of application performance as they make the shift to cloud.Some of the signs of poor performance (and the source of users’ frustration) include Outlook responding slowly when the user tries to open messages, VoIP calls over Skype for Business having rough spots, and documents being slow to open, close and save in Word. Performance problems in the Office applications manifest in many other ways, as well.To read this article in full, please click here
Not everyone thinks in binary or wants to mentally insert commas into large numbers to come to grips with the sizes of their files. So it's not surprising that Linux commands have evolved over several decades to incorporate more human-friendly ways of displaying information to its users. In today’s post, we’re going to look at some of the options provided by various commands that make digesting data just a little easier.Why not default to friendly?
If you’re wondering why human-friendliness isn’t the default –- we humans are, after all, the default users of computers -- you might be asking yourself “Why do we have to go out of our way to get command responses that will make sense to everyone?” The answer is primarily that changing the default output of commands would likely interfere with numerous other processes that were built to expect the default responses. Other tools as well as scripts that have been developed over the decades might break in some very ugly ways if they were suddenly fed output in a very different format than what they were built to expect. It’s probably also true that some of us might prefer to see all of Continue reading
Not everyone thinks in binary or wants to mentally insert commas into large numbers to come to grips with the sizes of their files. So, it's not surprising that Linux commands have evolved over several decades to incorporate more human-friendly ways of displaying information to its users. In today’s post, we look at some of the options provided by various commands that make digesting data just a little easier.Why not default to friendly?
If you’re wondering why human-friendliness isn’t the default –- we humans are, after all, the default users of computers — you might be asking yourself, “Why do we have to go out of our way to get command responses that will make sense to everyone?” The answer is primarily that changing the default output of commands would likely interfere with numerous other processes that were built to expect the default responses. Other tools, as well as scripts, that have been developed over the decades might break in some very ugly ways if they were suddenly fed output in a very different format than what they were built to expect.To read this article in full, please click here
A Cambridge, Massachusetts, research lab is addressing some of modern medicine’s most overlooked issues with cutting-edge IoT technology and an open-source approach, weaving aging devices and deeply siloed data into an accessible web of medical information.The Medical Device Interoperability Program, or MD PnP, in affiliation with Massachusetts General Hospital and Partners Healthcare, is a hub for research into making medical devices dramatically smarter by making it simpler for them to share the data they gather.[ For more on IoT see tips for securing IoT on your network, our list of the most powerful internet of things companies and learn about the industrial internet of things. | Get regularly scheduled insights by signing up for Network World newsletters. ]
With more and more people being monitored by IoT devices in hospitals and monitoring themselves with Fitbits and Apple watches, there’s suddenly a lot more digital data than there was before in the world of healthcare.To read this article in full, please click here
How much can your Linux system tell you about the kernel it's running and what commands are available to help you ask? Let's run through some of them.uname
The simplest and most straight-forward command for providing information on your kernel is the uname -r command. It provides a succinct answer to your question but in a format that also includes a number of fields each which provides a particular piece of information.$ uname -r
4.15.0-30-generic
^ ^ ^ ^ ^
| | | | |
| | | | |
| | | | +-- the distribution-specific string
| | | +------- the latest bug fix
| | +---------- the minor revision
| +------------ the major revision
+--------------- the kernel version
Add an "s" and your output will include the kernel's name:To read this article in full, please click here
Need to move petabytes of data into the cloud? There’s an old-school option that works faster than network transfers: Load a physical appliance with data, then ship the appliance to a cloud provider.To read this article in full, please click here(Insider Story)
Cisco is moving rapidly toward its ultimate goal of making SD-WAN features ubiquitous across its communication products, promising to boost network performance and reliability of distributed branches and cloud services.The company this week took a giant step that direction by adding Viptela SD-WAN technology to the IOS XE software that runs its core ISR/ASR routers. Over a million of ISR/ASR edge routers, such as the ISR models 1000, 4000 and ASR 5000 are in use by organizations worldwide.[ Related: MPLS explained -- What you need to know about multi-protocol label switching]To read this article in full, please click here
You can scratch the Xeon Phi off your shopping list. And if you deployed it, don’t plan on upgrades. That's because Intel has quietly killed off its high-performance computing co-processor because forthcoming Xeon chips have all the features of the Phi, no separate chip or add-in card needed.Intel quietly ended the life of the Xeon Phi on July 23 with a “Product Change Notification” that contained Product Discontinuance/End of Life information for the entire Knight’s Landing line of Xeon Phis.The last order date for the Xeon Phi is Aug. 31, 2018, and orders are non-cancelable and non-returnable after that date. The final shipment date is set for July 19, 2019.To read this article in full, please click here
For years it has been normal practice for organizations to store as much data as they can. More economical storage options combined with the hype around big data encouraged data hoarding, with the idea that value would be extracted at some point in the future.With advances in data analysis many companies are now successfully mining their data for useful business insights, but the sheer volume of data being produced and the need to prepare it for analysis are prime reasons to reconsider your strategy. To balance cost and value it’s important to look beyond data hoarding and to find ways of processing and reducing the data you’re collecting.To read this article in full, please click here