Telepresence has become a very intelligent business strategy, especially for companies that are spread across multiple sites or for those that have clients in many locations that they need to deal with on a fairly regular basis. Using what is in essence a fairly simple robot, anyone can transport himself to another location, move around through offices and interact face to face with people they might not otherwise ever meet. Granted they’re going to look something like large iPads held up by a couple metal rods riding on top of self-propelled vacuum cleaners, but the experience is still surprisingly effective.I recently had a chance to transport myself using one of the Beam presence systems built by Suitable Technologies. I sat in my office in the mountains in Virginia while being transported to an office suite in Palo Alto, California, and interacted with two members of the staff. I had previously spoken with one of the same company’s customers at yet another location to get a feel for how they were using their Beams.To read this article in full or to leave a comment, please click here
It’s zombie season again! Not only was The Walking Dead back with new episodes this month, but neighborhoods around the country are about to be crawling with zombies (most can be staved off with a little chocolate).In business, unfortunately, zombie season has been in full swing for some time. This is an era of digital disruption, and it’s completely changed the way business is done, but not everyone has gotten on board. Companies are persisting with outdated business models, investing in outdated products, and committed to outdated delivery methods. To me, these companies are zombies, dead without knowing it. They may be moving forward, but don’t let the motion fool you, they’re only moving toward obsolescence.To read this article in full or to leave a comment, please click here
While the public cloud continues to grow at the expense of on-premises data centers, not everything that moves to the cloud stays there. Some data comes back, for a variety of reasons. And while apps are moving to the cloud at a rapid clip, data is not.That’s the findings of a recent report by 451 Research commissioned by Schneider Electric, entitled “Customer Insight: Future-Proofing Your Colocation Business.” It finds, for example, that global operational square footage hosting cloud infrastructure will grow at a 16 percent compound annual growth rate (CAGR) from 2017 to 2020, while the amount of on-premises enterprise data center capacity will drop four percentage points, from 77 to 73 percent in the same time period.To read this article in full or to leave a comment, please click here
While the public cloud continues to grow at the expense of on-premises data centers, not everything that moves to the cloud stays there. Some data comes back, for a variety of reasons. And while apps are moving to the cloud at a rapid clip, data is not.That’s the findings of a recent report by 451 Research commissioned by Schneider Electric, entitled “Customer Insight: Future-Proofing Your Colocation Business.” It finds, for example, that global operational square footage hosting cloud infrastructure will grow at a 16 percent compound annual growth rate (CAGR) from 2017 to 2020, while the amount of on-premises enterprise data center capacity will drop four percentage points, from 77 to 73 percent in the same time period.To read this article in full or to leave a comment, please click here
Storage has long been one of the biggest line items on the IT budget. Rightly so, given that data is valuable asset and the lifeblood of every business today. Critical applications consume data as quickly as they get it, and many companies are also using their data to find new insights that help them develop novel products and strategies.Regardless of how hot a file is when it is created, in time, its use cools. As a business matures, more and more cool, cold and even frigid data continues to pile up. With analytics now offering new insights on old data, however, no one wants to delete old files or send them to offline archival storage. This means buying more and more capacity is a given, just as death and taxes.To read this article in full or to leave a comment, please click here
reimagine /riːɪˈmadʒɪn/To reinterpret something imaginatively – in other words, in a creative and innovative way
The word “reimagine” is one of those words loved by marketing people and often loathed by engineers. But, in the context of this column, I think it is appropriate. The word “reimagine” should be close to every engineer’s heart, as it is at the essence of what we all love: solving problems in a creative and innovative way.Over the last decade or two, we have witnessed a great deal of creativity and innovation in how we build networks and deliver communication services. We have witnessed the rise of Ethernet and IP and how these two protocols laid the foundation for a common networking paradigm that we take for granted today. We have witnessed the rise of the IP-based internet and how every imaginable service has been dramatically affected. We have witnessed the rise of cloud computing and how this has, in a sense, completed the disruption that the introduction of the internet first promised.To read this article in full or to leave a comment, please click here
No advance in information technology in the past six decades has offered a greater range of quantifiable benefits than has virtualization. Many IT professionals think of virtualization in terms of virtual machines (VM) and their associated hypervisors and operating-system implementations, but that only skims the surface. An increasingly broad set of virtualization technologies, capabilities, strategies and possibilities are redefining major elements of IT in organizations everywhere.Virtualization definition
Examining the definition of virtualization in a broader context, we define virtualization as the art and science of making the function of an object or resource simulated or emulated in software identical to that of the corresponding physically realized object. In other words, we use an abstraction to make software look and behave like hardware, with corresponding benefits in flexibility, cost, scalability, reliability, and often overall capability and performance, and in a broad range of applications. Virtualization, then, makes “real” that which is not, applying the flexibility and convenience of software-based capabilities and services as a transparent substitute for the same realized in hardware.To read this article in full or to leave a comment, please click here
Understanding user experience is becoming critically important to the success of all companies. I’ve interviewed dozens of business leaders on their digital transformation initiatives, and I can sort them into two larger buckets: increasing workforce productivity and improving customer experience. Those may seem somewhat unrelated, other than they used digital technologies, but there is another point of commonality and it’s that applications play a key role.Also on Network World: 7 must-have network tools
By 2020, customer experience will be the #1 brand differentiator, topping price, product, or any other metric you can think of. While a web or mobile app experience isn’t the only thing that creates a good or bad experience, it’s often the first touch point for customers — and a bad one could drive them away.To read this article in full or to leave a comment, please click here
A few weeks back I told you how white box vendors, those Chinese-made, unbranded server vendors that compete with HP Enterprise and Dell EMC, were taking a sizable chunk of the business from the brand-name vendors.Well, now HPE has made it official and announced it will no longer try to sell commodity hardware — the cheap, low-end servers used in abundance in public-facing data centers — to tier 1 customers like Amazon, Facebook, Google and Microsoft. Also on Network World: How HPE plans to spin out its software assets
HPE president Antonio Neri made the announcement at HPE’s analyst day event last week. He added that HPE would continue to sell higher-end servers to those vendors.To read this article in full or to leave a comment, please click here
There are a number of efforts involving artificial intelligence (AI) and neural network-oriented processors from vendors such as IBM, Qualcomm and Google. Now, you can add Intel to that list. The company has formally introduced the Nervana Neural Network processor (NNP) for AI projects and tasks. This isn’t a new Intel design. The chips come out of Intel’s $400 million acquisition of a deep learning startup called Nervana Systems last year. After the acquisition, Nervana CEO Naveen Rao was put in charge of Intel’s AI products group. RELATED: Artificial intelligence in the enterprise: It’s on
“The Intel Nervana NNP is a purpose-built architecture for deep learning,” Rao said in a blog post formally announcing the chip. “The goal of this new architecture is to provide the needed flexibility to support all deep learning primitives while making core hardware components as efficient as possible.” To read this article in full or to leave a comment, please click here
The drive to digital transformation is causing the world to move faster than ever. And it seems businesses are experiencing a huge case of “fear of missing out” (FOMO) and adopting new technologies at a dizzying pace.A few years ago, only a few companies had invested in the Internet of Things (IoT), software-defined networking (SDN), cloud services and DevOps. Today, they’re rapidly becoming the norm, and it’s difficult, if not impossible, for IT to maintain the current environment.Doing things manually no longer works. An experienced engineer used to be able to look at router logs, TCP dumps or other data and figure out what was going on and find the source of a problem. But now, so much data is being generated from so many sources that even the best engineers can’t keep up and know the network like they used to. To read this article in full or to leave a comment, please click here
What does it take to open the world’s first self-powered data center? For Aruba S.p.A., it involved three elements:
Flowing river water
Photovoltaic solar panels
Always cold, pumped-to-the-surface underground water as the principal cooling source
Aruba’s newest data center, named the Global Cloud Data Center (IT3) is located near Milan, Italy, and claims to be 100 percent green. The 49-acre ANSI/TIA-942 Rating 4 standard facility (at 200,000 square meters) opened earlier this month.Also on Network World: Space-radiated cooling cuts power use 21%
Low-impact credentials at the site come largely because the data center has its own dedicated hydroelectric plant. The facility is located on the banks of the River Brembo, an Aruba representative told me. Electricity is generated from the running river water through the operation of turbines. That power is stored and then injected into the national grid infrastructure. Electricity is supposedly guaranteed for the campus by the national grid in exchange for the input.To read this article in full or to leave a comment, please click here
Hard disk makers are using capacity as their chief bulwark against the rise of solid-state drives (SSDs), since they certainly can’t argue on performance, and Western Digital — the king of the hard drive vendors — has shown off a new technology that could lead to 40TB drives.Western Digital already has the largest-capacity drive on the market. It recently introduced a 14TB drive, filled with helium to reduce drag on the spinning platters. But thanks to a new technology called microwave-assisted magnetic recording (MAMR), the company hopes to reach 40TB by 2025. The company promised engineering samples of drive by mid-2018.Also on Network World: Get ready for new storage technologies and media
MAMR technology is a new method of cramming more data onto the disk. Western Digital’s chief rival, Seagate, is working on a competitive product called HAMR, or heat-assisted magnetic recording. I’ll leave it to propeller heads like AnandTech to explain the electrical engineering of it all. What matters to the end user is that it should ship sometime in 2019, and that’s after 13 years of research and development. To read this article in full or to leave a comment, please click here
Virtualization management tools are becoming a must-have for enterprises that are grappling with increasingly dynamic infrastructure environments. Virtualization management tools allow for proactive capacity planning, which increases performance efficiency, keeps costs in check and avoids disruption.To read this article in full or to leave a comment, please click here(Insider Story)
Oracle is not the first company that comes to mind when you think of enterprise security, but the company announced at its recent OpenWorld conference new products with artificial intelligence (AI) and machine learning capabilities to quickly identify security threats.The company introduced two new sets of integrated suites called Oracle Identity Security Operations Center (SOC) and Oracle Management Cloud. It claims they will help enterprises forecast, reduce, detect and resolve cybersecurity threats in minutes rather than days and assist remediation of application and infrastructure performance issues.It makes sense for Oracle to jump into this field even if it is full of established players like Symantec, Sophos, Tripwire and far more. Since Oracle’s databases are often a target of hacker attacks, who better to secure an Oracle database than Oracle? To read this article in full or to leave a comment, please click here
Over the past few weeks, Nvidia has been holding a series of regional GPU Technology Conferences (GTC) in different parts of the globe. In September, Nvidia showed off its new Tensor3 GPU made for artificial intelligence (AI) inferencing in China. This week, the company took its show to Munich to host GTC Europe where it made a couple of announcements in the advancements of self-driving vehicles.The quest for the fully autonomous car has been somewhat of a “holy grail” and one of the best examples of what’s possible when discussing advanced technologies such as machine learning, artificial intelligence (AI), and the Internet of Things (IoT).To read this article in full or to leave a comment, please click here
Two new spending reports paint a rosy picture for some IT vendors, but it isn’t looking so great for the traditional players.At Gartner’s Symposium & ITxpo last week, the firm released a report stating the global IT market is expected to reach $3.7 trillion next year, a 4.3 percent increase over the $3.5 billion expected for this year.Spending on traditional data center hardware and systems is expected to stay flat, continuing a trend we’ve seen for a while now as businesses ramp up spending on three on-demand services: infrastructure as a service (IaaS), platform as a service (PaaS), and communication as a service (CaaS).Also on Network World: Report confirms on-premises data center spending declined
"The IT buying landscape is changing: Digital business transformation is an effort to create connected, platforms and new industry revenue streams," analyst John-David Lovelock wrote in the report. "Organizations that are not creating new digital business models or new ways to engage constituents or customers are falling behind. Those vendors that do not move more quickly than their clients will be left behind." To read this article in full or to leave a comment, please click here
Old habits die hard, especially when it comes to buying network gear and accessories based on long-standing procurement practices. While it may seem easier to sustain the status quo, doing so can expose you to undue costs created by manufacturer price-gouging practices.Case in point: Optical transceivers, which Gartner says accounts for 10 to 15 percent of enterprise network capital spending. This may not seem like a big budget buster, but huge markups on optics is the subject of a new Gartner report, entitled “How to Avoid the Biggest Rip-Off in Networking.”To read this article in full or to leave a comment, please click here
Two years ago, Intel spent $16.7 billion to acquire FPGA chip vendor Altera. So, what’s it going to do with that big purchase? The company is finally ready to say. A field-programmable gate array, or FPGA, is an integrated circuit that can be customized to perform specific functions. Whereas the x86 executes only the x86 instruction sets, an FPGA can be reprogrammed on the fly to perform specified tasks. That’s why x86s are considered general compute processors and FPGAs are viewed as customizable. Also on Network World: What you need when the big breakout for the Internet of Things arrives
The company’s strategy is interesting in that it effectively puts Intel in competition with itself. If you want to do massive floating-point computation, Intel has the Xeon Phi line of add-in cards that compete with Nvidia and AMD GPUs. Now the FPGAs are also targeting those use cases. To read this article in full or to leave a comment, please click here
The Internet of Things (IoT) sometimes has the feel of a trend that’s forever going to be on the cusp of a huge breakout. Figures fly around about the projected size of the IoT and they’re always massive (such as the 50 billion devices Cisco predicted by 2020). But the number of things in the IoT is already counted in the 8 billion to 15 billion range. So, shouldn’t we be seeing more from the IoT by now? Based on what leaders are saying in a survey commissioned by Verizon, we soon will.To read this article in full or to leave a comment, please click here