I recently gave a webinar on how to best architect your network for Office 365. It comes on the heels of a number of complaints from customers around their struggles deploying responsive Office 365 implementations. SharePoint doesn’t quite work; Skype calls are unclear. And forget about OneDrive for Business. It’s incredibly slow.Latency and Office 365
Ensuring a smooth transition to Office 365, or for that matter any cloud deployment, involves a solid understanding of which Office 365 applications are being deployed. Here latency matters. Microsoft recommends that round trip latency for Office 365 does not exceed 275 ms, but those metrics change significantly depending on the Office 365 application. Latency should not exceeds 50ms with Exchange Online and 25ms with SharePoint. (Check out my “ultimate” list of Office 365 networking tools for help with your O365 deployment.)To read this article in full or to leave a comment, please click here
For years, white box PCs have accounted for a significant chunk of desktop sales. It was the same wherever I went: small mom-and-pop shops built their own PCs using components shipped in from Taiwan, and if there was a logo on it, it was for the PC store (affectionately referred to as “screwdriver shops”) that built the thing. On the server side, though, it remained a name-brand business. Data centers were filled with racks of servers that bore the logos of IBM (now Lenovo), Dell and HP. + Also on Network World: How a data center works, today and tomorrow +
However, that’s changing. In its latest sales figures for the second quarter of 2017, IDC says ODM sales now account for the largest group of server sales, surpassing HPE. In the second calendar quarter of 2017, worldwide server sales increased 6.3 percent year over year to $15.7 billion thanks in part to new Intel Skylake processors.To read this article in full or to leave a comment, please click here
The rapid development of the so-called Internet of Things (IoT) has pushed many old industries to the brink, forcing most companies to fundamentally reevaluate how they do business. Few have felt the reverberations of the IoT more than the microchip industry, one of the vital drivers of the IoT that has both enabled it and evolved alongside of it.So how exactly is chip design evolving to keep up with the IoT’s breakneck proliferation? A quick glance at the innerworkings of the industry which enables all of our beloved digital devices to work shows just how innovative it must be to keep up with today’s ever-evolving world.Building a silicon brain
Developing microchips, which are so diverse that they’re used to power coffee makers and fighter jets alike, is no simple task. In order to meet the massive processing demands of today’s digital gadgets, chip design has been forced to take some tips from the most efficient computer known to man: the human brain.To read this article in full or to leave a comment, please click here
A data center is a physical facility that enterprises use to house their business-critical applications and information, so as they evolve, it’s important to think long-term about how to maintain their reliability and security.Data center components
Data centers are often referred to as a singular thing, but in actuality they are composed of a number of technical elements such as routers, switches, security devices, storage systems, servers, application delivery controllers and more. These are the components that IT needs to store and manage the most critical systems that are vital to the continuous operations of a company. Because of this, the reliability, efficiency, security and constant evolution of a data center are typically a top priority.To read this article in full or to leave a comment, please click here
NB-IoT, that must stand for “no big IoT,” like “no big deal?”Awful. Why don’t you leave the jokes to me?Fine, fine. So what is NB-IoT, really?First of all, it’s narrow-band IoT, and it’s a communication standard designed to let IoT devices operate via carrier networks, either within an existing GSM carrier wave, in an unused “guard band” between LTE channels, or independently.+ALSO ON NETWORK WORLD: Aruba rolls out security fabric designed for IoT and the digital era + Survey: Enterprise IoT faces skills shortage, security challengesTo read this article in full or to leave a comment, please click here
As a network engineer, an improperly configured application can cost a whole lot of time and money down the line. The best way to try and prevent these unfortunate accidents is by conducting thorough and efficient testing on a routine basis. Whether designing a network, migrating to the cloud, or adding a new device to the rack, every step within the application deployment life cycle should be validated with accurate testing.Regarding network testing, the terms emulation and simulation are often used interchangeably. In most cases, either term will generally get the point across, but there’s a big difference between a network emulator and network simulator, both practically and semantically. To read this article in full or to leave a comment, please click here
The Pervasive Network is far more than a collection of technologies and processes; it is a promise to be fulfilled. A promise of delivering constant, reliable, smart, secure, intelligent and scalable bandwidth to power a future of ubiquitous IoT devices, augmented reality experiences, super smart AI systems and innovation in the form of new mobile services and applications yet to be imagined. These technologies will be the building blocks for improving the operational efficiency of every business and providing customer experiences that will make the difference in every company’s competitive position going forward.I have been on the customer and consultant side of the fence, and one thing has become clear to me: the network is not a commodity component, but a vital and strategic key to unlocking the potential of the innovation that we see in new services, experiences and opportunities.To read this article in full or to leave a comment, please click here
It’s hardly surprising that IT professionals have their hands full in the age of IoT (Internet of Things) and Big Data. Supporting rapidly growing data volumes, new data types, and many more data sources is making it harder than ever for IT to meet service level agreements (SLAs) while keeping spending in check. The complexity IT manages is clear in the results of a recent Storage Census of over 300 IT professionals my company, Primary Data, conducted at VMworld 2017. The survey showcased the conflicting pressures currently faced by IT leaders. Those surveyed included delivering performance, executing data migrations, meeting expectations with existing budgets, and integrating the cloud into their infrastructure among the biggest challenges facing their departments today. Let’s examine the factors that contribute to these challenges and how IT can solve them.To read this article in full or to leave a comment, please click here
I don’t think anyone would argue with the premise that data centers have increased significantly over the past decade. Data centers used to be orderly, as each application had its own dedicated hardware and software. This was highly inefficient, but most data centers could be managed with a handful of people. Then something changed. Businesses were driven to improve the utilization of infrastructure and increase the level of agility, and along came a number of technologies such as virtualization, containers and the cloud. Also, organizations started to embrace the concept of DevOps, which necessitates a level of dynamism and speed never seen before in data centers. To read this article in full or to leave a comment, please click here
Lately, our clients with an MPLS WAN are starting to ask: “Should we get rid of our MPLS and go to SD-WAN? Is SD-WAN better?”I don’t mean to make this question sound childish (since it’s a fantastically good question), but it reminds me of a mistake I made when my 9-year-old son asked me a question the other day.He is obsessed with baseball, so, of course, he asked me something all of us baseball fans have wondered at some point: “Daddy, which is better… a guy who hits .250 with 50 home runs or a guy who hits .300 with 10 home runs?”To read this article in full or to leave a comment, please click here
Data is at its greatest risk of being compromised when it is being used, when moving from a secure database around the servers or apps in memory. So, Microsoft is launching a new technology for Windows Server and Azure that protects the data while it’s being processed. Microsoft claims the service, called Azure confidential computing, makes it the first public cloud provider to offer encryption of data while in use. Encrypting data while it is being manipulated is pretty CPU-intensive, and there is no word on the performance impact of this service. “Despite advanced cybersecurity controls and mitigations, some customers are reluctant to move their most sensitive data to the cloud for fear of attacks against their data when it is in use,” Mark Russinovich, Microsoft Azure CTO, wrote in a company blog post. “With confidential computing, they can move the data to Azure knowing that it is safe not only at rest, but also in use from [various] threats.” To read this article in full or to leave a comment, please click here
Setting up and managing an IT infrastructure isn’t what most small and mid-sized business owners signed up for when they opened their doors. At least not voluntarily. After all, IT is an intimidating field filled with fragmented components, esoteric expertise, and expensive hardware. That may be why the most powerful network solutions have felt out of reach for small businesses, only approachable by larger enterprises with deeper pockets.Whether they have the resources or not, though, every business is a digital business in today’s economy. They all rely on a functional IT framework on some scale. And for the 83% of small businesses that don’t have any dedicated IT staff, the ultimate responsibility of running the company network often falls to the person with the most at stake: the business owner. So too do the related concerns of cybersecurity, network reliability, malfunctioning equipment, employee access, and so on.To read this article in full or to leave a comment, please click here
Many organizations deploy new applications for their remote site users without testing it on a WAN. Not testing these applications across a simulated WAN increases the possibility of performance issues during the early stages of usage because you have no idea how the application will perform once latency or jitter come in between the communication path of the client and server. If the application uses large amounts of bandwidth that causes congestion, then it can negatively affect the performance of other applications that share the bandwidth. A WAN emulator can enable you to measure the average bandwidth that an application may use before you deploy it. To read this article in full or to leave a comment, please click here
In real estate, there’s a mantra that most agents use of “location, location, location,” meaning houses that may be equal in many ways will cost more the closer you get to something of value. For example, the San Jose Mercury News recently published a story about a house in Sunnyvale, California, that sold for $782,000 over asking price. Why such a ridiculous amount? Because it’s near Apple’s new campus — location matters.Does location matter with the cloud? Given how fast data travels, one might not think so, but location does indeed matter. A recent report from EdgeConneX and Cedexis, Cloud, Content, Connectivity and the Evolving Internet Edge, shows just how much it actually does. The study conducted uses Cedexis’ RUM-based internet performance measurement tools to test how cloud applications perform in different locations and with various optimization techniques.To read this article in full or to leave a comment, please click here
What is a data fabric?
The concept of a "data fabric" is emerging as an approach to help organizations better deal with fast growing data, ever changing application requirements and distributed processing needs.The term references technology that creates a converged platform that supports the storage, processing, analysis and management of disparate data. Data that is currently maintained in files, database tables, data streams, objects, images, sensor data and even container-based applications can all be accessed using a number of different standard interfaces.A data fabric makes it possible for applications and tools designed to access data using many interfaces such as NFS (Network File System), POSIX (portable operating system interface), a REST API (representative state transfer), HDFS (Hadoop distributed file system), ODBC (open database connectivity), and Apache KAFKA for real-time streaming data. A data fabric must also be capable of being enhanced to support other standards as they emerge in importance.To read this article in full or to leave a comment, please click here
The Internet of Things, which has captured the attention of investors and futurist alike as it continues to reshape our markets and everyday lives, is now generating buzz around a new topic: manufacturing. Business and tech analyst are increasingly coming to realize the boon that the IoT can offer to U.S. manufacturing, a fact the wealthiest companies and savviest innovators are already exploiting.So how exactly can the IoT drive a productivity revolution in U.S. manufacturing? Is it grand conjecture that’s driving this growing consensus that the IoT will reshape manufacturing, or does this idea have a serious basis of reality? A brief review of the IoT’s impact on U.S. productivity shows just how significantly it can optimize American manufacturing, and how little time we have until all of our production is seamlessly integrated with our internet.To read this article in full or to leave a comment, please click here
Every day we hear how the network is changing. Virtualization, Cloud, Software-defined Networking, the Internet of Things — it’s clear big transformational change is happening. The technical aspects of these solutions seem to get the most attention, but if you manage IT/Network cost and delivery for a living, your success may depend more on understanding the changing network business models that accompany these new technologies and how to adapt your IT operations.I’ve assembled below what I think are five of the top operational challenges facing IT managers in the next generation network. Some of these are blocking and tackling fundamentals (excuse the seasonal American football analogy), while others are more strategic in nature. I’ll present a short rationale for why these are my top 5. I’d love to hear your perspective.To read this article in full or to leave a comment, please click here
The tech industry’s approach to becoming a part of the IoT landscape is reminiscent of a quilting bee – a large number of participants approaching a central problem from a wide array of different angles and taking on different areas of responsibility.And that’s a good fit for the IoT market – companies have wildly diverse sets of needs, requiring a commensurately diverse set of technological capabilities. A factory might need a sophisticated, integrated system that can both manage complicated manufacturing equipment and track products, while a nearby hospital might need to bring expensive medical equipment onto the network.+ALSO ON NETWORK WORLD: 3 real-world examples of IoT rolled out in the enterprise; 5 IoT trends that will define 2018+To read this article in full or to leave a comment, please click here
When discussing Ethernet data, the terms frame and packet are often used interchangeably. Frames and packets are the electronic containers that carry our data from point-to-point by navigating LANs and WANs and, as they both serve similar functions, their differences are often misunderstood.So what’s the difference?
To simplify matters, imagine frames and packets as envelopes of information that are going to be sent from one person to another. The key difference between a frame and a packet is how they encapsulate the information and that depends on where the information is being sent.Frames explained
Imagine a company with inter-department mail where a person can send documents to another person within their private/local organization. The contents are placed in an internal envelope and the sender writes their name and department in the “From” field, then writes the recipient’s name and department in the “To” field. To read this article in full or to leave a comment, please click here
The umask setting plays a big role in determining the permissions that are assigned to files that you create. But what's behind this variable and how do the numbers relate to settings like rwxr-xr-x?First, umask is a setting that directly controls the permissions assigned when you create files or directories. Create a new file using a text editor or simply with the touch command and its permissions will be derived from your umask setting. You can look at your umask setting simply by typing umask on the command line.$ umask
0022
Where the umask setting comes from
The umask setting for all users is generally set up in a system-wide file like /etc/profile, /etc/bashrc or /etc/login.defs -- a file that's used every time someone logs into the system. The setting can be overidden in user-specific files like ~/.bashrc or ~/.profile since these files are read later in the login process. It can also be reset on a temporary basis at any time with the umask command.To read this article in full or to leave a comment, please click here