Archive

Category Archives for "Networking"

Troubleshooting with Docker Swarm + NetQ

Say you are a network engineer, and you recently were told your company will be building applications using a distributed/microservices architecture with containers moving forward. You know how important this is for the developers — it gives them tremendous flexibility to develop and deploy money making applications. However, what does this mean for the network? It can be much more technically challenging to plan, operate, and manage a network with containers than a traditional network. The containers may need to talk with each other and to the outside world, and you won’t even know IF they exist, let alone WHERE they exist! Yet, the network engineer is responsible for the containers connectivity and high availability.

Since the containers are deployed inside a host — on a virtual ethernet network — they can be invisible to network engineers. Orchestration tools such as Docker Swarm, Apache Mesos or Kubernetes make it very easy to spin up and take down containers from various hosts on a network – and may even do this without human intervention. Many containers are also ephemeral and the traffic patterns between the servers hosting containers can be very dynamic and constantly shifting throughout the network.

troubleshooting with Docker Swarm

Cumulus Networks understands Continue reading

Microsoft launches data security technology for Windows Server, Azure

Data is at its greatest risk of being compromised when it is being used, when moving from a secure database around the servers or apps in memory. So, Microsoft is launching a new technology for Windows Server and Azure that protects the data while it’s being processed. Microsoft claims the service, called Azure confidential computing, makes it the first public cloud provider to offer encryption of data while in use. Encrypting data while it is being manipulated is pretty CPU-intensive, and there is no word on the performance impact of this service. “Despite advanced cybersecurity controls and mitigations, some customers are reluctant to move their most sensitive data to the cloud for fear of attacks against their data when it is in use,” Mark Russinovich, Microsoft Azure CTO, wrote in a company blog post. “With confidential computing, they can move the data to Azure knowing that it is safe not only at rest, but also in use from [various] threats.” To read this article in full or to leave a comment, please click here

Microsoft launches data security technology for Windows Server, Azure

Data is at its greatest risk of being compromised when it is being used, when moving from a secure database around the servers or apps in memory. So, Microsoft is launching a new technology for Windows Server and Azure that protects the data while it’s being processed. Microsoft claims the service, called Azure confidential computing, makes it the first public cloud provider to offer encryption of data while in use. Encrypting data while it is being manipulated is pretty CPU-intensive, and there is no word on the performance impact of this service. “Despite advanced cybersecurity controls and mitigations, some customers are reluctant to move their most sensitive data to the cloud for fear of attacks against their data when it is in use,” Mark Russinovich, Microsoft Azure CTO, wrote in a company blog post. “With confidential computing, they can move the data to Azure knowing that it is safe not only at rest, but also in use from [various] threats.” To read this article in full or to leave a comment, please click here

Finally: Easy, Remote IT For SMBs

Setting up and managing an IT infrastructure isn’t what most small and mid-sized business owners signed up for when they opened their doors. At least not voluntarily. After all, IT is an intimidating field filled with fragmented components, esoteric expertise, and expensive hardware. That may be why the most powerful network solutions have felt out of reach for small businesses, only approachable by larger enterprises with deeper pockets.Whether they have the resources or not, though, every business is a digital business in today’s economy. They all rely on a functional IT framework on some scale. And for the 83% of small businesses that don’t have any dedicated IT staff, the ultimate responsibility of running the company network often falls to the person with the most at stake: the business owner. So too do the related concerns of cybersecurity, network reliability, malfunctioning equipment, employee access, and so on.To read this article in full or to leave a comment, please click here

BrandPost: Finally: Easy, Remote IT For SMBs

Setting up and managing an IT infrastructure isn’t what most small and mid-sized business owners signed up for when they opened their doors. At least not voluntarily. After all, IT is an intimidating field filled with fragmented components, esoteric expertise, and expensive hardware. That may be why the most powerful network solutions have felt out of reach for small businesses, only approachable by larger enterprises with deeper pockets.Whether they have the resources or not, though, every business is a digital business in today’s economy. They all rely on a functional IT framework on some scale. And for the 83% of small businesses that don’t have any dedicated IT staff, the ultimate responsibility of running the company network often falls to the person with the most at stake: the business owner. So too do the related concerns of cybersecurity, network reliability, malfunctioning equipment, employee access, and so on.To read this article in full or to leave a comment, please click here

Always test remote app performance with a WAN emulator

Many organizations deploy new applications for their remote site users without testing it on a WAN. Not testing these applications across a simulated WAN increases the possibility of performance issues during the early stages of usage because you have no idea how the application will perform once latency or jitter come in between the communication path of the client and server. If the application uses large amounts of bandwidth that causes congestion, then it can negatively affect the performance of other applications that share the bandwidth. A WAN emulator can enable you to measure the average bandwidth that an application may use before you deploy it. To read this article in full or to leave a comment, please click here

Always test remote app performance with a WAN emulator

Many organizations deploy new applications for their remote site users without testing it on a WAN. Not testing these applications across a simulated WAN increases the possibility of performance issues during the early stages of usage because you have no idea how the application will perform once latency or jitter come in between the communication path of the client and server. If the application uses large amounts of bandwidth that causes congestion, then it can negatively affect the performance of other applications that share the bandwidth. A WAN emulator can enable you to measure the average bandwidth that an application may use before you deploy it. To read this article in full or to leave a comment, please click here

Location, location, location… it matters to the cloud

In real estate, there’s a mantra that most agents use of “location, location, location,” meaning houses that may be equal in many ways will cost more the closer you get to something of value. For example, the San Jose Mercury News recently published a story about a house in Sunnyvale, California, that sold for $782,000 over asking price. Why such a ridiculous amount? Because it’s near Apple’s new campus — location matters.Does location matter with the cloud? Given how fast data travels, one might not think so, but location does indeed matter. A recent report from EdgeConneX and Cedexis, Cloud, Content, Connectivity and the Evolving Internet Edge, shows just how much it actually does. The study conducted uses Cedexis’ RUM-based internet performance measurement tools to test how cloud applications perform in different locations and with various optimization techniques.To read this article in full or to leave a comment, please click here

Location, location, location… it matters to the cloud

In real estate, there’s a mantra that most agents use of “location, location, location,” meaning houses that may be equal in many ways will cost more the closer you get to something of value. For example, the San Jose Mercury News recently published a story about a house in Sunnyvale, California, that sold for $782,000 over asking price. Why such a ridiculous amount? Because it’s near Apple’s new campus — location matters.Does location matter with the cloud? Given how fast data travels, one might not think so, but location does indeed matter. A recent report from EdgeConneX and Cedexis, Cloud, Content, Connectivity and the Evolving Internet Edge, shows just how much it actually does. The study conducted uses Cedexis’ RUM-based internet performance measurement tools to test how cloud applications perform in different locations and with various optimization techniques.To read this article in full or to leave a comment, please click here

IDG Contributor Network: What is a data fabric and why should you care?

What is a data fabric? The concept of a "data fabric" is emerging as an approach to help organizations better deal with fast growing data, ever changing application requirements and distributed processing needs.The term references technology that creates a converged platform that supports the storage, processing, analysis and management of disparate data. Data that is currently maintained in files, database tables, data streams, objects, images, sensor data and even container-based applications can all be accessed using a number of different standard interfaces.A data fabric makes it possible for applications and tools designed to access data using many interfaces such as NFS (Network File System), POSIX (portable operating system interface), a REST API (representative state transfer), HDFS (Hadoop distributed file system), ODBC (open database connectivity), and Apache KAFKA for real-time streaming data. A data fabric must also be capable of being enhanced to support other standards as they emerge in importance.To read this article in full or to leave a comment, please click here

IDG Contributor Network: What is a data fabric and why should you care?

What is a data fabric? The concept of a "data fabric" is emerging as an approach to help organizations better deal with fast growing data, ever changing application requirements and distributed processing needs.The term references technology that creates a converged platform that supports the storage, processing, analysis and management of disparate data. Data that is currently maintained in files, database tables, data streams, objects, images, sensor data and even container-based applications can all be accessed using a number of different standard interfaces.A data fabric makes it possible for applications and tools designed to access data using many interfaces such as NFS (Network File System), POSIX (portable operating system interface), a REST API (representative state transfer), HDFS (Hadoop distributed file system), ODBC (open database connectivity), and Apache KAFKA for real-time streaming data. A data fabric must also be capable of being enhanced to support other standards as they emerge in importance.To read this article in full or to leave a comment, please click here