Nmap, short for Network Mapper, is a free and open source tool used for vulnerability checking, port scanning and, of course, network mapping. Despite being created back in 1997, Nmap remains the gold standard against which all other similar tools, either commercial or open source, are judged.Nmap has maintained its preeminence because of the large community of developers and coders who help to maintain and update it. The Nmap community reports that the tool, which anyone can get for free, is downloaded several thousand times every week.To read this article in full, please click here
Nmap, short for Network Mapper, is a free and open source tool used for vulnerability checking, port scanning and, of course, network mapping. Despite being created back in 1997, Nmap remains the gold standard against which all other similar tools, either commercial or open source, are judged.Nmap has maintained its preeminence because of the large community of developers and coders who help to maintain and update it. The Nmap community reports that the tool, which anyone can get for free, is downloaded several thousand times every week.To read this article in full, please click here
After teasing out details about the technology for a year and half under the code name Falcon Mesa, Intel has unveiled the Agilex family of FPGAs, aimed at data-center and network applications that are processing increasing amounts of data for AI, financial, database and IoT workloads.The Agilex family, expected to start appearing in devices in the third quarter, is part of a new wave of more easily programmable FPGAs that is beginning to take an increasingly central place in computing as data centers are called on to handle an explosion of data.
Learn about edge networking
How edge networking and IoT will reshape data centers
Edge computing best practices
How edge computing can help secure the IoT
FPGAs, or field programmable gate arrays, are built around around a matrix of configurable logic blocks (CLBs) linked via programmable interconnects that can be programmed after manufacturing – and even reprogrammed after being deployed in devices – to run algorithms written for specific workloads. They can thus be more efficient on a performance-per-watt basis than general-purpose CPUs, even while driving higher performance. To read this article in full, please click here
After teasing out details about the technology for a year and half under the code name Falcon Mesa, Intel has unveiled the Agilex family of FPGAs, aimed at data-center and network applications that are processing increasing amounts of data for AI, financial, database and IoT workloads.The Agilex family, expected to start appearing in devices in the third quarter, is part of a new wave of more easily programmable FPGAs that is beginning to take an increasingly central place in computing as data centers are called on to handle an explosion of data.
Learn about edge networking
How edge networking and IoT will reshape data centers
Edge computing best practices
How edge computing can help secure the IoT
FPGAs, or field programmable gate arrays, are built around around a matrix of configurable logic blocks (CLBs) linked via programmable interconnects that can be programmed after manufacturing – and even reprogrammed after being deployed in devices – to run algorithms written for specific workloads. They can thus be more efficient on a performance-per-watt basis than general-purpose CPUs, even while driving higher performance. To read this article in full, please click here
HPE is offering new Edgeline Converged Edge System hardware and software designed to let enterprises not only control machines in their facilities, but also manage and analyze the sea of data generated by devices and sensors at the edge network.The new software lets enterprise network managers and data-center administrators merge data from a variety of third-party applications and remotely manage as many as thousands of Edgeline hardware systems, which are capable of running unmodified enterprise applications, HPE said at its Discover Conference in Madrid Tuesday.To read this article in full, please click here
HPE is targeting a new class of storage it calls Memory-Driven Flash at enterprise data centers that are increasingly being called on to handle real-time analytics, high-speed transactions, big data and AI workloads that demand more storage performance than ever before.To read this article in full, please click here(Insider Story)
HPE is targeting a new class of storage it calls Memory-Driven Flash at enterprise data centers that are increasingly being called on to handle real-time analytics, high-speed transactions, big data and AI workloads that demand more storage performance than ever before.To read this article in full, please click here(Insider Story)
HPE is offering new Edgeline Converged Edge System hardware and software designed to let enterprises not only control machines in their facilities, but also manage and analyze the sea of data generated by devices and sensors at the edge network.The new software lets enterprise network managers and data-center administrators merge data from a variety of third-party applications and remotely manage as many as thousands of Edgeline hardware systems, which are capable of running unmodified enterprise applications, HPE said at its Discover Conference in Madrid Tuesday.To read this article in full, please click here
HPE is targeting a new class of storage it calls Memory-Driven Flash at enterprise data centers that are increasingly being called on to handle real-time analytics, high-speed transactions, big data and AI workloads that demand more storage performance than ever before.To read this article in full, please click here(Insider Story)
Cloud-assisted driving, remote surgery, touch-sensitive VR, and most important for the global economy, a massive jump in industrial productivity, are all on the way and have one thing in common: the continuing evolution of network infrastructure toward 5G.To read this article in full, please click here(Insider Story)
Cloud-assisted driving, remote surgery, touch-sensitive VR, and most important for the global economy, a massive jump in industrial productivity, are all on the way and have one thing in common: the continuing evolution of network infrastructure toward 5G.To read this article in full, please click here(Insider Story)
Cloud-assisted driving, remote surgery, touch-sensitive VR, and most important for the global economy, a massive jump in industrial productivity, are all on the way and have one thing in common: the continuing evolution of network infrastructure toward 5G.To read this article in full, please click here(Insider Story)
Lenovo and NetApp's storage alliance, joint venture in China, and new series of all-flash and hybrid flash products announced at Lenovo's Transform event, put them both in a much stronger position in the data center against rivals Dell EMC and HPE.The storage offerings include two familes, each subdivided into all-fash and hybrid -flash products, jointly developed by Lenovo and NetApp and available now worldwide. Several of the products support NVMe (non-volatile memory express), the extremely fast communications protocol and controller able to move data to and from SSDs via the PCIe-bus standard. NVMe SSDs are designed to provide two orders of magnitude speed improvement over prior SSDs.To read this article in full, please click here
Lenovo and NetApp's storage alliance, joint venture in China, and new series of all-flash and hybrid flash products announced at Lenovo's Transform event, put them both in a much stronger position in the data center against rivals Dell EMC and HPE.The storage offerings include two familes, each subdivided into all-fash and hybrid -flash products, jointly developed by Lenovo and NetApp and available now worldwide. Several of the products support NVMe (non-volatile memory express), the extremely fast communications protocol and controller able to move data to and from SSDs via the PCIe-bus standard. NVMe SSDs are designed to provide two orders of magnitude speed improvement over prior SSDs.To read this article in full, please click here
Nvidia is raising its game in data centers, extending its reach across different types of AI workloads with the Tesla T4 GPU, based on its new Turing architecture and, along with related software, designed for blazing acceleration of applications for images, speech, translation and recommendation systems.The T4 is the essential component in Nvidia's new TensorRT Hyperscale Inference Platform, a small-form accelerator card, expected to ship in data-center systems from major server makers in the fourth quarter.The T4 features Turing Tensor Cores, which support different levels of compute precision for different AI applications, as well as the major software frameworks – including TensorFlow, PyTorch, MXNet, Chainer, and Caffe2 – for so-called deep learning, machine learning involving multi-layered neural networks.To read this article in full, please click here
Nvidia is raising its game in data centers, extending its reach across different types of AI workloads with the Tesla T4 GPU, based on its new Turing architecture and, along with related software, designed for blazing acceleration of applications for images, speech, translation and recommendation systems.The T4 is the essential component in Nvidia's new TensorRT Hyperscale Inference Platform, a small-form accelerator card, expected to ship in data-center systems from major server makers in the fourth quarter.The T4 features Turing Tensor Cores, which support different levels of compute precision for different AI applications, as well as the major software frameworks – including TensorFlow, PyTorch, MXNet, Chainer, and Caffe2 – for so-called deep learning, machine learning involving multi-layered neural networks.To read this article in full, please click here
To get up and running on a self-service, big-data analytics platform efficiently, many data-center and network managers these days would likely think about using a cloud service. But not so fast – there is some debate about whether the public cloud is the way to go for certain big-data analytics.For some big-data applications, the public cloud may be more expensive in the long run, and because of latency issues, slower than on-site private cloud solutions. In addition, having data storage reside on premises often makes sense due to regulatory and security considerations.
[ Also see How to plan a software-defined data-center network and Efficient container use requires data-center software networking.]
With all this in mind, Dell EMC has teamed up with BlueData, the provider of a container-based software platform for AI and big-data workloads, to offer Ready Solutions for Big Data, a big data as a service (BDaaS) package for on-premises data centers. The offering brings together Dell EMC servers, storage, networking and services along with BlueData software, all optimized for big-data analytics. To read this article in full, please click here
To get up and running on a self-service, big-data analytics platform efficiently, many data-center and network managers these days would likely think about using a cloud service. But not so fast – there is some debate about whether the public cloud is the way to go for certain big-data analytics.For some big-data applications, the public cloud may be more expensive in the long run, and because of latency issues, slower than on-site private cloud solutions. In addition, having data storage reside on premises often makes sense due to regulatory and security considerations.
[ Also see How to plan a software-defined data-center network and Efficient container use requires data-center software networking.]
With all this in mind, Dell EMC has teamed up with BlueData, the provider of a container-based software platform for AI and big-data workloads, to offer Ready Solutions for Big Data, a big data as a service (BDaaS) package for on-premises data centers. The offering brings together Dell EMC servers, storage, networking and services along with BlueData software, all optimized for big-data analytics. To read this article in full, please click here
Network administrators, IT managers and security professionals face a never-ending battle, constantly checking on what exactly is running on their networks and the vulnerabilities that lurk within. While there is a wealth of monitoring utilities available for network mapping and security auditing, nothing beats Nmap's combination of versatility and usability, making it the widely acknowledged de facto standard.What is Nmap?
Nmap, short for Network Mapper, is a free, open-source tool for vulnerability scanning and network discovery. Network administrators use Nmap to identify what devices are running on their systems, discovering hosts that are available and the services they offer, finding open ports and detecting security risks.To read this article in full, please click here
Network administrators, IT managers and security professionals face a never-ending battle, constantly checking on what exactly is running on their networks and the vulnerabilities that lurk within. While there is a wealth of monitoring utilities available for network mapping and security auditing, nothing beats Nmap's combination of versatility and usability, making it the widely acknowledged de facto standard.What is Nmap?
Nmap, short for Network Mapper, is a free, open-source tool for vulnerability scanning and network discovery. Network administrators use Nmap to identify what devices are running on their systems, discovering hosts that are available and the services they offer, finding open ports and detecting security risks.To read this article in full, please click here