Artificial intelligence (AI) and blockchain are among new technologies that are driving a need for increased data center capacity, according to a telco, announcing an expansion recently.China Telecom said in a press release that these “rapidly maturing” technologies, such as machine learning and adaptive security, will propel investment in data centers. And that they are one reason for its data center-business enlargement.Interestingly, though, data centers themselves may end up using this new tech as heavily as the customers.To read this article in full or to leave a comment, please click here
Artificial intelligence (AI) and blockchain are among new technologies that are driving a need for increased data center capacity, according to a telco, announcing an expansion recently.China Telecom said in a press release that these “rapidly maturing” technologies, such as machine learning and adaptive security, will propel investment in data centers. And that they are one reason for its data center-business enlargement.Interestingly, though, data centers themselves may end up using this new tech as heavily as the customers.To read this article in full or to leave a comment, please click here
Treatment will be brought to the patients and patient data will be centralized, “turning hospitals into data centers,” a telco equipment maker says in a recent report.Ericsson, in its 2017 Mobility report (PDF), published this month, says patient treatment will, in the future, no longer be performed in hospitals located far from patients’ homes, but performed remotely through new 5G wireless radio.+ Also on Network World: Reliability, not principally speed, will drive 5G +
Wearables will be among the tools used for keeping an eye on folks’ health and dishing out medication. Diagnosis will be accomplished through online consultations, and robots will remotely execute surgeries at nearby healthcare clinics rather than far-off hospitals.To read this article in full or to leave a comment, please click here
Future computer systems need to be significantly faster than the supercomputers around today, scientists believe. One reason is because analyzing complex problems properly, such as climate modeling, takes increasing work. Massive quantities of calculations, performed at high speed, and delivered in mistake-free data analysis is needed for the fresh insights and discoveries expected down the road.Limitations, though, exist in current storage, processing and software, among other components.The U.S. Department of Energy’s four year $48 million Exascale Computing Project (ECP), started at the end of last year for science and national security purposes, plans to overcome those challenges. It explains some of the potential hiccups it will be running into on its Argonne National Laboratory website. Part of the project is being studied at the lab.To read this article in full or to leave a comment, please click here
Future computer systems need to be significantly faster than the supercomputers around today, scientists believe. One reason is because analyzing complex problems properly, such as climate modeling, takes increasing work. Massive quantities of calculations, performed at high speed, and delivered in mistake-free data analysis is needed for the fresh insights and discoveries expected down the road.Limitations, though, exist in current storage, processing and software, among other components.The U.S. Department of Energy’s four year $48 million Exascale Computing Project (ECP), started at the end of last year for science and national security purposes, plans to overcome those challenges. It explains some of the potential hiccups it will be running into on its Argonne National Laboratory website. Part of the project is being studied at the lab.To read this article in full or to leave a comment, please click here
Power-thirsty data centers could receive a new kind of electricity supply that uses surplus juice found in idle electric vehicles (EVs). It uses the cars’ batteries to augment a building’s grid power.Vehicle-to-grid (V2G) technology provides a way to store, then subsequently use, energy in commuters’ cars while employees work during the day. It can also supply power when a delivery fleet is parked at night, researchers say.+ Also on Network World: Smart buildings start with awareness +
Cost savings for building infrastructure could be significant in green-electricity applications, such as solar and wind, which require local electricity storage. Installing the batteries needed for green applications is expensive.To read this article in full or to leave a comment, please click here
Power-thirsty data centers could receive a new kind of electricity supply that uses surplus juice found in idle electric vehicles (EVs). It uses the cars’ batteries to augment a building’s grid power.Vehicle-to-grid (V2G) technology provides a way to store, then subsequently use, energy in commuters’ cars while employees work during the day. It can also supply power when a delivery fleet is parked at night, researchers say.+ Also on Network World: Smart buildings start with awareness +
Cost savings for building infrastructure could be significant in green-electricity applications, such as solar and wind, which require local electricity storage. Installing the batteries needed for green applications is expensive.To read this article in full or to leave a comment, please click here
Space is significantly more data-center-friendly than earth, reckons a Los Angeles company that recently received a U.S. patent for a proposed galactic server farm and associated network.Cloud Constellation Corp. says it’s going to provide a way for organizations and governments to shift chunks of data around, without using the traditional terrestrial infrastructure that it says is slow, insecure and a legal minefield.“This concept will soon become reality,” president Cliff Beek says, writing in Nextgov about SpaceBelt. “Data will never pass through the internet or along its leaky and notoriously insecure lines. In-transit espionage, theft and surveillance become impossible.”To read this article in full or to leave a comment, please click here
The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here
The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here
The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here
Network traffic analysis should be used more in the fight against malware. That’s because pointers show up on the network “weeks and even months” in advance of new malicious software being uncovered, scientists from the Georgia Institute of Technology explain in an article on the school’s website.The researchers, who have been studying historic network traffic patterns, say the latest malware tracking should take advantage of inherent network-supplied barometers and stop simply focusing on trying to identify malware code already on networks and machines. By analyzing already-available, suspicious network traffic created by the hackers over a period of time, administrators will be able to pounce and render malware harmless before it can perform damage.To read this article in full or to leave a comment, please click here
Network traffic analysis should be used more in the fight against malware. That’s because pointers show up on the network “weeks and even months” in advance of new malicious software being uncovered, scientists from the Georgia Institute of Technology explain in an article on the school’s website.The researchers, who have been studying historic network traffic patterns, say the latest malware tracking should take advantage of inherent network-supplied barometers and stop simply focusing on trying to identify malware code already on networks and machines. By analyzing already-available, suspicious network traffic created by the hackers over a period of time, administrators will be able to pounce and render malware harmless before it can perform damage.To read this article in full or to leave a comment, please click here
Network traffic analysis should be used more in the fight against malware. That’s because pointers show up on the network “weeks and even months” in advance of new malicious software being uncovered, scientists from the Georgia Institute of Technology explain in an article on the school’s website.The researchers, who have been studying historic network traffic patterns, say the latest malware tracking should take advantage of inherent network-supplied barometers and stop simply focusing on trying to identify malware code already on networks and machines. By analyzing already-available, suspicious network traffic created by the hackers over a period of time, administrators will be able to pounce and render malware harmless before it can perform damage.To read this article in full or to leave a comment, please click here
Network traffic analysis should be used more in the fight against malware. That’s because pointers show up on the network “weeks and even months” in advance of new malicious software being uncovered, scientists from the Georgia Institute of Technology explain in an article on the school’s website.The researchers, who have been studying historic network traffic patterns, say the latest malware tracking should take advantage of inherent network-supplied barometers and stop simply focusing on trying to identify malware code already on networks and machines. By analyzing already-available, suspicious network traffic created by the hackers over a period of time, administrators will be able to pounce and render malware harmless before it can perform damage.To read this article in full or to leave a comment, please click here
Network traffic analysis should be used more in the fight against malware. That’s because pointers show up on the network “weeks and even months” in advance of new malicious software being uncovered, scientists from the Georgia Institute of Technology explain in an article on the school’s website.The researchers, who have been studying historic network traffic patterns, say the latest malware tracking should take advantage of inherent network-supplied barometers and stop simply focusing on trying to identify malware code already on networks and machines. By analyzing already-available, suspicious network traffic created by the hackers over a period of time, administrators will be able to pounce and render malware harmless before it can perform damage.To read this article in full or to leave a comment, please click here
Thousand megabit broadband is a turning point for internet delivery speeds. Newer tech, such as virtual reality, and the incumbents, such as video streaming, will benefit. Right now, though, only about 17 percent of the U.S.’s population has access to those super-fast speeds, which are primarily delivered by fiber, according to Viavi Solution’s latest Gigabit Monitor report.Although Gigabit is kicking in, it’s not going to be particularly simple to implement at the networking level, internet metrics company Ookla said earlier this month. Upgraded, wired installs will likely handle the throughput better than existing, now commonly used Wi-Fi, among other things, the company said.To read this article in full or to leave a comment, please click here
Thousand megabit broadband is a turning point for internet delivery speeds. Newer tech, such as virtual reality, and the incumbents, such as video streaming, will benefit. Right now, though, only about 17 percent of the U.S.’s population has access to those super-fast speeds, which are primarily delivered by fiber, according to Viavi Solution’s latest Gigabit Monitor report.Although Gigabit is kicking in, it’s not going to be particularly simple to implement at the networking level, internet metrics company Ookla said earlier this month. Upgraded, wired installs will likely handle the throughput better than existing, now commonly used Wi-Fi, among other things, the company said.To read this article in full or to leave a comment, please click here
Thousand megabit broadband is a turning point for internet delivery speeds. Newer tech, such as virtual reality, and the incumbents, such as video streaming, will benefit. Right now, though, only about 17 percent of the U.S.’s population has access to those super-fast speeds, which are primarily delivered by fiber, according to Viavi Solution’s latest Gigabit Monitor report.Although Gigabit is kicking in, it’s not going to be particularly simple to implement at the networking level, internet metrics company Ookla said earlier this month. Upgraded, wired installs will likely handle the throughput better than existing, now commonly used Wi-Fi, among other things, the company said.To read this article in full or to leave a comment, please click here
Thousand megabit broadband is a turning point for internet delivery speeds. Newer tech, such as virtual reality, and the incumbents, such as video streaming, will benefit. Right now, though, only about 17 percent of the U.S.’s population has access to those super-fast speeds, which are primarily delivered by fiber, according to Viavi Solution’s latest Gigabit Monitor report.Although Gigabit is kicking in, it’s not going to be particularly simple to implement at the networking level, internet metrics company Ookla said earlier this month. Upgraded, wired installs will likely handle the throughput better than existing, now commonly used Wi-Fi, among other things, the company said.To read this article in full or to leave a comment, please click here