Patrick Nelson

Author Archives: Patrick Nelson

Ambient ‘T-rays’ could help power IoT devices

Most things with a measurable temperature – human beings going about their daily routines, inert objects – generate terahertz waves, radiation that is sandwiched between infrared and microwave on the electromagnetic spectrum.So far, these waves haven’t proved very useful, but now scientists at MIT are trying to harness them with devices that use them to generate electricity that could charge the batteries of cellphones, laptops, even medical implants.[Get regularly scheduled insights by signing up for Network World newsletters.] If successful, the charging devices would passively gather the waves and generate DC current at room temperature, something that hasn’t been accomplished before. Previous devices that can turn terahertz waves – T-rays – to electricity only work in ultracold environments, according to an MIT News article about the project.To read this article in full, please click here

A tiny experimental optical chip can support downloads of 1,000 movies in a split second

An optical chip the size of a fingernail has enabled Australian researchers to set new optical data rate records on the country’s National Broadband Network (NBN).The raw data rate of 44.2Tbps over conventional optical cable is about three times the data rate for the entire NBN network and about 100 times the speed of any single device currently used on the network, the researchers say.10 of the world's fastest supercomputers While the technology is meant for metro-area networks and data centers, those bit rates would support downloads of 1,000 movies in less than a second.To read this article in full, please click here

How IoT will rescue aviation

A biotech company that develops sensors to detect explosives and other chemicals on planes and in airports is teaming up with Airbus to create a sensor that could detect passengers who are positive for COVID-19.California-based Koniku and Airbus, which have been working since 2017 on contactless equipment that sniffs out chemicals, are trying to adapt that technology to sniff out pathogens, says Osh Agabi, founder and CEO of Koniku, in a blog post.To read this article in full, please click here

New 6 GHz Wi-Fi could add $153 billion to U.S. economy: report

Opening the 6 GHz band to Wi-Fi could add $153.75 billion to the U.S. economy over the next five years, according to a new study.In late April, the Federal Communications Commission adopted rules that make 1,200 megahertz of spectrum in the 6 GHz band available for unlicensed use. Freeing up the chunk of 6 GHz spectrum for Wi-Fi is the biggest frequency allocation upgrade to the now aging wireless protocol in 10 years. Wi-Fi using 5 GHz spectrum – the last major touch-up – was introduced in 2009. The original 2.4 GHz Wi-Fi was introduced in 1997.To read this article in full, please click here

How underwater Internet of Things will work

More than two-thirds of the world's surface is covered by water. It plays an important role in our economic existence, including in major verticals such as oil and gas, shipping and tourism.As the Internet of Things proliferates, questions arise as to how IoT will manifest itself underwater given that radio waves degrade over distance in seawater, and underwater acoustic communication (which does actually work okay) is easily eavesdropped on and isn't stealthy.To make the underwater Internet of Things happen, light is the answer, some say. Researchers at King Abdullah University of Science and Technology (KAUST) in Thuwal, Saudi Arabia, are proposing underwater optical communications. They're investigating simultaneous lightwave information and power transfer (SLIPT) configurations, which they're using to transmit energy and data to underwater electronic devices. Recently, the researchers announced a breakthrough experiment in which they were able to achieve an underwater, two-way transmission of data and power over 1.5 yards between a solar panel-equipped sensor and a receiver.To read this article in full, please click here

Harvesting ambient energy will power IoT, scientists say

Stray, ambient magnetic fields that are naturally created from electricity usage should be captured, diverted, and converted into power for Internet of Things sensors, researchers say."Just like sunlight is a free source of energy we try to harvest, so are magnetic fields," said Shashank Priya, professor of materials science and engineering and associate vice president for research at Penn State, in a statement published on the university's web site. "We have this ubiquitous energy present in our homes, office spaces, work spaces and cars. It's everywhere, and we have an opportunity to harvest this background noise and convert it to useable electricity."To read this article in full, please click here

Neural computing should be based on insect brains, not human ones

The bumble bee brain is a better model than the human brain for neural networks that might be used to run autonomous robots, an academic team believes.“It is pretty impressive that a bee can fly over five miles, then remember its way home, with a brain the size of a pinhead,” says Professor James Marshall, of the University of Sheffield, quoted by multiple newspapers that were reporting on a presentation Marshall made to the American Association for the Advancement of Science conference in February.[Get regularly scheduled insights by signing up for Network World newsletters.] “It makes sense to me that we should try and mimic a bee brain in [autonomous systems], drones and driverless cars.”To read this article in full, please click here

Electronics should sweat to cool down, say researchers

Computing devices should sweat when they get too hot, say scientists at Shanghai Jiao Tong University in China, where they have developed a materials application they claim will cool down devices more efficiently and in smaller form-factors than existing fans.It’s “a coating for electronics that releases water vapor to dissipate heat from running devices,” the team explain in a news release. “Mammals sweat to regulate body temperature,” so should electronics, they believe.[Get regularly scheduled insights by signing up for Network World newsletters.] The group’s focus has been on studying porous materials that can absorb moisture from the environment and then release water vapor when warmed. MIL-101(Cr) checks the boxes, they say. The material is a metal organic framework, or MOF, which is a sorbent, a material that stores large amounts of water. The higher the water capacity one has, the greater the dissipation of heat when it's warmed.To read this article in full, please click here

Seawater, humidity inspire new ways to generate power

The possiblity of a future power-availability crunch – spurred in part by a global increase in data usage – is driving researchers to get creative with a slew of new and modified ways to generate and store energy.Ongoing projects include the use of seawater for batteries; grabbing ambient humidity; massive water storage systems for hydropower; and solar panels that work at night. Here are some details:Batteries based on seawater Seawater will provide "super-batteries," says the University of Southern Denmark. Researchers there have been studying how to use sodium, which is abundant in seawater, as an alternative to lithium in batteries.To read this article in full, please click here

A $399 device that translates brain signals into digital commands

Scientists have long envisioned brain-sensing technology that can translate thoughts into digital commands, eliminating the need for computer-input devices like a keyboard and mouse. One company is preparing to ship its latest contribution to the effort: a $399 development package for a noninvasive, AI-based, brain-computer interface.The kit will let "users control anything in their digital world by using just their thoughts," NextMind, a commercial spinoff of a cognitive neuroscience lab claims in a press release.To read this article in full, please click here

Future ‘smart walls’ key to IoT

IoT equipment designers shooting for efficiency should explore the potential for using buildings as antennas, researchers say.Environmental surfaces such as walls can be used to intercept and beam signals, which can increase reliability and data throughput for devices, according to MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL).Researchers at CSAIL have been working on a smart-surface repeating antenna array called RFocus. The antennas, which could be applied in sheets like wallpaper, are designed to be incorporated into office spaces and factories. Radios that broadcast signals could then become smaller and less power intensive.To read this article in full, please click here

How bacteria could run the Internet of Things

Biologically created computing devices could one day be as commonplace as today’s microprocessors and microchips, some scientists believe. Consider DNA, the carrier of genetic information and the principal component of chromosomes; it's showing promise as a data storage medium.A recent study (PDF) suggests taking matters further and using microbes to network and communicate at nanoscale. The potential is highly attractive for the Internet of Things (IoT), where concealability and unobtrusiveness may be needed for the technology to become completely ubiquitous.To read this article in full, please click here

How Open-RAN could ‘white-box’ 5G

One of Britain’s principal mobile networks, O2, has just announced that it intends to deploy Open Radio Access Network technology (O-RAN) in places.O-RAN is a wireless industry initiative for designing and building radio network solutions using “a general-purpose, vendor-neutral hardware and software-defined technology,” explains Telecom Infra Project, the body responsible, on its website.TIP is the trade body that, along with Intel and Vodafone, conceived of the technology alternative – an attempt at toppling the dominance of Ericsson, Huawei and Nokia, which provide almost all mobile telco infrastructure now.To read this article in full, please click here

Beyond Moore’s Law: Neuromorphic computing?

With the conceivable exhaustion of Moore’s Law – that the number of transistors on a microchip doubles every two years – the search is on for new paths that lead to reliable incremental processing gains over time.One possibility is that machines inspired by how the brain works could take over, fundamentally shifting computing to a revolutionary new tier, according to an explainer study released this month by Applied Physics Reviews.[Get regularly scheduled insights by signing up for Network World newsletters.] “Today’s state-of-the-art computers process roughly as many instructions per second as an insect brain,” say the paper’s authors Jack Kendall, of Rain Neuromorphics, and Suhas Kumar, of Hewlett Packard Labs. The two write that processor architecture must now be completely re-thought if Moore’s law is to be perpetuated, and that replicating the “natural processing system of a [human] brain” is the way forward.To read this article in full, please click here

Beyond Moore’s Law: Neuromorphic computing?

With the conceivable exhaustion of Moore’s Law – that the number of transistors on a microchip doubles every two years – the search is on for new paths that lead to reliable incremental processing gains over time.One possibility is that machines inspired by how the brain works could take over, fundamentally shifting computing to a revolutionary new tier, according to an explainer study released this month by Applied Physics Reviews.[Get regularly scheduled insights by signing up for Network World newsletters.] “Today’s state-of-the-art computers process roughly as many instructions per second as an insect brain,” say the paper’s authors Jack Kendall, of Rain Neuromorphics, and Suhas Kumar, of Hewlett Packard Labs. The two write that processor architecture must now be completely re-thought if Moore’s law is to be perpetuated, and that replicating the “natural processing system of a [human] brain” is the way forward.To read this article in full, please click here

Instant, secure ‘teleportation’ of data in the works

Sending information instantly between two computer chips using quantum teleportation has been accomplished reliably for the first time, according to scientists from the University of Bristol, in collaboration with the Technical University of Denmark (DTU). Data was exchanged without any electrical or physical connection – a transmission method that may influence the next generation of ultra-secure data networks.Teleportation involves the moving of information instantaneously and securely. In the “Star Trek” series, fictional people move immediately from one place to another via teleportation. In the University of Bristol experiment, data is passed instantly via a single quantum state across two chips using light particles, or photons. Importantly, each of the two chips knows the characteristics of the other, because they’re entangled through quantum physics, meaning they therefore share a single physics-based state.To read this article in full, please click here

DIY communications networks to trend in 2020, says major telco

Communications networks without a centralized infrastructure will become more popular this year as folks become increasingly aware of data collection from governments and tech companies, says telecommunications provider Telenor Group.The company refers to fully encrypted mesh and peer-to-peer apps as the technology that will enable these consumer-level “off-the-grid, build-it-yourself” links. Mesh apps will also be useful in disasters where traditional networks fail.[Get regularly scheduled insights by signing up for Network World newsletters.] “Communicating without a central coordinating network is appealing to people for many reasons, and in 2020, we expect to see more go that route, especially in conflict situations, to mobilize for protests, and simply to stay below the radar,” the company says on its website.To read this article in full, please click here

Researchers aim for transistors that compute and store in one component

Researchers at Purdue University have made progress towards an elusive goal: building a transistor that can both process and store information. In the future, a single on-chip component could integrate the processing functions of transistors with the storage capabilities of ferroelectric RAM, potentially creating a process-memory combo that enables faster computing and is just atoms thick.The ability to cram more functions onto a chip, allowing for greater speed and power without increasing the footprint, is a core goal of electronics design. To get where they are today, engineers at Purdue had to overcome incompatibilities between transistors – the switching and amplification mechanisms used in almost all electronics – and ferroelectric RAM. Ferroelectric RAM is higher-performing memory technology; the material introduces non-volatility, which means it retains information when power is lost, unlike traditional dielectric-layer-constructed DRAM.To read this article in full, please click here

Researchers aim for transistors that compute and store in one component

Researchers at Purdue University have made progress towards an elusive goal: building a transistor that can both process and store information. In the future, a single on-chip component could integrate the processing functions of transistors with the storage capabilities of ferroelectric RAM, potentially creating a process-memory combo that enables faster computing and is just atoms thick.The ability to cram more functions onto a chip, allowing for greater speed and power without increasing the footprint, is a core goal of electronics design. To get where they are today, engineers at Purdue had to overcome incompatibilities between transistors – the switching and amplification mechanisms used in almost all electronics – and ferroelectric RAM. Ferroelectric RAM is higher-performing memory technology; the material introduces non-volatility, which means it retains information when power is lost, unlike traditional dielectric-layer-constructed DRAM.To read this article in full, please click here

Space-data-as-a-service gets going

Upcoming space commercialization will require hardened edge-computing environments in a small footprint with robust links back to Earth, says vendor OrbitsEdge, which recently announced that it had started collaborating with Hewlett Packard Enterprise on computing-in-orbit solutions.OrbitsEdge says it’s the first to provide a commercial data-center environment for installing in orbit, and will be using HPE’s Edgeline Converged Edge System in a hardened, satellite micro-data-center platform that it’s selling called SatFrame.To read this article in full, please click here