Helping next-generation 5G cellular technology see beyond the trees
Measurements of the impact of trees on 5G transmissions could prove vital for the use of a new class of signals.
As 5G technology is fully implemented over the next few years, mobile phones and other wireless technologies will become more powerful with increased data throughput and lower latency. But with these benefits comes a question: Will your next-gen cell phone be unable to see the forest for the trees?
This is one way of describing the problem facing cellular network designers, who must take advantage of both the pros and cons of a new class of signals that 5G will use: millimeter waves. Not only can these waves carry more information than conventional transmissions, but they also usefully occupy a portion of the broadcast spectrum that communication technologies rarely use – a major concern in an age when broadcasters compete for portions of spectrum like the prospectors marking out the territory.
However, millimeter waves also have drawbacks, including their limited ability to penetrate obstacles. These obstacles include buildings, but also trees that dot the landscape. Until recently, little was known about how trees affect the propagation of millimeter waves. And just as few of us would want to imagine a landscape without greenery, few designers would be able to plan networks around it without such a crucial fundamental detail.
The National Institute of Standards and Technology (NIST) set out to solve this problem by measuring the effect of trees on millimeter waves. The effort could make a profound difference in the ability of our next generation devices to see the 5G antennas that may soon sprout.
The era of 5G will feature wireless communication not only between people, but also between devices connected to the Internet of Things. The increased demand for larger downloads by cellular customers and the lag-free response of the network by gamers has prompted the wireless industry to seek faster and more efficient communication. Not only could our current devices and services run more efficiently, we could make new ones: Autonomous vehicles will depend on such a fast network response to function.
“We will be able to do new things if our machines can exchange and process information quickly and efficiently,” said Nada Golmie, head of NIST’s wireless networks division at the Communications Technology Lab. “But you need a good communications infrastructure. The idea is to connect, process the data in one place and do things with it elsewhere.
Millimeter waves, which are new ground for the wireless industry, could be part of the solution. Their wave peaks are only a few millimeters apart, a very short distance compared to radio waves which can reach several meters in length. And their frequencies are very high, somewhere between 30 and 300 gigahertz, or billions of wave peaks per second. Compared to conventional radio transmissions, which fall in the kilohertz (for AM) and megahertz (for FM) ranges, the new 5G signals will indeed be very high frequency – something like a bird tweeting in the upper range of hearing. human relative to the depth of the radio, bass bass.
It is the high frequency of millimeter waves that makes them both attractive as data carriers and difficult to use. On the one hand, more wave peaks per second means the waves can carry more information, and our data-hungry age needs this capacity to deliver those faster downloads and network responses. On the other hand, high frequency waves find it difficult to pass through obstacles. Anyone who has passed by a house or car whose occupants are playing loud dance music knows that the throbbing low frequencies are most of what hits the outside, not the highs of a singing soprano.
For 5G networks, the obstruction wall can only be an oak leaf. For this reason, NIST scientists embarked on a somewhat unusual task in September 2019: they installed measurement equipment near trees and shrubs of various sizes around the agency’s campus in Gaithersburg, in Maryland. The study went on for months, in part because they needed a seasonal perspective.
“The study of trees is one of the few that examines the effect of the same tree on a particular signal frequency during different seasons,” Golmie said. “We couldn’t just do the survey in the winter, because things would have changed in the summer. It turns out that even the shape of the leaves affects the reflection or transmission of a signal.
The team worked with the wireless community to develop the mobile equipment needed to take the measurements. The researchers focused on single trees and directed millimeter wave signals at them from a range of angles and positions, to simulate waves coming from different directions. They measured the loss, or attenuation, in decibels. (Each 10 dB of loss is a reduction of a power of 10; an attenuation of 30 dB would mean that the signal is reduced by a factor of 1000.)
“The tree study is one of the few to examine the effect of the same tree on a particular signal frequency in different seasons. Even the shape of the leaves affects the reflection or transmission of a signal. – Nada Golmie, NIST researcher
For one type of deciduous tree, European nettle, the average attenuation in summer was 27.1 dB, but it relaxed to 22.2 dB in winter when the tree was bare. Evergreens blocked more of the signal. Their average attenuation was 35.3 dB, a number that did not change with the season.
(For comparison, the team also looked at different types of building materials. Wood doors, plasterboard walls, and interior glass showed losses of up to 40.5 dB, 31.6 dB and 18.1 dB, respectively, while exterior building materials exhibited even greater losses, up to 66.5 dB.)
While NIST’s contributions to 5G network development efforts might end up being as ubiquitous as the trees themselves, for most of us they will be considerably less visible. The measurements made by the team are primarily aimed at companies creating models of how different objects affect millimeter waves. Part of the effort was a collaboration with Ansys Inc. The company used measurement data shared by NIST to tune tree simulation models, which mobile companies use to plan their antenna arrays in. detail.
“Most models do not include information based on measurements on the trees,” said David Lai of NIST, one of the scientists who conducted the study. “They could just say that for a given tree shape, we should expect some signal loss. We want to improve their models by providing accurate measurement-based propagation data. “
NIST’s collaboration with Ansys contributed to guidelines issued by the International Telecommunication Union (ITU), the organization that creates guidelines for telecommunications standards. The results now appear as a new section on trees in ITU Recommendation ITU-R P.833-10. This publication serves as a reference for signal propagation models, which others will develop.
“Our goal is to get these metrics in front of the entire wireless community,” Golmie said. “We hope this effort will help the entire market.”