Powering radio access networks using renewables, such as wind and solar power, promises dramatic reduction in the network operation cost and the network carbon footprints. However, the spatial variation of the energy field can lead to fluctuations in power supplied to the network and thereby affects its coverage. This warrants research on quantifying the aforementioned negative effect and countermeasure techniques, motivating the current work. First, a novel energy field model is presented, in which fixed maximum energy intensity $\gamma$ occurs at Poisson distributed locations, called energy centers. The intensities fall off from the centers following an exponential decay function of squared distance and the energy intensity at an arbitrary location is given by the decayed intensity from the nearest energy center. The product between the energy center density and the exponential rate of the decay function, denoted as $\psi$, is shown to determine the energy field distribution. Next, the paper considers a cellular downlink network powered by harvesting energy from the energy field and analyzes its network coverage. For the case of harvesters deployed at the same sites as base stations (BSs), as $\gamma$ increases, the mobile outage probability is shown to scale as $(c \gamma^{-\pi\psi}+p)$, where $p$ is the outage probability corresponding to a flat energy field and $c$ a constant. Subsequently, a simple scheme is proposed for counteracting the energy randomness by spatial averaging. Specifically, distributed harvesters are deployed in clusters and the generated energy from the same cluster is aggregated and then redistributed to BSs. As the cluster size increases, the power supplied to each BS is shown to converge to a constant proportional to the number of harvesters per BS.

Thanks. We have received your report. If we find this content to be in
violation of our guidelines,
we will remove it.

Ok