Loading [MathJax]/extensions/TeX/mathchoice.js

Connectivity in device-to-device networks in Poisson-Voronoi cities

Here’s a recently uploaded manuscript:

  • 2023 – Keeler, Błaszczyszyn, Cali – Connectivity and interference in device-to-device networks in Poisson-Voronoi cities.

https://arxiv.org/abs/2309.02137

This work presents numerical results complementing mathematical work carried out by us. The work concerns (continuum) percolation results for a special network model based on Poisson-Voronoi tessellations.

The most relevant work are these two papers (the first being somewhat seminal):

  1. Dousse, Franceschetti, Macris, Meester, Thiran, Percolation in the signal to interference ratio graph, 1996.
  2. Le Gall, Błaszczyszyn, Cali, and En-Najjary, Continuum line-of-sight percolation on Poisson-Voronoi tessellations, 2021

Our work effectively seeks to combine these two papers. We obtain the equivalents results from the first paper by coupling its connectivity model with the connectivity model and network model (based on a Cox point process) presented in the second paper.

If you want a more detailed version, here’s the abstract:

To study the overall connectivity in device-to-device networks in cities, we incorporate a signal-to-interference-plus-noise connectivity model into a Poisson-Voronoi tessellation model representing the streets of a city. Relays are located at crossroads (or street intersections), whereas (user) devices are scattered along streets. Between any two adjacent relays, we assume data can be transmitted either directly between the relays or through users, given they share a common street. Our simulation results reveal that the network connectivity is ensured when the density of users (on the streets) exceeds a certain critical value. But then the network connectivity disappears when the user density exceeds a second critical value. The intuition is that for longer streets, where direct relay-to-relay communication is not possible, users are needed to transmit data between relays, but with too many users the interference becomes too strong, eventually reducing the overall network connectivity. This observation on the user density evokes previous results based on another wireless network model, where transmitter-receivers were scattered across the plane. This effect disappears when interference is removed from the model, giving a variation of the classic Gilbert model and recalling the lesson that neglecting interference in such network models can give overly optimistic results. For physically reasonable model parameters, we show that crowded streets (with more than six users on a typical street) lead to a sudden drop in connectivity. We also give numerical results outlining a relationship between the user density and the strength of any interference reduction techniques.

In future posts I’ll detail the above work as well as our more mathematical work on this type of percolation model.

The Newman-Ziff algorithm for simulating percolation models

Imagine you’re trying to estimate some statistics about a certain percolation system. For example, estimating the probability of a given bond being open, which means connected, when a giant component forms. The system or model is too complicated for analytic results, which is true for all percolation models with the exception of a handful.

So you code up the percolation model in your favourite programming language, run a large number of stochastic simulations, collect the statistics, and then look at the results. Percolation systems are, by definition, large, but with fast computers, the simulation method works. You get your statistics.

Easy.

But that’s just for one parameter. What if you want to see what happens with a range of parameter values? Well, you do the same thing, but now just run the large number of simulations for a range of different parameter values, right?

No.

You could do that, and it would work. But it would be slow. How to make it faster?

Newman and Ziff proposed and implemented a fast percolation algorithm based on the simple ideas of keeping old simulations and using conditional probabilities.

Algorithm

The algorithm was presented in this paper:

  • 2000 – Newman, Ziff – Efficient Monte Carlo Algorithm and High-Precision Results for Percolation

There exists a preprint with a slightly different title.

  • 2001 – Newman, Ziff – A fast Monte Carlo algorithm for site or bond percolation.

Overview

The simulation method consists of three components.

Components of the Newman-Ziff algorithm

  1. Re-use parts of previous simulation outcomes by shuffling.
  2. Use fast union-find algorithms.
  3. Find a smooth curve of results by using conditional probabilities.

None of them are entirely revolutionary, but put together they form a fast (and popular) simulation method for studying (monotonic) percolation models.

1. Randomly sampling sequences

The main insight was to sample in an incremental fashion, adding an open bond to the percolation model, such as a lattice, during each simulation run, without  throwing away the previous simulation. More specifically, let’s say they sample a bond model with k open bonds during one simulation run. Then in the next simulation, use the same configuration, and simply add one bond.

This is a nice trick, but there’s no free lunch here. On one hand, they don’t waste results. But on the other hand, they do throw away independence (and ergodicity) by doing this. But perhaps the numerical results don’t care.

To do this step, Newman and Ziff use a random shuffling algorithm called the Fisher-Yates shuffle. But it’s not clear if Newmand and Ziff thought of this shuffling algorithm independently or not. They never call it by its name nor do they cite any work on this well-known shuffling method. Both MATLAB and Python (SciPy) have functions for doing random shuffles.

2. Using union-find

Imagine you have a collection of things. Some of these things are connected to other things in your collection. You want to understand how these clusters of connected things behave.

To find the clusters, looking for ultimately the biggest one, Newman and Ziff use a union-find method. There are different types of these alogrithms, often hinging upon the use of recursion. They were developed mostly in the 1980s, particularly in work by Robert Tarjan.

These methods are very efficient at finding clusters. The algorithm speed or, rather, complexity is given in relation to the inverse of an Ackermann function, which is a famously fast growing function. These methods rely upon the monotonicity of growing unions.

(If you have a non-monotonic percolating system, then unfortunately you can’t use these algorithms, which is the case for percolation based on signal-to-interference-plus-noise ratio (SINR). )

3. Filling in the parameter gaps

For any random sample of a percolation model, the number of open bonds or sites is a natural number. But the parameter of the percolation model, such as the probability of a site or bond being open, is a real number.

Newman and Ziff use a simple conditioning argument to fill the gaps.

For any statistic S of of a given percolation model, the Newman-Ziff algorithm finds the expected statistic conditioned on n number of open bonds (or equivalent objects in other models). In other words, the algorithm finds the expected conditional statistics E[S_n|N=n]. Then we just need the probability of N being n as a function of the parameter we’re trying to vary, such as the bond probability p. With this probability P(N=n), we immediately arrive at the expression for the statistic S with introductory probability

E(S_n)=E_N [E(S_n|N=n)].

For a nice discrete model with m total, we get the expression

E(S_n)=\sum _{n=1}^{m}[P(N=n)S_n].

But we know the probability (distribution) of N, as it’s simply the binomial variable. Assuming there are m total bonds (or equivalent objects), then we arrive at

P(N=n)={m\choose n} p^n(1-p)^{m-n} .

Then the final statistic or result is simply

E(S_n)=\sum _{n=1}^{m} {m\choose n} p^n(1-p)^{m-n} S_n.

Code

I wrote the algorithm in MATLAB and Python. The code can be found here. I’ve included a script for plotting the results (for small square lattices).

MATLAB

It turns out that MATLAB doesn’t like recursion too much. Well, at least that’s my explanation for the slow results when I use the union-find method with recursion. So I used the union-find method without recursion.

Python

My Python code is mostly for illustration purposes. Check out this Python library. That website warns of the traps that multithreading in NumPy can pose.

C

In the other original paper, Newman and Ziff included C code of their algorithm. You can find it on Newman’s website, but I have also a copy of it (with added comments from me) located here.

Future work

I plan to implement this algorithm using the NVIDIA CUDA library. This is exactly the type of algorithm that we can implement and run using graphical processing units (GPUs), which is the entire point of the CUDA library.

At first glance, the spatial dependence of the problem suggests using texture memory to speed up the memory accessing on the GPUs. But I will explain why that’s probably not a good idea for this specific algorithm

Further reading

Newman wrote the very readable book:

  • Newman, Networks: An introduction, Oxford Academic.

Using more words than equations (not a criticism), Newman explains the logic behind the algorithm.

The Newman-Ziff algorithm exists in the Python library PyPercolate:

https://pypercolate.readthedocs.io/en/stable/newman-ziff.html

Union-find algorithms are now standard fare in computer science books. I recommend the book by Tarjan himself:

  • Tarjan, Data structures and network algorithms, SIAM.

Some of these algorithms are covered in standard computer science textbooks on algorithms. For example:

  • Cormen, Leiserson, Rivest, and Stein,  Introduction to Algorithms, The MIT Press;
  • Sedgewick and Wayne,  Algorithms, Addison-Wesley.

This website gives a quick introduction to union-find in Python:

https://www.delftstack.com/howto/python/union-find-in-python/

Who was S.R. Broadbent?

I found myself recently wondering who the first author was of a seminal paper that created a mathematical field. I did some online digging and I noted my findings here for others who may ask the same question.

Percolation theory

The 1957 paper by S.R. Broadbent and J.M. Hammersley gave birth to percolation theory. I noticed the author affiliations:

S.R. Broadbent, United Glass Bottle Manufacturers
J.M. Hammersley, United Kingdom Atomic Energy Research Establishment Harwell

I knew John Hammersley’s name already, as he had co-written a classic book on Monte Carlo methods. Indeed, if you look at the end of the paper, you see the paper’s idea was born at an early conference on Monte Carlo methods.

But who was S.R. Broadbent? And what was he doing at a bottle factory?

Probability work

This Broadbent character was a bit mysterious. At first, it seemed as though the writer always published as S.R. Broadbent — so much so, I thought he might be a she. Such things have happened before, as the story behind the influential 1960 paper by A.H. Land and A.G. Doig demonstrated.1Alisa H. Land only died in 2021. Alison G. Doig is still alive but she has been known as Alison Harcourt and has worked as a tutor at the University of Melbourne for many, many years.

From the start I could see S.R. Broadbent was strong in probability. In the 1950s Broadbent wrote probability papers, including a 1953 paper co-written by the British father of stochastic geometry David G. Kendall. 2For those across the La Manche, the French father of stochastic geometry is Georges Matheron. The Broadbent and Kendall paper has the improbable title The Random Walk of Trichostrongylus retortaeformis. (It’s a type of larva.)

And then I was thrown by the title of a paper co-written by Broadbent in the 1960s: A computer assessment of media schedules.

Simon says ley lines are just spatial coincidences

Then I lost track of S.R. Broadbent in terms of academic papers. I had almost given up, assuming Broadbent had left academia. But then I saw Broadbent’s name mentioned in the acknowledgements in a famous spatial statistics paper written by David Kendall in 1989. In the paper Kendalls cites a 1980 paper written by S.R. Broadbent. The subject was the statistics of ley lines — it’s a thing — showing they can be explained by mere chance. And in the acknowledgements Kendall thanks a certain Simon Broadbent — presumably he’s S.R. Broadbent whose paper Kendall cited.

Had I found the S.R. Broadbent? I was a bit skeptical, but then he’s mentioned as Simon Broadbent in this obituary of Hammersley:

Despite the title of the paper, ‘Poor man’s Monte Carlo’, the lasting contributions are the clear statement of the problem of counting self-avoiding walks, the use of subadditivity to prove the existence of the connective constant, and the discussion of random media that culminated in Simon Broadbent’s formulation of the percolation model.

That reassured me.

And then I found two 1950s papers (1954 and 1958) by a Simon Broadbent, not S.R. Broadbent, published in the Journal of the Royal Statistical Society Series C: Applied Statistics. I had actually stumbled upon the 1958 paper earlier in my searches. Initially I didn’t think it was S.R. Broadbent because the journal referred to the author as Dr Simon Broadbent, and I thought only medical doctors do that to each other. But it’s a statistics journal. So it’s probably S.R. Broadbent. Also, for this 1954 paper in the same journal they call him Mr Simon Broadbent.

And then I looked at the paper by Broadbent that David Kendall cited, and in that paper Broadbent acknowledges Kendall. And then I noticed that this paper was actually written by Simon Broadbent, but Kendall had cited the paper and written its author as S.R. Broadbent. The final link. S.R. Broadbent was Simon Broadbent — whoever that was.

I thought I had come to the end. But I hadn’t.

Advertising guru

A non-scientific article had floated up in my search results talking about the death of some big name in advertising. Seeing the title and reading the start, I thought it couldn’t be the same Simon Broadbent.

Simon asked simple questions: How does advertising work? How much advertising is enough? How can we tell whether our ad campaign has succeeded? Is adspend profitable?
When he entered advertising in the 60s, none of these questions had satisfactory answers. Many doubted they could ever be answered. Simon did more than anyone to show that advertising can contribute to profit, and that it is not a cost that can be cut with no effect on sales.

But after reading more, I realized this had to be the very same Broadbent.

He answered advertising questions with the rigour of a mathematician and the clarity of a poet. He read engineering at Cambridge, took an applauded first in mathematics and a diploma in statistics at Magdalen, Oxford, and completed his doctorate in statistics at Imperial, London. His poetry was published by Blackwell’s while he was still an undergraduate.

His lifelong interest was applying statistics to problems that nobody had thought amenable to statistical analysis. His paper to the Royal Statistical Society, In Search of the Ley Hunter, debunked claims for the existence of megalithic ley lines, pointing out that there were fewer alleged lines than would be expected from a random distribution of points between which lines could be drawn. Preparing the paper also allowed him to crawl about in Greenwich Park, testing the likely accuracy of stone-age surveying devices using instruments he had built himself.

Simon R. Broadbent made quite the impression in advertising, even being called a global legend in media research. Some of his contributions to this field are mentioned in this published tribute. It seems this field remembered him better than his original field. I couldn’t find any tributes focusing on his mathematical contributions.

Mystery solved, except for what the R stands for.

Back to percolation

Now knowing the S stood for Simon, I noticed that Simon Broadbent is mentioned in the Wikipedia article on percolation theory. The article cites a tutorial article by John Hammersley and Dominic Welsh, which gives the original motivation of percolation theory:

The study of percolation originated from a question posed to one of us by S. R. Broadbent (see Hammersley and Morton 1954). Broadbent was at that time working at the British Coal Utilization Research Association on the design of gas masks for use in coal mines. These masks contained porous carbon granules (the medium) into which gas (the fluid) could penetrate. The pores in a granule constituted a random network of tiny interconnecting tunnels, into which the gas could move by surface adsorption; and the question was what properties of such a random network would assist or inhibit the uptake of gas. If the pores were large enough and richly enough connected, the gas could permeate the interior of a granule; but if the pores were too small or inadequately connected, the gas could not get beyond the outer surface of a granule. Thus there was a critical point, above which the mask worked well and below which it was ineffective. Critical phenomena of this sort are typical of percolation processes, and do not, normally arise in ordinary diffusion.

The original Broadbent and Hammersley paper did mention that Broadbent had previously worked at British Coal Utilisation Research Association (BCURA).

(Another aside, the same Wikipedia article says Rosalind Franklin, whose X-ray work helped reveal the structure of DNA, also worked on porous material at the British Coal Utilisation Research Association, before leaving in 1946, which I guess was before Broadbent started working there. You can read about her career, for example, here.)

Summary

Simon R. Broadbent was a trained engineer and mathematician who co-founded the thriving mathematical discipline of percolation theory, before leaving that world to go revolutionize the field of advertising by using quantitative methods and writing influential books.

A Fields Medal goes to another percolation researcher

The Fields Medal is a prize in mathematics awarded every four years to two to four outstanding researchers (forty years old or younger) working in mathematics. One of the medals this year was awarded to French mathematician Hugo Duminil-Copin who has solved problems and obtained new results in the percolation theory which lies in the intersection of probability and statistical physics. Here’s a good Quanta article on Duminil-Copin and some of his work.

(The other winners are June Huh, James Maynard, and Maryna Viazovska.)

The Fields Medal people has been kind to probability researchers in recent years. Previous winners working in probability have included Wendelin Werner (2006), Stanislav Smirov (2010), and Martin Hairer (2014), while other winners in recent years have also made contributions to probability.

All in all, that’s not too shabby for a discipline that for a long, long time wasn’t considered part of mathematics.  (That story deserves a post on its own.)

I work nowhere near Duminil-Copin, but I have used some percolation theory in my work. I will write a couple of posts on percolation theory. Eventually, I may even mention some recent work that my collaborators and I have been working on.