We can’t all talk all the time, while still being heard. That’s true for people and it’s true for wireless networks.
Networks need to schedule when their network nodes can transmit and receive data. In other words, a collection or subset of network nodes is chosen to transmit, while another subset is chosen to receive. (In telecommunication language, this is part of what is called the medium access control or MAC layer.)
The classic scheduler (spatial) Aloha involves each node flipping a biased coin with probability \(p\) of heads occurring. If heads comes up, a node can transmit. If not, it must stand ready to receive data. Simple.
But can we do better?
Determinantal scheduling
A few years ago, a couple collaborators and I become interested in this problem. To develop an intelligent (or, at least, not stupid) scheduler, we adapted a machine (or statistical) learning framework and published the results in a conference paper:
- 2020 – Błaszczyszyn, Brochard, and Keeler, Coverage probability in wireless networks with determinantal scheduling.
The hint is in the title. Using a determinantal scheduler, we derived mathematical expressions for the coverage probability. I discussed that paper in an earlier post. I produced the numerical results with code located here and here.
From fermions to generative model
We used a learning framework based on determinantal point process, which were originally used to model subatomic particles called fermions such as electrons. A key property is that determinantal points, like fermions, repel each other, so this is a random model for diversity, anti-clustering or negative correlations.
Defined with a kernel matrix and determinants, these random models have convenient mathematical properties. For example, you can both simulate them (exactly) and train (or fit) them with maximum likelihood methods. Using this fermion model, Kulesa and Taskar proposed and pioneered a machine learning framework.
When properly trained (or fitted), determinantal models can act as generative artificial intelligence (or Gen AI) models. In some sense, these point processes are special types of Gibbsian point processes, which are defined with a general energy function called a Hamiltonian. In the field of AI and computer science more broadly, there are many energy-based models, some of which are used for generative models such as Boltzmann machines.
Recent work
A few days ago I was curious what was happening in this research area, and by chance, I noticed this recently uploaded preprint:
- 2025 – Tu, Saha, and Dhillon – Determinantal Learning for Subset Selection in Wireless Networks.
I feel more work remains to be done in this area.
Further reading
Wireless networks
Our work was inspired by an earlier paper that we did:
-
- Błaszczyszyn and Keeler, Determinantal thinning of point processes with network learning applications.
I mentioned that paper in an earlier post.
The new work by Tu, Saha, and Dhillon is based on previous work, such as this ambitiously-titled paper:
- 2019 – Saha and Dhillon – Machine Learning meets Stochastic Geometry: Determinantal Subset Selection for Wireless Networks.
This work by Saha and Dhillon work was independent of our work on scheduling, but came out around the same time.
Machine learning
I should stress that none of the above work was possible without the pioneering efforts by Kulesza and Taskar in a series of papers, much of which appears in their book:
- 2012 – Kulesza and Taskar – Determinantal point processes for machine learning, Now Publishers.
You can find a version of the book here.