Determinantal learning for selecting subsets in wireless networks

We can’t all talk all the time, while still being heard. That’s true for people and it’s true for wireless networks.

Networks need to schedule when their network nodes can transmit and receive data. In other words, a collection or subset of network nodes is chosen to transmit, while another subset is chosen to receive. (In telecommunication language, this is part of what is called the medium access control or MAC layer.)

The classic scheduler (spatial) Aloha involves each node flipping a biased coin with probability \(p\) of heads occurring. If heads comes up, a node can transmit. If not, it must stand ready to receive data. Simple.

But can we do better?

Determinantal scheduling

A few years ago, a couple collaborators and I become interested in this problem. To develop an intelligent (or, at least, not stupid) scheduler, we adapted a machine (or statistical) learning framework and published the results in a conference paper:

  • 2020 – Błaszczyszyn, Brochard, and Keeler, Coverage probability in wireless networks with determinantal scheduling.

The hint is in the title. Using a determinantal scheduler, we derived mathematical expressions for the coverage probability. I discussed that paper in an earlier post. I produced the numerical results with code located here and here.

From fermions to generative model

We used a learning framework based on determinantal point process, which were originally used to model subatomic particles called fermions such as electrons. A key property is that determinantal points, like fermions, repel each other, so this is a random model for diversity, anti-clustering or negative correlations.

Defined with a kernel matrix and determinants, these random models have convenient mathematical properties. For example, you can both simulate them (exactly) and train (or fit) them with maximum likelihood methods. Using this fermion model, Kulesa and Taskar proposed and pioneered a machine learning framework.

When properly trained (or fitted), determinantal models can act as generative artificial intelligence (or Gen AI) models. In some sense, these point processes are special types of Gibbsian point processes, which are defined with a general energy function called a Hamiltonian. In the field of AI and computer science more broadly, there are many energy-based models, some of which are used for generative models such as Boltzmann machines.

Recent work

A few days ago I was curious what was happening in this research area, and by chance, I noticed this recently uploaded preprint:

  • 2025 – Tu, Saha, and Dhillon –  Determinantal Learning for Subset Selection in Wireless Networks.

I feel more work remains to be done in this area.

Further reading

Wireless networks

Our work was inspired by an earlier paper that we did:

    • 2019 – Błaszczyszyn and Keeler, Determinantal thinning of point processes with network learning applications.

In that paper, which I mentioned in an earlier post, we discussed the potential of using determinantal point processes for scheduling network transmission (or selecting subsets). We pointed out that such a model could be trained on optimized network configurations.

The new work by Tu, Saha, and Dhillon is based on previous work, such as this ambitiously-titled paper:

  • 2019 – Saha and Dhillon – Machine Learning meets Stochastic Geometry: Determinantal Subset Selection for Wireless Networks.

This work by Saha and Dhillon work was independent of our work on scheduling, but came out around the same time.

Machine learning

I should stress that none of the above work was possible without the pioneering efforts by Kulesza and Taskar in a series of papers, much of which appears in their book:

  • 2012 – Kulesza and Taskar – Determinantal point processes for machine learning, Now Publishers.

You can find a version of the book here.

Coverage probability in wireless networks with determinantal scheduling

My collaborators and I uploaded a manuscript:

  • Błaszczyszyn, Brochard, and Keeler, Coverage probability in wireless networks with determinantal scheduling.

https://arxiv.org/abs/2006.05038

Details

The paper builds off some previous work by us that uses a (relatively) new model in machine learning:

  • Błaszczyszyn and Keeler, Determinantal thinning of point processes with network learning applications.

https://arxiv.org/abs/1810.08672

The new machine learning model is based on a special type of point process called a determinantal point process. It was originally called a Fermion point process. These are useful point processes as the exhibit certain closure properties under certain operations such independent thinning.

Determinantal point processes are random objects and, when they are properly trained or fitted, can serve as generative artificial intelligence (or Gen AI) models. In fact, you can interpret these point processes as a special Gibbsian point processes, which have at their core a Hamiltonian function. A Hamiltonian is generalized type of energy function, which is also the basis for other generative (or stochastic) models such as Boltzmann machines.

Kulesza and Taskar introduced and developed the framework for using determinantal point processes for machine learning models.

Code

The MATLAB code for the producing the results in the paper can be found here:

https://github.com/hpaulkeeler/detcov_matlab

I also re-wrote the MATLAB code into Python:

https://github.com/hpaulkeeler/detcov_python

Further reading

Our work started with this related paper:

  • Błaszczyszyn and Keeler, Determinantal thinning of point processes with network learning applications.

I wrote about this work briefly in an earlier post.

Parallel (or orthogonal, perhaps) to our work, is this paper:

  • 2019 – Saha and Dhillon – Machine Learning meets Stochastic Geometry: Determinantal Subset Selection for Wireless Networks.

https://arxiv.org/abs/1905.00504

Update: A manuscript extending the above line of research by Saha and Dhillon was uploaded:

  • 2025 – Tu, Saha and Dhillon – Determinantal Learning for Subset Selection in Wireless Networks.

https://arxiv.org/abs/2503.03151

Determinantal thinning of point processes with network learning applications

My colleague and I uploaded a manuscript:

  • Błaszczyszyn and Keeler, Determinantal thinning of point processes with network learning applications.

https://arxiv.org/abs/1810.08672

Details

The paper uses a (relatively) new model framework in machine learning.  This framework is based on a special type of point process called a determinantal point process, which is also called a fermion point process. (This particle model draw inspiration from the form of the wave function in quantum mechanics.) Kulesza and Taskar introduced and developed the framework for using determinantal point processes for machine learning models.

The paper covers a type of generative machine learning (or Gen AI) model, due to its using randomness to create network outcomes. In fact, you can interpret these point processes as a special Gibbsian point processes, which have at their core a Hamiltonian function. A Hamiltonian is generalized type of energy function, which is also the basis for other generative (or stochastic) models such as Boltzmann machines.

The training method for determinantal point processes is based on the maximum likelihood approach, which gives a convex optimization problem, allowing you to tackle it with your favourite gradient method, such as the BFGS or Adam method.

It turns out that point processes are amenable for looking at the signal-to-inference-plus-noise ratio (SINR) — see this post for details. More specifically, we found these point processes give tractable SINR-based expressions when you use (what I call) the standard (SINR) model, which I covered in a previous post.

Code

The MATLAB code for the producing the results in the paper can be found here:

https://github.com/hpaulkeeler/DetPoisson_MATLAB

I also re-wrote (or translated) the MATLAB code into Python:

https://github.com/hpaulkeeler/DetPoisson_Python

Further reading

My collaborators and I later extended this line of work in a separate paper, where we looked at the problem of designing a smart(er) medium access control (MAC) scheduling algorithm:

  • Błaszczyszyn, Brochard, and Keeler, Coverage probability in wireless networks with determinantal scheduling.

https://arxiv.org/abs/2006.05038

I wrote about this work briefly in a later post.

Parallel (or orthogonal, perhaps) to our work, is this paper:

  • 2019 – Saha and Dhillon – Machine Learning meets Stochastic Geometry: Determinantal Subset Selection for Wireless Networks.

https://arxiv.org/abs/1905.00504

Update: A manuscript extending the above line of research by Saha and Dhillon was recently uploaded:

  • 2025 – Tu, Saha and Dhillon – Determinantal Learning for Subset Selection in Wireless Networks.

https://arxiv.org/abs/2503.03151