Contacts
Giacomo Zanella and colleagues find a way to compare how certain algorithms perform

Comparing complex algorithms is a very challenging task. Nonetheless, it is a necessary process as algorithms must perform quickly and use minimal resources, in addition to handling larger data sets and problem sizes without a significant decrease in performance. In their recent paper "Optimal design of the Barker proposal and other locally balanced Metropolis–Hastings algorithms", Giacomo Zanella of Bocconi's Department of Decision Sciences, Jure Vogrinc of the University of Warwick and Samuel Livingstone of the University College London develop a method for predicting the performance of several algorithms within a class.

Designing an algorithm means devising a method or a mathematical process for solving problems. Then, there are mathematical methods that predict under what conditions an algorithm performs well, which means it can produce its result in a reasonably short time span. This is the approach used when dealing with stochastic algorithms, which involve an element of randomness, leading to different possible outcomes even with identical inputs. Such algorithms work well not just in two dimensions like a cartesian graph but on very large (and even theoretically unlimited) numbers of dimensions. This is not abstract theory: algorithms used in machine learning and other advanced statistical applications routinely work with several thousand dimensions.

"It can be said that this is a case of mathematics helping algorithmic design," says Giacomo Zanella. The aim of this paper is to study a specific class of algorithms (technically known as first-order locally balanced Metropolis–Hastings algorithms) to determine their efficiency, that is to what extent each one of them responds to what insiders call the "curse of dimensionality". This term is used to address the fact that, in the worst case, the amount of data required to keep the same reliability grows exponentially as the number of dimensions increases. Therefore, algorithms are bound to become less efficient as dimensions grow, and it is important to know which ones are least vulnerable given certain conditions. This is what the "optimal design" in the paper's title refers to. The outcome is what the authors define as "an explicit expression for the asymptotic efficiency of an arbitrary algorithm in the class."

"It is common for practitioners to dedicate a lot of effort into making careful algorithm design choices and adjusting algorithmic tuning parameters to ensure that performance is adequate for a given problem. Failure to do this can be catastrophic. Examples in which a well-designed algorithm performs adequately but a less carefully-chosen alternative does not are ubiquitous," Zanella explains.

Jure Vogrinc, Samuel Livingstone, Giacomo Zanella, "Optimal design of the Barker proposal and other locally balanced Metropolis–Hastings algorithms", Biometrika, Volume 110, Issue 3, September 2023, DOI https://doi.org/10.1093/biomet/asac056