Optimization and Inference
Models can be optimized using pmrf.optimize.minimize, sampled for statistical inference using pmrf.infer.sample, or fit to measured data using the high-level pmrf.fit() function.
Lower-level routines directly accept an Evaluator to define the objective or log-likelihood function. For design purposes, the built-in Goal evaluator is highly useful. When fitting data, datasets can be passed directly as a skrf.Network or pmrf.NetworkCollection. Target features are specified using simple strings (e.g., 's11'), and you can easily apply specific loss or likelihood objects, such as pmrf.losses.RMSELoss or pmrf.likelihoods.GaussianLikelihood.
Solvers
All of the above methods take a “solver” argument. ParamRF allows for optimization using either scipy.optimize.minimize() or optimistix.minimise(), and Bayesian inference using inferix() (which provides wrappers for PolyChord and experimentally for BlackJAX).
Scipy: Provides a wrapper around gradient-based and gradient-free optimization algorithms from
scipy.optimize()inpmrf.optimize.ScipyMinimizer. This includes algorithms such as SLSQP, Nelder-Mead and L-BFGS. These algorithms are CPU-native and cannot run on the GPU.Optimistix: Provides JAX-native optimization algorithms, such as
optimistix.BFGSandoptimistix.NelderMead. These algorithms run their loop directly in JAX, and therefore can be compiled to any architecture (CPU, GPU, TPU).Inferix: Enables Bayesian inference through nested sampling and MCMC sampling using e.g.
inferix.PolyChordandinferix.NUTS. This approach provides maximum likelihood parameters, as well as full posterior probability distributions and Bayesian evidence for model comparison. We recommend this source for a brief introduction to nested sampling and Bayesian inference.