Core Classes

All of the classes introduced below can be found in pmrf.core, but are exported at the root. These include pmrf.Model, pmrf.Frequency, pmrf.Evaluator, and other helper classes for optimization and inference.

The Model

Model represents the base class for any RF model. All built-in models and components, such as Resistor, PhaseLine, CoaxialLine etc. inherit from this class.

The Model class itself is a parax.Module, and therefore an equinox.Module, a JAX PyTree and a Python dataclass. If these concepts are completely foreign to you, do not worry. The practical consequences of this are:

  • Models are immutable, and represent pure functions with attached data/parameters. You cannot update a model’s parameters by directly modifying its attributes.

  • To edit models, Parax provides several convenience methods. These methods are therefore also available on all ParamRF models by virtue of inheritance. This includes freezing certain parameters using pmrf.with_fixed_params(), inspecting parameters using pmrf.Model.named_params(), and manipulating parameters using pmrf.Model.with_params(). See the Parax documentation for more details.

  • All models are (by default) “JAX compatible”. This allows for just-in-time compilation and computation on platforms such as GPUs, TPUs etc., as well as other advanced JAX/Equinox features (e.g. vectorization via jax.vmap() and differentiation via jax.jacfwd()). See the JAX and Batched Models section for more details.

To define custom models, you can inherit directly from Model. When inherited from, methods such as s(), a(), z() and y() can be overriden to define model S-parameters, ABCD-parameters etc. as a function of frequency. This is an important difference to other libraries (e.g. scikit-rf): a model does not store its frequency, but rather accepts it as a functional input. Any network properties that have not been manually overriden are automatically made available via RF conversion functions. These can be found under pmrf.rf.

For more complex models, the __call__() method can also be overridden. Compared to the previous approach, __call__() does not accept any arguments as input, but instead must return a fully constructed Model instance. This is very useful for declarative, hierarchial model building. For a deeper look into building and defining custom models, see the Model Building section.

Frequency, Parameter and jax.Array

The Frequency class defines the axis over which models are evaluated. Ultimately, it is a lightweight wrapper around a JAX array (commonly imported as jnp.ndarray). As mentioned, those unfamiliar with JAX can see either the JAX and Batched Models section in this documentation, or have a look at JAX’s own quickstart guide for a more thorough overview. However, for those seeking a TLDR, the API is very similar to numpy’s np.ndarray, with some few “rough edges”.

Since ParamRF builds on top of the Parax library, parameters are created using the Parameter class. Similar to Frequency, parax.Parameter also wraps a JAX array, storing its value and additional metadata. This allows for parameter bounds and scaling, marking parameters as fixed, and associating a probability distribution with a parameter e.g. for Bayesian inference. However, unlike Frequency, parameters eagerly cast to jnp.ndarray. Ultimately, this means you can conveniently treat parameter’s as if they were regular arrays in your equations.

The Evaluator and other classes

Evaluator is a lower-level tool that can be used to “evaluate” a model over frequency in a composable manner. Its output is a jnp.ndarray (tensor or scalar). Evaluators can be used for extraction of model features, encapsulating loss/error/likelihood functions, and more. For example, to create a goal-orientated objective function for an optimization, the Goal evaluator can be used. Note that evaluators derive from Parax Operators, meaning they can be added, subtracted, multiplied, negated etc. For example, to use a statistical likelihood function for optimization purposes, simply negate it and pass it as if it were a loss function.

Evaluators are created automatically when fitting routines are called, such as fit() or condition(). For example, specifying 's21_db' as a fitting feature creates a Feature evaluator, whilst specifying a loss or likelihood function creates a TargetLoss or MarginalLogLikelihood evaluator respectively.

Other core classes include the Loss, Likelihood, DiscrepancyModel and NoiseModel classes. These help glue the rest of library together, as well as enable more advanced features such as hyper-parameter based optimization, and Gaussian process discrepancy modeling. See the Tutorials section for more information.