Low Rank Neural Operator#

class LowRankNeuralOperator(lifting_net, projecting_net, field_indices, coordinates_indices, n_kernel_layers, rank, inner_size=20, n_layers=2, func=<class 'torch.nn.modules.activation.Tanh'>, bias=True)[source]#

Bases: KernelNeuralOperator

Implementation of LowRank Neural Operator.

LowRank Neural Operator is a general architecture for learning Operators. Unlike traditional machine learning methods LowRankNeuralOperator is designed to map entire functions to other functions. It can be trained with Supervised or PINN based learning strategies. LowRankNeuralOperator does convolution by performing a low rank approximation, see LowRankBlock.

See also

Original reference: Kovachki, N., Li, Z., Liu, B., Azizzadenesheli, K., Bhattacharya, K., Stuart, A., & Anandkumar, A. (2023). Neural operator: Learning maps between function spaces with applications to PDEs. Journal of Machine Learning Research, 24(89), 1-97.

Parameters:
  • lifting_net (torch.nn.Module) – The neural network for lifting the input. It must take as input the input field and the coordinates at which the input field is avaluated. The output of the lifting net is chosen as embedding dimension of the problem

  • projecting_net (torch.nn.Module) – The neural network for projecting the output. It must take as input the embedding dimension (output of the lifting_net) plus the dimension of the coordinates.

  • field_indices (list[str]) – the label of the fields in the input tensor.

  • coordinates_indices (list[str]) – the label of the coordinates in the input tensor.

  • n_kernel_layers (int) – number of hidden kernel layers. Default is 4.

  • inner_size (int) – Number of neurons in the hidden layer(s) for the basis function network. Default is 20.

  • n_layers (int) – Number of hidden layers. for the basis function network. Default is 2.

  • func – The activation function to use for the basis function network. If a single torch.nn.Module is passed, this is used as activation function after any layers, except the last one. If a list of Modules is passed, they are used as activation functions at any layers, in order.

  • bias (bool) – If True the MLP will consider some bias for the basis function network.

forward(x)[source]#

Forward computation for LowRank Neural Operator. It performs a lifting of the input by the lifting_net. Then different layers of LowRank Neural Operator Blocks are applied. Finally the output is projected to the final dimensionality by the projecting_net.

Parameters:

x (torch.Tensor) – The input tensor for fourier block, depending on dimension in the initialization. It expects a tensor \(B \times N \times D\), where \(B\) is the batch_size, \(N\) the number of points in the mesh, \(D\) the dimension of the problem, i.e. the sum of len(coordinates_indices)+len(field_indices).

Returns:

The output tensor obtained from Average Neural Operator.

Return type:

torch.Tensor