diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json
index 0a30666..ed8eb39 100644
--- a/dev/.documenter-siteinfo.json
+++ b/dev/.documenter-siteinfo.json
@@ -1 +1 @@
-{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-29T00:17:28","documenter_version":"1.7.0"}}
\ No newline at end of file
+{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-29T02:16:16","documenter_version":"1.7.0"}}
\ No newline at end of file
diff --git a/dev/api/index.html b/dev/api/index.html
index 728cbb7..e637bc9 100644
--- a/dev/api/index.html
+++ b/dev/api/index.html
@@ -12,7 +12,7 @@
julia> y = rand(Float32, 2, 5);
julia> size(first(nomad((u, y), ps, st)))
-(8, 5)sourceNeuralOperators.DeepONet — Type
DeepONet(branch, trunk, additional)
Constructs a DeepONet from a branch and trunk architectures. Make sure that both the nets output should have the same first dimension.
Arguments
branch: Lux network to be used as branch net.
trunk: Lux network to be used as trunk net.
Keyword Arguments
additional: Lux network to pass the output of DeepONet, to include additional operations for embeddings, defaults to nothing
References
[1] Lu Lu, Pengzhan Jin, George Em Karniadakis, "DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators", doi: https://arxiv.org/abs/1910.03193
Constructs a DeepONet from a branch and trunk architectures. Make sure that both the nets output should have the same first dimension.
Arguments
branch: Lux network to be used as branch net.
trunk: Lux network to be used as trunk net.
Keyword Arguments
additional: Lux network to pass the output of DeepONet, to include additional operations for embeddings, defaults to nothing
References
[1] Lu Lu, Pengzhan Jin, George Em Karniadakis, "DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators", doi: https://arxiv.org/abs/1910.03193
FourierNeuralOperator(
σ=gelu; chs::Dims{C}=(2, 64, 64, 64, 64, 64, 128, 1), modes::Dims{M}=(16,),
permuted::Val{perm}=False, kwargs...) where {C, M, perm}
Fourier neural operator is a operator learning model that uses Fourier kernel to perform spectral convolutions. It is a promising way for surrogate methods, and can be regarded as a physics operator.
The model is comprised of a Dense layer to lift (d + 1)-dimensional vector field to n-dimensional vector field, and an integral kernel operator which consists of four Fourier kernels, and two Dense layers to project data back to the scalar field of interest space.
Arguments
σ: Activation function for all layers in the model.
Keyword Arguments
chs: A Tuple or Vector of the 8 channel size.
modes: The modes to be preserved. A tuple of length d, where d is the dimension of data.
permuted: Whether the dim is permuted. If permuted = Val(false), the layer accepts data in the order of (ch, x_1, ... , x_d , batch). Otherwise the order is (x_1, ... , x_d, ch, batch).
ch: A Pair of input and output channel size ch_in => ch_out, e.g. 64 => 64.
modes: The modes to be preserved. A tuple of length d, where d is the dimension of data.
::Type{TR}: The transform to operate the transformation.
Keyword Arguments
init_weight: Initial function to initialize parameters.
permuted: Whether the dim is permuted. If permuted = Val(false), the layer accepts data in the order of (ch, x_1, ... , x_d, batch). Otherwise the order is (x_1, ... , x_d, ch, batch).
ch: A Pair of input and output channel size ch_in => ch_out, e.g. 64 => 64.
modes: The modes to be preserved. A tuple of length d, where d is the dimension of data.
::Type{TR}: The transform to operate the transformation.
Keyword Arguments
σ: Activation function.
permuted: Whether the dim is permuted. If permuted = Val(true), the layer accepts data in the order of (ch, x_1, ... , x_d , batch). Otherwise the order is (x_1, ... , x_d, ch, batch).
All the keyword arguments are passed to the OperatorConv constructor.