Derivada de um processo gaussiano

12

Acredito que a derivada de um processo gaussiano (GP) é outra GP e, portanto, gostaria de saber se existem equações de forma fechada para as equações de previsão da derivada de uma GP? Em particular, estou usando o núcleo de covariância exponencial ao quadrado (também chamado de Gaussiano) e quero saber sobre como fazer previsões sobre a derivada do processo Gaussiano.


fonte
O que você quer dizer com derivada do GP? você gera aleatoriamente uma curva a partir da BP, e depois pega a derivada? x(t)
Placidia
@Placidia, não, quero dizer cálculo , o que eu acredito que deve ser outro Gaussian processox(t)t
Boa pergunta. No entanto, eu me lembro que o movimento browniano é ao mesmo tempo um GP e não é diferenciado. Portanto, não tenho certeza de que possa haver uma expressão genérica. É claro que x (t) -x (th) deve ser um gaussiano, portanto, deve ser possível, dada a função de covariância, pensar em probabilidades sobre ele para um determinado h.
conjectures
@conjectures, é por isso que eu disse especificamente que tenho um GP em que a função do kernel é o exponencial ao quadrado (já que sei que esse é infinitamente diferenciável) e estava realmente apenas procurando o caso derivado no meu exemplo. Mas bom ponto, no entanto!

Respostas:

13

The short answer: Yes, if your Gaussian Process (GP) is differentiable, its derivative is again a GP. It can be handled like any other GP and you can calculate predictive distributions.

But since a GP G and its derivative G are closely related you can infer properties of either one from the other.

  1. Existence of G

A zero-mean GP with covariance function K is differentiable (in mean square) if K(x1,x2)=2Kx1x2(x1,x2) exists. In that case the covariance function of G is equal to K. If the process is not zero-mean, then the mean function needs to be differentiable as well. In that case the mean function of G is the derivative of the mean function of G.

(For more details check for example Appendix 10A of A. Papoulis "Probability, random variables and stochastic processes")

Since the Gaussian Exponential Kernel is differentiable of any order, this is no problem for you.

  1. Predictive distribution for G

This is straightforward if you just want to condition on observations of G: If you can calculate the respective derivatives you know mean and covariance function so that you can do inference with it in the same way as you would do it with any other GP.

But you can also derive a predictive distributions for G based on observations of G. You do this by calculating the posterior of G given your observations in the standard way and then applying 1. to the covariance and mean function of the posterior process.

This works in the same manner the other way around, i.e. you condition on observations of G to infer a posterior of G. In that case the covariance function of G is given by integrals of K and might be hard to calculate but the logic is really the same.

g g
fonte
I do not understand your question. There is an explicit formula for the covariance function and the mean function given above (and in 9.4 of Rasmussen/Williams). As this is all there is to know and use a GP what else could you ask for?
g g
A process with this covariance is not differentiable. As stated in Section 1. of the answer the kernel function must be differentiable wrt both entries. The delta function is neither differentiable nor continuous. So G does not even exist.
g g
Is it possible that you confuse the mean function and the paths of the process? Note that the mean function is smoother than the paths and may be differentiable even though the process is not. But the mean function is a deterministic function, not a process, so there is no variance which can be calculated.
g g
1

It is. See Rasmussen and Williams section 9.4. Also, some authors argue strongly against the square exponential kenrnel - it is too smooth.

Yair Daon
fonte
1
So is there a predictive distribution for the derivative?