28 thoughts on “Machine learning – Introduction to Gaussian processes”

  1. Thanks for the helful lecture!
    The only thing I want to point out is that if you put labels on the axises on your plots, it would be more helful for the listener to understand from the begging what you describe

  2. I tried to understand GP via blog article, paper and a lot of videos. Best video ever on GP! Thank you !

  3. https://youtu.be/4vGiHC35j9s?t=3344 at the following timestamp prof wrote f* = (k*) * (K)-1 * f but according to theorem it should be f* = (k*) * (K)-1 * (x – f ) where x is the vector of know x values.

  4. I do not undetstand the concept of gp prior and gp posterior. Could anyone help me? Thank you in advance!

  5. This lecture is so amazing! The hand drawing part is really helpful to build up intuition reagarding GP. This is a life-saving video to my finals. Many thanks!

  6. I think I got the essence of GP, but what I can not understand is why we take that the mean is 0 when clearly it is not 0. I mean, if we suppose that f* will be distributed as a gaussian with mean 0, the expectation value of f* must be 0. Could anyone explain me this fact?

  7. If I run the example code I get an error stating that my K_ isn't a positive-definite matrix. What am I doing wrong?

  8. when estimating 'f', why each point is treated as a separate dimension and not different points in the same dimension?

  9. This is indeed an Awesome lecture! I liked the way the complexity is slowly built over the lecture. Thank you very much!

  10. Wow, you saved my life with this genius lecture ! I think it's a pretty abstract idea with GP and it's nice that you can walk one through from scratch !

Leave a Reply

Your email address will not be published. Required fields are marked *