Skip to content
## Posts

# Machine learning – Introduction to Gaussian processes

##
28 thoughts on “Machine learning – Introduction to Gaussian processes”

### Leave a Reply

Education Transforming Lives

Copyright © 2019 Mlearn 2008 — Primer WordPress theme by GoDaddy

thanks a lot prof. Very clean and easy to understand explanation.

Thanks for the helful lecture!

The only thing I want to point out is that if you put labels on the axises on your plots, it would be more helful for the listener to understand from the begging what you describe

I tried to understand GP via blog article, paper and a lot of videos. Best video ever on GP! Thank you !

Great lesson! Thank you!

https://youtu.be/4vGiHC35j9s?t=3344 at the following timestamp prof wrote f* = (k*) * (K)-1 * f but according to theorem it should be f* = (k*) * (K)-1 * (x – f ) where x is the vector of know x values.

You are the best

should we calculate all the f* or we can use K* as a big matrix to do calculation in one time

THANK YOU SO MUCH!!!

39:40 Giant paren left open. Please close this please close this

Can someone explain why f is distributed with mean 0?

I do not undetstand the concept of gp prior and gp posterior. Could anyone help me? Thank you in advance!

This lecture is so amazing! The hand drawing part is really helpful to build up intuition reagarding GP. This is a life-saving video to my finals. Many thanks!

UBC amazing

I think I got the essence of GP, but what I can not understand is why we take that the mean is 0 when clearly it is not 0. I mean, if we suppose that f* will be distributed as a gaussian with mean 0, the expectation value of f* must be 0. Could anyone explain me this fact?

Hi , great lecture. Can someone explain how the mean is got? is u and u* just assumed ?

If I run the example code I get an error stating that my K_ isn't a positive-definite matrix. What am I doing wrong?

This becomes very easy to understand with your thorough explanation. Thank you very much!

Wow! Great Lecture!

Awesome lecture, very well explained!

Very good lecture, full of intuitive examples which deepens the understanding. Thanks a lot

when estimating 'f', why each point is treated as a separate dimension and not different points in the same dimension?

Guys, do you know which textbook for this course professor was talking about?

Great Teacher! Thanks!

This is indeed an Awesome lecture! I liked the way the complexity is slowly built over the lecture. Thank you very much!

amazing intuition

Thank you for sharing this vdo, it was really helpful

Great lecture

Wow, you saved my life with this genius lecture ! I think it's a pretty abstract idea with GP and it's nice that you can walk one through from scratch !