# Machine learning – Introduction to Gaussian processes

## 28 thoughts on “Machine learning – Introduction to Gaussian processes”

1. sak02010 says:

thanks a lot prof. Very clean and easy to understand explanation.

Thanks for the helful lecture!
The only thing I want to point out is that if you put labels on the axises on your plots, it would be more helful for the listener to understand from the begging what you describe

3. francesco canonaco says:

I tried to understand GP via blog article, paper and a lot of videos. Best video ever on GP! Thank you !

4. Simone Iovane says:

Great lesson! Thank you!

5. Prakyath Kantharaju says:

https://youtu.be/4vGiHC35j9s?t=3344 at the following timestamp prof wrote f* = (k*) * (K)-1 * f but according to theorem it should be f* = (k*) * (K)-1 * (x – f ) where x is the vector of know x values.

6. Buoy Rina says:

You are the best

7. Junfan Huang says:

should we calculate all the f* or we can use K* as a big matrix to do calculation in one time

8. Junfan Huang says:

THANK YOU SO MUCH!!!

9. Benjamin Crouzier says:

Can someone explain why f is distributed with mean 0?

11. Pedro Maroto says:

I do not undetstand the concept of gp prior and gp posterior. Could anyone help me? Thank you in advance!

12. Sijin Heung says:

This lecture is so amazing! The hand drawing part is really helpful to build up intuition reagarding GP. This is a life-saving video to my finals. Many thanks!

13. Ming Lee says:

UBC amazing

14. Pedro de la Torre Luque says:

I think I got the essence of GP, but what I can not understand is why we take that the mean is 0 when clearly it is not 0. I mean, if we suppose that f* will be distributed as a gaussian with mean 0, the expectation value of f* must be 0. Could anyone explain me this fact?

15. Sidharth Sunil says:

Hi , great lecture. Can someone explain how the mean is got? is u and u* just assumed ?

16. Kristofer Ã„lvring says:

If I run the example code I get an error stating that my K_ isn't a positive-definite matrix. What am I doing wrong?

17. Xingtong Liu says:

This becomes very easy to understand with your thorough explanation. Thank you very much!

18. Dhruv Samant says:

Wow! Great Lecture!

19. Roberto Silveira says:

Awesome lecture, very well explained!

20. MB says:

Very good lecture, full of intuitive examples which deepens the understanding. Thanks a lot

21. deep hazarika says:

when estimating 'f', why each point is treated as a separate dimension and not different points in the same dimension?

22. Stanislav Smirnov says:

Guys, do you know which textbook for this course professor was talking about?

23. Liam Davey says:

Great Teacher! Thanks!

24. Sarnath K says:

This is indeed an Awesome lecture! I liked the way the complexity is slowly built over the lecture. Thank you very much!

amazing intuition

26. Gourv Ghoshal says:

Thank you for sharing this vdo, it was really helpful

27. Mausam Duggal says:

Great lecture

28. Dennis Doerrich says:

Wow, you saved my life with this genius lecture ! I think it's a pretty abstract idea with GP and it's nice that you can walk one through from scratch !