Extra reading
There are many good books on machine learning and neural networks. I have listed a few below that I think are particularly useful (although many go far beyond the scope of this module). As we go through the various topics, I will also point you to some useful resources that will help you to understand the concepts in more depth.
Pattern Recognition and Machine Learning by Christopher M. Bishop. This classic text is renowned for its rigorous mathematical and Bayesian approach to machine learning. It covers topics from probability theory to neural networks.
The Elements of Statistical Learning: Data Mining, Inference and Prediction by Trevor Hastie, Robert Tibshirani and Jerome Friedman. This book is a cornerstone text that delves into the statistical and algorithmic aspects of machine learning.
An Introduction to Statistical Learning: with Applications in R by Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani. A more accessible counterpart to The Elements of Statistical Learning, this book focuses on applying statistical learning methods using the R programming language. It is ideal for bridging the gap between theory and practical application and includes numerous examples and labs.
Machine Learning: A Probabilistic Perspective by Kevin P. Murphy. This is another comprehensive and mathematically-driven book that treats machine learning from a probabilistic viewpoint.
I would also recommend that you consider how large language models (LLMs) such as ChatGPT and Gemini can be used to help with your studies and to tailor resources to your needs. That is, however, a big topic that I will not be covering.