Introduction to Machine Learning

In this chapter, we discuss some basic machine learning techniques including linear and logistic regression. In addition to that we will also talk about the issues related to bias and variance which plague the machine learning models leading them to overfit. Generally, overfitting occurs when a very...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Srinivasan, Satish Mahadevan, Laplante, Phillip A.
Format: Buchkapitel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this chapter, we discuss some basic machine learning techniques including linear and logistic regression. In addition to that we will also talk about the issues related to bias and variance which plague the machine learning models leading them to overfit. Generally, overfitting occurs when a very complex statistical model suits the observed data because it has too many parameters compared to the number of observations. The outcome is risky as an incorrect model can perfectly fit the data just because it is quite complex in logic when compared to the amount of data that is available. Consequently, when the model is used to predict new observations, it fails to generalize. Such a model would have poor predictive performance in the real world. In order to resolve the overfitting, the regularization techniques are used. These methods involve modifying the performance function, normally selected as the sum of the square of regression errors on the training set. Both the Ridge and Lasso techniques will be explored here to understand how to avoid overfitting and for creating models with low bias and variance.
DOI:10.1201/9781003278177-4