Tag Collection

#modeling

Explore Books, Authors and Common Highlights on Modeling

Showing 20 of 20 highlights

Regularization techniques help prevent overfitting.
Our brains are designed to create models of the world.

From A Thousand Brains by Jeff Hawkins

Overfitting occurs when a model learns the training data too well, including its noise.
Regularization is a technique used to prevent overfitting by adding a penalty on the size of coefficients.
Models are essential tools for decision-making.

From The Model Thinker by Scott E. Page

The only way to understand the world is to build a model of it.
A good model is one that generalizes well to unseen data.
The key to deep learning is representation learning, where the model learns to represent the input data in a way that makes it easier to solve the task at hand.
Every thought and action is a reflection of the models we create.

From A Thousand Brains by Jeff Hawkins

Data is not enough; we need a model that tells us how the world works.

From The Book of Why by Judea Pearl

Overfitting occurs when a model learns the noise in the training data.
The most important part of building a machine learning model is to understand the problem you are trying to solve.
Feature selection is crucial in building effective machine learning models.
Every time we learn something new, we are building a model of the world.

From The Master Algorithm by Pedro Domingos

The process of modeling can reveal hidden patterns.

From The Model Thinker by Scott E. Page

Collaborative modeling enhances our understanding of complex systems.

From The Model Thinker by Scott E. Page

Hyperparameter tuning can significantly affect the performance of your model.
Our models of the world should reflect causal relationships.

From The Book of Why by Judea Pearl

Models help us navigate complex systems.

From The Model Thinker by Scott E. Page

Overfitting occurs when a model learns the noise in the training data instead of the underlying distribution.