Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

unbalancedparentheses/learning_data_science

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
9 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Learning Data Science

Data science etudes -- explorations of statistical concepts through code.

Conformal prediction is the standout recent development: it provides distribution-free prediction intervals with guaranteed coverage -- no assumptions about the data-generating process. Chatterjee's Xi coefficient and distance correlation address a fundamental limitation of Pearson correlation: they detect arbitrary nonlinear associations, which matters for feature selection, independence testing, and understanding complex data relationships. The information-theoretic ML foundations connect Shannon entropy to generalization bounds and PAC-Bayes theory -- this is the theoretical frontier of understanding why deep learning works.

About

This repository contains Jupyter notebooks that explore fundamental statistical concepts, with a focus on understanding how different measures capture relationships in data.

Notebooks

Pearson Correlation vs Mutual Information

pearson_correlation_coefficient_vs_mutual_information.ipynb - Compares how Pearson correlation and mutual information capture dependence in bivariate normal data, inspired by Nassim Taleb

This notebook explores the relationship between Pearson correlation and mutual information:

  • Pearson Correlation: Shows how correlation coefficient (rho) scales non-linearly -- the perceptual gap between rho=0.5 and rho=0.9 is much larger than between rho=0 and rho=0.5
  • Mutual Information: Demonstrates how mutual information scales linearly with information content, providing a more intuitive measure of dependence
  • Comparison: Visualizes bivariate normal distributions across both metrics to illustrate their differences

The notebook is based on this tweet by Nassim Taleb.

Requirements

  • Python 3.7+
  • Jupyter
  • NumPy
  • Matplotlib

Running

  1. Install dependencies:

    pip install jupyter numpy matplotlib
  2. Start Jupyter:

    jupyter notebook
  3. Open pearson_correlation_coefficient_vs_mutual_information.ipynb and run the cells.

License

See LICENSE file for details.

Not yet reviewed

These resources were recently found and have not been reviewed yet.

Information Theory

Statistical Methods

Notable Notebook Collections

Correlation & Dependence

Books

Dependence Measures Beyond Pearson/MI (New)

Information-Theoretic Approaches (New)

Conformal Prediction (New)

Books with Open Notebooks (New)

Fat Tails Notebooks (New)

Julia & Python Tools (New)

About

data science études

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Morty Proxy This is a proxified and sanitized view of the page, visit original site.