knitr::knit_exit()
Tech Reading
As of
Algorithms & ML
- Akalin https://compgenomr.github.io/book/ - very good, genomics
- Berkeley, excellent glossary:
- Berkeley CRASH ML https://mlberkeley.substack.com/p/part-1
- Bishop (2006) Ch1 nice intro: https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf
- Breiman, 2 cultures: https://www.jstor.org/stable/2676681
- Chollet, Deep Learning for R (not online, no pdf) python:
https://docs.google.com/viewer?a=v&pid=sites&srcid=dW10LmVkdS5wa3xzbmxwfGd4Ojc1ODc1ODY2OTZiOTUzOGQ - Compeau:
- great ideas book! http://compeau.cbd.cmu.edu/
-
http://compeau.cbd.cmu.edu/programming-for-lovers/ cmu - ch 8, 9 esp collinear.
- great ideas book! http://compeau.cbd.cmu.edu/
- Boehmke Hands on ML | https://bradleyboehmke.github.io/HOML/ Ch 4,5
- Bohm Statistics for Physics https://www-library.desy.de/preparch/books/vstatmp_engl.pdf change of variables (SEE 3.4.1, density as volume)
- Chan, Stanley https://probability4datascience.com/index.html more math and uses Likelikhood
- Davidson http://qed.econ.queensu.ca/ETM/etm-info.html
- Deisenroth, Faiesel et al | Math4ML some intution; linear trans, distributions https://mml-book.github.io/book/mml-book.pdf linear alg book, regression (Ch 9), nice intro to (\(\Omega,\mathcal{A}, P\))
- related videos https://www.youtube.com/playlist?list=PLkxomGYFWp67infnvPmEcqyQqk0q6ntrY
- Gagolewski Lightweight ML with R iPad, PDF
- Goodfellow et al: https://www.deeplearningbook.org/ ideas
- Higgens “Practical R Info?” Higgens, https://bookdown.org/pdr_higgins/rmrwr/ easy to read, range of topics (~cookbook)
- Kaplan 2023 https://mdsr-book.github.io/mdsr3e/ (tidyverse, parsnip, links to wikip for all stats)
- Kaplan 2017, 2E https://dtkaplan.github.io/SM2-bookdown/
- Kaplan 2021, 2E, Data https://dtkaplan.github.io/DataComputingEbook/ see
- Kaplan 2020, Calculus with R https://dtkaplan.github.io/RforCalculus/index.html mosaic and mosaicCalc packages (fitting to curves) ML, chapter 18, plus interesting other chapters
- Lindholm, “Machine Learning: A first course for engineers” PDF more math http://smlbook.org/ http://smlbook.org/book/sml-book-draft-latest.pdf
- Matloff (Prob book) https://github.com/matloff/probstatbook/blob/master/ProbStatBookW21.pdf
SEE 15.2 for logistic; var transform - mcelreath: https://github.com/rmcelreath/stat_rethinking_2022 (videos)
- Molnar https://christophm.github.io/interpretable-ml-book/ Ch 5.2 logistical regression (explanations!)
- Oehlert:
- Random Services https://www.randomservices.org/random/dist/Transformations.html
- Siegrist: https://stats.libretexts.org/Bookshelves/Probability_Theory/Probability_Mathematical_Statistics_and_Stochastic_Processes_(Siegrist)/
- Thomas, Math for ML … good lin alg, but quickly gets advanced. https://gwthomas.github.io/docs/math4ml.pdf
- UCLA: Mixed Models - intro: https://stats.oarc.ucla.edu/other/mult-pkg/introduction-to-linear-mixed-models/
- Vasishth: https://vasishth.github.io/bayescogsci/book/ch-reg.html#sec-logistic
- Venables http://web.mit.edu/r_v3.4.1/R-intro.pdf (SEE models chapter)
- Wiki: https://en.wikipedia.org/wiki/Generalized_linear_model#Binary_data, not easy but through
- SciLearn
- Scott, DS with R, Gentle Intro
- Kazdan | normal equations, linear alghttps://www2.math.upenn.edu/~kazdan/312S14/
- Kazdan, many good references: https://www2.math.upenn.edu/~kazdan/
Intro to Linear Alg & Models,
- Kuiper, Shonda: simple, clear: video: https://www.youtube.com/watch?v=jQkK0XMrAdM
- Race, Shaina gentle intro to lin alg:, https://shainarace.github.io/LinearAlgebra/index.html
- Thomas, Garrett, Math for ML, Berkeley https://gwthomas.github.io/docs/math4ml.pdf
- Bendixcarstensen.com, with R & matrix models (practical; try not use api pkg) http://www.bendixcarstensen.com/APC/linalg-notes-BxC.pdf
- Rafael genomics - Chapter 4 matrix (his older stuff is better, but not best organization)
- Rafael, Book2: http://rafalab.dfci.harvard.edu/dsbook-part-2/ (2020?), esp Linear Models and ML chapters
- Gallier, advanced linear alg https://www.seas.upenn.edu/~cis5150/linalg-I.pdf
Logistic Regression (binary response)
\[ \Pr(y) \sim \binom{n}{y}\theta^y(1-\theta)^{n-y} \] \[ \Pr(y=1)=\theta=\text{logit}^{-1}(\beta_0+\beta_1x_1+\beta_2x_2+...+\beta_7x_7) \]
- Prerequisite: Ease going from quantile function to CDF, and back.
- Difference: binomial variable, y, =1 and Pr(Y=1)
- Difference: p(y=m | x) conditional class probability vs p(y | x), where m repsents a ‘class’, given x
- Model y vs model log-odds (y)
- Reason for modeling mean
- Transformations of RV
SEE :
- https://en.wikipedia.org/wiki/Quantile_function
- Vasishth,
- Interpretable ML, Chapter 5.2
- https://stats.stackexchange.com/questions/374452/family-of-glm-represents-the-distribution-of-the-response-variable-or-residuals/374461#374461
- https://www.theanalysisfactor.com/link-functions-and-errors-in-logistic-regression/
- https://www.randomservices.org/random/dist/Transformations.html
QUARTO & CSS | SCSS
- READ discussions: https://github.com/quarto-dev/quarto-cli