Suggested further readings
Contents
Suggested further readings¶
Generic tips on model fitting in neuroscience:¶
Wilson, Robert C. & Collins, Anne. 2019. “Ten Simple Rules for the Computational Modeling of Behavioral Data.” eLife 8 (November). https://doi.org/10.7554/eLife.49547.
Palminteri, S., Wyart, V., & Koechlin, E. (2017, June 1). The Importance of Falsification in Computational Cognitive Modeling. Trends in Cognitive Sciences, Vol. 21, pp. 425–433. https://doi.org/10.1016/j.tics.2017.03.011
On linear regression:¶
Section 3.1 of Christopher Bishop’s textbook Pattern recognition and machine learning. Provides all mathematical derivations in depth. Freely available at https://www.microsoft.com/en-us/research/people/cmbishop/
Chapter 22 (Log-likelihood maximization) of David MacKay’s very comprehensive textbook Information Theory, Inference, and Learning Algorithms. Freely available at https://www.inference.org.uk/itprnn/book.pdf
On model selection:¶
Chapter 28 (Model selection and Occam’s razor) of David MacKay’s very comprehensive textbook Information Theory, Inference, and Learning Algorithms. Freely available at https://www.inference.org.uk/itprnn/book.pdf
Arlot, S., & Celisse, A. (2009). A survey of cross-validation procedures for model selection. Statistics Surveys, 4(0), 40–79. https://doi.org/10.1214/09-SS054
On optimization methods (for LLH maximization or MSE minimization):¶
Boyd, S. & Vandenberghe, L. - Convex Optimization. Textbook,great resource for convex optimization in general, available at https://web.stanford.edu/~boyd/cvxbook/
Acerbi, L., & Ma, W. J. (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. NeurIPS 2017. Algorithm for optimization problems, with Matlab toolbox available at https://github.com/lacerbi/bads.
Research example developed in outro:¶
Wei, K., & K rding, K. (2009). Relevance of error: What drives motor adaptation? Journal of Neurophysiology, 101(2), 655–664. https://doi.org/10.1152/jn.90545.2008