Now the Freakonomics guys have added their two cents: "The how of learning is deliberate practice."
This is true in sports, music, math, programming - everything. Technique matters; it's how you do it.
I've found that to be true in my new running venture. My new-found techniques learned by running barefoot have made it possible for me to run the Manchester Road Race without pain or stiffness. I was able to run the next day without any discomfort, although I will confess that my legs were tired. The only after-effect that consistently follows a run is tiredness in my feet. They're finally waking up after years of slumbering in their shoe cocoons. I think I've succeeded in learning how to run injury-free in middle age.
I have to remember that more when I work on my programming skills. I need to identify objectives better and be more aggressive about conquering them and making them mine.
I've studied a fair amount of math during my education. Engineers were required to take four math classes: two semesters of calculus, differential and integral; multivariate calculus; ordinary differential equations. A fifth course in partial differential equations was recommended but optional. The engineering courses reinforced and built on this base. The prevailing wisdom was that the engineering department taught the same stuff as the math department, but better. I guess we all liked it better when the engineering department presented the material because it came with a context that fixed the ideas in your head.
I took all those and kept going. I decided to sign up for complex variables and linear algebra, just because I liked math. There were also two grad classes that presented integral transforms, calculus of variations, differential geometry, and generalized tensors. The numerical methods that I studied followed the same track: linear algebra for solving large systems of equations and eigenvalues; numerical integration; evaluation of special functions.
I never took a formal course in statistics or probability. The last two graduate classes that I took were in the statistics department: analysis of variance and design of experiments. I was glad to have taken them, but it certainly didn't turn me into a statistician.
I recount all this because it's finally occurred to me that I've missed out on something important. When faced with quantum mechanics and the loss of determinism, Einstein said the God did not play dice. I can't claim to know the gaming habits of God, but I can say that probability and statistics imbue everything around us. They're stand-ins for ignorance, an expression of what we don't know or are uncertain about.
Classical physicists, like Newton, Laplace and Einstein, viewed the universe as a clockwork. Anything could be predicted, given enough information. This is ironic in light of the great service that Laplace rendered to Bayes' theorem by putting it on such a firm mathematical footing. Quantum mechanics killed this idea in the small; non-linearity did the deed in the large. It's all a roulette wheel. Does that mean the universe is really a big casino?
I became aware of two schools of thought in statistics: frequentists and Bayesians. I read hints about the food fight that has been going on between the camps for two centuries, but I didn't understand exactly what it was about - until I read "The Theory That Would Not Die" by Bertsch MyGrayne. The writing style was a bit repetitive, but the story was wonderful.
There were two bits that I especially liked. The first was a quote from Jerry Cornfield to his two daughters as he lay dying: "You spend your whole life practicing your humor for the times when you really need it."
The second was from a section about Jimmie Savage and Dennis Lindley. As the amount of data increases, subjectivists move into agreement, the way scientists come to a consensus as evidence accumulates: "That's the way science is done."
I recently saw "Doing Bayesian Data Analysis: A Tutorial with R and BUGS" by John Kruschke on Amazon. I was intrigued. I knew Bayes and R; what was this BUGS thing about? But now I know, thanks to "The Theory That Would Not Die": it stands for Bayesian Inference Using Gibbs Sampling. There's even an open source project that implements it as a framework. I hope to check it out in the coming months.
I'm planning to add a few items to my must-read list. Probability will be on the list. So will Kruschke's book.