Cancellation Or Excommunication?
High in the Swiss Alps, a band of intrepid monks draws the line against a proposed visit by yours truly, thus avoiding the collapse of their monastery roof and possibly a divinely mandated avalanche!
The titular “cancellation or excommunication” part of this post is toward the end. For now, suffice it to say that the good monks of the alpine Benedictine monastery Kloster Disentis know what dissent is when they think they see it, and they hate, hate, hate it! ;-)
Some readers are no doubt familiar with Paul John Werbos, the social scientist, statistician, and machine learning pioneer who discovered backpropagation and published the discovery in his 1974 Harvard dissertation Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Although it may have slipped his memory, he and I have conversed on a couple of previous occasions in discussions by the Foundations of Mind Group.
Backpropagation, short for "backward propagation of errors" and important in what is now called "deep learning", is a method for training neural networks. It adjusts network weights and biases to reduce discrepancies between actual and predicted output, thus reducing the loss function and enhancing the accuracy and efficiency of computational models (often with the help of optimization algorithms).
In the past, Paul has expressed enthusiasm for stochastic processes and methods, the model-theoretic utility of which relates to our recent installment on randomness in the Modern Synthesis. But we’ll revisit that discussion later.