
Causality And Information
May 23, 2021
In Causality For Physics, I introduced Pearlian interventions for physical systems that evolve over time. In Physical Information, I defined what it means for a system to have information. Here, I will merge these threads and talk about what it means for a system to have causal effect, and the connection with information. …

Physical Information
May 14, 2021
I will apply to abstract physics the same information algebra I introduced in Bayesian information theory and further developed in Information Algebra. Bayesian information is just information from the perspective of an agent that may have or not have particular information. Below, I will introduce the notion of a physical system having or not having information about itself or other systems (regardless of whether or not it has agenty attributes), and the same information algebra will apply. The only difference is a shift from the 1st person to 3rd person perspective. …

Causality For Physics
April 20, 2021
The definition of causality within physics is not a settled matter, perhaps surprisingly. My understanding is that this question is studied more by philosophers than physicists, as the field of physics tends to avoid interpretational problems. That is to say, theories like relativity or quantum mechanics are mathematically well defined and make predictions, so that’s all there is to it, right? I’m not a physicist, so I will proceed to ask such questions. …

Bayesian information theory
April 9, 2021
Shannon’s information theory defines quantity of information (e.g. selfinformation $\lg p(x)$) in terms of probabilities. In the context of data compression, these probabilities are given a frequentist interpretation (Shannon makes this interpretation explicit in his 1948 paper). In Deconstructing Bayesian Inference, I introduced the idea of a subjective data distribution. If quantities of information are calculated using a subjective data distribution, what is their meaning? Below I will answer this question by building, from the groundup, a different notion of Bayesian inference. …

Deconstructing Bayesian Inference
March 31, 2021
I pose the question: Why predict probabilities rather than predicting outcomes without probabilities? I first define Bayesian inference, and then I remove the probabilities involved in multiple passes until there is no probability. Then I examine what the result is, and eventually motivate bringing probabilities back into our predictions. …

Classical vs Bayesian Reasoning
February 24, 2021
My goal is to identify the core conceptual difference between someone who accepts “Bayesian reasoning” as a valid way to obtain knowledge about the world, vs someone who does not accept Bayesian reasoning, but does accept “classical reasoning”. By classical reasoning, I am referring to the various forms of logic that have been developed, starting with Aristotelian logic, through propositional logic like that of Frege, and culminating in formal mathematics (e.g. higher order type theory). In such logics, the goal is to uniquely determine the truth values of things (such as theorems and propositions) from givens. …

Bayesian Inference On 1st Order Logic
February 21, 2021
David Chapman’s blog post titled Probability theory does not extend logic has stirred up some controversy. In it, Chapman argues that socalled Bayesian logic, as it currently understood, is limited to propositional logic (0th order logic), but cannot generalize to higher order logics (e.g. predicate logic a.k.a. 1st order logic), and thus cannot be a general foundation for inference from data under uncertainty. Chapman provides a few counterexamples that supposedly demonstrate that doing Bayesian inference on statements in 1st order logic is incoherent. I think there is a lot of confusion surrounding this point because Chapman does not use proper probability notation. In the following article I show how Chapman’s examples can be properly written and made sense of using random variables. Hopefully this clarifies some things. …

Probability Theory and Its Philosophy
June 19, 2020
Probability is a measure defined on events, which are sets of primitive outcomes. Probability theory mostly comes down to constructing events and measuring them. A measure is a generalization of size which corresponds to length, area, and volume (rather than the bijective mapping definition of cardinality). …

Shannon's Information Theory
June 9, 2020
Shannon’s theory of information is usually just called information theory, but is it deserving of that title? Does Shannon’s theory completely capture every possible meaning of the word information? In the grand quests to creating AI and understanding the rules of the universe (i.e. grand unified theory) information may be key. Intelligent agents search for information and manipulate it. Particle interactions in physics may be viewed as information transfer. The physics of information may be key to interpreting quantum mechanics and resolving the measurement problem. If you endeavor to answer these hard questions, it is prudent to understand existing socalled theories of information so you can evaluate whether they are powerful enough and to take inspiration from them. Shannon’s information theory is a hard nut to crack. Hopefully this primer gets you far enough along to be able to read a textbook like Elements of Information Theory. At the end I start to explore the question of whether Shannon’s theory is a complete theory of information, and where it might be lacking. This post is long. That is because Shannon’s information theory is a framework of thought. That framework has a vocabulary which is needed to appreciate the whole. I attempt to gradually build up this vocabulary, stopping along the way to build intuition. With this vocabulary in hand, you will be ready to explore the big questions at the end of this post. …

Visualizing Quantum Field States
January 10, 2020