mathematical models in 3d steel. pwetty!
Tag: math
Intelligence means Prediction
The cortex appears wired at its foundation to run Bayesian computations as efficiently as can be possible.
now i REALLY have to cram bayesian math.
2008-09-27:
s/Intelligence is defined by behavior/Intelligence is defined by prediction/
Surfing Uncertainty isn’t pop science and isn’t easy reading. Sometimes it’s on the border of possible-at-all reading. Author Andy Clark (a professor of logic and metaphysics, of all things!) is clearly brilliant, but prone to going on long digressions about various boring scholarly debates. In particular, he’s obsessed with showing how “embodied” everything is all the time. This gets kind of awkward, since the predictive processing model isn’t really a natural match for embodiment theory, and describes a brain which is pretty embodied in some ways but not-so-embodied in others. If you want 100 pages of apologia along the lines of “this may not look embodied, but if you squint you’ll see how super-duper embodied it really is!”, this is your book.
2018-11-30: Free energy principle
Friston’s work has 2 primary motivations. Sure, it would be nice to see the free energy principle lead to true artificial consciousness someday, but that’s not one of his top priorities. Rather, his first big desire is to advance schizophrenia research, to help repair the brains of patients like the ones he knew at the old asylum. And his second main motivation is “much more selfish.” It goes back to that evening in his bedroom, as a teenager, looking at the cherry blossoms, wondering, “Can I sort it all out in the simplest way possible?”
and a piece on Friston:
Karl Friston’s free energy principle might be the most all-encompassing idea since Charles Darwin’s theory of natural selection. But to understand it, you need to peer inside the mind of Friston himself.
We have never seen such a concrete example of how the brain uses prior experience to modify the neural dynamics by which it generates sequences of neural activities, to correct for its own imprecision. This is the unique strength of this paper: bringing together perception, neural dynamics, and Bayesian computation into a coherent framework, supported by both theory and measurements of behavior and neural activities
If that is not mind-bending enough, in his new book, Jeff Hawkins extends the memory framework to the construct of “reference frames”. Everything we perceive is a constructed reality, a cortical consensus from competing internal models resident in many cortical columns, the amalgam of 1000 brains. Those models are updated by data streaming from the senses. But our reality resides in the models. “The brain learns its model of the world by observing how its inputs change over time. There isn’t another way to learn. Every time we take a step, move a limb, move our eyes, tilt our head, or utter a sound, the input from our sensors change. For example, our eyes make rapid movements, called saccades, about three times a second. With each saccade, our eyes fixate on a new point in the world and the information from the eyes to the brain changes completely.” We don’t perceive any of this because we are living in the model, which is predicting the next input to come, across all the senses. “Vision is an interactive process, dependent on movement. Only by moving can we learn a model of the object.”
“To avoid hallucinating, the brain needs to keep its predictions separate from reality. We are not aware of most of the predictions made by the brain unless an error occurs.”
“Thoughts and experiences are always the result of a set of neurons that are active at the same time (about 2% of the total). Individual neurons can participate in many different thoughts or experiences. Everything we know is stored in the connections between neurons. Every day, many of the synapses on an individual neuron will disappear and new ones will replace them. Thus, much of learning occurs by forming new connections between neurons that were not previously connected.”
Sequence memory (like predicting the next note in a melody or a common sequence of behaviors): “Sequence memory is also used for language. Recognizing a spoken work is like recognizing a short melody.”
Symmetry and the Monster
finite groups are the manhattan project of math. 15k pages of proof, and ongoing attempts to restate the proof so that future generations may understand it.
Lindenmayer Systems in Javascript
rules for fractals, includes drawing with canvas
math skills
i find it hard to believe these are real
Archimedes Palimpsest
arrgh. when you listen to what people did to the only known manuscript from archimedes.. made it into a prayer book, painted on it, etc.. oh the humanity awesome presentation though, including a “scriptospatial approach” ala mashups / pushpins to annotate
NUMB3RS
a show where a math “genius” is the star. will presumably do good things for math enrollment ala CSI, but their idea of genius is so sad. simple combinatorics and trigonometry problems are portrayed as “hard”. geez
Non-leaky abstractions in physics
i talked to prof. eberhard hilf yesterday. hilf is a retired professor of theoretical physics who is now working on the problems around long-term storage of electronic scientific data. we talked about the change in semantics as science moves forward, and the need to not only port content from old storage media to new ones, but also the need to transcribe the content itself to make it accessible to scientists of another age.
hilf demonstrated how equations as jotted down by einstein in 1905 would be almost incomprehensible to modern scientists today. over the years, verbose notations have been replaced by increasingly more succinct ones, new symbols have been introduced. i immediately had to think of leaky abstractions. hilf was adamant that physics was not prone to those problems because it is grounded in solid math.
good for them physicists, and too bad computer science cannot claim the same currently.
lean production
Paul Erdos, the most prolific mathematician of all time, defied the conventional wisdom that mathematics was just a young man’s game. For the last 25 years of his life, Erdos raced against the specter of old age to prove as many mathematical theorems as possible. “The first sign of senility, is when a man forgets his theorems. The second sign is when he forgets to zip up. The third sign is when he forgets to zip down.” Erdos never experienced the first sign. He managed to think about more problems than any other mathematician in history and could recite the details of all 1475 of the papers he had written or co-authored. Fortified by espresso and amphetamines, Erdos did mathematics 19 hours a day, 7 days a week. “A mathematician is a machine for turning coffee into theorems.” When friends urged him to slow down, he always had the same response: “There’ll be plenty of time to rest in the grave.”
i’m currently reading the man who loved only numbers. i dig erdos way of living out of a suitcase, something i aspire to as well. it takes lean production to entirely new levels.
Fusing math and art
A 22-year-old MIT professor whose work fuses art, science, work and play is the recipient of a $500K MacArthur Fellowship, commonly known as the genius grant. Assistant Professor Erik Demaine of electrical engineering and computer science – who last month was called one of the most brilliant scientists in America by Popular Science magazine – is one of the youngest people ever selected for the fellowship and the youngest of the 24 named this year.
interesting combination of approaches. there is so much to explore here.