Tag: science

Phanerozoic Temperature

By combining global temperature estimates from geological data with estimates of tropical temperatures obtained from oxygen isotope studies, it is possible to produce an estimate of the global average temperature for any time in the past. The modern global average temperature is 14.5˚C. The average global temperature for the last 540 ga is 20˚C, but the temperature has fluctuated between 25˚C (hothouse) and 10˚C (icehouse). During the Permo-Triassic Extinction, the global temperature spiked above 28˚C. Most of the time, the global temperature gently rises and falls in response to gradual changes in orbital and solar parameters, ocean currents, sea level, atmospheric chemistry (greenhouse gases), and other factors. These changes occur over millions of years. Rarely, there is a drastic change in one of these factors resulting in either rapid global warming (Kidder–Worsley events) or rapid global cooling (Stoll-Schrag events). These abrupt climate excursions take place over 1000s of years, rather than millions of years.

the world was colder than today only 2% of the time in the last 540 ga.

Turing Patterns

Turing’s paper described a theoretical mechanism based on 2 substances — an activator and an inhibitor that diffuse across an area at different rates. The interaction between these 2 “morphogens,” as Turing called them, allows one to interrupt the effect of the other — creating a pattern of colored lines on a tropical fish, for example, rather than a solid color. Turing’s mechanism was indeed responsible for the stripes in the bismuth. And it demonstrated once again how robust and powerful Turing’s original insight was. Here, the stripe-forming process is driven by the forces at play among the bismuth atoms and the metal below. Bismuth atoms want to fit into particular spots on the molecular lattice of the metal. But these spots are closer together than the bismuth atoms find comfortable. Like a photograph that gets shoved into a frame that’s too small for it, the sheet of bismuth atoms buckles. The strain creates a wavy pattern that leaves some atoms raised, forming the stripes. The vertical shift — movement away from the plane of the crystal — acts as the activator in the Turing equations, while the shift within the plane acts as the inhibitor. The morphogens here are displacements, not molecules. When part of a Turing pattern is wiped out, it grows back. You might not assume that inorganic materials like bismuth crystals would be able to heal as animals do,but indeed, his team’s simulated bismuth crystal was able to mend itself.

Animal Numerosity

Practically every animal that scientists have studied — insects and cephalopods, amphibians and reptiles, birds and mammals — can distinguish between different numbers of objects in a set or sounds in a sequence. They don’t just have a sense of “greater than” or “less than,” but an approximate sense of quantity: that 2 is distinct from 3, that 15 is distinct from 20. This mental representation of set size, called numerosity, seems to be “a general ability,” and an ancient one. Many species have displayed a capacity for abstraction that extends to performing simple arithmetic, while a select few have even demonstrated a grasp of the quantitative concept of 0 — an idea so paradoxical that very young children sometimes struggle with it. Both monkeys and honeybees know how to treat 0 as a numerosity, placing it on a mental number line much as they would numerosity 1 or 2. And crows can do it, too.

Woke Imperialism

A great example of woke imperialism was a recent foofaraw in which a woke tried to cancel someone for naming a protocol “AAVE”. The idea was that the authors of said protocol were insufficiently diverse because they didn’t know that “AAVE” stood for African American Vernacular English in the US. Now, the thing is that the word Aave is a Finnish word that means ghost, and the authors of the protocol were Finns, and it was a great word for what the protocol actually did (namely flash loans that could disappear in a second).

So, what this actually represented was Woke American chauvinism in the name of tolerance. A citizen of this gigantic global empire, the American empire, was using woke language to assert authority over some poor Finns as insufficiently respectful of the people his fellow Americans had once oppressed. Quite a trick: America’s history of slavery used to justify America’s present of imperialism! And again, this is similar to a Soviet soldier filling the ear of an Estonian civilian with the story of how Russian capitalists had once grievously oppressed Russian workers, a problem which Comrade Lenin solved with their glorious October Revolution…and that’s why they rolled the tanks into Tallinn. A non sequitur logically, but a useful tool ideologically.

and why might organizations adopt self-defeating woke positions?

It seems fair to say that there is a silent constituency, even a majority, within these organizations that does not support the mob and its methods. Why are they allowing the Woke to take over? There’s an increase of generic human capital relative to specific human capital. You don’t need to be trained on your firm’s computer system; you can navigate based on your experience with interfaces that are familiar on the Internet. People are not tied to organizations as closely as they were a couple of decades ago. They are not motivated to put up resistance when a determined minority of Wokesters tries to take over.

2023-07-03: Woke nonsense is destroying biology

Biology faces a grave threat from “progressive” politics that are changing the way our work is done, delimiting areas of biology that are taboo and will not be funded by the government or published in scientific journals, stipulating what words biologists must avoid in their writing, and decreeing how biology is taught to students and communicated to other scientists and the public through the technical and popular press. The science that has brought us so much progress and understanding—from the structure of DNA to the green revolution and the design of COVID-19 vaccines—is endangered by political dogma strangling our essential tradition of open research and scientific communication.
Campaigns were launched to strip scientific jargon of words deemed offensive, to ensure that results that could “harm” people seen as oppressed were removed from research manuscripts, and to tilt the funding of science away from research and toward social reform. The American government even refused to make genetic data—collected with taxpayer dollars—publicly available if analysis of that data could be considered “stigmatizing.” In other words, science—and here we are speaking of all STEM fields (science, technology, engineering, and mathematics)—has become heavily tainted with politics, as “progressive social justice” elbows aside our real job: finding truth.

First Applied Geometry

Researchers have made a discovery that may shake up the history of mathematics, revealing evidence of applied geometry being used for the purposes of land surveying 3.7 ka BP. Found on a Babylonian clay table, the etchings are believed to represent the oldest known example of applied geometry, and feature mathematical techniques linked to the Greek philosopher Pythagoras that were well ahead of their time.

Grabby Aliens

Advanced aliens really are out there, and we have enough data to say roughly where they are in space and time, and when we will meet or see them. In our model, GCs are born according to a volume-based power law, and once born they simply expand at a constant speed relative to local materials. We show that for nontrivial powers this power law is a reasonable approximation to a more realistic model. This expansion speed and the 2 parameters of this power law are the only 3 parameters of our model, each of which can be estimated from data to within 4x. The hard-steps in Earth history literature helps to estimate the power, and our current date helps to estimate the power law timescale. Furthermore, the fact that we do not now see large alien-controlled volumes in our sky, even though they should control much of the universe volume now, gives us our last estimate, that aliens expand at > 50% of lightspeed. Given estimates of all 3 parameters, we have in this paper shown many model predictions regarding alien timing, spacing, appearance, and the durations until we see or meet them. And we have shown how optimism regarding humanity’s future is in conflict with optimism regarding SETI efforts. Being especially simple, our model is unlikely to be an exact representation of reality. So future research might explore more realistic variations. For example, one might more precisely account for the recent exponential expansion of the universe, and for deviations between a realistic appearance function and our power law approximation. Instead of being uniform across space, the GCs birth rate might be higher within galaxies, higher within larger galaxies, and follow their typical spatial correlations. A GC expansion might take a duration to bring its full effect to any one location, and the GC expansion speed might vary and depend on local geographies of resources and obstacles. Finally, GC subvolumes might sometimes stop expanding or die, either spontaneously or in response to local disturbances.

2023-04-11: A good summary of the argument

SARS-CoV-2 Life Cycle

What has emerged from 19 months of work, backed by decades of coronavirus research, is a blow-by-blow account of how SARS-CoV-2 invades human cells. Scientists have discovered key adaptations that help the virus to grab on to human cells with surprising strength and then hide itself once inside. Later, as it leaves cells, SARS-CoV-2 executes a crucial processing step to prepare its particles for infecting even more human cells. These are some of the tools that have enabled the virus to spread so quickly and claim millions of lives.

50% more Food

Humans have an enzyme called FTO that demethylates N6-methyladenosine through oxidation. It’s part of a large family of enzymes that do this sort of thing using an iron atom in their active sites, along with molecular oxygen. Plants, though, don’t have an FTO homolog – they have some other enzymes that can demethylate this substrate, but not like FTO itself. So the team behind this paper wanted to see what would happen if you engineered the FTO enzyme into plants. In rice and potatoes, the crop yields went up by 50%. Grain size in the rice plants didn’t change, nor did the height of the plants – they just produced a lot more rice grains in general. How does this happen? The plants’ root systems were deeper and more extensive, and photosynthetic efficiency went up by a startling 36%. Transpiration from the leaves was up 78%, but at the same time, the plants of both species showed significantly higher drought tolerance. These are highly desirable traits, and it’s worth noting that a lot of this extra biomass is coming from increased usage of carbon dioxide from the air.

Now we just need to crush the GMO luddites to roll this out.

0.001% Mathlib

When Lean came out, there were plenty of other proof assistants available, including Coq, which is the most similar to Lean — the logical foundations of both programs are based on dependent type theory. But Lean represented a chance to start fresh. Mathematicians created mathlib and eagerly began to fill it with the world’s mathematical knowledge. The work has the air of a barn raising. “If you were to quantify how much of mathematics is formalized, I’d say it’s way less than 0.001%.” It will probably take decades before mathlib has the content of an actual research library, but Lean users have shown that such a comprehensive catalog is at least possible — that getting there is merely a matter of programming in all the math. Last year, the Lean community put aside the accumulation of undergraduate math and skipped ahead to the vanguard of the field. The goal was to define one of the great innovations of 21st-century mathematics — an object called a perfectoid space that was developed over the last 10 years by Peter Scholze. In 2018, the work earned Scholze the Fields Medal.

This project just hit a major milestone, as Peter Scholze explains:

While this challenge has not been completed yet, I am excited to announce that the Experiment has verified the entire part of the argument that I was unsure about. I find it absolutely insane that interactive proof assistants are now at the level that within a very reasonable time span they can formally verify difficult original research. Congratulations to everyone involved in the formalization!!

In this Q&A-style blog post, I want to reflect on my experience watching this experiment.

How did the formalization proceed?
Initially, I imagined that the first step would be that a group of people study the whole proof in detail and write up a heavily digested version, broken up into many many small lemmas, and only afterwards start the formalization of each individual lemma. This is not what happened. Instead, the formalization followed quite closely the original lecture notes, and directly attacked Lemma after Lemma there. It did seem that the process was to directly type the proofs into Lean. Lean actually gives the user a very clear summary of what the current goal is, so one always needs to get a very clear sense of what the next few steps really are. Sometimes it was then realized that even on paper it does not seem clear how to proceed, and the issue was brought to attention in the chat, where it was usually quickly resolved. Only after a lemma was entirely formalized, the proof, now thoroughly digested, was again written up in the Blueprint in human readable form.

What did you learn about the process of formalization?
I learnt that it can now be possible to take a research paper and just start to explain lemma after lemma to a proof assistant, until you’ve formalized it all! I think this is a landmark achievement.

see also this earlier piece about using coq:

For 10 years, Voevodsky has been advocating the virtues of computer proof assistants and developing univalent foundations in order to bring the languages of mathematics and computer programming closer together. As he sees it, the move to computer formalization is necessary because some branches of mathematics have become too abstract to be reliably checked by people. “The world of mathematics is becoming very large, the complexity of mathematics is becoming very high, and there is a danger of an accumulation of mistakes”. Proofs rely on other proofs; if one contains a flaw, all others that rely on it will share the error. The real potential of type theory informed by homotopy theory is as a new foundation for mathematics that’s uniquely well-suited both to computerized verification and to studying higher-order relationships.

and even earlier efforts:

A grand objective of the field is to formalize the whole of mathematics, reproving its theorems with a negligible probability of error, all based on a small kernel (in the HOL Light system, the kernel is 500 lines of OCaml code).

and another backgrounder

As the towering mathematical enterprise reaches new heights, with more intricately and esoterically turreted peaks, each practitioner drilling down into minutiae that could never be understood by colleagues even in the same field, computers will be necessary for advancing the enterprise, and authenticating the integrity of the structure—a structure so complex in detail and so massive in scale that adjudication by mere human referee will no longer be feasible

one person’s perspective

I’m not concerned by the fact that no human mathematician can check this, because we can check it with other computer approaches

2023-06-01: Progress continues, and now LLM have entered the picture

We set out to tackle serious mathematics with a combination of hope and trepidation. We were able to formalize everything we set out to formalize and were never forced to discard a development part way through. “We have formalized results by 2 Fields medalists (Roth and Gowers), an Abel prize winner (Szemerédi) and of course Erdős too!” We’ve also seen impressive advances in search and language models to assist users in proof development. Although the effort required to formalize mathematical articles remains high, we can confidently predict that formalization will be playing a significant role in mathematical research in the next few years.

2025-09-11: Gauss

Using Gauss we produced ~25K lines of Lean code, comprising 1K theorems and definitions. Formal proofs of this scale have historically been major milestones, often the culmination of multi-year efforts. The largest singular formalization projects in history — career-defining efforts, which can span more than a decade — are only an order of magnitude larger at up to 500K LOC. Lean’s standard mathematical library, Mathlib, is 1 OOM beyond that, at 2M LOC, comprising 350K Lean theorems and definitions, and developed by 600 human contributors over 8 years. With further algorithmic improvements, we aim to increase the sum total of formal code by 2-3 OOM in the coming 12 months. This will serve as the training ground for a new paradigm — verified superintelligence and the machine polymaths that will power it.

Doggerland

500m wide, the beach is made of material dredged from the sea bottom 13 kilometers offshore and dumped on the existing beach. It’s an experimental coastal protection measure, its sands designed to spread over time to shield the Dutch coast from sea-level rise. The endeavor has made 21m cubic meters of Stone Age soil accessible to archaeologists. “The surprising thing is just how much DNA is still down there.”

2022-03-06: A map of Doggerland 11 ka BP:

Doggerland was, by any estimation, the most attractive landscape in northwestern Europe for Mesolithic hunter-gatherers and perhaps the continent’s most densely populated region at the time. Because of the seemingly inexhaustible resources present there, normally mobile Mesolithic societies may have been encouraged to create permanent or semipermanent settlements. While relatively few Mesolithic sites have been located on land, if Doggerland was indeed the heartland for these early Holocene communities, it stands to reason that many archaeological sites may lie beneath the North Sea. Seismic data has now given archaeologists a much better sense of Doggerland’s topography. They know where its rivers, lakes, and coastlines were located, and where its forests were found. They can use this information to speculate about where people may have lived.
This new type of research into Doggerland has the potential to once again dramatically alter the field of European prehistory. “We have a completely intact landscape with a state of preservation that we can often only dream of on land. I think we will have a lot of exciting discoveries to come. We are barely scratching the surface.”