It was about the beginning of September, 1664, that I, among the rest of my neighbors, heard in ordinary discourse that the plague was returned again in Holland; for it had been very violent there, and particularly at Amsterdam and Rotterdam, in the year 1663, whither, they say, it was brought, some said from Italy, others from the Levant, among some goods which were brought home by their Turkey fleet; others said it was brought from Candia; others from Cyprus. It mattered not from whence it came; but all agreed it was come into Holland again.
I just finished Daniel Defoe’s a journal of the plague year. 2014-09-30: this is an awesome poster visualizing the plague, walking you through how real science is done, in an entertaining and informative way.
2015-10-06: Plague is one of the most virulent pathogens.
the acquisition of a single gene named pla gave Y. pestis the ability to cause pneumonia, causing a form of plague so lethal that it kills essentially all of those infected who don’t receive antibiotics. In addition, it is also among the most infectious bacteria known. “Yersinia pestis is a pretty kick-ass pathogen. A single bacterium can cause disease in mice. It’s hard to get much more virulent than that.”
Monica Green published a landmark article, The 4 Black Deaths, in the American Historical Review, that rewrites our narrative of this brutal and transformative pandemic. In it, she identifies a “big bang” that created 4 distinct genetic lineages that spread separately throughout the world and finds concrete evidence that the plague was already spreading from China to central Asia in the 1200s. This discovery pushes the origins of the Black Death back by over 100 years, meaning that the first wave of the plague was not a decades-long explosion of horror, but a disease that crept across the continents for over 100 years until it reached a crisis point.
2022-02-16: Black Death mortality rates varied widely.
“The data is sufficiently widespread and numerous to make it likely that the Black Death swept away 65% of Europe’s population”. But those figures, based on historical documents from the time, greatly overestimate the true toll of the plague. By analyzing ancient deposits of pollen as markers of agricultural activity, researchers found that the Black Death caused a patchwork of destruction. Some regions of Europe did indeed suffer devastating losses, but other regions held stable, and some even boomed. It’s possible that the ecology of rats and fleas that spread the bacteria was different from country to country. The ships that brought Yersinia to Europe may have come to some ports at a bad time of the year for spreading the plague, and to others at a better time.
2022-08-12: The plague may have had a role in the collapse of Egypt’s Old Kingdom and the Akkadian Empire in Mesopotamia
During the late 3rd millennium BCE, the Eastern Mediterranean and Near East witnessed societal changes in many regions, which are usually explained with a combination of social and climatic factors. However, recent archaeogenetic research forces us to rethink models regarding the role of infectious diseases in past societal trajectories. The plague bacterium Yersinia pestis, which was involved in some of the most destructive historical pandemics, circulated across Eurasia at least from the onset of the 3rd millennium BCE but the challenging preservation of ancient DNA in warmer climates has restricted the identification of Y. pestis from this period to temperate climatic regions. As such, evidence from culturally prominent regions such as the Eastern Mediterranean is currently lacking. Here, we present genetic evidence for the presence of Y. pestis and Salmonella enterica, the causative agent of typhoid/enteric fever, from this period of transformation in Crete, detected at the cave site Hagios Charalambos. We reconstructed 1 Y. pestis genome that forms part of a now-extinct lineage of Y. pestis strains from the Late Neolithic and Bronze Age that were likely not yet adapted for transmission via fleas. Furthermore, we reconstructed 2 ancient S. enterica genomes from the Para C lineage, which cluster with contemporary strains that were likely not yet fully host adapted to humans. The occurrence of these 2 virulent pathogens at the end of the Early Minoan period in Crete emphasizes the necessity to re-introduce infectious diseases as an additional factor possibly contributing to the transformation of early complex societies in the Aegean and beyond.
Watching the mummy returns reminded me of an article i had read some time ago, arguably one of the scariest i ever read. it talks about the problem of marking a site as dangerous for 10 ka into the future.
These standing stones mark an area used to bury radioactive wastes. The area is … by … kilometers and the buried waste is … kilometers down. This place was chosen to put this dangerous material far away from people. The rock and water in this area may not look, feel, or smell unusual but may be poisoned by radioactive wastes. When radioactive matter decays, it gives off invisible energy that can destroy or damage people, animals, and plants.
Do not drill here. Do not dig here. Do not do anything that will change the rocks or water in the area.
Do not destroy this marker. This marking system has been designed to last 10 ka. If the marker is difficult to read, add new markers in longer-lasting materials in languages that you speak. For more information go to the building further inside. The site was known as the WIPP (Waste Isolation Pilot Plant) site when it was closed in …
2006-10-16: Well-researched Thorium piece, but Michael needs to become more concise: he repeats himself too much in this piece.
Sometime between 2020 and 2030, we will invent a practically unlimited energy source that will solve the global energy crisis. This unlimited source of energy will come from thorium. A summary of the benefits, from a recent announcement of the start of construction for a new prototype reactor:
There is no danger of a melt-down like the Chernobyl reactor.
It produces minimal radioactive waste.
It can burn plutonium waste from traditional nuclear reactors.
It is not suitable for the production of weapon grade materials.
Global thorium reserves could cover our energy needs for 1000s of years.
The new reactor, which is only 7m x 2m, could change everything for a group of neighbors who are fed up with the power companies and want more control over their energy needs.
2008-05-22: Why bother with oil-based stuff when you can have distributed nuclear energy with Uranium hydride batteries? 2008-07-24: Uranium Deep Burn
It is projected that volumes of high-level waste could be reduced by a factor of 50, while extra electricity is generated.
Besides the low amount of waste and almost complete burning of all Uranium and Plutonium, another big advantage of liquid fluoride reactors is fast and safe shutoff and restart capability. This fast stop and restart allows for load following electricity generation. This means a different electric utility niche can be addressed other than just baseload power for nuclear power. Currently natural gas is the primary load following power source. Wind and solar are intermittent in that they generate power at unreliable times. LFTR would be reliable on demand power.
Fuck ethanol. Lets have some 21st century nuclear power
Thorium is one of the victims of the brainless scare campaign against nuclear that has infected most western nations over the last 30 years. Instead of doing silly stunts like the germans, whose “exit” from nuclear energy will mean more coal plants being built, an enlightened nation would chose thorium.
Instead, we are stuck with aging reactors (how does that make anyone safer?) and scientific illiteracy both in the general population and elected representatives.
I’m generally dismayed how little discussion about thorium there is in energy circles.
Kirk Sorensen provides an update on the current state of thorium power. The bad news is that it still remains mostly theoretical concept; no operational reactor has been deployed yet — even as a prototype. However, new thorium nuclear molten salt experiments were just started in Europe. We have good “line of sight” on the science to build one — so, at this point, the limiting factor is mostly funding. In a world of privately-funded space travel, such a gating obstacle shouldn’t remain for long. 4 specific difficulties have been mentioned:
Salts can be corrosive to materials.
Designing for high-temperature operation is more difficult
There has been little innovation in the field for several decades
The differences between LFTRs and the light water reactors in majority use today are vast; the former “is not yet fully understood by regulatory agencies and officials.”
Andrew Yang has proposed a nuclear subsidy—$50B over 5 years
he is pro-nuclear and has a deep understanding of all the technical issues around energy. Real change from the Bush administration in selecting extreme competence. It is not in any way a guarantee of correct energy choices because there is still political reality.
At stake is the 100s of billions spent on meaningless levels of “safety” around nuclear power plants and waste storage, the projected costs of next-generation nuclear plant designs to reduce greenhouse gases worldwide, and the extremely harmful episodes of public panic that accompany rare radiation-release events like Fukushima and Chernobyl. (No birth defects whatever were caused by Chernobyl, but fear of them led to 100K panic abortions in the Soviet Union and Europe. What people remember about Fukushima is that nuclear opponents predicted that 100s or 1000s would die or become ill from the radiation. In fact nobody died, nobody became ill, and nobody is expected to.) 2014-02-14: You can power the world for 72 years with the nuclear waste that exists today, at a price cheaper than coal. Of course it will likely not happen due to collusion between the coal industry and the fear industrial complex.
China approved 2 reactors this month as it vowed to cut coal use to meet terms of a CO2-emissions agreement reached in November between President Xi Jinping and US counterpart Barack Obama. About $370b will be spent on atomic power. Plans to 3x nuclear capacity by 2020 to as much as 58 gigawatts.
Assuming a 25% conversion efficiency, a Radioisotope Power Source (RPS) would have 400K MJ / kg (electric) compared to 0.72 MJ / kg for Li-ion batteries. The goal is make a 5 watt “D cell” but with nuclear power that lasts decades
Bill Gates is funding Nathan Myhrvold’s Terrapower, a fast breeder reactor that burns a U238 duraflame log for 60 years, with 99% efficiency vs 1% for today’s U235 reactors. No fuel to reload or waste to ship around. Existing nuclear waste could be used as fuel.
“It is the first time a comprehensive IAEA international meeting on molten salt reactors has ever taken place. Given the interest of Member States, the IAEA could provide a platform for international cooperation and information exchange on the development of these advanced nuclear systems.” Molten salt reactors operate at higher temperatures, making them more efficient in generating electricity. In addition, their low operating pressure can reduce the risk of coolant loss, which could otherwise result in an accident. Molten salt reactors can run on various types of nuclear fuel and use different fuel cycles. This conserves fuel resources and reduces the volume, radiotoxicity and lifetime of high-level radioactive waste.
2016-11-28: Making nuclear energy radically less expensive
“The big thing is that the government is making national lab resources available to private companies in a way that it wasn’t before. If you are a nuclear startup, you can only go so far before you need to do testing, and you are not going to build a nuclear test facility, because that is hard and expensive. But now you could partner with a national lab to use their experimental resources. I’ve been talking about how to set up a pathway from universities for this kind of research.”
2016-12-01: Coal to nuclear can rapidly address 30% of CO2
The high temperature reactors can replace the coal burners at 100s supercritical coal plants in China. The lead of the pebble bed project indicates that China plans to replace coal burners with high temperature nuclear pebble bed reactors.
The amount of used nuclear fuel will continue to increase, reaching around 1M tons by 2050. The uranium and plutonium that could be extracted from that used fuel would be sufficient to provide fuel for at least 140 light water reactors of 1 GW capacity for 60 years. “It makes sense to consider how to turn today’s burden into a valuable resource.”
The overall cost of this first of a kind nuclear plant will be in the neighborhood of $5K/kw of capacity. That number is based on signed and mostly executed contracts, not early estimates. It is 2x the initially expected cost. 35% of the increased cost could be attributed to higher material and component costs that initially budgeted, 31% of the increase was due to increases in labor costs and the remainder due to the increased costs associated with the project delays.
Zhang Zuoyi described the techniques that will be applied to lower the costs; he expects them to soon approach the $2k / kw capacity range. If this can be achieved then the 210 MW reactor would be $525m. A 630 MW reactor would be $1.5b. It could be less if the 600 MW reactor only had to have the thermal unit and could use the turbine and other parts of an existing coal plant.
Terrestrial Energy is leading the way to getting regulatory approvals for its molten salt
fission reactor design. Terrestrial Energy aims to build the first walkaway safe molten salt modular reactor design in the late 2020s. IMSR generates 190 MW electric energy with a thermal-spectrum, graphite-moderated, molten-fluoride-salt reactor system. It uses standard-assay low-enriched uranium (less than 5% 235U) fuel.
Deep in the bedrock of Olkiluoto Island in southwest Finland a tomb is under construction. The tomb is intended to outlast not only the people who designed it, but also the species that designed it. It is intended to maintain its integrity without future maintenance for 100 ka, able to endure a future ice age. 100 ka ago 3 major river systems flowed across the Sahara. 100 ka ago anatomically modern humans were beginning their journey out of Africa. The oldest pyramid is around 4.6 ka old; the oldest surviving church building is fewer than 2 ka old.
This Finnish tomb has some of the most secure containment protocols ever devised: more secure than the crypts of the Pharaohs, more secure than any supermax prison. It is hoped that what is placed within this tomb will never leave it by means of any agency other than the geological.
The tomb is an experiment in post-human architecture, and its name is Onkalo, which in Finnish means “cave” or “hiding place.” What is to be hidden in Onkalo is high-level nuclear waste, perhaps the darkest matter humans have ever made.
The reams of data generated by 3D-printing parts can speed up the certification process and lower the cost of getting a nuclear reactor online.
2021-04-20: Nuclear power failed. We need to deeply understand these reasons, because there won’t be a energy transition without new nuclear.
To avoid global warming, the world needs to massively reduce CO2 emissions. But to end poverty, the world needs massive amounts of energy. In developing economies, every kWh of energy consumed is worth $5 of GDP.
How much energy do we need? Just to give everyone in the world the per-capita energy consumption of Europe (which is only half that of the US), we would need to more than triple world energy production, increasing our current 2.3 TW by over 5 additional TW:
If we account for population growth, and for the decarbonization of the entire economy (building heating, industrial processes, electric vehicles, synthetic fuels, etc.), we need more like 25 TW. The proximal cause of nuclear‘s flop is that it is expensive. In most places, it can’t compete with fossil fuels. Natural gas can provide electricity at 7–8 cents/kWh; coal at 5 c/kWh.Why is nuclear expensive? I’m a little fuzzy on the economic model, but the answer seems to be that it‘s in design and construction costs for the plants themselves. If you can build a nuclear plant for around $2.50/W, you can sell electricity cheaply, at 3.5–4 c/kWh. But costs in the US are around 2–3x that. (Or they were—costs are so high now that we don’t even build plants anymore.)
2022-09-14: Simple reactor designs that can be iterated quickly may be the future
Much of the future lies with KRUSTY-like kilowatt-scale systems. Nuclear has a power density problem that keeps it from powering our cars and planes. The shielding and heat engines are too heavy. The radiation and particles are harmful because they contain a lot of energy. The answer is to make solid-state technologies that convert heat and radiation into electricity. It is theoretically possible to turn gamma rays into electricity with something similar to a solar cell. Shielding gets lighter and generates electricity! It also brings new life to many isotopes that require too much shielding to be practical in radioisotope generators. In the meantime, kilowatt-scale systems can compete in smaller remote power applications and supplement solar microgrids. Further cost decreases could enable electricity customers to defect from the grid where solar is not feasible. Competing manufacturers promise a much more competitive industry than exists today, where incentives rarely encourage falling prices.
The endgame is a chunk of nuclear material that can regulate itself based on user demand, surrounded by energy-capturing devices that soak up every bit of emitted energy. Power density could exceed today’s liquid fuels and batteries while having extreme energy density. We’d finally get our flying cars! Reactors that look like KRUSTY are on the path to that endgame.
2023-03-25: Nuclear has some near-fatal problems that make it a non-starter on earth. Beyond the well-known overregulation, the biggest problem is that nuclear produces relatively low temperature heat that then has to be converted to electricity, which is very inefficient. A process would have to be found to turn radiation and heat directly into electricity, without the steam turbines. 2023-07-13: How we got the current regulatory regime
In a world where industry and activists fought to a standstill, Probabilistic Risk Assessment provided the only credible guiding light. Rasmussen and team first began to compile and model relevant data in the early 1970s. Over the decades the industry’s database grew, and the NRC developed an opinion on every valve, every pipe, the position of every flashing light in a plant. This angered the utilities, who could not move a button on a control panel without reams of test data and its associated paperwork. This angered activists when the refinement of models predicted safety margins could be relaxed.
But Probabilistic Risk Assessment has no emotions. Probabilistic Risk Assessment estimated, validated, learned. Probabilistic Risk Assessment would form the barrier protecting us from catastrophe.
I support the Log Format Roadmap because it has a fighting chance to become the first practical step to achieve CMS content interop. Blogs will drive adoption of the principles stated in against the grain. As a weblog vendor, I support it because it will drive the adoption of better tools, and will increase the market for everyone. 2003-06-27: Sam Ruby has been spearheading a major standardization effort in the blog world recently, and he has this to say about his motivations:
About a month ago, my interest and activity in this space kicked into high gear. I started attending weblogging conferences.
Far from claiming to have been the inspiration, it is still very nice to think that OSCOM was able to contribute to the drive towards standardization. This is the stuff we are talking about. 2004-06-08: So that is what Greg Stein has been up to. The sprint was much fun, as were the drinks.
Ever since Atom first popped up, I’ve been interested in it, and even attempted to join a small sprint/discussion at Seybold last year to talk about WebDAV. The bomb threat shut that down, but we simply moved locations for drinks rather than hacking 🙂 So while I’ve been tracking it generally, my specific current interest is through my work at Google. I’m the engineering manager for the Blogger group, so I’ve gotta pay some attention to what we’re signing up for 🙂
2005-09-06: All feeds for this blog now serve Atom 1.0. It will be interesting to watch if anyone notices / cares. Longer term, /atom.xml is the canonical url if you want to subscribe. 2006-10-18: RSS / ATOM / OPML schematron is much easier to work with than the mysterious feed validator code. Plus it works for really huge feeds. This has the README for the RSS validator. Pretty out of date, but a good starting point. For one, you need the latest schematron from Rick Jellife, not the old one on this site. 2006-11-21: Gdata JSON. They also do jsonp, and reuse the Atom serialization. 2006-12-01: GData for Google Spreadsheets. The data web circle gets more complete. This is (one) counterpart to the web formulas in Google Sheets. Now as to how GData can play in the semweb space. Maybe via Queso. 2007-01-31: Tim wonders how to use Atom categories properly. Link to the wikipedia url of the tag I’d say. 2007-02-14: If you browse to a page with an RSS or Atom feed, you get the option to immediately add that feed to Google Reader for mobile via Mobile Proxy Feed Discovery. 2007-03-31: Some Atom extensions by nature to encourage text mining. I don’t know.. They do not seem to reuse core Atom in their examples. Plus I am not sure how useful a word count really is. 2007-05-16: GdataServer
Generally speaking, the Lucene GData Server is an extensible syndication format server providing CRUD actions to alter feed content, authentication, optimistic concurrency and full text search based on Apache Lucene.
2007-05-25: APP frontend to LDAP. This might enable some interesting scenarios. 2007-06-01: Opensearch / Atom interface for Swiss whitepages. Nice!
Wir haben für unser Telefonbuch eine Schnittstelle entwickelt, welche es erlaubt unsere Telefondaten in anderen Applikationen oder Websites zu integrieren. Die Schnittstelle basiert auf dem Konzept von REST. Die Resultate werden als Atom-Feed geliefert, welcher mit OpenSearch- und tel.search.ch-spezifische Felder ergänzt ist. Mit Hilfe eines Schlüssels werden die Resultate auch strukturiert zurückgeliefert. Die Resultatzahl ist pro Abfrage auf 200 Einträge beschränkt.
Gregor Rothfuss wondered whether I couldn’t influence people at Microsoft to also standardize on GData. The fact is that I’ve actually tried to do this with different teams on multiple occasions and each time the I’ve tried, certain limitations..
2007-06-10: Oy. And all this because I asked Dare why Microsoft doesn’t use APP.
There was quite a flurry of blogging about the Atom Publishing Protocol (APP) over the weekend, all kicked off by Dare Obasanjo’s criticisms of the protocol. Some of the posts were critical of Dare and his motives, but I’m thankful he started the conversation.
2007-07-26: The chorus for putting more REST into GIS / mapping gets louder, yay
The only thing needed to bring together this messy new world Atlas, is a global agreement about the structure of the data used to annotate the maps, as well as agreement on the format for retrieving such.
2007-07-28: WFS simple was hijacked, as usual, by people who don’t understand why worse is better. This is why I am not in the least interested in WFS and am betting on APP instead.
if the geospatial standards community continues on this path of isolating itself, of looking upstream to the ISO rather than downstream to the distributed neogeo developer community, it will miss out on being connected to amazing things.
Here’s a Feature Demo of a RESTful WFS-T with a call for GE to support posting of features. I would go further and ask for APP support.
Version control for Collaborative Mapping. Calls for diffs and patches. Might be built on top of an APP infrastructure, imho
The next major area of tool improvement I see is expanding the wiki notion of editing to more of a merging revision control model, with branches, versions, patches and eventually expanding in to distributed repositories. The ‘patch‘ is a small piece of code that can be applied to a computer program to fix something. They are widely used in the open source software world, both to get the latest improvements, and to allow those who have commit rights to a source repository to review outside improvements before putting them in. This helps create the meritocracy around projects, as they don’t let just anyone in to the repository as they might break the build. Such a case is less likely with maps, but sometimes core contributors might want to see a couple sample patches before letting a new member in. In the GeoServer versioning WFS work we have a GetDiff operation that returns a WFS Transaction that can then be applied to another WFS. This fits in with the technical part of how a patch works – they’re really easy to apply to one’s dataset. But unfortunately a WFS transaction is not as easy to read as a code patch. The other great thing about patches is that when leaf nodes are updating their data they can just request the change set – the patches – instead of having to do a full check out. So I’m still not sure how to solve this problem, the WFS Transaction is the best I’ve got, but I think we can do better, have a nice little format that just describes what changed.
Better UIs for Collaborative Mapping. More calls for rollback tools, and would like to see GE post to geoserver, etc
I think we need more user friendly options for collaborative editing. Not just putting some points on a map, but being able to get a sense of the history of the map, getting logs of changes and diffs of certain actions. Editing should be a breeze, and there should be a number of tools that enable this. Google’s MyMaps starts to get at the ease of editing, but I want it collaborative, able to track the history of edits and give you a visual diff of what’s changed. Rollbacks should also be a breeze – if you have really easy tools to edit it’s also going to be easier for people to vandalize. So you need to make tools that are even easier to rollback.
AtomPub sits in a very strange place, as it has the potential to disrupt 6 or more industry sectors, such as, Enterprise Content Management, Blogging, Digital/Desktop Publishing and Archiving, Mobile Web, EAI/WS-* messaging, Social Networks, Online Productivity tools. As interesting as the adoption rates, will be people and sectors finding reasons not use it to protect distribution channels and data lockins with more complicated solutions. Any kind of data garden is fair game for AtomPub to rationalize.
Why Digital Signature? This idea was first proposed by James Snell, and it’s a good one. Mind you, the benefits are a little bit theoretical, since no feed-reading clients that I’ve seen actually check a digital signature. The argument for this is similar to that for TLS; a bad guy who could somehow insert a fake press release into the feed could make zillions by gaming the share price. A verifiable digital signature would let someone reading the feed know that the news in it really truly did come from Sun.
2007-07-31: Atom for KML. Nice. I want to do more, but this is a good start. The Atom / KML meme spreads. Perception is reality, and I approve. 2007-08-03: Appfs
appfs can mount remote resources exposed via the Atom Publishing Protocol as a local filesystem.
2007-08-07: RESTful partial updates. Maybe useful for APP / KML to supplement update
over the past couple of months, there’s been a lot of discussion about the problem of partial updates in REST-over-HTTP. The problem is harder than it appears at first glance. The canonical scenario is that you’ve just retrieved a complicated resource, like an address book entry, and you decide you want to update just one small part, like a phone number. The canonical way to do this is to update your representation of the resource and then PUT the whole thing back, including all of the parts you didn’t change. If you want to avoid the lost update problem, you send back the ETag you got from the GET with your PUT inside an If-Match: header, so that you know that you’re not overwriting somebody else’s change.
Zend Google Data Client
The Zend Google Data Client provides a PHP 5 component to execute queries and commands against the Google Data APIs.
2007-08-14: Winer on Atom. Sore loser. 2007-08-19: How to deal with the sliding window problem where feed producers update more often than consumers, and consumers thus might miss entries.
A standardized way to get at previous entries that have scrolled out of a feed, and at the complete archive. 2007-08-28: YouTube GData. Nice to see more media-heavy usages. Now we have pretty much all of them, only KML is missing. 2007-10-29: APP Lock-In. So cute. Microsoft is in a tight spot: Admit they have no strategy and use APP, or invent their own. It seems they are trying to build a case to do just that.
It seems that while we weren’t looking, Google move us a step away from a world of simple, protocol-based interoperability on the Web to one based on running the right platform with the right libraries. Usually I wouldn’t care about whatever bad decisions the folks at Google are making with their API platform. However the problem is that it sends out the wrong message to other Web companies that are building Web APIs. The message that it’s all about embracing and extending Internet standards with interoperability being based on everyone running sanctioned client libraries instead of via simple, RESTful protocols is harmful to the Internet. Unfortunately, this harkens to the bad old days of Microsoft and I’d hate for us to begin a race to the bottom in this arena.
2007-12-06: FeedSync. The full syncing requirement makes this heavy weight
Although FeedSync is capable of full-blown multi-master synchronization, there are all kinds of interesting uses, including simple one-way uses. Consider, for example, how RSS typically has no memory. Most blogs publish items into a rolling window. If you subscribe after items have scrolled out of view, you can’t syndicate them. A FeedSync implementation could enable you synchronize a whole feed when you first subscribe, then update items moving forward. It could also enable the feed provider to delete items, which you might not want if the items are blog postings, but would want if they’re calendar items representing cancelled events.
i’m still reading a new kind of science. one concept that wolfram advocates is that of finite nature.
The basic idea is that of ‘Finite Nature’. There is some discussion of the exact meaning of this phrase; my preferred definition is, the proposition that a finite quantity of space/time, containing a finite amount of matter/energy, is capable (in principle) of being simulated EXACTLY using a finite amount of computing power on a Universal Turing Machine. The implications of the Finite Nature concept are pretty far-reaching. Among other things, any physical principle that is presently theorized to be based on continuous functions of any kind, must be supposed to actually be an approximation of an underlying computational process. This reverses the normal relationship, where digital processes (such as CA’s, or generally any quantized matrix calculations such as those used in computational dynamics) are seen as approximations of an underlying, continuous reality.
although wolfram has his detractors, his book is highly inspirational. he certainly has done much to publicize the field of digital philosophy.
It is time the academic review process is made more transparent. Bertrand Meyer on why reviews could learn from open source and weblogs. If most of your thoughts are in the public record, it affects your thought processes. For the better I think.
It is widely believed that anonymous refereeing helps fairness, by liberating reviewers from the fear that openly stated criticism might hurt their careers. In my experience, the effects of anonymity are worse than this hypothetical damage. Referees too often hide behind anonymity to turn in sloppy reviews; worse, some dismiss contributions unfairly to protect their own competing ideas or products. Even people who are not fundamentally dishonest will produce reviews of unsatisfactory quality out of negligence, laziness or lack of time because they know they can’t be challenged. Putting your name on an assessment forces you to do a decent job.
The day that you find a more interesting paper in Citeseer than in any IEEE or ACM e-journal, it’s the day you don’t look back. But what would happen next? What about the academic symbiosis with the peer review system? Citeseer uses a Google page-rank-like algorithm for ranking: which is analyzing the properties of the citation network topology to understand which papers are more influential than others. Just like Google does with hyperlinks for web pages, Citeseer does it for bibliographic citations: the result is that peer review is not done by a panel of experts, but by every researcher in the field!!
what strikes me is how much pomp, circumstance and apparatus academia requires in order to frame even a very small and simple point. References to everything in the literature ever said on any vaguely related topic, detailed comparisons of your work to whatever it is the average journal referee is likely to find important — blah, blah, blah, blah, blah…. A point that I would more naturally get across in 5 pages of clear and simple text winds up being a 30 page paper!
I’m writing some books describing the Novamente AI system — one of them, 600 pages of text, was just submitted to a publisher. The other 2, 300 and 200 pages respectively, should be submitted later this year. Writing these books took a really long time but they are only semi-technical books, and they don’t follow all the rules of academic writing — for instance, the whole 600 page book has a reference list no longer than I’ve seen on many 50-page academic papers, which is because I only referenced the works I actually used in writing the book, rather than every relevant book or paper ever written. I estimate that to turn these books into academic papers would require me to write 60 papers. To sculpt a paper out of text from the book would probably take me 2-7 days of writing work, depending on the particular case. So it would be at least 1 full year of work, probably 2 full years of work, to write publishable academic papers on the material in these books!
The lack of risk-taking is particularly evident in computer science:
Furthermore, if as a computer scientist you develop a new algorithm intended to solve real problems that you have identified as important for some purpose (say, AI), you will probably have trouble publishing this algorithm unless you spend time comparing it to other algorithms in terms of its performance on very easy “toy problems” that other researchers have used in their papers. Never mind if the performance of an algorithm on toy problems bears no resemblance to its performance on real problems. Solving a unique problem that no one has thought of before is much less impressive to academic referees than getting a 2% better solution to some standard “toy problem.” As a result, the whole computer science literature (and the academic AI literature in particular) is full of algorithms that are entirely useless except for their good performance on the simple “toy” test problems that are popular with journal referees.
His first scenario makes me wonder if amateur scientists could again make meaningful contributions to research, combined with a wiki-like process that (hopefully) would identify promising directions better than today’s peer reviews:
And so, those of us who want to advance knowledge rapidly are stuck in a bind. Either generate new knowledge quickly and don’t bother to ram it through the publication mill … or, generate new knowledge at the rate that’s acceptable in academia, and spend 50% of your time wording things politically and looking up references and doing comparative analyzes rather than doing truly productive creative research.
2006-12-31: The trend towards cross-disciplinary research is getting stronger. this is very good news for this dabbler 😉
There is an increasing coalescence of scientific disciplines in many areas. Thus the discovery of the structure of the genome not only required contributions from parts of biology, physics, chemistry, mathematics, and information technology, but in turn it led to further advances in biology, physics, chemistry, technology, medicine, ecology, and even ethics. And all this scientific advance is leading, as it should, to the hopeful betterment of the human condition (as had been also one of the platform promises of the Unity of Science movement, especially in its branch in the Vienna Circle).
Similar developments happen in the physical sciences—a coalescence of particle physics and large-scale astronomy, of physics and biology, and so forth. It is a telling and not merely parochial indicator that ~50% of my 45 colleagues in my Physics Department, owing to their widespread research interests, now have joint appointments with other departments at the University: with Molecular and Cellular Biology, with Mathematics, with Chemistry, with Applied Sciences and Engineering, with History of Science. Just now, a new building is being erected next to our Physics Department. It has the acronym LISE, which stands for the remarkable name, Laboratory of Integrated Science and Engineering. Although in industry, here and there, equivalent labs have existed for years, the most fervent follower of the Unity of Science movement would not have hoped then for such an indicator of the promise of interdisciplinarity. But as the new saying goes, most of the easy problems have been solved, and the hard ones need to be tackled by a consortium of different competences.
Delicious is the Rome, Jerusalem, and Paris of my existence as a researcher these days. It’s where I make my friends, how I get the news, and where I go to trade.
Our goal is to make collaboration and open source come to life in the field of clinical research. With our partners, we will identify specific projects where the sharing of information will lead to better, more accurate research.
2007-10-10: Publication bias: Only positive correlations are published. Time for a big expert system to feed the data from all experiments in. 2007-12-14: Creative Commons
Nature Magazine’s announced that it’s going to share all its human genome papers under Creative Commons Attribution-NonCommercial-ShareAlike licenses.The genomes themselves are not copyrightable and go into a public database, but the papers — which are a vital part of the science — may now be freely copied by any non-commercial publisher.
2009-03-11: Strangest college courses. Can’t let Sarah find this list!
15. Arguing with Judge Judy: Popular ‘Logic’ on TV Judge Shows
14. Underwater Basket Weaving
13. Learning From YouTube
12. Philosophy and Star Trek
11. The Art of Walking
10. Daytime Serials: Family and Social Roles
9. Joy of Garbage
8. The Science of Superheroes
7. Zombies in Popular Media
6. The Science of Harry Potter
5. Cyberporn and Society
4. Simpsons and Philosophy
3. Far Side Entomology
2. Myth and Science Fiction: Star Wars, The Matrix, and Lord of the Rings
1. The Strategy of StarCraft
2009-03-12: Subsidizing intellectually challenged people to go to college just exacerbates the glut of useless degrees.
Scholars in their 60s are not producing path-breaking new research, but they are the people that tenure protects. Scholars in their 20s have no academic freedom at all.
2011-10-28: Makes the argument that #ows is people with useless degrees and a lot of student loan debt. the useless degree bubble is definitely one to pop.
But the lower tier of the New Class — the machine by which universities trained young people to become minor regulators and then delivered them into white collar positions on the basis of credentials in history, political science, literature, ethnic and women’s studies — with or without the benefit of law school — has broken down. The supply is uninterrupted, but the demand has dried up. The agony of the students getting dumped at the far end of the supply chain is in large part the OWS. As Above the Law points out, here is “John,” who got out of undergrad, spent a year unemployed and living at home, and is now apparently at University of Vermont law school, with its top ranked environmental law program — John wants to work at a “nonprofit.”
Even more frightening is the young woman who graduated from UC Berkeley, wanting to work in “sustainable conservation.” She is now raising chickens at home, dying wool and knitting knick-knacks to sell at craft fairs. Her husband has been studying criminal justice and EMT — i.e., preparing to work for government in some of California’s hitherto most lucrative positions — but as those work possibilities have dried up, he is hedging with a (sensible) apprenticeship as an electrician. These young people are looking at serious downward mobility, in income as well as status. The prospects of the lower tier New Class semi-professionals are dissolving at an alarming rate. Student loan debt is a large part of its problems, but that’s essentially a cost question accompanying a loss of demand for these professionals’ services.
2013-04-12: You can observe the useless degrees epidemic every day in Williamsburg.
At present, high % of high school graduates are continuing to enroll in college, often taking on punishing levels of debt in the process. As nearly any recent college graduate knows, many of these people are ending up working in lower wage service jobs (Baristas for example). I think it is entirely possible that future high school graduates will begin to look at these outcomes begin shying away from college.
2014-02-09: Everyone going to college is completely wrong.
Student population: now in decline; Admissions: harder & harder to make targets; Finance: banks not willing to take on more student loans; Research: shift from tenure to adjuncts.
I don’t think MOOCs are the solution. The premise that “everyone” should have a university education is wrong, and deprives people of quality apprenticeships.
On top of that, there are massive problems with science itself. Academic literature is a relic of the era of typesetting, modeled on static, irrevocable, publication. A dependency graph would tell us, at a click, which of the pillars of scientific theory are truly load-bearing. Revision control (git anyone?) would allow for a much more mature way to incorporate improvements from anyone. 2014-06-12: From upworthy university:
You NEED to see this hot model (NSFW) of ethnic politics and foreign policy These squirrels were forced to migrate south because of the Ice Age. What happened next WILL SHOCK YOU
2014-07-01: Amusing given how many overeducated people with useless degrees already work at Starbucks. I don’t think more degrees is the answer.
Starbucks will provide a free online college education to 1000s of its workers, without requiring that they remain with the company
published p-values cluster suspiciously around this 0.05 level, suggesting that some degree of p-hacking is going on. This is also often described as torturing the data until it confesses
Nature has decided to add an option for double-blind peer review – papers would be sent to the referees without author names or institutional affiliation on them. papers from Big Names won’t bother, because they have more to lose by being covered up. So the double-blinded papers might end up disproportionately from smaller groups who are trying to even the playing field, and it risks becoming a negative signal. It might be better if Nature were to take the plunge and blind everything.
2015-05-08: Or you could have saved yourself 250k and gotten a useful education right from the start.
In a Boston basement that houses a new kind of vocational training school, Katy Feng is working harder than she ever did at Dartmouth College. The 22-year-old graduated last year with a bachelor’s degree in psychology and studio art that cost more than $250K. She sent out 10s of résumés looking for a full-time job in graphic design but wound up working a contract gig for a Boston clothing store. “I thought, they’ll see Dartmouth, and they’ll hire me. That’s not really how it works, I found.” She figures programming is the best way to get the job she wants. Hence the basement, where she’s paying $11500 for a 3-month crash course in coding.
Actually, no, let’s not do that, and just let people hold basic jobs even if they don’t cough up $100K from somewhere to get a degree in Medieval History? Sanders’s plan would subsidize the continuation of a useless tradition that has turned into a speculation bubble, prevent the bubble from ever popping, and disincentivize people from figuring out a way to route around the problem
2015-06-08: Scientific publishing is stuck in the 18th century.
to a great extent the internet is used as a PDF delivery device by many publishers, and the PDF is an electronic form of the classic paper journal article, whose basic outlines were established in the 17th and 18th centuries. In other words, in a qualitative sense we’re not that much beyond the Age of Newton and the heyday of the Royal Society. Scientific publishing today is analogous to “steampunk.” An anachronistic mix of elements somehow persisting deep into the 21st century. Its real purpose is to turn the norms of the past into cold hard cash for large corporations.
Our research suggests that open access policies have a tremendous impact on the diffusion of science to the broader general public through an intermediary like Wikipedia
Principal Deputy Manager of the Subcommittee for Athletic Communications
Assistant Vice Dean of the Committee on Neighborhood Compliance
Lead Associate Provost for the Office of Dining Compliance
2016-07-17: This Open Science Framework is excellent, starts peer review during the experiment design phase when it’s still cheap to make corrections, rather than when it’s all done. 2017-02-01: Scientific fraud is rampant
Hartgerink is 1 of only a handful of researchers in the world who work full-time on the problem of scientific fraud – and he is perfectly happy to upset his peers. “The scientific system as we know it is pretty screwed up. I’ve known for years that I want to help improve it. Statcheck is a good example of what is now possible”. The top priority is something much more grave than correcting simple statistical miscalculations. He is now proposing to deploy a similar program that will uncover fake or manipulated results – which he believes are far more prevalent than most scientists would like to admit.
2017-04-15: I wonder how many people ended up graduating with their head trauma degrees?
It was a bold new idea: an all-sports college, classes be damned. But for the athletes at Forest Trail Sports University who faced hunger, sickness and worse, it turned into a nightmare.
Scientific revolutions occur on all scales, but here let’s talk about some of the biggies:
1850-1950: Darwinian revolution in biology, changed how we think about human life and its place in the world.
1890-1930: Relativity and quantum revolutions in physics, changed how we think about the universe.
2000-2020: Replication revolution in experimental science, changed our understanding of how we learn about the world.
We are in the middle of a scientific revolution involving statistics and replication in many areas of science, moving from an old paradigm in which important discoveries are a regular, expected product of statistically-significant p-values obtained from routine data collection and analysis, to a new paradigm of . . . weeelll, I’m not quite sure what the new paradigm is.
Progress itself is understudied. By “progress,” we mean the combination of economic, technological, scientific, cultural, and organizational advancement that has transformed our lives and raised standards of living over the past couple of centuries. For a number of reasons, there is no broad-based intellectual movement focused on understanding the dynamics of progress, or targeting the deeper goal of speeding it up. We believe that it deserves a dedicated field of study. We suggest inaugurating the discipline of “Progress Studies.”
But the scientific journal as we know it was actually born because of popular demand for information during a pandemic.
In the early 1820s, a smallpox outbreak struck Paris and other French cities. A new vaccine was in existence at the time, but reports varied about how effective it was. A powerful medical institution in Paris, the Académie de Médecine, gathered its members to discuss what advice it should issue to the nation. Historically, such meetings were held privately, but the French Revolution had ushered in a new era of government accountability, and journalists were allowed to attend. The scientific debate they relayed upset some members of the Académie, which had hoped to make a clear, unified statement. In response, the Académie sought to regain control of its message by publishing its own weekly accounts of its discussions, which evolved into the academic journals we know today.
The massive effort to develop vaccines for COVID-19 will boost and accelerate the development of cancer vaccines. Streamlined regulatory approvals will speed up the approval of all new medical treatments. Elon Musk was successfully executing a transformation to fully reusable rockets, mass-produced satellites, electric cars, electric trucks, and self-driving vehicles. Elon Musk will continue to execute and win.
In just 3 months, 1 British research team identified the first life-saving drug of the pandemic (and helped cancel hydroxychloroquine). The Recovery trial has an adaptive design, built to evaluate 6 different drugs at once, with methods and goals announced in advance.
Science funding mechanisms are too slow in normal times and may be much too slow during the COVID-19 pandemic. Fast Grants are an effort to correct this. If you are a scientist at an academic institution currently working on a COVID-19 related project and in need of funding, we invite you to apply for a Fast Grant. Fast Grants are $10k to $500k and decisions are made in under 14 days. If we approve the grant, you’ll receive payment as quickly as your university can receive it.
2021-06-15: They were a big success, here’s what they learned:
The first round of grants were given out within 48 hours. Later rounds of grants, which often required additional scrutiny of earlier results, were given out within 2 weeks. These timelines were much shorter than the alternative sources of funding available to most scientists. Grant recipients were required to do little more than publish open access preprints and provide monthly 1-paragraph updates. We allowed research teams to repurpose funds in any plausible manner, as long as they were used for research related to COVID-19. Besides the 20 reviewers, from whom perhaps 20-40 hours each was required, the total Fast Grants staff consisted of 4 part-time individuals, each of whom spent a few hours per week on the project after the initial setup.
We found it interesting that relatively few organizations contributed to Fast Grants. The project seemed a bit weird and individuals seemed much more willing to take the “risk”. We were very positively surprised at the quality of the applications. We didn’t expect people at top universities to struggle so much with funding during the pandemic. 32% said that Fast Grants accelerated their work by “a few months”. 64% of respondents told us that the work in question wouldn’t have happened without receiving a Fast Grant. We were disappointed that very few trials actually happened. This was typically because of delays from university institutional review boards (IRBs) and similar internal administrative bodies that were consistently slow to approve trials even during the pandemic. In our survey of the scientists who received Fast Grants, 78% said that they would change their research program “a lot” if their existing funding could be spent in an unconstrained fashion. We find this number to be far too high: the current grant funding apparatus does not allow some of the best scientists in the world to pursue the research agendas that they themselves think are best. Scientists are in the paradoxical position of being deemed the very best people to fund in order to make important discoveries but not so trustworthy that they should be able to decide what work would actually make the most sense!
Another effect of ASU’s pragmatic research culture is reducing overspecialization among academic disciplines. Crow and Dabar recognize that “specialization has been the key to scientific success” and that disciplines have historic value, but they worry that “such specialization simultaneously takes us away from any knowledge of the whole,” leaving us ill-prepared for the future. Disciplines make it harder to synthesize information, and if a university wants to remain a flexible knowledge enterprise, they need to be prepared to take a holistic approach. During Crow’s tenure, ASU consolidated quite a few academic departments, such as history and political science, or English and modern languages, in order to encourage these fields to solve real problems together – which also had the added benefit of reducing administrative costs. Instead of having to apply to ASU first, then, a student can start by doing the coursework, then enroll when they’re ready, with part of their degree already completed. This feels well-aligned to me with tech’s focus on prioritizing output over credentials. It’s also worth noting that because this experiment lives in Learning Enterprise, it doesn’t detract from the more traditional degree work that lives under Academic Enterprise. ASU also seems to share a lot of values that I cherish about tech, such as optimism, entrepreneurialism, and responsiveness, as well as being results-oriented, and it appears to be a culture that’s enforced from the top. While ASU might not run as fast as a startup, their culture of testing, prototyping, and reinvention seems rare for such a large institution.
2021-11-18: A fascinating look at all the research around Ivermetcin, and what lessons to draw from it about the state of science
This is one of the most carefully-pored-over scientific issues of our time. 10s of teams published studies saying ivermectin definitely worked. Then most scientists concluded it didn’t. What a great opportunity to exercise our study-analyzing muscles! To learn stuff about how science works which we can then apply to less well-traveled terrain! If the lesson of the original replication crisis was “read the methodology” and “read the preregistration document”, this year’s lesson is “read the raw data”. Which is a bit more of an ask. Especially since most studies don’t make it available.
I worked on biomedical literature search, discovery and recommender web applications for many months and concluded that extracting, structuring or synthesizing “insights” from academic publications (papers) or building knowledge bases from a domain corpus of literature has negligible value in industry. Close to nothing of what makes science actually work is published as text on the web. Research questions that can be answered logically through just reading papers and connecting the dots don’t require a biotech corp to be formed around them. There’s much less logic and deduction happening than you’d expect in a scientific discipline.
Now founders and investors—including tech CEOs, crypto billionaires, bloggers, economists, celebrities, and scientists—are coming together to address stasis with experimentation. They’re building a fleet of new scientific labs to speed progress in understanding complex disease, extending healthy lifespans, and uncovering nature’s secrets in long-ignored organisms. In the process, they’re making research funding one of the hottest spaces in Silicon Valley.
Arc Institute
Problem: U.S. science funding attaches too many strings to our best researchers, preventing them from working on the most interesting problems.
Solution: Arc gives scientists no-strings-attached, multiyear funding so that they don’t have to apply for external grants.
Arcadia Science
Problem: Modern science is too siloed—both because researchers are too narrowly focused and because peer-reviewed journals stymie collaboration.
Solution: Expand the menu of species that we deeply research—and embrace an open-science policy.
New Science
Problem: Science is getting old, fast.
Solution: New Science sponsors young scientists.
Bringing all method advances together:
Altogether, these examples are largely restricted to specific disciplines: while research in genomics has pioneered the use of massive open databases, it rarely contains robustness checks or the pre-registration of methods. While the methods of clinical trials are required to be pre-registered, their analysis code is rarely shared publicly.
We believe that good practices from individual fields should serve as models for how science ought to work across the board, and that the scientific process should be radically reassembled from start to finish. How would it work?
To begin with, scientists would spend far more time clarifying the theories they are studying – developing appropriate measures to record data, and testing the assumptions of their research – as the meta-scientist Anne Scheel and others have suggested. Scientists would use programs such as DeclareDesign to simulate data, and test and refine their methods.
Instead of writing research in the form of static documents, scientists would record their work in the form of interactive online documents (such as in Markdown format). Past versions of their writing and analysis would be fully availableto view through platforms such as Git and OSF. Robustness checks and multiverse analysis would be the norm, showing readers the impact of various methodological decisions interactively.
Once research is freed from the need to exist in static form, it can be treated as if it were a live software product. Analysis code would be standardized in format and regularly tested by code-checkers, and data would be stored in formats that were machine-readable, which would enable others to quickly replicate research or apply methods to other contexts. It would also be used to apply new methods to old data with ease.
Some types of results would be stored in mass public databases with entries that would be updated if needed, and other researchers would reuse their results in further analysis. Citations would be viewer-friendly, appearing as pop-ups that highlight passages or refer to specific figures or datasets in prior research (each with their own doi codes), and these citations would be automatically checked for corrections and retractions.
Peer review would operate openly, where the wider scientific community and professional reviewers would comment on working papers, Red Teams would be contracted to challenge research, and comments and criticisms of studies would be displayed alongside them on platforms such as PubPeer. Journals would largely be limited to aggregating papers and disseminating them in different formats (for researchers, laypeople and policymakers); they would act, perhaps, in parallel with organizers of conferences. They could perform essential functions such as code-checking and copy-editing, working through platforms such as GitHub.
2022-02-07: Why Isn’t There a Replication Crisis in Math?
There’s a lot to say about the mathematics we use in social science research, especially statistically, and how bad math feeds the replication crisis.1 But I want to approach it from a different angle. Why doesn’t the field of mathematics have a replication crisis? And what does that tell us about other fields, that do? 1 of the distinctive things about math is that our papers aren’t just records of experiments we did elsewhere. In experimental sciences, the experiment is the “real work” and the paper is just a description of it. But in math, the paper, itself, is the “real work”. Our papers don’t describe everything we do, of course. But the paper contains a (hopefully) complete version of the argument that we’ve constructed. And that means that you can replicate a math paper by reading it. Mathematicians have pretty good idea of what results should be true; but so do psychologists! Mathematicians sometimes make mistakes, but since they’re mostly trying to prove true things, it all works out okay. Social scientists are also (generally) trying to prove true things, but it doesn’t work out nearly so well. Why not? In math, a result that’s too good looks just as troubling as one that isn’t good enough. If a study suggest humans aren’t capable of making reasoned decisions at 11:30, it’s confounded by something, even if we don’t know what.
2022-03-06: There’s much less of a replication crisis in Biology:
In biology, when one research team publishes something useful, then other labs want to use it too. Important work in biology gets replicated all the time—not because people want to prove it’s right, not because people want to shoot it down, not as part of a “replication study,” but just because they want to use the method. So if there’s something that everybody’s talking about, and it doesn’t replicate, word will get out.
To avoid rent dissipation and risk aversion, our state funding of science should be simplified and decentralized into Researcher Guided Funding. Researcher Guided Funding would take the ~$120b spent by the federal government on science each year and distribute it equally to the 250k full-time research and teaching faculty in STEM fields at high research activity universities, who already get 90% of this money. This amounts to about $500k for each researcher every year. You could increase the amount allocated to some researchers while still avoiding dissipating resources on applications by allocating larger grants in a lottery that only some of them win each year. 60% of this money can be spent pursuing any project they want, with no requirements for peer consensus or approval. With no strings attached, Katalin Karikó and Charles Townes could use these funds to pursue their world-changing ideas despite doubt and disapproval from their colleagues. The other 40% would have to be spent funding projects of their peers. This allows important projects to gain a lot of extra funding if a group of researchers are excited about it. With over 5000 authors on the paper chronicling the discovery of the Higgs Boson particle in the Hadron Supercollider, this group of physicists could muster $2.5b a year in funding without consulting any outside sources. This system would avoid the negative effects of long and expensive review processes, because the state hands out the money with very few strings, and risk aversion among funders, because the researchers individually get to decide what to fund and pursue.
2022-10-20: Maybe not all hope is lost. It seems there’s improvement in preregistering studies, more data sharing etc.
There are encouraging signs that pre-registered study designs like this are helping address the methodological problems described above. Consider the following five graphs. The graphs show the results from 5 major studies, each of which attempted to replicate many experiments from the social sciences literature. Filled in circles indicate the replication found a statistically significant result, in the same direction as the original study. Open circles indicate this criterion wasn’t met. Circles above the line indicate the replication effect was larger than the original effect size, while circles below the line indicate the effect size was smaller. A high degree of replicability would mean many experiments with filled circles, clustered fairly close to the line. Here’s what these 5 replication studies actually found:
As you can see, the first 4 replication studies show many replications with questionable results – large changes in effect size, or a failure to meet statistical significance. This suggests a need for further investigation, and possibly that the initial result was faulty. The fifth study is different, with statistical significance replicating in all cases, and much smaller changes in effect sizes. This is a 2020 study by John Protzko et al that aims to be a “best practices” study. By this, they mean the original studies were done using pre-registered study design, as well as: large samples, and open sharing of code, data and other methodological materials, making experiments and analysis easier to replicate…In short, the replications in the fifth graph are based on studies using much higher evidentiary standards than had previously been the norm in psychology. Of course, the results don’t show that the effects are real. But they’re extremely encouraging, and suggest the spread of ideas like Registered Reports contribute to substantial progress.
There’s also some interesting ideas about funding:
Fund-by-variance: Instead of funding grants that get the highest average score from reviewers, a funder should use the variance (or kurtosis or some similar measurement of disagreement) in reviewer scores as a primary signal: only fund things that are highly polarizing (some people love it, some people hate it). One thesis to support such a program is that you may prefer to fund projects with a modest chance of outlier success over projects with a high chance of modest success. An alternate thesis is that you should aspire to fund things only you would fund, and so should look for signal to that end: projects everyone agrees are good will certainly get funded elsewhere. And if you merely fund what everyone else is funding, then you have little marginal impact
2023-02-23: Registered reports for publishing negative results
The fundamental principle underpinning a Registered Report is that a journal commits to publishing a paper if the research question and the methodology chosen to address it pass peer review, with the result itself taking a back seat. For now, Nature is offering Registered Reports in the field of cognitive neuroscience and in the behavioral and social sciences. In the future, we plan to extend this to other fields, as well as to other types of study, such as more exploratory research.
Why are we introducing this format? In part to try to address publication bias, the tendency of the research system — editors, reviewers and authors — to favor the publication of positive over negative results. Registered Reports help to incentivize research regardless of the result. An elegant and robust study should be appreciated as much for its methodology as for its results. More than 300 journals already offer this format, up from around 200 in 2019. But despite having been around for a while, Registered Reports are still not widely known — or widely understood — among researchers. This must change. And, at Nature, we want to play a part in changing it.
2023-08-31: History and other disciplines are even worse than Psychology
Science’s replication crises might pale in comparison to what happens all the time in history, which is not just a replication crisis but a reproducibility crisis. Replication is when you can repeat an experiment with new data or new materials and get the same result. Reproducibility is when you use exactly the same evidence as another person and still get the same result — so it has a much, much lower bar for success, which is what makes the lack of it in history all the more worrying.
2024-04-12: Mandating open access as a condition of funding
The Bill & Melinda Gates Foundation, one of the world’s top biomedical research funders, will from next year require grant holders to make their research publicly available as preprints. The foundation also said it would stop paying for article-processing charges (APCs) — fees imposed by some journal publishers to make scientific articles freely available online for all readers, a system known as open access (OA).
In the next quarter to half century, further advances may lead to large-scale production of hydrogen gas for fuel cells and the ability to store in the ground both bio- and fossil-derived CO2. Microbes could enable the inexpensive production of hydrogen by consuming a hydrogenated feedstock and releasing H2, for example, splitting water with light or splitting hydrogen from biomass or even coal.
A superconducting motor would be very lightweight and far more efficient electrically, generating 3x the torque of a conventional electric motor for the same energy input and weight. In addition, an electric aircraft would be far quieter than a conventional jet as there are no internal combustion processes involved. Liquid hydrogen is cold enough to make the superconducting magnets work but also has 4x as much energy weight for weight than aviation fuel.
2008-01-11: Thermoelectric Energy Conversion. Using hydrogen as a heat exchanger, solid state, 60% efficient energy generation. Can be used with solar, internal combustion, turbines and other sources of waste heat.
The JTEC is an all solid-state engine that operates on the Ericsson cycle. Equivalent to Carnot, the Ericsson cycle offers the maximum theoretical efficiency available from an engine operating between 2 temperatures. The JTEC system utilizes the electro-chemical potential of hydrogen pressure applied across a proton conductive membrane (PCM). The membrane and a pair of electrodes form a Membrane Electrode Assembly (MEA) similar to those used in fuel cells. On the high-pressure side of the MEA, hydrogen gas is oxidized resulting in the creation of protons and electrons. The pressure differential forces protons through the membrane causing the electrodes to conduct electrons through an external load. On the low-pressure side, the protons are reduced with the electrons to reform hydrogen gas. This process can also operate in reverse. If current is passed through the MEA a low-pressure gas can be “pumped” to a higher pressure.
The JTEC could utilize heat from fuel combustion, solar, low grade industrial waste heat or waste heat from other power generation systems including fuel cells, internal combustion engines and combustion turbines. As a heat pump, the JTEC system could be used as a drop in replacement for existing HVAC equipment in residential, commercial, or industrial settings.
To compliment the Scimitar engine, Reaction Engines has proposed a suitable vehicle configuration (A2) that attains the necessary subsonic and supersonic lift/drag ratio for efficient commercial operation. The airframe is designed to have adequate control authority about all axes to handle engine-out and to achieve pitch trim over the full Mach range. In addition the airframe configuration is an efficient structural shape with circular cross section hydrogen tankage and uninterrupted carry-through wing spars. The vehicle is sized to carry 300 passengers since this is typical of future supersonic transport designs and thought to be the minimum to achieve a competitive seat/km cost.
They also have an unpiloted, reusable spaceplane intended to provide inexpensive and reliable access to space.
skylon space plane will be ~$550 / kg to orbit initially and $145 later, significantly cheaper than spacex, which is around $1500 / kg. the space shuttle was $22k / kg. this is achieved with a lower mass ratio, because skylon is air breathing up to 25k altitude. this is a very interesting design and i hope they succeed.
Boeing has unveiled the hydrogen-powered Phantom Eye unmanned airborne system for collecting data and communications, a demonstrator that will stay aloft at 20 km for up to 4 days.
Between this and the british unmanned fighter plane, an inflection point. 2014-03-14: Making hydrogen storage practical will make tesla batteries look like amateur hour, and have huge implications.
DOE had hoped that by 2017, a research team could pack in 7.5% hydrogen by weight by 2020. Li’s team has already crossed that threshold, with a hydrogen storage density of 9.5%. The team has also demonstrated the potential to reach an even higher density
2020-06-09: The push to make America tun on hydrogen. This seems at least partially a boondoggle like ethanol.
Nikola’s truck cab—a 1000-horsepower system comprising carbon fiber tanks, hydrogen fuel and a fuel-cell stack—will push an 18-wheeler up to 1200km and weighs 9000kg. The same juice would demand a lithium-ion battery that would add at least 2000kg to a truck with the same range.
2020-12-10: Negative emission car. This is extremely silly, but then again I remember how much people enjoyed the display in a Prius that tells you how much energy you’re getting back from braking. Anything can be turned into a game.
The car is equipped with Toyota’s “Minus Emissions” technology, whereby air is taken into the vehicle, purified, sent into the fuel stack in order to generate electricity, then vented back out of the vehicle. By the time it comes out of the exhaust, the air has been scrubbed of any harmful chemicals and particulate matter 2.5 microns or greater in size. Are people around you likely to notice that you’ve left fresher air? Perhaps not, but you will: “The large 31cm center display includes an Air Purification display that shows the amount of air purified when driving through an easy-to-understand graphic of runners and digital display. It also includes an Air Purification meter that shows how much air is purified during acceleration. The meter enables the driver to feel the contribution that the new Mirai is making to the environment.”
Unlike other solid-to-liquid-fuel processes such as cornstarch into ethanol, this one will accept almost any carbon-based feedstock. If a 80 kg man fell into one end, he would come out the other end as 17 kg of oil, 3 kg of gas, and 3 kg of minerals, as well as 56 kg of sterilized water. While no one plans to put people into a thermal depolymerization machine, an intimate human creation could become a prime feedstock. “There is no reason why we can’t turn sewage, including human excrement, into a glorious oil”.
Just as we are hitting the hubbert peak, we get a technology that may make oil rigs obsolete:
Andreassen and others anticipate that a large chunk of the world’s agricultural, industrial, and municipal waste may someday go into thermal depolymerization machines scattered all over the globe. If the process works as well as its creators claim, not only would most toxic waste problems become history, so would imported oil. Just converting all the US agricultural waste into oil and gas would yield the energy equivalent of 4B barrels of oil annually. In 2001 the United States imported 4.2B barrels of oil. “This technology offers a beginning of a way away from this.”
With their main (only?) source of income in danger, what will the middle east kleptocracies do?
because
The only thing this process can’t handle is nuclear waste. If it contains carbon, we can do it.” and Thermal depolymerization has proved to be 85% energy efficient for complex feedstocks, and even higher for relatively dry raw materials, such as plastics
Held together by a slowly rotating system of currents northeast of Hawaii, the Eastern Garbage Patch is more than just a few floating plastic bottles washed out to sea; the Patch is a giant mass of trash-laden water 2x the size of Texas.
Declaring war on the “white pollution” choking its cities, farms and waterways, China is banning free plastic shopping bags and calling for a return to the cloth bags of old
2013-12-05: Depolymerization was hailed as the solution ~10 years ago: turning plastic back into more versatile compounds. I weirdly haven’t heard much about it since. Probably because no one cares about trash?
almost every facility like it in the country is running in the red. More than 2K municipalities are paying to dispose of their recyclables instead of the other way around.
Anything that requires constant vigilance (sorting) combined with subsidies isn’t going to work even medium-term. looks like recycling needs a big reboot. 2017-04-26: Plastic-eating worms. This sounds like one of those “obvious solutions”, like releasing rabbits in Australia to deal with a forgotten problem. Fear our future where the wax worm is up there with rust as a mortal enemy of civilization.
While other organisms can take weeks or months to break down even the smallest amount of plastic, the wax worm can get through more—in a far shorter period of time. The researchers let 100 wax worms chow down on a plastic grocery bag, and after just 12 hours they’d eaten 4% of the bag. That may not sound like much, but that’s a vast improvement over fungi, which weren’t able to break down a noticeable amount of polyethylene after 6 months.
Hydrothermal liquefaction could change the world’s polyolefin waste, a form of plastic, into useful products, such as clean fuels and other items. Once the plastic is converted into naphtha, it can be used as a feedstock for other chemicals or further separated into specialty solvents or other products. There is 1B tons of polyolefin waste in landfills.
2019-03-13: Plastic recycling never worked, and was a greenwashing effort by the industry, and dum-dums fell for it.
Even before China’s ban, only 9% of discarded plastic was being recycled, while 12% was burned. The rest was buried in landfills or simply dumped and left to wash into rivers and oceans. Without China to process plastic bottles, packaging, and food containers—not to mention industrial and other plastic waste—the already massive waste problem posed by our throwaway culture will be exacerbated, experts say. The planet’s load of nearly indestructible plastics—more than 8B tons have been produced worldwide over the past 60 years—continues to grow.
Companies like ExxonMobil, Shell, and Saudi Aramco are ramping up output of plastic to hedge against the possibility that a serious global response to climate change might reduce demand for their fuels. Petrochemicals now account for 14% of oil use, and are expected to drive 50% of oil demand growth between now and 2050. The World Economic Forum predicts plastic production will double in the next 20 years.
Every human on Earth is ingesting 2000 particles of plastic a week
2020-04-11: 90% breakdown of PET in under 10 hours. Process is still expensive and needs to scale further. 2020-07-08: Apples are the most contaminated fruit while carrots are the vegetables most affected. This is a much bigger problem than the performative efforts to clean up the great pacific garbage patch.
THROW A POLYESTER sweater in the washing machine and it’ll come out nice and clean, but also not quite its whole self. As it rinses, millions of synthetic fibers will shake loose and wash out with the waste water, which then flows to a treatment plant. Each year, a single facility might pump 21B of these microfibers out to sea, where they swirl in currents, settle in sediments, and end up as fish food, with untold ecological consequences.
The company plans to use what it learns from the demonstration facility to build its first industrial plant, which will house a reactor 20x larger than the demonstration reactor. That full-scale plant will be built near a plastic manufacturer somewhere in Europe or the US, and should be operational by 2025. Manufacturing PET from enzymatic recycling could reduce greenhouse gas emissions between 17% and 43% compared to making virgin PET.
Last month, a group of marine biologists noticed something fishy in a video by a nonprofit called The Ocean Cleanup. “This is likely a staged video. I call bullshit.” In the 25-second clip, a large net appears to dump 4000 kg of plastic waste, including crates, buckets, and fishing gear, onto the deck of a ship. The Ocean Cleanup, which has raised more than $100m on the promise to rid plastic from the seas, said the trash in the video was just pulled from the Great Pacific Garbage Patch. “It’s like mopping up the spill when the spigot is still on. We can’t clean up our way out of plastic pollution.”
It is becoming increasingly clear that, if a useful device for quantum computation will ever be built, it will be embodied by a classical computing machine with control over a truly quantum subsystem, this apparatus performing a mixture of classical and quantum computation. This paper investigates a possible approach to the problem of programming such machines: a template high level quantum language is presented which complements a generic general purpose classical language with a set of quantum primitives.
A very interesting paper, basically stating that any quantum computer will need a classical front end to deal with data pre- and post processing. even the very pragmatic distinction between call-by-value and call-by-reference needs to be rethought:
It is well known that the no-cloning theorem excludes the possibility of replicating the state of a generic quantum system. Since the call-by-value paradigm is based on the copy primitive, this means that quantum programming can not use call-by-value; therefore a mechanism for addressing parts of already allocated quantum data must be supplied by the language.
2007-02-12: D-Wave 16 qubit prototype, with hopes for a 1024 qubit system in late 2008. funny: they are not sure if it is a quantum computer at all, it might be an analog computer. 2007-04-22: the quantum computation version of the stacked turtle
But it was still pretty exciting stuff. Holy Zarquon, they said to one another, an infinitely powerful computer? It was like a 1000 Christmases rolled into 1. Program going to loop forever? You knew for a fact: this thing could execute an infinite loop in less than 10 seconds. Brute force primality testing of every single integer in existence? Easy. Pi to the last digit? Piece of cake. Halting Problem? Sa-holved.
They hadn’t announced it yet. They’d been programming. Obviously they hadn’t built it just to see if they could. They had had plans. In some cases they had even had code ready and waiting to be executed. One such program was Diane’s. It was a universe simulator. She had started out with a simulated Big Bang and run the thing forwards in time by 13.6b years, to just before the present day, watching the universe develop at every stage – taking brief notes, but knowing full well there would be plenty of time to run it again later, and mostly just admiring the miracle of creation.
For “generic” problems of finding a needle in a haystack, most of us believe that quantum computers will give at most a polynomial advantage over classical ones.
2011-01-20: 10b qubits is very significant. i am sure there are all sorts of caveats, but still: wow 2011-10-04: Philosophy and Theoretical Computer Science class by Scott Aaronson.
This new offering will examine the relevance of modern theoretical computer science to traditional questions in philosophy, and conversely, what philosophy can contribute to theoretical computer science. Topics include: the status of the Church-Turing Thesis and its modern polynomial-time variants; quantum computing and the interpretation of quantum mechanics; complexity aspects of the strong-AI and free-will debates; complexity aspects of Darwinian evolution; the claim that “computation is physical”; the analog/digital distinction in computer science and physics; Kolmogorov complexity and the foundations of probability; computational learning theory and the problem of induction; bounded rationality and common knowledge; new notions of proof (probabilistic, interactive, zero-knowledge, quantum) and the nature of mathematical knowledge. Intended for graduate students and advanced undergraduates in computer science, philosophy, mathematics, and physics. Participation and discussion are an essential part of the course.
2013-04-13: Quantum computing since Democritus. Written in the spirit of the likes of Richard Feynman, Carl Sagan, and Douglas Hofstadter, and touching on some of the most fundamental issues in science, the unification of computation and physics. kind of like a new kind of science was, without the bs. Plus Scott is a funny guy, so even if you only understand 5% (likely, given the deep topics), seems worth it. If you want to get a taste, try this paper: NP-complete Problems and Physical Reality 2017-07-09: Multi-colored photons
the technology developed is readily extendable to create 2-quDit systems with more than 9000 dimensions (corresponding to 12 qubits and beyond, comparable to the state of the art in significantly more expensive/complex platforms).
2018-10-09: Quantum Verification. How do you know whether a quantum computer has done anything quantum at all?
After 8 years of graduate school, Mahadev has succeeded. She has come up with an interactive protocol by which users with no quantum powers of their own can nevertheless employ cryptography to put a harness on a quantum computer and drive it wherever they want, with the certainty that the quantum computer is following their orders. Mahadev’s approach gives the user “leverage that the computer just can’t shake off.” For a graduate student to achieve such a result as a solo effort is “pretty astounding”. Quantum computation researchers are excited not just about what Mahadev’s protocol achieves, but also about the radically new approach she has brought to bear on the problem. Using classical cryptography in the quantum realm is a “truly novel idea. I expect many more results to continue building on these ideas.”
space-time achieves its “intrinsic robustness,” despite being woven out of fragile quantum stuff. “We’re not walking on eggshells to make sure we don’t make the geometry fall apart. I think this connection with quantum error correction is the deepest explanation we have for why that’s the case.”
That rapid improvement has led to what’s being called “Neven’s law,” a new kind of rule to describe how quickly quantum computers are gaining on classical ones. Quantum computers are gaining computational power relative to classical ones at a “doubly exponential” rate — a staggeringly fast clip. With double exponential growth, “it looks like nothing is happening, nothing is happening, and then whoops, suddenly you’re in a different world.”
This is certainly the most extreme of the nerd rapture curves i have seen:
the very near future should be the watershed moment, where quantum computers surpass conventional computers and never look back. Moore’s Law cannot catch up. A year later, it outperforms all computers on Earth combined. Double qubits again the following year, and it outperforms the universe.
In the mid-2000s there was a small diamond mined from the Ural Mountains. Is was called the ‘magic Russian sample. The diamond was extremely pure—almost all carbon, which isn’t common but with a few impurities that gave it strange quantum mechanical properties. Now anyone can go online and buy a $500 quantum-grade diamond for an experiment. The diamonds have nitrogen impurities—but what Schloss’s group needs is a hole right next to it, called a nitrogen vacancy. Russian “magic diamonds” can hold qubits in place and thus act the same way that a trapped-ion rig does. They replace a single carbon atom in a diamond’s atomic lattice with a nitrogen atom and leaving a neighboring lattice node empty, engineers can create what’s called a nitrogen-vacancy (NV) center. This is generally inexpensive since it’s derived from nature.
With no evaluative judgment attached, this is an unprecedented time for quantum computing as a field. Where once faculty applicants struggled to make a case for quantum computing (physics departments: “but isn’t this really CS?” / CS departments: “isn’t it really physics?” / everyone: “couldn’t this whole QC thing, like, all blow over in a year?”), today departments are vying with each other and with industry players and startups to recruit talented people. In such an environment, we’re fortunate to be doing as well as we are. We hope to continue to expand.
2019-07-26: Quantum hardware should make monte carlo methods more powerful & accurate. 2019-08-20: 1 Million Qubits
Fujitsu has a Digital Annealer with 8192 Qubits and a 1M qubit system in the lab. Digital Annealer is a new technology that is used to solve large-scale combinatorial optimization problems instantly. Digital Annealer uses a digital circuit design inspired by quantum phenomena and can solve problems which are difficult and time consuming for classical computers.
Microsoft is developing Majorana-based topological quantum computer qubits which will be higher-quality and lower error rate qubits. A high-quality hybrid system made of InSb nanowires with epitaxial-grown Al shells has revealed ballistic superconductivity and quantized zero-bias conductance peak. This holds great promise for making the long-sought topological quantum qubits.
they have made the simulation of the quantum electrons so fast that it could run extremely long without restrictions and the effect of their motion on the movement of the slow ions would be visible
Transparent crystals with optical nonlinearities could enable quantum computing at room temperature by 2030
2020-12-03: BosonSampling. A second method achieves quantum supremacy.
Do you have any amusing stories? When I refereed the Science paper, I asked why the authors directly verified the results of their experiment only for up to 26-30 photons, relying on plausible extrapolations beyond that. While directly verifying the results of n-photon BosonSampling takes ~2n time for any known classical algorithm, I said, surely it should be possible with existing computers to go up to n=40 or n=50? A couple weeks later, the authors responded, saying that they’d now verified their results up to n=40, but it burned $400000 worth of supercomputer time so they decided to stop there. This was by far the most expensive referee report I ever wrote!
2021-12-06: Quantum Computing Overview. A really good overview of the field of quantum computing with a clear explanation of how they work, why people are excited about quantum algorithms and their value, the potential applications of quantum computers including quantum simulation, artificial intelligence and more, and the different models and physical implementations people are using to build quantum computers like superconducting devices, quantum dots, trapped ions, photons or neutral atoms, and the challenges they face.
Here we report the measurement of logical qubit performance scaling across several code sizes, and demonstrate that our system of superconducting qubits has sufficient performance to overcome the additional errors from increasing qubit number. We find that our distance-5 surface code logical qubit modestly outperforms an ensemble of distance-3 logical qubits on average, in terms of both logical error probability over 25 cycles and logical error per cycle ((2.914 ± 0.016)% compared to (3.028 ± 0.023)%). To investigate damaging, low-probability error sources, we run a distance-25 repetition code and observe a 1.7 × 10−6 logical error per cycle floor set by a single high-energy event (1.6 × 10−7 excluding this event). We accurately model our experiment, extracting error budgets that highlight the biggest challenges for future systems. These results mark an experimental demonstration in which quantum error correction begins to improve performance with increasing qubit number, illuminating the path to reaching the logical error rates required for computation.
2023-06-19: It might be possible to work around noise, making quantum computing practical.
IBM physicist Abhinav Kandala conducted precise measurements of the noise in each of their qubits, which can follow relatively predictable patterns determined by their position inside the device, microscopic imperfections in their fabrication and other factors. Using this knowledge, the researchers extrapolated back to what their measurements — in this case, of the full state of magnetization of a 2D solid — would look like in the absence of noise. They were then able to run calculations involving all of Eagle’s 127 qubits and up to 60 processing steps — more than any other reported quantum-computing experiment. The results validate IBM’s short-term strategy, which aims to provide useful computing by mitigating, as opposed to correcting, errors. Over the longer term, IBM and most other companies hope to shift towards quantum error correction, a technique that will require large numbers of additional qubits for each data qubit.
Broken pipes and rusty fences. If that ain’t scary, few things are.
The main entrances to Los Alamos are only marginally better defended than TA-33’s land. The military-like guards keeping watch at these points certainly look fierce in camouflage paints and black bulletproof vests. But there’s little to back up the image. Their belts have gun holsters, but no guns to fill them. Around facilities like the biology lab, where anthrax and other biotoxins have been handled, no sentries stand guard at all. Nor is there any kind of fence to keep the curious and the malicious away — not even a piece of string.
The United States Geological Survey is now reporting the magnitude of the claimed North Korean nuclear test as 4.2. This seems to be curiously low. Now, estimating explosive yield from the body magnitude of a seismic event is a tricky business, and requires knowledge of details such as the depth of the detonation and the geological properties of the surroundings, but a magnitude around 4.2 is what you’d expect for a detonation of 1 kiloton. The “natural size” of a crude fission bomb is in excess of 10 kilotons, from which you’d expect a magnitude closer to 5. It is very unlikely that a low kiloton yield device would be used in an initial test.
2006-12-03: The Agony of Atomic Genius, biographical sketch of J. Robert Oppenheimer
2008-06-28: Man-made nuclear explosions in the 1940s and 1950s released isotopes into the environment that do not occur naturally, allowing the dating of works of art. 2010-09-21: The Atom Bomb on Film. Or you could go to the atomic testing museum in Vegas and see these and much more in person.
2010-11-25: Nuke Detector. Turn a supertanker into an antineutrino detector by kitting it out with the necessary photon detectors and filling it with 10^34 protons. Then station it off the coast of suspicious countries and submerge it. 2013-11-26: India nuclear assassinations and the Indian government is mum about it. Nuclear scientists have very high mortality in Iran too, but the government there is making a huge ruckus about it.
Indian nuclear scientists haven’t had an easy time of it over the past 10 years. Not only has the scientific community been plagued by “suicides,” unexplained deaths, and sabotage, but those incidents have gone mostly underreported in the country—diluting public interest and leaving the cases quickly cast off by police.
during the Cold War, the United States did deploy man-portable nuclear destruction. If Warsaw Pact forces ever bolted toward Western Europe, they could resort to nukes to delay the advance long enough for reinforcements to arrive. These “small” weapons, many of them more powerful than the Hiroshima bomb, would have obliterated any battlefield and irradiated much of the surrounding area.
In 1957, a young man named Darrell Robertson enlisted in the US Army and participated in a secret training program in the middle of the Nevada desert. He and his fellow recruits were sworn to secrecy and, for decades, told no one of their experiences. In 1996, the US government declassified the project and Robertson was finally able to tell his story. In X-Ray Man, Robertson recalls training exercises in which the Department of Defense used him and other soldiers in nuclear tests more than 10 years after the horrors of Hiroshima and Nagasaki were already well known. Kerri Yost’s powerful short documentary is an account of how Cold War-era fears allowed for shocking treatment not just of supposed enemies, but also of those enlisted to fight against them. Though cancer has attacked his body, Robertson, supported by his wife, remains stoic and dignified, offering the quiet but forceful observation that ‘any person in the military becomes part of military science’.
2015-09-09: Nuclear wars for SETI. Nuclear explosions might be the first thing we see of other life at interstellar distances. Gamma rays are much easier to detect than radio waves, but would only last a few days at most. You’d have to be extremely lucky to catch that, but then we can spot GRB like that all the time. 2016-07-17: The H-Bombs in Turkey
Among the many questions still unanswered following Friday’s coup attempt in Turkey is one that has national-security implications for the United States and for the rest of the world: How secure are the American hydrogen bombs stored at a Turkish airbase?
2019-03-12: Trinity Test. The first detonation of a nuclear bomb
To avoid being destroyed and rendered useless—their silos provide no real protection against a direct Russian nuclear strike—they would be “launched on warning,” that is, as soon as the Pentagon got wind of an incoming nuclear attack. Because an error could have disastrous consequences, James Mattis testified to the Senate Armed Services Committee in 2015 that getting rid of America’s land-based nuclear missiles “would reduce the false alarm danger.” Whereas a bomber can be turned around even on approach to its target, a nuclear missile launched by mistake can’t be recalled.
It is true that science can be done in the space station. But science can also be done dressed in a clown suit atop a large Ferris wheel. The argument ought to be over where is the best place for it. Performing experiments in microgravity does not require a $100B platform. Moreover, much of the work that can genuinely be done only on the station is justified through another magnificently circular leap of logic. Research into the effects of microgravity on human health and the growth of soybeans, for instance, is useful only in the context of a manned mission to Mars.
It doesn’t pay to lift humans out of earth’s gravity well. Billions each year could instead be spent on research and development for cheaper transport options, spurring the advent of a commercial space industry. 2006-10-24: Our Non-Expeditions to the Moon and Mars
President Bush is right. The space shuttle and the space station deserve termination. The true heart of his proposal is the elimination of these programs, and the substitution of robotic exploration.
Space exploration is fundamentally about the survival of the species, about ensuring better odds for our survival through the promulgation of the human species. But as we do it, we will also ensure the prosperity of our species in the economic sense, in a 1000 ways.
NASA is starting to take extinction events seriously and arguing for space missions as a hedge. About time. 2007-06-06: Future of NASA
Reducing the cost of space access is now being addressed by the private sector. NASA is now acting as a responsible potential customer of commercial launch services (COTS). The government has a poor record competing with the private sector in the arena of cost effectiveness. NASA spent many years and billions of dollars in pursuit of next generation launch technologies, with limited success. NASA has now wisely chosen to provide a market with exploration programs and to permit private enterprise to have a crack at making that affordable. In the meantime NASA is developing the Ares family of launch vehicles to provide the capabilities it requires to initiate the human exploration program until the market is able to offer cheaper alternatives. As for the International Space Station (ISS), it is essential that we learn how to truly live and work in space – not just pay visits. ISS is a vital international laboratory for learning how to build, live aboard, maintain, and operate a complex vehicle in space. The same is true for a lunar base that enables us to use the resources of space and assists our education in how to reach Mars. As for favored contractors, the COTS program and the market established by exploration will open new venues for many companies and communities around the world to participate.
nasa’s reply to allegations of suckiness by wired et al 2010-05-15: The New Space Frontier
Today, the President will articulate an ambitious and exciting new plan that will alter our destiny as a species. I believe this address could be as important as President Kennedy’s 1962 speech at Rice University. For the first time since Apollo, our country will have a plan for space exploration that inspires and excites all who look to the stars. Even more important, it will work.
NASA finally creates a real space industry. You know, with competition and stuff. 2012-03-03: Space exploration future .Beyond on a related note, getting into AMNH early without all the crowds makes it twice as awesome. 2012-05-27: Mars Drive
The reference mission design of the MarsDrive Consortium is discussed, which has been created to facilitate exploration of the red planet through methods that are both realizable and cost-sustainable with existing technology. This mission plan—known as Mars for Less—is predicated on the use of existing medium-lift launch vehicles. In this architecture, 25-ton propulsion stages are placed individually in low-Earth orbit, where they are mated to Mars-bound payloads and ignited at successive perigees to execute trans-Mars injection. Spacecraft follow conjunction-class trajectories to the red planet and utilize aerodynamic methods for orbital capture and descent. Return vehicles are fueled with methane/oxygen bipropellant synthesized primarily from Martian resources. Dispatching expeditions from orbit with individual, high energy stages—rather than directly from the Earth’s surface—allows for the division of mission mass into more manageable components, which can be launched by vehicles that exist today. This plan does not require the development of heavy-lift launch technology: an effective yet costly proposition that may otherwise hinder current space exploration initiatives. Without the need for heavy-lift boosters, piloted missions to Mars may be undertaken presently, and within the capabilities of private initiatives. It is argued that the mission design herein represents a more viable method of conducting early human Mars exploration than proposals which require heavy-lift launch vehicles—an alternative method by which the red planet can be opened to humanity.
Between the safety fetish and cover your ass problems with government funded spaceflight, private companies have a 100x cost advantage. What flag will they plant in the regolith?
Here is one such proposal 2013-02-27: Going to Mars in 2018? This will be the most awe-inspiring and profound event this quarter-century.
Inspiration Mars Foundation believes in the exploration of space as a catalyst for growth, national prosperity, knowledge and global leadership. History has shown that strong nations reap these benefits when they boldly follow a path rooted in curiosity and guided by technological innovation. In 2018, the planets will literally align, offering a unique orbit opportunity to travel to Mars and back to Earth in only 501 days. Inspiration Mars is committed to sending a 2-person American crew – a man and a woman – on an historic journey to fly within 160 km around the Red Planet and return to Earth safely.
2013-03-12: Mars Research Station. I salute this effort. Very often, things look silly to our eyes that become crucial for humanity later on.
In the vast open spaces of southern Utah, Reuters photographer Jim Urquhart recently paid a visit to the Mars Desert Research Station (MDRS). Built and operated by a space advocacy group called the Mars Society, the research facility is investigating the feasibility of human exploration of Mars, using the Utah desert’s Mars-like terrain to simulate working conditions on the red planet. Since 2000, more than 100 small crews have served 2-week rotations in the MDRS, conducting research in an on-site greenhouse, observatory, engineering area, and living space. Urquhart was able to accompany members of the Crew 125 EuroMoonMars B mission inside the MDRS facility, and on a simulated trip to collect Martian geological samples.
2014-02-18: Indian Mars Exploration. India’s Mars mission is 9x cheaper than a similar NASA one. I hope they succeed and put another nail into the coffin of bloated military-industrial complex projects.
While India’s recent launch of a spacecraft to Mars was a remarkable feat in its own right, it is the $75m mission’s thrifty approach to time, money and materials that is getting attention.
2014-10-31: Exploration is sacrifice, and we’d best re-learn that lesson.
I would like to share my deepest personal thoughts on today’s Virgin Galactic Accident. As you know, this is deeply meaningful to me, my family and friends.
Today, most importantly, my heart goes out to those who have lost loved ones, and the many at Virgin Galactic, Scaled Composites, the Virgin Group and the Mojave Spaceport who this accident deeply affects.
I urge all of us to keep something in mind. We are on the verge of opening the space frontier, one of the greatest endeavors of our species.
Many Americans forget that 500 years ago 1000s of European gave their lives to open the Americas, and 200 years ago, the early American’s risked their lives to open the west. This is what exploring is all about. We risk our lives for what we believe in. This is the American way, the explorer’s way.
I for one, am proud to be a Virgin Galactic client. I believe in the company, and know, without a doubt, that they will succeed, and I will fully trust them with my safety when my turn to fly materializes.
A far cry from the fierce Cold War Space Race between the US and the Soviet Union, exploration in the 21st century is likely to be a far more globally collaborative project. This spirit of trans-border ownership and investment seems set to continue. One key part of this is the Global Exploration Roadmap, an effort between space agencies like NASA, France’s Centre National d’Etudes Spatiales, the Canadian Space Agency, and the Japan Aerospace Exploration Agency, among many others, that is intended to aid joint projects from the International Space Station to expeditions to the Moon and near-Earth asteroids—and to reach Mars. On a recent trip to India’s space agency, Stofan recounted to me, she met with many Indian engineers who were just as excited as the Americans to get scientists up there, not only to explore, but also to begin nailing down the question of whether there was ever life on the red planet. It’s also clear that the next stage of space exploration will not only be more global, but will equally involve greater private and public partnerships. Companies like Space X are increasingly involved in NASA’s day-to-day operations
At 61, Dr Stone appreciates the limits of human exploration. 23 of his friends lost their lives on expeditions and he has personally recovered 7 bodies. Now Stone Aerospace is developing a team of robots to hunt for microbial life on Europa. The discovery of Europan life would, Dr Stone reckons, be “a pretty good contender” for one of the most momentous events in human history. That might satisfy most explorers, but not Dr Stone. He has founded the Shackleton Energy Company to process water on the Moon into oxygen and hydrogen for rocket fuel.
The possibility of applying the Space-X Falcon-Heavy booster to human exploration of the inner solar system is discussed. A human-rated Dragon command module and an inflatable habitat module would house and support the 2-4 person crew during a ~1 year interplanetary venture. To minimize effects of galactic cosmic rays, older astronauts should conduct the mission during Solar Maximum. Crew life support is discussed as is application of a ~1-km square solar photon sail. The sail would be applied to rendezvous with the destination Near Earth Object (NEO) and to accelerate the spacecraft on its return to Earth. An on-line NASA trajectory browser has been used to examine optimized trajectories and destinations during 2025-2026. A suitable destination with well established solar-orbital parameters is Asteroid 2009 HC.