In this elegant research vessel, Craig Venter set sail around the world to shotgun sequence the millions of viruses and bacteria in every spoonful of seawater. From the first 5 ocean samples, this team grew the number of known genes on the planet by 10x and the number of genes involved in solar energy conversion by 100x. The ocean microorganisms have evolved over a longer period of time and have pathways that are more efficient than photosynthesis. Another discovery: every 300 km across the open ocean, the microbial genes are 85% different. The oceans are not homogenous masses. They consist of myriad uncharted regions of ecological diversity… and the world’s largest genetic database.
Molecular self-assembly is feasible for simple materials’ ‘biological systems can be engineered to to operate outside a living cell and in alternate configurations’ ‘more research to determine feasibility for nanobiotechnology needed
I’m not sure that freewill, if it exists, requires any immeasurable quantum mechanical mumbo jumbo. The magic is not in any quantum mechanical phenomena inside the neurons, but in the standard physics arrangement of them.
More likely, the appearance of free will is result of the inability to perform 100% introspection into one’s own mind. I can no more “understand” the real-time machinations of my own mind than a Pentium processor can run a real-time simulation of its own transistors. Because I can’t perfectly introspect my subconscious, much of its output looks magically non-deterministic (hence the seeming similarity to quantum mechanical systems).
Any bounded-rational being would believe itself to have freewill based on its ability to take independent actions and its inability to introspect out all the causal factors underpinning its own actions. In reality, the system that creates intelligence can be 100% deterministic, just too complex for that intelligence to understand itself. Only a much more powerful intelligence could look down and see that these beings that think they have free will are actually operating on “simple” rules.
2007-03-23: Selecting for Gay Pagan Babies. Delicious logical quandaries.
Would the selection rob the child of free will? I don’t think so. What is being set is parts of personality traits, not the thoughts or reactions of the emerging person. They will bias and affect the thoughts, but no more and no less than any other personality traits. That these ones were selected does not give the parents more control over the child or predetermine its destiny. A theological argument against would be that God would make sure to give the right genome, and that parents should trust God to do it right. But if that is true, then God seems to like gays too.
2009-04-16: Strong Free Will Theorem. If indeed we humans have free will, then elementary particles already have their own small share of this valuable commodity. 2014-10-03: The free-will fix
New brain implants can restore autonomy to damaged minds, but can they settle the question of whether free will exists? If free will could be safely enhanced, would those with strengthened capacities be held to a higher standard?
We tend to take it for granted that conscious thoughts precede our actions. Indeed, our systems of morality, justice and moral responsibility are based on the notion that people are free to make thoughtful decisions. However, the US scientist Benjamin Libet’s groundbreaking 1980s experiments on the relationship between brain activity, conscious thoughts and physical actions caused some scientists and philosophers to rethink the concept of ‘free will’ and ask whether our decisions are made subconsciously before we’re even aware of them.
Quantum computing theorist, popular author, blogger, and scientist Scott Aaronson on the #MeaningOfLife, Enlightenment, Schrodinger, Heisenberg, the Matrix, balancing work and family, and why the universe is not just a simulation.
It is time the academic review process is made more transparent. Bertrand Meyer on why reviews could learn from open source and weblogs. If most of your thoughts are in the public record, it affects your thought processes. For the better I think.
It is widely believed that anonymous refereeing helps fairness, by liberating reviewers from the fear that openly stated criticism might hurt their careers. In my experience, the effects of anonymity are worse than this hypothetical damage. Referees too often hide behind anonymity to turn in sloppy reviews; worse, some dismiss contributions unfairly to protect their own competing ideas or products. Even people who are not fundamentally dishonest will produce reviews of unsatisfactory quality out of negligence, laziness or lack of time because they know they can’t be challenged. Putting your name on an assessment forces you to do a decent job.
The day that you find a more interesting paper in Citeseer than in any IEEE or ACM e-journal, it’s the day you don’t look back. But what would happen next? What about the academic symbiosis with the peer review system? Citeseer uses a Google page-rank-like algorithm for ranking: which is analyzing the properties of the citation network topology to understand which papers are more influential than others. Just like Google does with hyperlinks for web pages, Citeseer does it for bibliographic citations: the result is that peer review is not done by a panel of experts, but by every researcher in the field!!
what strikes me is how much pomp, circumstance and apparatus academia requires in order to frame even a very small and simple point. References to everything in the literature ever said on any vaguely related topic, detailed comparisons of your work to whatever it is the average journal referee is likely to find important — blah, blah, blah, blah, blah…. A point that I would more naturally get across in 5 pages of clear and simple text winds up being a 30 page paper!
I’m writing some books describing the Novamente AI system — one of them, 600 pages of text, was just submitted to a publisher. The other 2, 300 and 200 pages respectively, should be submitted later this year. Writing these books took a really long time but they are only semi-technical books, and they don’t follow all the rules of academic writing — for instance, the whole 600 page book has a reference list no longer than I’ve seen on many 50-page academic papers, which is because I only referenced the works I actually used in writing the book, rather than every relevant book or paper ever written. I estimate that to turn these books into academic papers would require me to write 60 papers. To sculpt a paper out of text from the book would probably take me 2-7 days of writing work, depending on the particular case. So it would be at least 1 full year of work, probably 2 full years of work, to write publishable academic papers on the material in these books!
The lack of risk-taking is particularly evident in computer science:
Furthermore, if as a computer scientist you develop a new algorithm intended to solve real problems that you have identified as important for some purpose (say, AI), you will probably have trouble publishing this algorithm unless you spend time comparing it to other algorithms in terms of its performance on very easy “toy problems” that other researchers have used in their papers. Never mind if the performance of an algorithm on toy problems bears no resemblance to its performance on real problems. Solving a unique problem that no one has thought of before is much less impressive to academic referees than getting a 2% better solution to some standard “toy problem.” As a result, the whole computer science literature (and the academic AI literature in particular) is full of algorithms that are entirely useless except for their good performance on the simple “toy” test problems that are popular with journal referees.
His first scenario makes me wonder if amateur scientists could again make meaningful contributions to research, combined with a wiki-like process that (hopefully) would identify promising directions better than today’s peer reviews:
And so, those of us who want to advance knowledge rapidly are stuck in a bind. Either generate new knowledge quickly and don’t bother to ram it through the publication mill … or, generate new knowledge at the rate that’s acceptable in academia, and spend 50% of your time wording things politically and looking up references and doing comparative analyzes rather than doing truly productive creative research.
2006-12-31: The trend towards cross-disciplinary research is getting stronger. this is very good news for this dabbler 😉
There is an increasing coalescence of scientific disciplines in many areas. Thus the discovery of the structure of the genome not only required contributions from parts of biology, physics, chemistry, mathematics, and information technology, but in turn it led to further advances in biology, physics, chemistry, technology, medicine, ecology, and even ethics. And all this scientific advance is leading, as it should, to the hopeful betterment of the human condition (as had been also one of the platform promises of the Unity of Science movement, especially in its branch in the Vienna Circle).
Similar developments happen in the physical sciences—a coalescence of particle physics and large-scale astronomy, of physics and biology, and so forth. It is a telling and not merely parochial indicator that ~50% of my 45 colleagues in my Physics Department, owing to their widespread research interests, now have joint appointments with other departments at the University: with Molecular and Cellular Biology, with Mathematics, with Chemistry, with Applied Sciences and Engineering, with History of Science. Just now, a new building is being erected next to our Physics Department. It has the acronym LISE, which stands for the remarkable name, Laboratory of Integrated Science and Engineering. Although in industry, here and there, equivalent labs have existed for years, the most fervent follower of the Unity of Science movement would not have hoped then for such an indicator of the promise of interdisciplinarity. But as the new saying goes, most of the easy problems have been solved, and the hard ones need to be tackled by a consortium of different competences.
Delicious is the Rome, Jerusalem, and Paris of my existence as a researcher these days. It’s where I make my friends, how I get the news, and where I go to trade.
Our goal is to make collaboration and open source come to life in the field of clinical research. With our partners, we will identify specific projects where the sharing of information will lead to better, more accurate research.
2007-10-10: Publication bias: Only positive correlations are published. Time for a big expert system to feed the data from all experiments in. 2007-12-14: Creative Commons
Nature Magazine’s announced that it’s going to share all its human genome papers under Creative Commons Attribution-NonCommercial-ShareAlike licenses.The genomes themselves are not copyrightable and go into a public database, but the papers — which are a vital part of the science — may now be freely copied by any non-commercial publisher.
2009-03-11: Strangest college courses. Can’t let Sarah find this list!
15. Arguing with Judge Judy: Popular ‘Logic’ on TV Judge Shows
14. Underwater Basket Weaving
13. Learning From YouTube
12. Philosophy and Star Trek
11. The Art of Walking
10. Daytime Serials: Family and Social Roles
9. Joy of Garbage
8. The Science of Superheroes
7. Zombies in Popular Media
6. The Science of Harry Potter
5. Cyberporn and Society
4. Simpsons and Philosophy
3. Far Side Entomology
2. Myth and Science Fiction: Star Wars, The Matrix, and Lord of the Rings
1. The Strategy of StarCraft
2009-03-12: Subsidizing intellectually challenged people to go to college just exacerbates the glut of useless degrees.
Scholars in their 60s are not producing path-breaking new research, but they are the people that tenure protects. Scholars in their 20s have no academic freedom at all.
2011-10-28: Makes the argument that #ows is people with useless degrees and a lot of student loan debt. the useless degree bubble is definitely one to pop.
But the lower tier of the New Class — the machine by which universities trained young people to become minor regulators and then delivered them into white collar positions on the basis of credentials in history, political science, literature, ethnic and women’s studies — with or without the benefit of law school — has broken down. The supply is uninterrupted, but the demand has dried up. The agony of the students getting dumped at the far end of the supply chain is in large part the OWS. As Above the Law points out, here is “John,” who got out of undergrad, spent a year unemployed and living at home, and is now apparently at University of Vermont law school, with its top ranked environmental law program — John wants to work at a “nonprofit.”
Even more frightening is the young woman who graduated from UC Berkeley, wanting to work in “sustainable conservation.” She is now raising chickens at home, dying wool and knitting knick-knacks to sell at craft fairs. Her husband has been studying criminal justice and EMT — i.e., preparing to work for government in some of California’s hitherto most lucrative positions — but as those work possibilities have dried up, he is hedging with a (sensible) apprenticeship as an electrician. These young people are looking at serious downward mobility, in income as well as status. The prospects of the lower tier New Class semi-professionals are dissolving at an alarming rate. Student loan debt is a large part of its problems, but that’s essentially a cost question accompanying a loss of demand for these professionals’ services.
2013-04-12: You can observe the useless degrees epidemic every day in Williamsburg.
At present, high % of high school graduates are continuing to enroll in college, often taking on punishing levels of debt in the process. As nearly any recent college graduate knows, many of these people are ending up working in lower wage service jobs (Baristas for example). I think it is entirely possible that future high school graduates will begin to look at these outcomes begin shying away from college.
2014-02-09: Everyone going to college is completely wrong.
Student population: now in decline; Admissions: harder & harder to make targets; Finance: banks not willing to take on more student loans; Research: shift from tenure to adjuncts.
I don’t think MOOCs are the solution. The premise that “everyone” should have a university education is wrong, and deprives people of quality apprenticeships.
On top of that, there are massive problems with science itself. Academic literature is a relic of the era of typesetting, modeled on static, irrevocable, publication. A dependency graph would tell us, at a click, which of the pillars of scientific theory are truly load-bearing. Revision control (git anyone?) would allow for a much more mature way to incorporate improvements from anyone. 2014-06-12: From upworthy university:
You NEED to see this hot model (NSFW) of ethnic politics and foreign policy These squirrels were forced to migrate south because of the Ice Age. What happened next WILL SHOCK YOU
2014-07-01: Amusing given how many overeducated people with useless degrees already work at Starbucks. I don’t think more degrees is the answer.
Starbucks will provide a free online college education to 1000s of its workers, without requiring that they remain with the company
published p-values cluster suspiciously around this 0.05 level, suggesting that some degree of p-hacking is going on. This is also often described as torturing the data until it confesses
Nature has decided to add an option for double-blind peer review – papers would be sent to the referees without author names or institutional affiliation on them. papers from Big Names won’t bother, because they have more to lose by being covered up. So the double-blinded papers might end up disproportionately from smaller groups who are trying to even the playing field, and it risks becoming a negative signal. It might be better if Nature were to take the plunge and blind everything.
2015-05-08: Or you could have saved yourself 250k and gotten a useful education right from the start.
In a Boston basement that houses a new kind of vocational training school, Katy Feng is working harder than she ever did at Dartmouth College. The 22-year-old graduated last year with a bachelor’s degree in psychology and studio art that cost more than $250K. She sent out 10s of résumés looking for a full-time job in graphic design but wound up working a contract gig for a Boston clothing store. “I thought, they’ll see Dartmouth, and they’ll hire me. That’s not really how it works, I found.” She figures programming is the best way to get the job she wants. Hence the basement, where she’s paying $11500 for a 3-month crash course in coding.
Actually, no, let’s not do that, and just let people hold basic jobs even if they don’t cough up $100K from somewhere to get a degree in Medieval History? Sanders’s plan would subsidize the continuation of a useless tradition that has turned into a speculation bubble, prevent the bubble from ever popping, and disincentivize people from figuring out a way to route around the problem
2015-06-08: Scientific publishing is stuck in the 18th century.
to a great extent the internet is used as a PDF delivery device by many publishers, and the PDF is an electronic form of the classic paper journal article, whose basic outlines were established in the 17th and 18th centuries. In other words, in a qualitative sense we’re not that much beyond the Age of Newton and the heyday of the Royal Society. Scientific publishing today is analogous to “steampunk.” An anachronistic mix of elements somehow persisting deep into the 21st century. Its real purpose is to turn the norms of the past into cold hard cash for large corporations.
Our research suggests that open access policies have a tremendous impact on the diffusion of science to the broader general public through an intermediary like Wikipedia
Principal Deputy Manager of the Subcommittee for Athletic Communications
Assistant Vice Dean of the Committee on Neighborhood Compliance
Lead Associate Provost for the Office of Dining Compliance
2016-07-17: This Open Science Framework is excellent, starts peer review during the experiment design phase when it’s still cheap to make corrections, rather than when it’s all done. 2017-02-01: Scientific fraud is rampant
Hartgerink is 1 of only a handful of researchers in the world who work full-time on the problem of scientific fraud – and he is perfectly happy to upset his peers. “The scientific system as we know it is pretty screwed up. I’ve known for years that I want to help improve it. Statcheck is a good example of what is now possible”. The top priority is something much more grave than correcting simple statistical miscalculations. He is now proposing to deploy a similar program that will uncover fake or manipulated results – which he believes are far more prevalent than most scientists would like to admit.
2017-04-15: I wonder how many people ended up graduating with their head trauma degrees?
It was a bold new idea: an all-sports college, classes be damned. But for the athletes at Forest Trail Sports University who faced hunger, sickness and worse, it turned into a nightmare.
Scientific revolutions occur on all scales, but here let’s talk about some of the biggies:
1850-1950: Darwinian revolution in biology, changed how we think about human life and its place in the world.
1890-1930: Relativity and quantum revolutions in physics, changed how we think about the universe.
2000-2020: Replication revolution in experimental science, changed our understanding of how we learn about the world.
We are in the middle of a scientific revolution involving statistics and replication in many areas of science, moving from an old paradigm in which important discoveries are a regular, expected product of statistically-significant p-values obtained from routine data collection and analysis, to a new paradigm of . . . weeelll, I’m not quite sure what the new paradigm is.
Progress itself is understudied. By “progress,” we mean the combination of economic, technological, scientific, cultural, and organizational advancement that has transformed our lives and raised standards of living over the past couple of centuries. For a number of reasons, there is no broad-based intellectual movement focused on understanding the dynamics of progress, or targeting the deeper goal of speeding it up. We believe that it deserves a dedicated field of study. We suggest inaugurating the discipline of “Progress Studies.”
But the scientific journal as we know it was actually born because of popular demand for information during a pandemic.
In the early 1820s, a smallpox outbreak struck Paris and other French cities. A new vaccine was in existence at the time, but reports varied about how effective it was. A powerful medical institution in Paris, the Académie de Médecine, gathered its members to discuss what advice it should issue to the nation. Historically, such meetings were held privately, but the French Revolution had ushered in a new era of government accountability, and journalists were allowed to attend. The scientific debate they relayed upset some members of the Académie, which had hoped to make a clear, unified statement. In response, the Académie sought to regain control of its message by publishing its own weekly accounts of its discussions, which evolved into the academic journals we know today.
The massive effort to develop vaccines for COVID-19 will boost and accelerate the development of cancer vaccines. Streamlined regulatory approvals will speed up the approval of all new medical treatments. Elon Musk was successfully executing a transformation to fully reusable rockets, mass-produced satellites, electric cars, electric trucks, and self-driving vehicles. Elon Musk will continue to execute and win.
In just 3 months, 1 British research team identified the first life-saving drug of the pandemic (and helped cancel hydroxychloroquine). The Recovery trial has an adaptive design, built to evaluate 6 different drugs at once, with methods and goals announced in advance.
Science funding mechanisms are too slow in normal times and may be much too slow during the COVID-19 pandemic. Fast Grants are an effort to correct this. If you are a scientist at an academic institution currently working on a COVID-19 related project and in need of funding, we invite you to apply for a Fast Grant. Fast Grants are $10k to $500k and decisions are made in under 14 days. If we approve the grant, you’ll receive payment as quickly as your university can receive it.
2021-06-15: They were a big success, here’s what they learned:
The first round of grants were given out within 48 hours. Later rounds of grants, which often required additional scrutiny of earlier results, were given out within 2 weeks. These timelines were much shorter than the alternative sources of funding available to most scientists. Grant recipients were required to do little more than publish open access preprints and provide monthly 1-paragraph updates. We allowed research teams to repurpose funds in any plausible manner, as long as they were used for research related to COVID-19. Besides the 20 reviewers, from whom perhaps 20-40 hours each was required, the total Fast Grants staff consisted of 4 part-time individuals, each of whom spent a few hours per week on the project after the initial setup.
We found it interesting that relatively few organizations contributed to Fast Grants. The project seemed a bit weird and individuals seemed much more willing to take the “risk”. We were very positively surprised at the quality of the applications. We didn’t expect people at top universities to struggle so much with funding during the pandemic. 32% said that Fast Grants accelerated their work by “a few months”. 64% of respondents told us that the work in question wouldn’t have happened without receiving a Fast Grant. We were disappointed that very few trials actually happened. This was typically because of delays from university institutional review boards (IRBs) and similar internal administrative bodies that were consistently slow to approve trials even during the pandemic. In our survey of the scientists who received Fast Grants, 78% said that they would change their research program “a lot” if their existing funding could be spent in an unconstrained fashion. We find this number to be far too high: the current grant funding apparatus does not allow some of the best scientists in the world to pursue the research agendas that they themselves think are best. Scientists are in the paradoxical position of being deemed the very best people to fund in order to make important discoveries but not so trustworthy that they should be able to decide what work would actually make the most sense!
Another effect of ASU’s pragmatic research culture is reducing overspecialization among academic disciplines. Crow and Dabar recognize that “specialization has been the key to scientific success” and that disciplines have historic value, but they worry that “such specialization simultaneously takes us away from any knowledge of the whole,” leaving us ill-prepared for the future. Disciplines make it harder to synthesize information, and if a university wants to remain a flexible knowledge enterprise, they need to be prepared to take a holistic approach. During Crow’s tenure, ASU consolidated quite a few academic departments, such as history and political science, or English and modern languages, in order to encourage these fields to solve real problems together – which also had the added benefit of reducing administrative costs. Instead of having to apply to ASU first, then, a student can start by doing the coursework, then enroll when they’re ready, with part of their degree already completed. This feels well-aligned to me with tech’s focus on prioritizing output over credentials. It’s also worth noting that because this experiment lives in Learning Enterprise, it doesn’t detract from the more traditional degree work that lives under Academic Enterprise. ASU also seems to share a lot of values that I cherish about tech, such as optimism, entrepreneurialism, and responsiveness, as well as being results-oriented, and it appears to be a culture that’s enforced from the top. While ASU might not run as fast as a startup, their culture of testing, prototyping, and reinvention seems rare for such a large institution.
2021-11-18: A fascinating look at all the research around Ivermetcin, and what lessons to draw from it about the state of science
This is one of the most carefully-pored-over scientific issues of our time. 10s of teams published studies saying ivermectin definitely worked. Then most scientists concluded it didn’t. What a great opportunity to exercise our study-analyzing muscles! To learn stuff about how science works which we can then apply to less well-traveled terrain! If the lesson of the original replication crisis was “read the methodology” and “read the preregistration document”, this year’s lesson is “read the raw data”. Which is a bit more of an ask. Especially since most studies don’t make it available.
I worked on biomedical literature search, discovery and recommender web applications for many months and concluded that extracting, structuring or synthesizing “insights” from academic publications (papers) or building knowledge bases from a domain corpus of literature has negligible value in industry. Close to nothing of what makes science actually work is published as text on the web. Research questions that can be answered logically through just reading papers and connecting the dots don’t require a biotech corp to be formed around them. There’s much less logic and deduction happening than you’d expect in a scientific discipline.
Now founders and investors—including tech CEOs, crypto billionaires, bloggers, economists, celebrities, and scientists—are coming together to address stasis with experimentation. They’re building a fleet of new scientific labs to speed progress in understanding complex disease, extending healthy lifespans, and uncovering nature’s secrets in long-ignored organisms. In the process, they’re making research funding one of the hottest spaces in Silicon Valley.
Arc Institute
Problem: U.S. science funding attaches too many strings to our best researchers, preventing them from working on the most interesting problems.
Solution: Arc gives scientists no-strings-attached, multiyear funding so that they don’t have to apply for external grants.
Arcadia Science
Problem: Modern science is too siloed—both because researchers are too narrowly focused and because peer-reviewed journals stymie collaboration.
Solution: Expand the menu of species that we deeply research—and embrace an open-science policy.
New Science
Problem: Science is getting old, fast.
Solution: New Science sponsors young scientists.
Bringing all method advances together:
Altogether, these examples are largely restricted to specific disciplines: while research in genomics has pioneered the use of massive open databases, it rarely contains robustness checks or the pre-registration of methods. While the methods of clinical trials are required to be pre-registered, their analysis code is rarely shared publicly.
We believe that good practices from individual fields should serve as models for how science ought to work across the board, and that the scientific process should be radically reassembled from start to finish. How would it work?
To begin with, scientists would spend far more time clarifying the theories they are studying – developing appropriate measures to record data, and testing the assumptions of their research – as the meta-scientist Anne Scheel and others have suggested. Scientists would use programs such as DeclareDesign to simulate data, and test and refine their methods.
Instead of writing research in the form of static documents, scientists would record their work in the form of interactive online documents (such as in Markdown format). Past versions of their writing and analysis would be fully availableto view through platforms such as Git and OSF. Robustness checks and multiverse analysis would be the norm, showing readers the impact of various methodological decisions interactively.
Once research is freed from the need to exist in static form, it can be treated as if it were a live software product. Analysis code would be standardized in format and regularly tested by code-checkers, and data would be stored in formats that were machine-readable, which would enable others to quickly replicate research or apply methods to other contexts. It would also be used to apply new methods to old data with ease.
Some types of results would be stored in mass public databases with entries that would be updated if needed, and other researchers would reuse their results in further analysis. Citations would be viewer-friendly, appearing as pop-ups that highlight passages or refer to specific figures or datasets in prior research (each with their own doi codes), and these citations would be automatically checked for corrections and retractions.
Peer review would operate openly, where the wider scientific community and professional reviewers would comment on working papers, Red Teams would be contracted to challenge research, and comments and criticisms of studies would be displayed alongside them on platforms such as PubPeer. Journals would largely be limited to aggregating papers and disseminating them in different formats (for researchers, laypeople and policymakers); they would act, perhaps, in parallel with organizers of conferences. They could perform essential functions such as code-checking and copy-editing, working through platforms such as GitHub.
2022-02-07: Why Isn’t There a Replication Crisis in Math?
There’s a lot to say about the mathematics we use in social science research, especially statistically, and how bad math feeds the replication crisis.1 But I want to approach it from a different angle. Why doesn’t the field of mathematics have a replication crisis? And what does that tell us about other fields, that do? 1 of the distinctive things about math is that our papers aren’t just records of experiments we did elsewhere. In experimental sciences, the experiment is the “real work” and the paper is just a description of it. But in math, the paper, itself, is the “real work”. Our papers don’t describe everything we do, of course. But the paper contains a (hopefully) complete version of the argument that we’ve constructed. And that means that you can replicate a math paper by reading it. Mathematicians have pretty good idea of what results should be true; but so do psychologists! Mathematicians sometimes make mistakes, but since they’re mostly trying to prove true things, it all works out okay. Social scientists are also (generally) trying to prove true things, but it doesn’t work out nearly so well. Why not? In math, a result that’s too good looks just as troubling as one that isn’t good enough. If a study suggest humans aren’t capable of making reasoned decisions at 11:30, it’s confounded by something, even if we don’t know what.
2022-03-06: There’s much less of a replication crisis in Biology:
In biology, when one research team publishes something useful, then other labs want to use it too. Important work in biology gets replicated all the time—not because people want to prove it’s right, not because people want to shoot it down, not as part of a “replication study,” but just because they want to use the method. So if there’s something that everybody’s talking about, and it doesn’t replicate, word will get out.
To avoid rent dissipation and risk aversion, our state funding of science should be simplified and decentralized into Researcher Guided Funding. Researcher Guided Funding would take the ~$120b spent by the federal government on science each year and distribute it equally to the 250k full-time research and teaching faculty in STEM fields at high research activity universities, who already get 90% of this money. This amounts to about $500k for each researcher every year. You could increase the amount allocated to some researchers while still avoiding dissipating resources on applications by allocating larger grants in a lottery that only some of them win each year. 60% of this money can be spent pursuing any project they want, with no requirements for peer consensus or approval. With no strings attached, Katalin Karikó and Charles Townes could use these funds to pursue their world-changing ideas despite doubt and disapproval from their colleagues. The other 40% would have to be spent funding projects of their peers. This allows important projects to gain a lot of extra funding if a group of researchers are excited about it. With over 5000 authors on the paper chronicling the discovery of the Higgs Boson particle in the Hadron Supercollider, this group of physicists could muster $2.5b a year in funding without consulting any outside sources. This system would avoid the negative effects of long and expensive review processes, because the state hands out the money with very few strings, and risk aversion among funders, because the researchers individually get to decide what to fund and pursue.
2022-10-20: Maybe not all hope is lost. It seems there’s improvement in preregistering studies, more data sharing etc.
There are encouraging signs that pre-registered study designs like this are helping address the methodological problems described above. Consider the following five graphs. The graphs show the results from 5 major studies, each of which attempted to replicate many experiments from the social sciences literature. Filled in circles indicate the replication found a statistically significant result, in the same direction as the original study. Open circles indicate this criterion wasn’t met. Circles above the line indicate the replication effect was larger than the original effect size, while circles below the line indicate the effect size was smaller. A high degree of replicability would mean many experiments with filled circles, clustered fairly close to the line. Here’s what these 5 replication studies actually found:
As you can see, the first 4 replication studies show many replications with questionable results – large changes in effect size, or a failure to meet statistical significance. This suggests a need for further investigation, and possibly that the initial result was faulty. The fifth study is different, with statistical significance replicating in all cases, and much smaller changes in effect sizes. This is a 2020 study by John Protzko et al that aims to be a “best practices” study. By this, they mean the original studies were done using pre-registered study design, as well as: large samples, and open sharing of code, data and other methodological materials, making experiments and analysis easier to replicate…In short, the replications in the fifth graph are based on studies using much higher evidentiary standards than had previously been the norm in psychology. Of course, the results don’t show that the effects are real. But they’re extremely encouraging, and suggest the spread of ideas like Registered Reports contribute to substantial progress.
There’s also some interesting ideas about funding:
Fund-by-variance: Instead of funding grants that get the highest average score from reviewers, a funder should use the variance (or kurtosis or some similar measurement of disagreement) in reviewer scores as a primary signal: only fund things that are highly polarizing (some people love it, some people hate it). One thesis to support such a program is that you may prefer to fund projects with a modest chance of outlier success over projects with a high chance of modest success. An alternate thesis is that you should aspire to fund things only you would fund, and so should look for signal to that end: projects everyone agrees are good will certainly get funded elsewhere. And if you merely fund what everyone else is funding, then you have little marginal impact
2023-02-23: Registered reports for publishing negative results
The fundamental principle underpinning a Registered Report is that a journal commits to publishing a paper if the research question and the methodology chosen to address it pass peer review, with the result itself taking a back seat. For now, Nature is offering Registered Reports in the field of cognitive neuroscience and in the behavioral and social sciences. In the future, we plan to extend this to other fields, as well as to other types of study, such as more exploratory research.
Why are we introducing this format? In part to try to address publication bias, the tendency of the research system — editors, reviewers and authors — to favor the publication of positive over negative results. Registered Reports help to incentivize research regardless of the result. An elegant and robust study should be appreciated as much for its methodology as for its results. More than 300 journals already offer this format, up from around 200 in 2019. But despite having been around for a while, Registered Reports are still not widely known — or widely understood — among researchers. This must change. And, at Nature, we want to play a part in changing it.
2023-08-31: History and other disciplines are even worse than Psychology
Science’s replication crises might pale in comparison to what happens all the time in history, which is not just a replication crisis but a reproducibility crisis. Replication is when you can repeat an experiment with new data or new materials and get the same result. Reproducibility is when you use exactly the same evidence as another person and still get the same result — so it has a much, much lower bar for success, which is what makes the lack of it in history all the more worrying.
2024-04-12: Mandating open access as a condition of funding
The Bill & Melinda Gates Foundation, one of the world’s top biomedical research funders, will from next year require grant holders to make their research publicly available as preprints. The foundation also said it would stop paying for article-processing charges (APCs) — fees imposed by some journal publishers to make scientific articles freely available online for all readers, a system known as open access (OA).
In the next quarter to half century, further advances may lead to large-scale production of hydrogen gas for fuel cells and the ability to store in the ground both bio- and fossil-derived CO2. Microbes could enable the inexpensive production of hydrogen by consuming a hydrogenated feedstock and releasing H2, for example, splitting water with light or splitting hydrogen from biomass or even coal.
A superconducting motor would be very lightweight and far more efficient electrically, generating 3x the torque of a conventional electric motor for the same energy input and weight. In addition, an electric aircraft would be far quieter than a conventional jet as there are no internal combustion processes involved. Liquid hydrogen is cold enough to make the superconducting magnets work but also has 4x as much energy weight for weight than aviation fuel.
2008-01-11: Thermoelectric Energy Conversion. Using hydrogen as a heat exchanger, solid state, 60% efficient energy generation. Can be used with solar, internal combustion, turbines and other sources of waste heat.
The JTEC is an all solid-state engine that operates on the Ericsson cycle. Equivalent to Carnot, the Ericsson cycle offers the maximum theoretical efficiency available from an engine operating between 2 temperatures. The JTEC system utilizes the electro-chemical potential of hydrogen pressure applied across a proton conductive membrane (PCM). The membrane and a pair of electrodes form a Membrane Electrode Assembly (MEA) similar to those used in fuel cells. On the high-pressure side of the MEA, hydrogen gas is oxidized resulting in the creation of protons and electrons. The pressure differential forces protons through the membrane causing the electrodes to conduct electrons through an external load. On the low-pressure side, the protons are reduced with the electrons to reform hydrogen gas. This process can also operate in reverse. If current is passed through the MEA a low-pressure gas can be “pumped” to a higher pressure.
The JTEC could utilize heat from fuel combustion, solar, low grade industrial waste heat or waste heat from other power generation systems including fuel cells, internal combustion engines and combustion turbines. As a heat pump, the JTEC system could be used as a drop in replacement for existing HVAC equipment in residential, commercial, or industrial settings.
To compliment the Scimitar engine, Reaction Engines has proposed a suitable vehicle configuration (A2) that attains the necessary subsonic and supersonic lift/drag ratio for efficient commercial operation. The airframe is designed to have adequate control authority about all axes to handle engine-out and to achieve pitch trim over the full Mach range. In addition the airframe configuration is an efficient structural shape with circular cross section hydrogen tankage and uninterrupted carry-through wing spars. The vehicle is sized to carry 300 passengers since this is typical of future supersonic transport designs and thought to be the minimum to achieve a competitive seat/km cost.
They also have an unpiloted, reusable spaceplane intended to provide inexpensive and reliable access to space.
skylon space plane will be ~$550 / kg to orbit initially and $145 later, significantly cheaper than spacex, which is around $1500 / kg. the space shuttle was $22k / kg. this is achieved with a lower mass ratio, because skylon is air breathing up to 25k altitude. this is a very interesting design and i hope they succeed.
Boeing has unveiled the hydrogen-powered Phantom Eye unmanned airborne system for collecting data and communications, a demonstrator that will stay aloft at 20 km for up to 4 days.
Between this and the british unmanned fighter plane, an inflection point. 2014-03-14: Making hydrogen storage practical will make tesla batteries look like amateur hour, and have huge implications.
DOE had hoped that by 2017, a research team could pack in 7.5% hydrogen by weight by 2020. Li’s team has already crossed that threshold, with a hydrogen storage density of 9.5%. The team has also demonstrated the potential to reach an even higher density
2020-06-09: The push to make America tun on hydrogen. This seems at least partially a boondoggle like ethanol.
Nikola’s truck cab—a 1000-horsepower system comprising carbon fiber tanks, hydrogen fuel and a fuel-cell stack—will push an 18-wheeler up to 1200km and weighs 9000kg. The same juice would demand a lithium-ion battery that would add at least 2000kg to a truck with the same range.
2020-12-10: Negative emission car. This is extremely silly, but then again I remember how much people enjoyed the display in a Prius that tells you how much energy you’re getting back from braking. Anything can be turned into a game.
The car is equipped with Toyota’s “Minus Emissions” technology, whereby air is taken into the vehicle, purified, sent into the fuel stack in order to generate electricity, then vented back out of the vehicle. By the time it comes out of the exhaust, the air has been scrubbed of any harmful chemicals and particulate matter 2.5 microns or greater in size. Are people around you likely to notice that you’ve left fresher air? Perhaps not, but you will: “The large 31cm center display includes an Air Purification display that shows the amount of air purified when driving through an easy-to-understand graphic of runners and digital display. It also includes an Air Purification meter that shows how much air is purified during acceleration. The meter enables the driver to feel the contribution that the new Mirai is making to the environment.”
Unlike other solid-to-liquid-fuel processes such as cornstarch into ethanol, this one will accept almost any carbon-based feedstock. If a 80 kg man fell into one end, he would come out the other end as 17 kg of oil, 3 kg of gas, and 3 kg of minerals, as well as 56 kg of sterilized water. While no one plans to put people into a thermal depolymerization machine, an intimate human creation could become a prime feedstock. “There is no reason why we can’t turn sewage, including human excrement, into a glorious oil”.
Just as we are hitting the hubbert peak, we get a technology that may make oil rigs obsolete:
Andreassen and others anticipate that a large chunk of the world’s agricultural, industrial, and municipal waste may someday go into thermal depolymerization machines scattered all over the globe. If the process works as well as its creators claim, not only would most toxic waste problems become history, so would imported oil. Just converting all the US agricultural waste into oil and gas would yield the energy equivalent of 4B barrels of oil annually. In 2001 the United States imported 4.2B barrels of oil. “This technology offers a beginning of a way away from this.”
With their main (only?) source of income in danger, what will the middle east kleptocracies do?
because
The only thing this process can’t handle is nuclear waste. If it contains carbon, we can do it.” and Thermal depolymerization has proved to be 85% energy efficient for complex feedstocks, and even higher for relatively dry raw materials, such as plastics
Held together by a slowly rotating system of currents northeast of Hawaii, the Eastern Garbage Patch is more than just a few floating plastic bottles washed out to sea; the Patch is a giant mass of trash-laden water 2x the size of Texas.
Declaring war on the “white pollution” choking its cities, farms and waterways, China is banning free plastic shopping bags and calling for a return to the cloth bags of old
2013-12-05: Depolymerization was hailed as the solution ~10 years ago: turning plastic back into more versatile compounds. I weirdly haven’t heard much about it since. Probably because no one cares about trash?
almost every facility like it in the country is running in the red. More than 2K municipalities are paying to dispose of their recyclables instead of the other way around.
Anything that requires constant vigilance (sorting) combined with subsidies isn’t going to work even medium-term. looks like recycling needs a big reboot. 2017-04-26: Plastic-eating worms. This sounds like one of those “obvious solutions”, like releasing rabbits in Australia to deal with a forgotten problem. Fear our future where the wax worm is up there with rust as a mortal enemy of civilization.
While other organisms can take weeks or months to break down even the smallest amount of plastic, the wax worm can get through more—in a far shorter period of time. The researchers let 100 wax worms chow down on a plastic grocery bag, and after just 12 hours they’d eaten 4% of the bag. That may not sound like much, but that’s a vast improvement over fungi, which weren’t able to break down a noticeable amount of polyethylene after 6 months.
Hydrothermal liquefaction could change the world’s polyolefin waste, a form of plastic, into useful products, such as clean fuels and other items. Once the plastic is converted into naphtha, it can be used as a feedstock for other chemicals or further separated into specialty solvents or other products. There is 1B tons of polyolefin waste in landfills.
2019-03-13: Plastic recycling never worked, and was a greenwashing effort by the industry, and dum-dums fell for it.
Even before China’s ban, only 9% of discarded plastic was being recycled, while 12% was burned. The rest was buried in landfills or simply dumped and left to wash into rivers and oceans. Without China to process plastic bottles, packaging, and food containers—not to mention industrial and other plastic waste—the already massive waste problem posed by our throwaway culture will be exacerbated, experts say. The planet’s load of nearly indestructible plastics—more than 8B tons have been produced worldwide over the past 60 years—continues to grow.
Companies like ExxonMobil, Shell, and Saudi Aramco are ramping up output of plastic to hedge against the possibility that a serious global response to climate change might reduce demand for their fuels. Petrochemicals now account for 14% of oil use, and are expected to drive 50% of oil demand growth between now and 2050. The World Economic Forum predicts plastic production will double in the next 20 years.
Every human on Earth is ingesting 2000 particles of plastic a week
2020-04-11: 90% breakdown of PET in under 10 hours. Process is still expensive and needs to scale further. 2020-07-08: Apples are the most contaminated fruit while carrots are the vegetables most affected. This is a much bigger problem than the performative efforts to clean up the great pacific garbage patch.
THROW A POLYESTER sweater in the washing machine and it’ll come out nice and clean, but also not quite its whole self. As it rinses, millions of synthetic fibers will shake loose and wash out with the waste water, which then flows to a treatment plant. Each year, a single facility might pump 21B of these microfibers out to sea, where they swirl in currents, settle in sediments, and end up as fish food, with untold ecological consequences.
The company plans to use what it learns from the demonstration facility to build its first industrial plant, which will house a reactor 20x larger than the demonstration reactor. That full-scale plant will be built near a plastic manufacturer somewhere in Europe or the US, and should be operational by 2025. Manufacturing PET from enzymatic recycling could reduce greenhouse gas emissions between 17% and 43% compared to making virgin PET.
Last month, a group of marine biologists noticed something fishy in a video by a nonprofit called The Ocean Cleanup. “This is likely a staged video. I call bullshit.” In the 25-second clip, a large net appears to dump 4000 kg of plastic waste, including crates, buckets, and fishing gear, onto the deck of a ship. The Ocean Cleanup, which has raised more than $100m on the promise to rid plastic from the seas, said the trash in the video was just pulled from the Great Pacific Garbage Patch. “It’s like mopping up the spill when the spigot is still on. We can’t clean up our way out of plastic pollution.”
Craig Venter has been given ethical approval and a government grant to build the first artificial bacterium. He plans to create a single-celled organism with the minimum number of genes to sustain life.
This could be the drosophila of proteomics by having an idealized organism that is as simple as it can possibly be. Even then, this organism will still be orders of magnitude more complex than cellular automata. Now that I have more time, I need to delve into a new kind of science 2008-09-12: Syn3.0
J. Craig Venter’s work to build an artificial bacterium with the smallest number of genes necessary to live takes current life forms as a template. Protocell researchers are trying to design a completely novel form of life that may never have existed.
We now know we can create a synthetic organism. It’s not a question of ‘if’, or ‘how’, but ‘when‘, and in this regard, think weeks and months, not years.
Venter’s team painstakingly whittled down the genome of Mycoplasma mycoides to reveal a bare-bones set of genetic instructions capable of making life. syn3.0 contains just 473 genes or 531k bp, smaller than any independently replicating organism discovered on Earth to date. it is unclear what 149 of these genes do. also not a single gene is shared across all of life
With only 160k base pairs of DNA, the genome of Carsonella ruddi is less than 50% the size thought to be the minimum necessary for life. Carsonella lives inside a leaf-munching insect, called a psyllid. They have a symbiotic relationship. The bacteria’s sheltered life has allowed it to pare its genome down to the bare minimum. There are certain genes necessary for life that the bacteria’s genome lacks, but these are compensated for by its insect host.
2022-02-25: And now some new work blends Alife with minimal life:
The minimal cell the team modeled, JCVI-syn3A, is an updated version of one presented in Science in 2016. Its genome is designed after that of the very simple bacterium Mycoplasmas mycoides, but stripped of genes that were not essential for life. JCVI-syn3A gets by with 493 genes, but no one knows what 94 of those genes do except that the cell dies without them. To build the new model, the team took an abundance of findings from various fields and wove them together. They used flash-frozen, thin-sliced images of the minimal cell to position its organic machinery precisely. A massive protein analysis helped them sprinkle all the right known proteins inside, and a detailed analysis of the cell membrane’s chemical composition helped them place molecules correctly on the outside. A thorough map of the cell’s biochemistry provided a rulebook for the interactions of the molecules.
As the digital cell grew and divided, 1000s of simulated biochemical reactions occurred, revealing how every molecule behaved and changed over time. The simulations mirrored many measurements of living JCVI-syn3A cells in culture. But they also predicted characteristics of the cells that hadn’t yet been noticed in the lab such as how the cell portions out its energy budget and how quickly its messenger RNA molecules degrade, a fact that critically affects researchers’ understanding of how the cell regulates genes. With a complete enough model, the researchers should be able to get creative: They can see what happens if they prune biochemical pathways, drop in extra molecules or set the simulation in a different environment. The results should give more insights into which processes cells need to survive — and which they don’t. They might even offer glimpses into what the very first cells required billions of years ago.
Some biologists are now combining approaches. Their goal is to create an integrated view of life inside the cell, in the form of a computer simulation that puts the whole system into motion. In grad school, Villa studied under Klaus Schulten, who helped develop the field of whole-cell computational modeling. Klaus worked from the bottom up, favoring “all-atom” simulations, in which virtual atoms follow the laws of quantum mechanics, while Zan worked from the top down, with “kinetic” models that track the cell’s larger traffic patterns. By the 2010s, the state of knowledge had advanced enough for them to try building a hybrid model. Klaus died in 2016. But, last month, Zan’s group published a paper in Cell that outlined a computational model of JCVI-syn3A. The model drew on cryo-EM images from Villa’s lab and on a genetic inventory supplied by J.C.V.I. It included all 452 of JCVI-syn3A’s proteins, plus other cellular bits. In the simulation, these parts interact among themselves as they would in real life.
The software aims to simulate a world that’s very different from ours. If a cell were blown up to the size of a high-school gym, you wouldn’t be able to see across it. It would be filled with 10000s of proteins, most about the size of a basketball. Other biomolecules no bigger than your hand, and water molecules the size of your thumb, would fill the spaces between. (To scale, your whole body would be about the size of a ribosome.) The mixture would have the consistency of hair gel. In such a world, gravity would be virtually meaningless—you would be weightless, as if suspended in a ball pit. And everything would be moving. The mixture would buzz constantly; spend just a few seconds inside it and every medium-sized object around you would have explored every square inch of your body. It would feel like pandemonium, but it wouldn’t be.
2022-03-13: More on cryo-ET and why it is such a big deal.
cryo-ET has evolved tremendously over the past 20 years. Advancements in the field will continue in the years to come, significantly enhancing our knowledge of prokaryotic cell biology. Those enhancements include but are not limited to improved sample preparation workflows, advances in hardware and software, and the curation of the vast amount of data into publicly available resources. Depending on the collection scheme and chosen magnification, fast tomography could increase collection time per target by 50–75%, vastly multiplying the amount of data that can be collected per sample. Resources such as the Caltech Electron Tomography Database have laid the groundwork for building comprehensive collections of cryo-ET data. In turn, some of this information has been translated into resources such the Atlas of Bacterial and Archaeal Cell Structure. This open access, digital resource provides detailed information about the prokaryotic ultrastructure of 85 species, which can be used as a source for comparison of structures in different strains, education, and comparison of sample treatments.
There’s also other microscopy techniques that have lower resolution, but can create 3D movies:
By combining 2 imaging technologies into Multimodal Optical System with Adaptive Imaging Correction (MOSAIC), scientists can now watch in unprecedented 3D detail as cancer cells crawl, spinal nerve circuits wire up, and immune cells cruise through a zebrafish’s inner ear.
We went into the study thinking JCVI-syn3B simply wouldn’t be able to contend with the “inevitable mutations [that are] going to hit one of those essential genes”. The team pitted JCVI-syn3B against the first-generation JCV10syn1.0 from which it was derived. Each strain grew for 2k generations. Although both strains rapidly mutated, JCVI-syn3B could flexibly modify its genes like JCV10syn1.0, even though the latter had far more genetic letters to tolerate random mutations. Both bacterial strains survived similar types of genetic changes—insertions, deletions, and the switching of genetic letters—without a hitch.
“The initial effects of genome reduction were quite large; they made the cells sick”. Their fitness dropped by 50%. Fast-forward 2k generations, and it was a different picture. The minimal cells bounced back, regaining a fitness rate similar to their non-minimal cousins. Despite harboring a bare-boned genome, they readapted to their surroundings and overcame initial genetic shortfalls. The minimal cells’ main lifeline seemed to be “metabolic innovation.” Rather than adapting themselves to slurp more nutrients from the surrounding broth, the cells instead increased their ability to synthesize molecular pieces of fat into an outer protective layer, without sacrificing the lipid molecules essential for regeneration.