lengthy interview with avie tevanian, very interesting.
Tag: unix
Unix at 50
Maybe its pervasiveness has long obscured its origins. But Unix, the operating system that in 1 derivative or another powers nearly all smartphones sold worldwide, was born 50 years ago from the failure of an ambitious project that involved titans like Bell Labs, GE, and MIT. Largely the brainchild of a few programmers at Bell Labs, the unlikely story of Unix begins with a meeting on the top floor of an otherwise unremarkable annex at the sprawling Bell Labs complex in Murray Hill, New Jersey.
Minimizing pointer privilege
We have adapted a complete C, C++, and assembly-language software stack, including the open source FreeBSD OS (nearly 800 UNIX programs and more than 200 libraries including OpenSSH, OpenSSL, and bsnmpd) and PostgreSQL database, to employ ubiquitous capability-based pointer and virtual-address protection.
rm -rf

Leopard, What a mess
Sorry to burst your bubble, but using a Mac makes you a Power User, not a tried and true Unix guru. Go get a real OS, you losers.
+1 like, what is up with Terminal.app not having working scrollback? or where is copy & paste support in X.app?
bash history unleashed
you can have multiple bash instances append to the same bash history. who knew?
Semi-structured data pipelines
the cocooners are musing about a filesystem that is aware of semistructured data.
Imagine having an XSL processor in the kernel:
You could “execute” .xsl files, bypassing having to run a processor manually.
prompt$ page2html.xsl < input.xml > output.html
this jibes well with the notion of making xml the default programming model, something i would like to see (i like xml much more than java 🙂 i wrote about this 2 years ago. damn, has it been 2 years already.
Infosets are pipes
xml applications are infoset pipelines.
henry thompson, w3c
one of the most powerful concepts in computer science is arguably abstraction. by building on abstraction, it has been somewhat possible to manage complexity. productivity advanced or new approaches are more often than not based on abstracting existing ideas and piling on them.
xml is such a story as well. like everyone and their brother i long thought xml was basically a nice way to do markup, sort of html done right. which led to the question what all the hype was about. back then i attributed the hype to the ground breaking insight that simplicity matters. other than that, i failed to see what xml could be useful for.
as it turns out, tags and markup are irrelevant. what matters are infosets, or the information that is contained in xml. with the advent of xml schema it has become possible to add another layer of abstraction to the markup. you no longer have to think of your data in terms of tags and markup, but rather in terms of its types. what does that mean?
it means you can concern yourself with the (simple / complex) types you encounter in your problem space. like the notion of address. you do not care how address is encoded, you just care about its type. xml schema allows you to extract that information out of your data. once you have such rich, structured data, you can do a lot with that.
for instance, the concept of pipes, another powerful abstraction and fundamental to the unix way. conceived by ken thompson as little programs that sequentially work on each others output, it has inspired 20 years of operating system design.
now one of the basic assumptions of the pipes idea was that the data was basically character data. enhance this concept with xml (rich, strongly typed data) and you have the foundation for a lot of new, very powerful ideas. this is what i currently understand xml to be, and what is being built on with xml protocols and ultimately the xml processing model.
(heavily inspired by this keynote talk by henry thompson)