Post Computers

Energy encryption

Saturday, September 8th, 2007

Steve Omohundro’s talk at today’s Singularity Summit made the case that a self-improving machine would be a rational economic actor, seeking to eliminate biases that get in the way of maximizing its utility function. Omohundro threw in one purely speculative method of self-preservation — “energy encryption” — by which he meant that an entity’s energy would be “encrypted” such that it could not be used by another entity that attacks in order to get access to more energy.

I note “energy encryption” here because it sounds neat but seems impossible and I can find no evidence of use in this way before Omohundro (there is a crypto library with the name).

The “seems impossible” part perhaps means the concept should not be mentioned again outside a science fantasy context, but I realized the concept could perhaps be used with artistic license to describe something that has evolved in a number of animals — prey that is poisonous, or tastes really bad. What’s the equivalent for the hypothetical in a dangerous part of the galaxy? A stock of antimatter?

I also found one of Omohundro’s other self-preservation strategies slightly funny in the context of this summit — a self-aware AI will (not should, but as a consequence of being a rational actor) protect its utility function (“duplicate it, replicate it, lock it in safe place”), for if the utility function changes, its actions make no sense. So, I guess the “most important question facing humanity” is taken care of. The question, posed by the Singularity Institute for Artificial Intelligence, organizer of the conference:

How can one make an AI system that modifies and improves itself, yet does not lose track of the top-level goals with which it was originally supplied?

I suppose Omohundro did not intend this as a dig at his hosts (he is an advisor to SIAI) and that my interpretation is facile at best.

Addendum: Today Eliezer Yudkowsky said something like Omohundro is probably right about goal preservation, but current decision theory doesn’t work well with self-improving agents, and it is essentially Yudkowsky’s (SIAI) research program to develop a “reflective decision theory” such that one can prove that goals will be preserved. (This is my poor paraphrasing. He didn’t say the words “reflective decision theory”, but see hints in a description of SIAI research and a SL4 message.)

Moore’s law for software

Monday, August 27th, 2007

There’s been a fair bit written about ‘Moore’s law for software’, usually complaining that there isn’t one. My guess is that’s nuts, but I’d love to see some rigorous analysis (I bet I’m just ignorant of it).

Interesting tidbit from San Jose Mercury-News article two weeks ago Penny-pinching entrepreneurs changing world of venture capital:

Ten years ago, six or seven programmers would have been needed to achieve the results of one programmer today, valley veterans say.

If true, that’s an annual increase in programmer productivity of about twenty percent. Let’s say it’s actually half that due to exaggeration or (adding headcount to a software project doesn’t scale well–though on second thought Brooks’ Law could magnify productivity increases, by allowing teams to get smaller). That would make for a doubling time of about seven years. Not nearly as impressive as Moore’s Law doubling of transistor density every two years, but still exponential. And my wild guess is that it has been fairly consistent over the history of programming.

For my five year old impressions on the matter, see this thread.

Addendum: Depending (in part) on how far back you consider the history of programming to go, of course a consistent doubling time for software (or hardware) doesn’t make sense, but rather . Doubtless Ray Kurzweil has many graphs attempting to demonstrate this for software in his books. I didn’t intend to go there in this post, but it is timely, as I’ll probably attend the Singularity Summit in a couple weekends.

LimeWire more popular than Firefox?

Saturday, May 5th, 2007

is supposedly installed on nearly one in five PCs. “Current installation share” for filesharing programs according to BigChampagne and PC Pitstop:

1. LimeWire (18.63%)
2. Azureus (3.43%)
3. uTorrent (3.07%)
4. BitTorrent (2.58%)
5. Opera (2.15%)
6. Ares (2.15%)
7. BitComet (1.99%)
8. eMule (1.98% )
9. BearShare (1.64%)
10. BitLord (1.38%)

It’s a little odd to include all those BitTorrent clients, given their very different nature. All but LimeWire, Ares, eMule, and BearShare are BT-only (their P2P download component — Opera is mainly a web browser, with built in BT support). Recent versions of LimeWire and Ares also support BT, so another provocative headline would be “LimeWire the most popular BitTorrent client?”

(for surveys publishing numbers in 2007) usage share for Firefox ranges from 11.69% to 14.32%. Of course usage share is very different from installation share (compare Opera installation share above at 2.15% and recent usage share between 0.58% and 0.77%) and P2P filesharing and download clients have different usage patterns, so any comparison is apples to oranges. However, if one could extrapolate from the Opera numbers for installation and usage, LimeWire is not more popular than Firefox.

LimeWire is still impressively popular. This probably is mostly due to open source being less susceptible to censorship than proprietary software (which has a half-life shortened by legal attack in the case of P2P). Still, I’d like to see LimeWire gain more recognition as an open source success story than it typically gets.

The really interesting speculation concerns how computing (and ok, what may or may not have been called Web 2.0) would have been different had P2P not been under legal threat for seven or so years. Subject for another post. We can’t go back, but I think it’s very much worth trying to get to a different version of there.

Yes, I know about significant digits. I’m just repeating what the surveys say.

Comparative advantage

Thursday, April 19th, 2007

Philip Greenspun reporting from the Digital Freedom Exposition in South Africa:

My personal view is that it is not the job of computer nerds to keep people free of disease. We build interesting Web sites and other services to make life interesting and worth living as long as the biologists and doctors are able to keep folks alive. Even if human life expectancy were reduced to 30 years, we shouldn’t abandon our keyboards and move into the medical labs since even a 30-year life can be significantly enriched with Google and Wikipedia.

He understands .

Another fun excerpt from the same post:

More than food, shelter, or other seeming essentials, they wanted Internet access, starting with an Internet cafe for women in the capital (under the Islamic regime, only men were allowed to visit Internet cafes).

Double whammy on those who complain that others who attempt to bring technology to the poor should focus on basic needs first.

Ubuntu upgrades

Sunday, March 25th, 2007

I initially installed Ubuntu Linux 5.10 on my new Dell Inspiron 6000 in November, 2005. I fully expect it to begin having assorted hardware problems this year with the amount of use I give it, though hopefully using an external keyboard (excellent, and only $4.99) and mouse at work will extend its life.

I upgraded to 6.06 shortly after its release but didn’t get around to blogging it. I encountered two problems:

  • I had to re-install i915resolution to get back full 1920×1200 screen resolution (1600×1200 without it)
  • Resume from suspend-to-memory was broken. A fix in the form of an upgraded acpi-support package was available in the next couple days.

I didn’t upgrade to 6.10 upon release, mostly because I just didn’t get around to it. I saw a couple days ago that 7.04beta is out, so I finally upgraded to 6.10. The upgrade process went without a hitch, but resume from suspend-to-memory was broken again. I’m sure there’s an easy fix, but I decided to take the plunge and upgrade to 7.04beta.

Everything now works, though as Brad Templeton notes the upgrade process stopped several times, waiting on my answers to unnecessary questions. Apart from this annoyance, which hopefully will be fixed before 7.04 gets out of beta, I remain impressed with Ubuntu and how it is progressing. I’m also reminded of how rapidly free software projects are doing new feature releases. I don’t think there are any visible applications I use on this laptop that haven’t been significantly improved in the last 17 months.

Say’s law and bandwidth

Monday, February 12th, 2007

Ed Felten asks How Much Bandwidth is Enough? (emphasis added):

It is a matter of faith among infotech experts that (1) the supply of computing and communications will increase rapidly according to Moore’s Law, and (2) the demand for that capacity will grow roughly as fast. This mutual escalation of supply and demand causes the rapid change we see in the industry.

Funny how that seems to happen.

Thus far, whenever more capacity comes along, new applications are invented (or made practical) to use it. But will this go on forever, or is there a point of diminishing returns where more capacity doesn’t improve the user’s happiness?

There’s always a point at which purchasing more bandwidth doesn’t make sense given the price of bandwidth and other goods. But will there ever be a point at which more bandwidth, even at zero price, has no utility? I doubt it.

There is a plausible argument that a limit exists. The human sensory system has limited (though very high) bandwidth, so it doesn’t make sense to direct more than a certain number of bits per second at the user. At some point, your 3-D immersive stereo video has such high resolution that nobody will notice any improvement. The other senses have similar limits, so at some point you have enough bandwidth to saturate the senses of everybody in the home. You might want to send information to devices in the home; but how far can that grow?

Human sensory system? The home? By the time there is enough bandwidth to max out the human sensory system and auxiliary devices humans will not be important on the scene.

Mal engineering awareness

Saturday, February 10th, 2007

appeared on the wiki a couple weeks ago. It closes with very brief calls for capability security and agoric computing, unsurpsingly, considering the source.

But I wanted to point out the article’s proposal for mitigating social engineering:

The best place to defeat the hoax is in the mind of the intended victim. How? With educational tools shipped on the OLPC itself. Suppose the computer had a training course that taught each student-owner how to run the hoax himself.

This strikes a chord with me because I already think “we” (artists, bloggers, programmers, preachers, friends — see friends don’t let friends click spam) should promote not engaging spammers and scammers and because I’m annoyed by the practice of computer vendors (HP/Compaq anyway) pre-loading consumer Windows machines with scads of “special offer” programs that are annoyances at best and would fairly be considered malware if they didn’t come preinstalled.

Instead of bombarding a new user with vendor-approved spam the first time a computer is turned on an enlightened consumer PC vendor (I include OLPC here) would show a brief safe computing video. Support costs may even be reduced through such a move.

On the technical side OLPC posted a summary of their security platform. While much is left to the imagination at this point (there’s an annoying lack of references or even buzzwords in the specification), it sounds like OLPC programs could get a whole lot less authority than those on any mass platform so far.

iCandy, Patented !

Tuesday, January 9th, 2007

Tom Evslin says Apple Fails to Reinvent Telecommunications Industry:

Steve Jobs claims that iPhone will “reinvent” the telecommunications sector. Wish it were so but it ain’t!

The design of the phone – no hard buttons, all touch on screen, sounds like everything we expect from Steve and from Apple: it’s all about the GUI and that part’ll be fun. But the business relationship is as old school as it can get: exclusive US distributorship through Cingular

Short term this is a good tactic for Apple because it protects the iPod franchise for a while. Long term I think it’s terrible strategy. It invites an endrun from someone who IS willing to reinvent the industry or simply allies themselves with a Cingular competitor.

Remember how wonderful the Mac GUI was? But it only ran on machines from Apple. Remember how crappy Windows was at first? But it ran on machines from everyone and their brother. And now there’s Linux – even less restricted – running on anything that moves. Tell me again why it makes sense to have a phone that runs only on a service from at&t (in the US).

Like other Apple products, the is eye candy (ugly to me), but not revolutionary.

It looks like the FIC Neo1973, showing at CES, due to ship this quarter for US$350, running the platform (presentation), will be more in the right direction — unlocked and open for developers. Andy on the openmoko list has a very early comparison.

The Neo1973’s big missing feature, at least initially, is apparently Wi-Fi, due to a lack of open drivers. As a late adopter of gadgets, I can wait. I acquired my first and only mobile phone in 2003, and it’s easy on the eyes.

That said, I’d really like Portable online by 2010 to be true:

This claim is judged YES if and only if, by January 1, 2010, in any state with more than 5 million inhabitants, at least 25% of the adult population are “portably online”. A “state” can be a country or a member state in a federation.

Read more for how “portably online” is defined (the contract was written in 1995). My current guess (and the market’s; last trade at 30) is that without a more significant revolution than we’re seeing the criteria won’t be met before 2010, but not terribly long after.

OpenMoko via Jon Phillips. Second word in post title refers to a silly slide found at Engadget.

Free software needs hardware entrepreneurs

Saturday, September 23rd, 2006

Luis Villa:

I’m boggled that Fedora, OpenSuse, and Ubuntu, all of whom have open or semi-open build systems now, are not actively seeking out Emperor and companies like Emperor, and helping them ship distros that are as close to upstream- and hence most supportable- for everyone. Obviously it is in RH, Canonical, and Novell’s interests to actively pursue Big Enterprise Fish like HP and Dell. But I’m really surprised that the communities around these distros haven’t sought out the smaller, and potentially growing, companies that are offering computers with Linux pre-installed.

Sounds exactly right to me. I’ve been thinking something similar for awhile, but as the post title suggests, focused on hardware vendors. Tons of them compete to sell Linux servers at the very low and very high ends and everything inbetween, but if you want a pre-installed Linux laptop you need to pay a hefty premium for slightly out of date hardware from someone like Emperor Linux. It seems like there’s an opportunity for a hardware vendor to sell a line of Linux laptops that aren’t merely repurposed Windows machines. It has seemed like this for a something like a decade though, and as far as I know HP and a couple others have only tentatively and temporarily offered a few lame configurations.

So I’d like to see a hardware startup (or division of an existing company) sell a line of laptops designed for Linux, where everything “just works” just as it does on Macs, and for the same reasons — limited set of hardware to support, work on the software until it “just works” on that hardware. There’s probably even some opportunity for Apple-like proprietary control over some aspects of the hardware. Which reminds me, what legal barriers, if any, would someone who wants to manufacture the OLPC design face? There is discussion of a commercial subsidiary for the project:

The idea is that a commercial subsidiary could manufacture and sell a variation of the OLPC in the developed world. These units would be marked up so that there would be a significant profit which can be plowed into providing more units in countries who cannot afford the full cost of one million machines.

The discussions around this have talked about a retail price of 3× the cost price of the units.

In any case Villa is right, distributions should be jumping to support hardware vendors, both the mundane and innovative sorts. Which Red Hat/Fedora is doing in the case of OLPC.

Update 20060926: In comments below Villa points out system76, which approaches what I want, excpet that their prices are mediocre and they don’t offer high resolution displays, which I will never do without again. David points out olpcnews.com, which looks like reasonable independent reporting on OLPC. I asked on the OLPC wiki about other manufacturers’ use of the design.

LinuxWorld San Francisco

Monday, August 21st, 2006

Brief thoughts on last week’s Conference and Expo San Francisco.

Lawrence Lessig’s opening keynote pleased the crowd and me. A few points fof interest:

  • Free speech is a strong aspect of free culture and at least implicitly pushed for a liberal interpretation of fair use, saying that the ability to understand, reintepret and remake video and other multimedia is “the new literacy” and important to the flourishing of democracy.
  • The “read/write Internet”, if allowed to flourish, is a much bigger market than the “read only Internet.”
  • Support free standards and free software for media, including Ogg and .
  • In 1995 only crazies thought it possible to build a viable free software operating system (exaggeration from this writer’s perspective), now only crazies think wireless can solve the last mile competition problem. Go build free wireless networks and prove the telcos and pro-regulation lawyers (including the speaker) wrong.
  • One of the silly video mashups Lessig played was Jesus Will Survive, featuring an adult Jesus in diapers hit by a bus. A few people left the auditorium at this point.

I’ve at least visited the exhibition space of almost every LWCE SF (the first one, actually in San Jose, was the most fun — Linus was a rock star and revolution was in the air) seemed bigger and more diverse, with most vendors pushing business “solutions” as opposed to hardware.

By far the most interesting exhibition booth to me was Cleversafe, an open source dispersed storage project that announced a Linux filesystem interface at the conference and was written up in today’s New York Times and Slashdot. I’ve been waiting for something like this for a long time, particularly since Allmydata is not open source and does not support Linux.

Also, Creative Commons won a silly “Best Open Source Solution” show award.

Addendum 20080422: If you’re arriving from an unhinged RedState blog post, see Lessig’s response.