Archive for February, 2006

Boing Boing promotes supply-side anti-censorship

Monday, February 27th, 2006

Not the context I imagined, but Boing Boing is calling for supply-side anti-censorship:

What happens when the blogosphere uses so much tasteful nudity that the web is unusable for SmartFilter users? What happens when SmartFilter blocks so much content that the web is crippled for its users?

No, of course they aren’t calling it supply-side anti-censorship (bad name anyway, sorry). Here you go:

Identity in, Identity out

Sunday, February 26th, 2006

I’ve briefly mentioned digital identity before, but the launch of ClaimID (more below) prompts me to write down my over the top theory concerning digital/ as a great absolute productivity equalizer. The theory in pseudocode:

function work_on_digital_identity(num_workers, avg_worker_skill) { return 0; }

This comes from the observation that hangers-on who can barely string buzzwords together to form a semi-coherent sentence and very smart people and single person projects and huge organizations composed of either all produce the same level of useful results when applied to the “problem” of digital identity: nil.

No, I’m not naming names and yes, this is an extreme caricature. Thanks for letting me get that off my chest.

So ClaimID is a new site that encourages you to catalog links about yourself and link to your resulting ClaimID page from your blog or home page. Someone ClaimID says does a good job of explaining ClaimID says:

Say you’re a college student with a weblog and you post your foolish thoughts under your real name. Or you’re active in some newgroups or mailing lists.

Time passes, you graduate and decide to look for a job. Of course, the prospective employer will want to do a search on your name, but what will they find? Oops! Too bad you didn’t think of that before!

Enter ClaimID.com. ClaimID will give you a place that you can point people to, to say “Here’s the ‘me’ I’m proud of!”

Salvation! Before ClaimID it was not possible to create a web page with links you are proud of and without links you are not proud of. If you think giving your clean ClaimID page is going to prevent a prospective employer from finding embarrassing links, I have a bridge to sell you.

On the plus side ClaimID is a well designed web application, even if it does nothing useful yet, and useful features are easy to imagine, e.g., a platform that is not a walled garden (imagine!), /tracking, , copyright or other registry, perhaps even the of .

Many of these could be built into popular blogging software (for example) but using future ClaimID or similar would be far easier and more robust for most people.

For what it’s worth I think the killer app for decentralized authentication (yes I’m sloppily mixing overlapping concepts, oh well — but on that note this and this look pretty interesting) is private blogging, or more generally selective information sharing. Currently the only way to do this on the web apart from running your own walled garden is to use someone else’s, e.g., . I never understood the popularity of LJ until a year or so ago someone told me people use it to write entries that only friends may access.

Admit defeat, not error!

Saturday, February 25th, 2006

William F. Buckley admits that the U.S. military adventure in Iraq is a defeat, but willfully fails to learn anything from it.

It is healthier for the disillusioned American to concede that in one theater in the Mideast, the postulates didn’t work. The alternative would be to abandon the postulates.

His two postulates amount to an assumption that wherever the U.S. intervenes people will act in accordance with U.S. politicians’ wishes. Nevermind that this doesn’t even work within the U.S. jurisdiction.

Buckley attributes defeat soley to “Iraqi animosities.” Even if that were the sole cause blame can be pinned firmly on U.S. politicians who were very well aware of Shiite/Sunni/Kurd/Christian/etc. “animosities” as leveraging these was a major component of U.S. policy toward Iraq after the 1991 . However, Buckley ignores economic mismanagement, and doubtless many other idiocies endemic to political management, nevermind military-political management. To do so would be to accept blame and teeter on the edge of admitting error.

If Buckley hopes to fence off his “postulates” (and thus U.S. policy) from criticism by admitting defeat in this one instance I hope he fails miserably, but I fully expect he and other advocates of interventionism will succeed in this subversion of truth. The long history of poor outcomes of U.S. intervention in the Middle East, elsewhere, and within the U.S. jurisdiction (domestically) is forgotten completely and is never learned from.

I have probably suggested too many times that prediction markets could help remind voters that the most likely outcomes are not those predicted by politicians.

On a related note: So what if Iraq splits? A jurisdiction is not a sacred entity.

Via Mike Godwin. You must check out Godwin’s awesome site design. (Don’t worry, I still hate Macs.)

Free as in free pollution parking

Saturday, February 25th, 2006

Tyler Cowen cites Donald Shoup’s The High Cost of Free Parking, which claims that “On average [in the U.S.] a new parking space has cost 17 percent more than a new car.” If I were lured by the temptation of urban policy I would certainly read this book.

I gather Shoup’s argument is that if zoning did not require minimum numbers of spaces and if market rates were charged for parking there would not be wasteful spaces built in uncongested areas and it would be possible to find parking in congested areas.

Shoup probably covers this, but one of the baneful effects of free or underpriced (e.g, cheap area parking permits in San Francisco) is opposition to dense development. Additional residents mean more competition for spaces, giving residents all the reason they need to go into mode, leaving a stunted cross between (vile place) and the wonderful Sanhattan it could be. (Of course there’s much more to story. I’d point to some Matt Smith columns and a feature published on the 50th anniversary of the founding of the United Nations in San Francsico in the if its archive search weren’t so broken.)

Certain control freaks now want to swing from requiring a certain number of parking spaces to prohibiting more than a certain number of spaces. How about letting people build or not build however many spaces as they see fit? The problem is not under- or over-provision of private spaces, it is the underpricing of public spaces.

How about auctioning area parking permits — what politician doesn’t love a windfall? Existing permit holders could share in the windfall as power dictates. New residents would pay market prices. I’m sure Shoup has many more and better thought out proposals.

A related urban transport micro-rant: is an atrocity. No faster than buses and far more expensive, dangerous, space-wasting and inflexible, light rail serves only monument-building fantasies. If a real is infeasible just add or upgrade buses.

Addenda:

  • A complement or partial alternative to market prices for parking is to charge for road use as in central .
  • Anti-light rail articles.
  • Politically-controlled underpricing of water (especially for agricultural use, e.g., in California) and energy (primarily in oil exporting jurisdictions) doubtless cause far greater problems worldwide than underpriced parking.

Google Brin Creator

Thursday, February 23rd, 2006

Now that Google has a product () named* for cofounder and current President of Products it clearly needs a technology named for cofounder and current President of Technology .

“Brin” doesn’t have an obvious meaning so perhaps the technology could be something more compelling than . How about a Basic Reality Interface Neuroplant?

I’ll take two Google Brins for staters — one to replace each eye — better portals to see the portal, including its (soon to be) millions of crappy Google Pages.

* Not really.

Supply-side anti-censorship

Friday, February 17th, 2006

Brad Tempelton explains why a censor should want an imperfect filter — it should be good enough to keep verboten information from most users, but easy enough to circumvent to tempt dissidents, so they can be tracked and when desired, put away.

In the second half of the post, Tempelton suggests some anti-censor techniques: ubiquitous and . Fortunately he says these are “far off” and “does not scale”, respectively. To say the least, I’d add.

Cyber-activists have long dreamed that strong encryption would thwart censorship. is an example of a project that uses this as its raison d’être. While I’m a huge fan of ubiquitous encryption and decentralization (please install , now!), these seem like terribly roundabout, means of fighting censorship — the price of obtaining information, which includes the chance of being caught, is lowered. But someone has to seek out or have the information pushed to them in the first place. If information is only available via hidden channels, how many people will encounter it regardless of lower risk?

An alternative, perhaps less sexy because it involves no technology adoption, is supply-side anti-censorship: make verboten information ubiquitous. Anyone upset about google.cn should publish information the Communist Party wants censored (my example is pathetic, need to work on that). This is of course not mutually exclusive with continuing to carp and dream of techno-liberation.

I guess I’m calling for projects. Or one of those chain letters (e.g, “four things”) that plagues the blogosphere.

content.exe is evil

Thursday, February 16th, 2006

I occasionally run into people who think users should download content (e.g., music or video) packaged in an executable file, usually for the purpose of wrapping the content with where the content format does not directly support DRM (or the proponent’s particular DRM scheme). Nevermind the general badness of Digital Restrictions Management, requiring users to run a new executable for each content file is evil.

Most importantly, every executable is a potential vector. There is no good excuse for exposing users to this risk. Even if your executable content contains no malware and your servers are absolutely impenetrable such that your content can never be replaced with malware, you are teaching users to download and run executables. Bad, bad, bad!

Another problem is that executables are usually platform-specific and buggy. Users have enough problem having the correct codec installed. Why take a chance that they might not run Windows (and the specific versions and configurations you have tested, sure to not exist in a decade or much less)?

I wouldn’t bother to mention this elementary topic at all, but very recently I ran into someone well intentioned who wants users to download content wrapped in , if I understand correctly for the purposes of ensuring users can obtain content metadata (most media players do a poor job of exposing content metadata and some file formats do a poor job of supporting embedded metadata, not that hardly anyone cares — this is tilting at windmills) and so that content publishers can track use (this is highly questionable), all from a pretty cross platform GUI. A jar file is an executable Java package, so the platform downside is different (Windows is not required, but a Java installation, of some range of versions and configurations, is), but it is still an executable that can do whatever it wants with the computer it is running on. Bad, bad, bad!

The proponent of this scheme said that it was ok, the jar file could be . This is no help at all. Anyone can create a certificate and sign jar files. Even if a creator did have to have their certificate signed by an established authority it would be of little help, as malware purveyors have plenty of resources that certificate authorities are happy to take. The downsides are many: users get a security prompt (“this content signed by…”) for content, which is annoying, misleading as described above and conditions the user to not pay attention when they install things that really do need to be executable, and a barrier is raised for small content producers.

If you really want to package arbitrary file formats with metadta, put everything in a zip file and include your UI in the zip as HTML. This is exactly what P2P vendor ‘s Packaged Media File format is. You could also make your program (which users download only once) look for specific files within the zip to build a content-specific (and safe) interface within your program. I believe this describes ‘s Kapsules, though I can’t find any technical information.

Better yet put your content on the web, where users can find and view it (in the web design of your choice), you get reasonable statistics, and the don’t get fed. You can even push this to 81/19 by including minimal but accurate embedded in your files if they support it — a name users can search for or a URL for your page related to the content.

Most of the pushers of executable content I encounter when faced with security concerns say it is an “interersting and hard problem.” No, it is a stupid and impossible problem. In contrast to web, executable content is a 5/95/-1000 solution — that last number is a .

If you really want an interesting and hard problem, executable content security is the wrong level. Go work on platform security. We can now run sophisticated applications within a web browser with some degree of safety (due to Java applet and Flash sandboxes, JavaScript security). Similar could be pushed down to the desktop, so that executables by default have no more rights to tamper with your system than do web pages. is an aggressive approach to this problem. If that sounds too hard and not interesting enough (you really wanted to distribute “media”), go the web way as above — it is subsuming the desktop anyhow.

CodeCon Extra

Monday, February 13th, 2006

A few things I heard about at outside the presentations.

Vesta was presented at CodeCon 2004, the only one I’ve missed. It is an integrated revision control and build system that guarantees build repeatability, in part by ensuring that every file used by the build is under revision control. I can barely keep my head around the few revision control and build systems I occasionally use, but I imagine that if I were starting (or saving) some large mission-critical project that found everyday tools inadequare it would be well worth considering Vesta. About its commercial equivalents, I’ve mostly heard second hand complaining.

Allmydata is where Zooko now works. The currently Windows-only service allows data backup to “grid storage” presumably a as used by . Dedicate 10Gb of local storage to the service, you can back up 1Gb, free. Soon you’ll be able to pay for better ratios, including $30/month for 1Tb of space. I badly want this service. Please make it available, and for Linux! Distributed backup has of course been a dream P2P application forever. Last time I remember the idea getting attention was a Cringely column in 2004.

Some people were debating whether the Petname Tool does anything different from what specify and whether either would make substantially harder. The former is debated in comments on Bruce Schneier’s recent post on petnames, inconclusively AFAICT. The Petname Tool works well and simply for what it does (Firefox only), which is to allow a user to assign a name to a https site if it is using strong encryption. If the user visits the site again and it is using the same certificate, the user will see the assigned name in a green box. Any other site, including one that merely looks like the original (in content or URL), or even has hijacked DNS, appears to be “secure” but uses a different certificate, will appear as “untrusted” in a yellow box. That’s great as far as it goes (see phollow the phlopping phish for a good description of the attack this would save reasonable user from), though the naming seems the least important part — a checkbox to begin trusting a site would be nearly as good. I wonder though how many users have any idea that some pages are secure and others are not. The petname tool doesn’t do anything for non-https pages, so the user becomes inured to seeing it doing nothing, then does not see it. Perhaps it should be invisible when not on a secure site. Indicators like PageRank, Alexa rank (via the Google and Alexa toolbars) and similar, , and whether the visitor has previously visited the site in question before would all help warn the user that any site may not be what they expect — nearly everyone, including me, confers a huge amount of trust on non-https sites, even if I never engage in a financial transaction on such a site. I imagine a four-part security indicator in a prominent place in the browser, with readings of site popularity (rank), danger as measured by the likes of SiteAdvisor, the user’s relationship with the site (petname) and whether the connection is strongly encrypted.

Someone claimed that three letter agencies want to mandate geolocation for every net access device. No doubt some agency types dream of this. Anyway, the person said we should be ready to fight this if it were to become a real push for such a law, because what would happen to anonymity? No doubt such a mandate should be fought tooth and nail, but preserving anonymity seems like exactly the wrong battle cry. How about privacy, or even mere freedom? On that note, someone briefly showed a tiny computer attached to and powered by what could only be called a solar flap. This could be slapped on the side of a bus and would connect to wifi networks whenever possible and route as much traffic as possible.

CodeCon Sunday

Monday, February 13th, 2006

Dido. I think this provides AGI, or a way to script voice response systems using and a voice template system analogous to scripting and HTML templates for web servers, though questioners focused on a controversial feature to reorder menus based on popularity. The demo didn’t really work, except as a demonstration of everyone’s frustration with IVRs, as an audience member pointed out.

Deme. Kitchen sink collaboration web app. They aren’t done putting dishes in the sink. They’re thinking about taking all of the dishes out of the sink, replacing the sink, and putting the dishes back in (PHP to something cooler). Let’s vote on what kind of vote to put this to.

Monotone. Elegant distributed , uses SHA1 hashes to identify files and repository states. Hash of previous repository state included in current repository state, making lineage cryptographically provable. used to quickly determine file level differences between repositories (for sync). Storage and (especially) merge and diff are loosely coupled. Presentation didn’t cover day to day use, probably a good decision in terms of interestingness. The revision control presentations have been some of the best every year at CodeCon. They should consider having two or three next year. may be the only project presented this year that had a Wikipedia article before the conference.

Rhizome. Unlike Gordon (and perhaps most people), hearing the triplet doesn’t make my eyes glaze over, but I’m afraid this presentation did. Some of the underlying code ( etc) might be interesting, but was the second to last presentation, and the top level project, Rhizome, amounts to yet another idiosyncratic , with the idiosyncratic dial turned way up.

Elkhound/Elsa/Oink/Cqual++. generator that handles ambiguous grammars in a straightforward manner, C++ parser and tools built on top of same. Can find with a reasonable false positive rate. Expressed confidence that future work would lead the compiler catching far more bugs than usually thought possible (as opposed to only at runtime). Cool and important stuff, too bad I only grok it at a high level. Co-presenter Dan Wilkerson (and sole presenter on Saturday of Delta) is with the Open Source Quality Project at UC Berkeley.

Saturday
Sunday 2005

CodeCon Saturday

Sunday, February 12th, 2006

Delta. Arbitrarily large codebase triggers specific bug. Run delta, which attempts to provide you with only the code that triggers the bug (usually a page or so, no matter the size of the codebase) via a like algorithm (the evaluation function requires triggering the bug and considers code size). Sounds like a big productivity and quality booster where it can be used.

Djinni. Framework for approximation of problems, supposedly faster and easier to use than more academic oriented approximation frameworks. An improved simulated annealing algorithm is or will be in the mix, including an analog of “pressue” in . Super annoying presentation style. Thank you for letting us know that CodeCon is where the rubber meets the road.

iGlance. Instant Messaging with audio and video, consistent with the IM metaphor (recipient immediately hears and sees initiator) rather than telephone metaphor (recipient must pick up call). Very low bitrate video buddy lists. Screen and window sharing with single control and dual pointers so that remote user can effectively point over your shoulder. Impressive for what seems to be a one person spare time project. Uses OVPL and OVLPL licenses, very similar to GPL and LGPL, but apparently easier to handle contributor agreements, so project owner can move code between application and library layers. Why not just make the entire application ?

Overlay Anycast Service InfraStructure. Locality-aware server selection (to be) used by , easy to implement for your service. Network locality correlates highly with geographic locality due to the speed of light bound. Obvious, but the graph was neat. OpenDHT was also mentioned, another hosted service. OpenDHT clients can use OASIS to find a gateway. Super easy to play with a with around 200 nodes. Someone has built fileshare using OpenDHT, see Octopod. As Wes Felter says, this stuff really needs to be moved to a non-research network.

Query By Example. Find and rank rows [dis]similar to others in SQL using extension for , which uses a for classification (last is not visible to user). Sounds great for data mining engagements.

Friday
Saturday 2005