Post P2P

LimeWire more popular than Firefox?

Saturday, May 5th, 2007

is supposedly installed on nearly one in five PCs. “Current installation share” for filesharing programs according to BigChampagne and PC Pitstop:

1. LimeWire (18.63%)
2. Azureus (3.43%)
3. uTorrent (3.07%)
4. BitTorrent (2.58%)
5. Opera (2.15%)
6. Ares (2.15%)
7. BitComet (1.99%)
8. eMule (1.98% )
9. BearShare (1.64%)
10. BitLord (1.38%)

It’s a little odd to include all those BitTorrent clients, given their very different nature. All but LimeWire, Ares, eMule, and BearShare are BT-only (their P2P download component — Opera is mainly a web browser, with built in BT support). Recent versions of LimeWire and Ares also support BT, so another provocative headline would be “LimeWire the most popular BitTorrent client?”

(for surveys publishing numbers in 2007) usage share for Firefox ranges from 11.69% to 14.32%. Of course usage share is very different from installation share (compare Opera installation share above at 2.15% and recent usage share between 0.58% and 0.77%) and P2P filesharing and download clients have different usage patterns, so any comparison is apples to oranges. However, if one could extrapolate from the Opera numbers for installation and usage, LimeWire is not more popular than Firefox.

LimeWire is still impressively popular. This probably is mostly due to open source being less susceptible to censorship than proprietary software (which has a half-life shortened by legal attack in the case of P2P). Still, I’d like to see LimeWire gain more recognition as an open source success story than it typically gets.

The really interesting speculation concerns how computing (and ok, what may or may not have been called Web 2.0) would have been different had P2P not been under legal threat for seven or so years. Subject for another post. We can’t go back, but I think it’s very much worth trying to get to a different version of there.

Yes, I know about significant digits. I’m just repeating what the surveys say.

Jamendo ad revenue share with artists

Tuesday, January 16th, 2007

is one of the most interesting music sites on the net (in terms business, community, and technology — there’s no competition yet for the vastness and bizarreness to be found on archive.org, yet). They’re trying every Web 2.0 trick and have somehow managed to avoid becoming overwhelmed with crap. I’ve listened to dozens of the 2,100 albums on Jamendo. While only a small fraction of these have strongly agreed with my taste, just about everything (weighted toward electronica and French rock) sounds professional.

Now Jamendo has introduced an advertising revenue sharing program with participating artists.

jamendo revenue share

Several video sites are attempting variations on this theme (among them , Lulu.tv, and ), but as far as I know Jamendo’s is the first attempt in the audio space. One might think an audio site would have a harder time making web advertising work than a video site (videos are usually watched within a web page and can have clickable ad areas or bumpers even if not), but I gather that listening via (usually Flash-based) audio players embedded in web pages is increasingly common (and Jamendo upgraded theirs recently), as will be media players that “play” a web page in a browser interface.

One data point: although Jamendo heavily promotes download of high quality copies, primarily via BitTorrent, their statistics indicate that low quality http “streaming” has accounted for more bandwidth. There are many obvious caveats here, but I think all points above indicate that advertising-supported web audio should not be ruled out, even if it is granted that web video has more potential.

Digg Jamendo’s revenue share page.

BlackNet is a wiki?

Sunday, January 7th, 2007

Wikileaks, currently vapor, may be a joke. If Wikileaks is not a joke and if it successfully exposes a large number of secrets, I’d find it hilarious to see this happening on a public website and without financial incentives. P2P, digital cash, information markets, and crypto anarchy? Nope, just a wiki and a communinty.

Wikileaks FAQ:

WikiLeaks will be the outlet for every government official, every bureaucrat, every corporate worker, who becomes privy to embarrassing information which the institution wants to hide but the public needs to know. What conscience cannot contain, and institutional secrecy unjustly conceals, WikiLeaks can broadcast to the world.

Untraceable Digital Cash, Information Markets, and BlackNet (1997, but these ideas spread widely in the early 1990s):

One of the most interesting applications is that of “information markets,” where information of various kinds is bought and sold. Anonymity offers major protections for both buyers and sellers, in terms of sales which may be illegal or regulated. Some examples: corporate secrets, military secrets, credit data, medical data, banned religious or other material, pornography, etc.

Why is more information not leaked on the net already? The technology exists to do so anonymously and has for a long time. Why is there not (or to what extent is there) a market for secrets? Again, the technology exists.

Perhaps lack of the relevant institutions in each case. One could email secrets or post to a blog anonymously, but what then? Will anyone notice? One could want to sell secrets, but how to find a buyer?

If Wikileaks succeeds it will be because it will provide, or rather its community will be, the relevant institution. Again from the Wikileaks FAQ:

WikiLeaks opens leaked documents up to a much more exacting scrutiny than any media organization or intelligence agency could provide: the scrutiny of a worldwide community of informed wiki editors.

Instead of a couple of academic specialists, WikiLeaks will provide a forum for the entire global community to examine any document relentlessly for credibility, plausibility, veracity and falsifiability. They will be able to interpret documents and explain their relevance to the public. If a document is leaked from the Chinese government, the entire Chinese dissident community can freely scrutinize and discuss it; if a document is leaked from Somalia, the entire Somali refugee community can analyze it and put it in context.

I have not read the Wikileaks email archived at cryptome.

Defeatist dreaming

Sunday, October 22nd, 2006

Jimmy Wales of Wikipedia says to dream a little:

Imagine there existed a budget of $100 million to purchase copyrights to be made available under a free license. What would you like to see purchased and released under a free license?

I was recently asked this question by someone who is potentially in a position to make this happen, and he wanted to know what we need, what we dream of, that we can’t accomplish on our own, or that we would expect to take a long time to accomplish on our own.

One shouldn’t look a gift horse in the mouth and this could do a great deal of good, particularly if the conditions “can’t accomplish on our own…” are stringently adhered to.

However, this is a blog and I’m going to complain.

Don’t fork over money to the copyright industry! This is defeatist and exhibits static world thinking.

$100 million could fund a huge amount of new free content, free software, free infrastructure and supporting institutions, begetting more of the same.

But if I were a donor with $100 million to give I’d try really hard to quantify my goals and predict the most impactful spending toward those goals. I’ll just repeat a paragraph from last December 30, Outsourcing charity … to Wikipedia:

Wikipedia chief considers taking ads (via Boing Boing) says that at current traffic levels, Wikipedia could generate hundreds of millions of dollars a year by running ads. There are strong objections to running ads from the community, but that is a staggering number for a tiny nonprofit, an annual amount that would be surpassed only by the wealthiest foundations. It could fund a staggering Wikimedia Foundation bureaucracy, or it could fund additional free knowledge projects. Wikipedia founder Jimmy Wales has asked what will be free. Would an annual hundred million dollar budget increase the odds of those predictions? One way to find out before actually trying.

Via Boing Boing via /.

Scientology of sharing

Tuesday, October 17th, 2006

Last month I watched , a scientology docudrama, after hearing about it on Boing Boing. It is a pretty well done and low key film, considering the nuttiness of scientology.

Copyright is one of the weapons scientology uses to hide the hilarious absurdity of its beliefs, so it is no surprise that The Bridge has has been taken down (at least some of the copies) from YouTube, Google, and the Internet Archive.

I remember that it was published to the Archive under a Creative Commons Attribution-NonCommercial-NoDerivs license. Sadly http://www.archive.org/details/BrettHanoverTheBridge is not in the Wayback Machine nor WebCite, so I can’t demonstrate this. If I am correct, the filmmaker has no cause to stop non-commercial distribution, as CC licenses are irrevocable.

If you can’t find the film on the lightnet fire up a filesharing client (I recommend ) and click on the below to start your P2P search and download.

Scientology-The_Bridge.mp4

LinuxWorld San Francisco

Monday, August 21st, 2006

Brief thoughts on last week’s Conference and Expo San Francisco.

Lawrence Lessig’s opening keynote pleased the crowd and me. A few points fof interest:

  • Free speech is a strong aspect of free culture and at least implicitly pushed for a liberal interpretation of fair use, saying that the ability to understand, reintepret and remake video and other multimedia is “the new literacy” and important to the flourishing of democracy.
  • The “read/write Internet”, if allowed to flourish, is a much bigger market than the “read only Internet.”
  • Support free standards and free software for media, including Ogg and .
  • In 1995 only crazies thought it possible to build a viable free software operating system (exaggeration from this writer’s perspective), now only crazies think wireless can solve the last mile competition problem. Go build free wireless networks and prove the telcos and pro-regulation lawyers (including the speaker) wrong.
  • One of the silly video mashups Lessig played was Jesus Will Survive, featuring an adult Jesus in diapers hit by a bus. A few people left the auditorium at this point.

I’ve at least visited the exhibition space of almost every LWCE SF (the first one, actually in San Jose, was the most fun — Linus was a rock star and revolution was in the air) seemed bigger and more diverse, with most vendors pushing business “solutions” as opposed to hardware.

By far the most interesting exhibition booth to me was Cleversafe, an open source dispersed storage project that announced a Linux filesystem interface at the conference and was written up in today’s New York Times and Slashdot. I’ve been waiting for something like this for a long time, particularly since Allmydata is not open source and does not support Linux.

Also, Creative Commons won a silly “Best Open Source Solution” show award.

Addendum 20080422: If you’re arriving from an unhinged RedState blog post, see Lessig’s response.

Free software needs P2P

Friday, July 28th, 2006

Luis Villa on my constitutionally open services post:

It needs a catchier name, but his thinking is dead on- we almost definitely need a server/service-oriented list of freedoms which complement and extend the traditional FSF Four Freedoms and help us think more clearly about what services are and aren’t good to use.

I wasn’t attempting to invent a name, but Villa is right about my aim — I decided to not mention the four freedoms because I felt my thinking too muddled to dignified with such a mention.

Kragen Sitaker doesn’t bother with catchy names in his just posted draft essay The equivalent of free software for online services. I highly recommend reading the entire essay, which is as incisive as it is historically informed, but I’ve pulled out the problem:

So far, all this echoes the “open standards” and “open formats” discussion from the days when we had to take proprietary software for granted. In those days, we spent enormous amounts of effort trying to make sure our software kept our data in well-documented formats that were supported by other programs, and choosing proprietary software that conformed to well-documented interfaces (POSIX, SQL, SMTP, whatever) rather than the proprietary software that worked best for our purposes.

Ultimately, it was a losing game, because of the inherent conflict of interest between software author and software user.

And the solution:

I think there is only one solution: build these services as decentralized free-software peer-to-peer applications, pieces of which run on the computers of each user. As long as there’s a single point of failure in the system somewhere outside your control, its owner is in a position to deny service to you; such systems are not trustworthy in the way that free software is.

This is what has excited about decentralized systems long before P2P filesharing.

Luis Villa also briefly mentioned P2P in relation to the services platforms of Amazon, eBay, Google, Microsoft and Yahoo!:

What is free software’s answer to that? Obviously the ’spend billions on centralized servers’ approach won’t work for us; we likely need something P2P and/or semantic-web based.

Wes Felter commented on the control of pointers to data:

I care not just about my data, but the names (URLs) by which my data is known. The only URLs that I control are those that live under a domain name that I control (for some loose value of control as defined by ICANN).

I hesitated to include this point because I hesitate to recommend that most people host services under a domain name they control. What is the half-life of http://blog.john.smith.name vs. http://johnsmith.blogspot.com or js@john.smith.name vs. johnsmith@gmail.com? Wouldn’t it suck to be John Smith if everything in his life pointed at john.smith.name and the domain was hijacked? I think Wes and I discussed exactly this outside CodeCon earlier this year. Certainly it is preferable for a service to allow hosting under one’s own domain (as Blogger and several others do), but I wish I felt a little more certain of the long-term survivability of my own [domain] names.

This post could be titled “freedom needs P2P” but for the heck of it I wanted to mirror “free culture needs free software.”

Constitutionally open services

Thursday, July 6th, 2006

Luis Villa provokes, in a good way:

Someone who I respect a lot told me at GUADEC ‘open source is doomed’. He believed that the small-ish apps we tend to do pretty well will migrate to the web, increasing the capital costs of delivering good software and giving next-gen proprietary companies like Google even greater advantages than current-gen proprietary companies like MS.

Furthermore:

Seeing so many of us using proprietary software for some of our most treasured possessions (our pictures, in flickr) has bugged me deeply this week.

These things have long bugged me, too.

I think Villa has even understated the advantage of web applications — no mention of security — and overstated the advantage of desktop applications, which amounts to low latency, high bandwidth data transfer — let’s see, , including video editing, is the hottest thing on the web. Low quality video, but still. The two things client applications still excel at are very high bandwidth, very low latency data input and output, such as rendering web pages as pixels. :)

There are many things that can be done to make client development and deployment easier, more secure, more web-like and client applications more collaboration-enabled. Fortunately they’ve all been tried before (e.g., , , , others of varying relevance), so there’s much to learn from, yet the field is wide open. Somehow it seems I’d be remiss to not mention , so there it is. Web applications on the client are also a possibility, though typical only address ease of development and not manageability at all.

The ascendancy of web applications does not make the desktop unimportant any more than GUIs made filesystems unimportant. Another layer has been added to the stack, but I am still very happy to see any move of lower layers in the direction of freedom.

My ideal application would be available locally and over the network (usually that means on the web), but I’ll prefer the latter if I have to choose, and I can’t think of many applications that don’t require this choice (fortunately is one of them, or close enough).

So what can be done to make the web application dominated future open source in spirit, for lack of a better term?

First, web applications should be super easy to manage (install, upgrade, customize, secure, backup) so that running your own is a real option. Applications like and have made large strides, especially in the installation department, but still require a lot of work and knowledge to run effectively.

There are some applications that centralizaton makes tractable or at least easier and better, e.g., web scale search, social aggregation — which basically come down to high bandwidth, low latency data transfer. Various P2P technologies (much to learn from, field wide open) can help somewhat, but the pull of centralization is very strong.

In cases were one accepts a centralized web application, should one demand that application be somehow constitutionally open? Some possible criteria:

  • All source code for the running service should be published under an open source license and developer source control available for public viewing.
  • All private data available for on-demand export in standard formats.
  • All collaboratively created data available under an open license (e.g., one from Creative Commons), again in standard formats.
  • In some cases, I am not sure how rare, the final mission of the organization running the service should be to provide the service rather than to make a financial profit, i.e., beholden to users and volunteers, not investors and employees. Maybe. Would I be less sanguine about the long term prospects of Wikipedia if it were for-profit? I don’t know of evidence for or against this feeling.

Consider all of this ignorant speculation. Yes, I’m just angling for more freedom lunches.

Filesharing a waste of time

Sunday, June 4th, 2006

Well over a year ago Sameer Parekh called out an obvious flaw in my argument:

I find it funny when I read technologists arguing that downloads of movies aren’t a problem because they’re slow. When do technologists talk about how technology sucks and isn’t going to improve? When the improvement of that technology hurts their public relations effort!

I noticed Parekh’s blog again recently, which reminded me to respond. I find it interesting (but somewhat tangential) that in the interim centralized web-based video “sharing” ( and many similar sites) has taken off while decentralized P2P filesharing has languished.

Anyhow, I do not argue that P2P filesharing is a waste of time merely because it takes a really long time to download a movie. Even if downloads were instantaneous the experience would be trying. Making it easy and certain for an average user to find a complete copy and find and install the video codecs to be able to watch the copy is not something that improved bandwidth will fix automatically. They are social and software problems, which tend to not improve at the rate bandwidth and similar increase.

In the future when today’s huge downloads are (nearly) instantaneous, they’ll be nearly instantaneous via underground P2P or via centralized download services. The only people who will struggle with the former are the very poor, those who enjoy fighting with their computers, and those who seriously miscalculate the value of their time. Unless the latter are encumbered with DRM so frustrating that there is no convenience advantage to using a centralized service.

By that time I expect most entertainment to be some combination of supercheap, server-mediated and advertising.

CCSSF2 with Gonze & Ostertag

Tuesday, April 11th, 2006

The first Creative Commons Salon San Francisco was good, tomorrow’s should be great. Bob Ostertag and Lucas Gonze (who I’ve cited many times) are presenting. I could hardly ask for a better lineup.

Event details.

Update 20060417: Followup post on the CC blog.

It was a pleasure talking to Ostertag before the presentations got underway. Among other things I learned that Pantychrist vocalist Justin Bond has become extremely sucessful. During the presentation he said he had wanted to put his recordings in the public domain but Creative Commons seemed like a good thing to support, so he chose a license rather arbitrarily. Argh! (CC does offer a public domain dedication.) Ostertag pushed the idea that thinking in terms of “copies” is completely obsolete and more or less encouraged “piracy” — in response to a naive questioner asking if streaming and DRM together could stop copying (smiles all around). It was evident during Q&A that he had much more to say coming from a number of different angles. I look forward to reading more of his thoughts.

I thoroughly enjoyed Lucas Gonze’s presentation, though it may have been too much too fast for some people. I found the things he left out of a talk about how the net is changing music notable — nothing about DRM, streaming, P2P, music stores, or podcasting. Hear, hear!