Likely, something akin to a robots.txt file would have been invented that would provide electronic evidence of permission to link, and it would have been bundled by default into Apache. Sure, some commercial web sites would have refused to allow linking, but that would have simply lowered their profile within the web community, the same way the NYT’s columnists have become less prominent post-paywall.
In a fairly bad scenario it doesn’t matter what Apache does, as the web is a backwater, or Apache never happens. And in a fairly bad scenario lower profile in the web community hardly matters — all the exciting stuff would be behind AOL and similar subscription network walls. But I agree that workarounds and an eventually thriving web would probably have occurred. Perhaps lawyers did not really notice search engines and linking until after the web had already reached critical mass. Clearly they’re trying to avoid making that mistake again.
So I stand by the words “relentless” and “inevitable” to describe the triumph of open over closed systems. I’ll add the concession that the process sometimes takes a while (and obviously, this makes my claim non-falsifiable, since I can always say it hasn’t happened yet), but I think legal restrictions just slow down the growth of open platforms, they don’t change the ultimate outcome.
Slowing down progress is pretty important, in a bad way. Furthermore, I’d make a wild guess that the future is highly dependent on initial conditions, no outcomes are inevitable by a long shot, and there is no such thing as an ultimate outcome, only a new set of initial conditions.
That’s my peeve for the day.
Grandiose example: did Communism just delay the relentless march of Russian society toward freedom and wealth?