Friday, August 29, 2003

How Digital Is Bringing Back Ephemerality

So because digital media can reproduce perfectly, in some sense it is fighting the idea of the value of ephemerality. But in a more important way, it's bringing it back.

As has been often noted, especially in law enforcement and journalism, you can't really rely on photographs anymore. PhotoShop and its related products have made it possible for any person to manipulate a photograph in such a way that it shows something that didn't really happen. Every once in a while, this becomes a story when some newspaper or magazine digitally edits some image. And while experts are quite good at detecting such changes, as the software improves, it will become more difficult, and eventually, the cost and expense of analyzing an image for trickery will be high enough that it's essentially cheaper or more effective to assume it's false. A few years behind, but on the same curve, is the same phenomenon for video clips. (Funny, I almost wrote "footage," a word that has no relationship with the physical object any more.)

This sort of thing sends people into a tizzy. "We won't be able to trust film! We won't have a perfect record!" But the funny thing is that this was true for 99% of the history of the human race; only in the last hundred years or so have we had this idea that technology could serve as a perfect, impartial observer. All that digital editing has done is return us to that state, where instead of relying on the perfection of technology, we need to begin to relearn to sense the motives and honesty of the people holding the evidence; no different than any other kind of testimony or story.

Wednesday, August 27, 2003

Ephemerality

I'm away from home on a slow connection, so short entry tonight.

Ephemerality is undervalued. I heard a beautiful chamber music concert tonight, and I think we appreciate live music, and plays, at least in part because we know that a recording will never do it justice, that we've just seen something that will never be exactly repeated. Even with a play that might have eight performances a week, that particular performance is unique, and that uniqueness is something we treasure.

In software, we grasp tightly to the idea -- already a fiction -- that results are predictable. We think of our computers as deterministic machines, but that stopped being true years ago. If we were to turn away from that, and treat software more like chamber music: each performance slightly different, in ways that make us value it both as an aesthetic and as a functional object, we might live in better harmony with our technology.

In a funny way, digital media are very much swimming against this trend. The "perfect copies forever" quality of digital music that so scares the hell out of the content industry is actually something that cheapens the value of these things; not because they become common, but because they become banal.

Monday, August 25, 2003

The Map Certainly Isn't the Territory

Here's what I want, and I expect someone out there in web land to do it for me. It doesn't seem like too much to ask.

For decades, if not centuries, people have been complaining or arguing about the accuracy of various projections of global maps. One projection makes Greenland too big, one projection splits the ocean in half, one projection distorts the shapes of continents, one projection makes my butt look big. Whatever. These all miss the point that our maps are blemished with an enormous set of inaccuracies: the names of countries are wrong.

I want a map of the world with the actual names of countries on it. If you look at a typical American-produced map, it's full of countries with names like "Germany" and "India" and "Greece" and "China" and "Japan" and "Hungary" and "Egypt," etc. etc. etc. You might not think that's strange, but the fact is that there are no such countries. Sure, we in the English speaking world may have been calling certain countries by those names, but it's not what the people who live there call them. This is ridiculous. It's time to get rid of at least one vestige of colonialism and produce an accurate map.

I talked to someone at National Geographic about this once, reasoning that if anyone could make a change like that, it's them, but they declined. They did say that they had begun to start labeling cities with their actual names ("Kolkata" for Calcutta, or "Mumbai" for Bombay, for instance), but they were not going to change country names. Bah! I say. They're supposed to be an educational institution. Well, educate us Westerners the single, most basic fact about these countries: what their name is.

P.S. the real names of those countries are: Deutchland, Bharat, Hellas, Chungguo, Nihon, Magyar, and Misr. There are many others, not even counting subtle misspellings like Turkiye, Rossiya, and Espana.
Bonus! Instant Political Analysis

Okay, I said no politics, because there are already too many political blogs. And Josh Marshall and I went to school together so I don't want to give him too much competition. But as a special bonus, here goes.

Who does Clark hurt? Retired four-star General Wesley K. Clark is likely to shortly declare his candidacy for the Democratic nominee for President of the United States. If he does, he's likely to seriously upset the current dynamics. Of the current viable candidates, who does he hurt the most? Well, as someone with a gold-plated resume, including service in Vietnam, he's likely to gain support among the kind of people who are impressed with biography. That probably hurts Kerry some, as he has the best military record among the current crop. Although Clark's positions on social issues are a little unclear, other than being pro-choice and pro-affirmative action, he's likely to position himself near the political center, and being associated with the military will help with this. This will eat into Lieberman's base as a DLC candidate. Finally, as a military man and a southerner (he's from Arkansas), he will probably rob supporters from North Carolinian John Edwards. While Kerry and Lieberman have enough support to absorb some losses, the race is close enough that any loss will hurt. Edwards will likely be out of contention.

As a late entrant, it may be Clark's strategy to put minimal effort into Iowa and New Hampshire, and go for a big win in South Carolina (in fact, staying out this late will probably help him as he's lowered expectations for the first two states). So who benefits? In a word, Dean. Dean, in one swipe, clears out all of his major opponents except Gephardt. Perhaps this is related to the fact that Clark and Dean have had a series of private meetings...

A final observation that a Clark/Dean ticket would be extremely effective.

Friday, August 22, 2003

The Ultimatum Game

This is the ultimatum game. I have $10. I offer to share that money with you, split however I decide. Once I offer you your share (say, $4), you can decide to accept the offer or reject it. If you accept it, then you get your share and I keep mine. The game ends. If you reject it, however, we both lose all the money and get nothing.

There are variants for things like multiple rounds, but let's keep it simple. Rational actor theory from economics would tell us that the best strategy is to accept whatever you're offered. After all, it's better than nothing. If I offer you $1 and keep $9, your choice is a) $1 or b) $0. Obvious. What happens when people are actually tested, however (even in cirumstances where they are assured there will be no second round with the same player and thus no chance for retribution or learning), is that people disproportionately reject "unfair" offers. That this is a shock to economists tells us something about economists. What's interesting about this study is that MRI images of the brain taken during the study showed that the portion of the brain responsible for emotion triggered during rejection of unfair offers. This did not happen, and rejection took place less often, when the test subject was playing a computer.

Thursday, August 21, 2003

Stigmergy

I learned a new word today, stigmergy. Coined by French biologist (or as we say these days, "freedom biologist") Pierre-Paul Grasse, it refers to the strategy followed by some animals of communicating in a way intermediated by the environment. The canonical example is how ants communicate; not by directly waving their antennae at each other, but by leaving a trail of pheromones in the environment. Other ants, finding those molecules, gain information about what the other ants are doing.

Judging from the hits on Google, other people have already discovered this term and applied it to the Web, blogs, Google, trackback, and whatnot. I'd like to try to examine it in a broader context, though, in a much closer analogy (or perhaps literal interpretation) of what ants do.

Stigmergy is different from the communications we're used to. For the vast majority of human existance, communications was highly localized in time and space: a live conversation with another person. The invention of writing, at first associated with megalithic construction, broke the time axis, but was restricted to a very few individuals. In the last century, technologies have appeared that have broken the space axis as well; we can have time-local but non-space-local conversations via the phone, or IM. Added to this confusion are the questions of whether a communication was addressed to a specific recipient, was persistent, or perceivable by others.

Given the coming smart environment, true stigmergic communications are possible. We can leave messages in a location with our location-sensitive palmtop; where the message is physically stored is irrelevant. What's important is that later visitors could read our ant trail. Here, we'd see communications highly space-local, that is, tied to and perhaps only visible in a specific location. But it would be very non-time-local; it would be visible at any time after its initial creation. In some sense, what it's most similar to in human experience is the ancient stone monoliths, carved with hieroglyphs: "Here lies Sun-born Thutmosis, Beloved of Re, Lord of the Two Lands." But this new communications pattern (which must, I admit, get a different name first) would be available to anyone, filterable by software, and attached to any kind of location, even if there isn't a great tomb there.

The invention of writing, the printing press, the telephone, email; all presaged fundamental changes in social patterns. When we are as ants, what manner of anthills will we build?

Wednesday, August 20, 2003

Smart Environments

One of the great conceptual breakthroughs in evolutionary biology is the idea of coevolution, that species evolve in a complex feedback relationship with a constantly evolving environment, including other species. This can be an arms race between predator and prey, or a more positive coevolution between, say, bee preference and flower color. When you look at a piece of fruit, consider that its color, smell, taste evolved in a dance with the preferences of a specific animal. For example, fruits that want to be eaten by bats tend to be colorless (because bats, although not blind, hunt at night), fairly odorless, and they hang below the branch so bats can swoop them up on the fly.

That lesson seems to have been lost on poor homo sapiens, smart as we in theory are. We assume that the environment is largely static, other than the changes we intentionally inflict on it. There's more plastic in the ocean than plankton, but we assume that the ocean is static and infinite, so we can dump as much of our crap as we want into it. In a much more prosaic way, when individuals walk through an environment, we assume that it's static. We can observe it, and make changes to it ourselves, but other than that, even if it does change, it does so beyond our notice.

In a profound and fundamental way, that basic assumption will change over the next five years. A whole host of technologies, from smart sensors to software that can interpret human posture and movement, will not only provide a lot of important and useful data and services, but change the way we conceptualize space. Space will change from an empty void to a fully inhabited, dynamic, responsive computational fabric. No changes will go unobserved -- we'll hear all those trees falling in the forest -- but more importantly, the environment will change in response to us, changing light, heat, information displays, eventually even reconfiguring furniture and walls. But it's not any specific applications; certainly there will be unexpected and innovative uses. The point is that our basic ideas of space, action, and change will be shaken and rebuilt in a way that's only vaguely clear.

Friday, August 15, 2003

Some Fair and Balanced Commentary

Fox News is not, metaphor aside, run by or about foxes. "Fox" is what lawyers call an arbitrary trademark: the word "fox" has no special connection to news, and thus is a fairly strong trademark. If I were to create a news company, getting even close to "fox" (Foxy News?) would likely trigger a trademark suit which I would rightly lose. Their trademark is now strong enough that it would even likely beat a news company that was in fact about foxes ("Geoff's Fox News"). I'll leave the possibility of a news company run by foxes as an exercise for the reader.

Fair. Balanced. Those are words that describe a news service (although not necessarily Fox's). It is extraordinarily difficult -- or at least by statute and case law it ought to be -- to defend such descriptive trademarks; they verge on being entirely generic. (Of course, Fox News is owned by News Corp, an exceedingly generic name itself. It's a wonder they haven't sued the other news companies.) When Fox News says they are being "Fair and Balanced," they are describing attributes of their news broadcasts, and in terms that are central to the idea of a news broadcast. To assert proprietary ownership over "fair" and "balanced" is to give Fox News an illegitimate control via trademark law over key competitive aspects of the field. The ideas of "fair" and "balanced" are too central and important to the market for news for any company to have ownership. (They would likely counter that it is the combined phrase "fair and balanced." Possibly; would they have gone after Franken if he had said "balanced and fair"?) Compare this to a notional news company whose motto was "Garbled and Misleading." That would be a much stronger trademark, since those terms are not usually associated with news broadcasts (except, of course, Fox).

Trademark law really needs to be changed in two important ways. One, the fact that you can lose your trademark if you allow any other uses of it creates an absolutist regime. Why not water this down? Secondly, mottos or descriptive phrases really deserve far less protection than brands. It's not the Fair and Balanced News Show, it's Fox News. "Fair and Balanced" is a secondary, descriptive by intention and so shouldn't be treated as equivalent to trademarks that do deserve protection.

Thursday, August 14, 2003

Vocabulary of the Future

Cingularity: The point in time when exponentially increasing churn rates among mobile telephony subscribers reaches infinity.

Wednesday, August 13, 2003

Annotation Projects

The thing about the web is that ever imaginable project seems to be out there already.

Annotea at W3C
Annozilla at Mozdev

Haven't had time to actually play with them yet, but they're on the right track.
Web Sharing

What I'm looking for in annotation is just one example of increasing the number of ways to make web browsing a shared experience. The simplest things in the world -- mailing a cool link to your friends, sharing your bookmarks, sharing your opinion of a web page, having a discussion, using web links as references for written material -- are clunky, poorly integrated, or held captive by specific web sites. These should all be organic parts of the experience, and should be natively provided by the web browsing platform (I'll be momentarily agnostic whether that means a browser or something else).

Other things are just impossible for the basic loadset; for example, let's say I'm visiting an interesting dynamic web site, and I want you to be able to follow along on my trip. Without wholesale slaving my display to yours, I can't do it. It's also irritatingly difficult to point into content, such as a non-anchored web page, a PDF file, a Shockwave animation, etc. These internal links are at the whim of the media creators, which is a fundamental violation of the idea of the web, that no coordination is necessary to create links.

Finally, it's becoming difficult to wrap up the state of a web session in a shareable way. I was having a problem with cookies on a webmail site I use, but it was extremely difficult for me to inform Mozilla or the web site owners of my problem, since the problem only arose once I had logged in. Unless I wanted to file my user id and password on bugzilla, I couldn't really share enough information to make the bug reproduceable.

These are all small things individually, but they are all part of a pattern of an isolating trend on the web, that user experience on the web is increasingly tied to an individual rather than a social group. This is not even to mention the increasing use of customization. Of course, customization is a good thing, but it does come at a cost. I remember being sort of excited when Amazon would feature a favorite book of mine on their home page, but now I know that I'm probably the only person seeing that book now. Or even if I'm not and it really is globally displayed for all visitors, I have lost the sense that Amazon's home page is a shared space. Maybe not a big deal to Jeff Bezos '86, but with enough of these privatizations of space, we lose an important sense of being interconnected.

Tuesday, August 12, 2003

More on Annotation

The problem with annotation is that it involves a many-way dance: the web-site content owner, the annotator, the annotation host, and the viewer. It's easy to change any one of these, and with a little bit of effort to coordinate two, but coordinating all of them (with "annotation host" not even a real thing yet) is quite difficult. As I probably mentioned before, I strongly believe that the annotation host and the content host must be different, or there will be too much political pressure to only accept friendly annotations.

The idea is that anyone could annotate a given site, and place it in some trusted third-party registry. When a user (using an annotation-aware browser) visited any site, the browser would simultaneously fetch the actual content of the web site and also check in with (possibly many?) annotation sites to see if there were any available. If there were, it would display them overlaid on the original content. Of course, each one of these steps has many complications and roadblocks, and I'll dig into them over the next couple of postings.

What exactly do I mean by annotation? The classical usage is in literature studies, where an annotation is some sort of expanded detail of a fragment of text. In the broader web context, it could be additional data, a comment, a potential revision, a supporting or opposing fact, or even just an AOL-like "me too!" It would be nice if there was a mix of structured and unstructed format to these annotations, so that while we wouldn't be constraining any innovative uses of the system, we could more easily extract meaning from or categorize them.

A giant missing piece of this puzzle is a reputation system; if a web site like Rush Limbaugh's has fifty thousand annotations, who can possibly read them all? Which should the browser display by default? Ideally, there would be some sort of system that allowed the user to customize who they trusted and liked, or even what sort of annotations they were looking for at that time. Again, we must be extremely careful that we aren't blocking out contrary views; there's no point in building such a system just to increase the reflectivity of the echo chamber.

Friday, August 08, 2003

Crying Wolf

A short digression from annotation. I recently finished Hubbert's Peak: The Impending World Oil Shortage by my old geology professor (and character in John McPhee's excellent Basin and Range, now available as part of The Annals of the Former World) Kenneth Deffeyes. His premise, quite briefly, is that within the next few years, the world will reach a peak of oil production, and it will decline from then out. Not that we'll run out of oil; we have decades left of reserves, even at extrapolated SUV-oil-guzzling rates, but for inescapable reasons of geology, we will not be able to extract the oil as quickly as we did in the past. Basically, we've already hit all the good oil fields, and the rest is all inaccessible or in inefficient forms such as oil shale. (A similar methodology, applied in 1958, predicted a US oil peak in the early 1970s. Of course, it was met with great skepticism. US oil production peaked in 1970 and has declined every year since.)

The first reaction most people have to hearing that someone's predicted an oil shortage is "They've always predicted that, and it never comes true." Well, leaving aside that it's usually political activists or economists saying it, but now it's actual geologists, there's an important point I think people are missing.

In the fable "The Boy Who Cried Wolf," the obvious moral is that you shouldn't cry wolf because then people won't believe you. But there's a moral for the villagers, too, although not explicitly stated: eventually, the wolf really is there.

Thursday, August 07, 2003

Annotation

Reader Joseph Kondel was kind enough to point out some historical startups that offered "post-it" note functionality for web pages, but sadly they have either disappeared or morphed into IM companies. It's not surprising that there isn't particularly a good business opportunity for annotation, but that doesn't mean it's not a vitally important technology.

The idea of annotation is related to the idea of the parallel document, something that Ted Nelson, the coiner of the term "hypertext" and one of the original thinkers in the field, is deeply interested in. His project, Xanadu, is a kind of alternate-history web, built around much deeper structure, and designed to support applications like two-way links and parallel texts. Unfortunately, it seems unlikely that his project will displace the web, another instance of Gabriel's worse-is-better at work, most likely.

The failure of a business model and the failure of a technology model doesn't mean that we can't graft good annotation onto the web, though, and I think that we need to. Annotation, simply put, is the ability for a user to record a set of commentary around a web site (or page, or fragment of a page), in a way that it becomes accessible or visible to later visitors of that site. The extremely coarse-grained way to do that now is when web sites offer public comment fields, but of course those can be edited, and they are not always offered. Annotation, in my sense, has to be completely independent of the hosting web site.

Why is all this important? Because the vitality of our democracy rests on the ability to trade in the market of ideas, and to participate in vigorous debate. With the web emerging as such an important way for people to get information, it becomes more important for ideas to exist in a context where they can be challenged and possibly refuted. This is not censorship -- just the opposite. Everyone deserves the right to speak (in this age where "freedom of speech" and "freedom of the press" are merging into one), but for that right to translate into democracy, we need to be able to collectively, that is, as a group activity, weigh and judge that speech.

Especially in this time of political polarization, it becomes easier and easier to just read information that reinforces your own worldview, giving you the mistaken impression that everyone agrees with you. Nothing could be more dangerous. And if we can make a small difference in that by just hacking Mozilla, then why not?

Wednesday, August 06, 2003

Rant, Part Two

Okay, there's another web site that's got me frothing at the mouth in anger. I won't link to it, because I don't want to give it any Googlejuice, but it's already on the first search page.

I was searching on the web for safety information on sucralose, one of those sugar substitutes. Sucralose is basically sucrose (ordinary sugar), but with three hydroxyl groups replaced with chlorine atoms. "Chlorine?!," you might think, but then you'll probably remember that table salt is sodium chloride, and your stomach contains hydrochloric acid. Yeah, chlorinated compounds might very well be dangerous, but that's not the point here. The point is that this one web site, in discussing the safety of sucralose, admitted that salt is sodium chloride, but insisted that that's not relevant, because NaCl doesn't become a gas until 1600 degrees, and so the chloride atom is never separate, not like in sucralose.

Hello? Or, instead of heating it to the point of dissociation, you could drop the salt in a glass of water, you moron! Freeing the Cl- atom is far easier from salt than it is from the carbon in sucralose. To quote one of my favorite professors, "You couldn't be more wrong."

Now the second part of the rant is that there is no way I can usefully indicate that this guy is an idiot. He's already a top ten Google site for searching on sucralose, and maybe people looking for information won't know that he's an idiot. If I linked to him in this story, all that would happen is that his Google rank would rise. There are proposals out there for a way to link negatively, indicating that I don't like the destination (sort of like modding down on Slashdot), but that's probably not quite enough information (although it would be a useful first step). What the web really needs is a way for third parties to annotate web sites without the permission of the original web site owner. I should be able to attach this message (or more preferably a clearer one) to that page, pointing out the many ways in which he is ignorant of high school chemistry, and that page should be visible to casual visitors, or at least ones who have somehow indicated that they trust me or want to see annotations.

Next, I'll talk about ways that we could move to an annotated society.

Tuesday, August 05, 2003

What are you linked to?

Well, the Episcopalian bishop story is over (for now, at least), but this coverage from the normally with-it Boston Globe nearly drove me into a frothing rage:
Yesterday, a Globe reporter was able to get from the Outright home page to a site containing images of bisexuality, including explicit pictures of sexual acts, in six links.


Six links! Six links! Has this "Globe reporter" never seen Six Degrees of Separation? Or actually used the web? To slander someone because you can get from the web site of an organization that they once worked with, years ago, to a site with images of bisexuality (gasp!), taking six links to do it is not only criminal, it's incredibly dumb. (The original story is here, but in another stupid web trick, that link will only be good for a couple of days.

Monday, August 04, 2003

Simulation and Its Discontents, Part IV

I bet you thought I had forgotten about my series of posts on simulation; in fact, chances are you weren't reading my blog back then (except you, hi Dad!). I had one more point to make, but needed to figure out how to explain it.

A brief refresher for those who haven't followed the link yet: I'm a big fan of the uses of simulations, but people often overlook the problems with them. I talked about a couple of the issues, particularly the lack of certificates and the ease of being fooled by superficial structural similarity. I should have pointed out then the more obvious drawbacks: the fact that designers can easily influence the outcomes of the simulations by subtlely (and possibly invisibly) inserting their own biases. For some reason, Will Wright's preference for mass transit in SimCity is the canonical example here, despite the fact (or because?) that he's quite open about it.

Today's problem is pretty abstract, and comes via Caltech professor John Doyle. Here goes. Real life systems will change when the environment changes; some changes will create changes of larger magnitude than others, depending on the system. For example, the temperature of a house that has a functioning thermostat is fairly resistant to changes in outside temperature. If the temperature rises, the thermostat kicks in and turns on the air conditioning. Too low, and the heat comes on. However, the temperature is extremely responsive to a loss of electricity: the thermostat stops running and the temperature quickly moves into equilibrium with the outside. In short, some systems show a great deal of fragility in the face of some external perturbations, and stability in the face of others.

What Doyle's shown is that, in a model of that system, the complexity required to accurately model the system is correlated with the system's fragility. That is, for perturbations for which the system is stable, the model can be simple. For perturbations for which the model is extremely fragile, the model must be extremely complex. This is unfortunate, since it's generally in the fragile regimes that we're the most interested.

Friday, August 01, 2003

What Good is a Robot?

It's clear to me that robots, long relegated to the factory floor, are at an inflection point. Advances in miniturization, construction, and conceptual design of robots have created all of the ingredients for a generation of extremely capable, interactive, mobile, and (relatively) affordable robots. iRobot Corporation's products, like Roomba, and its cousins in the MIT AI lab are the leading examples of this, but also the Japanese products like ASIMO and Aibo (the robot toy dog). Each holds a piece of the puzzle, and shortly, I think, products combining these pieces will be on the market: easily mobile, aware of humans, relatively intelligent and adaptive.

But... so what? Despite my belief that "we're almost there," I struggle to come up with any good examples of robots that would change our lives. Okay, so there's a robotic vacuum cleaner. And a bunch of (cool) toys. What else? Rodney Brooks' book Flesh and Machines painted a picture of a future where robotic technology was invisible, subsumed into other products, but I can't think of any examples that aren't ridiculous. Typical examples include robots for home cleaning - other than vacuuming, this is hard to imagine. Are we really less than ten years away from a robot that can pick up and sort laundry, or a robot that can clear a table? Robots for telepresence, perhaps, especially in dangerous situations (disaster response, military, environmental danger). Plenty of entertainment and leisure activities; "Teddy" from AI is probably not far away (although not quite as socially sophisticated, at least for a while).

I feel like this is a failure of imagination on my part: I'm supposed to get paid for predicting the future of technology, and this is clearly a huge area of investment, but for the life of me I can't think of any broadly useful robots. And what about those little toaster-shaped things on wheels that zipped along the hallways of the Death Star? It's easier to transmit information by wire or wireless, so they weren't carrying disks or messages. Sandwiches and juice-boxes for the Stormtroopers, maybe.