Category Archives: standards

Rondevous: lower life forms

My father liked to order things off the menu he had never eaten and then cheerfully attempt to get his children to try them. This is a cheerful kind of cruelty that I have inherited. We often told the story in later years of the time we ordered sea urchins in a dingy chinese restaurant in New York. The consensus was that the sea urchins weren’t actually dead when they got to the table. They would go in your mouth and when they discovered you were attempting to chew on them they would quickly flee to the other side of your mouth. Being a very low life form once you had succeeded in biting them in two your reward was two panic sea urchins in your mouth.

I’m reminded of this story by the fun people are having with neologisms these days. For example blog, or folksonomy. A neologism is rarely a very highly evolved creature, which makes it hard to pin down. But there in lies the fun. You can have entire conferences about a single word because collectively nobody really knows what the word means. These littoral zones are full of odd creatures. The tide of Moore’s law and his friends keeps rising. The cheerfully cruel keep finding things to order off the menu.

But, before I got taken prisoner by that nostalgic reminding, what I want to say something about is this definition of ontology that Clay posted this morning.

The definition of ontology I’m referring to is derived more from AI than philosophy: a formal, explicit specification of a shared conceptualization. (Other glosses of this AI-flavored view can be found using Google for define:ontology.) It is that view that I am objecting to.

Now I don’t want to get drawn into the fun that Clays having – bear baiting the information sciences.

What I do want to do is point out that the function of these “explicit specifications of a shared conceptualization” is not just to provide a an solid outcome to the fun-for-all neologism game.

The purpose of these labors is to create shared conceptualizations, explicit specifications, that enable a more casual acts of exchange between parties. The labor to create an ontology isn’t navel gazing. It isn’t about ordering books on the library shelves. It isn’t about stuffing your books worth of knowledge, a tasty chicken salad between two dry crusts – the table of contents and the index.

It’s about enabling a commerce of transactions that take place upon that ontology. Thus a system of weights and measures is an ontology over the problem of measurement and enables exchanges to take place without having to negotiate from scratch each time, probably with the help of lawyer, the meaning of a cord each time you order fire wood. And, weights and measures are only the tip of the iceberg provide the foundation for efficient commerce.

But it’s not just commerce, ontology provide the vocabulary that enables one to describe the weather all over the planet and in turn predict tomorrow’s snow storm. It provides the the opportunity to notice the planet is getting warmer and to decide that – holly crap – it’s true!

To rail against ontology is to rail against both the scientific method and modern capitalism. That’s not a little sand castle on the beach soon to be rolled over by the rising tide. Unlike say Journalism v.s. blogs, it’s not a institution who’s distribution channel is being disintermediated by Mr. Moore and his friends. Those two are very big sand castles. They will be, they are being, reshaped by these processes but they will still stand when it’s over.


What really caught my attention in Clay’s quote was “shared conceptualizations.” Why? Because sharing, is to me, an arc in a graph; and that means network, which means network effect, which means we can start to talk about Reed’s law; power-laws, etc. It implies that each ontology forms a group in the network of actors. To worry about big, durable, or well founded these groups are it to miss the point of what’s happening. It’s the quantity, once again, that counts.

What we are seeing around what is currently labeled as folksonomy is a bloom of tiny groups that have rendezvous around some primitive ontology. For example consider this group at flickr. A small group of people stitching together a quilt. Each one creating a square. They are going to auction it on eBay to raise funds for tsunami victims. For a while they have taken ownership of the word quilt.

What’s not to like? The kinds of ontology that is emerging in examples like that is smaller than your classic ontologies. These are not likely to predict global warming, but they are certainly heart warming.

This is the long tail story from another angle. The huge numbers, excruciating mind boggling diversity, billions and billions of tiny effects that sum up to something huge.

Professional Programmer – huh?

Patrick Logan writes:

Agreed…

We’re not so much building on the programming state of the art as continually have each generation of programmers rediscover it.
Bill de hOra

This old fart agrees too. A far more interesting question: why.  Why hasn’t a core of professional knowledge emerged in this industry?  Isn’t it normal, even natural, for a craft to transition into a profession?  Why am I not a member of the Association of Computing Machinery?  My friends aren’t either.  Why when hiring we have a strong preference for people who have built things rather people who are well certified?  Why do project managers see little, if any, value in having a few doctors of Computer Science on their teams?

I see three reasons for the absence of a professional class: fast growth, a culture of anti-professionalism, and competing institutions. I’m sure there are others reasons.  I’m sure that at this point I wouldn’t pick one of these as dominate.

This outcome is not  necessarily  a bad ting.  The craft much more egalitarian than most highly technical crafts. It’s easier to get into this field.  Training barrier is lower.  The tools tend to be simple.  They have to be.  I see forces in play which keep it that way.

Fast growth has meant the demand for skilled craftsmen, tools, and knowledge has continually outstripped the supply. The rapidly expanding frontier of the industry continually creates a new frontier where amateurs can achieve huge success.  In this situations it’s much more important to get there and build something than it is to build it well. In new markets the quantity of your customer relationships always dominates the quality of your technical execution.  The fresh frontiers plus scarce labor creates a demand for simpler tools.

Anti-professionalism – man you could write a whole book about this! The mythology of the hacker, open source, the American cowboy. Libertarianism, the 60’s youth culture. etc. etc. But possibly I can say something a bit new. The scarcity of skill results in loose social networks.  On the frontier everybody is new in town.  So the fabric, the social networks, that interconnect the craftsmen are thin.  But, new technology – network based social interaction tools – have enabled much to compensate for that.  One of the theories about the function of a profession is that it act to create knowledge pools.  The network’s social tools have allowed knowledge pooling inspite of thin social networks.  This is new and might well cause other professional networks to erode as they are less necessary.

Another story people tell about professions is that they are a form of union, which naturally leads to realizing that any profession competes against it’s complementary institutions.  Other institutions in high-tech would like to be the source of legitimization in the computing industry.  This is a pattern I first noticed in the Medical profession. Medical doctors managed over the course of the 20th century to gain  hegemony  over their industry. Today that control is falling apart as other players – insurance companies, drug companies, etc. etc. are competing to take control of the huge amounts of money in flux. Today my HMO sees to it that a person who’s not even a nurse does any minor surgery. In high-tech large vendors play a similar game; and they don’t have to bother to compete with a existing strong profession.  Microsoft, Oracle, Sun, etc. etc. all provide certification programs that substitute for the legitimacy of the professional society working in tandem with universities.

Some of this I think is bad; but other aspects of it are great. It’s very bad for the respect and income that highly skilled practitioners can command. While it certainly holds back the median level of skill – it appears to entrain a larger pool of practitioners.  We get a longer tail.  And, as open source projects demonstrate, we are getting better at aggregating knowledge from an extended tail.

Mostly I think it is great that we remain a craft that sports a reasonably low barrier to entry. It makes my coworkers a more interesting diverse lot.  I think it’s healthy to keep the problem solving closer to the problems. Down in the mud not up in the ivory tower.

It is healthy that the righteous prideful status riddled behaviors of most professions are somewhat more rare in this line of work.

Scarecrow: I haven’t got a brain… only straw.

Dorothy: How can you talk if you haven’t got a brain?

Scarecrow: I don’t know… But some people without brains do an awful lot of talking… don’t they?

… much later …

Wizard of Oz: Why, anybody can have a brain. That’s a very mediocre commodity. Every pusillanimous creature that crawls on the Earth or slinks through slimy seas has a brain. Back where I come from, we have universities, seats of great learning, where men go to become great thinkers. And when they come out, they think deep thoughts and with no more brains than you have. But they have one thing you haven’t got: a diploma.

No Follow

No Follow is a darn fine example of standards making. Ten. That’s all, ten. They only had to get ten organizations on board. And it’s very simple too; so they didn’t have to spend months in oddly illuminated rooms discussing character sets and industry politics. Nice thing about small numbers, like 10, is they can execute the entire project in private. They don’t have to polish the idea in the face of millions of bloggers. They can just do it.

I wonder if it took this long to happen because we needed to wait for the blogger and search engine industries to condense to the point that the number fell to ten or less?

All this reminds me of a posting I made a long time ago – “Feeding the link parasites is a sin“. (For about a year after I wrote that posting it was a magnet for spammers experimenting with innovative new attacks.) When I wrote that my concern was how the open posting policy of blogs was creating a plate of agar for various lower live forms. A substrate just like Outlook, Internet Explore, and Windows 99.

At the time it did occurred to me that the right answer to the problem was to mark up the out bound links, I suggested an author=”unknown” attribute in the link. But I really thought that the right thing to do was to mark up all the added content. Then the search engine would be able to distinguish which portions of the site the site’s owner wished to have accrue to his identity.

My hope was that site maintainers would strive to find a solution to the open posting problem because their identity was at risk; and to a large extent that’s been true. But the problem remains really hard – casually usable open systems are hard.

The rel=”nofollow” solution adds weight to the model that the search engines don’t particularly care about the content of your pages, that in end it’s who you know, not what you know that counts. Probably so.

the calculus of distributed tag space

Running a bit further along the ideas in this posting about revealing a small programming language over your web site’s data.

Today I’m thinking that this is an interesting variation on the idea of a data exchange standard; something much richer.

Consider this URL that assembles a page from the union of URI tagged with a single token at both del.icio.us and at flickr.


     http://oddiophile.com/taggregator/index.php?tag=market (try it)

Or the same idea as executed by Technorati.


     http://www.technorati.com/tag/market (try it)

In both cases the ceiling on what’ possible isn’t very high because the query language available is limited to “What URI does site FOO have with tag BAR.” Clearly a richer query language is desirable. One that let you make queries like: “What URI does site S1 and S2 have with tags T1, T2, but not T3 that site S4 doesn’t tagged with T7 excluding any tags made by the group untrusted actors or commercial actors.” Rich queries demands more consensus about the data model and the operations upon it.

The current bloom of fun illustrated by the two examples above stand on a a very small consensus, i.e. “Yeah let’s mark up URI with single token tags!” A slightly larger consensus would enable a larger bloom.

A small consensus, like the tags, is a lot easier to achieve. It is less likely to fall victim to the IP tar pit. It’s maximally likely to be easy for the N sites to adopt and rendezvous around. It allows one site to set an example (as I’d argue del.icio.us did with tagging) so that the other sites can mimic the behavior.

I wonder if the idea of a simple query language along the lines of the one my earlier posting could enable that. Interesting design problem.

A Folksonomy Rant

Folksonomies – nobody in control? What! that’s silly, the site designers are in control. If the ontologists wish to remain in control they had best start building sites, and fast. “The essential difference between emotion and reason is that emotion leads to action while reason leads to conclusions.” Ontologists are in a particularly extreme case of that quandary, arn’t they? More so than then typical professional faced with the displacement of Moore’s Law and his friends.

Folksonomies – not scalable? My that’s rich. What is this about if not leveraging the vast talent available on the planet. Ontology design by committee, now that doesn’t scale. The seats at the table become scare, the coordination problems explode. Wake up people.

Folksonomies – too flat? Just maybe the reason for hierarchy in the classical ontology hand little to do with the reality of the problem undergoing classification and everything to do with governance of the seats at the committee table?

Displacement and Common Lisp

Displacement is an economic or cultural process where by a community of practice wakes up one morning to discover that the tide of history has left it high and dry. The displaced community is not, necessarily, at fault in these stories. The archetypical example of displacement was the introduction of a new technology into northern england that displaced the tenant farmers from their land holds. In that case the displacement unfolded pretty quickly because their legal claim to their land was based on leases; so when the leases came due and the landlord’s (who wanted to convert to the new paradigm) displaced the residents. The technical innovation that displaced the farmers was a more robust breed of sheep coupled with a business fad for sheep.

In the software industry displacement happens when an existing language community – Cobol, Fortran, whatever – wakes up one morning to discover that the industries current fast growing network effect is taking place outside their community. Some communities manage to catch up by quickly piling on the tools, design patterns, etc. etc. required to play in the new world.

The Common Lisp made one mistake back in the 1980s that helped with it’s displacement. This mistake was around graphic user interfaces. When the Mac came out it redefined how graphic user interface interaction would take place. It set a standard for the interaction. This standard featured the idea of a current selection. First the user would use his mouse to accumulate the selection. He’d select the window/document to work on. He’d adjust the view to bring the thing he wished to modify into view. He’d then select that object. Only then would he browse a selection of commands to affect the object. Who knows if that’s the best design; but it certainly became the standard approach.

Meanwhile over in the older graphic user interface communities the command loop worked quite differently. For example selection was often entangled with mouse location. For example if you moved the mouse over an window or an object the object was automatically selected – but that’s only an example.

In the Common Lisp community a really unbelievably elegant user interface tool kit emerged known as CLIM (or Common Lisp Interface Manager). But this beautiful elegant thing had no concept of “the selection.” As a result it was totally irrelevant to the building of the kinds of user interface that were demanded by those working where the action was. Great ideas displaced by no particular fault of it’s own.

Microsoft had an analogous brush with displacement when the Internet broke out and desk top suddenly became marginalized. Microsoft has, historically, been better at mustering the sense of fear and panic necessary to respond to displacement events. So when it became clear they were at risk they reacted.

The Common Lisp community endures. I still use and prefer it for all kinds of tasks.

But, recently I’ve been concerned by what looks to me like a another displacement threat – character sets.

Emacs is a key complement to the Lisp community; and quite a few others. Emacs has amazing character set support, both the major variants (GNU Emacs, and XEmacs). The support is uniquely powerful. The approach emerged in a branch of emacs known as Mule and more recently has getting folded back into Emacs. All the input output streams can be configured to declare their encoding schemes. The internal strings and buffers are especially clever. The usual trick for systems having to tackle this problem is to normalize all the characters into one standard format, typically unicode. Mule’s approach is different; buffers in a mule enhanced emacs retain their character encoding. Load a file of Big5 characters and point at an arbitrary character in the string and mule emacs knows that’s a big5 character; paste a string of unicode characters into that buffer and now you have a buffer who’s characters are in assorted character set encoding in a way analogous (but with a different implementation) the way that you can have a buffer with characters in assorted fonts. That I find the mule design so cool reminds a bit of how cool I found the CLIM design; but at this point I’m feeling a bit paranoid.

So on the emacs front things are in in pretty good, even very good, shape. It’s all a bit rough around the edges though. The emac communities are still holding mule emacs at arm’s length so you often have to build the mule variant by hand. You often need to get the version of emacs that’s ahead of the stable release curve to capture the features you need. Font support is both amazing and frustrating. If your running under X then you can get a large set of international fonts and after a mess of suffering you can get your GNU emacs or xemacs to use them. One curiosity of the mule buffer design is that a character encoded in one character set may not have a font to render it only because the only font you have installed able to render that cute character happens to be laid out using a different character set. That’s a big pain on the Mac which has beautiful fonts but I can’t see how to get to them from unicode characters sets in emacs.

Over on the Lisp side of things the story is slowly resolving it’s self. There are a _lot_ of really fine commercial and open Common Lisp implementations. Each one has a slightly different story about how and when the unicode problem is getting addressed. The best unicode support in an open implementation happens to be in the slowest implementation. The implementation I’m using today (Steel Bank Common Lisp, or SBCL) has very fresh unicode support.

It’s taken me almost two weeks to get a working tool chain for this stuff. I have tried a lot of combinations and experianced a lot of crashes where both Lisp and Emacs die horrible recursive deaths choking as tried to display or transport characters down pipelines. Currently I’m running the lastest released beta version of XEmacs, build from scratch to get mule support. I’m running that under X on my Mac; so I’m using the open international X fonts. I’m running the bleeding edge versions (CVS Head) of both SBCL and Slime (the emacs< ->lisp interaction mode). [Hint: (setf slime-net-coding-system 'utf-8-unix)]

I’m happy to report that I can now stream unpredictable UTF-8 streams thru reasonably long chains of tools and it all works. Everything in my tool chain except the fonts is beta or bleeding. I’ll be really happy when I’ve got the database linkages working.

If this was 1995 I’d be less concerned about displacement; but it’s 2004. The good news is that the problem is getting solved.

Efficency of Exclusion

I had breakfast the other day with a friend who was involved in a group ware company for many years. He introduced the very amusing idea that a good platform strives to be like Seinfeld; about nothing. That’s got delightful synergy with both the idea that a platform vendor strives to create a huge space of options for his developers as well as the the idea that the long tail is impossible to describe.

Then the discussion turned to group ware. We got to talking about the corporate culture shaping power of software.

The first time I observed that was with bug databases. The arrival of a bug database into a project always has a startling effect. It creates a way of organizing the work, call it BD, and it often rapidly displaces other means of organizing the work, call them OM. BD wasn’t just the bug database; it was a entire culture of how to work. Little things happen, for example, dialogs about the issues emerge in the bug comments and if the tool supports it these dialogs become central to the work. Big things happen like the emergence of entire roles such as bug finder v.s. bug fixer. Anxiety managers quickly learned how to leverage the BD culture. It’s a machine who’s crank they can turn. On the one hand, this was all just great.

But BD had a strong tendency to displace other methods, or OM. OM always lacked a name. It wasn’t a single thing you could name; it’s not even a small integer of things. BD doesn’t cotton to redesign or analysis. BD fails at solving any issue above a certain size. BD had a strong tendency to glue problems to individuals. All these preferences and tendencies of BD are a good thing, except when they weren’t effective.

I can recall a occasions when I would get a large problem solved only by changing the person a bug or collection of bugs was assigned to into a team. Conversely there are situations where the right answer was to assign a bug to a nonexistent person – i.e. the person not yet hired or the person who understands this mystery but who didn’t know it yet.

What was unarguable about the BD culture was it’s efficiency and efficacy. Sadly, in the end it excluded the the other methods required to solve key problems. Looks like a nail cause i got a hammer thinking would begin to dominate. Product hard to use, open a bug. Product starts up too slow, open a bug. Customer learning curve too steep, open a bug.

The BD v.s. OM syndrome has a second kind of displacing syndrome. Individual contributors quickly realize that if they want to part of the work culture they need to get hooked up and pay attention to the evolving BD status. This creates a network effect that strengthens ties between the BD culture and the labor. Which is good for the BD culture; but it’s train wreck if hired a given person for his exceptional skill that happens to be a member of the set of other methods. For example you’ll observe the graphic designer wasting a few hours a day reviewing the bug database status and individual bug updates so he can be a clueful participant in the work flow – but since you didn’t hire him for that it becomes cost not efficient.

In chatting with my friend the example we got to talking about was group calendaring. We had both experienced a syndrome where the firms we were working in had installed a group calendaring tool and almost immediately the firm’s entire problem solving culture had been transformed. In this case a GC culture would displace OM.

In the GC culture it’s easier to schedule meetings so you get more meetings. The software has strong opinions about meeting granularity; so you get mostly one hour meetings. Meetings tend to be family sized. The software makes it easy to invite more people, so more people get invited. Meetings by their synchronous nature are exclusionary. Not wanting to appear exclusive with members of the extended family of coworkers people tend to invite additional people. People, concerned about what might happen behind the closed doors of those meetings tend to accept the invitations.

That feedback loop that tends to push meetings toward family sized is the GC culture equivalent of the graphic designer wasting his time reading all the dialogs on bugs. You get brilliant team members attending 3 hours of meetings a day because they happen to know that once or twice during those three hours they will say “Ah, I’m not sure that works.” I’ve seen corporate cultures where that’s considered a day’s work for a brilliant guy. “I’m so happy you came to the planning meeting! You really saved our butt.”

This is, of course, part of the problem of the long tail. The organizational culture that adopts one of these highly efficient methods, call that EM so that BD or GC above are examples of EM, develops a power-law among it’s members. Those members who adopt EM enthusiastically gain a place higher up the curve then those who stick to the core competency. They become the elite and all the usual polarizing forces come into play.

None of this requires the introduction of Machiavellian agendas by the players. People are just “atoms in a jar” as the forces play out: synchronization, efficiency, displacement, cultural network effects, and emergence of the elite and consequential polarization.

I don’t think it’s over generalizing to say that when ever you introduce an synchronizing device you gain some degree of measurable efficiency at the cost of displacing nameless uncountable other methods that don’t synchronize well with the method you’ve adopted. If you can’t name them then it’s going to be even harder to measure them. Why bother to even try if they are uncountable.

Generous

Needlepoint bookmark, reading: Be strict in what you produce and generous in what you consume.Eve Maler does needlepoint during standards meetings!

Samplers have a long tradition of teaching moral lessons to the young while presumably guarding them from the dangers posed by idle hands.

Open source has taught me that a person with a talent is an opportunity. Done right it’s fun for everybody. I suggested what the world needed some of the design principles of the Internet rendered in cross point.

Might be a best practice if standards authors were required to execute some samplers first.

Eve made that and gave it to Lauren Wood; who hopefully can display it regularly to those in need.

Curiously, my wife just acquired some very nice antique lace. Lucky her!

Paying for Features

This posting is about two things. I can’t or I uncomfortable with teasing them apart. One part is about how a firm blocks out the choice to add a feature versus how an open source project makes the same choice. For example if we have the bundle of software used to run a handheld computer and are are trying to decide about adding a Postscript viewer to the bundle.

The second aspect is about network effects. In our example if the viewer is added to the base system it will create a much stronger network. That results in an interesting tension in the community around our example handheld. One camp would love to see the addition of a viewer while another camp is ambivalent. Call these camp A and camp B. The tension arises because it’s a lot more fun for camp A if the entire community gets the new feature. If camp B doesn’t get the viewer then camp A won’t be able to casually exchange their Postscript files with the entire community.

Rereading Oz Shy’s wonderful little book on network effects triggered this posting. He outlines a market failure where a firm fails to reach customers with a cool feature because the firm can’t figure out how to structure their pricing so that camp A pays for the viewer while camp B doesn’t.

I’m often asked by clients if open source can help them create a network they desire. So I got to wondering about the analogy between Shy’s example and open source scenarios.

So let’s try to map his example onto an the analagous situation in an open source project.

The commerical vendor does a cost/benefit calculation. It costs so much to add the feature, and so they need to raise prices or market by a certain amount to justify the effort. The analogy in open source is that you need to find a team, call it team X, who desire the feature sufficently to cooperate to add the project.

In Shy’s example the two camps are have different willingness to pay for the feature; i.e. we have a demand curve with a step function. One camp has high willingness to pay and the other has low or zero. Team X, in the open source, case is analagous to camp A; but different.

When I read Shy’s point about how the firm has a pricing problem my first thought was: “Ok, what’s the demand curve really look like? Are there some extremely desperate folks in camp A?” Because if there are just a few desperate folks they may be willing to pay for the firm to do the whole work and then give it away to the entire installed base. For example it is possible to image a handheld firm finding a single fortune 500 company that would be willing to write them a check to do the work. That depends on the shape of the demand curve.

Folks unfamiliar with open source often assume that Team X is an extreme form of camp A. They assume team X must be really desperate since they are willing to build it! That’s not entirely true, rarely true in my experiance. In many cases Team X is unique not in their willingness to pay but rather in how easy it is for them to build it. They have extreme talent (of a very narrow kind) as a substitute for extreme desire.

That’s important. It means that your drawing the members from team X out of a population distribution that suprisingly independent from the one that was simplified into camp A and camp B.

Either high desire, or low cost increase the probablity that a member of the project community will join a collaborative attempt to add the feature.

The standard way to visualize willingness to pay is a demand curve and we can draw a similar picture for talent to implement (and other skills that increase the chance that team X succeeds). Unlike your typical economics text book these curves are not straight lines. The forture 500 company above illustrates that. I suspect these curve are typically power law in form.

You still get failures analagous to Shy’s pricing problem. First you get failures becaue getting Team X to form is hard – i.e. finding elite members of the desire population and the talent population isn’t trivial. The open source project provides a point of rendezvous but it doesn’t necessarilly engage in directed search for oportunities with the same focus that a firm is assumed to.

The second stage of the analogy comes from returning to the issue of the network effect. Members of team X desire a large network as do members of camp A. They are seeking a distribution channel that can give them a means to that end. The open source project offers the option of getting that. Gaining a chance to try and exercise that option pulls team X together. This substitutes for the commerical firm’s directed search for features that will make them money.

For the commercial firm a feature with network effect is a pricing problem; for an open source project a feature with network effect is a team builder.

Expensive Number Registries

I think I’m currently in the lead; so let the games begin. What numbers are the most expensive? The UCC firm prefix is around $150/year and about $750 get into the game. Martin Geddes takes a stab at the value of a phone number treating it as a license to bill customers he get’s a valuation of around $140 dollar a year.

I’ve become increasingly interested in the number business; i.e. selling minted registered official numbers. The raw material is free and plentiful. It is almost the canonical example of the commons. But as the examples above make clear, registered numbers are can be quite expensive. The act of converting a common number into private property is the business of a registry. The domain over which the numbers are used form a club, the registry plays a role in the regulation or governance of that club. That registry can be a private rent seeking entity, like Verisign, or a nonprofit that seeks less naively quantifiable goals like the UCC or ICANN.

Frame it how ever you like, there is always an owner (or owners) of the registry function. A bit of paranoia about these registries is in order. On the one hand they are well positioned to become abusive monopolies (or oligarchies) , while on the other they can suffer all the classic breakdowns that trouble commons.

The rent seeking registry owner strives to make the numbers scarce. To convince customers that common numbers are worth something are better than common numbers is a bit of a magic trick. The trick is one of illusion, faith, and fact.

Consider the prime numbers needed to implement https connections; i.e. SSL keys. How much of the value of a SSL key from a leading vendor is illusion, faith, or fact?

What concerns me is the lack of awareness among system designers of the choices they are making when they block out the design of these registries. Some design choices favor the emergence of a highly concentrated registrar market while others favor the emergence of a dysfunctional diffuse set of registrars. While either outcome has unfortunate side effects it appears to me that most designers aren’t even aware they are making these choices.

Consider an example – domain name registries. Rent seeking is common. Long tedious efforts by members of the club to reduce the degree of market concentration continue. I have been amused to notice a nice example of how making a market more diffuse creates challenges in market’s social contact. Consider this example: the scarcity of domain names provides the foundation for a lot of spam filtering and as domain names become less scarce those techniques are breaking down. Notice that these registry numbers provide the hook on which reputation hangs. A registrar that retains ownership of the number, and only licenses it’s use, can engage in bait and switch pricing, so I wonder about the recent offer of free domain names from a number of top level registrars. I don’t recall anybody who was aware of that this kind of game was being set up when the original designs for DNS were blocked out. Certainly some people were aware of the risk of having a single point of failure; but that’s just one example of something to be paranoid about.

Actually I don’t think it’s a necessary to have the contest. I think I know what the most expensive registered numbers are: citizen ID numbers. Those in wealthy nations with strong well functioning social contracts and deep pools of public services. A Manhattan phone number is a just proxy for a set of those. Which brings us full circle back to Martian’s posting about phone number values. Should the citizens of Manhattan demand that their phone numbers not be handed out to others; clearly that erodes the value of their numbers?

[[ My current favorite example of a light weight registrar: Linux User Numbers. ]]