Monthly Archives: October 2005

Weird Economic Constant Discovered

I find this graph wierd, it reveals a correlation between the miles driven in the US and our GDP. It is a constant! $3.37 of GDP per mile driven. How can it be that this number has remained a constant? The fleet gets more efficient, no effect. The fleet gets less efficient, no effect. The internet, no effect. Productivity explodes, no effect.  Price of gas?  No effect.  It’s just plan weird.

I got a flat tire once while traveling and the guy from AAA who came to fix it was a wealth of information about the health of the economy. He felt his business was great indicator. Apparently so. It implies that such a huge portion of the American economy is intimately tied up in complementary relationships with the automobile that everything that isn’t, health care for example, is just noise in the GDP.

Electronics Condensing on the Factory Floor

My paper today includes one of those typical business section PR placed pieces about a local company, in this case a robotics company.

I think thier product is just too delightfully amusing.

It’s a warehouse management system. You store all your junk shelving units, pretty much regular ones. These shelving units are then scattered around the warehouse. Want something? Send a robot to get it. This is when the silly magic happens. The robot doesn’t bring you the part your looking for. Instead it brings you the whole shelving unit.

The robot runs out into the warehouse, slips under the shelving unit, lifts it up, and runs back to your desk with everything on it. In there spare time the robots can rearrange the shelving so popular items move toward the front of the warehouse and unpopular items are packed densely toward the back.

While I think this idea is very amusing it has the added cuteness of being sufficently counter intuitive a that they certainly got some strong patents out of it. It’s a nice idea because all you need is a flat floor and a slight upgrade in your shelf units. The robots are very simpler than most because they work only in two dimensions.

This is typical of a kind of general trend in automation. In olde factories humans wandered the factory floors, listening, gazing, pulling levers, turning valves, etc. In 20th century automated factories sensors and actuators were wrapped around component in the factory. Which made everything a lot more expensive because electronics was sprayed all over everything. We are seeing some condensation of that electronic – or to use the over blown terminology of the industry “the intellegence.” It’s becomming possible to build factories that have reasonably dumb components in the majority, like those shelving units, but slightly clever robotics that run around the factory like the workers of old. Instead of having valves that fit the hands of labor today’s valves fit the robot’s needs; down at floor level for example.

Boy, are there some powerful network effects and platform buisness models that will play out in this industry!

The company’s named Kiva Systems and here’s the article.

Your Child Recliner Has Grape Jam on It!. Seller Financing.

Jealous of all the press bird flu’s been getting the blog-o-sphere ecology has evolved splogs. They are causing congestion of my pubsub queries. Today’s title: Your Child Recliner Has Grape Jam on It!. Seller Financing.. Many of my pubsub queries are coughing up a steady phlegm of splogs, bu it could easly get worse. Just remember the great spam plague of 1998; millions of email addresses died!

Lets toss two terms into the pot. “Dominant design” and “convergent evolution” into the pot. In part because I want to drop them into ye olde blog. But mostly because the provide an entry point to I want to get off my chest about bird flu.

Dominant design is a term of art among the folks who think about innovation. Dominant designs emerge in design spaces as innovators progressively “mine-out” the options in the design space. Once the dominant designs emerge it becomes possible for complementary activities to gather around them – i.e. they create new design spaces. I like the term because it avoids calling these the best designs. The emergence of dominant designs is extremely contextual and path dependent. Owning the dominant design, being the early into that part of the design space, and encouraging the emergance of the compilmentary stuff is all part and parcel of the gold rush in and around one of these design spaces. Careful though. It is rare tha a single dominant design emerges from a design space; more typically a bloom of designs emerges. How skewed the user’s adoption of these designs turns out varies. There are, for example, a handful of dominant designs for operating systems. Typical power-law stuff.

Covergent evolution is the name given a pattern observed in nature where two very similar species (or organs) evolve in widely seperate environments. The design of the two species converges, presumably, because the two niches place analagous pressures on their evolution. Examples of convergently evolved species can be quite disconcerting. When traveling in Ireland many years ago I found it bizzare how some of the birds would behave almost identically along the forest edge to those in New England. If they had behaved exactly as the birds at home that wouldn’t have bothered me; but in Ireland they would dart in slightly different patterns. Which would trigger for me a startle reaction. These days I often experiance something analagous when talking to people from differing developer communities. For example both Apache and Wikipedia appear to have covergently evolved some community norms the might be filed under “convival.” Or, for example most commercial developer networks begin with a focus on developers (aka hackers) but then there emerges a parrallel network that nurtures the relationship with the folks who market the products built on the platform.

Dominant design is a way to think about the shape of systems over time. You can expect to find dominant designs in any reasonably system (market, institution, ecology, whatever). You can expect to see conflict about what designs will be dominant in systems that haven’t stablized. Users attempt to time their choices about when to adopt depending on these signals about how stable things are; and the appearance of dominant designs is just such a signal.

Convergent evolution is a way to think about where to go to get ideas. Similar ecologies can be expected to be rich sources for patterns that are likely to work in your home community. While gene splicing design patterns is easier than teaching Irish birds to fly right it’s still real work. It often fails because it’s easly to over simplify and presume that a the other system is like your own when on closer examination that presumption would fall appart.

Ok, so about that bird flu. The standard story template in the media about bird flu is to tell the story of the 1918 flu pandemic. The fear that nature’s random number generator will mutate the current flu into a form that is virilent in humans. This fear plays on our intuitions about both dominant design and convergent evolution. Do either of these make sense?

We know that systems tend to settle down with one or more dominant designs settling out. But I don’t see how that is a reasonable expectation in this case. Two forces drive things toward a dominant design: fitness and complements. While an operating system has complements a flu doesn’t. So is a more virilent flu more fit?

Of course a flu virus doesn’t engage in longterm planning, but the 1918 flu was obviously too virilent for it’s own good. That’s why it went extinct; it burned bright, fast and out.

Our intuition that this flu will converge to the same design pattern as the 1918 flu is all well and good, but it only leads naturally to the next question. What niche did the 1918 flue arise in. What niche encouraged so virilent a beast. There is a good discussion of just that question in Paul Ewald book.

He argues there that it is entirely possible that the 1918 flu evolved in a very very unusual niche; i.e. the one created by the allied army in support of the trench warfare of the first world war. It wasn’t that the trenchs were a squalid venue; those are common. What was unique about that niche was the tremendously vigorous mixing of flu victums.

To survive a flu virus needs to strike a balance; it must be mild enough to assure it gets passed onto lots of other victums. So the good news is that this tends to temper how virilent a virus of this kind (air infection) becomes. Ewald’s guess about the 1918 virus is that because when a young soldier fell ill with a virus in the trenchs they would toss him in a truck and then bucket brigade him to a series of hospitals. This bucket brigade assured that the virus got to infect a lot of other people. This environment encouraged the virus to evolve toward a more and more virilent strain. First off because it didn’t suffer the negative consequences of high virilence, i.e. lossing the opportunity to infect a large follow on population. Second because high virilence increased the chance the victum would be pulled out of the trenches; giving the virus an impoved chance of infecting a large follow on population.

It’s not hard to see how the hyper dense bird farms of modern agriculture offer an analagous niche for evolving virilent strains of bird flu. Sidebar: the word selfish doesn’t quite seem to be the optimal label for the practice of feeding generic anti-viral drugs to farm animals.

The important question about bird flu in humans is two fold. How far in the flu design space does the existing bird flu need to travel to be virilent in humans; and are we providing a niche that rewards traveling in that direction? We are not running a 19th century European trench war, so that’s good news. Not that I really know anything about the evolutionn of infectious diseases but it appears to me that both the intuitions that drive the fear about bird flu don’t stand up to casual closer examination. Of course, casual isn’t a very good strategy for risk management.

Bogus

Ok, just to be clear. Jakob Nielsen is a smart guy and I’d be glad to pay him big bucks to critique any significant real world web site I was responsible for.

But … he totally does not get blogs and his list of mistakes a blog designer can make are totally absolutely and completely bogus. I don’t agree with even one of them!

I get the sense that Jakob doesn’t actually read to any real blogs; he probably just visits a few of the big name blogs – which are more like zines anyway. Do people actually call them weblogs anymore?

Curious, he doesn’t have an RSS feed.

Barndoor standards

Javascript is a perfect example of a syndrome in standardization that keeps CIO’s up at night. Let’s call it the barndoor syndrome; after closing folksy saying: “Shutting the barndoor after the horse escapes.”

Javascript escaped into the wild before it was standardized; it then underwent very rapid mutation in the installed base. Three forces drove this rapid emergance of new species: the security nightmare, the fun everbody was having, and high stakes competition. The last is particularly corrosive to the collaboration necessary for standards making. This family of species are now all over the installed base, and as we all know installed bases are very hard to move.

The poor web site designer is stuck with a miserable choice. He can antagonize large numbers of users; or he can make himself miserable. It’s a kind of quanity/quality trade off.

The standard(s) for Javascript aren’t a foundation for innovation; they are more like a beacon in the night toward which their authors hope the installed base slowly migrates.

When learning the language the standard is only a point against which you can measure the distance, in units of exceptions possibly, you must travel to reach this or that subpopulation of the installed base.

Driving the horse out of the barn is very tempting, since it builds momenteum and helps you search for the best design informed by actual use rather than ivory tower mumbling (i.e. security architectures). So we could rename this syndrome entrepeurial standards making rather than barndoor standards making.

When small entrepeurial firms do this it’s reasonably ethical; how else are they going to get traction in the market. When large monopolist firms do it the ethics are much more muddy. Which is something to think about when reading people’s critiques of Microsoft’s infocard. It is of course irrelevant to Microsoft if their designs go thru a legitimate standards process; just as long as it wins in the marketplace. Microsoft has cleverly attempted to substitute for a real standard process a conversation in among bloggers. The technorati are one audience you need to convince before a standard will gain great  momentum, but they are not a substitute for real legitimate standards making.  Assuming that you lack  sufficient  market power to just command it’s success.

Bypassing the Toll Booth

The business model I find most facinating is the two sided network effect; where the business acts as a middleman or a distribution channel between two distinct groups. eBay is my standard example bridging between buyers and sellers. Single’s bars where men and women rondevous are another fun example.

I often use the example of a bridge with a toll booth as a way to help visualize this model. I often emphasis the role of the two cities that grow up on either side of the bridge; the complementary industries. Other times I emphasis the role of discrimitory or value pricing as a means for the bridge owner to encourage a larger network. For example bridge owners will agree to lower prices for travelers with an alternative, i.e. those who aren’t locked in. Bridge owners will cheerfully lower prices for travelers who, like the poor, who don’t derive significant value from their travels.

The media business is member of this class of business models; bridging between entertainment producers and consumers. Technology has been lower the cost to create substitute bridges and they have been desperately striving to avoid displacement.

In their thrashing around one technique they have been trying is digital rights managment. This tactic has lots of problems; in particular it misses the point entirely: their bridge is now irrelevant. DRM is a strange attempt to use technology to create chasm which the industry can then offer to bridge; it’s all backwards.

But, there is another problem with DRM. It doesn’t play well with pricing. Nobody in the bridge business wants to charge his entire universe of users the exact same price. Such a strategy would be fatal, at least in the presence of competition.

Pricing in network businesses is a balance between extracting taxes from the network while assuring that your bridge is carrying the most and best traffic. When two fo these bridge businesses compete the one that strikes the balance best wins. Most niave approaches to DRM ties one arm behind your back.

So this story is no suprise Musicians tell how to beat system (Bruce Schneier).

First off it’s obvious that Musicians would be happy to tell their listeners how to get around the bridge; because any information about alternative routes over the river has two positive effects for them. It helps them reach more of their customers and grow their own networks. More interestingly it improves their ability to negotate with the bridge owner – lowering his tolls and increasing their cut.

The second part is more interesting though.

Sony BMG says it is not trying to prevent consumers from getting music onto iPods. Fans who complain to Sony BMG about iPod incompatibility are directed to a Web site (http://cp.sonybmg.com/xcp) that provides information on how to work around the technology.

So Sony cuts the iPod owners a break because it would be very bad for Sony’s network.

Catch the Wave, Control the Supply

Apple’s classic market share, small but significant, has always allowed it to catch technology waves before other vendors. The Airport is a perfect example of that. The PC reference platform is a much more immovable object than Apple’s platform. This is one reason why it has been to Apple’s advantage to keep the hardware platform closed.

If a handful of vendors are struggling to get their industrial standard to catch fire can get Apple opportunity to build momenteum. While I don’t know if Apple managed to control the early supply of 802.11B components (though I suspect so) I have read that Apple managed to lockup the early supply of the tiny disks found in the iPod. I personally think thats 80% of why was able to gain such a huge lead and dominance in that market.

This tactic often backfires for Apple. All thru the 1980s and 90s Apple would regularly have a product success followed by an inablity to supply the demand they had created. That risk is built into the tactic. It can also backfire when the emerging standard fails to catch fire; the portable friendly PowerPC is an example of that.

Managing your supply chain is a contracting problem, among other things. So Apple can be seen to occationally announce that they have signed long term supply contracts with vendors for key breaking edge components, for example flat screens. Such announcements should be viewed the lens of inter-firm signalling; particularly the signals about where momenteum is building for this or that standard or component.

Of course if this tactic works perfectly then other firms wake up to discover that they can’t get components needed to compete. They get testy. Here’s a story about how other firms are looking for regulatory relief after Apple locked up most of the supply the leading edge flash memory used in the iPod nano.

There is a difference, an important one, between a this tactic’s use by a company with a small but significant market share and it’s use by a company that totally dominates a market. In the first – small but significant – case the firm is playing a high risk disruptive role that can reshape the market. In the second – large and dominate market share – the firm is consolidating and leveraging it’s control over the market. The first is healthy, the second just reenforces the existing monopoly.

Gini For Various Nations

This table shows recent entries from the data reported here.

Nation Gini Year
Austria 23.7 2001
Sweden 25.7 2002
Netherlands 25.8 2001
Bosnia and Herzegovina 26.1 2001
Luxembourg 26.6 2001
Hungary 26.7 2002
Slovak Republic 26.7 2002
France 27.0 2002
Czech Republic 27.3 2002
Germany 28.0 2003
Albania 28.1 2002
Ireland 28.9 2001
Belgium 29.3 2001
Ethiopia 29.7 2000
Finland 30.3 2003
Slovenia 30.7 2002
Australia 30.9 2002
Croatia 31.0 2001
Switzerland 31.1 2002
Kazakhstan 31.3 2001
Bangladesh 31.7 2000
Greece 32.3 2001
Macedonia, FYR 33.2 2002
Taiwan 33.9 2003
Indonesia 34.1 2002
Belarus 34.2 2002
Spain 34.6 2002
United Kingdom 35.0 2003
Poland 35.3 2002
Estonia 35.5 2003
Latvia 35.8 2002
Armenia 35.9 2002
Italy 36.4 2002
Canada 36.5 2000
Tanzania 36.7 2001
Bulgaria 37.0 2002
Norway 37.0 2002
Portugal 37.1 2001
Egypt 37.8 2000
Serbia and Montenegro 37.8 2001
Jamaica 38.6 2000
Israel 38.9 2001
Denmark 39.0 2002
Lithuania 39.0 2002
Mauritania 39.0 2000
Romania 39.1 2002
Turkey 39.8 2000
Tunisia 40.6 2000
Ukraine 41.8 2002
Thailand 42.7 2001
Moldova 43.6 2002
Cameroon 44.2 2001
Uruguay 44.5 2000
China 44.9 2003
Georgia 45.4 2002
Venezuela 45.8 2000
United States 46.4 2003
Sri Lanka 46.9 2002
Madagascar 47.4 2001
Singapore 48.1 2000
Uzbekistan 48.1 2001
Kyrgyz Republic 49.0 2002
Russian Federation 49.1 2002
Peru 49.3 2000
Philippines 49.5 2000
Costa Rica 50.1 2000
Azerbaijan 50.8 2002
Mexico 51.1 2002
Argentina 52.3 2001
El Salvador 53.8 2000
Nicaragua 54.2 2001
Uganda 54.6 2000
Ecuador 56.0 2000
Colombia 57.4 2000
Panama 57.8 2000
Chile 59.5 2000
Guatemala 59.8 2000
Brazil 61.2 2001
Bolivia 63.3 2000

Gini is a metric of wealth inequality. Wikipedia has a description though the curves it shows are more symetric than the real world distribution. Gini is a percentage. In a nation with perfectly equitable distribution of income it is zero and in a nation where one household controls all the wealth it is 100. It is the percentage of the income dollars (or what ever) shifted from lower to higher incomes.

Good practice demands that this table be taken with a great deal of care. The methods used to sample the populations of the various countries are so diverse and the quality of the samples taken vary widely. There is a very good overview of how hard it is to get good data like this in the discussion materials that come with the data here.

It amazes me that a statistic so central to economic analysis can be so hard to find.

Prophesy AI

“If only AI had worked out better.” is one of the favorite sayings of a contempary and fellow traveler thru the 80s AI boom. There are so many problems in life that a bit of effective AI ought to be able to solve. Coordinating the time of a meeting, or calling up the cable company and squeezing the next six month discount out of them

I’m currently reading “When time shall be no more. Prophecy Beliefs in Modern American Culture.” I haven’t gotten to the modern part yet, I’m still working thru the history of apocolypticism prophecy; three meaty chapters worth! The pattern the author sees in the historical record is that when ever western societies undergo trama a bloom of apocolyptic prophecy sweeps thru a segment of the population. He reports that each time things settle down again the bloom dries up, quickly. Each time this happens the various characters and events in the book of revelations are methodically mapped onto the recent history. This allows the narators to show thier enemies are the agents of satan. To hear him tell it these blooms seem more organic rather than planned. The forshadowing (teehee) is that later in the book modern media empires and PR based political movements will discover and exploit this market. A clear sign of the end times!

The laundry list of historical figures who’s names have been proven to sum up to 666 is amazing! Every pope, of course, and most historical figures.

Mapping names into a calculation that reaches a particular value is a simple search problem. The kind of thing that early AI tech is very good at. So it’s not suprising that you can get software to find these calculations. What I found amusing though was that you can find message boards that outline how to hack that software, so you can use it for free. Surely we need an open source version.

Now if only AI had worked out we would have software that could read any conflict narative and map it’s participants and events into the book of revelations allowing us to deamonize and polarize with much greater efficency.

Firefox’s impotent user.js

Acording to this

… but here’s a tip: prefs or other JavaScript that you don’t want overwritten (e.g. comments) can be put in a file called user.js …

Mozilla has what I’d call a user init script, aka a .rc file. But Firefox’s doc doesn’t say that (see here) and my experiments suggest that the only thing can put in a firefox user.js file is calls on user_pref. Everything else is ignored? Bummer.