Monthly Archives: May 2007

Turf Maintainance

The L-Curve model would have us pack the entire US population onto a football field, 44 people per square inch.  In the center of the field the 2 inch high turf represents the median income.  Society cares for the lawn: prisons to the left, schools to right. California for example.
Prison Funding v.s. Higher Ed

Less than 1% of the American population is in prison, while over the age of 18 6% are enrolled in school

I Spy Class War

This image is lifted from an article at First Monday (A Practical Model for Analyzing Long Tails).

football-pl.gif

I’d seen this trick of using a sports field to help inform your intuition about power-law curves previously. In that case the distribution of wealth is the topic. These guys talk about the L-curve; shown here (video):

l-curve.png

theusual.jpgBoth of these do a nice job of helping to visualize the actual shape of these curves. They help to clarify why the politics and business models that serve the two legs are very different and why the appeals that emphasis middle class values are should be treated with some suspicion. The more typical illustration, shown to the right, is preferable if you want to deemphasis the polarization and highlight the uniformity of the underlying generative processes.

Sharing as exercise

I hate most explanations for why people participate in Open Source. I care about this question.  I enjoy the game of puzzling out the answer.  In a reversal of the usual cliche I love the game and hate the players; the casual players who think they know the answer. After two decades of thinking about this question I love that I stumble upon new answers.

Owners of Capital Goods often have excess capacity that they might share.This morning I was attempting to read, yet again, Benkler’s essay on sharing nicely where in he argues that the usual dialectic framing of how to coordinate activities (hierarchy v.s. markets) has blinded us to a third scheme; i.e. sharing. He points out some huge coordination problems that are solved via sharing and he does the good and necessary work of constructing an economic model for why some problems are well solved by sharing.

Part of his model explains why owners have excess lying around, that is  suitable for sharing.  In that explanation I was excited to to notice a new motivation for sharing.

Benkler draws our attention to excess capacity that owners can not consume. Idle cycles on your PC or empty seats in your car as you drive hither and yon. He model for this is analogous to that seen in value pricing models – i.e. if you own a hotel full of rooms and as the hour grows late you should consider selling those rooms for less; since otherwise they will go idle an you will get nothing.

I found my self thinking at that point about the emotions an owner has about this excess capacity, for example the sense of of lost opportunity, leading to emotions of frustration, grief, guilt.  The hostess pressing left overs on her guests as the party wraps up is motivated by a horror at the waste shows how motivating this kind of sharing might be.

But the resource that drives open source is talent so the question naturally arises at this point does this model have something to say about sharing around the creation of these knowledge pools? This is delightful bit. If we think of skill as a capital good then talented people own building full of skills; and they lease out to earn their living. Of course most of the time they can’t find a buyer for all their skills.  The rooms are empty.  It’s not surprising that they are willing to share freely some of this capacity.

Skill, unlike capital equipment, can improves with use; creating an incentive for sharing.I had already noted many of the motivations outlined above for sharing one’s talents: countering the guilt for letting it go to waste, the positive emotions of generosity, the low cost of giving away the excess capacity. But I had not noticed something else: skills that are not exercised decay. While the hotel room left idle depreciates only slightly, a skill unused decays quickly. The skill demands that I exercise it, it’s survival depends on that exercise. If I horde it, it evaporates.

Openssh authorized key commands

I didn’t know about this and it’s quite useful. In Openssh you can specify the command to run when a given key connects. In effect this allows you to treat ssh keys as capability tokens. You can gin up a key pair, and then configure things so that key is useful for one and only one operation; say retrieving a log file or polling a remote system.  If you leave don’t bother with giving the key a pass phrase (you could also configure your ssh-agents just right) the client machine can use the capability in scripts as it pleases. The details for how to set up the authorized_key file to do this are in the ssh manual.

For example here is how you might create way to ask a machine what it’s uptime is.  First we create a ssh key pair to use.

ssh-keygen -t dsa -f /tmp/uptime_id -C ‘for uptime queries’ -N ”

The -N ” results in an empty pass phrase.  I like to add a comment stating what this key’s purpose is.  No doubt you’d probably want to save it someplace more useful than /tmp.

That results in two file, for the private/public parts.  We want to add the /tmp/uptime_id.pub into the ~/.ssh/authorized_keys of our target machine (call it target.example.com) as so:

command=”/usr/bin/uptime” ssh-dss AA … ucP0NNHm+w== for uptime queries

except of course the ” … ” in that example should be the entire public key.  Take careful not to remove your other entries in your user’s target.example.com’s .ssh/authorized_keys.

Now your all set.  Just do:

$ ssh -q -o ‘ControlMaster no’ -i /tmp/uptime_id target.example.com
11:00  up 6 days, 18:37, 3 users, load averages: 0.66 0.41 0.39
$

The “-q” quiets ssh’s verbosity; and the “-i /tmp/uptime_id”.  The -o to disable ControlMaster avoids the risk that you may already have a connection and ssh will try to share it.

Meanwhile on the client side you might create a pseudo-host in your .ssh/config file.  Wire up just right you can say ssh fetch-log-from-foo.  In the ssh-config manual there is additional useful doc for. IdentitiesOnly helps keep a stray ssh-agent from blessing things with a more capable  identity.  You should disable the ControlMaster.

Appalachian for OpenID

The group I’m involved with, as my job, has released some software: Appalachian, an OpenID add-on for Firefox 2.x. It’s under a BSD style license. It helps you manage multiple OpenID, and smooth the process of logging into sites using OpenID.

OpenID is a good example of a solution that has lots of benefits and lots of risks. It sits in that part of the risk/benefit plane where our brains don’t like to sit.

It’s great because it has strong adoption drivers: not to hard for web sites to use, not too hard for users to understand, and easy for lots of identity providers to add to their offerings (and more importantly it’s very good for them). So, it’s very likely that OpenID will get a slew of adoption.

It’s risky because it encourages people to adopt a single global identifier; and that’s bad because it makes it easy for arbitrary 3rd parties to aggregate data about them.

The global identifier problem in internet identity systems is a puzzle. If you hang out with Semantic Web people for a while you begin to see a picture of the future were by giving every entity it’s only URI we can then casually make statements about those entities. How many URI denote the same entity quickly becomes a problem to which all the social sciences have something to contribute.

I have a strong opinion about this. I think entities should have more URI, not less; but that it is the nature of our technology that we are likely to rapidly head in exactly the opposite direction. I.e. what I want is entirely in conflict with what I believe very strong forces are going to deliver.

So I’m curious about how to fight back. For example how could we enable users to have billions of OpenID, rather than one (or a handful)? Appalachian is, among other things, a step in that direction.

Jane

Dick Hardt from Sxip Identity draws our attention to a new light weight identity solution from, your not going to believe this, 3M! I agree with Dave Wiener’s point that the emerging Internet generation treats identity in fundimentally new ways, so while this solution is not conformant with standards, on many levels, it is both long tail, user centric, and quite sticky. Update: you may need to widen to get the big picture.

Lotus fought the Powerlaw and the Powerlaw Won

There is an encomium to Lotus in the Boston paper today. I worked at Lotus back in the 1980s; during the era when Microsoft killed them or, if you prefer, they committed suicide. Lotus made bad choices about where to make their home. They died because they failed to pick the right answer to the multihoming problem. Of course there are plenty of other aspects to the story, but that’s its core and everything else is noise.

During that era I recall chatting with what we would now call the CTO of a company we went on to acquire. I asked him why he had decided to expend vast resources on keeping his product platform independent rather then on features for his users. This question was a lead up to a question about how he viewed what I’d now call the ‘plausible premise’ of each of the platforms he was supporting. His three
platforms were, if I correctly recall, in order of plausibility: Mac. Motif, and the Window 1.0. As a glimpse into my point about how Lotus was getting these questions wrong at the time it was a bone of contention that he didn’t have a OS/2 port.

He answer was “I have no idea which one will survive.” It’s a glimpse into how naive we were as an industry back then that this answer surprised and delighted me. It became a bit of a cliche for me. To say “Which of these platforms is going to survive.” in planning meetings was surprisingly provocative. The kind of thing that gets people to asking if you’re a team player.

Multihoming is costly. (Ben recalls at this point the misery of failing to learn a foreign language in high school.) These day, for example, it’s damn expensive to support both soap and rest APIs for your web services. You need to support both for the prosaic reason I thought the CTO would raise, i.e. to get access to the maximum number of users. But you also need to support both because you don’t know which will survive.

Most of my technically informed readers can not imagine that one or the other of those could possible die off. Holding that thought in mind you are recreating a bit of what the world looked like when somebody would float the idea that X11/Motif, or OS/2 might die off.

Friction

Today I noticed this ad offering to reimburse you for getting a passport.  $157 per adult.  I felt some sympathy for the advertiser, an island in the Caribbean.  A place people go for the weekend; well they used to.  The island tourism folks woke up recently to discover that numbers where down and they have discovered that the newly increased tedium of getting a passport has caused huge numbers of idle travelers to decided to, well, just go someplace else.

When my 1st son got his learner’s permit it took us three trips to the registry before we managed to accumulate enough documentation to convince them to let him have the learner’s permit.  My 2nd son submitted his first pay check’s stub rather than the check and the bank called to correct the error.  A bit got set on his account that didn’t get cleared.  So the ATM ate his bank card.  It took months to get a replacement card since his school was yet to issue the ID card they required.  All N of my financial institutions have recently insisted that I add four security questions, including one involving a photograph; which is a pain since I share access to these accounts with my spouse so all 30 odd questions and their answers all have to be in some shared location.  We recently got new passports, a project that was at least a dozen times more expensive and tedious than doing my taxes.

I once had a web product that failed big-time.  A major contributor to that failure was tedium of getting new users through the sign-up process.  At each screen they had to step  through we lost of 10 to 20% of our customers.  Reducing the friction of that process was key to our survival.  We failed. It is a thousand times easier to get a cell phone or a credit card than it is to get a passport or a learner’s permit.  That wasn’t the case two decades ago.

The Republicans have done a lot of work over the last decade to make it harder to vote; creating additional friction in the process of getting to the polling booth.  The increased barriers for getting a drivers license, passport, etc. are all part of that.  This make sense because now, unlike 30 years ago, there is now a significant difference in the wealth of Democratic v.s. Republican voters.

Public health experts have done a lot of work over the decades to create barrier between the public and dangerous items and to lower barriers to access to constructive ones.  So we make it harder to get liquor, and easier to get condoms.  Traffic calming techniques are another example of engineering that makes makes a system run more slowly.

I find these attempts to shift the temperature of entire systems fascinating.  This is at the heart of what your doing when you write standards, but it’s entirely scale free.  Ideas like this are behind the intuition of some managers who insist on getting everybody in the team working in the same room with no walls between them.

In the sphere of internet identity it is particularly puzzling how two counter vialing forces are at work.  One trying to raise the friction and one trying to lower it.  Privacy and security advocates are attempting to lower the temp. and increase the friction.  Thus you get the mess around the passport, real-id, and the banks.  Wearing that hat it seems perfectly reasonable that one should present photo id when you vote, or have your biometrics captured if you cross a boarder.  On the other hand there are those who seek in the solution to the internet identity problem a way to raise the temperature and lower the friction.  That more rather than less transactions would take place.  That more blog postings garner good coments, that more wiki pages will be touched up, that more account relationships will emerge rather than less.

Of course the experts in the internet identity space are trying to strike a balance.  It’s clearly one of those high-risk high-benefit cases that people have trouble holding in their head.