JMESPath is sweet

Back in 60s card images were the canonical format for data.  Unix inherited that and tools like, sed, awk, and even perl all have carry forward that legacy, though they call ’em lines.

There have been many attempts to get something more meaty to pump thru our pipe fittings. For example: Erlang’s got something, Lisp’s got something,  XML, Protocol buffers, JSON, and many many others.  JSON is quite popular this morning.

JMESPath is analogous to XML’s XPaths, but for JSON.  I first encountered JMESPath in the AWS CLI tool’s –query arguments.  But it is totally a stand alone thing.  There are implementations in many popular languages and the licenses are quite generous.   I like using it to pluck data out of API end-points.

One aspect of network effects is that when you capture one your rewards (or possibly your curse) is an immovable installed base.   JSON (and XML) have succeeded where better schemes designed in the past have failed is because they both now have an installed base, i.e. the vast number of HTTP based API end-points.  Getting those to switch to something better would be a PIA.  Worse is better and all that.

While JMESPath is targeted at JSON, you can often use it to dig data out of the native data structures in what ever language you’re using today.

It was a nerd prank when I was a freshman in college to blow large quantities punch card chad under another’s dorm room door.

Like any good DSL JMESPath take a bit of getting used to.  The interactive tutorial is sweet.

Google’s Cloud Source Repository … WTF :)

So Google has released a beta for  “Cloud Source Repositories” as part of their Cloud Platform.    At first blush it looks analogous to github, bitkeeper, sourceforge, etc. etc.   My reaction to this amuses me.

  • Does Google’s left hand know what it’s right hand is doing?
  • Is this more proof of how a modern high tech company balances trust v.s. velocity?
  • Is this a sign that all the cloud OS vendors will discover that source control is just a feature?


It’s painfully ironic that this new thing comes out only shortly after Google murdered Google Code.  It’s really hard to trust a vendor that kills product offerings (i.e. Google Reader, Google Code, etc.).  I suspect most developers are bewildered that they didn’t “just” migrate the Google code users to this new thing.   It really suggests that the team that shut down Google code was entirely independent from the one that build this thing.

There is a B-School narrative about how firms need to balance providing backward compatibility against the need to force their installed base to adopt new technology.  The first is necessary because customers need to trust that their investment in your offerings won’t suddenly depreciate as you create new versions.   The second is because old tech can quickly become a legacy tarpit that enables upstart competitors to leave you in the dust.

Microsoft was good at the first and very slow about the second.  And this dialectic is the primary cause of angst at Microsoft – and has been for about a decade plus.  When you are a monopoly you can spend a very long time resolving these tensions.    The unbelievably long road they took to get to an OS with a reasonable GUI is an example of that, though most people have forgotten that era.

So the B-School narrative around this suggests that a monopoly, i.e. a firm how’s customers are locked in (aka very loyal) can be extremely slow to offer the cool new thing.  That story is changing though.  If the velocity of high tech change increases firms are forced to be more aggressive about chasing it’s tail lights.

Google is willing  to abandon customers more casually than I’d expect.    Well until I recall the last paragraph.  Maybe.

In Google’s case I tend to think it’s more about the culture.  They cough up these products like Google Reader and Google Plus and Google Code and then the team and management loses interest or switch jobs.  The shiny new thing catches their eye.  Maybe they reward new to excess.

Anyhow what ever.

My third take on this is more interesting to me.  I remain fascinated by this new species of operating systems.

The set of things that make up an OS is much larger than they teach you in school.  It’s not just device drivers, and resource management.  These days it’s package management, app stores, developer tools and networks, evangelism, partner management, etc. etc.

Back in the 80s VC would sometimes utter the damning phrase “Oh that’s just a feature.”  For example somebody would author an amazingly complex package that would check grammar in documents.  “Just a feature.”   What that meant was that some other product Microsoft Word for example would swallow that feature.  And sure enough that is exactly what would happen.  Painfully – because each and every time the feature implementation inside the product that swallowed it would suck compared to the stand alone product’s.

So one thing I’m thinking is that cloud based source control (bug tracking, wiki, build tools, etc. etc.) is just a feature.  That it will be swallowed by the cloud platforms.  This morning that seems pretty damn inevitable.

The term platform has always been intended to suggest a stable surface you can build on.  But really these folks come from California.  Earthquakes!

The Poor? The pathologist report: it’s chronic, hopeless, and their own damn fault. Might be genetic too.

Steve Randy Waldman has another awesome post, and this case he tackles the mystery of how you can have a reasonably well functioning wealthy liberal democracy at the same time as a huge segment of the population is shockingly poor.  Wealth inequality is a simple answer, but then why doesn’t the democratic process work to fix that?  So you get a “trilema.”  I love triangles.

His names for the three sides of this triangle are: Liberal, Equality, and Nonpathology.  Clearly this idea is going to have trouble getting traction if only because that last one is so odd.  And that’s the key idea.  You can have a functioning liberal democracy along with extreme inequality if you can get everybody to flesh out the bible’s “For you always have the poor with you” sufficiently.  If the majority of the population accepts that the root cause of both is that the poor are afflicted with some pathological flaw – genetic say, or bad maybe bad fashion sense.  This is amusingly covered in the in Westside Story’s “Officer Krupke.”

This technique for suppressing the natural feedback loop you’d expect in a democracy is.  This isn’t just the usual technique of reactionaries to say that it would be futile to try and fix a problem they don’t care much about.

Once you decide that the problem is that the poor are suffering from the disease state – which is only true to the extent that they are poor – you can call in various quacks to prescribe their favorite prescription.  Interview training say.  Or better impulse control.  Or more entrepreneurship risk taking.  Or scolding that they should study harder.  You know: the things that the well off struggle to improve in their own lives.  This is totally a win for the elites because the prescriptions just happen to server their goals.  Tax cuts!

It’s a very good essay, particularly the tail end where he addresses some of the stories elites tell, and the poor often accept, about the pathological behaviors of the poor.


Is Bitcoin an Affinity Fraud?

Somebody must have written something about the pedestrian idea that Bitcoin is entirely a case of affinity fraud targeted at Libertarians.

In fact I’d enjoy reading something about the data on affinity frauds.  For example what attributes of a group make it more or less likely to to attract these kinds of frauds.  Where wealthy Jewish people likely to attract Bernie Madoff?

I enjoy reading PonziTracker, and I don’t know that I can see any obvious patterns there.

The above was triggered by the comment overheard that the Republican party appears to be an affinity fraud targeted at it’s elderly white voters.

Happy Birthday HTTPD

I gather that the Apache HTTPD server project was born in March of 1995, i.e. 20 years old today.  Noting that April 15th when ones taxes are due in the US it 66

My first contribution was 1997, and I was deeply involved in that, and then other questions of open source, standards, etc. etc. for about a decade.

Very interesting years, yup.

I’m surprised that my second posting to the dev list mentions typing injuries.  I thought that happened after I got involved, but apparently it was before.  That change the arc of my life a lot more than HTTPD.

As one of my Internet friends has been known to point out nostalgia is a very dangerous emotion, so I’ll stop there.


Things I’m Liking

  • 200 years ago Tambora blew up –> world wide climate emergency.  You can worry about that too.
  • Forcing the unemployed to take jobs as fast as possible has long term negative consequences on GDP because they suck at the jobs they end up taking?
  • Interestingly cheerful take on how capitalism’s long term and intimate relationship with criminals is really a wonderful thing.  Extra points to Bloomberg for having Mr. Cook pen that.
  • Tis ironic that a good place to read up on modern trolling technology is the US propaganda organization Radio Free Europe’s series on the Russian Troll Army.  A serious sociologist should write a book on these techniques – he’s get a lot of buzz and high paid consulting work!

Crime Registry

I’ve occasionally wondered if the sex offender registry might lead to some sort of flocking behavior where those on the registry tend to gather in particular locations.  I’ve even looked for, but not found the heat map showing where they flock too.  Yeah Google, I thought you were all seeing?

Similarly I’ve wondered if we will seem other scarlet letter offender registries?

So it is with much delight that I learn that Utah is close to creating a registry for white collar criminals.  Apparently Utah has a lot of affinity fraud.

Gosh if they set one of these up in New York state I can visualize the what the heat map of Manhattan would look like.

High-Involvement Style

I’m currently enjoying Deborah Tannen’s book on Conversational Style.  Here is her summary of that might be called New Yorker style.  Sometime’s it’s called fast talking,   People unpracticed in this style often find in exausting or obnoxious.   She calls it “HIgh-Involvement Style.”

1. Topic

  • (a) prefer personal topics,
  • (b) shift topics abruptly,
  • (c) introduce topics without hesitance,
  • (d) persistence (if a new topic is not picked up by others, reintroduce it. Data show persistence up to a maximum of seven tries).

2. Genre

  • (a) tell more stories,
  • (b) tell stories in rounds, in which (i) internal evaluation (Labov, 1972) is preferred over external (i.e., demonstrate the point of the story rather than lexicaling it), (ii) omit abstract (Labov, 1972) (i.e. plunge right in without introduction; cohesion is established by juxtaposition and theme);
  • (c) preferred point of a story is the emotional experience of the teller.

3. Pace

  • (a) faster rate of speech
  • (b) pauses avoided (silence has a negative value; it is taken as evidence of lack of rapport-Tannen, 1984);
  • (c) faster rate of turn taking,
  • (d) cooperative overlap (the notion of back-channel responses [Duncan 1974] is extended to include lengthy questions and echoes, resulting from a process of participatory listenership).

4. Expressive paralinguistics

  • (a) expressive phonology,
  • (b) pitch and amplitude shifts,
  • (c) marked voice quality,
  • (d) strategic pauses

The ancient global eunuch fad!

a_few_menSo here’s a chart  from an illustration by Sabine Deviche, it’s taken from here.   It covers deep time, the axis on the left is in units of a thousand years.    The interesting bit is the 2nd chart in the middle.   It shows “effective” men (green) and women (purple); where by effective we mean that they managed to pass their genes onto the next generation.

Something happen to the men.

At some point the number of men who managed to pass their genes along declined to a vanishingly small percentage of the total population.

But this didn’t happen to the women.

Most of the men disappeared, at least they didn’t show up at the prom.

“4-8,000 years ago there was an extreme reduction in the number of males who reproduced, but no in the number of females.”

What the hell?

You can see the whole illustration here, and read more here.  The paper is inside a walled garden.  Or maybe you’ll enjoy this headline:

“8,000 Years Ago, 17 Women Reproduced for Every One Man.”

europeThis blog post offers this view, showing the event for various regions.  The women, in rose, always reproduce more dependably. But 15 thousand years ago their effectiveness increases substantially.  And then around six thousand years ago we have this event that reduced the reproductive success for men.

When it happened, how radical it was, and how long it lasted varies from region to region. it’s hard to see how to explain this without blaming social forces: culture, technology, economics.  Fun to make up insta-theories.

Premature Standardization

Back in the day I was quite interested in Industrial Standardization.  It’s a fascinating complement to the more widely discussed business models intended to capture and own a given market.

This morning I’m aroused by word that the ISO is working on standardizing how we test software.  My reaction is “Argh!  Surely you jest!”

A few more reactions.

In all my reading about standards I don’t recall a good check list to help guide when to transition a body of practice into a standard.  There is an excellent list of what drives standardization.  But that’s more about the intensity of the demand, not the quality of the supply of professional knowledge.

There are a few good discussions of failure syndromes around standardization.  James Gosling wrote up a nice short one about how often the demand for quality runs ahead of the supply of skills, which I mention here.

There is an excellent model of what goes wrong when you have intense demand for skills, low professional knowledge, and low barriers to entry.   I’ll quote from my post about that:

“The lack of clear quality measures leads the substitution of alternate sources of legitmacy: pomp, pompous attitude, parasitizing on other sources of authority, advertising, character defamation. (A point which deserves a blog posting of it’s own, but since that’s unlikely I’ll toss in this marvelous line. When this happens you see a pattern: consumers hold the trade in very low esteem but hold their personal practitioner in the highest regard. Where have I heard that before?)”

The effort to standardize software testing came to my attention via Laurent Bossavit’s twitter stream.   Laurent has spent a lot of calories on the puzzle of good software development practices.  You should read his book “The Leprechauns of Software Engineering: How folklore turns into fact and what to do about it.

And maybe you should sign the petition that attempts to slowdown this attempt to prematurely standardize software testing.   Just because we want to have high quality testing practices, skills, and standards does not mean we are ready to write down standards for how to fulfill that desire.  We aren’t ready.