Decentralisation and Generative Systems

21 Jan 2014 — Mike Sandford

I went to a Redecentralize meet-up the other day (3rd December, as it happens) and thoroughly enjoyed myself meeting a bunch of people and finding out what they did and how they saw the internet playing out both short and long term. For the long term I heard how the Catholic Church has managed overall organisation, communication and data storage over a very long time, rolling with the technology and using what works, including keeping the older methods where it might be appropriate, or simply where it just doesn’t matter and there is no need to change. For more immediate consideration I found a mechanism that used TOR to provide anonymized emails. This mechanism fits into the mail transport layer and uses TOR or the open internet quite transparently, depending on the capability of the mail recipient.

I could have spent all day listening to all of the people there and discussing ideas, and that is an essential part of what Redecentralize can offer; the facility to talk to people and discover what we have in common. Having said that, the Redecentralize team are aware that other things might be needed. My own approach to decentralisation was, initially, to think about what I would want to do; what are the use cases and user stories. That sort of works, but it turns out that most of the cases I could think of are already handled, either by old technology, or by current projects. Besides which, I’m aware that my use cases are themselves technology based and that is not a good place to start from. So, what we might be looking for is ways of enabling experiments to happen and ways for ideas to gain traction. Clearly I’m not the only one to be thinking this, and words such as ‘standards’ and ‘governance’ were floating in the air. I’m not sure that we need to go that far, and it is almost certainly too early. I like standards, though, and in the right place they allow developers free range to develop tools and provide features without having to worry about how their software works with others. So, maybe we’re not going to go as far as managing standards, it depends, but we can at least provide a framework for discussion and I’m proposing an idea that might help here.

Actually, it’s not my idea, it comes from The Future of the Internet (and how to stop it) by Jonathan Zittrain. The idea is that systems that allow for experiment and appropriate uptake are what he calls Generative Systems, and they have five characteristics: leverage, adaptability, ease of mastery, accessibility and transferability.

Leverage means that some task is made easier - something that was not possible, or just impractical, becomes doable.

Adaptability describes how the tool can be applied to a range of tasks, or not. The more adaptable the tool is the more uses people will put it to.

Ease of mastery refers to the amount of effort required to learn how to use it and adapt it.

Accessibility refers to the hoops the user has to go through to get hold of the tool. In our context this might range from automatic inclusion in the operating system at one end, to having to configure and compile a set of libraries at he other.

Transferability is a measure of the ease with which user’s ideas, and related changes to the tool, can be promulgated to other users.

These are very short descriptions to whet your appetite. You can read more about them here, but I recommend you buy the book.

I’m not suggesting that authors of tools should necessarily aim to maximise any or all of these characteristics. I want to use these ideas to inform our reporting of projects. To be fair, any reporting that brings out these factors may well have an influence, but, equally, there may be projects, the Serval Project, for example, where, almost, something may better than nothing.