Happy 40th Birthday, Internet!
Happy 40th Birthday, Internet!
Annoyed by this piece.
There's something about articles that are half true that annoy me more than those that are completely wrong-headed.
Yes; the most important thing about new technology is the changes in practices that they engender. My favourite writing about this is Douglas Engelbart. In Augmenting Human Intellect: a conceptual framework he situates the knowledge worker in the midst of language/tools/technology -- and they evolve together. Engelbart was particularly interested in the deliberate evolution of these forms, and his early bootstrapping work was just this; a deliberate attempt to change technology and practice at the same time; dialecticaly. The only way to discover the road is to walk it.
So yes, bleeding obvious.
Crap technology, on the other hand, is that that doesn't change practice, that has no effect on the organisation. There's a lot of this-- it's easy to implement and poses no challenges. Often, it outsources work that was once done by one department (say HR or Finance), to those they are supposed to serve, freeing them for cocktails, or whatever they're supposed to be doing.
20th century paradigm -- false; perhaps 12th century chain of being is more apt?
Metaphor for modernism is the idealised notion of science; perfectability through rational investigation and collaboration. This is what the internet came out of-- in a very real sense. TBL developed the web in order to share results with a wider group of people, more quickly. Greater access to information leads to better analysis of that information.
The early history of computers was indeed disruptive -- but in a small way, because they weren't widespread by definition.
- Community computing in the 70s
- Bulletin boards for interest groups
- In business, the spreadsheet enabled managers to get informationa and analyse it without being dependent on the power bases within the organisation. I wasn't there, but I bet it was disruptive.
The issue there is that the early adopters learned how to use the tools-- the early state meant they had to, and they were early adopters because they were interested. So computing then meant the ability to manipulate data in whatever way they wanted.
This has changed; because of the tendency to inertia, because power bases don't like to give up power, what passes for computing is using the computer in a very limited sense-- ie as a particular machine; an email machine, a word processing machine, an internet browsing machine.
Each of these uses is an application of the general purpose computer -- that is the machine that can be programmed. Computer literacy is programming-- anything else is just office work.
Computing has been debased over time-- but that doesn't prevent people learning how to use the machines flexibly, it just obscures this possibility.
The new web stuff is just another attempt at proper computing-- using web services to mash up data in ways; using massive input from many people to provide that data.
Writing
In fact, the term 'hardwiring' that became a dominant late twentieth-century metaphor for the human brain comes from the permanent and unchangeable electrical circuitry within an early computer that dictates its capabilities.
helps to obscure what's really going on. The central insight of computation is that it's possible to have a definite system (hardwired), that can become any thing you want it to. Ie; programming is possible. In the early days, programs themselves where wired onto cards; and the computer was the system that could take these cards; the context for the programs.
This is important. The computer is exactly that machine that can become any machine; that is what computing is.