The News needs an Anti-Virus (and it’s us)

Passing this one along from Dave Winer because the sooner we think of passing along things we dislike on the Web, that only can exist with our attention, as viruses, the sooner we remove their influence.

So you *hate* the Rebecca Black video? Well by linking to it you gave her a record contract.

The link economy is an attention-based economy. Links, whether you signify hate or love for the thing you are linking, give it attention and influence.

Think, before you link.

(and yeah, that’s no fun, and yeah, I’m not gonna follow my own advice, so there!)

Oh, and… YouTube: “Death Metal Friday”:

Recent Space-X on NPR highlights how little we read

A story is posted about a private company working to build the largest capacity lifter in service and what do a significant number of commentors fret about?

That the government is wasting its money on building it!

It is pretty clear that most of those who are commenting that way have not read (or worst – understood) the story, but the ‘private company’ part was highlighted in the summary.

Check it out – NPR.org (on Facebook): Plans For World’s Most Powerful Rocket Unveiled

Sad, huh?

But don’t you think there are multiple failures taking place exemplified here? And where do you feel they stem from?

How poor information design led to Waterfall

I just took part in a great 3 day training session with Uncle Bob Martin on TDD and healthy software design. One of the tidbits Bob shared was the history behind the Waterfall methodology that some of us older folks strained against until agile and lean methodologies started to get well known. Waterfall originated in a paper by Winston W. Royce, in which he describes the process… as a straw man to tear down! Unfortunately, the poor information design of the paper (it puts the summary tearing down the methodology on later pages instead of right up front) led those who read the nice graphics on leading pages to come away thinking they found the solution to their software engineering process needs.

Watch Glenn Vanderburg’s “Real Software Engineering” talk on Vimeo about this.

Real Software Engineering – Glenn Vanderburg from Engine Yard on Vimeo.

Is – the universe – just bits of information?

It is a fascinating question that led to some awesome lunchtime conversation the other day at work. Some more food for thought was recently written by Freeman Dyson in The New York Review of Books, who was reviewing James Gleick’s newest, “The Information: A History, a Theory, a Flood”, which I’m gonna just have to read. Metafilter, as usual, had an interesting discussion to follow.

Other related links:

Nicholas Carr at The Daily Beast

Wired: Why the Basis of the Universe Isn’t Matter or Energy—It’s Data

Philadelphia Inquirer: Tirdad Derakhshani: Information Please

Ever see the “Mother of all Demos”?

In 1968 Douglas C. Engelbart, along with a team of 17 researchers at Stanford, in a 90 minute taped demonstration, showed us what was then the future – which is now the present (and soon to be the past?) – hypertext, gui based interaction, online collaboration including email, and more.

Stanford has a terrific page on the demo, including video clips of it broken down by time and topic, and a single clip of the whole thing. If you’ve never seen this before, take the time, scroll to the bottom of this, and watch beginning to end. It’s not called “The Mother of all Demos” for nothing.

I’ve watched this a few times over the years and I keep coming back to it and being blown away. How far have we gone? How far have we not? There has been much added to the mix these past ten years, but it was a long way from there to here.

Related:

PhillyCHI

Does the Internet “Change Nothing?”

Read this thought provoking essay by Marshall Poe.

Then read Scott Berkun’s thoughts.

Then make up your own mind, because this story is still being written isn’t it?

It used to cost a $1 trillion, now it costs $60 dollars

Computer World: “Today’s $60 1TB drive would have cost $1 trillion in the ’50s”

YouTube: “TEDxPhilly – Robert J. Moore – The data explosion “:

Related:

Hal R. Varian, University of California, Berkeley: Economics of Information Technology

MIT Technology Review: “The 70 Online Databases that Define Our Planet”

guardian.co.uk: Data Store

TechCrunch: Devin Coldewey: “The Dangers Of Externalizing Knowledge”

ScraperWiki

O’Reilly: Mike Loukides: “What is Data Science?”

On the Big Ball of Mud

Brian Foote and Joseph Yoder, Department of Computer Science, University of Illinois at Urbana-Champaign: Big Ball of Mud:

In the end, software architecture is about how we distill experience into wisdom, and disseminate it. We think the patterns herein stand alongside other work regarding software architecture and evolution that we cited as we went along. Still, we do not consider these patterns to be anti-patterns. There are good reasons that good programmers build BIG BALLS OF MUD. It may well be that the economics of the software world are such that the market moves so fast that long term architectural ambitions are foolhardy, and that expedient, slash-and-burn, disposable programming is, in fact, a state-of-the-art strategy. The success of these approaches, in any case, is undeniable, and seals their pattern-hood.

People build BIG BALLS OF MUD because they work. In many domains, they are the only things that have been shown to work. Indeed, they work where loftier approaches have yet to demonstrate that they can compete.

It is not our purpose to condemn BIG BALLS OF MUD. Casual architecture is natural during the early stages of a system’s evolution. The reader must surely suspect, however, that our hope is that we can aspire to do better. By recognizing the forces and pressures that lead to architectural malaise, and how and when they might be confronted, we hope to set the stage for the emergence of truly durable artifacts that can put architects in dominant positions for years to come. The key is to ensure that the system, its programmers, and, indeed the entire organization, learn about the domain, and the architectural opportunities looming within it, as the system grows and matures.

Periods of moderate disorder are a part of the ebb and flow of software evolution. As a master chef tolerates a messy kitchen, developers must not be afraid to get a little mud on their shoes as they explore new territory for the first time. Architectural insight is not the product of master plans, but of hard won experience. The software architects of yesteryear had little choice other than to apply the lessons they learned in successive drafts of their systems, since RECONSTRUCTION was often the only practical means they had of supplanting a mediocre system with a better one. Objects, frameworks, components, and refactoring tools provide us with another alternative. Objects present a medium for expressing our architectural ideas at a level between coarse-grained applications and components and low level code. Refactoring tools and techniques finally give us the means to cultivate these artifacts as they evolve, and capture these insights.

Read the whole thing.

Worth a re-read, still relevant: Neal Stephenson’s “In the Beginning was the Command Line”

Pre-OSX, Pre-iPod, pre George W. Bush presidency, way back when.

First, A quote about Apple and Boomerdom:

Apple has always insisted on having a hardware monopoly, except for a brief period in the mid-1990s when they allowed clone-makers to compete with them, before subsequently putting them out of business. Macintosh hardware was, consequently, expensive. You didn’t open it up and fool around with it because doing so would void the warranty. In fact the first Mac was specifically designed to be difficult to open–you needed a kit of exotic tools, which you could buy through little ads that began to appear in the back pages of magazines a few months after the Mac came out on the market. These ads always had a certain disreputable air about them, like pitches for lock-picking tools in the backs of lurid detective magazines.

This monopolistic policy can be explained in at least three different ways.

THE CHARITABLE EXPLANATION is that the hardware monopoly policy reflected a drive on Apple’s part to provide a seamless, unified blending of hardware, operating system, and software. There is something to this. It is hard enough to make an OS that works well on one specific piece of hardware, designed and tested by engineers who work down the hallway from you, in the same company. Making an OS to work on arbitrary pieces of hardware, cranked out by rabidly entrepeneurial clonemakers on the other side of the International Date Line, is very difficult, and accounts for much of the troubles people have using Windows.

THE FINANCIAL EXPLANATION is that Apple, unlike Microsoft, is and always has been a hardware company. It simply depends on revenue from selling hardware, and cannot exist without it.

THE NOT-SO-CHARITABLE EXPLANATION has to do with Apple’s corporate culture, which is rooted in Bay Area Baby Boomdom.

Now, since I’m going to talk for a moment about culture, full disclosure is probably in order, to protect myself against allegations of conflict of interest and ethical turpitude: (1) Geographically I am a Seattleite, of a Saturnine temperament, and inclined to take a sour view of the Dionysian Bay Area, just as they tend to be annoyed and appalled by us. (2) Chronologically I am a post-Baby Boomer. I feel that way, at least, because I never experienced the fun and exciting parts of the whole Boomer scene–just spent a lot of time dutifully chuckling at Boomers’ maddeningly pointless anecdotes about just how stoned they got on various occasions, and politely fielding their assertions about how great their music was. But even from this remove it was possible to glean certain patterns, and one that recurred as regularly as an urban legend was the one about how someone would move into a commune populated by sandal-wearing, peace-sign flashing flower children, and eventually discover that, underneath this facade, the guys who ran it were actually control freaks; and that, as living in a commune, where much lip service was paid to ideals of peace, love and harmony, had deprived them of normal, socially approved outlets for their control-freakdom, it tended to come out in other, invariably more sinister, ways.

Applying this to the case of Apple Computer will be left as an exercise for the reader, and not a very difficult exercise.

On Disney, Apple/Microsoft, and mediated reality:

If I can risk a broad generalization, most of the people who go to Disney World have zero interest in absorbing new ideas from books. Which sounds snide, but listen: they have no qualms about being presented with ideas in other forms. Disney World is stuffed with environmental messages now, and the guides at Animal Kingdom can talk your ear off about biology.

If you followed those tourists home, you might find art, but it would be the sort of unsigned folk art that’s for sale in Disney World’s African- and Asian-themed stores. In general they only seem comfortable with media that have been ratified by great age, massive popular acceptance, or both.

In this world, artists are like the anonymous, illiterate stone carvers who built the great cathedrals of Europe and then faded away into unmarked graves in the churchyard. The cathedral as a whole is awesome and stirring in spite, and possibly because, of the fact that we have no idea who built it. When we walk through it we are communing not with individual stone carvers but with an entire culture.

Disney World works the same way. If you are an intellectual type, a reader or writer of books, the nicest thing you can say about this is that the execution is superb. But it’s easy to find the whole environment a little creepy, because something is missing: the translation of all its content into clear explicit written words, the attribution of the ideas to specific people. You can’t argue with it. It seems as if a hell of a lot might be being glossed over, as if Disney World might be putting one over on us, and possibly getting away with all kinds of buried assumptions and muddled thinking.

But this is precisely the same as what is lost in the transition from the command-line interface to the GUI.

Disney and Apple/Microsoft are in the same business: short-circuiting laborious, explicit verbal communication with expensively designed interfaces. Disney is a sort of user interface unto itself–and more than just graphical. Let’s call it a Sensorial Interface. It can be applied to anything in the world, real or imagined, albeit at staggering expense.

Why are we rejecting explicit word-based interfaces, and embracing graphical or sensorial ones–a trend that accounts for the success of both Microsoft and Disney?

Part of it is simply that the world is very complicated now–much more complicated than the hunter-gatherer world that our brains evolved to cope with–and we simply can’t handle all of the details. We have to delegate. We have no choice but to trust some nameless artist at Disney or programmer at Apple or Microsoft to make a few choices for us, close off some options, and give us a conveniently packaged executive summary.

But more importantly, it comes out of the fact that, during this century, intellectualism failed, and everyone knows it. In places like Russia and Germany, the common people agreed to loosen their grip on traditional folkways, mores, and religion, and let the intellectuals run with the ball, and they screwed everything up and turned the century into an abbatoir. Those wordy intellectuals used to be merely tedious; now they seem kind of dangerous as well.

We Americans are the only ones who didn’t get creamed at some point during all of this. We are free and prosperous because we have inherited political and values systems fabricated by a particular set of eighteenth-century intellectuals who happened to get it right. But we have lost touch with those intellectuals, and with anything like intellectualism, even to the point of not reading books any more, though we are literate. We seem much more comfortable with propagating those values to future generations nonverbally, through a process of being steeped in media.

On Linux, Writing Software, and Emacs:

The triad of editor, compiler, and linker, taken together, form the core of a software development system. Now, it is possible to spend a lot of money on shrink-wrapped development systems with lovely graphical user interfaces and various ergonomic enhancements. In some cases it might even be a good and reasonable way to spend money. But on this side of the road, as it were, the very best software is usually the free stuff. Editor, compiler and linker are to hackers what ponies, stirrups, and archery sets were to the Mongols. Hackers live in the saddle, and hack on their own tools even while they are using them to create new applications. It is quite inconceivable that superior hacking tools could have been created from a blank sheet of paper by product engineers. Even if they are the brightest engineers in the world they are simply outnumbered.

In the GNU/Linux world there are two major text editing programs: the minimalist vi (known in some implementations as elvis) and the maximalist emacs. I use emacs, which might be thought of as a thermonuclear word processor. It was created by Richard Stallman; enough said. It is written in Lisp, which is the only computer language that is beautiful. It is colossal, and yet it only edits straight ASCII text files, which is to say, no fonts, no boldface, no underlining. In other words, the engineer-hours that, in the case of Microsoft Word, were devoted to features like mail merge, and the ability to embed feature-length motion pictures in corporate memoranda, were, in the case of emacs, focused with maniacal intensity on the deceptively simple-seeming problem of editing text. If you are a professional writer–i.e., if someone else is getting paid to worry about how your words are formatted and printed–emacs outshines all other editing software in approximately the same way that the noonday sun does the stars. It is not just bigger and brighter; it simply makes everything else vanish. For page layout and printing you can use TeX: a vast corpus of typesetting lore written in C and also available on the Net for free.

Read the whole thing