« Parameter Validation | Main | Confirmed Talk »

December 05, 2006

Lego Programming

Joel reviewed a book Beyond Java, and, in his review, he enthusiastically recommended an essay by Fred Brooks called "No Silver Bullet: Essence and Accidents of Software Engineering." He recently mentioned it again in his post Lego Programming. Brooks wrote the Mythical Man Month, which was really the first software engineering text. It was remarkable in identifying surprising asymmetries in software development. For example, Brooks identified the network costs in adding more developers to a project and dramatic disparities  in individual developer productivity (eg, "adding manpower to a late softer project makes it later"), but he also made a number of forgotten poor predictions in the same book, some of which he confesses to in "The Mythical Man Month After 20 Years"  such as "I am convinced that interactive systems will never displace batch systems for many applications."

I have written about Silver Bullets in the past and emphatically feel the widely regarded author to be irresponsible and premature in his assessment of there being no silvery bullets, which is leading many developers, not the least of which is Joel, to be unimaginative and pessimistic of advances in software development.

For example, Brooks asserts in the start of his essay this bit of false wisdom:

There is no single development, in either technology or in management technique, that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity.

That assertion turns out to be pure nonsense, amply disproven by numerous advances in IDEs, languages, frameworks, componentization over the past few decades. Our expectations of software and our ability have risen. A year of work takes a month or a month of work takes a day. An order of magnitude improvement usually results in major qualitative changes, often resulting in an existing lengthy project becoming a short task item or a new project suddenly becoming feasible, such as when end users start writing applications (using scripting and RAD tools) that were once exclusively the domain of IT.

The net effect is that we often don't consciously recognize tenfold improvements in productivity. We forget how hard it was to program decades ago. Consider developing a game for the simple Atari 2600 gaming system back in the early 1980s.

I see the driving force towards tenfold productivity as the move to more declarative and compositional approaches, be it through functional, object-oriented, and component-based programming. Kinda like Lego.


TrackBack URL for this entry:

Listed below are links to weblogs that reference Lego Programming:

» Parkinson's Law Eats Silver Bullets from you've been HAACKED
Parkinson's Law Eats Silver Bullets [Read More]

» Is programming really like playing with Legos? from Crazy But Able
My main man Joel wrote a post yesterday about Lego Programming, the myth that you can make programming as easy as snapping together Legos: Frequently, the mainstream media, reporting on computer programming tools, gets the story horribly wrong. What... [Read More]


I too have always had trouble with the no Silver Bullet theory. Sure, there may not be One Big Thing that shoots productivity upward, but over the years there have been a lot of small changes that together amount to major gains.

My analogy is health care. People complain about the rising costs, but the technology is static. I'm sure if you wanted a 1970's level of care you could get it at a reasonable cost.

While we have had an order-of-magnitude improvements, I think Parkinson's law has eaten them up. I wrote about it here:


You seem to be ignoring both the "single development" and "order of magnitude" parts of his statement, which he has pointed out when challenged by some development or another. Also, the things you mentioned are things he said to look out for as condenders in the future, but so far none of them alone has exceeded that "order of magnitude" (times 10) improvement. Combining a number of things in the case of Ruby on Rails has come close (5 to 10 times, by many claims), but only in its area of strength - its productivity gain falls to about double when it's used for less suitable tasks. Other "silver bullets" and combinations have been similar.

Regardless of whether Brooks said "a single development," his statement is not a law.

The whole comment is subject to interpretation; people will selectively seek confirmation of that belief.

There's no quantitative, scientific basis for his remarks. It's become a religious belief. It's bogus, but it's being repeated as true.

Interesting post.

I agree that *programmers* have become productive.

But the full version of the Lego Idea, as also mentioned in your essay, is that programming will become so simple that users will do it.

I just don't see that happening in the general case (there are of course exceptions in some specific areas and for some specific users). My belief is that users are usually terrible at creating programs, and struggle to create the simplest programs, because they don't generally (obviously there are some exceptions) have the skills to decompose problems into logical units, and to assemble solutions out of logical units, even if these logical units are premade components, or available by picking from a menu of options (like in The Last One and ObjectVision).

If the Users-could-program-if-only-they-had-the-tools theory were true, then most programs today would be created by users using something like Borland ObjectVision. In fact, what really happened, was to the extent this kind of tools have been popular at all, they have been popular with programmers, not users.

The real reason that programming is hard for users is because this breaking down of problems is a hard skill, not because the tools aren't good enough.

(I posted about this part on my blog, of course).

I would imagine Brooks in his career must have seen many improvements in programming tools, and programming productivity as a result. My view is that what he was probably striving to express, is that there won't be a silver bullet to solve this difficulty in decomposing problems into logical units.

I am referring to the past rise of application programmability like VBA and Notes, when I mentioned end-users building applications.

I also don't want to forget those applications on the web that enable various forms of group collaboration and scheduling, which don't require users to "program."

It is good to see someone articulating a counterpoint to "No Silver Bullet". I have been in the industry long enough to see advances, and like most didn't believe we could, or were, gaining orders of magnitude.

The thing of it is... we are.

Noted computer scientist, Ray Kurzweil sums it up best when he talks about the "intuitive linear" view vs. of the "historic exponential" view. People fall into the trap of extrapolating the current rate of change as the long term rate of change, when in reality the rate of change itself is increasing. This is why your statement that we don't consciously recognize orders of magnitude increases is dead on.

This is happening to software and software development as well, albeit the doubling rate is slower than that of computing in general. But it is happenening.

Beyond "numerous advances in IDEs, languages, frameworks, componentization" we see things like Google APIs, Amazon ECC, S3, salesforce.com, and much of what falls under Web 2.0 yield further, large gains.

> I am referring to the past rise of application programmability like VBA and Notes, when I mentioned end-users building applications.

But only a small percentage of users are willing or capable to do this kind of thing.

My experience is that if you ask a typical end-user to do the kind of task that programmers routinely do, (i.e. breaking up a big task into small procedural steps), even if you remove all the jargon, they are mostly very bad at this kind of thing. Ask a dozen people to draw a flowchart for a fairly complex business procedure involving multiple people - and about 10 or 11 of them will struggle. (Of course there are a small percentage of users with a natural aptitude - probably the same ones who do a bit of VBA or whatever).

It's as if most people somehow are missing the mental tools to do this kind of task. I don't know whether it's lack of aptitude or lack of practice in thinking this way - but I do think it's a closely related phenomenon to that of some programmers being vastly more productive than others.

It is all about application and knowledge domains.

Just think of what SQL has done for productivity since "No Silver Bullets" appeared.

Then we have Excel and various report generators which have slashed development time and costs significantly.

Anyway, I think it is time to look hard into the economics of programming. Most programmers have no idea about how long it takes after the first compile until the software "works".

Why is there such a resistance towards using the most effective programming languages? Why do a programming language have to be cool before programmers want to use it? Why isn't development time a valid argument?


Great post, Wes.

I agree with Joel and others that there is no one, single thing that will solve all programming problems.

Instead, it's apparent that the advances will be coming in increments from many different areas: languages like C# are improving, frameworks and libraries are improving with better abstractions, and our tools to help us write better software: better IDEs, object relational mappers, bug finding tools, unit testing tools and frameworks, code analyzers, performance and memory profilers. So much is improving, and new ideas are being tried out. It's a good time to be a software developer.

The 'no silver bullet' will probably not be true until computers can think for themselves. As soon as we have artificial intelligence capable of creative design, computers will be able to translate fuzzy requirements into a coherent designs on their own (the hard bit!). In this case though the solution (creating a machine capable of replacing human analysts and programmers) is harder than the problem. AI is the real silver bullet!

Sunil writes: "But the full version of the Lego Idea, as also mentioned in your essay, is that programming will become so simple that users will do it."

Well, no, that's *a* full version. Yes, there are people who want to bring programming to the masses. I'm pretty sure that's *not* what was being addressed by the people in the Businessweek article Joel quoted.

What that was talking about was just making *programming* easier for *programmers* so we can get more done and make the users happy.

In *this* version the legos will handle the programming scut work, allowing the programmers to focus on the interesting, value-adding coding, implementing the features the user needs.

For instance, one of the companies quoted in the article, Object Technologies, sold some 'Legos' - a set of SmartFields, drag 'n drop text fields for NeXTStep's IB GUI builder. As the name implies, these add smarts such as formatting, validation, minimum and maximum values, etc, which were not handled by the standard widgets at that time.

These Smartfields freed the developer from the scutwork of programmatically handling the formatting, validation, and other logic for standard text fields. Instead, the programmer could set the fields up in IB, and be done with them, apart from any programmatic adjustments that might be required at runtime (probably none would be.)

Ok, I have a question that no one has been able to answer yet. How can the standardization of JavaScript be viewed as a bad thing. I think that it would simplify things a lot for people trying to use them and for he use of them around the world. I am actually surprised that this has not been done before. Anyone working on it would run into a wall with just one character being different. And now there is debate about if it should be standardized? How can there be any doubt about this? I just do not see the point of anyone not wanting it to be uniform. What are the advantages of leaving it as it is right now? People make it seem that there are some that I have not been able to see yet. I was wondering if you could help. Thanks for your time in this.

I absolutely agree with your point. I can’t find anything positive in terms of development every time something is produced or developed. I wonder what has gotten into the mind of the author to stress something that’s totally absurd.

It is happening to software and software development as well, albeit the doubling rate is slower than that of computing in general. But it is happenening. Thank you for the article.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment