Classic Book Review: The Sciences of the Artificial

Herbert A. Simon
MIT Press, 3rd edition 1996 (first edition 1969)

ISBN 0262691914 paper

Buy it from Amazon.com
Buy it from Amazon.co.uk

Other Reviews

 

 

 

The Essence of Systems Thinking

This oddly-titled book might, if it had first been published today, have been called something like Foundations of the Science of Complexity, or On the Edge of Chaos. But it came out in 1969, when no-one had heard of such a thing. This new edition takes advantage of the convergence of academic thought with what, back in the '60s, was Simon's outlandish worldview. All sections have been brought up to date, and the glaring omission of a chapter on the structure of complexity - hierarchies of many kinds - has finally been remedied.

Given the nature of its subject, the book is simple enough. There are few equations and no pages full of algebra or formal logic. The topics covered, stated baldly, may seem almost ordinary nowadays: the natural and artificial worlds (i.e. simulation, modelling, virtual reality); complexity in economic models; thought (and computation); memory; design; social planning; complexity itself; and the hierarchic nature of complexity. What these disparate things have in common is that they are very large, and too complicated to understand as a whole. They are also (fashionable word!) chaotic, so they don't lend themselves to naively deterministic approaches.

The artificial-intelligence examples and discussions seem, if anything, the most dated parts of the book. Programs that can understand and resolve limited natural-language formulations of naïve physics problems? It sounds like the early '80s - except that Simon was writing long before then, his claims were soundly based on working examples, and crucially he admits up front that

"of course the .. program actually implemented is only a prototype for the engine that would be needed to accomplish this in all generality."

The examples do not claim to achieve robotic intelligence, unlike the hype of the '80s, but lead naturally into the discussion of how size increases complexity.

That new chapter, "The Architecture of Complexity", is the heart of the book, which must have read oddly without it. A delightful tale of two watchmakers presents the central argument. One of them, Tempus, has to start again from the beginning when interrupted. The other, Hora, makes subassemblies which he can make into assemblies which he can (at any later time) make into a watch. Given any reasonable number of interruptions, how long do they each take to finish a watch? The results are startling (there is an exponential involved). All of traditional Systems Engineering follows simply and logically from this basis; if you have something very complex to design, like an aircraft, there's simply no other way -- and even with careful subdivision into assemblies, you have to work hard to keep them internally coherent but loosely-coupled to each other, or complexity will overwhelm you.


The main problem with Simon's approach is that not everything can be reduced to simplicity by analysis and clear thinking: some problems are irreducibly social, political, and emotional -- in other words, human. Peter Checkland's Soft Systems Methodology addresses the messy sort of complexity in systems, where whatever you do constitutes an intervention that some people may like, and others may not -- or to put it shortly, where whatever is tried can easily make matters worse. In contrast to Checkland, Simon's approach is "hard", by which people mean an intellectualising, reductionist, analytic attempt to impose scientific order on a chaotic world. Checkland is quite critical of Simon's approach to systems thinking, and indeed to some extent defines himself by his distance from Simon's way of thinking.

The truth, I suspect, is that both Simon and Checkland are right; we need to address tough system problems analytically and with all the mental organisation we can muster -- and also we need to address all the other issues that can swiftly derail a project. After all, it doesn't matter how well-thought-out something is, if someone more powerful than you is determined it shall never happen; but if it isn't properly thought out in the first place, it's doomed anyway.

Any engineer who has not thought out why we need to address complexity, and how we can do so successfully, should try reading this concise masterpiece.

© Ian Alexander 1996, 2002, 2004


You may also like: