I just finished reading Understanding Computers and Cognition by Terry Winograd and Fernando Flores and have to jot down some quick notes & quotes before I jump in and start reading it again ... yeah, it's that good.
Having gone on Rorty and Wittgenstein kicks recently, I was really happy to find this book while browsing the Neats vs Scruffies Wikipedia article a few months ago. It seems to combine this somewhat odd interest I have in pragmatism and writing software. While it was first published in 1986, it's still very relevant today, especially in light of what the more Semantic Web heavy Linked Data crowd are trying to do with ontologies and rules. Plus it's written in clear and accessible language, which is perfect for the arm-chair compsci/philosopher-type ... so it's ideal for a dilettante like me.
While only 207 pages long, the breadth of the book is kind of astounding. The philosophy of Heidegger figures prominently...in particular his ideas about throwness, breakdowns and readiness to hand which emphasize the importance of concernful activity over rationalist, representations of knowledge.
Heidegger insists that it is meaningless to talk about the existence of objects and their properties in the absence of concernful activity, with its potential for breaking down.
The work of the biologist Humberto Maturana forms the second part of the theoretical foundation of the book. The authors draw upon Maturana's ideas about structural coupling to emphasize the point that:
The most successful designs are not those that try to fully model the domain in which they operate, but those that are 'in alignment' with the fundamental structure of that domain, and that allow for modification and evolution to generate new structure coupling.
And the third leg in the chair is John Searle's notion of speech acts which emphasizes the role of commitment and action, or the social role of language in dealing with meaning.
Words correspond to our intuition about "reality" because our purposes in using them are closely aligned with our physical existence in a world and our actions within it. But the coincidence is the result of our use of language within a tradition ... our structure coupling within a consensual domain. Language and cognition are fundamentally social ... our ability to think and to give meaning to language is rooted in our participation in a society and a tradition.
So the really wonderful thing that this book does here is take this theoretical framework (Heidegger, Maturana & Searle) and apply it to the design of computer software. As the preface makes clear, the authors largely wrote this book to dismantle popular (at the time) notions that computers would "think" like humans. While much of this seems anachronistic today, we still see similar thinking in some of the ways that the Semantic Web is described, where machines will understand the semantics of data, using ontologies that model the "real world".
There is still a fair bit of talk about getting the ontologies just right so that they model the world properly, and then running rule driven inference engines over the instance data, to "learn" more things. But what is often missing is a firm idea of what actual tools will use this new data. How will these tools be used by people acting in a particular domain? Like The Modeler, practitioners in the Linked Data and Semantic Web community often jump to modeling a domain, and trying to get it to match "reality" before understanding what the field of activity we want to support is...what we are trying to have the computer help us do ... what new conversations we want the computer to enable with other people.
In creating tools we are designing new conversations and connections. When a change is made, the most significant innovation is the modification of the conversation structure, not the mechanical means by which the conversation is carried out. In making such changes we alter the overall pattern of conversation, introducing new possibilities or better anticipating breakdowns in the previously existing ones ... When we are aware of the real impact of design we can more consciously design conversation structures that work.
It's important to note here that these are conversations between people, who are acting in some domain, and using computers as tools. It's the social activity that grounds the computer software, and not some correspondence that the software shares with reality or truth. I guess this is a subtle point, and I'm not doing a terrific job of elucidating it here, but if your interest is piqued definitely pick up a copy of the book. Over the past 5 years I've been lucky to work with several people who intuitively understand how important the social setting and alignment are to successful software development--but it's nice to have the theoretical tools as ballast when the weather gets rough.
Another really surprising part of the book (given that it was written in 1986) is the foreshadowing of the agile school of programming:
... the development of any computer-based system will have to proceed in a cycle from design to experience and back again. It is impossible to anticipate all of the relevant breakdowns and their domains. They emerge gradually in practice. System development methodologies need to take this as a fundamental condition of generating the relevant domains, and to facilitate it through techniques such as building prototypes early in the design process and applying them in situations as close as possible to those in which they will eventually be used.
Compare that with the notion of iterative development that's now prevalent in software development circles. I guess it shouldn't be that surprising since the roots of extend back quite a ways. But still, it was pretty eerie seeing how on target Winograd and Flores could be still, particularly in the field of computing which has changed so rapidly in the last 25 years.