On the one hand, all the models that are available for download on Hugging Face seem pretty much like programming language compilers and interpreters that we download and use to write software. You don’t try to open and read /usr/bin/python3 in your text editor. You trust that it works. Simon Willison says these models are like an “opaque blob that can do weird and interesting things”, and the same analogy seems to hold for the binary executables we run too.

But the big difference is that, once you get various dependencies assembled correctly, you can build the Python binary. The build depends on other opaque blobs being set up, like gcc, which in turn can be built by bootstrapping using a lower level language. There are layers of abstraction at work that can be tested, and reasoned about, which lead us to having some confidence that things are working correctly. It might get complicated but we can debug them when they don’t work correctly.

This is not true of the opaque blob Large Language Model (LLM). We don’t have access to the source code that was used to create it. Compiling it can require a huge investment in time and resources. There’s no way to debug its logic. If you’re lucky there may be a paper about how it was built, but some don’t because it is deemed too dangerous.

So while it feels the same, it’s really not. I just don’t understand why people would like to integrate LLMs into applications, for generating database queries, or API calls. It seems to me like we would want to be able to reason about these things, and that we lose the ability to do that when using an LLM. Why does anyone think this is a good idea?

And if this style of programming were really to catch on with a new generation of programmers, would we lose our ability to understand SQL or REST? Are these really useless abstractions like Assembler that we want to forget? Won’t our ability to reason about our applications atrophy? The state of software is already kind of bad, and it seems like some people are dreaming up ways of making it even worse.