In my previous post on Fabricated Complexity, I wrote about a quote that I found myself repeatedly agreeing with:
The solution to a hard problem, when solved, is simple.
A commenter remarked:
I disagree, though - certainly many things are overcomplicated due to reasons you state, but this doesn't mean every hard problem has a seemingly simple solution.
Two things I want to point out: (1) Notice that I didn’t say every hard problem has a simple solution, since not every hard problem is solvable. (2) This, by the way, is not a hard belief of mine.
I think the basis of my “belief” comes from experiences with functional style programming, where the solution of the problem often turns out to be its specification. I found that the new functional features in C#, iterators (essentially, an efficient implementation of functions that return sets) and anonymous functions, allow me to uncover more simple solutions.
What’s a moderately hard problem to use as an example? Let’s see. Perhaps, building a generalized regular expression matcher might be. Microsoft’s implementation of regular expression matching over strings is spread across 24 files and 14,455 lines of code including comments and whitespace.
Let’s see what happens if we switch to a functional style. We could use the implementation of lazy lists, LazyChain, from the blog of Wes Dyer (developer in C# team) and then port the 14 line Python implementation of a regular expression engine to C# using iterators and anonymous functions. The regularly expression would be composed functionally rather than through a string:
re = Sequence( term1, Plus( term2, Alternate( term3, term4 )), term 5)
and the results would be returned as a collection by applying the regular expression to the list to be searched:
results = re( list_to_search )
I contend that, with this approach, we can get a reasonable implementation of regular expressions over lists of arbitrary types in less than 200 lines of code, which is two orders of magnitude more concise than Microsoft’s more specific implementation over strings. I’ll produce a port of this one day and post it in my blog.
In another point, the fact that a human outperforms a computer in a given programming task serves as a cue to me that maybe the computer algorithm is poorly written. In such cases, I look for an alternative functional implementation which tends to more accurately model how we really think.
Let’s look at resolution-based theorem proving, for example. For a large problem, I wonder if its just faster for the prover to extract from the clauses in the knowledgebase a functional expression equivalent in truth value to the user’s query rather than actually performing the proof. Quantified expressions get turned into lambda expressions, and boolean operations into conditionals. This extraction might be possible in polynomial time, after which we could go about reducing the returned expression to normal form, which is far more efficient than performing a lengthy nondeterministic search.