So I struggled with some Python this weekend. I'm taking over a piece of code that is written in Python, but the original is very imperative, with a lot of for loops. It was giving me a headache. It just looked way too long and tedious. I'm too old to spend my life debugging off-by-one errors in for loops.
After staying up half the night struggling with a malfunctioning IDLE on MacOS X, I finally discovered the cure for the headache was -- more Python!
# This gives us a "curry" equivalent from functools import partial # Reduce a list of strings to a single string by concatenation reduce_stringlist = partial( reduce, str.__add__ ) # Given a list of XML elements, return a list of only the data from only the text nodes # (ignores any non-text nodes) def get_text_data( nodelist ): return map( lambda node: node.data, filter( lambda node: node.nodeType == node.TEXT_NODE, nodelist ) ) # Now, combined: reduce a list of nodes to the concatenated text extracted out of only # the text node data def extract_text( nodelist ): return reduce_stringlist( get_text_data( nodelist ) ) # Given a portion of the parsed XML tree and a name, extract a single string. Expect # only one element. def get_one_text_element( node, name ): elt_list = node.getElementsByTagName( name ) assert elt_list.length == 1 return extract_text( elt_list[0].childNodes ) # Given a starting node and a list of tag names, retrieve one text string each and # put them in a newly created dictionary using the tag name as a key def make_text_element_dict( node, namelist ): return dict zip( namelist, map ( partial( get_one_text_element, node ), namelist ) ) )
Ahhh, I feel much better! All that excess hairy boilerplate seems to be just melting away!
The only thing I did not like is that there did not seem to be a nice way to specify keyword arguments to "reduce" so that I could create a partial (curried) application that supplied the first and optional third parameter, leaving the second as the one to be supplied at runtime.
I experimented with writing the above with Python's list comprehension idiom, generators, and various permutations on join(). The above just seems to make more sense to me. Haskell has apparently ruined me for other languages.
Apparently Guido would like to ban reduce() from Python. I say -- prefer the standard idiom to the offbeat, and avoid the Not Invented Here syndrome. Python's comprehensions and generators are nice, but apparently I've been ruined by seeking more and more expressive languages, seeking truth and beauty on my wandering but inexorable path from Dylan to NewtonScript to Scheme to Haskell. Lambda, map, curry, reduce, and zip now seem to me to be fundamental primitives that any reasonable dynamic language ought to provide, and it seems to me that they ought to be preferred to an obscure language-specific trick. As long as I have to use Python, Guido will have to pry the curried functions from my cold, dead hands!