Pretty image

“Your hands are like dogs, going to the same places they’ve been. You have to be careful when playing is no longer in the mind but in the fingers, going to happy places. You have to break them of their habits or you don’t explore; you only play what is confident and pleasing. I’m learning to break those habits by playing instruments I know absolutely nothing about, like a bassoon or a waterphone.”—Tom Waits

We were dining with friends when Tom Waits walked in.

It had been a day for surprises at Summer Jo’s, the Oregon restaurant that my partner Nancy and I own and she manages. Old friends from out of state had been popping in all day long—a cousin from Phoenix, a friend from Wisconsin, former colleagues from the Bay Area. But I didn’t do a double-take until the little guy in the porkpie hat sauntered past our table.

I don’t usually get all giddy over celebrities, but Tom Waits is one of my heroes. Whether or not you dig his whiskey voice and dive-patron persona, you have to admire an established artist who still pushes himself to pick up instruments he knows nothing about in order to break his fingers of their old habits. Admire, and maybe emulate.

While googling Waits lyrics, I came across the quote you’ll see at the top of this column. It started me thinking about that experience of picking up a truly unfamiliar (programming) instrument, and how it can stretch your mind.

Yesterday Is Here

The first time I had that experience as a programmer, it was forced on me.

In grad school, as a teaching assistant to Dan Friedman, I was tasked with running an undergraduate lab class in Basic. When Dan handed me a peculiar little book he’d written and told me to “learn it,” I was unimpressed. The Little LISPer didn’t look like it had anything to do with Basic.

In fact, the book didn’t have much of anything to do with anything I’d ever encountered—in subject matter, pedagogy, or purpose. It consisted of nothing but questions and answers, and most of the questions were unanswerable on the basis of the knowledge you could be expected to have when encountering them. Years later, when I was trying to develop software for Apple’s newly released Macintosh computer and struggling through Apple’s developer documentation, Inside Macintosh, the joke was that the prerequisite for reading Inside Macintosh was a thorough knowledge of Inside Macintosh. But it seemed literally true that you couldn’t begin to understand The Little LISPer until you had mastered The Little LISPer.

Somehow it worked its magic, though: by the time I’d puzzled through that strange little book I really got LISP. More than that, more than adding to my then-meager store of programming knowledge, I’d learned a new way of thinking about programming. My old strategies for decomposing a problem into parts no longer fit this new way. Wherever I’d employed the hammer of iteration I now tried the lever of recursion. My ideas about how big a program should be and in what order I should tackle the subproblems of a programming task changed drastically. I’d put down the guitar and picked up the bassoon.

I never did figure out how to use it in teaching Basic to undergraduates, but I don’t think Dan expected me to.

Although I moved on to more sophisticated LISP books and projects, for practical reasons, I’ve done no serious programming in LISP since grad school. Nevertheless, I think that learning LISP did me more good as a programmer than learning C or Java or Ruby. It’s like the Latin I took in high school: I don’t read or write it today, but the Latin I learned lets me see inside words in a way that I couldn’t without that experience. And knowing LISP lets me see inside problems in different and helpful ways.

Or so it seems to me. Of this I’m sure: Learning this new instrument of LISP was excellent mental finger exercise.

Trouble’s Braids

Musing on that Waits quote I flashed forward to 2009, and asked myself what a really challenging and useful finger exercise for programmers would be today. Hmm, how about parallel programming?

It’s odd that parallel programming would feel unnatural. The universe is parallel. Your brain is parallel. If you play the piano, your hands play in parallel, and not SIMD parallel, either. The Web is parallel.

And now, belatedly, the computer on your desk is parallel. Problems in heat dissipation, power consumption, current leakage, and the speed of light have spelled the end of single-core growth. “Like it or not, folks,” Ted Neward sums it up, “the path forward isn’t one that you get to choose.” Intel and the other chip makers have already chosen it for you. Multicore processors are here and waiting to be exploited.

It’s not just multicore, of course. There are many kinds of parallel programming. And the problems are not really new. We’ve been dealing with concurrency in operating systems for decades. But when it comes to developing software for richly parallel hardware, we’re in the era of rabbit ears, says James Reinders, who evangelizes all things parallel for Intel and is the author of a new book called Intel Threading Building Blocks.

In the pioneering days of television, the early adopters had to be willing to diagnose and replace vacuum tubes, fuss with vertical and horizontal hold, and experiment with the positioning of the rabbit-ears antenna on top of the set to get the best picture. In Reinders’ analogy, you are the poor sap balancing the rabbit ears on his head and twiddling the knobs, trying to get a decent picture. The tools to help you do the job are, Intel implicitly admits, somewhat primitive or lacking.

Which may explain why Intel identifies as the top challenge for parallel computing—er, you. Or as they put it, “Finding concurrency in a program—how to help programmers ‘think parallel.’” See, if only programmers could think parallel, we’d be rocketing ahead in the exploitation of the power that Intel is so generously building into modern processors. It’s your fault.

OK, maybe they don’t mean it in quite that way. But the fact is, thinking in parallel is challenging. Despite all the parallelism going on around us and inside us all the time, thinking of problems in parallel terms doesn’t seem to come naturally to us.

Reinders would have parallel programming taught as a fundamental component of any computer science curriculum, starting in high school. “We’ve progressed from the introduction of multicore processors in some computers to nearly all new systems having multicore processors,” he says. Yet “parallel programming remains an advanced topic in graduate school studies and notably absent from too many university undergraduate courses.... [W]e can do more.”

“We can do more” is a good summary of the state of parallel programming today. To master parallel programming, you need to master new tools (if you can find them), new rules, new languages (there are starting to be some good ones), discover or adopt new algorithms, patterns, design strategies, rethink things you take for granted, and come to programming with fresh eyes, willing to see anew. Learning a new language is only part of it; we’ve really entered a new generation of programming and those who actually learn to think in parallel will have an edge. The positive spin on this daunting state of affairs is that there is much to engage the willing learner.

In other words, good finger exercises. Challenging in the best sense. In parallel programming, there are fundamental algorithms yet to be discovered, breakthroughs to be made by someone. Might as well be you.

We are in the rabbit-ears era of parallel programming. But that’s a good thing.

Innocent When You Dream

But what about something that really takes your coding fingers out of their happy places?

Sometimes the best mental exercises are the ones that have no immediate chance of being applied to any practical problem. Pure mind play. That’s what Jack Woehr has been up to recently, teaching himself quantum computing in public in his Dr. Dobb’s Journal blog. He’s been interviewing experts, chatting with readers, thinking out loud.

“Over the years,” he writes, “I have slowly imbibed a tiny sip of quantum mechanics in pursuit of quantum computing. It’s cumulative and it’s starting to make sense. I’m now going back and reading early classic papers on the subject by the originators of QM.” Jack entreats the experts he interviews to “talk to me like an engineer.” Researching quantum computing (QC) is definitely an academic pursuit, but what Jack is after is a practical, engineer’s understanding of the subject.

In 1992, David Deutsch and Richard Josra demonstrated that a quantum computer would be able to solve exponential problems in polynomial time: to make the intractable tractable. A quantum computer, if such a thing existed, could solve problems that were heretofore not merely unsolved, but practically unsolvable. The payoff for creating a quantum computer would be enormous.

And there has been progress, some of it quite recent. On August 6, the National Institute of Standards and Technology announced that NIST researchers had “demonstrated on a small scale all the generally recognized requirements for a large-scale ion-based quantum processor. Previously they could perform all of the... processes a few at a time, but now they can perform all of them together and repeatedly....”

The processes include initializing qubits (the quantum analog to bits) to the desired binary value, storing qubit data in ions, performing logic operations on qubits, transferring information between different locations in the processor, and reading out qubit results. Obviously, this is a long way from having a working quantum computer; equally obviously, if you can do these things you should be able to build one.

The computer you can build, though, will be very different from any computer we’ve ever encountered. Quantum mechanics, as one of the experts Jack interviews points out, isn’t binary logic, and qubits are not bits. Which is why QC will lead to entirely new ways of computing things, and why QC will require—and inspire—new algorithms. Jack interviewed Jonathan Home, one of the NIST researchers, who said, “[I]t’s an open research problem, what problems will be better-solvable by quantum computing.” This is the wild west of computing, an untamed frontier.

And out on the frontier you don’t have all the conveniences of home. As Jack says, “QC today is not what ‘us programmers’ usually call programming. This is gate design and basic logic.” Languages? It’s a little early for such frills. “When we see what are the basic operations supported by the machine we’ll weave languages.”

That doesn’t mean people aren’t developing QC algorithms, though. Shor’s algorithm for factoring large numbers rapidly is all by itself reason to try to develop a quantum computer—and to worry about how the world will change when affordable quantum computers running Shor’s algorithm make today’s encryption solutions obsolete. Robert Tucci, who has weighed in on Jack’s blog and who writes about quantum computing on his own blog, has developed a Java application that generates the qubit circuit necessary for doing simulated annealing. You can do quantum computing today, it’s just that you will find it a strange exercise.

Viewed as an instrument, quantum computing would be an odd one—maybe a kazoo. Viewed as a mental exercise... “Quantum computing,” Jack says, “depends on quantum mechanics and quantum mechanics makes my head hurt.” Indeed. As mental exercise, studying quantum algorithms is extreme sport. But Jack’s glad to be doing the exercise. He says, “It’s revitalizing my technical chops.”

Thinking about all these mental exercises tired me out. Over dinner, I told Nancy that I knew just what I needed to take me out of my comfort zone and get me thinking differently. That’s right, I said, I need to get me a porkpie hat.

Michael Swaine is the editor of PragPub. He’s been looking for a good hat since he first felt a draft on the top of his head.