The Perils of Programming as Puzzle

Michael Correll
9 min readApr 23, 2021
A section of a jigsaw puzzle with one piece missing

A while back I was interviewing for a research scientist position at [company redacted]. I had already picked up kind of iffy vibes during the day, but there was a point where it crystalized into “ah, I don’t want to be here, and they probably don’t want me here either.” It was when, after an interview loop that was mostly focused on my research and places where I could intervene, I was asked by a new interviewer to take a marker, go up to the whiteboard and, in my “preferred language,” implement a stack data structure using queue data structures as a basis. To me, being asked an (in my opinion) introductory question like that, at this stage in the interview, for a position that required a PhD. in Computer Science or a related field to even get into the running, was indicative of a dysfunction somewhere (either with me, with the company, or some combination) that was likely to be fatal. Namely, some combination of:

  1. The interviewer wasn’t convinced I was a “real” computer scientist, or suspected that I had otherwise slipped through the cracks of the systems that would normally weed out obvious imposters, and this was a last ditch attempt to reveal me for the sham I was.
  2. The interviewer found my work or my background so boring or irrelevant that the only thing they could think to do during our 1:1 time was to default to asking basic programming questions “to see how I think” as a sort of interviewer autopilot. Or, somewhat relatedly, viewed such autopilot questions as a standard hazing practice that somebody had to do, just to get out of the way.
  3. As a hybrid option, the interviewer wasn’t sure what a research scientist was or how one might help them in their work, but suspected that some (or many) of them were useless talkers or dilettantes who couldn’t do things that “really count” (like write code), and so was trying to identify such useless parasites at the jump.

While I wish I had the genius of some of Joel Grus’ humorous responses to giving people stupid programming “puzzles,” instead I internally seethed, said my “preferred language” was English, and gave a really thoroughly half-assed answer aloud. I didn’t get the job (and they were right not to offer it to me: if someone is unwilling to jump through hoops like that for ego reasons, or work on things they think are boring in service of making things run smoothly, then you should probably hire somebody who will). Yet, for some reason, that experience ranks high up on my personal pantheon of annoying interview experiences, alongside the interview where my 1:1 time was spent being forced to watch a youtube video of positive news coverage of the faculty member’s work, and then being shuffled out the door with a flyer advertising their newest book (n.b. that my bad interviewing experiences are both usually self-imposed and also trivial hangnail-level inconveniences compared to some horror stories I’ve heard, especially from women).

I’ll admit to being bristly about the subject of coding puzzles in professional contexts. One of my very first Medium essays (before I settled on primarily using this platform for trolling) is me complaining about how computer scientists try to bundle up so much of their self-identity and in-group/out-group politics with some monolithic notion of what coding means, to the disservice of actually teaching or learning or diversifying or even just enjoying computer science. But my personal experiences and gripes aside, it’s all at least a little weird, right? Computer science feels “puzzle-focused” in a way that other technical or professional fields aren’t (for a glimpse of how weird this looks in other contexts, look at Emily Krager’s thought experiment on if medical doctors were interviewed like software engineers). And this hangup on puzzles and puzzle-solving I think ultimately results in a number of concrete harms.

What I mean by “puzzle” and “puzzle-focused” is a bit hard to pin down, but the general gestalt I am trying to pick out is the idea of an abstract, self-contained unit of stimulating intellectual effort; one that requires a certain (perhaps rare or unconventional) sort of intellectual skill or expertise to “solve,” after which it is “solved” forever (both in its current form, and as re-applied to any isomorphic puzzles). Finishing a sudoku, gathering everyone together in the drawing room to reveal who really did the murder, and answering a riddle are all puzzles in the sense I mean. Puzzles in this sense are deep in the bedrock in computer science — dealing with problems as discrete chunks to tackle abstractly is just sort of how things like algorithms and architecture and software engineering work. But puzzle solving has also shaped other bits of computer science (its practice, its self-conception, its applications) in ways that are neither inherent to the discipline nor ultimately beneficial. In particular:

Abstraction and Alienation

The technology was the most interesting thing I had built to that point. It may still be the most interesting thing I’d built. But don’t let that distract you: it was designed to kill people

— Caleb Thompson, “Don’t Get Distracted” (RubyConf 2017)

The first trick that the puzzle-solving mindset plays on us is to create a distance between us and the consequences of our actions. There are lots of ways that this distance can be abused. As Caleb Thompson points out in the talk I link to above, our intrinsic motivation to tackle intellectually stimulating problems can distract us from what people will actually do with all of our labor once we’re done. Even if we’re not distracted, the distance from abstract problem solving to concrete actions in the world is large enough to dilute our feelings of personal responsibility. It tempts us with the easy out of “I’m just an engineer” or “I’m just a researcher” or “it’s my job to build the tech, and other people’s job to consider the implications.”

All sorts of horrors in history were fueled by people who compartmentalized their jobs into just the tasks in front of them (either directly through the inhumanity that this alienation produces, or more nefariously at the guidance of people who know just how to take advantage of people like that). Focusing on just intellectual interest (how fun the puzzle is to solve) rather than actual motivations and consequences of what we’re doing is why the computer science research community seems to spend so much time on “racist nonsense” like using facial recognition to infer personality traits, and why it gets caught flatfooted again and again when its nice and shiny abstract algorithms create “absurd outcomes.”

Lots of the ways we do our work in modern society is alienating, but computer science is one of those fields where this alienation from the actual impact of our problems is somehow seen as virtuous. Computer science programs pump out “apolitical subjects,” and computer science research is often positioned as a “politically neutral game.” Other types of work at least have the good sense to be embarrassed when they are snookered into doing evil; it is computer science where we have the naïveté to be shocked and surprised each time.

Braggadocio and Blandness

Shared among the first four terms — unicorns, wizards, ninjas, rock stars — is a focus on the individual’s extraordinary technical expertise and their ability to prevail when others cannot. We might have more accurately said “his ability to prevail” because these ideas about individual mastery and prevailing against steep odds are, of course, also associated with men.

— Lauren Klein and Catherine D’Ignazio, Data Feminism

I talked above about how the attitude of computer science as puzzle solving can lead to poor outcomes, but there is also harm in reducing computer scientists down to puzzle solvers. It is a value of individual genius over collective effort, novelty over replicability, creation over maintenance. Lots of people dunked on the “How do you spot a 10x Engineer” tweet thread which suggested so-called virtues like the inability to communicate, mentor, or delegate as hallmarks of superlative software engineers, but all the more frightening are all the people that would silently agree with those kinds of statements, and are (or would be) willing to accept all sorts of abhorrent behavior in order to retain their own personal talent pool of unicorns, wizards, and rock stars. Not willing to put up with assholes, sacrifice your time or health or your family for your job, or do unpaid extra work? Sorry, not rock star enough. The result is a field that will continue to struggle with diversity, with burnout, and with retention.

Beyond just what it causes us to value in others, the focus on this very narrow idea of a puzzle-solver impacts how we view and value ourselves. It makes us lonely and atomized and nervous. Not understanding something or being able to solve a problem can all too often be recast as a personal intellectual failing rather than an opportunity for growth or collaboration. We are pitted against each other rather than encouraged to build solidarity and organizational strength. We walk around with inflated egos since we “made it” and others didn’t. And since no one is really the “unicorn” they are asked to be, these egos are often poor shields for a host of internal self-doubts and perceived inadequacies.

Capture and Collaboration

The best minds of my generation are thinking about how to make people click ads. That sucks.

— Jeff Hammerbacher, “This Tech Bubble is Different

I would be willing to accept all sorts of negative externalities from the puzzle solving attitude if, at the very least, all of this puzzle-solving did something useful. But does it, really? I suspect in many cases it does not. We march to the beat of research problems set for us by big companies trying to get people to click on more ads, oppressive governments trying to keep their citizens in line, or startups trying to bamboozle the current round of venture capitalists before they are distracted by new buzzwords. In short, even well-meaning efforts are “captured” by external entities and wasted in nefarious or pointless endeavors.

Machine learning is of course a particularly prominent example, where many of the biggest, most intellectually demanding data sets are owned by a select few companies that largely use them for the (either mostly fake or at the very least massively overvalued) domain of internet advertising, but it’s true for other parts of the computing ecosystem as well. Mark Hurst describes this process for the field of User Experience, which he describes as beginning with the good intentions of centering a human being in the design process and now he sees as focused on selecting which set of dark patterns to use to get people to agree to give away their data or click on your ads.

So What Should We Do About It?

Here’s the part where I wildly extrapolate in an attempt to get you mad, because if you are mad at least there are higher odds you’ll do something about all this. In particular, my takeaways are:

Treat computer science work as labor

I mean lots of things by this, but I think I can summarize them all with “don’t be a sucker, or at the very least don’t ask other people to be suckers with you, just because you like hacking.” People will take advantage of you (for instance through unpaid work like “hackathons” or asking for evidence of “side projects” to get your foot in the door) because they’ve convinced you that you are “intrinsically motivated” to solve computer science puzzles. That’s sucker talk. It’s the same sucker talk as when we tell teachers to accept student debt and low wages because they are supposed to have a “love of teaching,” and you shouldn’t fall for it (although even teachers aren’t suckers enough to have talked themselves out of needing things like unions or other channels for advocacy, like tech workers seem to do).

Treat computer science work as impactful

You are doing things that shift balances of power, allocate time and resources, and just generally impact people other than yourself. That means that you have duties and responsibilities. As per the Casey Fiesler article I linked way up above, “if you work in tech and you’re not thinking about ethics, you’re bad at your job.” Thinking about other people and how you are impacting them isn’t like sprinkles you add on at the very end after you’ve baked your computer science cake, it’s your duty as a human being in a society. Don’t get distracted, bought off, or otherwise waylaid from this inevitable fact. If you practice computer science, make sure you can stop, slow, or otherwise raise a ruckus about work that you think is causing harm. If you teach computer science, integrate it throughout your curriculum rather than treating it as a “special interest” topic for a single course (or single lesson!). If you are a user of computer technology, stop mystifying it as either magic creations of heroic infallible geniuses or inscrutable “algorithms” that are beyond criticism, but treat every technology as the product of flesh and blood human beings with flaws, biases, and home addresses.

Thanks to Lane Harrison, Sean Andrist, and Arvind Satyanarayan for feedback on this post

--

--

Michael Correll

Information Visualization, Data Ethics, Graphical Perception.