John Cook links to a blog by Ben Deaton arguing that people often waste time trying to set up ideal working conditions, even though (a) your working conditions will never be ideal, and (b) the sorts of constraints and distractions that one tries to avoid, can often stimulate new ideas.
Deaton seems like my kind of guy–for one thing, he works on nonlinear finite element analysis, which is one of my longstanding interests–and in many ways his points are reasonable and commonsensical (I have little doubt, for example, that Feynman made a good choice in staying clear of the Institute for Advanced Study!), but I have a couple of points of disagreement.
1. In my experience, working conditions can make a difference. And once you accept this, it could very well make sense to put some effort into improving your work environment. I like to say that I spent twenty years reconstructing what it felt like to be in grad school. My ideal working environment has lots of people coming in and out, lots of opportunities for discussion, planned and otherwise. It’s nothing like I imagine the Institute for Advanced Study (not that I’ve ever been there) but it makes me happy. So I think Deaton is wrong to generalize to “don’t spend time trying to keep a very clean work environment” to “don’t spend time trying to get a setup that works for you.”
2. Also consider effects on others. I like to feel that the efforts I put into my work environment have positive spillovers on others–the people I work with, the other people they work with, etc., also as setting an example for others in the department. In contrast, people who want super-clean work conditions (the sort of thing that Deaton, rightly, is suspicious of) can impose negative externalities on others. For example, one of the faculty in my department once removed my course listings from the department webpage. I never got a straight answer on why this happened, but I assumed it was because he didn’t like what he taught, and it offended his sensibilities to see these courses listed. Removing the listing had the advantage from his perspective of cleanliness (I assume) but negatively impacted potential students and others who might have been interested in our course offerings. That is an extreme case, but I think many of us have experienced work environments in which intellectual interactions are discouraged in some way. This is clear from Deaton’s stories as well.
3. Deaton concludes by asking his readers, “How ideal is ideal enough for you to do something great?” I agree with his point that there are diminishing returns to optimization and that you shouldn’t let difficulties with our workplace stop us from doing good work (unless, of course, you’re working somewhere where your employer gets possession of everything you do). But I am wary of his implicit statement that “you” (whoever you are) can “do something great.” I think we should all try to do our best, and I’m sure that almost all of us are capable of doing good work. But is everyone out there really situated in a place where he or she can “do something great”? I doubt it. Doing something “great” is a fine aspiration, but I wonder if some of this go-for-it advice can backfire for the people out there who really aren’t in a position to achieve greatness.
I have a saying that "good is almost always better than provably optimal". I think this falls under that category.
The problem with provably optimal is that it's extremely hard to achieve, and the real world objective is rarely that well linked to the objective function used in the optimization, so in general using "optimization" type techniques to get a good solution in a reasonable time is a good idea, but trying to figure out THE optimum is a waste of time.
When I worked on the Hubble, I learned an important engineering maxim: "Fast, Good, Cheap: Pick two."
Biggest improvement in my work conditions was adding a 24 inch monitor to sit next to my laptop, allowing me to use my laptop screen for my final draft, while having my notes page and a web page simultaneously open on the big screen.
I see that Al Gore has three 30" screen around his Macbook. That seems ideal.
That sounds like Voltaire's maxim “Le mieux est l'ennemi du bien'' – in English “The best is the enemy of the good.''
Joel Spolsky is a New York software entrepreneur who makes a point of creating ideal working conditions (See a description of the office he set up in 2008) as part of his business strategy to attract the best programmers.
Thanks for taking the time to read the post and share your thoughtful response. Actually, I agree there is massive potential to positively impact one's productivity and happiness as well as those around you (as you rightly point out) by discovering and implementing what works for you.
Here, though, I was more addressing people I encounter who use "if I only had a better office/tools/colleagues/etc"-type thinking to justify a lack of productivity, obviously a deeper problem than can be solved by upgrading to a newer Mac or better office view.
I do think you're right about the term "great," which I used more generally. Someone with a Dunning-Kruger mentality, for example, is in for a discouraging time if they just decide to go all in on a technical venture. On the other hand, that type of proactivity might provide some valuable (and needed) feedback.
I suppose I may as well state the obvious, so that others won't feel that they have to: different people have radically different levels of tolerance for non-ideal working conditions. Some people are so distracted by a messy desk that they can't function well until it's cleaned up; others of us don't care at all. Some people are very distracted by a brief phone call or by someone who sticks their head in the door for a quick chat; others may function better with an occasional break of this sort (and of course the chat or phone call often constitutes or facilitates work). Some people are finicky about things like lighting, temperature, and noise, others are not.
I'm on the tolerant side of just about all of these. But I wouldn't hold myself up as an example of anything: I do less actual work per hour of "work" than just about anyone. It's just that if distractions aren't provided for me, I make them up myself. Sit me in front of an empty desk with nothing but a pencil and a piece of paper, and I'll give you a paper full of doodles. Put me in front of a computer that has no applications other than R, and I'll show you dozens of lissajous figures, or simulations from a made-up predator-prey model, or images of the Mandelbrot set with different color schemes. Anything but what I'm supposed to be doing. But that doesn't seem to be affected much by "working conditions". The only things that keep me on-task are (1) frequent interactions with people who are interested in my work, and (2) deadlines. With those (both are necessary), I can be effective in any environment I've ever experienced. Without them, in none.
There has certainly BEEN some really work done under the most miserable conditions. I remember, as a kid (pre calculator days) reading about the Trachtenberg system for rapid arithmetic calculation. It was developed while its author was in a concentration camp.
And there has been some undeniably great work done in very normal conditions, both Euler and Erdos could, apparently, work anywhere at all.
OTOH, it's likely that most of us do better work when we are in better conditions.
Advantageously trading-off now versus later efforts? and confusion!
There something about making the right trade-off between what one can do in their current situation versus investing time and effort trying to improve the situation.
One of the silver linings of less than ideal situations might be the confusion it may honestly generate that increases creativity and or later understanding.
For instance I tried doing some analysis of that noisy scatter plot news video while also doing a barbeque on Saturday evening and I made a couple rather silly mistakes. The confusion that resulted provided some encouragement early the next morning to think
a little bit more carefully about things that I sort of knew but maybe not well enough.
Anways, CS Peirce thought confusion (real not fake) was a necessary ingredient to being creative and making a real contribution to knowledge.
K?