2016-01-14

One of the common rules of ethics in Monopoly is not to hide money. While players don’t have to make great efforts to keep their cash holdings transparent, it’s not legal to stick a pair of $500 bills under the board and pretend to be nearly bankrupt in order to negotiate in bad faith, underpay when landing on the “Income Tax” square, or the like.

In the financial world, there are similar policies against “hiding money under the board”, or understating one’s performance in order to appear stronger in future years. If your trades make 50 percent in 2013, you might be inclined to report 15 percent so that, if you have a -10 percent year in 2014, you can report another positive year. Ultimately, there’s some level of performance that betrays high volatility and, if one exceeded that level, it would be socially advantageous to smooth out the highs and lows. In essence, one who does this is misleading investors about the risk level of one’s own chosen strategy.

While this behavior isn’t ethical when applied to financial instruments or board games, it’s something that many people have to do on a daily basis at the workplace. When you have a good week, do some extra useful work and hide it somewhere. When you have a bad week, unload some of your stored work to compensate for the period of slowness. Of course, this discussion naturally leads into the “what’s it really like to be bipolar in tech” question, which I’ve already answered and don’t care to trudge over (it sucks, people are shitty) that frozen muck path yet again. This “hiding money under the board” skill is something you learn quickly with a mood disorder, but I think that it’s worthwhile for everyone, because it’s impossible to predict when some external problem will impede one’s performance.

Broadly speaking, people can be divided into low- and high-variance categories. Society needs both in order to function. While the low-variance people aren’t as creative, we need them for the jobs that require extreme discipline and reliability, even after days without sleep or severe emotional trauma or a corporate catastrophe. High-variance people, we need to hit the creative high notes and solve problems that most people think are intractable. Now, about 98 percent of jobs can be done well-enough by either sort of person, insofar as few corporate jobs actually require the high-variance person’s level of creativity or the low-variance person’s reliability. Given this, it seems odd that whether one is low- or high-variance would have a powerful impact on one’s career (spoiler: it’s better to be low-variance). It does, because corporations create artificial scarcities as a way of testing and measuring people, and I’ll get to that.

There are also, I would say, two subcategories of the high-variance set, although the distinction here is blurrier, insofar as both patterns are seen in most people, so the distinction pertains to proportion. There’s correlated high variance and uncorrelated high variance. People with correlated high variance tend to react in similar ways to “normal”, low-variance people, but with more severity. Uncorrelated high variance tends to appear “random”. It’s probably correlated to something (if nothing else, the person’s neurochemistry) but it doesn’t have patterns than most people would discern. Oddly enough, while uncorrelated variance is more commonly associated with “mental illness”– if someone laughs at a funeral, you’re going to think that person’s “a bit off”– correlated variance can be much more detrimental, socially and industrially speaking. A person with correlated high variance is likely to nose-dive when conditions are ostensibly bad, and that’s when managerial types are on high alert for “attitude problems” and emerging morale crises and pushing for much higher performance (to detect “the weak”) than they’d demand in good conditions. “Hiding money under the table” is hiding variance, and uncorrelated variance is a lot easier to conceal because no one expects it to be there.

Most people would agree that between low- and high-variance there is a spectrum, but not necessarily connect it to anxiety and mood disorders like depression, panic, or bipolar disorder. I disagree. I think that depression and bipolarity are symptoms of many different root causes that we just haven’t figured out how to separate. “Depression” is probably ten different diseases grouped by a common symptom, which is part of what makes it hard to treat. Some depressions respond very well to medication and others don’t. Some go away with improved exercise and sleep habits and meditation, while others don’t. At any rate, I think that the extreme of high variance is going to manifest itself as a mood disorder. This also suggests that mentally healthy people at, say, the 90th percentile of variance might be arguably “subclinically bipolar”, even though they wouldn’t exhibit pathological symptoms. In fact, I don’t think that that’s as far off the mark as it sounds.

People have asked me what the hardest parts of cyclothymia (a rapid-cycling, but usually mild, variety of bipolar disorder) are, and it’s actually not the classic symptoms. Depression sucks, but I haven’t had a depressive episode of longer than a week since 2013 (after a death in the family) and I haven’t had a manic episode since 2008, and I’ll probably have one again. Number two is the panic attacks, which tend to occur because, as one gets older, pure hypomania tends to be rarer and it’s more common to have “mixed” periods with characteristics of hypomania and depression intermingled. (And what do you get when you combine hypomania and depression? Often, anxiety.) That’s pretty much where I am now. I don’t go from manic to depressive and back; I get 47-49 weeks per year of normal mood (with some anxiety, and an occasional stray panic attack) and about 20-35 days of (mostly mild) cycling, during which I can work and get about just fine, but have (mostly short-lived) “depression attacks” and insomnia and weird dreams and the familiar pressure-behind-the-eyes headache of hypomania.

I said that panic is the #2 hardest, worst thing about it. What’s #1? I call it 20 percent time, as a shout-out to Google. Silicon Valley is full of self-diagnosed “Aspies” who think that they have an autism-spectrum condition, and I think that most of them are off the mark. Yes, there’s a discernible pattern in creative individuals: extreme social ineptitude from ages 5 to 20, awkwardness (improved social ability, but a deficit of experience) from 20 to 30, and relative social normalcy (with, perhaps, stray moments of bitterness) after 30. This is a ridiculously common narrative, but I really don’t buy that it has anything to do with the autism spectrum. People with autism are (through no fault of their own) socially deficient for their entire lives. They might learn to cope and adapt, but they don’t develop normal social abilities in their mid-20s, as people along the high-variance “main sequence” seem to do. In fact, I think that most of the people with this narrative are on a spectrum, but I don’t think it has anything to do with autism. My guess is that they have a subclinical 90th-percentile variety of what is, at the 98th- and 99th-percentile, bipolar disorder. One thing to keep in mind about mental illness is that its stigma is amplified by the visibility of the extreme cases. Below the water line on that metaphorical iceberg, there are a large number of people who aren’t especially dysfunctional and, I would argue, many undiagnosed and subclinical “sufferers” who experience little more than mild social impairment.

Mood disorders are notoriously hard to diagnose, much less treat, in childhood and adolescence, and people commonly offer quips like “teenagers are just manic-depressive in general”. That’s not really true at all. Teenagers aren’t “naturally bipolar”. What is true is that children and adolescents have exaggerated moods that exist to reward and punish behaviors according to their social results. This is a specific subtype of “high variance” that is correlated (unlike the more uncorrelated high variance that is seen in mood disorders). That’s how social skills are learned. You kiss a girl, and you feel great; you’re picked last for a team, and you feel terrible. In this period of life, I’d guess that the garden-variety, not-really-bipolar, high-variance (85th to 95th percentile) people don’t seem especially mood-disordered relative to their peers. But, at the same time, they’re spending 10 to 20 percent (and maybe more) of their time in a state of consciousness where their moods are somewhat uncorrelated to social results and, therefore, acquisition of social skills is halted. That state of consciousness is good for other things, like creative growth, but it’s not one where you’ll learn others’ social cues and messages, and how to move among people, at an optimal rate. This explanation, I think, is better than subclinical autism in getting to the root of why creative people are invariably socially deficient before age 25, and why a good number (more than half, I would guess) recover in adulthood. If you’re 40, the effect of “20% time” is that you’re “socially 32”, and no one can tell the difference. If you’re 20, and your “social age” is 16, that’s a fucking disaster (or, statistically more likely, a not-fucking disaster). The age of 25 is, approximately, the point at which being 10 to 20 percent younger in social ability is no longer a handicap.

What does this have to do with Silicon Valley and the workplace? I want to be really careful here, because while I think that high-variance people (and I, obviously, am one) experience across-the-board disadvantages, I don’t want to create tribal bullshit around it. High-variance people aren’t necessarily “better” than low-variance people. There are high-variance people with awful moral character and low-variance people with great moral character. It’s important to keep this in mind, even while “the Big Nasty” among the working world’s conflicts (and, probably, organizational conflicts in general) is usually going to come down to that between the high-variance people of strong moral character and the low-variance people of bad moral character. (Lawful good and chaotic evil are organizationally inert.) In the first set, you have the “chaotic good” archetype; in the movies, you always root for these people to win because, in real life, they almost never do. They’re usually not diagnosably bipolar, but they’re certainly not well-adjusted either. They’re courageous, moralistic, and intense. In the second set, of low-variance people with bad moral character, you have psychopaths. Psychopaths aren’t even affected by the normal sources of mood variance, like empathy, love, and moral conflict.

Psychopaths are the cancer cells of the human species. They are individually fit, at the expense of the organism. Now, there are plenty who misinterpret such claims and expect it to predict that all psychopaths would be successful, which we know not to be true. In fact, I’d guess that the average psychopath has am unhappy life. They’re not all billionaires, clearly, because there are only a handful of billionaires and there are a lot of psychopaths out there. Analogously, not all cancer cells are successful. Most die. (The “smell of cancer”, infamous to surgeons, is necrosis.) Cancer cells kill each other just as they kill healthy tissue. Cancer doesn’t require that all cancer cells thrive, but only that enough cells can adapt themselves to the organism (or the organism to themselves) that they can enhance their resource consumption, reach, and proliferation– causing disease to the whole. Worse yet, just as cancer can evade and repurpose the body’s immune system toward its own expansion, psychopaths are depressingly effective at using society’s immune systems (ethics, reputation, rules) toward their own ends.

What can the rest of humanity do to prevent the triumph of psychopaths and psychopathy? Honestly, I’m not sure. This is an arms race that has been going on for hundreds of thousands of years, and the other side has been winning for most of that time. Coming to mind (and bringing negative conclusions) is Ex Machina, a movie (which I’ll spoil, so skip to the end of this paragraph, if you don’t want that) that contends with some of the darker possibilities behind “Strong AI“. The three main characters are Caleb, a “straight man” programmer of about 25; Nathan, a tech billionaire with sociopathic tendencies; and an AI who goes “way beyond” the Turing Test and manages to convince Caleb that she has human emotions, even though he knows that she is a robot. She’s just that good of a game player. She even manages to outplay Nathan, the carbon-based douchebag who, remaining 5 percent normal human, can be exploited. Psychopaths, similarly, have a preternatural social fitness and are merciless at exploiting others missteps and weaknesses. How do we fight that? Can we fight it?

I don’t think that we can beat psychopaths in direct combat. Social combat is deeply ingrained in “the human organism”, and whatever causes psychopathy has had hundreds of thousands of years to evolve in that theater. As humans, we rank each other, we degrade our adversaries, and we form an ethical “immune system” of rules and policies and punishments that is almost always repurposed, over time, as an organ for the most unethical. Whatever we’ve been doing for thousands of years hasn’t really worked out that well for our best individuals. I think that our best bet, instead, is to make ourselves aware of what’s really going on. We can succeed if we live in truth, but this requires recognizing the lie. Is it possible to detect and defeat an individual psychopath? Sometimes, yes; sometimes, no. Beating all of them is impossible. If we understand psychopathy in terms of how it works and tends to play out, this might give us more of an ability to defend ourselves and the organizations that we create. We’ll never be able to detect every individual liar, but we can learn how to spot and discard, from our knowledge base, the lies themselves.

This takes us back to the variance spectrum. Every organization needs to rank people, and the way most organizations do it is what I call “the default pattern”: throw meaningless adversity and challenges at people, and see who fails out last. Organizational dysfunction can set in rapidly, and even small groups of people can become “political” (that is, trust-sparse and corrupt) in a matter of minutes. It doesn’t take long before the dysfunction and stupidity and needless complexity and recurring commitments exceed what some people can handle. There is, of course, a lot of luck that dictates who gets hit hardest by specific dysfunctions. On the whole, though, human organizations permit their own dysfunction because it allows them to use a blunt (and probably inaccurate) but highly decisive mechanism for ranking people and selecting leaders: whoever falls down last.

Let’s imagine a world where truck drivers make $400,000 per year. With a large number of contenders for those jobs, the entrenched and well-compensated drivers decide (in the interest of protecting their position) that only a certain type of person can drive a truck, so they create a hazing period in which apprentice drivers must tackle 72-hour shifts for the first three years. You’d have a lot of drug abuse and horrible accidents, but you would get a ranking. If you had them driving safe, 8-hour shifts, you might not, because most people can handle that workload. In the scenario above, you filter out “the weak” who are unable to safely drive extreme shifts, but you’re also selecting for the wrong thing. The focus on the decline curve (at the expense of public safety) ignores what should actually matter: can this driver operate safely under normal, sane conditions?

It isn’t intentional, but most organizations reach a state of degradation at which there are measurable performance differences simply because the dysfunction affects people to differing degrees. In an idyllic “Eden” state, performance could theoretically be measured (and leaders selected) based on meritocratic criteria like creative output and ethical reliability (which, unlike the superficial reliability that is measured by subjecting people to artificial stress, scarcity, and dysfunction, actually matters). However, none of those traits show themselves so visibly and so quickly as the differences between human decline curves at the extremes. This raises the question: should we measure people based on their decline curves? For the vast majority of jobs, I’d say “no”. The military has a need for ultra-low-variance people and has spent decades learning how to safety test for that (and, something the corporate world hasn’t managed, to keep a good number of the psychopaths out). But you don’t need an Army Ranger or a Navy SEAL to run a company. It probably won’t hurt, but most companies can be run by high-variance people and will do just fine, just as most creative fields can be practiced by low-variance people.

The advantage of psychopaths isn’t just that they tend to be low-variance individuals. If that were all of it, then organizations could fill their ranks with low-variance non-psychopaths and we’d be fine. It’d be annoying to be a high-variance person (and know that one would probably never be the CEO) but it wouldn’t make our most powerful organizations into the ethical clusterfucks that virtually all of them are. The psychopaths’ greater advantage is that they aren’t affected by dysfunction at all. When an organization fails and unethical behavior becomes the norm, the high-variance people tend to fall off completely while the decent low-variance people decline in lesser degrees– they’re disgusted as well; it just has less of an effect on their performance– but the psychopaths don’t drop at all. In fact, they’re energized, now that the world has come into their natural environment. If they’re smart enough to know how to do it (and most psychopaths aren’t, but those who are will dominate the corporate world) they’ll go a step further and drive the environment toward dysfunction (without being detected) so they can have an arena in which they naturally win. Social climbing and back stabbing and corporate corruption deplete most people, but those things energize the psychopath.

We come around, from this, to concrete manifestations of damaged environments. In particular, and a point of painful experience for software programmers, we have the violent transparency of the metrics-obsessed, “Agile Scrotum“, open-plan environment. This is the environment that requires programmers (especially the high-variance programmers who were attracted to the field in search of a creative outlet) to, breaking the rules of Monopoly and financial reporting, hide “money” under the table. Agile Scrotum and the mandates that come out of it (“don’t work on it if it’s not in the backlog; if you must do it, create a ticket and put it in the icebox”) demands people to allow visibility into their day-to-day fluctuations to a degree that is unnecessary, counter-productive, and downright discriminatory.

Agile Scrotum also hurts the company. It makes creative output (which rarely respects the boundaries of “sprints” or “iterations” or “bukkakes” or whatever they are calling it these days) impossible. I’ve written about the deleterious effects of this nonsense on organizations, but now I’m going to talk about their effects on people. When the new boss comes in and declares that all the workers must whip out their Agile Scrotums, for all the world to see and pluck at; the best strategy for a high-variance person (whose day-to-day fluctuations may be alarming to a manager, but whose average performance is often several multiples of what is minimally acceptable) is, in my mind, to hide volatility and put as much “money” under the table as one can. Achieve something in July when your mood and external conditions are favorable, and submit it in August when you hit a rough patch and need some tangible work to justify your time. Yes, it’s deceptive and will hinder you from having a healthy relationship with your boss, but if your boss is shoving his Agile Scrotum down your throat (wash out the taste with a nice helping of user stories and planning poker) you probably didn’t have a good relationship with him in the first place, so your best bet is to keep the job for as long as you can while you look for a Scrum-free job elsewhere.

I promised, in the title, a “fundamental truth” that would be useful to the “neurotypical” people with no trace of mood disorder, and to the low- and high-variance people alike, and now I’m at it. This steps aside from the transient issues of open-plan offices and the low status of engineers that they signify. It’s independent of the current state of affairs and the myriad dysfunctions of “Agile” management. It’s probably useful for everyone, and it’s this: never let people know how hard you work, and especially don’t let them know how you invest your time.

People tend to fall into professional adversity, as I’ve observed, not because their performance is low or high, but because of a sudden change in performance level. Oddly enough, going either way can be harmful. Sudden improvements in performance suggest ulterior motives, transfer risk, or a negative attitude that was held just recently, in the same way that a sudden improvement of health might upset a jealous partner. If you “ramp it up”, you’re likely to expose that you were underperforming in the past. Likewise, the people most likely to get fired are not long-term low performers, because organizations are remarkably effective at adapting to those, but high performers who drop to average or just-not-as-high performance. Most managers can’t tell who their high and low performers actually are, because their people are all working on different projects, but they can detect changes, especially in attitude and confidence, so you’re in a lot more danger as an 8 who drops to a 6 than as a 3 who’s always been a 3. This is, of course, one of the reasons why it’s so awful to be a high-variance person in a micromanagement-ridden field like software. As a strategic note, however, I think that it’s valuable for low-variance people as well to understand this, too. You don’t want to be seen as a slacker, but you don’t want people to see you as “a reliable hard worker” either. People with the “hard worker” reputation often get grunt work dropped on them, and can’t advance. What you really want, if you’re gunning for promotions, is for people to look at you and see what they value, which will not always be what you value. Some people value sacrifice, and you want them to see you as dedicated and diligent (and so hard-working and busy that you can’t be bothered to take on the kinds of sacrificial duties that they might otherwise want to foist upon you, in order to even out the pain load). Other people (and I put myself in this category) value efficiency and you want them to see you as smart, creative, and reliable on account of sustainable practices and personal balance.

Achieving the desired image, in which people see their own values reflected in you, isn’t an easy task. I could write books on that topic, and I might not even solve anything, because there’s a ton of stuff about it that I don’t know. (If I did, I might be richer and more successful and not writing this post.) I do know that control of one’s narrative is an important professional skill. How hard one actually works, and what one’s real priorities are, is information that one needs to keep close, whether one’s working 2 hours per day or 19 hours per day. (In my life, I’ve had many spells of both.) People who can control their own narratives and how they are perceived generally win, and people who can’t control their stories will generally have them defined by others– and that’s never good. One might sacrifice a bit of reputation in order to protect this sort of personal information, and I can already feel a bristling of that middle-class impulse to desire “a good reputation” (not that it stops me from writing). Here’s what I think, personally, on that. “Good” and “bad” reputations are transient and the distinction is sometimes not that meaningful, because what seems to matter in the long run (for happiness and confidence, if not always agreeability and of-the-moment mood) is control of one’s reputation. Even personally, that’s a big part of why I write. I’d rather have a “bad” reputation that I legitimately earned, because I wrote something controversial, than a “good” reputation that was defined for me by others.

What irks me about Silicon Valley’s culture and its emphasis on micromanagement is not only the meaningless (meaningless because what is made transparent has nothing to do with questions of who is actually doing the job well) violent transparency of open-plan offices and Agile Scrotums. That stuff sucks, but it bothers me a lot more when people seem not to mind it. It’s like being in a strategy game where the lousy players add so much noise that there’s no purpose in playing– but not being allowed to leave. For example, I’ve always argued that when a manager asks for an estimate on how long something should take, one should ask why the estimate is being requested. It’s not about a desire to put off planning or to hide bad work habits. It’s about parity. It’s about representing yourself as a social equal who ought to be respected. Managers may not understand Haskell or support vector machines, but they know how to work people, and if you give them that information for free— that is, you furnish the estimate without getting information about why the estimate is important, how it will be used, what is going on in the company, and how performance is actually evaluated– then they’re more likely to see you as a loser. It’s just how people work.

Likewise, if someone has day-by-day visibility into what you are working on, and if that person knows on a frequent basis how hard you are working (or even thinks that he knows), then you are being defeated by that person, even if your work ethic is admirable. Being visible from behind, literally and metaphorically, while working shows low status in the work and the person doing it. All of this is not to say that you shouldn’t sometimes share, on your terms, that you worked a 90-hour week to get something done. On the contrary, proof of investment is far more powerful than talk alone. At the same time, it should be you who decides how your narrative is presented, and not others. The passive transparency of an Agile shop– the willingness of these programmers to have their status lowered by a process in which they give far too much up and get nothing in return– makes that impossible. When you buy into Agile Scrotum fully, you’re also implicitly agreeing that contributions that aren’t formally recognized as tickets don’t matter, and allowing your work output to be negatively misrepresented, and possibly without your knowledge, by anyone with access to the tracking software. Isn’t that just putting a “Kick Me” sign on your own back?

I am, for what it’s worth, well-traveled. I’ve seen a lot of human psychology, and I’ve learned what I would consider a sizable amount. One recurring theme is that humans expect a certain logical monotonicity that doesn’t hold water. Logically, if A implies B and (A and C) is true, then B is true. In other words, having more information (C) doesn’t invalidate concluded truths. In the human world of beliefs and near-truths and imperfect information, it’s not that way at all. Of course, there are valid mathematical reasons for this. For example, it could be that B has a 99.99999% chance of being true when A is true and A “effectively implies” B, while C has an even stronger negative impact and implies not-B. Then A almost implies C but B definitely implies not-C. More commonly, there are computational reasons for non-monotonicity. “Flooding” a logical system with irrelevant facts can prevent valid inferences from ever being made, because computation time is wasted on fruitless branches, and flooding an imperfect logical system (like a human knowledge base) can even populate it with non-truths (in technical terms, this is called “full of shit”). The violent-transparency culture of open-plan offices and Agile is based on phony monotonicity. First, it assumes that more information about workers is always desirable. In human decision-making, more information isn’t always good. Shitty, meaningless information that creates biases will lead to bad decisions, and get the wrong people promoted, rewarded, punished and fired. Second, it ties into a monotonicity that specifically afflicts high-variance people (and advantages psychopaths), which is the perception that small offenses to social rules betray large deficiencies. That’s also not true. There’s really no connection between the meaningless unreliability of an office worker who, after a long night, shows up at 9:05 the next day; and the toxic ethical unreliability that we actually need to avoid.

I have, between the minor unpleasantness of a mood disorder and the major unpleasantness of the software industry, seen a lot of crap. I use the word crap specifically for its connotation of low quality, because let’s be honest about the problem of specifically low-quality information and what it does to humans in large amounts. Agile Scrotum generates a lot of information about who’s working on what and at what speed, and that information will re-order the team’s pecking order; and, guess what, it’s all crap. It takes the neurological variance that has to accompany creative high performance, because Nature or God couldn’t figure out any other way, and turns it into meaningless signals. The same holds for open-plan offices, which bombard engineers and management both with meaningless information (unless the detail of when each person goes to the bathroom is somehow important) that is then held to reflect on individual performance and value within the group. As again, it fills the office with piles of low-quality information and, soon enough, the only thing that anyone can smell is crap. This is one thing that mood disorders make a person better at than most people: detecting crap. When a malfunction of my own wetware tells me that everything I’m doing is pointless and that I should just crawl in a hole and die, I can listen to it, or I can say, “That’s crap” and move on. When an adverse situation throws me for a loop and I get anxiety, I can recognize it for what it is and get past it. (I may have a panic attack, but I don’t get the emotional drama that seems to afflict the neurotypical, because I recognize crap that comes from my own mind. If someone cuts me off in traffic and I still have anxiety or anger, five minutes later, that’s on me.) I’ve survived by recognizing flaws in my own cognition with a precision that 95 percent of people never have to develop. This also makes me preternaturally keen at spotting (less intense, but longer-lived) flawed cognition in others. In other words, it makes me great at cutting through crap.

So now I’m going to speak of (and possibly to) the software industry and try to get through a lot of crap. People learn to program (or to write, or to paint) in order to implement ideas. Sometimes we want to implement our ideas, and sometimes we want to implement good ideas regardless of whose they are. I think that it’s useful to have a healthy balance between the two: exploring your own ideas, and spending time with others’ ideas that (one hopes) have a higher-than-baseline chance of actually working. Many of us (myself included) were drawn into the craft for the creative potential. So, when I see a shop using Agile Scrum, the only question I can ask is, what the fuck happened? This macho-subordinate ceremony isn’t going to make me better at my job. It’s not going to teach me new things about programming and computation in the way that Haskell did, it’s not going to improve my architectural or aesthetic sense, and a bunch of psych-101 bullshit designed to trick me into liking being a subordinate is certainly not going to make me a better leader. None of it has any value to me, or to anyone, because it’s all crap.

People with my condition live 10 to 15 years less than an average person. That’s statistical and, as a non-drinker and non-drug-user who hits the gym every morning and hasn’t had a major problem with it for years, I’m pretty sure that I’ll beat the odds. I’m treated and as “sane” as anyone else, but I also have the (extremely low probability, high impact) sword of a recurrence of 200x Crazy hanging over my head. I’m aware of my mortality. Even a single minute spent learning how to write fucking user stories is a minute not spent learning something that actually matters. Or relaxing and working on my health. Or at the gym. Or sleeping. Or writing. I factually don’t have the time to learn and practice that garbage, but given that I’m not really any more or less mortal than anyone else, I can’t really justify the idea that anyone else should have to do it, either. If they’re really such poor programmers that the humiliation of Agile Scrotum “makes sense” for them, then why not convince them to work on the skills that they lack, instead? It’s crap and, rather than holding on to it, we should throw it away. Our industry is already too cluttered. Even without the clutter, there would be more things genuinely worth working on than there is time and talent to do them. With the clutter, it’s hard enough to get started on just one. We need to get serious about ourselves, our relationship to the world, and computer science and technology themselves. This industry is full of people waxing philosophical about “the Singularity” and biological immortality, but are we really ready to talk about immortality if some of us think that it’s okay to spend time on “backlog grooming” meetings?

That, like much that I have written, took balls to finish. By this, I mean metaphorical “balls” because not only do male sex organs fail to be a prerequisite for courage, but they’re not even a correlate of it, in my experience. Anyway… wanna know what it didn’t require? It didn’t take an Agile Scrotum.

Show more