I started noticing something odd in my Google Search Console statistics for this site about two weeks ago:

The prior drop, in mid March, is most likely due to the onset of the COVID-19 pandemic.

Around May 4th of this year, my results started tanking, anywhere from 600 less clicks to a massive 1400 less clicks per day! That's a lot of traffic that was suddenly missing, traffic that I didn't know how to get back.

This was, obviously, worrying. I'd had drops in site traffic before, but none were this pronounced or this sudden. I decided to take a closer look at my stats from Analytics and Search Console, and I did not like what I found.

Every single post on Exception Not Found, from the earliest one to the most recently public series, was now getting less search traffic. Markedly less. So if every single page was showing less organic search traffic, then there must be something site wide going on.

I started searching for answers. Many articles I read pointed out that I should check if I had any "manual actions" or "security issues" on my site, and resolve them. I found none. They suggested that I look at my keywords to see if they changed, to check the design to see if it was making the search crawler's job harder, to see if inbound links to my site had disappeared, and even if the content itself to was not actually as useful as I thought it was. I found no outstanding issues, nothing that I could find to explain the large drop in organic search traffic.

And then I stumbled on this announcement:

Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G

There’s nothing wrong with pages that may perform less well in a core update. They haven’t violated our webmaster guidelines nor been subjected to a manual or algorithmic action, as can happen to pages that do violate those guidelines. In fact, there’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.

This is supposed to make me, a small site owner heavily dependent on Google search results for traffic, feel better. But that's hard to do when 20%-30% of my traffic is gone almost overnight, and it was most likely because of Google's own changes. If I sound bitter, well, frankly I am; I worked hard on those posts and they were useful to someone, and now because of an algorithm change they're not as useful anymore?

From 5015 pageviews on Monday May 4th to 3850 pageviews on Monday May 11.

There's no two ways about it, Google wields a LOT of power when it comes to the results of their search algorithm. I'd argue they wield more power than any single other company on the Internet, merely by having the best search engine available (and make no mistake, it is the best).

It does seem, if not quite unfair, at least a little unbalanced that Google wields so much influence into who sees what on the internet. If you run a website, no matter what it is, you are at the mercy of your Google overlords. Google is the 2000-pound gorilla, the target against which all metrics are compared, the source and arbiter of eyes on most websites.

They hold the power, and it cuts both ways; they can both raise your profile to unheard of heights and sink you so low that no one will ever find you, namely, on the second page of results. That's a lot of power for a single organization, not to mention a business that ultimately wants to make money, to wield.

I am not sure how I will recover from this drop yet. Or, frankly, if I need to worry about it at all. I'm clearly not gonna stop blogging anytime soon. But Google has not stopped being, as Jeff Atwood put it 11 years ago, the elephant in the room. If anything, the elephant has only gotten larger, and hungrier. Here's hoping that, should we need to, we can get out of its way in time.

Origins

I have a new project I am working on for my company. This project is nothing terribly complicated, but part of our process is to build a CI/CD system for all of our projects using Azure DevOps. It's part of my job description to handle such things.

Lucky for me, an extraordinarily similar project already had a build and release definition in our CI/CD system, so my job was going to be super easy. If you've read my blog before, you know that merely thinking this is a bad idea.

The structure of this project was pretty straightforward. We had the following projects:

This looked, to me, like this was a perfectly valid MSBuild parameter set. Some of the parameters I recognized (MvcBuildViews), some had a meaning easily understood from their name (DeleteExistingFiles), and some were just there and I had to hope they worked as I thought they did.

All of that seemed fine and dandy, except when I actually ran the build in DevOps I kept getting the following errors:

Our company-specific items have been blacked out.

The errors read, "Error CS0006: Metadata file 'path/to/file' could not be found". Which is not an error I'd ever seen before.

See, the UnitTests project relies upon the Enums and Impl projects to build, and it seemed as though when the UnitTests project was building, it could not find its dependencies. Which was especially strange because it worked just fine on my local machine.

I spent two days googling this error, trying tiny fixes in my solution, messing with the DevOps build pipeline. No matter what I tried, no matter what configurations I put into the project, nothing seemed to work.

Desperation being the mother of inspiration, at a certain point I just started removing the MSBuild parameters, one by one, each time trying the build again. An hour later, I ended up with a configuration that works! See if you can spot the difference:

Yup. All I did was remove the OutDir argument, and magically the build started working!

Turns out the OutDir parameter is, well, old. And I am not the only one to run into issues with OutDir; Peter Seale noticed one seven years ago. As far as I can tell, MSBuild has changed enough in the past seven years that the OutDir parameter is no longer required, and is indeed a hindrance. Or, possibly, something else changed; I don't have enough background knowledge to know for certain.

But, really, the OutDir parameter isn't the true problem, is it?

Return of the Cargo Cult

You might have missed this tidbit from earlier: in order to make this build definition, I copied an existing definition. More specifically, I copied an existing definition without fully understanding what it did, why it did it, or really anything about it. I assumed it would just work, and this assumption ended up kicking my butt.

Yep, you guessed it: in that moment, I fell victim to that old standby of software development anti-patterns: cargo cult programming. I saw the man directing the airplanes with sticks, and figured the sticks themselves must be summoning the airplanes (the sticks, in this case, being MSBuild parameters).

Obviously, cargo cult programming is bad. Obviously, it leads to problems down the road, in the form of "we don't know why this works but we can't remove it because then it stops working". Obviously, I should have avoided cargo cult thinking at all costs.

Shouldn't I have?

The problem is that the real world is more complex than any pithily-named "anti-pattern". We developers are under deadlines, obligations, commitments, and various other requirements that we are sometimes forced to cut corners. We have a directive to deliver working software, not perfect software. So of course I was going to take a perfectly-good build definition from another project and reuse it; what else could I reasonably do?

This particular time, it bit me. I lost two days because of this decision. Most of the time, though, cannibalizing off other working projects does the job just fine.

All Things are Modifiable

The problem with listening to dogmatic ideas ("cargo cult programming is bad and you should feel bad for doing it") is that it breeds fanatics, people who slavishly adhere to ideals that might not even be achievable in the first place. These people are the anti-pragmatic programmers, the ones who favor architecture over usability, cleverness over readability, and perfection over shipping a working product.

I wrote an entire damn series on anti-patterns, and you'd be wise to take my words with a grain of salt here. But the point I'm making is this: in software development, there are rarely rules that cannot be broken. Everything is negotiable, everything can be modified, and the only constant is that nothing is. Nothing is ever unchangeable. In this case, I broke the rule of "cargo cult programming should be avoided" and it turned out terribly for me.

I'd do it again, though. Sometimes we as developers have to make concessions to the real world. Concessions like copying a working build definition from another project without fully understanding what it did because it means I can get a CI/CD pipeline in place more quickly. It bit me, I lost two days, and all around it was a very frustrating experience. But I would absolutely do it again, in these circumstances, because most of the time it works just fine.

Don't be afraid of the things people say are "wrong". Don't be afraid of "rules" because they are common knowledge in our field. Do your own research, reach your own conclusions, solve your own problems. In our world, nothing is wrong or right; rather, everything is just a tool. The trick is to know when to use it.

But you might still want to be afraid of OutDir.

Happy Coding!

]]>After a great many years of resisting, it has finally happened. I have joined the dark side.I find your lack of faith disturbing...

I had heard for years (mostly from my teammates) that dark mode helped reduce "eye strain" and that sounded pretty good to me. But partially due

After a great many years of resisting, it has finally happened. I have joined the dark side.

I find your lack of faith disturbing...

I had heard for years (mostly from my teammates) that dark mode helped reduce "eye strain" and that sounded pretty good to me. But partially due to stubbornness and partially to laziness I refused to budge, refused to try out this cool new thing that all my teammates were raving about. That lasted until earlier this year.

For four months now, I've done everything in dark mode if it is available. Visual Studio, Chrome and Firefox, various web sites, my email client, the Ghost editor pages; if it has a dark or night mode, I'm turning that puppy on and relishing the relief it gives my poor eyes. I'm looking forward to the new Dark Mode in iOS 13; my phone is the other device I'm staring at most often and I'm sure Dark Mode will help reduce my headaches even further.

Oh, did I not mention that? I didn't switch because it was the cool trendy thing to do, no, I switched because I getting headaches so frequently I thought I must be bursting out of my skull. At one point I would develop a splitting headache almost daily, whenever I had been staring at my screen for most of the day. I tried several things (both external and internal, ergonomic and medicinal) to get my headaches under control, and while many of those helped, one thing seemed to do the most good: switching all of my applications to dark mode.

I can literally feel the difference. Reading dark text on a bright background now feels like burning my retinas out. Sites which don't offer a dark mode are sites that I leave as quickly as possible. It shouldn't make this much difference, but it does, and I'm surprised at how much I can feel it.

In short, dark mode on my devices has helped me get my headaches under control for several months now. I'm a total believer in how dark mode helps reduce eye strain, as it certainly has for me. I'm living proof that dark mode provides tangible benefits. Problem is: tangible benefits for whom?

What I found surprising while doing some Googling about this topic is that there's no scientific consensus that dark mode is better for your eyes. Optics is a finicky science, and what's good for one person doesn't make it good for another. The only real consensus I could find is that it is contrast, not color, that provides the biggest difference in readability and strain. Black-on-white and white-on-black provide very high contrast, hence why they are the default.

I am no longer surprised as to why my teammates were raving about dark mode years ago. It works for me, and it's definitely helped ease my headaches. I appear to be one of the people for whom dark mode is a benefit, and I am grateful for it. But as with any new fad, it isn't some universal panacea for all people who suffer from eye strain, as much as I (and the companies issuing their own dark modes) want you to believe it is.

But, and hear me out, it is worth trying. I didn't because I figured my headaches had some other root cause, like my posture, and so I missed out on years of less pain and distraction. Don't be like me. Try out dark mode, and if it works for you, keep using it. If it doesn't, go right back to light mode. There's no downside.

I am eagerly awaiting the time where "dark mode vs light mode" becomes the next silly flame war topic, right after "tabs vs spaces", "command line vs GUI", and my perennial favorite "real programmers use X".

(And as soon as I can, I'm going to have to figure out a dark mode for this blog, one that doesn't completely undo the design.)

I am quite possibly the laziest programmer you know. At least, I think so; I can't be bothered to check.

"Oh honey, you're so lazy!" "Why thank you dear. I love you too."

Several years ago, at a different job, my boss walked by my office and asked me "why don't I hear you typing?" At the time, fresh from college and not wanting to rock the boat, I immediately started typing out some C# code I'd been thinking about. This lasted only as long as it took for him to leave, whereupon I stopped banging out these BS "solutions" and started thinking and Googling and generally determining what way was best to solve this problem, rather than blindly charging ahead.

I couldn't help but think, "what's wrong with lazy?" Now, several years and a better job later, I am convinced that laziness is one of the principle defining characteristics of a good programmer.

Lazy Is Good

Lazy programmers are good programmers. They don't do more work than absolutely necessary.

Lazy programmers will not deal with problems more than once. Consequently, they often think through potential issues thoroughly so that they won't have to solve it a second time. That quick, off-the-cuff solution that they can bang out in an hour? Lazy programmers will realize that it will come back around to give them more work in the future, and they are not gonna stand by and let more work happen to them. Anything which causes them work is a problem that needs to be dealt with, correctly, the first time it arises.

Lazy programmers abhor redundancy. They go out of their way to reduce the amount of code they write, because they know that the more code that exists, the more likely it is to cause them future work. In fact, in their opinion the best code is no code at all, because it cannot break.

It... is... PERFECT!!

Lazy programmers explain their decisions. They write thorough comments, expand on existing documentation, and generally try to ensure that they are being clear, because God help them they are going to explain this one time and one time ONLY!

Lazy programmers automate everything that can possibly be automated, even the coffee pot. If there's work to be done, and it needs to be done regularly, it can and will be automated.

See how I automated my lunch with this one simple trick!

Lazy programmers teach people more junior than them, partially so those junior people can do the work instead. They spend time and energy now to show what the proper way to do things is, so that the juniors can do it that way or better and stop bothering the lazy programmers.

Lazy programmers expect to be replaced. They know that obsolescence is coming, and they accept this fate because, hey, that means less work for them. They even help it along; they write the tools that will make their own job obsolete. Lazy programmers know that if you can't be replaced, you can't be promoted.

In short, lazy programmers want to do as little work as possible right now. But they also want to do as little future work as possible, and so they ensure that the work they do is as efficient, concise, easy to comprehend, and replaceable as possible.

Is it literally on fire? No? It can wait. [1]

What's wrong with that?

Be lazy. It's OK. Don't do the work more than once. Write the docs so you don't have to write them again. Explain yourself clearly and concisely the first time. Teach people more junior than you so they can do the work. Think about things so you don't have to think about them again. Laziness has saved me more hours of fixing stupid things than I can possibly imagine. It takes practice, but everyone, yes EVERYONE, can do it and get good at it.

After all, if you can be lazy and still do a better job, why wouldn't you?

In the early going this seemed to be quite the nasty breakup, with HackerNoon apparently asking for a license to all of its articles and Medium retorting that they weren't allowed to do so. It appears, according to later tweets, that the less savory aspects of this breakup are now "resolved" and both parties are at an understanding.

But as I was watching this from a distance (having no affiliation with HackerNoon and only a few articles on Medium), I couldn't help but feel very glad that I own my own content.

I cannot stress enough how important it is to me that I own my own domain, my own site, and my own articles. Every article on this site is something I've written. All of Exception Not Found is mine; my words, my thoughts, my projects, my code. This site is me, just on the Internet. I feel so strongly about this that, IMO, every developer should have a blog, and their own domain to host it on.

Mwahahahahahaha!!!

After four years of blogging, I truly believe that every developer should have a blog. It doesn't even need to be a "good" blog, it just needs to exist. It is your footprint on the web, a combination resume and portfolio that you can use to assist your colleagues and impress your (potential and actual) bosses. It is you, but everywhere.

Just as important, though, is that you need to write posts, publish them, and have your readers review them. Everything you know is something someone else doesn't. Publish your thoughts, tips, experiences, opinions, anything that makes you you. Who cares if they suck? They'll get better the more you do it! The mere act of writing causes us developers to examine what they're saying, how they're saying it. This, in turn, helps them become better communicators, and modern software development is all about being able to communicate effectively.

It is critical, though, that you own your words. Have your own domain, and publish everything there. Own what you write, and when your community points out helpful tips or places where you got something wrong, own that too. Just yesterday I had commenters point out issues with some examples in my FizzBuzz test post and we, together, arrived at a satisfactory conclusion. If they hadn't said anything, I'd just continue to look like an idiot.

Scott Hanselman said years ago that "Your Blog is the Engine of Community" and he's exactly right. But it needs to be your blog, not a "publication" hosted on some massive aggregator site and lost in the swarm of nameless, faceless, featureless posts. It needs to be your words, on your site, with your style. That's how people find you.

Own your words. They are you, when you aren't there. Own them, publish them, refine them, show them off. Write what you know, and be proud! Because one day, your words will help someone, whether it's to get a new job or solve a difficult problem. That person might even be you.

If any of my readers would like someone to look at their blog and make suggestions, I am happy to do so. I urge all of you to start a blog, pick up one that's been dormant, just get writing somehow. You could even host with DigitalOcean and Ghost like I do. Grab a domain and get writing. Yes, now. Seriously.

Why are you still staring at me? Get blogging! Future you will thank you for it.

I have an adage I like to trot out from time to time: "You cannot solve sociological problems with technological solutions." It's a wonderful, useful phrase, one that causes my team to want to pun by the time I say "solve" because they have heard it so much. Yet it appears that my adage has never been more true than it is now.

There's an ongoing debate about maintaining civility in online communities. StackOverflow has discussed their version of this problem several times in their short history, and now we are starting to see this issue spread to other sites that want to encourage community, such as GitHub. It's always the same problem: people are being rude to one another, and other people are pissed off at the rudeness (whether or not it's directed at them personally).

I've been thinking quite a lot about this problem. It seems, to my outsider's view, that all online communities have this issue, or will eventually. On first glance it's easy to say that certain online communities are not inviting new people, even actively discouraging them from joining. I think this view is shortsighted. The issue not that people are generally mean, or rude. The issue is that the very few people who are mean or rude have their comments exist in perpetuity, and given enough time, come to represent (fairly or not) the community as a whole if nothing is done about them.

Lifecycle of a Community

Communities are a fragile thing. When they are new, they are inviting, willing to take on all peoples and all problems. The people who "get in on the ground floor" as it were become respected members of their community, to whom other people look up to and respect. In the beginning, the community is on equal footing, so everyone shares everything equally. The problems arise later, and they have to do with the very nature of the Internet and people: namely, the Internet "remembers" everything, but people need to forget.

IT'S THE CIIIIIIRCLE OF LIIIIIFE!

People tend to remember bad experiences more readily than good ones. The problem now, in online communities, is that because the nature of the internet is to "remember" everything, nothing ever gets forgotten.

As a community grows and adds new members, so too does that community's knowledge pool. This includes not only what people seek in asking questions, but also tangential ideas, unwritten rules, and other etiquette which is rarely written down but will still be adhered to. Given enough time, the respected members of the community, the "elders" as it were, start to notice when their rules of etiquette aren't being adhered to.

In the best case, these elders issue gentle reminders that these rules need to be followed in order to be a member of the community, but in the worst, the elders lambast, make fun of, ridicule the knowledge-seeker as being someone who "didn't do the research." The people who are targets of these slights tend to come to the conclusion that because this one individual thinks their question is stupid, the entire community must think so as well.

In short, the elders unwittingly become the oppressors. This happens entirely without their consent, and sometimes without their knowledge. But it happens.

And once you have oppressors, you start to have oppressed. The people who were rejected or treated roughly by the community they once sought to join are now stunned, shocked, heartbroken. They merely wished to be a part of the group, and now the group doesn't want them. They might channel this frustration into something useful, like starting their own community, or finding a better way to ask a question. They might simply give up their endeavor altogether. One thing they are guaranteed to do, though, is talk; given enough people, the odds of someone talking and venting their anger rapidly approach 1.

The Internet is, in theory, egalitarian. It gives everyone an equal platform for their voice, their opinions. But the Internet remembers. As the shunned people become more numerous, they start to voice their opinions, their issues with the communities that shunned them. They write diatribes about how their once-longed-for community drove them away, and other people start to take notice. They gain allies in the perceived fight to come, the fight for the soul of these communities that started as welcoming but evolved to forbidding.

Those who still believe in the mission of said communities also take up arms. The extreme fringes of this group begin painting the shunned as people who didn't do enough research, or should have been in the group from the beginning if they really wanted the knowledge they supposedly sought. Of course they didn't get a good answer, they didn't ask the right question! They paint their perceived enemies as oppressors themselves, bad actors wishing to steal the soul of their beloved community by polluting it with poorly-researched questions and demands on the members' time.

We started with communities, and ended up with armies. And then we get a war, a war in which no winner is possible and everyone ends up pissed off. A war with no good outcome.

We will fight in the forums and in the tweets! We shall never surrender!

Be Nice!

It seems, from the outside, that this kind of war is inevitable, a tragic and unavoidable path down which all online communities will eventually travel, and this is because the solution to avoiding this war requires both sides to assume good faith in the other. To be fair, comparing this kind of virtual mud-slinging with actual war is a bit exaggerated (but what is the Internet if not collective exaggeration?) The members of the besieged community must assume in good faith that the askers have done some research, at least attempted to answer their question; and the learners must take it on faith that someone will deign to help them. It seems to me that good faith is in short supply.

The solution to this conundrum is known, and not technological: civility. Patience with our fellow man, understanding for where s/he is coming from, kindness when dealing with them. We were all knowledge seekers once, and the good ones among us still are. Civility, however, requires time and effort. I will argue that people have a limited supply of these things, and it can easily become a fool's errand to use them on someone who is no more than a series of words on a screen. It becomes easy to disregard the importance of civility when one's time or effort runs low.

But we must keep on. We, the "elders", who are the de facto gatekeepers for our profession, who have the power to decide who feels welcomed in programming and who doesn't, must be civil, and to all parties, real or not. That is our responsibility as leaders.

Civility online, despite our haphazard first impression, isn't dead, it's just more noticeable when it isn't there. But the lack of civility is a sociological problem, and requires a likewise solution. At the risk of being cliche, said solution is simple: Be nice to people! Yes, even them! It is on both sides, the elders and the knowledge-seekers, to remember that we are all in this together.

It was almost imperceptible. A slight shift in her gaze, a miniscule shake of her head, a flash behind her eyes. She knew, and then I understood. I had done it again.

I didn't mean for this to keep happening. Really! Interruptions like these just slipped out, before I had a chance to rein them in, to hold them back. I might have read an interesting article, saw a cool TV clip, listened to an engaging podcast. I learned things, and enjoyed it. What was the problem in sharing my knowledge? They'd want to know too, after all. Doesn't everyone?

I said it, and as I did she gave me that look, the one I'd seen so many times before. The look that says "come on, not this again." The mingled disappointment and resignation and half-buried hope were clear as day, just for a split second.

She rallied admirably; she kept the conversation going with barely a hiccup. Our friends that had come over for dinner didn't even notice. You'd have to know my wife like I do to be able to realize that something had happened. But I do know her, so I did realize. For being weightless, thoughts sure do hit you hard.

I was that guy. I was the guy that told you all about the things you thought that were wrong, the things that you believed that were incorrect. I was the guy that spouted opinions as facts, that proclaimed my beliefs and understandings as the true way of the world. The ones who try to make you believe you've been in the wrong this whole time. I did none of this intentionally, often not even consciously, but it happened all the same.

I was the "well, actually..." guy. And it took me thirty years to realize it.

I don't think of myself as an opinionated person. I doubt any of us do. That doesn't mean I don't have opinions, just that I don't think of myself as a person who needs to shout those opinions from the rafters. I like what I like, and so can you, provided you admit you're incorrect.

I am also a programmer. In my mind, this contributes significantly to my "well, actually" syndrome. Computers are inherently stupid machines. At heart, they understand two possibilities: on, and off. Everything ever produced by computers has been made by manipulating groups and collections of these states to compose something far more interesting. Consequently, programmers tend to think of problems in binary: either it works or it doesn't. Either it's right, or it's not. It's strikingly simple, even elegant, in the way that life isn't.

I want to know, and I can't stand not knowing. I want to understand, to learn, to grow, so I can put it all together. I want to be seen as smart. I want to comprehend my world as fully as can be imagined. I want people to know that I know things, that I am intelligent, that I am worth knowing. Doesn't everyone?

And in her look, I saw the problem. Was I just trying to help, or was I trying to be smarter than them? Three times in four minutes, she told me later, I had corrected someone who was talking. Was I really trying to be helpful? Or had I mistaken my insecurity, my need to be seen as smart, for magnanimity?

I looked. I saw. And I understood.

That was a year ago. Now, in the present day, I'm making efforts to not be that "well, actually" guy. Efforts like trying to not interrupt people, picking my verbal battles, or just saying "cool, man" and letting it go. Efforts that remind me that even though I know things, that doesn't mean everyone else needs to. Efforts which I think, I hope, I pray will start paying off soon.

In the meantime, I keep going. Life doesn't stop because a realization thirty years in the making knocked you on your ass. The difference is, now I occasionally remind myself that well, actually, I might have been in the wrong this whole time.

So now, it's finally time to discuss the biggest mistake I made during this project. This mistake ground the project to a halt. It was a mistake that could have easily been avoided had I taken the appropriate time to research other options. I'm hoping that, by sharing this mistake, you dear readers will learn from it and be given enough warning to prevent this mistake from taking down your projects.

Swing and a Miss

At the beginning of this massive rewrite, it was my job as lead developer on the project to determine what all needed to be done. We were given an order saying "take this 18-year-old project and rewrite it using modern technology and methodology." True to that order, I immediately dove into the existing codebase and attempted to determine what all it comprised of. There were sections for a library of documents, sections for an employee lookup, sections for links to other tools in our ecosystem. I found all the sections, wrote up an estimate, got it approved by our business unit, and my team was off to the races.

But I missed one. I missed a section, and the panicked decisions that resulted from that miss nearly killed the project.

The section that I missed was not very visible. You could liken it to a land mine: concealed and explosive. It was only accessible to a few people, a few very important people, a few people who were responsible for millions of dollars in revenue to our company. You had to have very specific permissions to even see this part of the tool. I didn't, but I also didn't look in the code for this section. I have no explanation as to why I found all the other sections we needed and not this one; I just missed it.

And yet, even that was not the true mistake.

The Dreaded Rewrite-In-Place

Once I learned that a section was missing, we were already 6 months into this supposed 8-month project. We were then forced to extend the deadline. We didn't extend it nearly enough. So in short order we were running up against the second deadline, the already extended one, and by this time the business had rightly had enough of our crap. At this point, I made the true mistake. I knew the end users and the business wouldn't be pleased with how I'd handled this, and so I chose to do something that has haunted the project ever since, something which to this day we are still dealing with the consequences of.

I directed my team to do a rewrite-in-place.

"What's that?" you ask. A rewrite-in-place is a well-known term I just made up that says we should copy the code from the old application to the rewritten one, fix it as little as possible to work in the new architecture, and then call it good enough.

If your head just exploded from the sheer short-sighted stupidity of this, well, I can't really blame you. Here's a paper towel.

In my defense, given the impending deadlines, it was one of the few options that made sense. We didn't have time to do a true rewrite, because that would require understanding all of the code enough to make it better. The rewrite-in-place removed this requirement, changing what would needed to have been a deep dive into code none of us understood into a shallow copy-and-paste job. In one sense, I took something that would have taken ages and made it very quick to do. But there's always a cost.

"Technical debt (also known as design debt or code debt) is a concept in software development that reflects the implied cost of additional rework caused by choosing an easy solution now instead of using a better approach that would take longer."

I chose to do the quick and easy solution, the rewrite-in-place. In the process, I knowingly chose not to do the correct, more-time-consuming solution (which would be to understand the missing section's code and rewrite it properly). It is that second decision, the intentionally choosing not to do the correct solution, that created the technical debt. Merely choosing to do a quick-and-easy fix does not incur technical debt if you aren't aware of a better option.

Technical debts are like taxes: you WILL have to pay them, it's only a matter of time. At some point in the near future, my team will have to start paying down the tech debt that my direction to do the rewrite-in-place incurred. They will be writing code to fix this bad decision, my bad decision, for quite a while to come.

Don't R.I.P.!

(Yes, the acronym for rewrite-in-place is very much intentional.)

Rewrites-in-place are NEVER a good option. They are sometimes a viable one, but only as a last resort. If you find yourself in the middle of a rewrite-in-place, stop and ask why. Then go to your boss (or lead, or coworker) and check to make sure that's what you should be doing. Point out to them that this kind of work is guaranteed, GUARANTEED, to invoke technical debt. Make them defend their decision! It is through this kind of self-reflection and open communication that good projects survive bad ideas.

Don't do what I did, if you can help it. Don't chose the quick-and-easy route when a better one exists. Avoid the rewrite-in-place, and the technical debt that inevitably comes with it! You'll regret it. It won't be now, but it will be soon, and your team will be the one that suffers through the rewrite of the rewrite, as mine will be doing shortly.

]]>

My recent post I Am a 9 to 5 Developer (And So Can You!) is getting WAY more traffic than I expected, and I am both humbled and excited that so many people in our profession seem to feel the same way that I do: namely, that you CAN leave

My recent post I Am a 9 to 5 Developer (And So Can You!) is getting WAY more traffic than I expected, and I am both humbled and excited that so many people in our profession seem to feel the same way that I do: namely, that you CAN leave work at work and still be a good developer. So, for everyone that read that post and stuck around to read this one, thank you so much for-a reading my blog!

As the Internet in its best form is prone to doing, sometimes someone will mosey along and bring up an obvious point that never occurred to me. That previous post was submitted to Reddit's /r/webdev community, and despite the common wisdom to not read comments about your own work I found myself perusing these conversations. This point that stuck with me has to do with the nature of software itself, and why it's never a good idea to spend all your time writing it. Here's the comment:

That last line in particular stuck in my craw. "Ultimately disposable" is EXACTLY what software is, much as we might wish it weren't. There is a terrible truth that many developers haven't quite accepted yet: software is disposable.

Time and the Tracker

A long time ago I wrote a post called The Solo Programmer and the Insidious Promise of the Ivory Tower which told the story of my first job out of college. During that time I was a lonely web developer on a team of non-programmers, and I built an ASP.NET WebForms web site that was used to help plan and estimate effort required for this team's projects. We called it the Tracker. The Tracker is still the longest-living project I have ever worked on and AFAIK it remains in use at that company to this day.

But the Tracker is the not the norm. Other than the projects I am currently working on, all of my software projects have either been replaced, made obsolete, or set on fire with a blowtorch made unmaintainable by time and change.

The truth, the ugly truth, is that nothing we ever write will be permanent.

To be sure, the good, quality code that us developers write will feel like it is permanent. It will feel like "I solved this problem with this code and therefore this problem will forever remain solved" until inevitably the next variable is introduced and the whole house of cards comes tumbling down. No solution stands the test of time, for time is constantly introducing new problems. Even the Tracker will eventually succumb to the unending flow of the river of time. Time always wins.

We developers spend a lot of time writing code (duh). Like most people who enjoy their job, we take pride in doing ours correctly, efficiently (also duh). Part of that pride in our profession is ensuring that our code and programs are the best they can be, given all the other constraints (e.g. time, money, availability, etc.). We want to make our work be worthwhile. If something is worthwhile, then it should last a while, right? That's how tangible real-world products work; the longer they last in their intended function, the more worthwhile they are.

But they never last forever.

Destined to be Discarded

Despite what we may want, software is like any other tangible good. It is incepted, designed, tested, sold, used, and eventually thrown away. It is disposable, constructed with a limited shelf life whether or not that was done intentionally. It will not last forever, and it is hubris to expect that it should do so.

That's scary! This idea that what we do is impermanent, disposable, is downright terrifying to some people, including me. The thing that I do, that I love to do, that I designed so carefully and tested so thoroughly, it isn't going to be around in 10 or 5 or even 2 years? Why the hell not? Did I do it wrong? Did I not solve the problem?

No, we didn't do it wrong. Yes we solved that problem. The problem merely changed.

Code doesn't change. Once it is written, it stays written in the exact same manner unless we change it. But the problems do change, time keeps changing them, and so we must keep coming up with new software to solve new problems. If our software was permanent, then truly static code would be just fine; there'd be no new problems to solve, nothing that we need to learn to solve them. Which begs the question: do we want to live in a world where there's nothing to learn? I sure don't.

Precious few things I have ever worked on still exist today. The Tracker is one of the few, and it too will eventually fade away like all my other code. All my software, my "life's work", is destined to be thrown away like the trash it will become. This is not scary; this is merely a fact of programming life. Everything we write is destined to be discarded. What we need to do is be mindful of that fact, because then we can plan for it, work toward it, bring it about. The end of our software should be planned for, expected, even desired, at the least so it doesn't catch us by surprise.

This particular position is near and dear to my heart, though I also recognize the irony. One of the things I've told potential employers ever since I was fired for refusing to work more is that I don't do overtime. At the very least, not on a regular basis. I will do occasional "hell weeks" where it is absolutely critical we get a particular change deployed or a bug fixed, but barring those I will show up, kick ass, and go home.

Ms. Abdalla's tweet goes a step further than that, and because of it I'm suddenly seeing my own thought process in a new light. See, I opted in to this mindset (show up, kick ass, go home) largely in order to protect my own sanity. If I don't set clear boundaries as to when it's OK to think about work problems, I'll think about them all the time, and that's an efficient way to become overwhelmed with all the things I haven't done yet, all those problems I haven't yet solved. I'm a distracted developer and the act of establishing these boundaries helps me rein in my impulsive thoughts.

The problem is that we never run out of problems. For a puzzle-hungry brain like mine, programming is so full of not-yet-solved problems that a mind like mine can find entertainment and solutions to their hearts content and still not feel like they've truly accomplished anything. I solved a problem! Cool! But so what? It can be maddening, frustrating, disheartening. It can be burnout-causing.

Due to my runaway brain, I've had to set boundaries. I've had to leave work at work. I've had to become a 9 to 5 developer.

I know there are a lot of developers out there inflicted with Impostor Syndrome. You probably have had this at some point; it's the feeling that you're a phony, a person who truly doesn't know what s/he is doing and is just faking it to get by. The issue for me is that if I don't set these boundaries, if I allow myself to continue working and exploring and solving problems at all hours, my own Impostor Syndrome only gets worse. I want to know everything and I cannot, and because I cannot I start to think less of myself.

So, I've had to set boundaries. Boundaries like "work stays at work" and "take breaks every so often". Boundaries that help keep my sanity intact and my distraction at bay. For the longest time I thought that made me less of a developer. I wasn't able to do 16-hour coding sessions, because by the end of it I'd be incapable of doing anything else and I have three kids and a wife at home who need my attention and love. Face it, our code will never love us back. But still, I wanted to do better, to be a better coder, like those I looked up to, and I hated myself for not being able to do it.

Twitter and other social media are the worst things when it comes to having this problem. People who are clearly brilliant programmers, people who I admire for their work, would proudly declare that they'd just spent all day coding and were feeling really accomplished and I could only sit back and despair about why I couldn't do that. Why can't I put that kind of effort in? Stupid brain! Why won't you let me be as productive as they are?! I can be just as good as they are, if only you'd get outta my way!

It took me a long time to come to grips with the fact that my brain simply does not work that way. I will never be able to do 16-hour coding sessions. I will never be able to be up all night and still get things done. I will never be able to be as "productive" as those people on Twitter say they are. I just cannot do it. And, as I've come to realize after nearly 11 years as a professional software developer, that's OK!

It's OK, because for eight hours a day I can still kick ass. I can still get my work done and do a good job on it. I can still lead my team efficiently. I can still tackle interesting problems with fun solutions. I just have to do it within boundaries. The trick is realizing that this doesn't make me any less than those Twitter people; it merely makes me different.

I will bet a lot of money that there are a "silent majority" of programmers out there who merely want to do their work and go home. These people do not sit up at night trying to mentally puzzle out a problem that's been bugging them for weeks. These people do not write blogs, contribute to open source projects, or give technical talks, or indeed are generally interested in programming outside the capacity of their job. These people are sometimes derogatorily termed "9 to 5 developers". I am here to tell these people that even though we are 9 to 5 developers, that does not make us worse developers.

If you are a 9 to 5 developer, it doesn't make you less of a programmer, it makes you better at time management.

I'm not here to tell you which way is better. I can only tell you what works for me, and that is that I am a 9 to 5 developer. I want to do my work, do it well, and then go home and do other things (like write this blog). I need to do it this way to keep my sanity. And you can too, even if your brain doesn't work like mine! You do not have to spend eons programming to be a good developer. You just have to think deeply about problems, and want to do a good job.

Guess what? If you are here reading this, you already are doing a good job. Now friggin go home already! The code will be there tomorrow, I promise.

There's this new guy on our team. I helped interview him, recommended we hire him, and ever since we did he's been absolutely killing it on every task we assign to him. Tasks that I thought would take an equivalently-skilled programmer 6 hours take him 30 minutes. He has consistently, repeatedly, beaten my estimates and done a damn fine job of writing testable, readable code. He is a living, breathing 10x programmer. And I'm not.

He's really smart. He's insightful, constantly coming up with suggestions that make our codebase better. He solves problems we didn't even assign to him. The only knock against him is that English is not his first language, but even then I have no problem understanding him. Our team is better because he is on it.

And he scares the living daylights out of me.

He could do my job. I'm sure of it. He could do these code reviews, these architecture plans, these proofs-of-concept, these structural tests, just as easily as I can. Probably more easily, in fact. His code would be just as good as mine. It is just as good, and I know because I review said code.

He's... better than me. At everything.

Crap! Now what do I do?! This guy is so damn smart that he'll knock me out of a job! He can do everything I can. He can do it better. And now he's coming for my job. I have to do something! I have to do something that ensures I still have a job here. I can't let this newcomer, this usurper, steal my job. My family needs this job. We have a mortgage! This is my position, and he can't have it! He'll take it from me when I'm done, and NOT BEFORE!

In some alternate timeline, that is what the other me is thinking. That is me worrying that I'm not good enough, that this new employee will take my job and leave me and my family shivering in the cold. That is me allowing my Impostor Syndrome to take full control of my actions, allowing my fear of being replaced to control my actions. That is me allowing my worst fears to rise up and control my very being, which would inevitably make both myself and the new programmer suffer.

There is a new guy on my team; we'll call him Robin. Robin probably is smarter than me. He's definitely quicker, and his code passes all tests and reviews with flying colors. English is not his first language, but it doesn't matter in our day-to-day work. In short, Robin is absolutely a better coder than me, despite my 10+ years of experience and his mere 3. Boy am I ever glad he's on my team.

Yet, I am worried. My worries don't directly come from Robin, but he embodies them. I worry that one day I won't have enough time to keep learning, and my dreaded obsolescence date will arrive sooner than I'd hoped. I worry that my coding skills will atrophy to the point where they're no longer needed, or worse, become pigeonholed so far down as to be unemployable. I worry that I am replaceable. It's scary to be reminded that no, you're still not the best ever at what you do. These worries bring out our worst selves.

But allowing that worst me to surface, letting that selfish conniving little brat come out even once, will not solve anything. To the worst me, it's all about control; control of the job, control of the process, control of everything. He wants to control his surroundings, because he feels like nothing is in his control. And he'll be damned before anyone takes his control from him.

The problem is this: that kind of obsession helps precisely no one. By fixating so heartily on the perceived loss of control, what do I achieve? Possibly a job for the forseeable future, but probably also a reputation of being a controlling jerk; an important jackass is still a jackass. Plus, that type of need, the need to control which stems from a real and primal fear, hinders people like Robin from getting the mentoring they need ("I'm not gonna help you, you'll just take my job"). That need to control is counterproductive to a positive development environment.

Here's the rub: the game is not zero-sum, despite what my Imposter Syndrome might tell me. I do not have to lose anything for Robin to gain. He is a better raw coder than me, that is true, but there's more to software development than coding. Architecture, testing, communication, these are all things he could use some guidance on. It is my job as a lead developer to build him up, to help him learn, to make him better than he was, better than me. My job, in short, is to put myself out of a job. After all, if you can't be fired, you can't be promoted.

Robin is now a critical member of my team, and I wouldn't have it any other way. He's a better coder than me, and that's wonderful. The trick is to remember how wonderful having such a productive member of my team is, and that he isn't out for my job, he's out to improve himself. That's something I can help with.

Do you have a Robin on your team? Someone who is markedly better than you? How does that make you feel? Let me know in the comments!

Happy Coding!

]]>

1. It doesn't get easier; you get smarter.

Programming is not a simple endeavor. Requirements change in the middle of projects; technology advances quickly and ruthlessly; customers are quick to judge and slow to explain. Programming does not get easier. Rather, we programmers get smarter.

Programming is not a simple endeavor. Requirements change in the middle of projects; technology advances quickly and ruthlessly; customers are quick to judge and slow to explain. Programming does not get easier. Rather, we programmers get smarter.

The first time you encounter a particular bug, you're a failure until you stumble upon the solution, at which point you promptly become a god.

The trick is this: the next time you find that bug, you already know how to fix it, or at least the correct path to start on. The bug didn't get easier, you got smarter.

2. Learn everyday.

Everyone will eventually become obsolete. If you are not learning, you will become obsolete much more quickly. You must learn or die! Technology moves too fast and too recklessly for any of us programmers to be left behind.

The trickiest part to this is learning how to learn. It takes years to be proficient at coding, and years more to be good at understanding what you need to know and letting go of what you don't need. You will get there, just give it time.

3. Sometimes, programming sucks

Programming is hard. We work in a world without universal rules, a world in which a customer who asks for a car could then demand that it also be an airplane, and we will be expected to deliver that in a week! Plus, programmers can be difficult to work with. Our field is one where the ground shifts nearly every day, and you will be expected to find your footing and keep moving. You're going to have to get used to feeling unbalanced.

But take heart: we all feel this way! None of us can possibly keep up with everything. We all feel lost, left behind, sometimes. It's OK! Just keep moving forward and you will be fine.

4. No one feels like they know what they are doing.

I've been writing code professionally for ten years. (Holy crap!) In that time I can think of maybe four weeks in which I absolutely, positively, totally felt like I knew what I was doing. Those four weeks just happened to coincide with the first four weeks of my first programming job.

Eventually, you start to realize that you feel like this all the time:

When you first have this epiphany you might feel pretty down. How can someone enjoy their job if they feel stupid all the time? It gets old fast.

Don't get down; instead, change how you look at it. Don't think of not knowing the solution as being stupid, think of it as having not learned the solution yet. Many programmers are puzzle-solvers at heart. Learn to love the puzzle, and the skill will follow.

5. You will be overwhelmed at some point. Don't burn out.

Stress comes for all of us. You can try to ward it off, delay it, but it will eventually get to you, and caffeine only helps so much. How you react to the pressures of your job will determine how much you like this field.

Don't double down on your work! Don't see the mountain of things to do in front of you and say "If I just work a bit harder, it'll all get done." That way leads to burnout. I personally got fired for refusing to repeatedly come in on Saturdays when I was told to. I'd get fired again in a heartbeat, because I value my time more than anything else, including money. I reject the glorification of work, and I invite you to do the same.

Enjoy your job, but also live your life. You only get one.

6. Communication skills are more important than coding skills.

You can learn all the code, all the design patterns, all the frameworks you want, but if you can't explain them to others then it doesn't count. Nobody cares what you can do, they care what you can describe.

Customers cannot read our minds. Part of our job is to take very technical things and explain them to very non-technical people. This is not a natural skill for many of us, but just because we're not good at it yet doesn't mean we never will be. It takes practice and experience to become an effective communicator. Give yourself time to get that experience, and you will get there. The best programmers are often the best communicators, because they help people understand what is happening.

7. Be the stupidest person in the room, or change rooms.

You cannot learn anything if you already know everything. Lots of programmers get comfortable in a job where, since they already know everything, there's no driving need to learn. These programmers become ticks, so dug in at their current jobs that removing them would cause harm to their company. They can't be fired, sure, but they also can't be promoted.

Don't become a tick. Your job is to write yourself out of a job, so that you can get a better job. If you can't learn anything from the people in the room, you're in the wrong room.

8. A good environment is worth far more than a high salary.

You cannot put a cost on the stress caused by working in an unwelcoming environment. Whatever the reason is, if you find yourself working a job that no longer makes you happy, do whatever you can to move away from it or to change it.

I know there are real-world pressures for people whose jobs no longer are (or never were) fulfilling. I know that, for some people, the money is too good to just up and leave. Maybe you and your family cannot take another job change right now. Whatever the reason is, future you will thank you ten times over if you can create a way for you to be satisfied with your job.

You don't have to change jobs if you're not content with your current one! Talk to your boss; perhaps there is something s/he would be willing to do to help lower your stress. Talk to your teammates; maybe they can assist with finding out why you might be so stressed. There is often a solution that can be found within your own team, even if it's something simple just allowing the use of noise-canceling headphones because the conversations down the hall are loud and distracting. Many managers are willing to do little things to keep their employees happy.

The only thing I would caution is this: if you are expected to work overtime just to pull your weight, it doesn't matter how good the salary is, run. Run far away. Don't spend your whole life trading time for money, it isn't worth it.

And if you do find a welcoming environment, one where the programmers are valued and you're not expected to pull insane hours, stay a while. Not forever; no one stays forever. But as long as you can. You may not find such an inviting environment often.

You CAN Do This

Here's the most important thing you need to know, dear juniors: you can do this. Programming looks like magic, and some professionals make it look easy. Don't fall for that. It's not easy, it never has been, it probably never will be. But it can be done, and you can do it. And we need you, newly-minted junior programmers. Our industry needs new minds and new experiences like yours.

Do you have some tips or ideas you think incoming junior devs should know about? Or are you a junior yourself, and want to share your experiences? Sound off in the comments!

Happy Coding!

]]>

You may be familiar with the term rubber duck debugging. This is the idea that in order to help a programmer solve a problem, s/he should explain it to some kind of inanimate object (most commonly a rubber duck), because in the process of explaining the problem they will

You may be familiar with the term rubber duck debugging. This is the idea that in order to help a programmer solve a problem, s/he should explain it to some kind of inanimate object (most commonly a rubber duck), because in the process of explaining the problem they will often solve the problem.

I'm finding that the best way to help people with their problems is to become that rubber duck. Though, perhaps, a slightly smaller one.

Becoming a rubber duck is not terribly difficult, though it does take some practice. Here's how it works: you listen to your coworker's problem. That's it. Don't offer opinions, don't espouse your favorite Javascript framework, just shut your mouth and pay attention. Much of the time, they will solve the problem through the mere act of explaining it to someone else.

The key is this: if you do this for enough people, eventually you get recognized as "the guy who helped me solve my problems". Which is wrong, of course; you didn't actually help, they just talked through their problem enough to work it out for themselves. You weren't any help to them at all.

Or were you?

Listening to other people describe their problem is the fastest way to solve that problem. The simple fact is that the other person already knows more about this particular issue than you do, and so s/he's infinitely more likely to solve it before you even understand what precisely is going on. You simply aren't going to grasp all the nuances of that precise situation as quickly as they will, because they have spent time in the trenches and has seen more than you have. If you're the kind of person that needs to know everything (like me), part of becoming an effective rubber duck is letting go of the idea that you know everything you need to know, because you don't and you never will.

Even if you actually do know the answer, sometimes you need to let your teammate solve it for themselves. Particularly when you are in a mentor/learner situation (as I am with my junior devs), it becomes essential to allow the junior team members to work out issues for themselves and not just their solve problems for them. Doing so merely teaches them that you are the solution, when what you actually want is to show them that they can find the solution.

Becoming a rubber duck also helps to craft the image I mentioned earlier, of "the guy that helped me solve my code problems." People are very often not looking for help, they are looking for a solution. How they reach said solution is irrelevant. If you help them reach the solution they are looking for, even by saying nothing, they will remember and let other people know.

Listening is a skill. It takes practice, particularly for individuals like me who are easily distracted and would much rather be solving problems of our own. But it's now a core part of my job as a lead developer. I cannot do my job correctly without listening effectively.

Work on your listening skills. Software development is a team sport. We're all in this together, and we all succeed or fail together. You aren't going to succeed if you can't listen well. Offer opinions or suggestions if they're truly relevant, but otherwise if a person comes to you and asks to explain their problem, don't talk. It's counter-intuitive, but the person who listens the most often becomes recognized as the person who solves other people's problems, and you want to be that person.

Become your team's rubber duck, and just listen for a bit. You might find that you solve more problems by listening than by speaking.

Happy Coding Listening!

]]>

Our Continuous Integration and Continuous Deployment (CI/CD) build system (that we recently implemented) is truly a joy to behold, but it's also basically magic from my perspective. Check some code in, wait a few minutes, something happens, and then BOOM it's on the dev server. I know that it

Our Continuous Integration and Continuous Deployment (CI/CD) build system (that we recently implemented) is truly a joy to behold, but it's also basically magic from my perspective. Check some code in, wait a few minutes, something happens, and then BOOM it's on the dev server. I know that it works, but I don't know exactly how. Should I take the time and effort to find out?

I'm a completionist, which means I'm also a naturally inquisitive person. I want to understand how things work, how they interact. I recognize that I cannot possibly understand everything, but I want to. It's a little internal conflict that rears up whenever I am presented with what seems, to me, to be magic.

My family and I love going to Disneyland, (seeing as how we're really not that far from it) and at that resort there is a particular ride that's now known as Soarin Around the World. This ride lifts passengers up into a hang-gliding type adventure, using a giant video screen and huge mechanical rows of seats to simulate flight. And the first time I went on it, I was so distracted by trying to figure out how it operated that I didn't even watch the screen. I couldn't even tell you what we saw. I missed the whole ride because I was trying to figure out how it worked. Was it worth it?

Another piece of magic slightly closer to home is my desk phone.

This phone has no power cable, just an ethernet jack. When I plug in the jack, the phone turns on. To me, this makes no sense, as I was under the impression that ethernet jacks couldn't provide power. But I must be wrong, since the evidence is clear: it works. I can make calls on it, and people can call me (even though I may not want them to). How and why are ethernet jacks capable of providing power? Or is something else going on and I'm just not seeing it? Is it worth the effort to figure out why it works rather than just accepting that it does?

I've slowly but surely come up with a litmus test that helps me determine if something that is "magic" is worth the time to figure out, to take apart and put back together. I do so if and only if:

The "magic" is directly related to a problem I'm trying to solve AND

The knowledge gained from investigating the "magic" is directly useful to solving other problems.

This test helps me determine whether or not further investigation would be useful for me. Let's see how our three examples of magic fare against this test.

First off, the Disneyland ride utterly fails our litmus test. Spending time to try to figure out how the ride worked at the expense of just enjoying it was absolutely not worth it. What was I going to do with the knowledge gained, impress my friends at some hypothetical party? I don't even go to regular parties, much less hypothetical ones. It didn't help me solve a problem; in fact it detracted from my enjoyment of my vacation and left me with less happiness than I might have had otherwise.

Taking apart the desk phone is also out. I mean, it's a phone and it works. I don't really need to know how it works, unless I suddenly need to be able to write code for it, which is unlikely. The test suggests that we should leave it alone.

But the CI/CD process is something I should know, even if only to be able to diagnose problems when they inevitably occur. The knowledge gained from learning about this procedure will absolutely be useful for later projects, since presumably they will be using the same or similar system. Plus, then I can be a teacher and help other people in my company set up their own CI/CD processes. The knowledge gained from investigating the magic would be useful in other areas.

We completionists will never have enough time to learn everything we want to learn, and so we have to learn to let go of that hope. This is not a sad thing, nor should it be, rather it's just part of learning to live a full, well-rounded life.

The point is, you don't have to spend the time to learn something that isn't going to be useful to you. Sometimes it's just not worth the effort. You have to actively manage your time, and since you'll never have enough of it, it's best to spend it as wisely as possible. Only you can determine exactly what that means, but for me, it means spending my time enjoying things I want to do, and learning about things that help me solve problems. Everything else is a waste of my precious time.

The problems started small, as they often do. But as we've seen many times before, lots of small problems in quick succession tend to make one big problem.

In this case, the problem got big fast. It started off easy enough: read the big report, find the bug, fix it, the usual. Our bug-tracking team located the source of the issue right away, and my team set about trying to work out the fix. We found the data source that was causing the issue, and it happened to be a web service owned by another team. We couldn't see into it, we could only see the inputs and outputs; it was essentially a black box to us.

Here's where the bad part starts. Due to a lack of efficient communication on all sides, impending deadlines, frustrated coders and general hesitancy to deal with this particular, ahem, time-intensive project, the actual bug fix took about four days to nail down. Yes, four days; we were just as annoyed as you are, and probably more so.

To make a long story short, the project we were attempting to deal with was:

old,

slow,

in desperate need of a rewrite,

using a data source which we had no visibility into (the aforementioned service),

not written by anyone currently on the team,

still our responsibility to fix AND

needed to be fixed right friggin now.

You should read that list and cringe a little for each bullet point. I know I did.

All of those problems put together (plus the fact that it took us four days to figure it out) prompted my manager, normally a well-reasoned, thoughtful individual, to say during our bug post-mortem:

"I'm starting to really loathe this project. It's getting to the point where I don't trust anything that we didn't build."

I have to say, it's hard to blame him for wondering if we shouldn't be using things that were not invented here.

It's incredibly easy for a software development team, even an experienced one like mine, to fall into the comfortable trap of believing that everybody else's code is terrible and their own is awesome. We developers often forget (or, quite possibly, want to forget) that most of the time the bug is in our code, no matter how much we wish that it wasn't.

Do this enough and the untamed wild of other people's code starts to look like an intimidating place. It's safer, easier, to believe that your code is correct and everyone else's is wrong, because that means you don't have to spend time trying to understand what the other person was thinking. Or, just as often, spending time figuring out how you are wrong, something nobody enjoys doing.

I've written before that I believe code should exist for a reason. The difficulty in working with other people's code is that not only are you trying to understand what the code does, you're trying to comprehend the reason why it does that. That's a difficult thing to accomplish in the best of times (efficient communication being a feat that usually fails, except by accident), and when you're approaching a deadline and trying to have a meaningful conversation with the original developer who has his own deadlines and responsibilities to deal with, it can be nigh impossible.

Let me be perfectly honest: there are times I completely understand my manager's frustration. It would be SO much easier if the only code I had to deal with was my own, because then the only stupid person in the equation is me and I can fix that. Dealing with other stupid people is infinitely more frustrating than dealing with your own stupidity.

To be clear, I am not calling my coworkers stupid; they are in fact quite the opposite. But it's tempting to fall back to lazy thinking and believe they are stupid merely because they were dealing with requirements and scenarios that I didn't have time to thoroughly understand. That temptation, to believe that things are stupid because I don't understand them, is something I find myself fighting against on a daily basis. It's an innate human quality, and not unique to programmers or other technical people.

Here is a basic fact of life: people, on the whole, are not stupid. Programmers do not write code for no reason, as the best code is no code at all and if we could have our way there would be no code, ever. But because code needs a reason to exist, it almost certainly had a set of requirements, or scenarios, or something which shaped its current form. Even if those requirements were merely thoughts in the original developer's head, they existed. It is not the fault of that developer that some idiot who saunters up to a laptop and is trying to break her code doesn't understand what said code is meant to do.

But it's easy to think that, isn't it? It's easy, it's simple, it's lazy. When we don't have time or energy to think, really think, the lazy thoughts are what we are left with. Given that programming is an almost-entirely-mental task, accepting the lazy thoughts as fact could even be seen as a reprieve from needing to think critically all day, every day.

Resist the lazy thoughts. Resist the idea that your fellow programmers are stupid, or wrong, or only doing a half-done job. Resist Not Invented Here syndrome. Resist the idea that because someone didn't understand you, they're dumb. Resist all these little thoughts that end up with a conclusion of "those other people are stupid," and instead try to answer "what were they trying to accomplish?" There's nothing wrong with digging a little deeper for a better understanding.

That's what I say to you: resist the lazy thoughts, and dig a little deeper. You will eventually have to trust something you didn't build. If you keep digging, you'll find what you are looking for.