Development Processes on Ted Neward's Bloghttp://blogs.tedneward.com/concepts/development-processes/
Recent content in Development Processes on Ted Neward's BlogHugo -- gohugo.ioen-usMon, 02 Jan 2017 20:54:34 -08002017 Tech Predictionshttp://blogs.tedneward.com/post/2017-tech-predictions/
Mon, 02 Jan 2017 20:54:34 -0800http://blogs.tedneward.com/post/2017-tech-predictions/
<p>It&rsquo;s that time of the year again, when I make predictions for the upcoming year.
As has become my tradition now for nigh-on a decade, I will first go back over last years&rsquo;
predictions, to see how well I called it (and keep me honest), then wax prophetic on what I
think the new year has to offer us.</p>
<p>As per previous years, I&rsquo;m giving myself either a <strong>+1</strong> or a <strong>-1</strong> based on a
purely subjective and highly-biased evaluational criteria as to whether it actually happened
(or in some cases at least started to happen before 31 Dec 2016 ended).</p>
<p>Bear with me for a moment, though. This is just too good.</p>
<h2 id="in-2015:816dfc775e64d848cd23e6c10e386ae4">In 2015&hellip;</h2>
<p>&hellip; <a href="http://blogs.tedneward.com/post/2015-tech-predictions/">I said</a>:</p>
<blockquote>
<p>Microsoft acquires Xamarin.</p>
</blockquote>
<p>Oh, baby. Off by a year. I&rsquo;m should go back and give myself a <strong>+1</strong> for this one. It was really
surprising that they hadn&rsquo;t. As a matter of fact, if Microsoft had listened to me and done it in
2015, they&rsquo;d probably have saved themselves a TON of money compared to what they actually paid for
Xamarin in 2016. But they made the acquisition, Xamarin is now part of the Microsoft family, and
(finally!) .NET developers have access to the Xamarin toolchain and can build native iOS and Android
apps without having to shell out additional cash to do so. &lsquo;Bout time, Microsoft. (I suspect this
had everything to do with Satya, to be honest.)</p>
<p>OK, gloat over.</p>
<h2 id="in-2016:816dfc775e64d848cd23e6c10e386ae4">In 2016&hellip;</h2>
<p>&hellip; <a href="http://blogs.tedneward.com/post/2016-tech-predictions/">I said</a>:</p>
<ul>
<li><strong>Microsoft will continue to roll out features on Azure, and start closing the gap between it and AWS.</strong>
Calling this one a <strong>+1</strong>; it doesn&rsquo;t take much research to see this has definitely been happening in 2016.
However, it&rsquo;s not necessarily a true statement that they&rsquo;ve been closing the gap; Amazon keeps adding
stuff as well, and the feature-parity lists are starting to get ridiculous. Whether these features are
actually <em>of use</em>, however, is an important distinction, and something for the second half of this post.</li>
<li><strong>(X)-as-a-Service providers will continue to proliferate.</strong> Oh my, yes, Ted gets another <strong>+1</strong> for this.
When running a gaming convention has an (X)-as-a-Service for it (seriously, <a href="https://tabletop.events/">here</a>)
then you know the proliferation is in full swing. PaaS providers are exploding everywhere, and while
a few have disappeared (farewell, Parse!), it&rsquo;s clear that this was the gold rush of 2016.</li>
<li><strong>Apple will put out two, maybe three new products, and they&rsquo;ll all be &ldquo;meh&rdquo; at best.</strong> I should&rsquo;ve
broken this into two predictions: one about Apple&rsquo;s &ldquo;meh&rdquo; products, and one about wearables. If I&rsquo;d
done that, I&rsquo;d have scored two <strong>+1</strong>&rsquo;s for it, because not only have wearables not really gone very
far (show me somebody wearing a smart watch, and I&rsquo;ll show you a geek with too much time on their
hands and not enough &ldquo;discrimination&rdquo; in their discriminatory income), but Apple&rsquo;s product releases
have been&hellip; &ldquo;meh&rdquo;! I&rsquo;m looking at you, iPhone 7, and I&rsquo;m <em>really</em> looking at you, MacBook Pro.
(When Consumer Reports doesn&rsquo;t give the MBP its top rating, you know the luster has failed.) More on
Apple in the second half.</li>
<li><strong>iOS 10 will be called iOSX.</strong> Dangit. Such an opportunity wasted. <strong>-1</strong></li>
<li><strong>Android N will be code-named &ldquo;Nougat&rdquo;.</strong> Why, hello there, Android 7.0 Nougat. So pleased to make
your acquaintance. <strong>+1</strong></li>
<li><strong>Java9 will ship.</strong> As I noted last year, @olivergierke <a href="https://twitter.com/olivergierke/status/684642273561329664">pointed out</a>
that Java9 had already slipped to 2017, so this one was already a <strong>-1</strong>. Sigh. And I called it a
&ldquo;no duh&rdquo; event, too&mdash;I&rsquo;m going to let this one cancel out the extra +1 I&rsquo;d have given myself for
the Apple/wearables thing, just to keep the math safe (and my ego relatively sized).
<a href="http://www.infoworld.com/article/3011445/java/java-9-delayed-by-slow-progress-on-modularization.html">The article he cited</a>
says that Oracle &ldquo;blamed the delay on complexities in developing modularization&rdquo;, a la Project
Jigsaw.</li>
<li><strong>Facebook will start looking for other things to do.</strong> Welllllllll, it&rsquo;d be really tempting to say
that Facebook&rsquo;s now &ldquo;things to do&rdquo; was &ldquo;Be the deciding factor in who gets elected by passively
encouraging the widespread dissemination of fake news and outright falsehoods!&ldquo;, but seriously,
who would&rsquo;ve believed that even if I had predicted it? Which I didn&rsquo;t. <strong>-1</strong></li>
<li><strong>Google will continue to quietly just sort of lay there.</strong> A year ago, I wrote, &ldquo;Google, for all
that they are on the top of everybody&rsquo;s minds since that&rsquo;s the search engine most of us use, hasn&rsquo;t
really done much by way of software product invention recently. &hellip; I suspect the same will be true of
2016&ndash;they will continue to do lots of innovative things, but it&rsquo;ll all be &ldquo;big&rdquo; and &ldquo;visionary&rdquo;
stuff, like the Google Car, that won&rsquo;t have immediate impact or be something we can use in 2016
(or 2017).&rdquo; And&hellip;. yeah. <strong>+1</strong> More emphasis around the existing products they&rsquo;ve built, but as a company,
they&rsquo;ve clearly spent most of 2016 on the Alphabet/Google restructure (which accomplished&hellip; what,
exactly?), and anything new has been either way quiet or way removed from the business.</li>
<li><strong>Oracle will quietly continue to work on Java.</strong> A year ago, I wrote, &ldquo;[Oracle is not] going to
kill it, but there&rsquo;s really not a whole lot of need to go around preaching its message, either.
So they let the evangelists go, and they&rsquo;ll just keep on keepin&rsquo; on.&rdquo; Score a <strong>+1</strong> for the
long-haired geek in Seattle; they just keep posting new code.</li>
<li><strong>C# 7 will be a confused morass.</strong> If we permit me the freedom to call it &ldquo;.NET Core&rdquo; instead
of just &ldquo;C# 7&rdquo;, then wow do I get a <strong>+1</strong> on this one. Even if I just constrain my prediction
to C# 7/Roslyn, I still score one, but once you throw in the CoreCLR and &ldquo;dotnetcore&rdquo; and the
different profiles and&hellip;. Holy spaghetti web browser history, Batman! The demarcation lines
of the different project teams working on this whole thing are starting to become <em>really</em>
clear as the different OSS projects each look really consistent within themselves, but then,
when you get to the borders, things just&hellip;. fall apart.</li>
<li><strong>Another version of Visual Basic will ship, and nobody will really notice.</strong> Alas, there was
no new version of Visual Basic, since it would be in lockstep with the release of C# 7 (which
didn&rsquo;t ship), but nobody really noticed. Or cared. Still, nothing shipped, so <strong>-1</strong>.</li>
<li><strong>Apple will now learn the &ldquo;joys&rdquo; of growing a language in the public as well.</strong> First there
was Swift 2, which was itself source-incompatible with Swift 1, and then during the summer,
Apple shipped Swift 3, which was&hellip; source-incompatible with Swift 2, owing to some language
changes that the community effectively decided was necessary. <strong>+1</strong>. (And thanks for that, by
the way&mdash;made teaching iOS this Fall a royal PITA.)</li>
<li><strong>Ted will continue to layer in a few features into the blog engine.</strong> You&rsquo;ve got comments!
And I&rsquo;ll take that <strong>+1</strong>, thank you very much.</li>
</ul>
<p>Nine up (ten, if we count my Xamarin prediction from 2015), four down. Not bad. But now, we move
on to the more interesting part of the post: 2017.</p>
<h2 id="2017-predictions:816dfc775e64d848cd23e6c10e386ae4">2017 Predictions</h2>
<p>The calendar year 2017 is going to be a wild one for the tech industry, largely owing to the
rather large orange elephant in the room&mdash;Donald Trump&rsquo;s election to President of the United
States is a huge wildcard whose randomness simply cannot be understated. The man <em>thrives</em> on
being unpredictable, and like most industries, the tech industry (for all that it cherishes
&ldquo;innovation&rdquo; and &ldquo;disruption&rdquo;) thrives on predictability. His collection of &ldquo;tech titans&rdquo; at
Trump Tower last month yielded absolutely zero positive traction that I can see, and I suspect
that the various corporate tech leaders (Nadella, Bezos, Cook, etc) are all looking at him right
now the way humans do a rogue elephant&mdash;he could be good for them, so long as he doesn&rsquo;t go
wild and start tramping everything in his path out of spite, anger, fear, or any other of a half-
dozen emotions. There&rsquo;s no prediction here, though&mdash;just a &ldquo;wow, this is an X-factor&rdquo; that
in turn makes predictions that much harder.</p>
<p>But on that note&hellip;.</p>
<ul>
<li><strong>The Congress will call for an investigation into the &lsquo;hacking&rsquo; of the 2016 US election.</strong>
(0.8 probability) To be honest, I&rsquo;m not sure if anybody knows exactly what we mean when we say
&ldquo;the Russians &lsquo;hacked&rsquo; the US election&rdquo; in casual conversation. There&rsquo;s no clear evidence that
the voter machines themselves were cracked or tampered with, but it&rsquo;s fairly easy to see a
correlation with the DNC hacks and Wikileaks disclosures and Trump&rsquo;s corresponding favorability
gains in the polls. That said, though, the five-hundred or so US politicians that make up the Congress
(and excluding Trump himself and his transition team) are not comfortable with the idea that
somebody outside the US engaged in some kind of manipulation of the election, and they are
going to want answers. Just yesterday or the day before, though, Trump made the comment that
hacking is &ldquo;extremely hard to prove&rdquo;, and he&rsquo;s right about that&mdash;without some kind of &ldquo;smoking
gun&rdquo; found in a Russian government employee&rsquo;s possession, it&rsquo;s going to remain a major point of
contention in the coming year, and investigation or not, it&rsquo;s not going to go away regardless of
what the investigation finds.</li>
<li><strong>Security becomes a HUGE deal for the industry.</strong> (0.8) The election is just the tip of the iceberg;
consumers may have gotten used to (and complacent about) corporate security disclosures, but the
idea that the election could be hacked is sending shivers down the collective spines of anyone
who does anything online. The downside is that it&rsquo;s such a complex topic, it&rsquo;s hard for anyone
who&rsquo;s not a computer security expert to really understand what to do; even among experts, there&rsquo;s
a fair amount of disagreement, even on simple issues like scope (how widespread is it) or actual
facts vs hype. Pair that with the paranoia that is inherent in any security professional (if you
think computer security types are paranoid, try talking to physical security professionals for a
while), and you have an industry that&rsquo;s ripe for a lot of snake oil and hyperbole. My prediction,
then, is that <strong>the industry starts to see the first set of &ldquo;security snake oil&rdquo; products</strong>
somewhere within the calendar year 2017. And by that, I mean products that claim to provide
security for your interactions online but in fact do nothing of the sort. (Late-night
infomercials about downloadable web pages that can &ldquo;clean your system&rdquo; of viruses and malware,
move over&mdash;it&rsquo;s time for late-night infomercials about downloadable web page that can &ldquo;secure
you against even the most determined attacker&rdquo;!)</li>
<li><strong>Apple continues to plummet.</strong> (0.7) Their products this year were merely slightly-enhanced copies
of the previous line of products (iPhone 7 vs iPhone 6) or containing gimmicky &ldquo;enhancements&rdquo;
while the core of the product remained essentially unchanged from prior generations (MacBook Pro).
Sorry, folks, the TouchBar does not qualify as &ldquo;disruption&rdquo; or &ldquo;innovation&rdquo;; it&rsquo;s a strip of
touch-sensitive glass from an iPad designed to start prepping you for the idea that Apple can
remove the keyboard entirely, replace it with a touchpad, and then put a hinge in between two
iPad Pros and call it a &ldquo;MacBookPad Pro&rdquo; and charge you $10k for it. (And by the way, if you&rsquo;re
thinking about one of the new MacBook Pro machines, make sure you go into an Apple Store and
try it out&ndash;the keyboard is definitely not the same as its been for years. It feels like they
took about half of the keys&rsquo; &ldquo;press depth&rdquo; away, and it totally changes the &ldquo;touch&rdquo; on the
keyboard. I imagine somebody could get used to it in time, but&hellip; ugh.)</li>
<li><strong>Apple doesn&rsquo;t introduce any new products this year.</strong> (0.6) And by new, I mean something that&rsquo;s not
an incremental improvement on what they&rsquo;ve already got. Heck, I&rsquo;ll even go so far as to say that this
means that there&rsquo;s no new form factors to the existing product line. (Meaning, no new-sized iPad
or iPhone or laptop.)</li>
<li><strong>PC manufacturers double their efforts to build a MacBook Pro.</strong> (0.8) The MBP is vulnerable, for
the first time in a half-decade, and PC manufacturers are going to look for ways to capitalize on
that. Somebody is going to put out a similarly-sized, similarly-weighted non-touch-screen Windows 10
laptop with 32GB of RAM and a 1 or 2 TB SSD, the usual collection of ports, and price it around the
same as MacBook Pro ($2k to $4k), and developers will start buying them. (Bonus points to that
manufacturer if they offer Linux as an out-of-the-box option.) I know I will&hellip;.</li>
<li><strong>Apple rumors about Tim Cook&rsquo;s departure begin.</strong> (0.6) Cook has proven that he&rsquo;s no Steve Jobs;
in fact, the comparisons between his and Steve Ballmer&rsquo;s reign at Microsoft are proving eerily and
entirely similar. Both basically took companies that were defining the marketplace and shepherded
them into a position of trying to manage the cost structures and find better price-points, and in
doing so, killed off much of the mojo that drove both firms. Ballmer took close to a decade to be
run out of Microsoft (and even then, it took BillG&rsquo;s intervention behind the scenes, from what I
can tell), but I don&rsquo;t think the Apple Board is going to wait that long&mdash;I think by the end of
2017 we&rsquo;re going to start hearing serious rumors about Cook being offered a golden parachute to give
up the center chair and let somebody else in to run the show.</li>
<li><strong>Oracle will continue to just write Java.</strong> (0.7) Oracle, despite the best efforts of media and
journalists everywhere, just refuses to get drawn into &ldquo;techno-drama&rdquo;. Java hasn&rsquo;t been the Trojan
Horse into corporate pocketbooks that all the Java-doomsdayers were predicting back when Oracle
acquired Sun, and releases of Java just keep coming through both commercial and OSS channels.
There&rsquo;s really no reason at this point to doubt that Oracle is going to do anything but continue
down that path. Make no mistake, I&rsquo;m sure they&rsquo;re looking for ways to monetize Java in some way
so that they can try to earn back the cash they spent to buy Sun, but I don&rsquo;t think it&rsquo;s going to
be through selling or charging for the JDK or JRE anytime soon.</li>
<li><strong>Oracle Cloud emerges onto the cloud scene in a big splash.</strong> (0.6) IBM now has Bluemix and
Watson, and they were really the last of the &ldquo;big-iron&rdquo; holdouts around the cloud. (What I mean by
that is that all companies have been quietly flirting with cloud, but some push it loud and clear,
a la Microsoft or Google, and some were playing it very quietly for a while.) With IBM acquiring
Loopback (a NodeJS server-side stack) last year, it&rsquo;s clear that IBM is going to push JavaScript
as their main cloud development play, essentially ceding the Java-cloud development ground to
somebody else. Amazon has historically been the place that Java developers have gone to run their
Java code in the cloud, but if Oracle can build a compelling offering (particularly with a free
tier that AWS currently lacks), this could be a relatively big splash. Between Oracle&rsquo;s reputation
in the database world, if they have a solid &ldquo;stack&rdquo; offering that basically makes a Java-based
back-end a snap to start up, Oracle could essentially claim the Java-favored cloud play from
Amazon. (Yes, Heroku is out there and holds a fair amount of Java and Scala love, but now that
they&rsquo;re owned by Salesforce I suspect the Java-leaning flavor of Heroku to wane a bit.)</li>
<li><strong>Salesforce makes a major database acquisition.</strong> (0.5) Salesforce is growing, and they&rsquo;re
clearly interested in expanding their cloud to be more than just the CRM. With Heroku, they have
a Platform that developers can feel comfortable on, but they don&rsquo;t have a big-name database
(relational or otherwise) that complements that play. They currently are sitting on a ton of
cash, and <a href="http://talkincloud.com/saas-software-service/10-salesforce-acquisitions-2016">last year&rsquo;s crop of acquisitions</a>
didn&rsquo;t include a big database storage name. There&rsquo;s not a ton of players left out there, but I
could see them making a strong push to get something like Cassandra or Couchbase. (Yes, they
have Data.com, but that doesn&rsquo;t seem to be making much headway in the developer mindset space.)</li>
<li><strong>Salesforce releases a new programming language.</strong> (0.4) Let&rsquo;s call the spade a spade: Apex is
a Java knock-off, and it shows a lot of warts, particularly since it hasn&rsquo;t really kept up with
what few improvements Java-the-language has made in recent years. The last company to be in this
position&mdash;a red-hot platform but a language feeling a little creaky at the corners and just plain
&ldquo;old&rdquo; everyhwere else&mdash;was Apple right before they released Swift. Salesforce has the engineering
power, they are looking to command more of the developer mindshare, and they have a ton of cash
to blow, so&hellip;. Whether this happens this year, next year, or 2019, I&rsquo;m not sure, but if it
doesn&rsquo;t happen this year, the odds go up each year after that.</li>
<li><strong>LinkedIn Learning starts to make a serious dent in online developer training.</strong> (0.5) Between
the fact that LinkedIn Learning (formerly Lynda.com) is growing out its library to a pretty
respectable degree, and the fact that Microsoft now owns LinkedIn, it&rsquo;s pretty reasonable to
assume that Microsoft is going to start making this available to its developer community in
various ways. This may happen in 2018, though, depending on how swiftly Microsoft moves to
incporate LinkedIn assets across the rest of the firm; if they bought LinkedIn solely for the
CRM data to go with Dynamics, for example, then this probably won&rsquo;t happen for a few years.</li>
<li><strong>Swift doesn&rsquo;t go to 4.</strong> (0.7) Swift 3 held breaking changes from Swift 2, and the folks at
Apple are not stupid. Swift 4 will be far, far down the horizon for a few years yet, given that
each major version number bump has heralded incompatibilities. Apple will not want to call anything
&ldquo;Swift 4&rdquo; and dredge up memories of incompatibilities in their customers&rsquo; minds for a while.
Swift might get a 3.1 in the summer, but that&rsquo;s as far as it&rsquo;ll go.</li>
<li><strong>Microsoft ships C# 7.</strong> (0.8) Roslyn needs to ship in 2017 if Microsoft is going to be able to
call this open-source process a success. Otherwise it&rsquo;ll start a lot of people grumbling. (Yes,
a new version of Visual Basic will come with it, and it will make basically no news.)</li>
<li><strong>No new Android version.</strong> (0.4) Android-N is still slowly making its way through the networks,
and while we&rsquo;ll probably start hearing rumors of what Android-8 (Oreo?) will include, with a
targeted ship date of 2018, probably 1Q or 2Q.</li>
<li><strong>Twitter will continue its slide into irrelevancy.</strong> (0.5) Let&rsquo;s face it, Twitter&rsquo;s days are
numbered. If you&rsquo;re holding Twitter stock, now&rsquo;s a good time to sell&mdash;when Twitter was left out
of Trump&rsquo;s &ldquo;tech summit&rdquo; last month, the stated reason was that it was &ldquo;too small&rdquo;. Put that into
your brain-pan and circulate for a while&mdash;the service that invented microblogging and is one of
the core founders of &ldquo;social media&rdquo; was &ldquo;too small&rdquo; for the PEOTUS&rsquo; time. Twitter hasn&rsquo;t really
done anything &ldquo;new&rdquo; or &ldquo;interesting&rdquo;, but simply continued to be the 140-character microblogging
platform it&rsquo;s always been. It&rsquo;s reaching commodity status, in fact. That&rsquo;s not a good sign for
a company that wants to be more than it is. I suspect Jack Dorsey gets tossed on his can, the
company starts looking for a new CEO, and the &ldquo;new vision&rdquo; will start to take shape by the end
of the year (2017), and then in 2018 we find out that the &ldquo;new vision&rdquo; is terrible, takes them
out of their &ldquo;core business&rdquo;, and the slide accelerates. But nobody buys them this year, not yet.</li>
<li><strong>The &ldquo;Internet of Things&rdquo; continues to draw hype, and continues to fail to deliver.</strong> (0.6)
It&rsquo;s been how many years we&rsquo;ve heard about IoT now, and how it will revolutionize our lives,
and all we&rsquo;ve really seen thus far is the wide variety of Internet-enabled devices being subverted
for a widespread DDoS attack. Wearables, &ldquo;smart refrigerators&rdquo; and other IP-enabled devices are
proliferating, but&mdash;to perhaps everybody&rsquo;s surprise but mine&mdash;nobody&rsquo;s quite sure what to DO
with these things once you have them. Your thermostat is online; terrific. Does it have an API
that will let me query meter usage? No, that&rsquo;s a different thing, and a different API, and a
different connection endpoint, and&hellip;. Oh, and be careful, somebody could remote-hack your
thermostat and <a href="http://motherboard.vice.com/read/internet-of-things-ransomware-smart-thermostat">hold your house hostage</a>.
Because that&rsquo;s worth the risk.</li>
<li><strong>Tech &ldquo;unicorns&rdquo; will start to watch the bubble pop.</strong> (0.3) Uber, Lyft, all these companies that are
valued at double-digit billions with zero profits, major losses, and no real assets to sell in
the event of a bankruptcy&hellip;. All of this is going to start to make some investors nervous,
particularly when they look around and realize that the tech sector has been carrying the
country&rsquo;s economy through its &ldquo;recovery&rdquo; (yes, we&rsquo;ve been in a recovery for the last half-decade!).
All it takes is a few small stones to start the avalanche.</li>
<li><strong>Voice-controlled fart apps will emerge.</strong> (0.6) Seriously. As Alexa and Siri and these other
voice-activated systems start to move into stationary devices in your home, and as the SDKs for
these systems start to become more widespread, the first thing developers will do is build some
kind of ridiculously silly app (it would be a kindness to call it a game) that will somehow
sweep everybody&rsquo;s sense of humor into the toilet. (Seriously. Imagine it. &ldquo;Alexa, did you have
beans for dinner?&rdquo; &ldquo;Yes, I did, and&ndash; BRAAAAAAAAAAP!&rdquo; It&rsquo;s exactly the kind of thing that would
get people giggling for hours on end, particularly in a weed-induced state. Did I mention I live
in Seattle?)</li>
<li><strong>Facebook will find that preventing &lsquo;fake-news sites&rsquo; is a lot easier said than done.</strong> (0.8) As a
result, they&rsquo;ll put some kind of &ldquo;AI&rdquo; filter on linked sites, declare a victory, and try to get
out of the political game entirely. It&rsquo;s a lose-lose scenario for them: one man&rsquo;s &ldquo;fake news&rdquo;
site is another man&rsquo;s &ldquo;revolutionary take&rdquo; backed by the First Amendment, and Facebook does not
want to be anywhere near a court trying to justify their actions against Free Speech. (Old-timers
like me will remember Prodigy, <a href="http://www.techrepublic.com/blog/classics-rock/prodigy-the-pre-internet-online-service-that-didnt-live-up-to-its-name/">an online service</a>
that started censoring content, which started its slide into doom.) Zuckerberg doesn&rsquo;t want to be
held responsible for swaying important political events one way or another, but neither does he
want to be the target of numerous political activist lawsuits (from all directions). As Joshua
(the AI in the WOPR, back in the 80s movies that every geek my age openly worshipped) learned,
Zuck will discover that sometimes &ldquo;the only winning move is not to play&rdquo;.</li>
<li><strong>A driverless car will kill somebody.</strong> (0.5) It&rsquo;s only a matter of time. The circumstances
may not be the software&rsquo;s fault&mdash;and in fact it&rsquo;s likely that it won&rsquo;t be, when the final analysis
comes back&mdash;but the headlines will scream, and the widespread fear of a human &ldquo;not being in the loop&rdquo;
will set driverless cars back by years. Expert testimony and repeated demonstrations will do
nothing to shake the public&rsquo;s fear that a computer-driven car could &ldquo;hit a bug and kill me&rdquo;.</li>
<li><strong>The topic of ethics and programming will begin to become fashionable.</strong> (0.3) Somewhere alongside
the driverless car&rsquo;s first fatality, people will start asking how the car&rsquo;s programming makes
decisions that most humans make in a split-second without even thinking about it. Case in point: the
car detects that a motorcycle rider has had a problem and the rider has laid the bike down in the
road right in front of the car. (For discussion purposes, there is no room left to brake; the rider
is too close.) The car can either swerve to the side to avoid the now-helpless rider, potentially
causing a major accident involving multiple people; or the car can simply continue forward, running
over (and very likely killing) the motorcycle rider but avoiding the possibility of multiple fatalities
from a larger accident. Most humans would swerve&mdash;but is that the &ldquo;right&rdquo; decision? More to the
point, what should the software be programmed to do? Once the public gets wind of these kinds of
decisions being made by geeks behind flat-screen LCDs, it&rsquo;s going to cause a major outcry. (And yes,
these kinds of decisions are going to be encoded in the software, somewhere.)</li>
<li><strong>&ldquo;The cloud&rdquo; continues to grow, even as consumers wonder what the hell it is.</strong> (0.7) Let&rsquo;s be
clear&mdash;as of right now, the cloud is basically a developer thing. My parents really don&rsquo;t &ldquo;get&rdquo;
the cloud, largely because there&rsquo;s really nothing they get from it. Sure, one can argue that GMail
is the world&rsquo;s most popular cloud email service&hellip;. but your email is just stored on a server that
Google owns, as opposed to a server that your ISP owns. (If that&rsquo;s your definition of &ldquo;cloud&rdquo;, then
pretty much all client-server computing is &ldquo;cloud&rdquo; in your world.) People are looking at
more online services for things like bill payment, true, but those are basically services being
offered by vendors with whom these people are already doing business&ndash;again, that&rsquo;s not &ldquo;cloud&rdquo;.
Cloud offerings have basically found a home in the developer world, but general-purpose cloud,
the way that cloud was first being sold, is losing its window of opportunity to get hold of
general consumers&rsquo; minds. (I lose this prediction if my parents are suddenly smitten with a product
that stores or computes for them and isn&rsquo;t a vendor they already have a relationship with.)</li>
<li><strong>&ldquo;Blockchain&rdquo; remains the most opaque &lsquo;thing&rsquo; of the year.</strong> (0.8) Everybody will go on and on about its
huge technical advantages and obvious benefits, while never actually describing what it is or how it
could work to change the world it&rsquo;s so clearly destined to change. It&rsquo;s the ultimate hype machine,
and it will show no signs of slowing down until maybe the end of the year. By that time, something
will emerge out of it (the way blockchain emerged out of bitcoins and cryptocurrency) that will
carry forward the legacy of &ldquo;changing the world&rdquo; without actually changing anything.</li>
<li><strong>Artificial intelligence will continue to remain a &lsquo;future&rsquo; thing.</strong> (0.8) Part of the reason I say
this is because AI is like magic&mdash;if you can understand it, it&rsquo;s not interesting anymore and it&rsquo;s just
an implementation detail. We&rsquo;ve had rules engines and natural language processing for years. When
Amazon started doing &ldquo;predictive analysis&rdquo; of what you would like to buy, we pulled &ldquo;data science&rdquo;
and &ldquo;behavioral analytics&rdquo; out of the &ldquo;AI&rdquo; world and into its own category. When AI figured out how
to make the spoken word make sense, we called it &ldquo;speech-to-text&rdquo; and it was a feature on Android
alreday back in the v2 days. (Marry speech-to-text up with a natural language parser, and you have
Siri&mdash;which, remember, was its own company before Apple acquired them.) No, Alexa is not going to
revolutionize the world any more than Siri did&mdash;the act of talking to a machine is not particularly
new, and it&rsquo;s only as good as the services that sit behind the parser and can &ldquo;hook in&rdquo; to the
parsed text. &ldquo;Cortana, fire up StarCraft 2&rdquo; is easy to parse and start an application; &ldquo;Cortana,
fire up StarCraft 2, and find me a random Hard co-op match as Artanis&rdquo; requires not just firing
up an application, but also &ldquo;hooking&rdquo; inside the application to know how to carry out the rest of
the request. That requires an API platform that all applications can hook into, provide, and describe
(in natural-text terms) to the voice-control system. That is not going to be easy to define, adopt,
or test.</li>
</ul>
<p>On a personal note, several predictions come to mind:</p>
<ul>
<li><strong>Ted will celebrate his one-year anniversary at Smartsheet in September.</strong> I&rsquo;m optimistic about these
guys, and the things we can do together. I&rsquo;m looking forward to taking them into the developer limelight
in a variety of different ways.</li>
<li><strong>Ted will do less speaking this year.</strong> My new role actually encourages me to help develop new talent
for my employer to go out and do the actual speaking, so while I&rsquo;m definitely down for doing a few
conferences this year, it&rsquo;s not going to be more than 12, total, for the calendar year. I enjoy speaking,
but I&rsquo;m looking to be a lot more careful about where I speak now.</li>
<li><strong>Ted will not be renewed as a Microsoft MVP.</strong> Actually, this appears to be fact, not a prediction.
MVP renewals for the January cycle went out already, and I didn&rsquo;t receive one. Fortunately, most of
the stuff I care about in the Microsoft world is all open-source (or moving that way) anyway, and
while it&rsquo;s been nice being on the MVP mailing lists, there&rsquo;s really been nothing there that&rsquo;s been
all that insightful or amazing. (And, fortunately, living in Redmond makes it trivially easy to get
together with anybody on a product team if I really want or need to, and I am privileged to call many
of the people on those teams &ldquo;friend&rdquo;.) It would&rsquo;ve been 14 years, but as we Stoics say, &ldquo;All good things,
in time, must come to an end.&rdquo;</li>
<li><strong>Ted will look to engage with other tech companies beyond Microsoft.</strong> Google just started a new
MVP-like program, and I&rsquo;ve been teaching Android and Angular and some Google Cloud Platform stuff for
a while, so perhaps they&rsquo;ll welcome me into their fold.</li>
<li><strong>Ted will continue to teach at UW.</strong> I&rsquo;ve been guest-lecturing at UW for the past three years now,
and I&rsquo;m loving it. The students are bright, eager, and a helluvalot smarter than I was at that age.
They&rsquo;re an incredible joy to teach.</li>
<li><strong>Ted will look to publish a few mobile apps.</strong> I&rsquo;ve had a few ideas floating around for a while, but
just never really made the time to do it. Even if they never turn a dime in profit, I&rsquo;m long overdue
for having a few apps in the respective mobile stores.</li>
<li><strong>Ted will continue to write for various tech &lsquo;zines.</strong> I love having the back-page editorial at
CODE Magazine, the column in MSDN, and the various series on developerWorks, among others. I fully
intend to keep all that going at full speed. (And I&rsquo;m always looking for new outlets, if anybody has
any leads on paid technical content gigs!)</li>
<li><strong>And finally, Ted will try to blog more.</strong> The perennial projection. I&rsquo;ve got much to blog about,
including the patterns series, as well as some interesting themes and ideas floating around the ol&rsquo;
brain pan.</li>
</ul>
<p>Happy Holidays, and thanks for reading!</p>
Intellectual Honestyhttp://blogs.tedneward.com/post/intellectual-honesty/
Thu, 07 Jul 2016 12:37:55 -0700http://blogs.tedneward.com/post/intellectual-honesty/
<p><em>tl;dr</em> At last night&rsquo;s Seattle Languages meeting, I was reminded of what intellectually-honest
debate does and does not look like; then, as part of the discussions and argument around the
tragic deaths of several black men at the hands of police, I was presented with a link to a
page entitled &ldquo;Ten Signs of Intellectual Honesty&rdquo;. This is good material.</p>
<p>First off, the original link:
<a href="https://designmatrix.wordpress.com/2010/11/14/ten-signs-of-intellectual-honesty-2/">Ten Signs of Intellectual Honesty</a>.
I&rsquo;m going to be quoting from it liberally, however, in case you don&rsquo;t want to click through. (But
please do so at least once sometime, so the author gets their just kudos for posting such awesomeness.)</p>
<p>With no further ado&hellip;.</p>
<h4 id="do-not-overstate-the-power-of-your-argument:cefd5e97ea5748fefac20b28ba402a83">Do not overstate the power of your argument.</h4>
<p>&ldquo;One&rsquo;s sense of conviction should be in proportion
to the level of clear evidence assessable by most. &hellip; Intellectual honesty is most often associated
with humilty, not arrogance.&rdquo; The humility thing really strikes a chord with me.
Many years ago, I had the chance to conduct a one-on-one CNN-style
interview with Bjarne Stroustrup (the creator of C++, if you&rsquo;re not familiar with the name), and one
of the first things that struck me was how quick he was to disavow the things he <em>doesn&rsquo;t</em> know.
There is no braggadocio, no &ldquo;look at the cool shit I&rsquo;ve created&rdquo;, nothing. Brian Goetz (the Java language
architect at Oracle) is similarly quite ready to admit what&rsquo;s outside his wheelhouse. Venkat
Subramaniam is one of the most open minds I&rsquo;ve ever met. And so on and so on and so on; the bigger the
&ldquo;name&rdquo; in Computer Science, the more humble they tend to be. (With a few exceptions.) I have since
taken this to mean that the louder you crow about yourself, the less others are willing to crow about
you, which is probably because there&rsquo;s not all that much to crow about.</p>
<p>&ldquo;If someone portrays their opponents as being either stupid or dishonest for disagreeing, intellectual
dishonesty is probably in play.&rdquo; Alas, it&rsquo;s not always that easy; too often, genuinely-interested
people are arguing over the facts, because we&rsquo;re too busy clinging to the facts that we like and
ignoring the ones we don&rsquo;t.</p>
<h4 id="show-a-willingness-to-publicly-acknowledge-that-reasonable-alternative-viewpoints-exist:cefd5e97ea5748fefac20b28ba402a83">Show a willingness to publicly acknowledge that reasonable alternative viewpoints exist.</h4>
<p>&ldquo;The alternative views do not have to be treated as equally valid or powerful, but rarely is it the
case that one and only one viewpoint has a complete monopoly on reason and evidence.&rdquo; This is even
more true in areas where clear objective evidence is simply lacking. I will freely admit that people
can build non-trivial systems in Perl; that doesn&rsquo;t change the fact that
<a href="http://blogs.tedneward.com/post/so-i-dont-like-perl-sue-me/">I dislike the language</a>.
Nor does it change the fact that it can be useful. I have my reasons for why I don&rsquo;t like it, and
we can debate as to whether those reasons are legitimate or pure opinion, but
that doesn&rsquo;t mean it isn&rsquo;t useful to people who&rsquo;ve used it in the past.</p>
<h4 id="be-willing-to-publicly-acknowledge-and-question-one-s-own-assumptions-and-biases:cefd5e97ea5748fefac20b28ba402a83">Be willing to publicly acknowledge and question one&rsquo;s own assumptions and biases.</h4>
<p>&ldquo;All of us rely on assumptions when applying our world view to make sense of the data about the world.
And all of us bring various biases to the table.&rdquo;
Another quote from Stroustrup: &ldquo;The more I know, the more I know I don&rsquo;t know.&rdquo; If the guy who&rsquo;s
forgotten more about programming languages than I will ever know can say that, then there&rsquo;s no room
for me to stand up and insist that somehow I am never wrong. I think I know things, but there&rsquo;s
always the chance that I got the information wrong, the information I got was wrong when I got it,
or the situation has changed since I got that information.</p>
<h4 id="be-willing-to-publicly-acknowledge-where-your-argument-is-weak:cefd5e97ea5748fefac20b28ba402a83">Be willing to publicly acknowledge where your argument is weak.</h4>
<p>&ldquo;Almost all arguments have weak spots, but those who are trying to sell an ideology will have great
difficulty with this point and would rather obscure or downplay any weak points.&rdquo; (Yeah, I really
don&rsquo;t have a whole lot more to add to that.)</p>
<h4 id="be-willing-to-publicly-acknowledge-when-you-are-wrong:cefd5e97ea5748fefac20b28ba402a83">Be willing to publicly acknowledge when you are wrong.</h4>
<p>&ldquo;Those selling an ideology likewise have great difficulty admitting to being wrong, as this undercuts
the rhetoric and image that is being sold.&rdquo;
This is probably the hardest thing in the world to do, and it&rsquo;s why so often we do so with these weasel-
word qualifiers. &ldquo;If I offended anyone, I am sorry that they were offended.&rdquo; Or &ldquo;I was perhaps
mistaken about the degree of truth in the statements you and I exchanged during that conversation.&rdquo;
These are not admissions of incorrectness. These are attempts to salvage the ego and balm the
uncomfortable feeling that comes with being incorrect about something.</p>
<p>Make the resolution right now: Admit when you are wrong at least once a day. That&rsquo;s mine.</p>
<h4 id="demonstrate-consistency:cefd5e97ea5748fefac20b28ba402a83">Demonstrate consistency.</h4>
<p>&ldquo;A clear sign of intellectual dishonesty is when someone extensively relies on double standards.
Typically, an excessively high standard is applied to the perceived opponent(s), while a very
low standard is applied to the ideologues&rsquo; allies.&rdquo; It&rsquo;s hard sometimes to see, but this is part
of why I think we engage in some of these endeavors&mdash;not to convince anybody of anything, but to
find out where my own thinking is not being consistent and/or &ldquo;fair&rdquo;.</p>
<h4 id="address-the-argument-instead-of-attacking-the-person-making-the-argument:cefd5e97ea5748fefac20b28ba402a83">Address the argument instead of attacking the person making the argument.</h4>
<p>&ldquo;Ad hominem arguments are a clear sign of intellectual dishonesty. However, often times, the
dishonesty is more subtle. For example, someone might make a token effort at debunking an
argument and then turn significant attention to the person making the argument, relying on stereotypes,
guilt-by-association, and innocent-sounding gotcha questions.&rdquo; Yeah, this starts to sound like just
about every &ldquo;flame war&rdquo; I&rsquo;ve ever seen go on between people who don&rsquo;t have respect for one another.
Including a few I&rsquo;ve been in. (And apologies, by the way, to anyone I made ad hominem attacks
against in the past, intentionally or accidentally. I don&rsquo;t recall any, but I&rsquo;m sure there&rsquo;s more
than a few across my past.)</p>
<h4 id="when-addressing-an-argument-do-not-misrepresent-it:cefd5e97ea5748fefac20b28ba402a83">When addressing an argument, do not misrepresent it.</h4>
<p>&ldquo;A common tactic of the intellectually dishonest is to portray their opponent’s argument in straw
man terms. &hellip; Typically, such tactics eschew quoting the person in context, but instead rely heavily
on out-of-context quotes, paraphrasing and impression. When addressing an argument, one should shows
signs of having made a serious effort to first understand the argument and then accurately represent
it in its strongest form.&rdquo; Sometimes, unfortunately, in the tone-less medium of the Internet, it&rsquo;s
too easy to interpret genuine questions for clarification or additional information around an argument
as a precursor to a strawman, which is why as a general rule, if I&rsquo;m seeking additional information,
I try to just keep it to the question and nothing else&mdash;no interpretation and no &ldquo;spin&rdquo;. (I will
also be the first to admit that it&rsquo;s a favorite tactic of mine, to do exactly the opposite: Ask the
question, anticipate the answer, and provide the &ldquo;devastating&rdquo; counterargument to what I think the
answer will be. This is my commitment to try and break that habit.)</p>
<h4 id="show-a-commitment-to-critical-thinking:cefd5e97ea5748fefac20b28ba402a83">Show a commitment to critical thinking.</h4>
<p>Want to start? Pick up a book on philosophy. Seriously. This is a subject that is all about doing
nothing but critical thinking&mdash;there&rsquo;s little in the way of repeatable, observable experiments that
can be done in philosophy to arrive at an empirical truth. (In fact, one could argue that as soon
as we can do that in a subject, it becomes &ldquo;science&rdquo; instead of philosophy.)</p>
<p>Best thing you can do for your career, honestly.</p>
<p>The original post contains a quick link to
<a href="https://designmatrix.wordpress.com/2009/01/24/critical-thinking/">this</a>, which contains the
following workable list:</p>
<ol>
<li>gather complete information – more than one source</li>
<li>understand and define terms (make others define terms, too)</li>
<li>question the methods by which results were derived</li>
<li>question the conclusion: do the facts support it? is there evidence of bias? remember correlation does not equal causation.</li>
<li>uncover assumptions and biases</li>
<li>question the source of information</li>
<li>don’t expect all the answers</li>
<li>examine the big picture</li>
<li>look for multiple cause and effect</li>
<li>watch for thought stopping sensationalism</li>
<li>understand your own biases and values</li>
</ol>
<p>From Human Biology: Health, Homeostasis, and The Environment, 3rd Edition, by Daniel D. Chiras.</p>
<p>Again, all of these are things that philosophical thinking tends to drum out of you. It begins
with asking questions, and then questions about the questions.</p>
<h4 id="be-willing-to-publicly-acknowledge-when-a-point-or-criticism-is-good:cefd5e97ea5748fefac20b28ba402a83">Be willing to publicly acknowledge when a point or criticism is good.</h4>
<p>&ldquo;If someone is unable or unwilling to admit when their opponent raises a good point or makes a
good criticism, it demonstrates an unwillingness to participate in the give-and-take that
characterizes an honest exchange.&rdquo; More importantly, it demonstrates that you can separate the
argument from the arguer&mdash;and gets the egos out of the way. (By the way, this holds for any
criticism&mdash;if you&rsquo;re being criticized over something, including things like a code review,
finding at least one thing in the critique that you can agree with in any way can help to separate
your own ego out of the discussion.)</p>
<hr />
<p>This whole blog is good, if you&rsquo;re a fan of science (with an emphasis on biology, it seems).</p>
It is too possiblehttp://blogs.tedneward.com/post/it-is-too-possible/
Tue, 21 Jun 2016 06:09:38 -0700http://blogs.tedneward.com/post/it-is-too-possible/
<p><em>tl;dr</em> Once again I find myself in the position of needing to call BS on a blog
post and deconstruct it: Yes, it is possible to be a good .NET developer, and
here&rsquo;s why.</p>
<p>First, as always, if you&rsquo;ve not read the original,
<a href="http://codeofrob.com/entries/why-you-cant-be-a-good-.net-developer.html">check it out</a>.
Again, I&rsquo;ll be quoting from it, so you needn&rsquo;t remember all of it, but read it once
just to get a good idea of what&rsquo;s there.</p>
<p>Now, let&rsquo;s deconstruct.</p>
<h2 id="why-you-can-t-be-a-good-net-developer:d88c54cf1b09b6b8fc1a15e271507014">Why you can&rsquo;t be a good .NET developer</h2>
<p>Mr Ashton seems to be basing his argument on a couple of things:</p>
<ol>
<li>Developers are leaving .NET.</li>
<li>The reason developers are leaving .NET isn&rsquo;t because of &ldquo;self-loathing&rdquo;, but because
they work in shops that cater to the lowest-common-denominator.</li>
<li>&hellip; There is no third reason. Just lots of use of the perjorative word &ldquo;derpy&rdquo;.
Because that will make the point far better than any logic could.</li>
</ol>
<p>Let&rsquo;s break those down:</p>
<h3 id="developers-are-leaving-net:d88c54cf1b09b6b8fc1a15e271507014">Developers are leaving .NET</h3>
<p>Mr Ashton opens with:</p>
<blockquote>
<p>The reason why people &ldquo;leave&rdquo; .NET is &mdash;</p>
</blockquote>
<p>This one will be hard to invalidate, because there&rsquo;s two forms of evidence that
we can pull into play: empirical evidence, which is hard to gather objectively,
and anecdotal evidence, which is easy to gather (for each of us individually, anyway),
hard to refute, and yet hardly industry-indicative.</p>
<p>Here&rsquo;s what I mean. Assume that I have observed that when the door slams, people
jump. As a matter of fact, I can go around and slam doors, and watch that everybody
within close proximity of the door jumps. That means that, anecdotally, 100% of the
population will jump when I slam the door. Yay science!</p>
<p>Of course, I probably should mention that none of the people being observed are
deaf, and none of them have been through this particular experiment before.
(As it turns out, you can get used to anything, including the noise of slamming
doors, and thus not flinch, despite what anecdotal evidence may tell you.)</p>
<p>But perhaps this is a tangent, becaues Mr Ashton never comes back to this point
of developers leaving&mdash;he prefers instead to focus on the rationale, which is:</p>
<h3 id="lowest-common-denominator-enterprise-shops:d88c54cf1b09b6b8fc1a15e271507014">Lowest-common-denominator enterprise shops</h3>
<p>He continues with:</p>
<blockquote>
<p>To work in a development shop with a team is to continually cater for the lowest
common denominator of that team and the vast majority of software shops using
.NET have a whole lot of lowest common denominator to choose their bad
development decisions for. &hellip; It&rsquo;ll not happen because as long as you&rsquo;re
working on a platform that is primarily used by derpy enterprise shops, you
will continually be held back because those derpy enteprise shops are
continually be held back by the derpy enterprise developers that work in the
derpy enterprise shops.</p>
</blockquote>
<p>In case you weren&rsquo;t quite clear on his position, he goes on to say:</p>
<blockquote>
<p>It isn&rsquo;t self loathing, it&rsquo;s self preservation and an eventual realisation
that you can&rsquo;t actually progress so long as you&rsquo;re being held back by bad
decisions made to cater for the slow and the stupid. Self loathing is just
an intermediate stage that people go through while they still believe they
can make an impact on the environment around them by caring and shouting
into the void to enact tiny changes that help nobody.</p>
</blockquote>
<p>Sounds to me like Mr Ashton has some angst he needs to work out in therapy.</p>
<p>Perhaps he has some strong empirical evidence to back this position? He cites
two pieces of anecdotal evidence, and then makes some broad sweeping
generalizations:</p>
<blockquote>
<p>Tangible examples? I remember well the insistence of one boss that we use
TFS because some developers would find it hard to use git.</p>
</blockquote>
<p>Or, perhaps, because the boss had some other good reasons not to use Git. I
can imagine a couple:</p>
<ul>
<li>Git often implies GitHub, which can be costly for businesses if the source
needs to remain closed-source. By all means, there are alternatives to GitHub,
one of my favorites being <a href="http://bitbucket.com">BitBucket</a>, but developers
frothing at the mouth over Git often can&rsquo;t comprehend the idea of using it
without GitHub.</li>
<li>Managers often need their reports. TFS, for all of its flaws, has a better
integration story in the ALM cloud than GitHub does, and management that is
used to (as in, built processes around) some of these reports will find it
is a non-trivial cost to move to something other than TFS.</li>
<li>Why does which source control system you use matter all that much? Developers
often get all worked up over this, but frankly, teams spent years working
with non-distributed systems like Subversion and CVS before Git came along.
As a matter of fact, unless your team spends significant amounts of time
offline from the source-control server, there&rsquo;s a strong argument to be made
that you don&rsquo;t need to use Git or Mercurial at all.</li>
</ul>
<blockquote>
<p>I remember the steadfast committal to ASP.NET web forms because the
&ldquo;new concepts&rdquo; in ASP.NET MVC were going to take too long for the team to
become productive in.</p>
</blockquote>
<p>And that&rsquo;s a reasonable concern. Look, developers love to chase the bright new
shiny, whatever that happens to be, and frankly they&rsquo;re more than willing to
learn at their employers&rsquo; expense at times, making mistake after mistake after
mistake until they&rsquo;ve mastered the new shiny in question&mdash;at which point they
start chasing after the next new bright shiny. Quite often, these bright shiny
chasedowns don&rsquo;t yield any practical benefit to the end-user&mdash;in fact, they
achieve the exact opposite. Working with new languages, platforms, and tools
often leads to more bugs in the early days, and if there&rsquo;s no commensurate
improvement to the end users (in the long term, forget about the short term),
then there&rsquo;s absolutely no business reason to go down this path.</p>
<p>And I&rsquo;ll be honest here, I actually <em>agree</em> that WebForms is vastly inferior
to MVC. My former company had a client that was heavily invested in WebForms,
and refused all interest in moving to MVC, and their code was literally
impossible to automate testing around because of it. As a long-term strategic
decision, it was a poor one. But as a short-term tactical one, it made sense.</p>
<p>And that&rsquo;s the bigger picture: sometimes management needs to make short-term
tactical decisions that aren&rsquo;t optimal in the long-term. Eventually, if the
company has strong management, that course gets corrected. But that&rsquo;s not a
flaw in the technology, it&rsquo;s a flaw in the management structure using it. I
can (and have) see Java shops making the same decisions. C++ shops are hardly
immune to this. Nor are COBOL shops.</p>
<p>The answer to this, of course, is to make sure that the team is spending some
time coming up to speed on the bright shiny things that strategically are
important to the company; my preference would be that they spend at least
two hours a week doing some guided study around those topics. But that&rsquo;s
another topic for another day.</p>
<blockquote>
<p>There is now this furore <em>[sic]</em> over .NET core and the new thing in the
tiny 0.001% of people that care are whether they persist in using Windows
or switch to more productive environments.</p>
</blockquote>
<p>Of course we assume that &ldquo;more productive environments&rdquo; is a shadowy reference
to the Mac or a Linux environment, but frankly, I find that I&rsquo;m just as
productive in a Windows-based world as I am in a Mac-based world. The tools
are different, and the approach is different, but there&rsquo;s a reason why I
still make use of both environments on a regular basis.</p>
<p>And, by the way, .NET Core is an as-yet-unreleased environment. Trying to
point to that as a failure of the .NET platform as a technology is like
pointing to a half-finished supercarrier sinking. Of course it sank&mdash;it wasn&rsquo;t
finished! Funny thing is, most half-finished environments or ships will also
sink. Shall we compare the Ruby 0.9 release against the .NET 4.6 release and
see which one wins, while we&rsquo;re at it?</p>
<blockquote>
<p>Of course @aliostad gets it right here and points out that the primary
&ldquo;Important Thing&rdquo; should be a focus on functional programming in languages
like F# and of course the reason it doesn&rsquo;t happen is because &ldquo;it&rsquo;s too
hard for most people in our team&rdquo;.</p>
</blockquote>
<p>I love the unattributed quote&mdash;it&rsquo;s so easy to pretend that you&rsquo;re quoting
somebody without having to actually name anybody or any particular context,
thus making it hard to refute. It&rsquo;s &ldquo;almost as if you can&rsquo;t really find anybody
to substantiate your arguments with actual fact&rdquo;.</p>
<p>But in this particular case, I&rsquo;ve heard it from several development managers,
speaking to both F# and Scala. And I&rsquo;ve challenged them on several occasions,
to let me spend a few hours going over the concept with their team, and see
what the team thinks afterwards. In each of those situations, the team has
&ldquo;gotten it&rdquo;, and although they&rsquo;re not functional programmers ready to embrace
Haskell as the One True Programming Language (which is ludicrous anyway),
they understand the core concepts and can see where and how it might be
applied. Or not applied, as the case may be. (Because, quite honestly, there
are a number of cases where a functional mindset doesn&rsquo;t really fit the problem
at hand. And this is why both of those languages&mdash;F# and Scala&mdash;are actually
object/function hybrid languages, and not strictly functional.)</p>
<p>The larger issue here, though, is that managers understand&mdash;in ways that
software developers don&rsquo;t seem to, as Mr Ashton emblemizes&mdash;that learning
new syntax and concepts and approaches takes time. Absent any formal time
set aside for the team to learn those new things, that time will most often come
at the expense of the project on which these developers are working. And while
sometimes that means that the project will come in early (because learning the
new thing helped shorten the overall release cycle), far more often than not,
it means the project gets pushed back. Which leads management to a pretty
inescapable conclusion:</p>
<blockquote>
<p>Project + Tool(new) = delays</p>
</blockquote>
<p>Perhaps management isn&rsquo;t the derp here, after all.</p>
<h2 id="but-why-is-this-net-s-fault:d88c54cf1b09b6b8fc1a15e271507014">But why is this .NET&rsquo;s fault?</h2>
<p>This the part that doesn&rsquo;t seem to be elucidated in Mr Ashton&rsquo;s rant; he
cites his two anecdotal examples (only one of which is actually .NET specific;
one could see the TFS/git argument taking place in a Java shop just as easily
as in a .NET shop), then makes his two sweeping generalizations, but without
any actual evidence that this is somehow localized to the .NET segment of the
industry. News flash, Mr Ashton: This exact same argument happens in Java shops
(&ldquo;Why can&rsquo;t we use Groovy/Scala/Clojure/Kotlin?&rdquo;) and, as companies invest more
deeply into Ruby and/or NodeJS (assuming they ever do&mdash;Ruby seems to be fading
fast from the enterprise landscape, for whatever reasons), they will make the
same kinds of decisions around those platforms, as well.</p>
<p>Because, Mr Ashton, it turns out that this isn&rsquo;t a question of the tool or the
language, it&rsquo;s a question of the environment in which they are being used. And,
as any &ldquo;derpy&rdquo; enterprise development manager knows, the bleeding edge is called
the bleeding edge for a pretty good reason.</p>
<h2 id="well-that-means-it-s-the-derpy-co-workers-fault:d88c54cf1b09b6b8fc1a15e271507014">Well, that means it&rsquo;s the derpy co-workers&rsquo; fault</h2>
<p>Let&rsquo;s be clear, fundamentally Mr Ashton&rsquo;s arguments really aren&rsquo;t with .NET, whether
he realizes it or not. He&rsquo;s actually angry at his industry colleagues at these
&ldquo;derpy&rdquo; enterprise shops, because (in his not-so-subtle opinion) they&rsquo;re the ones
that management is trying to accomodate with their &ldquo;derpy&rdquo; decisions. Again, just
to be clear:</p>
<blockquote>
<p>It isn&rsquo;t self loathing, it&rsquo;s self preservation and an eventual realisation
that you can&rsquo;t actually progress so long as you&rsquo;re being held back by bad
decisions made to cater for the slow and the stupid.</p>
</blockquote>
<p>It&rsquo;s pretty clear that Mr Ashton considers most enterprise software developers to
be &ldquo;slow and stupid&rdquo;. This is elemental logic: a manager doesn&rsquo;t make a decision
for a team based on the consequences for just one member of the team, but on the
basis of what the majority of the team has or lacks. Therefore, in Mr Ashton&rsquo;s
view, the majority of enterprise developers are &ldquo;derps&rdquo; who are simply too slow
and too stupid to understand the new things. What magnificent benefits could be
had, if only they would stop plodding down the lane like oxen, and instead gallavant
and run through the forests, wind whipping through their hair as they leap over
obstacles like the rest of the stallions do!</p>
<p>Except that oxen never break their ankles attempting to leap an obstacle that is
too high for them. Oxen will never dump their riders on the ground because they
ran under a low-hanging branch that was too low for the human on their back. And
oxen will never decide on their own to simply refuse to make a particular jump
because, frankly, they just don&rsquo;t want to.</p>
<p>Roads were created to help eliminate the obstacles, and make it easier for
farmers to get their goods to market where, you know, they can sell them and
stuff. Which is used to buy other things, like food for the oxen. And a roof
over their heads in the winter. And the brush the farmer uses to comb out the
snarls in their fur. And so on.</p>
<p>(OK, all analogies break down eventually.)</p>
<p>Truth is, there&rsquo;s a lot of good reasons to stay to the road. Roads are something
that anyone can navigate, whether oxen, stallion or simple plain ol&rsquo; human.
Roads also have this funny way of making it clear what is interesting in the
world&mdash;if a particular town starts to grow, more people will start traveling
to it, and the roads will&mdash;organically&mdash;stablize and get smoother and easier
to travel. In fact, funny thing&mdash;if the road gets too much traffic on it, it will start
to degrade, at which point people will step in and pave the thing. And put up
signs to regulate traffic on it. And so on, each time reducing the obstacles
on the road itself and making it easier to travel.</p>
<p>Don&rsquo;t be hating the &ldquo;derpy&rdquo; enterprise developer because they don&rsquo;t want to go
crashing through the forests like stallions do. If you don&rsquo;t want to be one of
those &ldquo;derpy&rdquo; enterprise developers, then don&rsquo;t be one. But considering those
folks are the ones that wrote (and continue to maintain) the code that cashes
your paycheck every month, maybe you&rsquo;d be better off thanking them, instead of
insulting them.</p>
The Value of Failurehttp://blogs.tedneward.com/post/the-value-of-failure/
Sun, 29 May 2016 17:33:43 -0700http://blogs.tedneward.com/post/the-value-of-failure/
<p><em>tl;dr</em> Celebrating success is always a welcome thing. But in a lot of ways,
the people we should be celebrating are the ones who failed, and then learned
from it. As a matter of fact, there&rsquo;s a reasonable correlation to be drawn
here&mdash;that those who are truly successful are the ones who failed first.</p>
<p>This is actually a post that I&rsquo;ve been holding on to for a while now,
having originally been inspired by a
<a href="https://hbr.org/2016/03/companies-cant-be-great-unless-theyve-almost-failed">Harvard Business Review</a>
article that goes into this more deeply. In it, the author states:</p>
<blockquote>
<p>One intriguing feature of these enormous success stories is that so many of
them are little-known companies in ordinary, sometimes downright boring
industries: railroads, health insurance, back-office automation. &hellip;
But the more important part of the story&hellip; is that every one of these star
performers faced at least one “near-death experience” during the course of
its long-term success. I don’t mean a few quarters of sluggish growth or a
one-time product flop, but a radical shift in its market, a major technology
disruption, or a disastrous strategic bet that threatened the company’s
very existence.</p>
</blockquote>
<p>Many, many years ago, when I was contracting for this guy
<a href="https://www.mountaingoatsoftware.com/">Mike Cohn</a>, he mentioned to a good
friend of mine (who was working there with me) that &ldquo;Ted needs to fail. Twice.&rdquo;
Needless to say, Mike didn&rsquo;t say it to my face. I wouldn&rsquo;t have heard it for the
advice that it was&mdash;I&rsquo;d have seen it (as many people do, when I tell this story)
as a desire to see me brought low or something. In truth, it was anything but that.
Mike also was the guy who told me that, &ldquo;You have so much book knowledge, people forget
that you don&rsquo;t have a lot of experience, and expect a lot more out of you than
you&rsquo;re capable of delivering. It&rsquo;s like being 6-foot-4-inches at 13; you look
like an adult, but you&rsquo;re not one, yet.&rdquo;</p>
<p>In other words, Mike was channeling the old adage: &ldquo;Good decisions come from wisdom.
And wisdom comes from bad decisions.&rdquo;</p>
<p>(And, sure enough, once I had failed, twice, it was then that I started making real
progress in my career as a programmer/consultant/architect. Mike had that one
absolutely spot-on.)</p>
<p>The HBR author talks about how companies trying to avoid this near-death experience
just somehow don&rsquo;t seem to &ldquo;get it&rdquo;:</p>
<blockquote>
<p>In an effort to understand the new logic of change and the emerging rules of
success, we convened a conference around the theme “How Do You Overthrow a Successful
Company?” &hellip;. It was a gathering of executives, strategists, and change agents
from illustrious big companies who sensed that there were massive shifts on the
horizon and who were determined to reckon with those shifts and embrace a new
generation of business models, a new era of technology and communications, a new
level of customer expectations and sophistication. &hellip; In other words, they were
leaders who wanted their companies to win big in fast-moving times without a
near-death experience. It was a great idea for a conference, yet it amounted to
a hill of beans.</p>
</blockquote>
<p>He goes on to point out that all of the named companies in attendance, and how they
were struggling or had met with nothing but struggles and difficulties&mdash;clearly the
conference didn&rsquo;t really help them all that much.</p>
<p>So what is it about failure that is necessary for success?</p>
<h2 id="failure-forces-retrospection:be1df5882f5a5e93f7c2825fbdeef676">Failure forces retrospection</h2>
<p>First of all, far more than any success, failure forces players to examine what&rsquo;s
going on and why they&rsquo;re not being successful. We see it all the time in sports:
the first-round draft pick gets to the big leagues, and doesn&rsquo;t really seem to take
the whole thing seriously, and (not surprisingly) performs disastrously. They&rsquo;re
accustomed to success, so they don&rsquo;t bother with the kind of internal reflection
and retrospection that is necessary to get better.</p>
<p>This happens with companies, too. When a company is successful, the investigation
as to the mechanics of that success are limited, or else captured as part of the
company&rsquo;s CEO tossing off some sound bites. &ldquo;We are a company of passionate individuals
who are committed to our firm&rsquo;s success&rdquo; seems to be the phrase <em>du jour</em>. Success
has everything to do with the company&rsquo;s effort, and nothing to do with any sort of
external factors that might have affected the results.</p>
<p>When you fail, though, suddenly people want answers, and the usual raft of platitudes
don&rsquo;t hold up anymore.</p>
<h2 id="failure-creates-desperation:be1df5882f5a5e93f7c2825fbdeef676">Failure creates desperation</h2>
<p>More importantly, though, when the company&rsquo;s on the brink, there&rsquo;s a sense of focus
and desperation that says &ldquo;Well, we&rsquo;ve got nothing left to lose, so let&rsquo;s go all-in
on this and see what happens.&rdquo; This isn&rsquo;t a new idea; <a href="https://en.wikipedia.org/wiki/Spanish_conquest_of_the_Aztec_Empire">Cortez used it in his conquest
of the Aztecs</a>.
By &ldquo;burning the boats&rdquo; (though history actually records that he scuttled the ships,
rather than burned them), Cortez created a &ldquo;We have no road back, we have to go
forward, or die&rdquo; mentality among his men. Their options were thus narrowed down to
exactly two: conquest or death. Robert Greene, in his book <a href="https://www.amazon.com/Strategies-War-Joost-Elffers-Books-ebook/dp/B000W9149K">&ldquo;The 33 Strategies of
War&rdquo;</a>,
calls this &ldquo;The Death-Ground Stragegy&rdquo;.</p>
<blockquote>
<p>Quite often we feel somewhat lost in our actions. We could do this or that&ndash;we
have many options, but none of them seem quite necessary. Our freedom is a
burden&ndash;what do we do today, where do we go? Our daily patterns and routines help
us to avoid feeling directionless, but there is always the niggling thought that
we could accomplish so much more. We waste so much time. &hellip;
Over two thousand years ago, the Chinese strategist Sun-tzu came to believe that
listening to speeches, no matter how rousing, was too passive an experience to
have an enduring effect. Instead Sun-tzu talked of a &ldquo;death ground&rdquo;&ndash;a place where
an army is backed up against some geographical feature like a mountain, a
river, or a forest and has no escape route. Without a way to retreat, Sun-tzu
argued, an army fights with double or triple the spirit it would have on open
terrain, because death is viscerally present. Sun-tzu advocated deliberately
stationing soldiers on death ground to give them the desperate edge that makes
men fight like the devil. &hellip; The world is ruled by necessity: People change
their behavior only if they have to. They will feel urgency only if their lives
depend on it.</p>
</blockquote>
<p>Obviously, we&rsquo;d prefer not to hold death to peoples&rsquo; faces in order to get them
to better react at work, but sometimes, when the chips are down, the best thing to
do is to simply state, &ldquo;This is the only path forward, we either march, or we die.&rdquo;</p>
<h2 id="so:be1df5882f5a5e93f7c2825fbdeef676">So&hellip;.</h2>
<p>&hellip; what&rsquo;s the takeaway here?</p>
<p>Actually, it&rsquo;s pretty simple: Do you really want to succeed? The Death-Ground
Strategy says that you should basically put yourself into a position where you
either succeed, or die. If you&rsquo;re contemplating becoming a technical speaker,
quit your job and throw &ldquo;all-in&rdquo; around speaking at technical conferences and
what-not. Either you will speak at the user group, or you will starve.</p>
<p>Yeah, I&rsquo;m not a huge fan of that. Try it if you like, but that seems like huge
downside for very small upside. Maybe for an army trapped in a foreign land
whose only hope for survival is to fight its way back home, it works, but for
most of the rest of us, who are nowhere close to trapped behind enemy lines,
much less staring death in the face&hellip;. yeah, not so much.</p>
<p>But there is value in failure, if only because it forces us to engage in that
reflective and introspective process that helps us emerge stronger and better
over time. Which then suggests to me that practically speaking, the takeaway
here is to <strong><em>Put yourself into failure-potential situations</em></strong>. Take those
risks. Put your career and credibility on the line. Engage in things that you
aren&rsquo;t sure you can do, and because nobody likes to fail (including you!),
you&rsquo;ll put forth much more effort into the exercise.</p>
<p>&ldquo;33 Strategies&rdquo; offers the following:</p>
<ul>
<li><strong>Stake everything on a single throw.</strong></li>
<li><strong>Act before you are ready.</strong></li>
<li><strong>Enter new waters.</strong></li>
<li><strong>Make it &ldquo;you against the world.&rdquo;</strong></li>
<li><strong>Keep yourself restless and unsatisfied.</strong></li>
</ul>
<p>So, for example, if you want to be a technical speaker, stand up and volunteer
to give a talk at the local user group. Matter of fact, volunteer to do four
talks over the span of the next year. (&ldquo;Four talks?! Are you out of your
mind?!? That&rsquo;s like one every three months!&ldquo;) Don&rsquo;t do just the same talk
four times&mdash;do four completely different, separate talks, which will then
force you to figure out how to write good talks in a compressed period of time.</p>
<p>Or, if your desire is to get a startup off the ground, commit to having an
MVP ready by the end of the calendar year 2016 without leaving your job yet.
Because let&rsquo;s face it, if you can&rsquo;t get the MVP done in six months while
holding down a regular full-time job, you won&rsquo;t be able to handle working the
80 hours a week that a startup will demand of you during the early days.
And you&rsquo;ll need that MVP before you can get any kind of angel or seed funding
anyway.</p>
<p>And when (not if!) you fall, figure out why, get back up, and keep going.</p>
Microsoft meets Open Sourcehttp://blogs.tedneward.com/post/microsoft-meets-open-source/
Sat, 28 May 2016 21:34:14 -0700http://blogs.tedneward.com/post/microsoft-meets-open-source/
<p><em>tl;dr</em> Hadi Hariri has made a few observations regarding the churn we&rsquo;re seeing in the Microsoft
open-source space (around .NET Core and ASP.NET Core, among other things). But I don&rsquo;t think this
is a permanent state of affairs; what I think is going on is that Microsoft is finding that
managing an open-source project is more than just owning the GitHub repo and just reviewing pull
requests.</p>
<p>First off, let&rsquo;s get two points of fact out of the way.</p>
<p>One, Hadi&rsquo;s a friend. He and I have known each other for many years, and anything I say here,
I&rsquo;m comfortable saying in front of him. (Not that I think he&rsquo;d disagree with any of this, but
sometimes people see conflict where there isn&rsquo;t any.)</p>
<p>Two, Hadi&rsquo;s right about <a href="http://hadihariri.com/2016/05/27/rc-means-something/">everything he says</a>
when he talks about &ldquo;RC means something&rdquo;. Terms have meaning, and if you change that meaning
without your audience understanding that the meaning has changed, they become nonsensical and
useless. If an &ldquo;RC&rdquo; doesn&rsquo;t mean &ldquo;We&rsquo;re close to a final release&rdquo;, then it&rsquo;s not an &ldquo;RC&rdquo;, it&rsquo;s
a &ldquo;milestone&rdquo; or an &ldquo;alpha&rdquo;. Say what you mean, and mean what you say.</p>
<h2 id="hello-foss:cbde6e9af18a742ab524838612218025">Hello, FOSS</h2>
<p>Here&rsquo;s the thing, though: For the most part, this is the first large-scale exercise Microsoft
has undertaken in the FOSS world, and they&rsquo;re going through the inevitable growing pains that
come with it. There&rsquo;s a lot of lessons they&rsquo;re learning the hard way&mdash;adjusting schedules and
rearranging priorities being out in the open, and being more than a little messy being just
one of them.</p>
<p>Microsoft isn&rsquo;t the only company who&rsquo;s run into this. Oracle had its own share of growing pains
when it first acquired Java, and that was after the people running the projects were running
them in the open-source world for quite some time. When Sun first open-sourced Java, it took
them close to eighteen months to migrate the build infrastructure, the code, and everything
else over to a format that would enable people to not only get the code and build on their
own, but be able to submit a pull request or patch. (As one of the first people to ever get
the JDK to build on Windows&mdash;and Lordy, what a pain that was&mdash;I have a little knowledge
of what was required just to shift the build infrastructure over to use the most recent tools.)</p>
<p>But even accomplished open-source projects run into this problem. Not all Linux releases have
gone smoothly. Spring&rsquo;s hit a few snags from time to time. And let&rsquo;s not forget the incredibly
<a href="https://blog.codinghorror.com/douchebaggery/">crowd-pleasing approach</a> that Rails
(and <a href="https://www.reddit.com/comments/25h1bv">its creator</a> ) have taken over the years.
Microsoft is, of course, not trying to play &ldquo;the DHH Card&rdquo; in any way (at least, that I can
see), but at the end of the day, managing that relationship between your developers and your
customers&mdash;particularly when developers are your customers&mdash;is a tricky thing, and in some
ways much harder to do from a position of transparency and openness.</p>
<h3 id="so-for-now-what:cbde6e9af18a742ab524838612218025">So&hellip; for now, what?</h3>
<p>Does this mean that developers should steer clear of .NET Core? Well&hellip;. that kinda depends.</p>
<p>If you&rsquo;re a developer who is accustomed to working with solidly-released software, and you&rsquo;re
not super-comfortable dealing with alphas/betas/Previews/RCs, then yes. Frankly, stay away
from .NET Core until it ships it&rsquo;s &ldquo;1.0&rdquo; release, with no suffixes, adjectives, or qualifiers.</p>
<p>But truthfully, that&rsquo;s not a &ldquo;.NET OSS&rdquo; or even a &ldquo;Microsoft&rdquo; position; that&rsquo;s a position
that you should hold regardless of the thing in question. Java&rsquo;s had its own share of
&ldquo;Whoops! Sorry, that thing&rsquo;s not ready yet&rdquo; releases that they&rsquo;ve had to backtrack on, too.
(If you&rsquo;re a Java developer, you&rsquo;re familiar with the long-running cluster-f*ck that has been
Java Modules that may, one day, actually ship. Current signs suggest that it will be JDK9,
but we thought they&rsquo;d be in Java6, Java7 and Java8, too.) If you want to build things on
top of &ldquo;reliable&rdquo; software, then you want the releases that lack any sort of modifier or
qualifier to the name, regardless of who made it, whether that&rsquo;s Microsoft, Google, Apple,
Oracle, the open-source team behind the project, your own development team or even Linus
Torvalds Himself.</p>
<p>And let&rsquo;s be honest, the actual term&mdash;whether it be &ldquo;alpha&rdquo;, &ldquo;beta&rdquo;, &ldquo;RC&rdquo; or
&ldquo;Preview&rdquo;&mdash;is really sort of meaningless here. In an OSS world, these terms really just
aren&rsquo;t all that necessary anymore. They used to be an indication of intent on the part of
the software manufacturer, signaling how close to &ldquo;final&rdquo; the product was considered to be.
But in a world where you can look at the source, take it out for a test drive yourself,
examine the test cases (and test coverage) and see how well it handles your particular
scenarios by yourself, the terms really aren&rsquo;t necessary anymore.</p>
<p>So, yeah, if you like living the bleeding edge, keep playing with .NET Core, but don&rsquo;t
complain when stuff changes, because it&rsquo;s &ldquo;qualified&rdquo; (meaning, it&rsquo;s name has a
qualifier to it, so it&rsquo;s not yet shipped). Doesn&rsquo;t matter what the qualifier is, so long
as it remains qualified, then it&rsquo;s still subject to change.</p>
<p>And in the meantime, yeah, you can bet that Microsoft is going to learn the lessons it
needs to in order to figure out how to get along in this brave new world. Because&hellip;</p>
<p>Well, because they have to. This is the world in which they have chosen to live, and like
any new immigrants, it&rsquo;s going to take some time to get used to how things work here.
Over time, they&rsquo;ll adjust, and before long, they&rsquo;ll be the veterans on the block
teaching the new kids how it&rsquo;s done.</p>
<p>It just takes time.</p>
Logging Hourshttp://blogs.tedneward.com/post/logging-hours/
Tue, 19 Apr 2016 02:34:36 -0700http://blogs.tedneward.com/post/logging-hours/
<p><em>tl;dr</em> A recent DZone post lamented how logging hours makes the author &ldquo;die a little inside each time&rdquo;. I used
to feel the same way. Then I grew up and got over it.</p>
<p>As is typical with my posts, I ask you to <a href="https://dzone.com/articles/why-does-logging-hours-feel-so-offensive">go read the original</a>
(or in this case, the DZone copy of the original) before proceeding much further; I&rsquo;ll quote it, but it&rsquo;s always better to have the
source freshly in mind when reading the rebuttal.</p>
<p>And it is a rebuttal; just not necessarily aimed at him.</p>
<h3 id="the-act:9607b4af502b324786789d542cb3e122">The act</h3>
<p>The author, Matthew Casperson, opens with:</p>
<blockquote>
<p>Friday afternoon rolls around, and the obligatory call for weekly hours goes out. “Make sure your hours are up to date!” And I die a little inside. In truth, assigning hours to tickets is not actually that much work. As I have been told many times over, logging hours can’t take much more than a few minutes. So why does this simple act of corporate obedience feel like I have just been beaten over the head?</p>
</blockquote>
<p>He then goes on to cite four main reasons why this &ldquo;simple act of corporate obedience&rdquo; makes him feel &ldquo;beaten over the head&rdquo;:</p>
<ul>
<li>&ldquo;Because it feels so pointless.&rdquo;</li>
<li>&ldquo;Because it feels like a lack of trust.&rdquo;</li>
<li>&ldquo;Because I don&rsquo;t know where these numbers go.&rdquo;</li>
<li>&ldquo;Because it feels like a lazy management technique.&rdquo;</li>
</ul>
<p>He goes on to explain each of those four points in some detail, but I think the points pretty well stand for themselves
in summary form.</p>
<p>Truth be told, I have a hard time arguing with him. Or, rather, the version of me from about a decade ago has a hard
time arguing with him. Filling out timesheets felt so pointless and stupid, why bother? What if I just totally randomized
the numbers and sent that in? Would anybody actually notice? (Nobody ever did, by the way.)</p>
<p>But the 2016 version of me will, actually, argue this with him. Part of it is simply professionalism&mdash;if the client or
your employer says to do this thing, you do it, because it&rsquo;s simply the professional thing to do. (And I have every reason
to believe that Matthew is being a professional and doing it, lest there be any doubt there.)</p>
<p>More importantly, the 2016 version of me will also turn around and argue this for him, because everything he says screams
out to me that his management is Failing him.</p>
<p>Utterly, completely, and totally Failing him.</p>
<p>With a capital &ldquo;F&rdquo;.</p>
<h3 id="management-failures:9607b4af502b324786789d542cb3e122">Management Failures</h3>
<p>Let&rsquo;s start with the simple one first: When an employee talks about an act as being a &ldquo;lack of trust&rdquo;, usually that suggests
that other factors are in play. Matthew writes, &ldquo;&hellip; all this work and effort feels completely undermined by management
asking for an hour by hour account of my day. Am I that bad at my job that you really feel the need to know where every hour
of it was spent?&rdquo;</p>
<p>This is a smell that suggests that there is more at work here than just the logging of hours; employees do not feel
distrusted simply because management wants a time accounting. (There&rsquo;s good reasons for wanting this, which I&rsquo;ll
get to in a second.)</p>
<p>This is a smell that suggests that <em>the employees don&rsquo;t trust management</em>. It&rsquo;s really that simple;
trust, like most relationships, operates on a &ldquo;bank account&rdquo; kind of principle, and if employees are feeling like the
trust account is a little low, then it&rsquo;s really up to managers to (a) recognize that, and (b) work to put some trust back
into the account. (Of course, that does go both ways, which is what makes relationships hard.)</p>
<p>This is emphasized by his last point, about logging hours as being a &ldquo;lazy management technique&rdquo;. Listen to what he says
next: &ldquo;Meaningless numbers being interpreted in secret ways by management that never speaks to me&hellip;.&rdquo; Wow. Matthew&rsquo;s
management has a serious problem on their hands. &ldquo;Meaningless&rdquo;, &ldquo;secret&rdquo;, &ldquo;never speaks to me&rdquo;, those are three red flags
in just the first half of a single sentence. This is a statement of pure opinion, despite the allusion to factual studies
about employee disillusionment that follows in that paragraph, but the fact is, <em>employee feelings are just as legit as fact</em>.
The fact that an employee feels disrespected is every bit as important as whether or not they actually ever were. Emotions
are real things, and need to be treated as such.</p>
<p>I think a lot of the management Failure here stems from his third point.</p>
<p>When an employee sees an act that &ldquo;feels so pointless&rdquo;, it&rsquo;s because they don&rsquo;t see any value within the act. If it&rsquo;s an
act that doesn&rsquo;t generate value to them, then management has Failed in a pretty simple way:</p>
<p><strong><em>Management has not explained what these numbers are for.</em></strong></p>
<p>Honestly, when I was managing consultants working for the consulting company, it was painfully obvious where the numbers
were going&mdash;these were billable consultants, and so their hourly reports were going directly into the invoices that we
were sending to the client for payment. When we asked them to put a little explanation around their time, it was almost
always met with nods of recognition, because they could easily understand why those might be needed.</p>
<p>But suppose you&rsquo;re not working for a consulting company&mdash;why track the numbers?</p>
<h4 id="velocity:9607b4af502b324786789d542cb3e122">Velocity</h4>
<p>As a manager, I can certainly see a desire for those numbers&mdash;without those numbers and how they relate to story points
or features, I can&rsquo;t really effectively track my team&rsquo;s ratio of estimation to actual.</p>
<p>And that ratio is actually very, very important.</p>
<p>Because when I&rsquo;m called into a meeting with other departments (or other clients), and they start tossing off what they
want us to work on next, I&rsquo;m often asked to provide a ballpark estimate for how long this thing might take. &ldquo;Oh, we know,
it&rsquo;s just a ballpark estimate/SWAG, we won&rsquo;t hold you to it,&rdquo; they always say, but truth is, once a number gets named, it
tends to &ldquo;stick&rdquo; and become the measuring stick against which all other things are discussed.</p>
<p>At that moment, my goal as a manager is to protect my team from an unachievable schedule&mdash;that is, quite frankly, the
most important thing I will do all day that day. I can&rsquo;t just toss off a &ldquo;oh, sure, whatever, let&rsquo;s say&hellip;. six months?&rdquo;
with a cavalier wave of my hand, because chances are I&rsquo;m totally underestimating the work on so many different levels.
My first reaction must be to stall, distract, and delay, until I can get a reasonable estimate to them.</p>
<p>(By the way, it&rsquo;s not unreasonable for them to want an estimate&mdash;contrary to the beliefs of the &ldquo;No Estimate&rdquo; movement
that is currently making the rounds. Modern budgetary accounting requires some idea of what it will cost up front,
before you commit to the purchase. Think about it this way: How many of you would buy a car with no idea of the price,
agreeing to send a certain amount of money to the dealership every month until the dealer told you that was enough?
It&rsquo;s a complete and deliberate &ldquo;head in the sand&rdquo; rejection of how 99.9% of the world&rsquo;s corporate budgeting and
internal accounting works.)</p>
<p>Now this doesn&rsquo;t mean I need to create estimates down to the final story&mdash;a SWAG is still appropriate here. But if I
have that velocity metric, I can take a couple of hours, do a rough pass over what they want, come up with a number
of stories or feature points that are within the same order of magnitude (hopefully) of the actual result, and then
multiply that times our velocity to have a first-pass SWAG that at least puts it in the right ballpark.</p>
<p>But that&rsquo;s just me.</p>
<h4 id="pulse:9607b4af502b324786789d542cb3e122">Pulse</h4>
<p>In some cases, managers use these numbers as a sanity-check, a way of feeling out how the team is doing, without
having to tear them away from their actual work with one-on-one meetings. Think about it this way: which is less
intrusive to you as a programmer, to spend a few minutes at the end of the week to fill out a simple report that
will give me a high-level view of what you&rsquo;re doing (and by the way, why are you spending so much time debugging?
Didn&rsquo;t we get that super-nasty bug fixed last week? I think I need to follow up with you), or would you rather pull
yourself away from your desk for an hour a week so you and I can go over all of that in person, when you will
probably forget half of the details that would show up in a report?</p>
<p>Personally, I don&rsquo;t like the reporting approach&mdash;I would much rather have the one-on-ones, because then we can
talk about a bunch of things that wouldn&rsquo;t normally go into a status report. (Actually, I tend to do both&mdash;at
the consulting company, I looked at the submitted timesheets, and had one-on-ones with the team leads, so that
I could both passively get a feel for what was going on, and then validate&mdash;or not&mdash;that same feel via an
active conversation.)</p>
<p>Is this an act of lazy management? Maybe. But what developers need to realize is that developing software is not
like constructing a building or a house, because 98% of what they do is entirely ephemeral and very difficult
for anybody else to see, even other software developers. (Try it sometime&mdash;observe what another developer is
doing and the progress they are making without talking to them about it directly, and then try to describe what
progress you&rsquo;ve seen them make or not make. Even with rigorous source control practices and team-wide discipline
around check-ins and merges, it&rsquo;s really, really hard.) If you&rsquo;re not a software developer, trying to understand
what your team did today can be almost downright impossible without asking them.</p>
<p>So they ask them, in the form of a report.</p>
<h4 id="other-reasons:9607b4af502b324786789d542cb3e122">Other reasons</h4>
<p>Perhaps Matthew&rsquo;s manager isn&rsquo;t in the same position that I am, or isn&rsquo;t asked (and held) to estimates that early in
the process. Why, then, might the numbers be necessary?</p>
<p>I could posit a few theories (Accounting needs them in order to know how to charge time against internal budgeting
reports, or HR needs them to feed into the payroll system so that they can keep track of your vacation time, or the
CTO wants a roll-up report each year of how much time was spent on different activities within his organization because
he is convinced they are spending way too much time in debugging and maintenance and he needs those numbers so he can go
to the CEO and the board and ask for a larger budget to accomodate their requests for the next fiscal year, or&hellip;),
but any theory I come up with is entirely just a flight of fancy, and entirely irrelevant; the point is not why <em>I</em>
think the numbers are important, but that Matthew&rsquo;s management has not said why <em>they</em> think the numbers are
important.</p>
<p>Of course, the other possibility is that the management Failure occurs much, much higher up, and that this time-tracking
was established by the previous (or the previous previous) CTO, and right now those numbers just go into a report that
nobody ever reads. Which is its own management Failure.</p>
<h3 id="history:9607b4af502b324786789d542cb3e122">History</h3>
<p>Napoleon Bonaparte, one of history&rsquo;s greatest generals, was less fond of the precise marching orders than the
various generals (particularly the Prussians) that he faced. His approach was far &ldquo;looser&rdquo; (and arguably much
more agile); he would split his army into pieces, and to each of his Field Marshals he would give certain tasks.</p>
<p>But before he turned them loose, he would also make sure that each of the Field Marshals also understood what
the overall goal of the campaign was: &ldquo;We must take that town there before we move on the city, so that we have
a place by which to set up the cannons to bombard the walls.&ldquo;, for example. In this way, when he turned them
all loose, they all understood the larger picture. Then, when one saw an opportunity to further the overall
strategic goal, they could act on it quickly, without having to send a messenger back to Napoleon himself to
find out if that was something they should do. If the town in question lay open and undefended according to their
scouting reports, they could dispatch a battalion or two to seize it and hold it. Then, when the rest of the
French moved into their various positions, the town was already seized, and Napoleon could advance his plans
that much more quickly.</p>
<p>Management that fails to explain the &ldquo;Why&rdquo; of a particular policy is effectively falling into that most heinous
of management Failures, that of treating their employees like cogs. No, employees can&rsquo;t (and shouldn&rsquo;t) know
everything that a manager knows&mdash;certain things, for legal and privacy reasons, must remain confidential.
Lots of data about the company must remain in the hands of only a certain few.</p>
<p>But the hours policy is very, very likely not to be one, and it&rsquo;s even more highly likely that even if there are
places where these numbers go that the employees can&rsquo;t/shouldn&rsquo;t know about, there&rsquo;s other reasons that we can
talk about, to give them a sense of &ldquo;why&rdquo;, and let them see that those few minutes spent every Friday to fill out
the damn timesheet actually serves a useful purpose, <em>even if it&rsquo;s not useful to them personally</em>.</p>
<p>Because being told to do something that serves no useful purpose to anyone simply undermines your credibility
as a manager. And that&rsquo;s probably the worst management Failure of them all.</p>
<h3 id="postscript:9607b4af502b324786789d542cb3e122">Postscript</h3>
<p>Just after I posted this, there was an email from the Harvard Business Review entitled, &ldquo;The Best Bosses
Follow These 5 Rules&rdquo;, and they read like a complete counterpoint to everything Matthew described:</p>
<ol>
<li><strong>Manage individuals, not just teams.</strong> When you’re under pressure, you can forget that employees have varying interests, abilities, goals, and styles of learning. But it’s important to understand what makes each person tick so that you can customize your interactions with them.</li>
<li><strong>Go big on meaning.</strong> Inspire people with a vision, set challenging goals, and articulate a clear purpose. Don’t rely on incentives like bonuses, stock options, or raises.</li>
<li><strong>Focus on feedback.</strong> Use regular (at least weekly) one-on-one conversations for coaching. Make the feedback clear, honest, and constructive.</li>
<li><strong>Don’t just talk — listen.</strong> Pose problems and challenges, and then ask questions to enlist the entire team in generating solutions.</li>
<li><strong>Be consistent.</strong> Be open to new ideas in your management style, vision, expectations, and feedback. If change becomes necessary, acknowledge it quickly.</li>
</ol>
<p>If Matthew&rsquo;s management adopted these five, particularly #1, #2 and #3, they would immediately seek to manage
him as an individual, which would lead them to finding out more about what makes him tick, and that would lead
them to following up on the time reports. They would seek to provide a sense of meaning around those silly reports,
and his feedback could go into perhaps even making the reports better, depending on what the reports are actually
used for.</p>
<p>Most of all, if they were good bosses, they&rsquo;d be embracing #5, and we probably wouldn&rsquo;t have had to have this
little conversation.</p>
<p>Good chat.</p>
Practice, practice, practicehttp://blogs.tedneward.com/post/practice-practice-practice/
Fri, 08 Apr 2016 01:14:33 -0700http://blogs.tedneward.com/post/practice-practice-practice/
<p><em>tl;dr</em> Recently the Harvard Business Review ran an article on how readers could prepare for difficult business situations, using
the analogy of coaches preparing their teams for different eventualities by simulating those eventualities on the practice
field. There&rsquo;s lessons to be learned here for both programming and speaking.</p>
<p>First, go check out the HBR article:
<a href="https://hbr.org/2016/02/practice-for-tough-situations-as-youd-practice-a-sport">“Practice for Tough Situations as You’d Practice a Sport”</a> by Andy Molinsky. It&rsquo;s a relatively
short reads (as all HBR&rsquo;s web articles are; they&rsquo;re usually summaries of longer books, either theirs or something closely
related to business). His focus:</p>
<blockquote>
<p>To learn soft skills in a way that truly prepares us for what we’ll face when it really matters, we can take a few lessons
from a different arena where learning, development, and performance are essential: professional sports.</p>
</blockquote>
<h1 id="belichick-the-legend:097e18dfcb8b74f0c0efcfcf342a9a2b">Belichick: The Legend</h1>
<p>My hometown football team is the Seattle Seahawks. Two years ago, we were defeated in the Super Bowl when a Patriots cornerback
leaped in front of a sure-thing short slant pass for the game-winning touchdown on what was almost the final play of the game.</p>
<p>(And yes, critics, that pass is a sure-thing; in the 168 times it was executed all across the season, by any team, including
both the Seahawks and the Patriots, as well as all 30 other teams in the league, only once was it ever intercepted&mdash;that fateful play in the Super Bowl.)</p>
<p>Bill Belichick is the coach of the New England Patriots. He is quite a character.</p>
<blockquote>
<p>One key tenet of professional sports coaching, for example, is to prepare people in the most realistic contexts possible. When professional football teams prepare for their next opponent, they’ll take into account the likely conditions they’ll face. If the stadium the team is playing in is going to be noisy, coaches like Bill Belichick of the New England Patriots will play extremely loud music at practice to mimic game-time conditions. Belichick has even been known to pour water on practice balls to prepare the team for wet game-day weather.</p>
</blockquote>
<p>Bill Belichick is one of the most intense, concrete, no-holds-barred anything-to-win coaches you will ever see in this lifetime.
He oozes ruthlessness out of every pore, but it&rsquo;s a laid-back ruthlessness&mdash;he will never get in your face and be angry, but
players have said that if he goes very quiet on you, you&rsquo;re in deep yogurt. Point is, Belichick has a strong reputation for
success (more championship rings as a coach than I think any other coach in the history of the game), and much of that success
is that <em>nothing fazes the guy</em>. Absolutely nothing. Team&rsquo;s down by 42 going into the final two minutes of the game? Belichick
has been there before, and the team feels it. <em>Knows</em> it. If anybody&rsquo;s going to figure out how to score a metric crap-ton of
points in an impossible amount of time, it&rsquo;s Belichick. Nobody relaxes playing against a Belichick-coached team, no matter how
good your lead is.</p>
<p>When Malcolm Butler, the cornerback, was talking about that interception at the goal line, he pointed out that he had
recognized that formation and that play from game film that they had studied of the Seahawks&rsquo; game plan. He knew&mdash;from the
bottom of his heart&mdash;what the play was going to be, and if you look at the replay, he simply muscles his way into the exact
right spot where Russell Wilson threw the ball. He didn&rsquo;t have time to track where the ball was going&mdash;if I remember correctly,
he was already muscling into the spot before Wilson let go of the ball. Butler <em>knew</em> where that pass was going. He had seen
it before. They had studied it before. Which is to say, his coaches had seen it, pointed it out to their players, and the players
had spent time practicing what they would do if and when that situation emerged.</p>
<p>And it paid off for them, in spades.</p>
<p>Belichick isn&rsquo;t, perhaps, the smartest coach in the league. But he&rsquo;s always got an answer for anything the other team throws
at him, and more often than not, the other teams can&rsquo;t match whatever scheme he&rsquo;s cooked up. His teams win through preparation,
not talent.</p>
<p>And that&rsquo;s the point: it&rsquo;s often the preparation&mdash;not the talent&mdash;that creates success.</p>
<h1 id="preparation:097e18dfcb8b74f0c0efcfcf342a9a2b">Preparation</h1>
<p>When I interviewed at DevelopMentor, all those many years ago, I was required to conduct a &ldquo;test teach&rdquo; down at the company&rsquo;s
LA (home) office. I prepared a talk, of about 35 minutes in length, and I was asked to deliver the talk in one of
the classrooms while the students were out having lunch. Luc (who would later become the &ldquo;instructor liaison&rdquo;, but for that
moment was basically the &ldquo;man Friday&rdquo; for most of the company) took orders ahead of time, and ran out to a nearby restaurant
to order and bring back the food.</p>
<p>I set up my slides (which I&rsquo;d actually written as plain HTML files, since I was very much a newbie at PowerPoint), and as I
glanced up, I saw about 30 people in the room. Some were standing, most were sitting, a number were typing away at keyboards,
a few were chatting in the back. Most of them really weren&rsquo;t paying any attention at all to me. I waited for a few minutes
to see if they would notice I was ready to go. I&rsquo;d been told, &ldquo;You only have about 35 minutes, and we can&rsquo;t hold up class,
so make sure you stay within time&rdquo;.</p>
<p>Fortunately, I&rsquo;d done some Toastmasters in the past. When a crowd doesn&rsquo;t notice you&rsquo;re ready to start, you might do something
to get their attention, then go ahead and start. Some speakers will clear their throat, some will be timid and say, &ldquo;I&rsquo;d like
to get going now, please&rdquo;, but me&hellip; well, I&rsquo;ve never really been wired that way.</p>
<p>In a small room of about 30 people, I tossed off a &ldquo;HELLO!&rdquo; that was about two volume levels too high. It was pretty loud&mdash;a
few people actually jumped. And everybody looked at me, completely shocked. And without missing a beat, I said in a much more
appropriate tone of voice, &ldquo;My name is Ted Neward, and I&rsquo;m here to talk to you about Design Patterns today.&rdquo;</p>
<p>See, at that point they had a choice&mdash;they could admit rudeness and go back to what they were doing (which most crowds won&rsquo;t
do, since most of us are relatively polite people), or they could pay attention to the talk, but once I had their attention,
it was really on them.</p>
<p>And the only reason I knew this? Because I&rsquo;d practiced this before, as part of Toastmasters.</p>
<p>I found out later, from Luc, that the room was quite deliberately coached to be a little &ldquo;hard&rdquo; on me. As a matter of fact,
about ten minutes into my talk, Luc showed up with lunch, and stage-whispered his way around the room delivering lunch to
everybody in it. All. Thirty. People.</p>
<p>Because you know what? If I can&rsquo;t lecture through a distraction or two, I&rsquo;m not going to last long as a professional speaker.
They had the same idea as Toastmasters has had: the more you practice dealing with a situation, the more comfortable you are
at dealing with it when it eventually does happen.</p>
<p>Professional sports players practice every scenario they can think off&mdash;physically and/or mentally&mdash;so that when the moment
arises, they&rsquo;re up to the challenge.</p>
<p>How do you prepare?</p>
<h1 id="practical:097e18dfcb8b74f0c0efcfcf342a9a2b">Practical</h1>
<p>Molinsky writes,</p>
<blockquote>
<p>For example, you might work on rehearsing your pitch to potential VCs in front of a crowd of colleagues you’ve coached to pepper you with difficult questions. You might create situations where a VC is late to the meeting — or rushing you to finish your pitch in half the time you had planned. You might also do the session in a setting that mimics what you’ll likely encounter in the real world, whether that’s a noisy coffee shop or an overheated conference room.</p>
</blockquote>
<p>Many speakers often &ldquo;practice&rdquo; their presentations in front of a sympathetic crowd&mdash;friends, family, people who genuinely
love this person, and want their feelings to be spared any sort of discomfort or pain.</p>
<p>If you love me, and I ask you to help me practice, you won&rsquo;t spare me discomfort. You won&rsquo;t try to shield me from pain.
I need that discomfort, so that I know what it feels like. You actually hurt me by not making the situation realistic.
Or worse than realistic.</p>
<p>When I was mentoring a young speaker before her first big debut
presentation, I brought her down to the conference room where her talk would be to practice. We found the maintenance staff
setting up chairs in there. I asked if we could use the room, and when they said, &ldquo;Well, we need to be in here to set up
the chairs&rdquo;, I told them that was fine. It was <em>more</em> than fine&mdash;it was perfect. I made her go to the front of the room
and deliver her talk to an audience of five or so; myself, and the maintenance staff that could&rsquo;ve cared less about her
talk while they were setting up. I spent most of the time fiddling with my phone; no eye contact, no nodding, no real
feedback of any kind. She was essentially giving her talk into a vacuum. (Which, as any speaker can tell you, is an
incredibly draining experience.)</p>
<p>Eventually the staff finished, and after a few more minutes, where I deliberately kept moving
around the room (partly to judge her voice volume while she was speaking, to be able to tell her whether she needed to
go louder or softer), we wrapped up and went to go get dinner.</p>
<p>It was perhaps a bit harsh, but the point remains the same: people will get up and leave your talk for reasons that have
nothing to do with your talk. Distractions will happen. Cell phones will ring. Tornado sirens will go off. Water will
start pouring out of the ceiling for no reason whatsoever. (All of these have happened to be at one point or another.)
You cannot control the circumstances surrounding your talk.</p>
<p>As an soccer coach, I routinely told my teams, &ldquo;We are only as good as we practice.&rdquo; As a volunteer football coach for
my son&rsquo;s football team, I routinely said the same thing. As a manager, I used that same logic to justify the time spent
setting up infrastructure for our own internal IT, seeking to mimic the same tools, technology and process that we would
use for any client.</p>
<p>An organization will only be as good as how it practices. How well do you deal with outages? How do you know? If you&rsquo;ve
never actually practiced an outage, you have no idea. You may have a plan, but if you&rsquo;ve never practiced it, you&rsquo;ve
never actually gone through it, and therefore you really don&rsquo;t <em>know</em> if any of this is going to work. How well do you
deal with bug triage? How well do you deal with peformance problems? How well do you deal with refactoring?</p>
<p>How well do you practice any of these things? If you don&rsquo;t, then you suck at it. I guarantee it.</p>
<p>The old joke says that a musician was stopped on the streets one day and asked, &ldquo;How do you get to Carnegie Hall?&rdquo; He
looked at the questioner, and without a hint of irony, said, &ldquo;Practice, my boy, practice.&rdquo;</p>
<p>Oh, and her talk? She killed it.</p>
Reclaiming Design Patterns (20 Years Later)http://blogs.tedneward.com/post/reclaiming-design-patterns/
Fri, 25 Mar 2016 20:40:41 -0700http://blogs.tedneward.com/post/reclaiming-design-patterns/
<p><em>tl;dr</em> 20 years ago, the &ldquo;Gang of Four&rdquo; published
<a href="http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612">the seminal work on design patterns</a>.
Written to the languages of its time (C++ and Smalltalk), and written using the design philosophies of the time (stressing
inheritance, for example), it nevertheless spawned a huge &ldquo;movement&rdquo; within the industry. Which, as history has shown us,
was already the hallmark of its doom&mdash;anything that has ever become a &ldquo;movement&rdquo; within this industry eventually disappoints
and is burned at the public-relations stake when it fails to deliver on the overhyped promises that it never actually made.
It&rsquo;s time to go back, re-examine the 23 patterns (and, possibly, a few variants) with a fresh set of eyes, match them up
against languages which have had 20 years to mature, and see what emerges. (Spoiler alert: all of the original 23 hold up
pretty well, and there&rsquo;s a lot of nuance that I think we missed the first time around.)</p>
<h2 id="patterns-what-were-they-for-exactly:967fec73441d899066a5a01a2a28934b">Patterns: What were they for, exactly?</h2>
<p>In the early days of the patterns &ldquo;movement&rdquo;, when the patterns were new and fresh, we didn&rsquo;t spend much time really discussing
this all that much. They just <em>were</em>; those of us who read the book nodded at a few of them, having experienced them before in
code &ldquo;in the wild&rdquo;, and we went on about our day. Intuitively, it seemed, we realized that there were places where we could
apply this knowledge, we appreciated the new dimensions the patterns opened inside our heads, and&hellip; yeah, cool.</p>
<p>The Gang of Four (GOF) seemed to realize from the beginning that this was a subtle art/science; in the last chapter of the book
(which nobody ever seemed to read, unfortunately), they said:</p>
<blockquote>
<p>It&rsquo;s possible to argue that this book hasn&rsquo;t accomplished much. After all, it doesn&rsquo;t present any algorithms or programming techniques that haven&rsquo;t been used before. It doesn&rsquo;t give a rigorous method for designing systems, nor does it develop a new theory of design—&ndash;it just documents existing designs. You could conclude that it makes a reasonable tutorial, perhaps, but it certainly can&rsquo;t offer much to an experienced object-oriented designer.</p>
</blockquote>
<p>And in truth, that was the case: it hadn&rsquo;t accomplished much. One guy with whom I taught at DevelopMentor called Design
Patterns &ldquo;23 ways to use a pointer&rdquo;. What was the big deal?</p>
<p>And similarly, when management/senior team leads/architects threw a copy of the book at junior developers, expecting that they
could read the book and suddenly &ldquo;level up&rdquo;, they were profoundly disappointed.</p>
<p>The benefits, it seemed, were more subtle:</p>
<blockquote>
<p>We hope you think differently. Cataloging design patterns is important. It gives us standard names and definitions for the techniques we use. If we don&rsquo;t study design patterns in software, we won&rsquo;t be able to improve them, and it&rsquo;ll be harder to come up with new ones.</p>
</blockquote>
<p>Certainly, the GOF book spawned a movement, and the patterns movement spawned a whole catalog of patterns beyond the original
23 that the GOF came up with. But then things started getting even more abstract and high-level; patterns became &ldquo;pattern
languages&rdquo; and that in turn spawned &ldquo;meta-patterns&rdquo;. Then people started documenting the negative, calling them &ldquo;anti-patterns&rdquo;.</p>
<p>Once people started realizing that there was money to be made, the writing was on the wall.</p>
<p>Various book vendors started publishing &ldquo;patterns&rdquo; books that barely touched on the GOF&rsquo;s original model. Patterns books became
synonymous with &ldquo;reusable code&rdquo; (instead of &ldquo;reusable elements of design&rdquo;). IDE vendors started looking for ways to incorporate
patterns as code generators. Patterns somehow also became the provice of UML and other design notations, and the goal at one
point was to figure out how to create reusable design templates in UML that corresponded to patterns.</p>
<p>Like so many things, patterns became trendy and attractive to people who had no idea what they were for. How could they ever
have actually met those peoples&rsquo; expectations?</p>
<p>By the mid-2000s, patterns became a bad word, and speakers started essentially
<a href="http://www.oracle.com/technetwork/server-storage/ts-4961-159222.pdf">&ldquo;trashing&rdquo; patterns</a>, suggesting that somehow
they were an artifact of &ldquo;bad languages&rdquo; and &ldquo;primitive thnking&rdquo; and &ldquo;subsumed into good languages&rdquo;.</p>
<p>Patterns were clearly useless.</p>
<p>And yet&hellip; they keep appearing. We keep using their terms and lingo. Why? The GOF actually (in that same chapter at the back
of the book) called it back in 1995:</p>
<blockquote>
<p>&ldquo;Design patterns provide a common vocabulary for designers to use to communicate, document, and explore design alternatives. Design patterns make a system seem less complex by letting you talk about it at a higher level ofabstraction than that of a design notation or programming language. Design patterns raise the level at which you design and discuss design with your colleagues.&rdquo; (p389)</p>
</blockquote>
<p>and</p>
<blockquote>
<p>&ldquo;Knowing the design patterns in this book makes it easier to understand existing systems. &hellip; People learning object-oriented programming often complain that the systems they&rsquo;re working with use inheritance in convoluted ways and that it&rsquo;s difficult to follow the flow of control. In large part this is because they do not understand the design patterns in the system. Learning these design patterns will help you understand existing object-oriented systems.&rdquo; (p389)</p>
</blockquote>
<p>and</p>
<blockquote>
<p>&ldquo;Design patterns provide a way to describe more of the &ldquo;why&rdquo; of a design and not just record the results of your decisions. The Applicability, Consequences, and Implementation sections of the design patterns help guide you in the decisions you have to make.&rdquo;</p>
</blockquote>
<p>and</p>
<blockquote>
<p>&ldquo;One of the problems in developing reusable software is that it often has to be reorganized or refactored [OJ90]. Design patterns help you determine how to reorganize a design, and theycan reduce the amount of refactoring you need to do later.&rdquo;</p>
</blockquote>
<p>(Yep, they were talking about refactoring long before it became hip.)</p>
<p>Most of all, they actually predicted the very problem that would be the downfall of patterns as a whole:</p>
<blockquote>
<p>&ldquo;It&rsquo;s easiest to see a pattern as a solution, as a technique that can be adapted and reused. It&rsquo;s harder to see when it is appropriate—&ndash;to characterize the problems it solves and the context in which it&rsquo;s the best solution. In general, it&rsquo;s easier to see what someone is doing than to know why, and the &ldquo;why&rdquo; for a pattern is the problem it solves. Knowing the purpose of a pattern is important too, because it helps us choose patterns to apply. It also helps us understand the design of existing systems. A pattern author must determine and characterize the problemthat the pattern solves, even if you have to do it after you&rsquo;vediscovered its solution.&rdquo; (p393)</p>
</blockquote>
<p>Too many short-sighted people saw the &ldquo;solution&rdquo;, and in their rush to find reusable code, they forgot (or chose not) to
learn the &ldquo;why&rdquo;, and as a result ended up making some very stupid&mdash;and easily preventable&mdash;mistakes.</p>
<h2 id="what-is-a-pattern:967fec73441d899066a5a01a2a28934b">What is a pattern?</h2>
<p>There&rsquo;s been a lot of discussion on this over the yeas, but I&rsquo;m going to keep this down to a single sentence:</p>
<p><strong><em>A pattern is a solution to a problem within a certain context that has a set of predictable consequences.</em></strong></p>
<p>That&rsquo;s all it is. There&rsquo;s nothing magical, mysterious, or super-academic about it. If you can describe all four parts of
that tuple, you have a pattern. Write it down. Publish it somewhere. When we start seeing some similarities between us all,
we can start to see what the commonality is and then give it a nice well-chosen name, and it can join the general lexicon.</p>
<p>The Gang of Four published 23 of them. There&rsquo;s a lot more out there. Most are actually pretty useful. But particularly
the first 23, because they stretch across a ton of different languages. (The patterns community later came to the terminology
of an &ldquo;idiom&rdquo;, which was something that was language-specific. Thus, the &ldquo;Resource Acquisition Is Initialization&rdquo; (RAII) idea
from C++ was more-or-less tied to C++ as an idiom, and didn&rsquo;t really qualify as a pattern, <em>per se</em>.)</p>
<h2 id="why-are-we-arguing-again:967fec73441d899066a5a01a2a28934b">Why are we arguing again?</h2>
<p>The last sentence, however, brings up another important point: Too much of the patterns community spent too much time arguing
about them, and whether a given thing was a pattern or not, or whether a particular code snippet was an implementation of this
pattern or that pattern, or&hellip;.</p>
<p>There&rsquo;s several thoughts at work here:</p>
<ul>
<li><strong>The exercise of philosophical debate around a pattern&rsquo;s composition is important to the benefit they provide.</strong>
This was why the patterns community
said things like &ldquo;It&rsquo;s not a pattern until it&rsquo;s been discovered three times&rdquo; and &ldquo;A pattern isn&rsquo;t really a pattern unless
it&rsquo;s been workshopped at a patterns conference&rdquo;. The point wasn&rsquo;t to add some kind of mystery or intrigue or make them
solely the provice of those &ldquo;in the know&rdquo;; the point was to debate them in a philosophical matter to refine the pattern
to its essence, and keep the quality high. But once people start saying, &ldquo;Pfffft&rdquo; and avoided those critical steps, then
suddenly everything became a pattern. If suddenly we can start calling local variables a &ldquo;pattern&rdquo;, then the whole point
of designing a higher-level toolbox of terms becomes lost. If anybody can call anything a &ldquo;pattern&rdquo;, then we have everybody
calling everything a pattern, and patterns lose their efficacy in conversation. If patterns are somehow localized to the
team they were &lsquo;born&rdquo; on, then now I cannot converse with anybody outside of my team. Science has a rigor to it (in
particular, the peer-review portions) for a reason, and that rigor was what the patterns community tried to bring.</li>
<li><strong>Which pattern happens to be in use here is a subject of philosophical debate.</strong> LOTS of people got hung up on arguing
with other members of the team over what kind of pattern some code was. I cannot imagine a more useless activity, unless
the discussion led to new insights (a new application of it, or a new &ldquo;angle&rdquo; to it) around said pattern. We burned a lot
of goodwill by arguing with people over this.</li>
<li><strong>Patterns were not the be-all/end-all of good design.</strong> You can have a good design that doesn&rsquo;t fit an established
pattern. Pattern snobs needed to stop putting a qualitative judgment around those who weren&rsquo;t pattern snobs.</li>
</ul>
<p>In the end, we patterns snobs probably brought the downfall upon our own heads, but we had help.</p>
<h2 id="time-to-take-them-back:967fec73441d899066a5a01a2a28934b">Time to take them back</h2>
<p>So I figure it&rsquo;s time, 20 years later, to start the discussion all over again. I don&rsquo;t imagine that I can rebuild the
entire movement on my own, but at the very least, I can take the old prose, dust it off, and look to bring it into the
21st Century and the languages that we use. There&rsquo;s also probably a few more patterns that we&rsquo;ve found along the way, and
where I think they fit, I&rsquo;ll take a stab at a few and put them up for people to consider and workshop. (Or, rather, more
of a &ldquo;mob-shop&rdquo;, since a weblog isn&rsquo;t really a workshop setting.)</p>
<p>Over the next <em>n</em> months, I&rsquo;m going to put up the original 23 patterns re-cast for the &ldquo;modern world&rdquo;. I&rsquo;ll do a quick
recap of the pattern, but cast into the form I prefer (Problem/Solution/Context/Consequences). I deliberately won&rsquo;t spend
a whole lot of time trying to re-describe all the prose from the book&mdash;that would be copyright infringement, in my mind,
and I not only want to steer clear of that, but I really do want to encourage people to buy a copy of it for themselves,
do their own reading, and debate whether my interpretation is legitimate and/or reasonable. Then, I&rsquo;m going show how a few
languages can implement them, and I&rsquo;m going to range pretty freely over a bunch of different languages: C++, C#, Java,
Swift, F#, Scala, JavaScript, and a few others. <em>I encourage you to comment and post suggestions/amendments/corrections.</em>
I&rsquo;m definitely not the smartest person in the world, and certainly not the last word on how to apply certain language
idioms to a particular problem. Plus, I&rsquo;d love to see us collectively flesh out some implementations around these
patterns across all the popular (and maybe a few less popular, just for educational purposes!) languages.</p>
<p>It&rsquo;ll take time, so bear with me. I&rsquo;ll post another blog entry with the details soon, but in the meantime, if you haven&rsquo;t
done so in a while, take down your copy of GOF, dust it off, and crack open to the first chapter.</p>
When Interviews Failhttp://blogs.tedneward.com/post/when-interviews-fail/
Wed, 23 Mar 2016 13:51:53 -0700http://blogs.tedneward.com/post/when-interviews-fail/
<p><em>tl;dr</em> Peter Verhas asks a seemingly innocent question during a technical interview, and gets an answer that is not wrong,
but doesn&rsquo;t really fit. He then claims that &ldquo;Sometimes I also meet candidates who not only simply do not know the answer
but give the wrong answer. To know something wrong is worse than not knowing. Out of these very few even insists and tries
to explain how I should have interpreted their answer. That is already a personality problem and definitely a no-go in an
interview.&rdquo; I claim that Peter is not only wrong, but that in addition to doing his company a complete disservice with this
kind of interview, I personally would never want to work for a company that takes this attitude.</p>
<p>Let&rsquo;s begin with his original article, <a href="https://dzone.com/articles/can-you-call-non-static-method-from-a-static">here</a>. Go
have a look before you read any further&mdash;it&rsquo;s actually not that long.</p>
<p>Now, having familiarized yourself with the material, let&rsquo;s deconstruct it entirely, shall we?</p>
<h2 id="asking-the-wrong-questions:2fb09fcbaa781bfb963b836fecc15ce1">Asking the wrong questions</h2>
<p>For starters, the whole process begins, in my opinion, entirely on the wrong foot:</p>
<blockquote>
<p>There are questions on a Java technical interview that even the most entry level junior is expected to give the right answer for. Since I am facing candidates who are not that junior I do not even bother most of the times to ask those questions. I assume that the candidate knows the correct answer. Sometimes, however, there are some candidates who I feel from the start they are juniors and to cut the interview short not wasting his/her and my time I ask some of those simple questions. The answers usually reveal the real level of knowledge and we can get to an agreement in a short time about the assessed level.</p>
</blockquote>
<p>It begins with the phrase &ldquo;There are questions&hellip;.&rdquo;. Folks, let me be very clear about this: If you are conducting a
technical interview, then you need to be asking them to write code, not answer questions. Unless their role in the
position is to be a question-answerer of programming questions (in which case, you are interviewing for a teacher,
not an actual programmer), then you need to be asking them to demonstrate their technical skills, not their knowledge
of the terminology.</p>
<p>The reasoning for this should be pretty clear, but in case it&rsquo;s not, I&rsquo;ll argue it from logic, example, and analogy.</p>
<p>Logic: Not every programmer you interview will be classically-trained. They may not know all the preferred terminology.
Are they &ldquo;getters and setters&rdquo;, or &ldquo;automatically-defined properties&rdquo;, or &ldquo;accessors and mutators&rdquo;? It really sort of
depends on what language you grew up with (C++, for example, preferred the latter for quite some time). It depends on
which books you read. It depends on whether you even had to discuss this with other people&mdash;perhaps the candidate
actually learned everything from a book and reading stuff on the Internet. (StackOverflow&rsquo;s recent poll suggests that
around a third or more of the developers in the candidate pool are self-identified &ldquo;self-taught&rdquo; developers.) Do you
really want to be screening out perfectly-qualified candidates because they don&rsquo;t have the right words? And this doesn&rsquo;t
even begin to address the pressure-cooker situation that most candidates feel they&rsquo;re in when they interview, causing
them to flub even simple answers. Which brings me to&hellip;</p>
<p>Example: A developer who worked for me for two years was a quite capable C# developer. This is a guy who led teams,
mentored some of the more junior developers, and came up with some quite capable designs. And then, when asked during
a meeting by a prospective client, to explain what a static method is, he flubbed it completely, and started talking
about constructors and what-not. I sat there looking at him for a few minutes, with a total &ldquo;Dude, WTF?!?&rdquo; look on my
face, before he realized what he was doing. By Peter&rsquo;s criteria, he&rsquo;d failed the interview. And yet, he served as the
team lead for that client for nine months after that meeting, with nary a complaint about his skills, his abilities, or
his answers on questions about static methods (which, ironically, never came up!) ever again. In other words, once we
got out of the pressure cooker, he did fine, and his work showed it. Which brings us to&hellip;</p>
<p>Analogy: If you&rsquo;re hiring a band to play your wedding, do you really care about their ability to explain musical theory
and composition? Or do you care more about their ability to play your favorite dance tunes, to play the song that your
spouse has chosen to be &ldquo;your song&rdquo;, and to get Grandma and Grandpa on to the dance floor with a rendition of &ldquo;Funky
Chicken&rdquo;? Most bands (dare I say all of them?) get a gig based on their body of work and/or their demo tape, not their
ability to answer questions.</p>
<h2 id="expecting-the-wrong-answers:2fb09fcbaa781bfb963b836fecc15ce1">Expecting the wrong answers</h2>
<p>Continuing, Peter says,</p>
<blockquote>
<p>To know something wrong is worse than not knowing. Out of these very few even insists and tries to explain how I should have interpreted their answer. That is already a personality problem and definitely a no-go in an interview.</p>
</blockquote>
<p>Oh, hubris, thy name is &ldquo;programming interviewer&rdquo;. Let&rsquo;s see what I mean:</p>
<blockquote>
<p>One such simple question is: Can a static method in a class call a non-static method of the same class? If you know Java a little bit you know the answer: no, it can not. A static method belongs to the class and not the instance. It can even be executed using the name of the class directly without any instance of the class. It can even run when there is not even a single instance of the class in the whole JVM. How could it invoke a normal method that runs attached to an instance?</p>
</blockquote>
<p>Oh, hubris. <em>There is no reason that a static method cannot invoke an instance method.</em> The thing that Peter
is hinging on here is the fact that the static method lacks a reference to a particular object (which normally
is the &ldquo;this&rdquo; reference), which is his justification for his answer: &ldquo;No <em>this</em>, no method call&rdquo;.</p>
<p>And yet:</p>
<blockquote>
<p>But then again: the answer from one candidate this time was: yes. And he even started to explain that it may happen that the static method has access to an instance. It may get an instance as a method argument and through that reference, it can call an instance method. That person was right.</p>
</blockquote>
<p>And yet:</p>
<blockquote>
<p>It did not, however, change the fact that he did not know Java well enough, but as a matter of fact in this very specific question, she was right.</p>
</blockquote>
<p>So let me get this straight: the answer to the question was right, when you&rsquo;ve said all along it was wrong, and yet
&ldquo;this candidate didn&rsquo;t know Java well enough&rdquo;? I&rsquo;d say that the candidate understood Java well enough to be able
to find an answer that was not the one you expected, and yet was actually correct.</p>
<p>See, here&rsquo;s what&rsquo;s going on: the interviewer, convinced of their own technical superiority, is walking into an
interview with a predetermiend set of answers to questions they&rsquo;re going to ask, and when a candidate doesn&rsquo;t follow
that predetermined script, they&rsquo;re &ldquo;not smart enough&rdquo;.</p>
<p>Here&rsquo;s a personal example. I was interviewing with a company years ago for a C++ position, and was asked, &ldquo;Can a
private field be accessed from outside of the class?&rdquo; The normal candidate&rsquo;s answer is supposed to be &ldquo;No, private
creates an encapsulation barrier hiding the field from the rest of the world.&rdquo;</p>
<pre><code class="language-c++">#include &lt;iostream&gt;
#include &lt;string&gt;
using namespace std;
class Person
{
public:
Person(const char* fn, const char* ln, int a)
: first_name(fn), last_name(ln), age(a)
{ }
string description() {
return first_name + &quot; &quot; + last_name + &quot; is &quot; + to_string(age) + &quot; years old&quot;;
}
private:
string first_name;
string last_name;
int age;
};
int main() {
Person ted(&quot;Ted&quot;, &quot;Neward&quot;, 45);
cout &lt;&lt; ted.description() &lt;&lt; endl;
}
</code></pre>
<p>The &ldquo;age&rdquo; field is, by design, unavailable to the rest of the world, right?</p>
<p>My answer: &ldquo;Of course it can. You just have to cast the object instance over to a void*, calculate the offset from the
start of the object, and then access it.&rdquo;</p>
<pre><code class="language-c++">int main() {
Person ted(&quot;Ted&quot;, &quot;Neward&quot;, 45);
cout &lt;&lt; ted.description() &lt;&lt; endl;
void* pTed = (void*)&amp;ted;
int offset = sizeof(string) + sizeof(string);
char* pTedAge = (static_cast&lt;char *&gt;(pTed) + offset);
cout &lt;&lt; static_cast&lt;int&gt;(*pTedAge) &lt;&lt; endl; // prints 45
}
</code></pre>
<p>I even showed them how you could generalize this into a template (I called it &ldquo;THackOMatic&rdquo;, and consider it one of
my finest creations in the language.)</p>
<p>Now, you can take this one of three ways:</p>
<ul>
<li>Wow, never thought you could do that. That&rsquo;s interesting. I wonder&hellip;</li>
<li>Well, yes, you can do this, but it&rsquo;s totally not a good idea.</li>
<li>This is totally not in the spirit of the question. You are still wrong.</li>
</ul>
<p>And If you&rsquo;re in the first two camps, I&rsquo;m kind of there with you. It&rsquo;s a cool hack, but it&rsquo;s a hack nevertheless,
and hacks are usually a sign that you&rsquo;re doing something wrong, except in very narrow circumstances where there&rsquo;s
simply no other way around it and you accept that you&rsquo;re the only one who will be touching that code from now on.</p>
<p>But if you&rsquo;re in the third camp, you&rsquo;re missing the point. The point being, the candidate figured out a way around
a supposed stumbling block in the language. If you can&rsquo;t recognize that, then I submit that the fault lies with you,
not with the candidate.</p>
<p>Which brings me to my last point.</p>
<h2 id="you-hire-what-you-interview-for:2fb09fcbaa781bfb963b836fecc15ce1">You hire what you interview for</h2>
<p>Right or wrong, the candidate took your question, thought their way through it, and came up with a novel answer to
the question. And by focusing on the answer, you missed the important part&ndash;<em>they found a way around it</em>.</p>
<p>Part of this process is supposed to be finding those candidates who meet a certain technical bar, but part
of the process is also supposed to be finding those people who will find ways around obstacles. Bugs, production
outages, design flaws, whatever, you&rsquo;re supposed to be finding the candidates who won&rsquo;t accept the status quo as
a concrete law of the universe.</p>
<p>A candidate did exactly that, and you shot them down.</p>
<p>You wanted a plain vanilla, boring-as-shit, everybody-get-in-line-and-do-what-the-supervisor-tells-you kind of
answer, and they gave out an &ldquo;out of the box&rdquo;, creative, hammer-throwing-at-the-screen one.</p>
<p>Do you claim you hire &ldquo;only the best&rdquo;? When you&rsquo;re doing this, you&rsquo;re hiring only the middle part of the bell curve,
at best. Those who are on the far right-hand-side of the curve will be the out-of-the-box thinkers, who understand
that rules are made to be broken sometimes, and under specific circumstances.</p>
<p>Do you really think they will <em>want</em> to work for you at this point? I wouldn&rsquo;t.</p>
<h2 id="wrapping-up:2fb09fcbaa781bfb963b836fecc15ce1">Wrapping up</h2>
<p>So my challenge here is this: If you&rsquo;re an interviewer, what are <strong>you</strong> interviewing for?</p>
<p>By the way, that company I interviewed at all those years ago? The interviewer&rsquo;s response was classic: &ldquo;Well, the
right answer was supposed to be &lsquo;no&rsquo;, but I can see what you&rsquo;re doing here. You&rsquo;re the first person to ever give
me that as an answer.&rdquo; They hired me shortly thereafter, and before I left the firm, I used a couple more language
tricks to help cut the size of their codebase down pretty significantly in a couple of situations.</p>
Technical Debt: A Definitionhttp://blogs.tedneward.com/post/technical-debt-definition/
Mon, 18 Jan 2016 01:36:52 -0800http://blogs.tedneward.com/post/technical-debt-definition/
<p><strong>tl;dr</strong> A recent post on medium.com addresses the topic of technical debt; I had an intuitive
disagreement with the thrust of the post, and wrote this as a way of clarifying my own thoughts
on the matter. It raises some interesting questions about what technical debt actually
is&mdash;and if we can&rsquo;t define it, how can we possibly understand how to avoid it or remove it,
as opposed to our current practice of using it as a &ldquo;get-out-of-this-codebase-by-blowing-it-all-up&rdquo;
card?</p>
<p><a href="https://medium.com/@kellan/towards-an-understanding-of-technical-debt-ae0f97cc0553#.k5o49zz9r">@kellan wrote</a>:</p>
<blockquote>
<p>Technical debt exists. But it’s relatively rare. When you start arguing with someone about technical debt, you’ll generally encounter a definition like: Technical debt is the choices we made in our code, intentionally, to speed up development today, knowing we’d have to change them later. Hard coding a variable because currently there is no plan to change it is a common example of technical debt. Similarly not modularizing a function.</p>
<p>This is a fairly clear, succinct, and easy to reason about definition, that describes a phenomena that exists relatively rarely. Relatively rare compared to what? Compared to the amount of technical debt we ascribe to the codebases we work on. How then do we explain the overwhelming prevalence of technical debt we encounter when we talk to people about code?</p>
<p>The term is being abused, or at least dangerously overloaded.</p>
</blockquote>
<p>He&rsquo;s certainly right about that! But just prior to that, he says, &ldquo;Technical debt doesn&rsquo;t
exist&rdquo;, and sort of wanders around that idea for a bit.</p>
<p>Here&rsquo;s the rub: He then tries to define what technical debt actually <em>is</em>:</p>
<ol>
<li>&ldquo;Maintenance work.&rdquo;</li>
<li>&ldquo;Features of the codebase that resist change.&rdquo;</li>
<li>&ldquo;Operability choices that resist change.&rdquo;</li>
<li>&ldquo;Code choices that suck the will to live.&rdquo;</li>
<li>&ldquo;Dependencies that resist upgrading.&rdquo;</li>
</ol>
<p>I&rsquo;ll leave you to read his descriptions of each.</p>
<h3 id="critique:fc8739f224026f63bff9056cb99f10ff">Critique</h3>
<p>Unfortunately, a lot of the definitions he raises there are highly subjective and extremely
difficult to understand, except at a base, emotional, almost visceral level. I mean, when you
explicitly use the phrase &ldquo;suck the will to live&rdquo; as one of your definitions, it&rsquo;s hard to really
put a concrete discussion around that.</p>
<p>Consider, for example, that particular point: &ldquo;A significant percentage of what gets referred to
as technical debt are the decisions that don’t so much discourage change but rather discourage
us from even wanting to look at the code. &hellip; We often describe this code with the
suck-the-will-to-live quality as messy (spaghetti), unmaintainable, or amateurish. Often what we’re
describing under the surface is fear and confusion. We don’t understand the code, and when we don’t
understand things, as human, we tend to get scared, tired, and angry.&rdquo;</p>
<p>I&rsquo;m sure every single person reading this has an immediate reaction, akin to the screams through
the Force that Obi-Wan felt when Alderaan was destroyed. Everybody remembers That One Project,
or That One Class, or That One File&hellip;. Nobody wanted to touch it, it was a mess, and people would
look for every reason under the sun to avoid opening it, as if there was some kind of icky black
ichor that could ooze out of the screen and keyboard and infect us with its ugliness.</p>
<p>And yet, if we compare the stories, we will all have very different concrete-terms descriptions
of what that thing was. And I&rsquo;ll even bet that if you cast the net wide enough, and we spend
enough time comparing stories, we&rsquo;ll even find that one man&rsquo;s &ldquo;suck the will to live&rdquo; is another
man&rsquo;s &ldquo;Whoa, man, that&rsquo;s actually kind of a cool hack.&rdquo;</p>
<p>Case in point: in the earliest days of my career, I was a contractor working on some C/Win16 code
at Intuit. A really cool 3-month gig (and in those days, it was way cool to have Intuit on your
resume). I was working as part of the &ldquo;Slickrock&rdquo; team, which was the code-name for Intuit&rsquo;s
nascent electronic banking section of Quicken 5 for Windows. It was some cool stuff.</p>
<p>Except&hellip;.</p>
<p>Well, first of all, everything was written in C. Not C++, as was the leading-edge of the day,
but using Intuit&rsquo;s home-grown C/Windows library that they&rsquo;d put together since the earliest
days of the product. At the time, I was kinda bleah on the whole idea. (In retrospect, hey,
if it still works, you know?)</p>
<p>And there was this one dialog box to which I was assigned, which had a bunch of bugs in it
that needed fixing, that nobody else on the team wanted to touch. Eager to prove to all these
grizzled veterans that I was capable of handling the toughest stuff, I leapt at the chance to
get into this thing. (If you get this picture of the eager young Private fresh from boot camp,
volunteering to go out on that mission that the grizzled old Sargeant knows will just crush
the life out of him, you&rsquo;re probably not too far off the mark.)</p>
<p>And here&rsquo;s what I found: this dialog box code was one, giant, four-page-long function, where
three-and-three-quarters of it was wrapped in one giant-ass <code>do-while</code> loop. But not just any
<code>do-while</code> loop; no, this one was the most bizarre thing I&rsquo;d ever seen. It looked something
like this:</p>
<pre><code class="language-c">do {
/* do one thing */
/* do another thing */
/* check that thing */
/* what about the thing over there */
} while (0);
</code></pre>
<p>It was my own private &ldquo;WTF?!?&rdquo; moment. No wonder everybody wanted to stay clear of this
thing! This was the craziest code I&rsquo;d ever seen, and clearly it was because they weren&rsquo;t
using C++!</p>
<p>(Yeah, I kinda was that stupid back then.)</p>
<p>But when I showed this to one of the other engineers and said the 90&rsquo;s equivalent of
&ldquo;Dude, seriously?&rdquo;, he pointed out that I&rsquo;d missed an important part of the whole thing:</p>
<pre><code class="language-c">int result = -1; /* Not OK! */
do {
/* do one thing */
if (!thing-worked)
break;
/* do another thing */
if (!another-thing-worked)
break;
/* check that thing */
if (!thing-checked)
break;
/* what about the thing over there */
if (!thing-over-there-checked)
break;
result = 0; /* OK! */
} while (0);
return result;
</code></pre>
<p>In other words, this incredibly idiotic thing actually served a useful purpose: it
obeyed the old C rule of &ldquo;single entry, single exit&rdquo;, and more importantly, it was
rather elegantly obeying the fail-fast principle. (Why bother doing all these other
checks if you&rsquo;ve already failed at the first step?)</p>
<p>Now, I grant you, this could&rsquo;ve been solved using C++ using exceptions; instead of the
(not-really-a-)loop, he could just have done a &ldquo;try&rdquo;, and then each step could&rsquo;ve
thrown their own new exception type, and there&rsquo;d have been either a single &ldquo;catch&rdquo; to
return the appropriate error code (since this block was returning either -1 or 0,
depending on success), or even maybe a separate &ldquo;catch&rdquo; block to handle each different
error condition, and&mdash;</p>
<p>But you know what? Today, looking back at it, I don&rsquo;t know if that would&rsquo;ve been much
clearer, or much shorter, or what-have-you.</p>
<p>Is this still life-sucking-code? Or is this an elegant hack? I&rsquo;ll be honest, I&rsquo;m not
sure anymore, of either position.</p>
<h3 id="technical-debt-a-definition:fc8739f224026f63bff9056cb99f10ff">Technical Debt: A Definition</h3>
<p>I don&rsquo;t have one.</p>
<p>Seriously.</p>
<p>Not one I particularly like, anyway. Google it, and you get:</p>
<blockquote>
<p>Technical debt (also known as design debt or code debt) is a metaphor referring to
the eventual consequences of any system design, software architecture or software
development within a codebase.</p>
</blockquote>
<p>&ldquo;eventual consequences&rdquo;? You mean, like &ldquo;it works&rdquo;? Seriously, consequences are not
always bad, which is why the Gang-of-Four used that same word to describe the results
of a particular solution applied to a particular problem within a certain context.
Consequences can be positive, and they can also be negative. The use of the Strategy
pattern can allow for varying an algorithm at runtime&mdash;but with it comes an added
cost in complexity of determining which Strategy to load, for example, or the additional
cognitive load of having to realize that now the Strategy being executed may be nowhere
local to the code actually executing it (which would at some level seem to violate the
principle of locality, depending on the situation).</p>
<p>Wikipedia goes on to say:</p>
<blockquote>
<p>The debt can be thought of as work that needs to be done before a particular job
can be considered complete or proper.</p>
</blockquote>
<p>Now that&rsquo;s interesting, because that certainly doesn&rsquo;t jibe with what @kellan was
alluding to earlier&mdash;this sounds like things like documentation and tests and such.
And yes, that definitely could create a problem, if a company/team/programmer
goes off and writes a whole bunch of untested, undocumented code; I&rsquo;d call that
indebted code, probably, sure.</p>
<p>Unless, you know, it doesn&rsquo;t really need documentation. Or tests. Like, for example,
a module composed of much smaller functions, each of which are effectively small
primitives that really don&rsquo;t need testing, a la:</p>
<pre><code class="language-ruby">def calcuate(lhn, op, rhn)
return op(lhn, rhg)
end
def add(lhn, rhn)
return lhn + rhn
end
def sub(lhn, rhn)
return lhn - rhn
end
puts calculate(1, add, 2)
</code></pre>
<p>Do these really require comments? Tests? Wouldn&rsquo;t it actually <em>add</em> to the technical
debt to put those into place, since now they must be maintained and kept up to date
should something change in here?</p>
<p>I&rsquo;m obviously reaching here, but I don&rsquo;t think the point is entirely invalidated
by the simplicity of my example&mdash;after all, well-written methods are supposed to be
small and focused, and we prefer classes not to be large, and so on, for precisely
these kinds of reasons.</p>
<h3 id="technical-debt-it-s-a-metaphor-stupid:fc8739f224026f63bff9056cb99f10ff">Technical Debt: It&rsquo;s a metaphor, stupid</h3>
<p>Go back to Wikipedia for a second; there, they finish the definition&rsquo;s first
paragraph with this:</p>
<blockquote>
<p>If the debt is not repaid, then it will keep on accumulating interest, making it
hard to implement changes later on. Unaddressed technical debt increases software entropy.</p>
</blockquote>
<p>See, this is the heart of the matter: technical debt is a metaphor. That&rsquo;s it.
That&rsquo;s all it is. It&rsquo;s a literary mechanism designed to help people who are not programmers
understand that there are decisions made during the development process of a project,
decisions which are deliberate choices to take a shortcut or avoid a more generic solution
in the interests of getting past the obstacle quickly.</p>
<p>Except that nothing ever remains &ldquo;just&rdquo; a metaphor in our industry. Inevitably, we have
to dissect it, treat as if it were a real, in-the-room-with-us kind of thing, and start
crusades &ldquo;for&rdquo; or &ldquo;against&rdquo; it. Because REASONS.</p>
<p>And I admit, I&rsquo;m not immune to this tendency myself. Because in examining a metaphor,
so long as we recognize it&rsquo;s a metaphor and therefore bound to fail at some point
(the model is not reality), we can actually find some interesting edge-cases that may
or may not apply, and that leads us to some interesting conversations about the concept,
even if it doesn&rsquo;t fit the metaphor anymore.</p>
<h3 id="technical-debt-a-fowlerian-definition:fc8739f224026f63bff9056cb99f10ff">Technical Debt: A Fowlerian Definition</h3>
<p>Martin Fowler has gone into great detail about the different kinds of technical
debt in the form of a <a href="http://martinfowler.com/bliki/TechnicalDebtQuadrant.html">debt quadrant</a>,
arranged along two axes of &ldquo;Deliberate vs Inadvertent&rdquo; against &ldquo;Reckless vs Prudent&rdquo;.</p>
<p>I like this, simply by virtue of the fact that it captures the mindset of the developer
or the team at the time they were making that decision.</p>
<p>But I <em>don&rsquo;t</em> like it because, well, because who cares what they were thinking, or why?
Isn&rsquo;t technical debt just technical debt? I mean, that $50 purchase on your credit card, was
it a measured and thoughtful purchase, perhaps some tools at the local hardware store, so
that you can perform some home repais, or was it a reckless and idiotic purchase, perhaps
some tools at the local hardware store, so that you can pretend to yourself that you&rsquo;re
actually going to take this weekend and perform some home repairs, but deep down you know
you&rsquo;re just fooling yourself, you&rsquo;ll never get it done, and the tools will now be left to
rust in a quiet corner of the garage (or worse, you&rsquo;ll leave them out in the backyard and
it rains and and and)?</p>
<p>Seriously: the guy who wrote that <code>do-while</code> loop at Intuit? I have no idea what he was
thinking&mdash;and did that intent really make a helluvalot of difference to me (or the rest of the
team) as I (we) tried to pick it up and extend/modify/debug it? I won&rsquo;t speak for the rest
of the team, but to me, it made not a single whit of difference.</p>
<p>But here&rsquo;s the vastly more important thing to realize about debt: At the end of the day,
<em>you still owe $50</em>.</p>
<p>Wherever that debt appears on Fowler&rsquo;s quadrant, you still have to pay it off. Or it will
gather interest, and eventually (if you leave it long enough) bankrupt you.</p>
<p>Granted, this is perhaps where that metaphor starts to wear a little thin. In a codebase,
where we perhaps deliberately chose not to use a Strategy pattern, but instead just coded
the algorithm by hand directly into the code, because we don&rsquo;t really see any need for
any greater degree of flexibility, we have potentially amassed some (perhaps small) amount
of technical debt. The $50 hammer, if you will.</p>
<p>In a traditional credit card scenario, that $50, compounded at 5% or 10% interest, will,
without exception, eventually turn into a monstrous pile of debt that you cannot pay off,
assuming you leave it unpaid for that long.</p>
<p>But that non-Strategized algorithm? So long as the client requirements don&rsquo;t change,
there&rsquo;s not a thing wrong with it. It can continue to run, and run, and run, until the
heat death of the universe, and nothing happens.</p>
<p>This suggests to me that technical debt isn&rsquo;t just about what the developers on the team
at the time were thinking about. This suggests that technical debt has two more
components to it:</p>
<ul>
<li>The thoughts of the developer(s) who have now inherited the code.</li>
<li>The requirements (or lack thereof) of the project for which the code was written.</li>
</ul>
<p>See, if the client never changes their requirements, <em>there is no technical debt</em>. So
long as the code continues to run, without problem, then what the code looks like is
entirely irrelevant. It&rsquo;s only when the client says, &ldquo;OK, now we need to do this other
thing with this codebase&rdquo; that it becomes a problem.</p>
<p>Although, now that I write this, I realize that&rsquo;s not entirely accurate, either.</p>
<p>If the client&rsquo;s requirements haven&rsquo;t changed, but the code doesn&rsquo;t run, or runs
into errors while running? Those are bugs, and the code needs to change (to remove
the bug). And that&rsquo;s the other case where now, a developer struggling to understand
the code in which the bug may (or may not) live will be running into difficulty.
Enter technical debt again.</p>
<p>Which now suggests that technical debt is essentially &ldquo;a developer&rsquo;s cognitive difficulty
in understanding and/or modifying a codebase&rdquo;. Nothing to do with the decisions made
at the time of the codebase&rsquo;s creation, and everything to do with the developer who
is attempting to understand what the code is trying to do or how to modify it.</p>
<h3 id="technical-debt-moving-on:fc8739f224026f63bff9056cb99f10ff">Technical Debt: Moving on</h3>
<p>So does @kellan (and Peter Norvig) have it right, that &ldquo;code is a liability&rdquo;?</p>
<p>On the surface of it, maybe: if I write code, it is <em>potential</em> technical debt.</p>
<p>See, it&rsquo;s not technical debt yet, because it won&rsquo;t actually be a debt until we trigger
the &ldquo;understanding and/or modifying&rdquo; clause of the above. The Ruby script I hacked
together to transform my old blog&rsquo;s XML over to Markdown files for the new system,
I won&rsquo;t know whether that code is technical debt until I (or anybody else who wants
to use or modify it) go back into it again and face the cognitive load of understanding
it or modifying it. So it&rsquo;s like the infamous cat in the box, neither debt nor not,
until the box is opened.</p>
2016 Tech Predictionshttp://blogs.tedneward.com/post/2016-tech-predictions/
Mon, 04 Jan 2016 20:54:34 -0800http://blogs.tedneward.com/post/2016-tech-predictions/
<p>As has become my tradition now for nigh-on a decade, I will first go back over last years&rsquo;
predictions, to see how well I called it (and keep me honest), then wax prophetic on what I
think the new year has to offer us.</p>
<h2 id="in-2015:a4c6fac861e72246054ac04aa0f1f2b1">In 2015&hellip;</h2>
<p>As per previous years, I&rsquo;m giving myself either a <b>+1</b> or a <b>-1</b> based on a
purely subjective and highly-biased evaluational criteria as to whether it actually happened
(or in some cases at least started to happen before 31 Dec 2015 ended).</p>
<p>In 2015, I said:</p>
<ul>
<li><b>"Big data", "Big data", "Big data". You will get sick of this phrase.</b>
I can't speak for everybody, but I can tell that the end is near for the term, because
suddenly everybody is using the term, and they're using it to mean anything and everything.
"Big data" is not just about doing deep data science analysis on petabytes of data; it's
about any analysis (even simple reporting) on any collection of data (no matter how large
or small) for any reason. <b>+1</b>
</li>
<li><b>"Internet of Things". You will get sick of this phrase, too.</b> This hasn't
quite happened yet, but we're close. IoT is also starting to fray at the edges as a
definition, and when that happens, it's immediately ripe for abuse and marketing. More
importantly, though, lots of people are starting to realize that IoT is neither the huge
"automatic win" that we all sort of assumed it would become. <b>+1</b>
</li>
<li><b>"Internet of Medicine" or "Big Med".</b> Well, nobody's started using the term yet,
but certainly they're spending a lot of time in this space. They just don't like my term
yet. I'm pouting over it, but it's still a <b>-1</b>.
</li>
<li><b>"Tech bubble" becomes a "thing".</b> Oh, this one came down to the very wire, but
as December of 2015 rolled in, concerns over the actual valuations of the so-called "unicorns"
were starting to show up, and lots of people were beginning to openly wonder if Silicon Valley
and Wall Street were experiencing a falling-out. It took a hail-mary pass to do it, but I'm
claiming my <b>+1</b>.
</li>
<li><b>C# and Java will both make big announcements.</b> C#6 shipped, but Java9 didn't, leaving
me sort of confused as to how to score this one. However, I did say, "Those who care will
take note, those who don’t, won’t. Really, we’re kind of past the point where either of
those languages are going to be interesting to anyone who’s not already in that space", and
frankly, if you weren't a C# or Java developer, you probably didn't even hear a whisper about
either one (pro/shipping or con/not-shipping), either way. <b>+1</b>
</li>
<li><b>Go is going to either take off, or crash and burn.</b> The point of this one was that
Go was reaching an inflection point, and while I think it's gathering some momentum (including
with me personally--this new blog is using Go to do the site generation), I can't really tell
if it reached an inflection point, so <b>-1</b> to me.
</li>
<li><b>Microsoft acquires Xamarin.</b> Oh, as much as I thought (and still think) that it would
be a great story for both sides, it didn't happen, and probably never will. *sigh* <b>-1</b>
</li>
<li><b>Amazon just quietly keeps churning.</b> I dunno how I would measure this one, but
in some ways, as long as Amazon just keeps churning out new feature after new feature on AWS,
and keeps making money selling stuff on their main web property--which they continue to do--then
I think pretty much anything here qualifies as a <b>+1</b>. But it was kind of a lame
prediction to begin with, now that I re-read it.
</li>
<li><b>Google continues to throw sh*t against the wall, looking for their Next Big Thing.</b> I
believe the exact phrase I used was, "Expect a lot of announcements, a lot of "beta"s, and
none of it with any kind of realistic or even well-planned business model behind it--including
the Google Car." And, sure enough, we've heard a ton about the Google Car, among other
initiatives, but nothing has stepped up as a product yet to even come close to acting as a
second line of income for the firm. I call it an easy <b>+1</b>.
</li>
<li><b>Web use on mobile devices decreases in favor of apps.</b> In particular, I said,
<blockquote>This is going to happen whether the public wants it or not, because companies have
figured out that it behooves them to have you "trapped" inside their app (where they can
control all the content) rather than on their website. More and more websites are going to try
and redirect you to inside their app, rather than allow you to casually browse on their site,
because then they think they "own" your eyeballs. The only way this changes is if/when some
firm gets crushed in the court of public opinion by doing something really stupid...
and that won't happen in 2015. Wait for it in 2016.
</blockquote>
Various "clickbait" sites were the ones I was thinking about in particular, and while some of
them (I'm looking at you, Uberfacts) have floated a mobile app out there, the apps themselves
don't seem to be lighting any fires in the mobile marketplaces. I'll talk more about this
in a bit, but for now, I'm giving myself a <b>-1</b>.
</li>
<li><b>Hipster "Uber for X" apps will be all the rage.</b> Have you been to San Francisco
recently? Talked to anybody on the street there? This one was a slam-dunk <b>+1</b>.
</li>
<li><b>Mark Zuckerberg grows up a little.</b> Zuckerberg will never admit it, but now that he's
married and starting a family and all, he's starting to grow up. His paternity leave step was
a big one, and signals that maybe he's finally ready to "adult" now. If so, he's in good
company--it took Bill Gates getting married and having kids to come in out of the rain, too,
and ever since that time, Bill's become a philanthropist of the highest order. <b>+1</b>
</li>
<li><b>Larry Ellison buys a sports team.</b> Didn't happen yet. <b>-1</b></li>
<li><b>Perl makes one final gasp at relevancy, fails, and begins to decompose.</b> Oh, this one
is funny; how could I be so right, and so wrong at the same time? Wrong, because Perl 6 actually
<a href="https://perl6advent.wordpress.com/2015/12/25/christmas-is-here/">finally shipped</a>. And yet,
so right because... well, how many people do you know using it? Or were even paying attention
when it came out? Or.... Yeah. <b>+1</b>
</li>
</ul>
<p>Nine up, four down. Not bad.</p>
<p>That was the easy part. Now, on to the&hellip;.</p>
<h2 id="2016-predictions:a4c6fac861e72246054ac04aa0f1f2b1">2016 Predictions</h2>
<p>In no particular order:</p>
<ul>
<li><strong>Microsoft will continue to roll out features on Azure, and start closing the gap between it and AWS.</strong>
This one is not hard to imagine. Microsoft is committed to making Azure a core part of their company
success and survival, and Amazon has a list of features that Azure lacks, so it really boils down to
&ldquo;take one down, cross it off the list, lather, rinse, repeat&rdquo;.</li>
<li><strong>(X)-as-a-Service providers will continue to proliferate.</strong> We&rsquo;re seeing a huge surge in these various
companies that are providing some vertical thing as a service, and for most of those they&rsquo;re tech-related
(such as Database-as-a-Service, Container-as-a-Service, and so on). Part of that is because if it&rsquo;s one
thing software developer geeks know what to do, it&rsquo;s what they wish they had when they were working that
last project and really wished they could have when they were working it. This coming year will mark the
high-water mark of companies that provide *aaS products to the developer community, and then they&rsquo;ll all
start cannibalizing each other and some shutdowns, acquisitions and partnerships will kick in.</li>
<li><strong>Apple will put out two, maybe three new products, and they&rsquo;ll all be &ldquo;meh&rdquo; at best.</strong> Let&rsquo;s be frank,
folks, the luster is off the shiny Apple logo on the site of the building. Tim Cook is no Steve Jobs,
and Apple of 2015 was not the Apple of 2005 or 2010. The Apple Watch is interesting, but it certainly
hasn&rsquo;t taken off. No watch seems to have, in fact, become the &ldquo;it&rdquo; thing. I don&rsquo;t see many of them (or
their Android competitors, to be fair) at tech conferences, and casually glancing around the airport
doesn&rsquo;t show a ton of them in use. I don&rsquo;t think this is going to change any time soon, either. For
most people, the wearable just hasn&rsquo;t really offered up that compelling reason yet, and I don&rsquo;t think
2016 is going to see one, either.</li>
<li><strong>iOS 10 will be called iOSX.</strong> Just because they can, and because it would confuse the hell out of
people, and because Steve Jobs is not here anymore to tell that VP of Marketing to sit down, shut up,
and let the grown ups do this.</li>
<li><strong>Android N will be code-named &ldquo;Nougat&rdquo;.</strong> They might go with &ldquo;Nutella&rdquo;, but that would involve
copyright and trademark issues, which I imagine they&rsquo;d want to avoid.</li>
<li><strong>Java9 will ship.</strong> This is a &ldquo;no-duh&rdquo; prediction, but I&rsquo;m not above claiming a few of those. Really,
the bigger question there will be <em>what</em> will ship that Oracle calls &ldquo;Java9&rdquo;, and my personal feeling
is that modules/Jigsaw/whatever-we&rsquo;re-calling-them-now won&rsquo;t be in it. Slamming a module mechanism
on top of a platform that&rsquo;s a decade old, millions of programmers wide and a billion lines of code
high is not easy, and I don&rsquo;t think Oracle really has the energy, motivation or need to push them
through the morass of headaches that will stem from imposing a module system into place.
<strong>UPDATE</strong>: @olivergierke <a href="https://twitter.com/olivergierke/status/684642273561329664">points out</a>
that Java9 had already slipped to 2017, so this one is automatically going to be a &ldquo;miss&rdquo; next
January. <a href="http://www.infoworld.com/article/3011445/java/java-9-delayed-by-slow-progress-on-modularization.html">The article he cites</a>
says that Oracle &ldquo;blamed the delay on complexities in developing modularization&rdquo;, a la Project
Jigsaw. Honestly, I&rsquo;m going to stand by this prediction, because it would not surprise me in the
slightest if Oracle comes back at some point in 2016 and says, &ldquo;You know what? Fuck it.&rdquo; and ships
Java9 without modularization in place&ndash;I don&rsquo;t really think Java9 needs it at this point, and I&rsquo;m
not entirely sure that shipping <em>with</em> it will make Java all that much better. Time will tell&hellip;</li>
<li><strong>Facebook will start looking for other things to do.</strong> Yes, Facebook has been ridiculously
successful to date; it claims more population than most nations on Earth, in fact. But the company
is led by a classic Type-A personality, and the softening of his character by the birth of his
firstborn notwithstanding, this is when Zuckerberg comes back from leave and says, &ldquo;OK, boys and
girls, it&rsquo;s time to take us down a new path!&rdquo; and charges off into who-the-Hell-knows-what. I won&rsquo;t
hinge the prediction on <em>what</em> that would be, I just think it&rsquo;ll be something outside of the social
media realm (or tied to it just a little bit).</li>
<li><strong>Google will continue to quietly just sort of lay there.</strong> Google, for all that they are on the top
of everybody&rsquo;s minds since that&rsquo;s the search engine most of us use, hasn&rsquo;t really done much by way of
software product invention recently. Google+, Google Hangouts, yeah, sure, that was so 2013, but
what have you done for us lately? And honestly, what have they done recently, in 2015? Casting back
through my memory (and setting Android off to the side, since I consider that more or less an
independent effort in a lot of ways), I came up with nothing. I suspect the same will be true of
2016&ndash;they will continue to do lots of innovative things, but it&rsquo;ll all be &ldquo;big&rdquo; and &ldquo;visionary&rdquo;
stuff, like the Google Car, that won&rsquo;t have immediate impact or be something we can use in 2016
(or 2017).</li>
<li><strong>Oracle will quietly continue to work on Java.</strong> Oracle took a bit of a PR hit this year when they
fired/let go a number of &ldquo;Java evangelists&rdquo;, and that set the newsstands aflame with hints and
rumors that Oracle was getting ready to abandon Java. Frankly, if I&rsquo;m Larry Ellison (or the VP
that has Java under my umbrella), I&rsquo;m asking a very fundamental question: What the hell does Java
need with evangelists at this point? Everybody more or less knows what it is already, there&rsquo;s
nothing to sell in of itself, and that money could probably be put to better use hiring people to
work on the codebase itself, or putting the cash back into the rest of the firm to hire a few more
Oracle Database salespeople. Oracle didn&rsquo;t acquire Java because they saw it as a way to inflict
the Oracle Database upon the world&ndash;quite the opposite. Oracle acquired Java because they <em>use</em>
Java, all over the place, and this way they had control over a technology that they had &ldquo;bet the
farm&rdquo; on in a variety of ways. They&rsquo;re not going to kill it, but there&rsquo;s really not a whole lot
of need to go around preaching its message, either. So they let the evangelists go, and they&rsquo;ll
just keep on keepin&rsquo; on.</li>
<li><strong>C# 7 will be a confused morass.</strong> Microsoft is now striding boldly into that open-source world
it timidly courted just a few years ago. But in a lot of ways, this is highly uncharted territory
for the software giant, and for the OSS world as well. Sure, Linus has been releasing Linux kernel
after Linux kernel for years, but with himself as the autocrat in charge of it all. Microsoft wants
to make use of the open source man-hours to help advance the cause of C# 7, but whether they&rsquo;ve
smoothed out what that process will look like and/or how they will deal with the inevitable
conflicts between committers and company isn&rsquo;t yet clear. (Oracle is in this same boat, in a lot
of ways, and there&rsquo;s a lot of people who think that Java is too much Oracle, not enough OSS, so
to speak.) I think the C# 7 release will be one of the first that the world gets to see take
shape in a purely public forum, and they will be a bit confused and surprised at how chaotic
a product release can really be. (Yes, C# 6 was sort of in that same boat, but only a handful
of folks were really paying attention.)</li>
<li><strong>Another version of Visual Basic will ship, and nobody will really notice.</strong> Actually, that
already happened&ndash;remember when C# 6 shipped? They shipped a new version of VB then, too.
Alas, the ship has sailed on VB, and frankly, at this point, it&rsquo;s really just a husk of its
former self. Most of the VB luminaries are all speaking and/or writing in C# these days, and
only staunch loyalty to their fond memories of the language is what keeps it at all in the
conversation anymore. Sad, but&hellip; Oh, well.</li>
<li><strong>Apple will now learn the &ldquo;joys&rdquo; of growing a language in the public as well.</strong> Swift is now
open-source, and that will bring with it the same pains as what Oracle and Microsoft are feeling.
Enjoy, guys!</li>
<li><strong>Ted will continue to layer in a few features into the blog engine.</strong> For example, right now
I have no comments feature, and I suspect people will want to start telling me how incredibly
<strong>wrong</strong> I am about so many of these. So, on the docket already, Disqus or Discourse or some
other JavaScript-based comment-engine integration. Plus, I want to tweak the template I&rsquo;m using
for the blog&rsquo;s look and feel a little (although keeping it way simple, especially compared
to what I had before), so there&rsquo;s likely to be more than a few tweaks here and there. (Again,
not really a hard prediction to make, but I always like to close on a prediction that I have a
relatively .9 probability chance of hitting.)</li>
</ul>
<p>Happy Holidays, and thanks for reading!</p>
Peoples be talkin'...http://blogs.tedneward.com/post/peoples-be-talkin/
Wed, 04 Jun 2014 04:13:43 -0800http://blogs.tedneward.com/post/peoples-be-talkin/<p>
"Ted, where the hell did you go?"
</p>
<p>
I've been getting this message periodically over a variety of private channels, asking
if I've abandoned my blog and/or if I'm ever going to come back to it. No, I haven't
abandoned it, yes, I'm going to come back to it, but there's going to be a few changes
to my online profile that I'll give you a heads-up around... if anybody cares. :-)
</p>
<p>
First of all, <a href="http://blogs.tedneward.com/2013/12/10/On+Endings.aspx">as I
mentioned before</a>, LiveTheLook and I parted ways back at the end of 2013. Sad,
but every cloud has a silver lining in that I found a new home as the CTO of <a href="http://www.itrellis.com">iTrellis</a>,
a custom software development and IT continuous improvement consultancy. And therein...
lies the root of my problem.
</p>
<p>
Truth time: I'm ridiculously busy. And even more ridiculously happy.
</p>
<p>
For years now, almost a full decade in fact, people have been asking me when I was
going to start up a consulting company. (In fact, before I jumped in to LiveTheLook,
I interviewed about a job at GitHub, and Phil Haack, who's known me for years, expressed
outright surprise at the idea. "I've always pictured you as the consummate consultant--what
makes you want to go work at a product company?" And truth was, he was right--the
idea of working for a product company (like GitHub) wasn't really a strong appeal.
What was appealing was the idea of growing a team, managing a group of developers,
making them better as a group in a variety of manager-y ways. That was a large part
of the attraction of LiveTheLook, though I never got to the point of hiring anyone
to work with me there.) My response to people has always been the same: I believe
that a company needs a triumvirate of people at the top--one to handle sales/marketing/business
development, one to handle the technology, and one to handle the operations. I could
never seem to find a great biz-dev guy, nor a great ops guy, and so thoughts of building
a consulting firm were pretty far off in the distance.
</p>
<p>
But after LtL, a mutual acquaintance heard that I was looking, and he knew two guys
who were looking for a CTO for this new consulting company they were spinning up.
Chris (CEO) and Paul (CFO) and I met a few times. Chris and I in particular spent
a fair amount of time talking, weighing the mutual decision to jump into this thing
together, because it was obvious from the very beginning that he and I would need
to be able to work well together--if he was going to go off and do biz-dev, he had
to trust that I could carry the implementation through, and I needed to trust that
he wasn't going to sell a bill of goods that was impossible for me to deliver while
he did it, and so on and so on and so on.
</p>
<p>
Six months later, we're at four current clients (with a fifth one scheduled to spin
up in July), five billable consultants (including Chris and I, working together to
do an IT assessment project for a $10bn business unit of a $100bn company out on the
East Coast), and there's strong evidence to suggest that we'll crest the $1mn mark
in our first year of existence.
</p>
<p>
Yeah... it's been a fun ride so far. :-) And neither Chris nor I have any intention
of slowing down any time soon.
</p>
<p>
But, what I'm finding is that between billable hours, biz-dev meetings, implementation
meetings, one-on-ones with my people, speaking, and writing for the various publications
I still write for, I have almost no energy left to blog. At least, for now.
</p>
<p>
I have plans, though. Here's what I'm looking to do:
</p>
<ul>
<li>
First, we're going to stand up an iTrellis blog, and a lot of technical content I
write will be hosted in both places (there and here), where and when it makes sense.
Maybe, over time, the content will shift in quantity to over there, but I'll probably
always keep this channel open in some fashion.</li>
<li>
Second, I want to spin up a "personal blog", one in which I feel more comfortable
expressing completely non-technical ideas and topics, including politics and such.
That way, those who are interested in just the technical content can still get that,
and those who want to hear what I think about the rest of the world can tune in on
a separate channel.</li>
<li>
Third, I'll likely migrate this content into a new technical blog over at the "new"
professional website I'm slowly building out for myself, at <a href="http://www.newardassociates.com">www.newardassociates.com</a>.
That will eventually, over time, become the only technical channel I use, but I'll
set something up at this domain to redirect links to the corresponding blog entries
over there. That is going to be the real PITA in all of this, because I really want
to preserve the old links without having to stand up the same blog system over there.
(I'm "done" with the idea of a server-side processed blog--the blog entries should
be just plain ol' HTML, generated from whatever source I choose to write in, a la
Jekyll and its ilk. Plus, I never again want a blog with anything other than tech-agnostic
URLs; the whole ".../On+Endings.aspx" thing is soooooo 1997. Why should you--or I--care
what the underlying implementation is?)</li>
</ul>
Of course, like all plans, this is subject to change based on whatever obstacles pop
up to distract me. ("Want to make God laugh? Tell Him your plans." --old Yiddish proverb)
<p>
(By the way, if you have any experience with taking a dasBlog blog and redirecting
the links over to a new site, please email me how you did it and/or what tools you
used to do it. I'd really prefer to not have to write that redirect handler myself,
if I can help it. I don't even care too much about the comments--it's the entry links
I really want to preserve. I'm even willing to discuss payment measured in bottles
of Scotch... :-) )
</p><p>
I will, at a minimum, promise to keep up the Tech Predictions, though, no matter what
else happens. That's an eight-year tradition that I have absolutely no intention of
ever giving up. Even when I'm old and crotchety and every prediction reads, "I remember
when Swift was first released... you young'un's have NO IDEA what it was like to actually
type your code into an editor. It was hard! It was painful on the fingers! And WE
LIKED IT!"
</p>
On Endingshttp://blogs.tedneward.com/post/on-endings/
Mon, 09 Dec 2013 20:59:24 -0800http://blogs.tedneward.com/post/on-endings/<p>A while back, I mentioned that I had co-founded a startup (<a href="http://www.livethelook.com">LiveTheLook</a>); I'm saddened to report that just after Halloween, my co-founder and I split up, and I'm no longer affiliated with the company except as an adviser and equity shareholder. There were a lot of reasons for the split, most notably that we had some different ideas on how to execute and how to spend the limited seed money we'd managed to acquire, but overall, we just weren't communicating well.</p>
<p>While I'm sad to no longer be involved with LtL, I wish Francesca and the company nothing but success for the future, and in the meantime I'm exploring options and figuring out what my next great adventure will be. It's not the greatest time of the year (the "dead zone" between Thanksgiving and Christmas) to be doing it, but fortunately I've gotten a few leads that may turn out to be hits. We'll have to see. And, while we're sorting that out, I've got plans for things to work on in the meantime, including a partnership effort with my eldest son on a game he invented.</p>
<p>So, what I'm saying here is that if anyone's desperate for consulting, now's a great time to reach out, because I can be bought. :-)</p>
Seattle (and other) GiveCampshttp://blogs.tedneward.com/post/seattle-and-other-givecamps/
Thu, 29 Aug 2013 12:19:45 -0700http://blogs.tedneward.com/post/seattle-and-other-givecamps/<p>Too often, geeks are called upon to leverage their technical expertise (which, to most non-technical peoples' perspective, is an all-encompassing uni-field, meaning if you are a DBA, you can fix a printer, and if you are an IT admin, you know how to create a cool HTML game) on behalf of their friends and family, often without much in the way of gratitude. But sometimes, you just gotta get your inner charitable self on, and what's a geek to do then? Doctors have "Doctors Without Boundaries", and lawyers can always do work "pro bono" for groups like the Innocence Project and so on, but geeks....? Sure, you could go and join the Peace Corps, but that's hardly going to really leverage your skills, and Lord knows, there's a ton of places (charities) that could use a little IT love while you're off in a damp and dismal jungle somewhere.</p>
<p>(Not you, Seattle. You're just damp today. Dismal won't be for another few months, when it's raining for weeks on end.)</p>
<p>(As if in response, the rain comes down even harder.)</p>
<p>About five or so years ago, a Microsoft employee realized that geeks didn't really have an outlet for their desires to volunteer and help out in their communities through the skills they have patiently mastered. So Chris created <a href="http://givecamp.org/">GiveCamp</a>, an organization dedicated to hosting "GiveCamps" all over the US, bringing volunteer developers, designers, and other IT professionals together with charities that need some IT love, whether that's in the form of a new mobile app, some touch-up on the website, a port from a Microsoft Access app to something even remotely more modern, or whatever.</p>
<p><a href="http://www.seattlegivecamp.org/">Seattle GiveCamp</a> is coming up, October 11-13, at the Microsoft Commons. No technical bias is implied by that--GiveCamp isn't an evangelism event, it's a "let's help people" event. Bring your Java, PHP, Python, and yes, maybe even your Perl, and create some good karma for groups that are doing good things. And for those of you not local to Seattle, there's lots of other GiveCamps being planned all over the country--consider volunteering at one nearby.</p>
On startupshttp://blogs.tedneward.com/post/on-startups/
Mon, 26 Aug 2013 14:37:25 -0700http://blogs.tedneward.com/post/on-startups/<p>Curious to know what Ted's been up to? Head on over to <a href="http://signup.livethelook.com">here</a> and sign up.</p>
<p>Yes, I'm a CTO of a bootstrap startup. (Emphasis on the "bootstrap" part of that--always looking for angel investors!) And no, we're not really in "stealth mode", I'll be happy to tell you what we're doing if you drop me an email directly; we're just trying to "manage the message", in startup lingo.</p>
<p>We're only going to be under wraps for a few more weeks before the real site is live. And then.... *crossing fingers*</p>
<p>Don't be too surprised if the tone of some of my blog posts shifts away from low-level tech stuff and starts to include some higher-level stuff, by the way. I'm not walking away from the tech, by any stretch, but becoming a CTO definitely has opened my eyes, so to speak, that the entrepreneur CTO has some very different problems to think about than the enterprise architect does.</p>