.Net on Ted Neward's Bloghttp://blogs.tedneward.com/platforms/index.net/
Recent content in .Net on Ted Neward's BlogHugo -- gohugo.ioen-usMon, 02 Jan 2017 20:54:34 -08002017 Tech Predictionshttp://blogs.tedneward.com/post/2017-tech-predictions/
Mon, 02 Jan 2017 20:54:34 -0800http://blogs.tedneward.com/post/2017-tech-predictions/
<p>It&rsquo;s that time of the year again, when I make predictions for the upcoming year.
As has become my tradition now for nigh-on a decade, I will first go back over last years&rsquo;
predictions, to see how well I called it (and keep me honest), then wax prophetic on what I
think the new year has to offer us.</p>
<p>As per previous years, I&rsquo;m giving myself either a <strong>+1</strong> or a <strong>-1</strong> based on a
purely subjective and highly-biased evaluational criteria as to whether it actually happened
(or in some cases at least started to happen before 31 Dec 2016 ended).</p>
<p>Bear with me for a moment, though. This is just too good.</p>
<h2 id="in-2015:816dfc775e64d848cd23e6c10e386ae4">In 2015&hellip;</h2>
<p>&hellip; <a href="http://blogs.tedneward.com/post/2015-tech-predictions/">I said</a>:</p>
<blockquote>
<p>Microsoft acquires Xamarin.</p>
</blockquote>
<p>Oh, baby. Off by a year. I&rsquo;m should go back and give myself a <strong>+1</strong> for this one. It was really
surprising that they hadn&rsquo;t. As a matter of fact, if Microsoft had listened to me and done it in
2015, they&rsquo;d probably have saved themselves a TON of money compared to what they actually paid for
Xamarin in 2016. But they made the acquisition, Xamarin is now part of the Microsoft family, and
(finally!) .NET developers have access to the Xamarin toolchain and can build native iOS and Android
apps without having to shell out additional cash to do so. &lsquo;Bout time, Microsoft. (I suspect this
had everything to do with Satya, to be honest.)</p>
<p>OK, gloat over.</p>
<h2 id="in-2016:816dfc775e64d848cd23e6c10e386ae4">In 2016&hellip;</h2>
<p>&hellip; <a href="http://blogs.tedneward.com/post/2016-tech-predictions/">I said</a>:</p>
<ul>
<li><strong>Microsoft will continue to roll out features on Azure, and start closing the gap between it and AWS.</strong>
Calling this one a <strong>+1</strong>; it doesn&rsquo;t take much research to see this has definitely been happening in 2016.
However, it&rsquo;s not necessarily a true statement that they&rsquo;ve been closing the gap; Amazon keeps adding
stuff as well, and the feature-parity lists are starting to get ridiculous. Whether these features are
actually <em>of use</em>, however, is an important distinction, and something for the second half of this post.</li>
<li><strong>(X)-as-a-Service providers will continue to proliferate.</strong> Oh my, yes, Ted gets another <strong>+1</strong> for this.
When running a gaming convention has an (X)-as-a-Service for it (seriously, <a href="https://tabletop.events/">here</a>)
then you know the proliferation is in full swing. PaaS providers are exploding everywhere, and while
a few have disappeared (farewell, Parse!), it&rsquo;s clear that this was the gold rush of 2016.</li>
<li><strong>Apple will put out two, maybe three new products, and they&rsquo;ll all be &ldquo;meh&rdquo; at best.</strong> I should&rsquo;ve
broken this into two predictions: one about Apple&rsquo;s &ldquo;meh&rdquo; products, and one about wearables. If I&rsquo;d
done that, I&rsquo;d have scored two <strong>+1</strong>&rsquo;s for it, because not only have wearables not really gone very
far (show me somebody wearing a smart watch, and I&rsquo;ll show you a geek with too much time on their
hands and not enough &ldquo;discrimination&rdquo; in their discriminatory income), but Apple&rsquo;s product releases
have been&hellip; &ldquo;meh&rdquo;! I&rsquo;m looking at you, iPhone 7, and I&rsquo;m <em>really</em> looking at you, MacBook Pro.
(When Consumer Reports doesn&rsquo;t give the MBP its top rating, you know the luster has failed.) More on
Apple in the second half.</li>
<li><strong>iOS 10 will be called iOSX.</strong> Dangit. Such an opportunity wasted. <strong>-1</strong></li>
<li><strong>Android N will be code-named &ldquo;Nougat&rdquo;.</strong> Why, hello there, Android 7.0 Nougat. So pleased to make
your acquaintance. <strong>+1</strong></li>
<li><strong>Java9 will ship.</strong> As I noted last year, @olivergierke <a href="https://twitter.com/olivergierke/status/684642273561329664">pointed out</a>
that Java9 had already slipped to 2017, so this one was already a <strong>-1</strong>. Sigh. And I called it a
&ldquo;no duh&rdquo; event, too&mdash;I&rsquo;m going to let this one cancel out the extra +1 I&rsquo;d have given myself for
the Apple/wearables thing, just to keep the math safe (and my ego relatively sized).
<a href="http://www.infoworld.com/article/3011445/java/java-9-delayed-by-slow-progress-on-modularization.html">The article he cited</a>
says that Oracle &ldquo;blamed the delay on complexities in developing modularization&rdquo;, a la Project
Jigsaw.</li>
<li><strong>Facebook will start looking for other things to do.</strong> Welllllllll, it&rsquo;d be really tempting to say
that Facebook&rsquo;s now &ldquo;things to do&rdquo; was &ldquo;Be the deciding factor in who gets elected by passively
encouraging the widespread dissemination of fake news and outright falsehoods!&ldquo;, but seriously,
who would&rsquo;ve believed that even if I had predicted it? Which I didn&rsquo;t. <strong>-1</strong></li>
<li><strong>Google will continue to quietly just sort of lay there.</strong> A year ago, I wrote, &ldquo;Google, for all
that they are on the top of everybody&rsquo;s minds since that&rsquo;s the search engine most of us use, hasn&rsquo;t
really done much by way of software product invention recently. &hellip; I suspect the same will be true of
2016&ndash;they will continue to do lots of innovative things, but it&rsquo;ll all be &ldquo;big&rdquo; and &ldquo;visionary&rdquo;
stuff, like the Google Car, that won&rsquo;t have immediate impact or be something we can use in 2016
(or 2017).&rdquo; And&hellip;. yeah. <strong>+1</strong> More emphasis around the existing products they&rsquo;ve built, but as a company,
they&rsquo;ve clearly spent most of 2016 on the Alphabet/Google restructure (which accomplished&hellip; what,
exactly?), and anything new has been either way quiet or way removed from the business.</li>
<li><strong>Oracle will quietly continue to work on Java.</strong> A year ago, I wrote, &ldquo;[Oracle is not] going to
kill it, but there&rsquo;s really not a whole lot of need to go around preaching its message, either.
So they let the evangelists go, and they&rsquo;ll just keep on keepin&rsquo; on.&rdquo; Score a <strong>+1</strong> for the
long-haired geek in Seattle; they just keep posting new code.</li>
<li><strong>C# 7 will be a confused morass.</strong> If we permit me the freedom to call it &ldquo;.NET Core&rdquo; instead
of just &ldquo;C# 7&rdquo;, then wow do I get a <strong>+1</strong> on this one. Even if I just constrain my prediction
to C# 7/Roslyn, I still score one, but once you throw in the CoreCLR and &ldquo;dotnetcore&rdquo; and the
different profiles and&hellip;. Holy spaghetti web browser history, Batman! The demarcation lines
of the different project teams working on this whole thing are starting to become <em>really</em>
clear as the different OSS projects each look really consistent within themselves, but then,
when you get to the borders, things just&hellip;. fall apart.</li>
<li><strong>Another version of Visual Basic will ship, and nobody will really notice.</strong> Alas, there was
no new version of Visual Basic, since it would be in lockstep with the release of C# 7 (which
didn&rsquo;t ship), but nobody really noticed. Or cared. Still, nothing shipped, so <strong>-1</strong>.</li>
<li><strong>Apple will now learn the &ldquo;joys&rdquo; of growing a language in the public as well.</strong> First there
was Swift 2, which was itself source-incompatible with Swift 1, and then during the summer,
Apple shipped Swift 3, which was&hellip; source-incompatible with Swift 2, owing to some language
changes that the community effectively decided was necessary. <strong>+1</strong>. (And thanks for that, by
the way&mdash;made teaching iOS this Fall a royal PITA.)</li>
<li><strong>Ted will continue to layer in a few features into the blog engine.</strong> You&rsquo;ve got comments!
And I&rsquo;ll take that <strong>+1</strong>, thank you very much.</li>
</ul>
<p>Nine up (ten, if we count my Xamarin prediction from 2015), four down. Not bad. But now, we move
on to the more interesting part of the post: 2017.</p>
<h2 id="2017-predictions:816dfc775e64d848cd23e6c10e386ae4">2017 Predictions</h2>
<p>The calendar year 2017 is going to be a wild one for the tech industry, largely owing to the
rather large orange elephant in the room&mdash;Donald Trump&rsquo;s election to President of the United
States is a huge wildcard whose randomness simply cannot be understated. The man <em>thrives</em> on
being unpredictable, and like most industries, the tech industry (for all that it cherishes
&ldquo;innovation&rdquo; and &ldquo;disruption&rdquo;) thrives on predictability. His collection of &ldquo;tech titans&rdquo; at
Trump Tower last month yielded absolutely zero positive traction that I can see, and I suspect
that the various corporate tech leaders (Nadella, Bezos, Cook, etc) are all looking at him right
now the way humans do a rogue elephant&mdash;he could be good for them, so long as he doesn&rsquo;t go
wild and start tramping everything in his path out of spite, anger, fear, or any other of a half-
dozen emotions. There&rsquo;s no prediction here, though&mdash;just a &ldquo;wow, this is an X-factor&rdquo; that
in turn makes predictions that much harder.</p>
<p>But on that note&hellip;.</p>
<ul>
<li><strong>The Congress will call for an investigation into the &lsquo;hacking&rsquo; of the 2016 US election.</strong>
(0.8 probability) To be honest, I&rsquo;m not sure if anybody knows exactly what we mean when we say
&ldquo;the Russians &lsquo;hacked&rsquo; the US election&rdquo; in casual conversation. There&rsquo;s no clear evidence that
the voter machines themselves were cracked or tampered with, but it&rsquo;s fairly easy to see a
correlation with the DNC hacks and Wikileaks disclosures and Trump&rsquo;s corresponding favorability
gains in the polls. That said, though, the five-hundred or so US politicians that make up the Congress
(and excluding Trump himself and his transition team) are not comfortable with the idea that
somebody outside the US engaged in some kind of manipulation of the election, and they are
going to want answers. Just yesterday or the day before, though, Trump made the comment that
hacking is &ldquo;extremely hard to prove&rdquo;, and he&rsquo;s right about that&mdash;without some kind of &ldquo;smoking
gun&rdquo; found in a Russian government employee&rsquo;s possession, it&rsquo;s going to remain a major point of
contention in the coming year, and investigation or not, it&rsquo;s not going to go away regardless of
what the investigation finds.</li>
<li><strong>Security becomes a HUGE deal for the industry.</strong> (0.8) The election is just the tip of the iceberg;
consumers may have gotten used to (and complacent about) corporate security disclosures, but the
idea that the election could be hacked is sending shivers down the collective spines of anyone
who does anything online. The downside is that it&rsquo;s such a complex topic, it&rsquo;s hard for anyone
who&rsquo;s not a computer security expert to really understand what to do; even among experts, there&rsquo;s
a fair amount of disagreement, even on simple issues like scope (how widespread is it) or actual
facts vs hype. Pair that with the paranoia that is inherent in any security professional (if you
think computer security types are paranoid, try talking to physical security professionals for a
while), and you have an industry that&rsquo;s ripe for a lot of snake oil and hyperbole. My prediction,
then, is that <strong>the industry starts to see the first set of &ldquo;security snake oil&rdquo; products</strong>
somewhere within the calendar year 2017. And by that, I mean products that claim to provide
security for your interactions online but in fact do nothing of the sort. (Late-night
infomercials about downloadable web pages that can &ldquo;clean your system&rdquo; of viruses and malware,
move over&mdash;it&rsquo;s time for late-night infomercials about downloadable web page that can &ldquo;secure
you against even the most determined attacker&rdquo;!)</li>
<li><strong>Apple continues to plummet.</strong> (0.7) Their products this year were merely slightly-enhanced copies
of the previous line of products (iPhone 7 vs iPhone 6) or containing gimmicky &ldquo;enhancements&rdquo;
while the core of the product remained essentially unchanged from prior generations (MacBook Pro).
Sorry, folks, the TouchBar does not qualify as &ldquo;disruption&rdquo; or &ldquo;innovation&rdquo;; it&rsquo;s a strip of
touch-sensitive glass from an iPad designed to start prepping you for the idea that Apple can
remove the keyboard entirely, replace it with a touchpad, and then put a hinge in between two
iPad Pros and call it a &ldquo;MacBookPad Pro&rdquo; and charge you $10k for it. (And by the way, if you&rsquo;re
thinking about one of the new MacBook Pro machines, make sure you go into an Apple Store and
try it out&ndash;the keyboard is definitely not the same as its been for years. It feels like they
took about half of the keys&rsquo; &ldquo;press depth&rdquo; away, and it totally changes the &ldquo;touch&rdquo; on the
keyboard. I imagine somebody could get used to it in time, but&hellip; ugh.)</li>
<li><strong>Apple doesn&rsquo;t introduce any new products this year.</strong> (0.6) And by new, I mean something that&rsquo;s not
an incremental improvement on what they&rsquo;ve already got. Heck, I&rsquo;ll even go so far as to say that this
means that there&rsquo;s no new form factors to the existing product line. (Meaning, no new-sized iPad
or iPhone or laptop.)</li>
<li><strong>PC manufacturers double their efforts to build a MacBook Pro.</strong> (0.8) The MBP is vulnerable, for
the first time in a half-decade, and PC manufacturers are going to look for ways to capitalize on
that. Somebody is going to put out a similarly-sized, similarly-weighted non-touch-screen Windows 10
laptop with 32GB of RAM and a 1 or 2 TB SSD, the usual collection of ports, and price it around the
same as MacBook Pro ($2k to $4k), and developers will start buying them. (Bonus points to that
manufacturer if they offer Linux as an out-of-the-box option.) I know I will&hellip;.</li>
<li><strong>Apple rumors about Tim Cook&rsquo;s departure begin.</strong> (0.6) Cook has proven that he&rsquo;s no Steve Jobs;
in fact, the comparisons between his and Steve Ballmer&rsquo;s reign at Microsoft are proving eerily and
entirely similar. Both basically took companies that were defining the marketplace and shepherded
them into a position of trying to manage the cost structures and find better price-points, and in
doing so, killed off much of the mojo that drove both firms. Ballmer took close to a decade to be
run out of Microsoft (and even then, it took BillG&rsquo;s intervention behind the scenes, from what I
can tell), but I don&rsquo;t think the Apple Board is going to wait that long&mdash;I think by the end of
2017 we&rsquo;re going to start hearing serious rumors about Cook being offered a golden parachute to give
up the center chair and let somebody else in to run the show.</li>
<li><strong>Oracle will continue to just write Java.</strong> (0.7) Oracle, despite the best efforts of media and
journalists everywhere, just refuses to get drawn into &ldquo;techno-drama&rdquo;. Java hasn&rsquo;t been the Trojan
Horse into corporate pocketbooks that all the Java-doomsdayers were predicting back when Oracle
acquired Sun, and releases of Java just keep coming through both commercial and OSS channels.
There&rsquo;s really no reason at this point to doubt that Oracle is going to do anything but continue
down that path. Make no mistake, I&rsquo;m sure they&rsquo;re looking for ways to monetize Java in some way
so that they can try to earn back the cash they spent to buy Sun, but I don&rsquo;t think it&rsquo;s going to
be through selling or charging for the JDK or JRE anytime soon.</li>
<li><strong>Oracle Cloud emerges onto the cloud scene in a big splash.</strong> (0.6) IBM now has Bluemix and
Watson, and they were really the last of the &ldquo;big-iron&rdquo; holdouts around the cloud. (What I mean by
that is that all companies have been quietly flirting with cloud, but some push it loud and clear,
a la Microsoft or Google, and some were playing it very quietly for a while.) With IBM acquiring
Loopback (a NodeJS server-side stack) last year, it&rsquo;s clear that IBM is going to push JavaScript
as their main cloud development play, essentially ceding the Java-cloud development ground to
somebody else. Amazon has historically been the place that Java developers have gone to run their
Java code in the cloud, but if Oracle can build a compelling offering (particularly with a free
tier that AWS currently lacks), this could be a relatively big splash. Between Oracle&rsquo;s reputation
in the database world, if they have a solid &ldquo;stack&rdquo; offering that basically makes a Java-based
back-end a snap to start up, Oracle could essentially claim the Java-favored cloud play from
Amazon. (Yes, Heroku is out there and holds a fair amount of Java and Scala love, but now that
they&rsquo;re owned by Salesforce I suspect the Java-leaning flavor of Heroku to wane a bit.)</li>
<li><strong>Salesforce makes a major database acquisition.</strong> (0.5) Salesforce is growing, and they&rsquo;re
clearly interested in expanding their cloud to be more than just the CRM. With Heroku, they have
a Platform that developers can feel comfortable on, but they don&rsquo;t have a big-name database
(relational or otherwise) that complements that play. They currently are sitting on a ton of
cash, and <a href="http://talkincloud.com/saas-software-service/10-salesforce-acquisitions-2016">last year&rsquo;s crop of acquisitions</a>
didn&rsquo;t include a big database storage name. There&rsquo;s not a ton of players left out there, but I
could see them making a strong push to get something like Cassandra or Couchbase. (Yes, they
have Data.com, but that doesn&rsquo;t seem to be making much headway in the developer mindset space.)</li>
<li><strong>Salesforce releases a new programming language.</strong> (0.4) Let&rsquo;s call the spade a spade: Apex is
a Java knock-off, and it shows a lot of warts, particularly since it hasn&rsquo;t really kept up with
what few improvements Java-the-language has made in recent years. The last company to be in this
position&mdash;a red-hot platform but a language feeling a little creaky at the corners and just plain
&ldquo;old&rdquo; everyhwere else&mdash;was Apple right before they released Swift. Salesforce has the engineering
power, they are looking to command more of the developer mindshare, and they have a ton of cash
to blow, so&hellip;. Whether this happens this year, next year, or 2019, I&rsquo;m not sure, but if it
doesn&rsquo;t happen this year, the odds go up each year after that.</li>
<li><strong>LinkedIn Learning starts to make a serious dent in online developer training.</strong> (0.5) Between
the fact that LinkedIn Learning (formerly Lynda.com) is growing out its library to a pretty
respectable degree, and the fact that Microsoft now owns LinkedIn, it&rsquo;s pretty reasonable to
assume that Microsoft is going to start making this available to its developer community in
various ways. This may happen in 2018, though, depending on how swiftly Microsoft moves to
incporate LinkedIn assets across the rest of the firm; if they bought LinkedIn solely for the
CRM data to go with Dynamics, for example, then this probably won&rsquo;t happen for a few years.</li>
<li><strong>Swift doesn&rsquo;t go to 4.</strong> (0.7) Swift 3 held breaking changes from Swift 2, and the folks at
Apple are not stupid. Swift 4 will be far, far down the horizon for a few years yet, given that
each major version number bump has heralded incompatibilities. Apple will not want to call anything
&ldquo;Swift 4&rdquo; and dredge up memories of incompatibilities in their customers&rsquo; minds for a while.
Swift might get a 3.1 in the summer, but that&rsquo;s as far as it&rsquo;ll go.</li>
<li><strong>Microsoft ships C# 7.</strong> (0.8) Roslyn needs to ship in 2017 if Microsoft is going to be able to
call this open-source process a success. Otherwise it&rsquo;ll start a lot of people grumbling. (Yes,
a new version of Visual Basic will come with it, and it will make basically no news.)</li>
<li><strong>No new Android version.</strong> (0.4) Android-N is still slowly making its way through the networks,
and while we&rsquo;ll probably start hearing rumors of what Android-8 (Oreo?) will include, with a
targeted ship date of 2018, probably 1Q or 2Q.</li>
<li><strong>Twitter will continue its slide into irrelevancy.</strong> (0.5) Let&rsquo;s face it, Twitter&rsquo;s days are
numbered. If you&rsquo;re holding Twitter stock, now&rsquo;s a good time to sell&mdash;when Twitter was left out
of Trump&rsquo;s &ldquo;tech summit&rdquo; last month, the stated reason was that it was &ldquo;too small&rdquo;. Put that into
your brain-pan and circulate for a while&mdash;the service that invented microblogging and is one of
the core founders of &ldquo;social media&rdquo; was &ldquo;too small&rdquo; for the PEOTUS&rsquo; time. Twitter hasn&rsquo;t really
done anything &ldquo;new&rdquo; or &ldquo;interesting&rdquo;, but simply continued to be the 140-character microblogging
platform it&rsquo;s always been. It&rsquo;s reaching commodity status, in fact. That&rsquo;s not a good sign for
a company that wants to be more than it is. I suspect Jack Dorsey gets tossed on his can, the
company starts looking for a new CEO, and the &ldquo;new vision&rdquo; will start to take shape by the end
of the year (2017), and then in 2018 we find out that the &ldquo;new vision&rdquo; is terrible, takes them
out of their &ldquo;core business&rdquo;, and the slide accelerates. But nobody buys them this year, not yet.</li>
<li><strong>The &ldquo;Internet of Things&rdquo; continues to draw hype, and continues to fail to deliver.</strong> (0.6)
It&rsquo;s been how many years we&rsquo;ve heard about IoT now, and how it will revolutionize our lives,
and all we&rsquo;ve really seen thus far is the wide variety of Internet-enabled devices being subverted
for a widespread DDoS attack. Wearables, &ldquo;smart refrigerators&rdquo; and other IP-enabled devices are
proliferating, but&mdash;to perhaps everybody&rsquo;s surprise but mine&mdash;nobody&rsquo;s quite sure what to DO
with these things once you have them. Your thermostat is online; terrific. Does it have an API
that will let me query meter usage? No, that&rsquo;s a different thing, and a different API, and a
different connection endpoint, and&hellip;. Oh, and be careful, somebody could remote-hack your
thermostat and <a href="http://motherboard.vice.com/read/internet-of-things-ransomware-smart-thermostat">hold your house hostage</a>.
Because that&rsquo;s worth the risk.</li>
<li><strong>Tech &ldquo;unicorns&rdquo; will start to watch the bubble pop.</strong> (0.3) Uber, Lyft, all these companies that are
valued at double-digit billions with zero profits, major losses, and no real assets to sell in
the event of a bankruptcy&hellip;. All of this is going to start to make some investors nervous,
particularly when they look around and realize that the tech sector has been carrying the
country&rsquo;s economy through its &ldquo;recovery&rdquo; (yes, we&rsquo;ve been in a recovery for the last half-decade!).
All it takes is a few small stones to start the avalanche.</li>
<li><strong>Voice-controlled fart apps will emerge.</strong> (0.6) Seriously. As Alexa and Siri and these other
voice-activated systems start to move into stationary devices in your home, and as the SDKs for
these systems start to become more widespread, the first thing developers will do is build some
kind of ridiculously silly app (it would be a kindness to call it a game) that will somehow
sweep everybody&rsquo;s sense of humor into the toilet. (Seriously. Imagine it. &ldquo;Alexa, did you have
beans for dinner?&rdquo; &ldquo;Yes, I did, and&ndash; BRAAAAAAAAAAP!&rdquo; It&rsquo;s exactly the kind of thing that would
get people giggling for hours on end, particularly in a weed-induced state. Did I mention I live
in Seattle?)</li>
<li><strong>Facebook will find that preventing &lsquo;fake-news sites&rsquo; is a lot easier said than done.</strong> (0.8) As a
result, they&rsquo;ll put some kind of &ldquo;AI&rdquo; filter on linked sites, declare a victory, and try to get
out of the political game entirely. It&rsquo;s a lose-lose scenario for them: one man&rsquo;s &ldquo;fake news&rdquo;
site is another man&rsquo;s &ldquo;revolutionary take&rdquo; backed by the First Amendment, and Facebook does not
want to be anywhere near a court trying to justify their actions against Free Speech. (Old-timers
like me will remember Prodigy, <a href="http://www.techrepublic.com/blog/classics-rock/prodigy-the-pre-internet-online-service-that-didnt-live-up-to-its-name/">an online service</a>
that started censoring content, which started its slide into doom.) Zuckerberg doesn&rsquo;t want to be
held responsible for swaying important political events one way or another, but neither does he
want to be the target of numerous political activist lawsuits (from all directions). As Joshua
(the AI in the WOPR, back in the 80s movies that every geek my age openly worshipped) learned,
Zuck will discover that sometimes &ldquo;the only winning move is not to play&rdquo;.</li>
<li><strong>A driverless car will kill somebody.</strong> (0.5) It&rsquo;s only a matter of time. The circumstances
may not be the software&rsquo;s fault&mdash;and in fact it&rsquo;s likely that it won&rsquo;t be, when the final analysis
comes back&mdash;but the headlines will scream, and the widespread fear of a human &ldquo;not being in the loop&rdquo;
will set driverless cars back by years. Expert testimony and repeated demonstrations will do
nothing to shake the public&rsquo;s fear that a computer-driven car could &ldquo;hit a bug and kill me&rdquo;.</li>
<li><strong>The topic of ethics and programming will begin to become fashionable.</strong> (0.3) Somewhere alongside
the driverless car&rsquo;s first fatality, people will start asking how the car&rsquo;s programming makes
decisions that most humans make in a split-second without even thinking about it. Case in point: the
car detects that a motorcycle rider has had a problem and the rider has laid the bike down in the
road right in front of the car. (For discussion purposes, there is no room left to brake; the rider
is too close.) The car can either swerve to the side to avoid the now-helpless rider, potentially
causing a major accident involving multiple people; or the car can simply continue forward, running
over (and very likely killing) the motorcycle rider but avoiding the possibility of multiple fatalities
from a larger accident. Most humans would swerve&mdash;but is that the &ldquo;right&rdquo; decision? More to the
point, what should the software be programmed to do? Once the public gets wind of these kinds of
decisions being made by geeks behind flat-screen LCDs, it&rsquo;s going to cause a major outcry. (And yes,
these kinds of decisions are going to be encoded in the software, somewhere.)</li>
<li><strong>&ldquo;The cloud&rdquo; continues to grow, even as consumers wonder what the hell it is.</strong> (0.7) Let&rsquo;s be
clear&mdash;as of right now, the cloud is basically a developer thing. My parents really don&rsquo;t &ldquo;get&rdquo;
the cloud, largely because there&rsquo;s really nothing they get from it. Sure, one can argue that GMail
is the world&rsquo;s most popular cloud email service&hellip;. but your email is just stored on a server that
Google owns, as opposed to a server that your ISP owns. (If that&rsquo;s your definition of &ldquo;cloud&rdquo;, then
pretty much all client-server computing is &ldquo;cloud&rdquo; in your world.) People are looking at
more online services for things like bill payment, true, but those are basically services being
offered by vendors with whom these people are already doing business&ndash;again, that&rsquo;s not &ldquo;cloud&rdquo;.
Cloud offerings have basically found a home in the developer world, but general-purpose cloud,
the way that cloud was first being sold, is losing its window of opportunity to get hold of
general consumers&rsquo; minds. (I lose this prediction if my parents are suddenly smitten with a product
that stores or computes for them and isn&rsquo;t a vendor they already have a relationship with.)</li>
<li><strong>&ldquo;Blockchain&rdquo; remains the most opaque &lsquo;thing&rsquo; of the year.</strong> (0.8) Everybody will go on and on about its
huge technical advantages and obvious benefits, while never actually describing what it is or how it
could work to change the world it&rsquo;s so clearly destined to change. It&rsquo;s the ultimate hype machine,
and it will show no signs of slowing down until maybe the end of the year. By that time, something
will emerge out of it (the way blockchain emerged out of bitcoins and cryptocurrency) that will
carry forward the legacy of &ldquo;changing the world&rdquo; without actually changing anything.</li>
<li><strong>Artificial intelligence will continue to remain a &lsquo;future&rsquo; thing.</strong> (0.8) Part of the reason I say
this is because AI is like magic&mdash;if you can understand it, it&rsquo;s not interesting anymore and it&rsquo;s just
an implementation detail. We&rsquo;ve had rules engines and natural language processing for years. When
Amazon started doing &ldquo;predictive analysis&rdquo; of what you would like to buy, we pulled &ldquo;data science&rdquo;
and &ldquo;behavioral analytics&rdquo; out of the &ldquo;AI&rdquo; world and into its own category. When AI figured out how
to make the spoken word make sense, we called it &ldquo;speech-to-text&rdquo; and it was a feature on Android
alreday back in the v2 days. (Marry speech-to-text up with a natural language parser, and you have
Siri&mdash;which, remember, was its own company before Apple acquired them.) No, Alexa is not going to
revolutionize the world any more than Siri did&mdash;the act of talking to a machine is not particularly
new, and it&rsquo;s only as good as the services that sit behind the parser and can &ldquo;hook in&rdquo; to the
parsed text. &ldquo;Cortana, fire up StarCraft 2&rdquo; is easy to parse and start an application; &ldquo;Cortana,
fire up StarCraft 2, and find me a random Hard co-op match as Artanis&rdquo; requires not just firing
up an application, but also &ldquo;hooking&rdquo; inside the application to know how to carry out the rest of
the request. That requires an API platform that all applications can hook into, provide, and describe
(in natural-text terms) to the voice-control system. That is not going to be easy to define, adopt,
or test.</li>
</ul>
<p>On a personal note, several predictions come to mind:</p>
<ul>
<li><strong>Ted will celebrate his one-year anniversary at Smartsheet in September.</strong> I&rsquo;m optimistic about these
guys, and the things we can do together. I&rsquo;m looking forward to taking them into the developer limelight
in a variety of different ways.</li>
<li><strong>Ted will do less speaking this year.</strong> My new role actually encourages me to help develop new talent
for my employer to go out and do the actual speaking, so while I&rsquo;m definitely down for doing a few
conferences this year, it&rsquo;s not going to be more than 12, total, for the calendar year. I enjoy speaking,
but I&rsquo;m looking to be a lot more careful about where I speak now.</li>
<li><strong>Ted will not be renewed as a Microsoft MVP.</strong> Actually, this appears to be fact, not a prediction.
MVP renewals for the January cycle went out already, and I didn&rsquo;t receive one. Fortunately, most of
the stuff I care about in the Microsoft world is all open-source (or moving that way) anyway, and
while it&rsquo;s been nice being on the MVP mailing lists, there&rsquo;s really been nothing there that&rsquo;s been
all that insightful or amazing. (And, fortunately, living in Redmond makes it trivially easy to get
together with anybody on a product team if I really want or need to, and I am privileged to call many
of the people on those teams &ldquo;friend&rdquo;.) It would&rsquo;ve been 14 years, but as we Stoics say, &ldquo;All good things,
in time, must come to an end.&rdquo;</li>
<li><strong>Ted will look to engage with other tech companies beyond Microsoft.</strong> Google just started a new
MVP-like program, and I&rsquo;ve been teaching Android and Angular and some Google Cloud Platform stuff for
a while, so perhaps they&rsquo;ll welcome me into their fold.</li>
<li><strong>Ted will continue to teach at UW.</strong> I&rsquo;ve been guest-lecturing at UW for the past three years now,
and I&rsquo;m loving it. The students are bright, eager, and a helluvalot smarter than I was at that age.
They&rsquo;re an incredible joy to teach.</li>
<li><strong>Ted will look to publish a few mobile apps.</strong> I&rsquo;ve had a few ideas floating around for a while, but
just never really made the time to do it. Even if they never turn a dime in profit, I&rsquo;m long overdue
for having a few apps in the respective mobile stores.</li>
<li><strong>Ted will continue to write for various tech &lsquo;zines.</strong> I love having the back-page editorial at
CODE Magazine, the column in MSDN, and the various series on developerWorks, among others. I fully
intend to keep all that going at full speed. (And I&rsquo;m always looking for new outlets, if anybody has
any leads on paid technical content gigs!)</li>
<li><strong>And finally, Ted will try to blog more.</strong> The perennial projection. I&rsquo;ve got much to blog about,
including the patterns series, as well as some interesting themes and ideas floating around the ol&rsquo;
brain pan.</li>
</ul>
<p>Happy Holidays, and thanks for reading!</p>
Farewell, IEhttp://blogs.tedneward.com/post/farewell-ie/
Fri, 08 Jan 2016 17:55:45 -0800http://blogs.tedneward.com/post/farewell-ie/
<p>For those of you who missed the announcement, Microsoft has officially
<a href="http://thenextweb.com/microsoft/2016/01/05/web-developers-rejoice-internet-explorer-8-9-and-10-die-on-tuesday/">end-of-lifed Internet Explorer</a>. Microsoft <a href="https://www.microsoft.com/en-us/WindowsForBusiness/End-of-IE-support">explained</a> why
the move was necessary, but let&rsquo;s be honest, we all knew this was coming, and why:
Because IE had long since fallen behind its competitors in terms of its implementation.
First Chrome came out, then Firefox got better, and when even Safari (which is not the world&rsquo;s most
standards-friendly browser, let&rsquo;s be hoenst) surpassed IE in terms of speed, it was pretty clear
that Microsoft was going to have to take some serious action to bring their browser back up
to speed.</p>
<p>So they did what any good software engineering group wants to do:
<a href="https://www.microsoft.com/en-us/windows/microsoft-edge">Blow it all up and start over</a>.</p>
<p>Personally, I will miss IE as a browser. So, if you will indulge me&hellip;</p>
<h1 id="eulogy-to-a-browser:d56630e92d1558eb11b63de05860871d">Eulogy to a browser</h1>
<p>Farewell, IE. Life has taken you from us, and it is moot that we should remember you in your
glory days and your prime.</p>
<p>You were a halfway-decent browser when I first got to
know you in 1997, and while I didn&rsquo;t like everything you did, you had some pretty decent features
nestled in among the various HTML 3.2 and 4.0 elements I came to love and loathe.</p>
<p>You had ActiveX
support, which was cool until it wasn&rsquo;t cool anymore, you had DynamicHTML, which was cool until
it was called AJAX, and even then you had one of the better XHR libraries/features, and you were
one of the first to put XSL into the browser directly, which was cool until people accused you
of &ldquo;breaking&rdquo; or &ldquo;trying to embrace-extend-extinguish a standard&rdquo; because the version of XSL
you shipped wasn&rsquo;t the final standard (which wasn&rsquo;t yet done when you shipped).</p>
<p>You were at the
centerpiece of Microsoft&rsquo;s &ldquo;Browsing your Desktop&rdquo; strategy, before it was determined that
having a Web component as a part of the operating system was a critical piece of monopolistic
hegemony, and you had this crazy relationship with JavaScript&ndash;ahem, sorry, JScript&ndash;that
had programmers never quite able to figure out if JScript was a &ldquo;server&rdquo; thing or a &ldquo;client&rdquo;
thing or&hellip;.</p>
<p>You served as the proving ground for Java applets, then when the Sun/Microsoft case hit its
zenith, you were the first browser to lead the way in disabling support for applets. At first,
all the anti-Microsoft folks hated you for having it (&ldquo;embrace! extend! extinguish!&rdquo;), then
they hated you for NOT having it (&ldquo;Not standards-compliant!&rdquo;), before they just finally came
to realize that they just hated you, period. Because, you know, &ldquo;MICRO$OFT!&rdquo;</p>
<p>Best of all, you spawned an entire generation of Web &ldquo;bling&rdquo;, in the form of the &ldquo;This site
best viewed in Internet Explorer&rdquo; buttons from which we have yet to completely recover.
(The button now reads &ldquo;Chrome&rdquo; or &ldquo;Safari&rdquo;, depending on which mobile device you&rsquo;re using
to read this page, by the way.)</p>
<p>Yes, IE, I will miss you.</p>
<p>But not very much.</p>
Blog Detailshttp://blogs.tedneward.com/post/blog-details/
Tue, 05 Jan 2016 00:55:07 -0800http://blogs.tedneward.com/post/blog-details/
<p>Here&rsquo;s a little ditty about Ted&rsquo;s blog, new and old. In case you were wondering.</p>
<h2 id="2005-dasblog:71dfa3f4592489d1d3836278310038ee">2005: dasBlog</h2>
<p>Back in 2005, having wanted to move away from my own home-grown Java Servlet/JSP-based
blogging system (and that was a fun little implementation, let me tell you about it sometime),
I decided to go with something out-of-the-box. Basic decision-making criteria anybody has around
whether to &ldquo;build vs buy&rdquo; were in play here: what I wanted wasn&rsquo;t really all that different
from what anybody else needed, and I couldn&rsquo;t justify the time required to build and maintain
my own bespoke system, particularly since I just wanted to blog more, not write blogging
software more.</p>
<p>So after asking around for a bit, I ended up installing dasBlog, a popular .NET-based blogging
system. It was nice, and since I was having a new website commissioned for me (again, buy vs
build here), I asked the designer to &ldquo;skin&rdquo; the blog in the same style as the website. That&rsquo;s
how <a href="http://www.tedneward.com">TedNeward.com</a> came into being, and the corresponding dasBlog
blog system&ndash;which many of you afterwards told me you sort of hated, from a colors and fonts
and other aesthetic criteria perspective&ndash;came into existence with it.</p>
<p>Fast forward ten years.</p>
<p>Frankly, the system had grown creaky and kind of painful. Yeah, OK, so it was nifty and cool
back in 2005&ndash;heck, it had support for a feature I <em>really</em> thought I wanted: the ability to
email a blog post in to the engine and have it posted! How cool! I could write blog posts
offline (while on an airplane, perhaps!), and then poof, upon landing and reconnecting to the
Internet, the email would whisk off, and without even my paying attention, blog post!</p>
<p>That offline-editing capability that I thought I really wanted&hellip; yeah, I never really used it.
I set it up, sure, but then promptly forgot the particular &ldquo;rules&rdquo; the email had to follow in
order to work correctly (Subject has to be prefixed with blah, To must be written like blah,
and so on). So that never really panned out.</p>
<p>More importantly, though, it was always in the back of my mind that dasBlog&rsquo;s architecture was
a teensy bit suspect.</p>
<h2 id="effective-enterprise-architecture:71dfa3f4592489d1d3836278310038ee">Effective Enterprise Architecture</h2>
<p>See, here&rsquo;s the funny thing about a blog: it&rsquo;s a read-mostly system. For every single web hit
that contains some kind of data that changes the system (a la a new blog post), there&rsquo;s hundreds
if not thousands of hits that are entirely read-only (a la you, dear reader). That means that
the decision to re-render the page to be sent back is really a ton of wasted work. Ideally,
a system in this situation would cache off the work spent doing the rendering, and only re-render
on demand.</p>
<p>This cache can also be known as &ldquo;saved HTML files&rdquo; to the lay person. In many respects, that&rsquo;s
the best kind of cache, because the system can simply pipe the HTML directly down the pipe, with
no additional processing required. Should somebody edit the blog post, the system can re-render
the HTML, write it to disk, and we&rsquo;re back to piping what is essentially static HTML right down
the pipe again on a new (read) request.</p>
<p>This was actually (sort of) what my first blog system did: each time I inserted a new blog post,
it wrote a new JSP file to the disk. If I edited the post, it loaded the JSP into a giant text
box for editing, and then when I submitted it, it wrote the contents of the text box back into
the JSP file, and lo and behold, we were back to (almost) straight rendering.</p>
<p>(The first blog actually was an attempt to see if that &ldquo;rewrite JSP files&rdquo; could and would
actually work in a production environment, particularly if I used servlet filters&ndash;instead of
servlets themselves&ndash;to do the various &ldquo;controller&rdquo; kinds of things. That way, the URL to a
given blog post was a straight JSP page URL, which seemed the hip and cool thing at the time.)</p>
<p>And the thing is, dasBlog didn&rsquo;t do any of this &ldquo;caching&rdquo; or &ldquo;pre-rendering&rdquo;; every time you
hit the engine, it was re-rendering the HTML from scratch, which always struck me as a waste
of CPU cycles and memory. But it worked, and I was happy, for about a decade.</p>
<p>Until it didn&rsquo;t.</p>
<h2 id="2015-hugo:71dfa3f4592489d1d3836278310038ee">2015: Hugo</h2>
<p>You&rsquo;ve all noticed it&ndash;the system as of late had gotten kinda slow. Wasn&rsquo;t sure why (was it the
blog engine, was it the amount of stuff it was hosting, was it the hosting provider, &hellip;?), and
honestly, didn&rsquo;t really care why. It was time to move on.</p>
<p>But&hellip; I have all these cool incoming links to my blog, particularly around the Vietnam of
Computer Science post, and Lord knows I don&rsquo;t want to lose those. (For crying out loud, how
many times do you get linked from Wikipedia?!?) So I needed something that would preserve the
old links (or at least redirect correctly), but I really wanted to get back to just straight
static HTML if I could. And it would be nice if somehow I could get &ldquo;pretty&rdquo; URLs, instead of
these (now-considered) ugly URLs that contain file extension information.</p>
<p>So I put the call out onto Twitter, and lo and behold, the answer came back: Jekyll (the gold
standard for most folks), and <a href="http://gohugo.io">Hugo</a>, which my buddy Matt Stine suggested,
and which happens to be written in Go, not that I really cared too much about that. (One wise
guy suggested Perl; I promised him coal in his stocking in return. That was the extent of my
concern over implementation.)</p>
<p>Hugo has worked out pretty well, so far&ndash;being a static site system, I write content as Markdown
files, and then ask Hugo (a command-line tool) to re-gen the entire site. (I have mixed feelings
about that&ndash;on the one hand, I understand why, but on the other, it feels wasteful most of the
time.) Then, since I&rsquo;m now trying to do most of my static site hosting at <a href="http://www.site44.com">site44</a>,
I just copy the files over to the appropriate folder inside my Dropbox (to which site44 is attached),
and lo and behold, I have deployed.</p>
<p>I had to write some kind of transformative script to take the old dasBlog XML files and convert
them over to Hugo&rsquo;s Markdown format, but that was pretty easy; getting the old aliases and
categories into the front matter was the trickier parts, and I gave up on a perfect 100%
port&ndash;as long as the script got 90% of the aliases right, I was happy, and hand-corrected the
remainder to match what the old dasBlog engine had generated.</p>
<h2 id="what-didn-t-come-over:71dfa3f4592489d1d3836278310038ee">What didn&rsquo;t come over</h2>
<p>I didn&rsquo;t want obstacles to stand in my way, so I deliberately chose to leave comments behind.
(They&rsquo;re still preserved in the old dasBlog files, if I ever want to try and pull them out and
attach them, though.) A static site system can have comments, but not the same way that blogs
like dasBlog stored them; systems like Disqus or Discourse tend to fill in the blanks here,
and frankly they do a better job with comments than dasBlog ever did, so I&rsquo;m OK with losing the
(often contentious) comments of the last decade in exchange for a more powerful, consistent
and SEOable system now. And hooking Disqus up took literally five minutes: sign up for Disqus,
put my website in, generate a UID for it, plug that in to the Hugo theme I&rsquo;m using, and voila!
I have comments.</p>
<p>Can&rsquo;t really beat that.</p>
<h2 id="next-steps:71dfa3f4592489d1d3836278310038ee">Next steps</h2>
<p>There&rsquo;s still a few things I need: analytics, for example. (Hello, Google Analytics!)</p>
<p>And, at some point, I want to actually move all of this over to <a href="http://www.newardassociates.com">my professional site</a>,
in order to allow me to have two blogs, one &ldquo;professional&rdquo; and the other &ldquo;personal&rdquo;, but that
may be a pipe dream at this point; in theory, I&rsquo;d like to have a blog attached to my named
domain (tedneward.com) that isn&rsquo;t tied to my professional life, but I don&rsquo;t want to lose whatever
branding I&rsquo;ve created for myself if I can help it. <em>shrug</em> Something to worry about later.</p>
<p>In the meantime, the new system means it&rsquo;s now much easier to blog again (yay), and my next
step after this (very shortly after this) is to hook up the blog content directory (from which
Hugo generates the site) in my source-control system to a CI system, so that now I can write a blog
post, commit it, walk away, and the CI system will see the checkin, generate the site, and deploy
the resulting static site to the proper Dropbox folder to deploy.</p>
<p>Wouldn&rsquo;t that be COOL?</p>
2016 Tech Predictionshttp://blogs.tedneward.com/post/2016-tech-predictions/
Mon, 04 Jan 2016 20:54:34 -0800http://blogs.tedneward.com/post/2016-tech-predictions/
<p>As has become my tradition now for nigh-on a decade, I will first go back over last years&rsquo;
predictions, to see how well I called it (and keep me honest), then wax prophetic on what I
think the new year has to offer us.</p>
<h2 id="in-2015:a4c6fac861e72246054ac04aa0f1f2b1">In 2015&hellip;</h2>
<p>As per previous years, I&rsquo;m giving myself either a <b>+1</b> or a <b>-1</b> based on a
purely subjective and highly-biased evaluational criteria as to whether it actually happened
(or in some cases at least started to happen before 31 Dec 2015 ended).</p>
<p>In 2015, I said:</p>
<ul>
<li><b>"Big data", "Big data", "Big data". You will get sick of this phrase.</b>
I can't speak for everybody, but I can tell that the end is near for the term, because
suddenly everybody is using the term, and they're using it to mean anything and everything.
"Big data" is not just about doing deep data science analysis on petabytes of data; it's
about any analysis (even simple reporting) on any collection of data (no matter how large
or small) for any reason. <b>+1</b>
</li>
<li><b>"Internet of Things". You will get sick of this phrase, too.</b> This hasn't
quite happened yet, but we're close. IoT is also starting to fray at the edges as a
definition, and when that happens, it's immediately ripe for abuse and marketing. More
importantly, though, lots of people are starting to realize that IoT is neither the huge
"automatic win" that we all sort of assumed it would become. <b>+1</b>
</li>
<li><b>"Internet of Medicine" or "Big Med".</b> Well, nobody's started using the term yet,
but certainly they're spending a lot of time in this space. They just don't like my term
yet. I'm pouting over it, but it's still a <b>-1</b>.
</li>
<li><b>"Tech bubble" becomes a "thing".</b> Oh, this one came down to the very wire, but
as December of 2015 rolled in, concerns over the actual valuations of the so-called "unicorns"
were starting to show up, and lots of people were beginning to openly wonder if Silicon Valley
and Wall Street were experiencing a falling-out. It took a hail-mary pass to do it, but I'm
claiming my <b>+1</b>.
</li>
<li><b>C# and Java will both make big announcements.</b> C#6 shipped, but Java9 didn't, leaving
me sort of confused as to how to score this one. However, I did say, "Those who care will
take note, those who don’t, won’t. Really, we’re kind of past the point where either of
those languages are going to be interesting to anyone who’s not already in that space", and
frankly, if you weren't a C# or Java developer, you probably didn't even hear a whisper about
either one (pro/shipping or con/not-shipping), either way. <b>+1</b>
</li>
<li><b>Go is going to either take off, or crash and burn.</b> The point of this one was that
Go was reaching an inflection point, and while I think it's gathering some momentum (including
with me personally--this new blog is using Go to do the site generation), I can't really tell
if it reached an inflection point, so <b>-1</b> to me.
</li>
<li><b>Microsoft acquires Xamarin.</b> Oh, as much as I thought (and still think) that it would
be a great story for both sides, it didn't happen, and probably never will. *sigh* <b>-1</b>
</li>
<li><b>Amazon just quietly keeps churning.</b> I dunno how I would measure this one, but
in some ways, as long as Amazon just keeps churning out new feature after new feature on AWS,
and keeps making money selling stuff on their main web property--which they continue to do--then
I think pretty much anything here qualifies as a <b>+1</b>. But it was kind of a lame
prediction to begin with, now that I re-read it.
</li>
<li><b>Google continues to throw sh*t against the wall, looking for their Next Big Thing.</b> I
believe the exact phrase I used was, "Expect a lot of announcements, a lot of "beta"s, and
none of it with any kind of realistic or even well-planned business model behind it--including
the Google Car." And, sure enough, we've heard a ton about the Google Car, among other
initiatives, but nothing has stepped up as a product yet to even come close to acting as a
second line of income for the firm. I call it an easy <b>+1</b>.
</li>
<li><b>Web use on mobile devices decreases in favor of apps.</b> In particular, I said,
<blockquote>This is going to happen whether the public wants it or not, because companies have
figured out that it behooves them to have you "trapped" inside their app (where they can
control all the content) rather than on their website. More and more websites are going to try
and redirect you to inside their app, rather than allow you to casually browse on their site,
because then they think they "own" your eyeballs. The only way this changes is if/when some
firm gets crushed in the court of public opinion by doing something really stupid...
and that won't happen in 2015. Wait for it in 2016.
</blockquote>
Various "clickbait" sites were the ones I was thinking about in particular, and while some of
them (I'm looking at you, Uberfacts) have floated a mobile app out there, the apps themselves
don't seem to be lighting any fires in the mobile marketplaces. I'll talk more about this
in a bit, but for now, I'm giving myself a <b>-1</b>.
</li>
<li><b>Hipster "Uber for X" apps will be all the rage.</b> Have you been to San Francisco
recently? Talked to anybody on the street there? This one was a slam-dunk <b>+1</b>.
</li>
<li><b>Mark Zuckerberg grows up a little.</b> Zuckerberg will never admit it, but now that he's
married and starting a family and all, he's starting to grow up. His paternity leave step was
a big one, and signals that maybe he's finally ready to "adult" now. If so, he's in good
company--it took Bill Gates getting married and having kids to come in out of the rain, too,
and ever since that time, Bill's become a philanthropist of the highest order. <b>+1</b>
</li>
<li><b>Larry Ellison buys a sports team.</b> Didn't happen yet. <b>-1</b></li>
<li><b>Perl makes one final gasp at relevancy, fails, and begins to decompose.</b> Oh, this one
is funny; how could I be so right, and so wrong at the same time? Wrong, because Perl 6 actually
<a href="https://perl6advent.wordpress.com/2015/12/25/christmas-is-here/">finally shipped</a>. And yet,
so right because... well, how many people do you know using it? Or were even paying attention
when it came out? Or.... Yeah. <b>+1</b>
</li>
</ul>
<p>Nine up, four down. Not bad.</p>
<p>That was the easy part. Now, on to the&hellip;.</p>
<h2 id="2016-predictions:a4c6fac861e72246054ac04aa0f1f2b1">2016 Predictions</h2>
<p>In no particular order:</p>
<ul>
<li><strong>Microsoft will continue to roll out features on Azure, and start closing the gap between it and AWS.</strong>
This one is not hard to imagine. Microsoft is committed to making Azure a core part of their company
success and survival, and Amazon has a list of features that Azure lacks, so it really boils down to
&ldquo;take one down, cross it off the list, lather, rinse, repeat&rdquo;.</li>
<li><strong>(X)-as-a-Service providers will continue to proliferate.</strong> We&rsquo;re seeing a huge surge in these various
companies that are providing some vertical thing as a service, and for most of those they&rsquo;re tech-related
(such as Database-as-a-Service, Container-as-a-Service, and so on). Part of that is because if it&rsquo;s one
thing software developer geeks know what to do, it&rsquo;s what they wish they had when they were working that
last project and really wished they could have when they were working it. This coming year will mark the
high-water mark of companies that provide *aaS products to the developer community, and then they&rsquo;ll all
start cannibalizing each other and some shutdowns, acquisitions and partnerships will kick in.</li>
<li><strong>Apple will put out two, maybe three new products, and they&rsquo;ll all be &ldquo;meh&rdquo; at best.</strong> Let&rsquo;s be frank,
folks, the luster is off the shiny Apple logo on the site of the building. Tim Cook is no Steve Jobs,
and Apple of 2015 was not the Apple of 2005 or 2010. The Apple Watch is interesting, but it certainly
hasn&rsquo;t taken off. No watch seems to have, in fact, become the &ldquo;it&rdquo; thing. I don&rsquo;t see many of them (or
their Android competitors, to be fair) at tech conferences, and casually glancing around the airport
doesn&rsquo;t show a ton of them in use. I don&rsquo;t think this is going to change any time soon, either. For
most people, the wearable just hasn&rsquo;t really offered up that compelling reason yet, and I don&rsquo;t think
2016 is going to see one, either.</li>
<li><strong>iOS 10 will be called iOSX.</strong> Just because they can, and because it would confuse the hell out of
people, and because Steve Jobs is not here anymore to tell that VP of Marketing to sit down, shut up,
and let the grown ups do this.</li>
<li><strong>Android N will be code-named &ldquo;Nougat&rdquo;.</strong> They might go with &ldquo;Nutella&rdquo;, but that would involve
copyright and trademark issues, which I imagine they&rsquo;d want to avoid.</li>
<li><strong>Java9 will ship.</strong> This is a &ldquo;no-duh&rdquo; prediction, but I&rsquo;m not above claiming a few of those. Really,
the bigger question there will be <em>what</em> will ship that Oracle calls &ldquo;Java9&rdquo;, and my personal feeling
is that modules/Jigsaw/whatever-we&rsquo;re-calling-them-now won&rsquo;t be in it. Slamming a module mechanism
on top of a platform that&rsquo;s a decade old, millions of programmers wide and a billion lines of code
high is not easy, and I don&rsquo;t think Oracle really has the energy, motivation or need to push them
through the morass of headaches that will stem from imposing a module system into place.
<strong>UPDATE</strong>: @olivergierke <a href="https://twitter.com/olivergierke/status/684642273561329664">points out</a>
that Java9 had already slipped to 2017, so this one is automatically going to be a &ldquo;miss&rdquo; next
January. <a href="http://www.infoworld.com/article/3011445/java/java-9-delayed-by-slow-progress-on-modularization.html">The article he cites</a>
says that Oracle &ldquo;blamed the delay on complexities in developing modularization&rdquo;, a la Project
Jigsaw. Honestly, I&rsquo;m going to stand by this prediction, because it would not surprise me in the
slightest if Oracle comes back at some point in 2016 and says, &ldquo;You know what? Fuck it.&rdquo; and ships
Java9 without modularization in place&ndash;I don&rsquo;t really think Java9 needs it at this point, and I&rsquo;m
not entirely sure that shipping <em>with</em> it will make Java all that much better. Time will tell&hellip;</li>
<li><strong>Facebook will start looking for other things to do.</strong> Yes, Facebook has been ridiculously
successful to date; it claims more population than most nations on Earth, in fact. But the company
is led by a classic Type-A personality, and the softening of his character by the birth of his
firstborn notwithstanding, this is when Zuckerberg comes back from leave and says, &ldquo;OK, boys and
girls, it&rsquo;s time to take us down a new path!&rdquo; and charges off into who-the-Hell-knows-what. I won&rsquo;t
hinge the prediction on <em>what</em> that would be, I just think it&rsquo;ll be something outside of the social
media realm (or tied to it just a little bit).</li>
<li><strong>Google will continue to quietly just sort of lay there.</strong> Google, for all that they are on the top
of everybody&rsquo;s minds since that&rsquo;s the search engine most of us use, hasn&rsquo;t really done much by way of
software product invention recently. Google+, Google Hangouts, yeah, sure, that was so 2013, but
what have you done for us lately? And honestly, what have they done recently, in 2015? Casting back
through my memory (and setting Android off to the side, since I consider that more or less an
independent effort in a lot of ways), I came up with nothing. I suspect the same will be true of
2016&ndash;they will continue to do lots of innovative things, but it&rsquo;ll all be &ldquo;big&rdquo; and &ldquo;visionary&rdquo;
stuff, like the Google Car, that won&rsquo;t have immediate impact or be something we can use in 2016
(or 2017).</li>
<li><strong>Oracle will quietly continue to work on Java.</strong> Oracle took a bit of a PR hit this year when they
fired/let go a number of &ldquo;Java evangelists&rdquo;, and that set the newsstands aflame with hints and
rumors that Oracle was getting ready to abandon Java. Frankly, if I&rsquo;m Larry Ellison (or the VP
that has Java under my umbrella), I&rsquo;m asking a very fundamental question: What the hell does Java
need with evangelists at this point? Everybody more or less knows what it is already, there&rsquo;s
nothing to sell in of itself, and that money could probably be put to better use hiring people to
work on the codebase itself, or putting the cash back into the rest of the firm to hire a few more
Oracle Database salespeople. Oracle didn&rsquo;t acquire Java because they saw it as a way to inflict
the Oracle Database upon the world&ndash;quite the opposite. Oracle acquired Java because they <em>use</em>
Java, all over the place, and this way they had control over a technology that they had &ldquo;bet the
farm&rdquo; on in a variety of ways. They&rsquo;re not going to kill it, but there&rsquo;s really not a whole lot
of need to go around preaching its message, either. So they let the evangelists go, and they&rsquo;ll
just keep on keepin&rsquo; on.</li>
<li><strong>C# 7 will be a confused morass.</strong> Microsoft is now striding boldly into that open-source world
it timidly courted just a few years ago. But in a lot of ways, this is highly uncharted territory
for the software giant, and for the OSS world as well. Sure, Linus has been releasing Linux kernel
after Linux kernel for years, but with himself as the autocrat in charge of it all. Microsoft wants
to make use of the open source man-hours to help advance the cause of C# 7, but whether they&rsquo;ve
smoothed out what that process will look like and/or how they will deal with the inevitable
conflicts between committers and company isn&rsquo;t yet clear. (Oracle is in this same boat, in a lot
of ways, and there&rsquo;s a lot of people who think that Java is too much Oracle, not enough OSS, so
to speak.) I think the C# 7 release will be one of the first that the world gets to see take
shape in a purely public forum, and they will be a bit confused and surprised at how chaotic
a product release can really be. (Yes, C# 6 was sort of in that same boat, but only a handful
of folks were really paying attention.)</li>
<li><strong>Another version of Visual Basic will ship, and nobody will really notice.</strong> Actually, that
already happened&ndash;remember when C# 6 shipped? They shipped a new version of VB then, too.
Alas, the ship has sailed on VB, and frankly, at this point, it&rsquo;s really just a husk of its
former self. Most of the VB luminaries are all speaking and/or writing in C# these days, and
only staunch loyalty to their fond memories of the language is what keeps it at all in the
conversation anymore. Sad, but&hellip; Oh, well.</li>
<li><strong>Apple will now learn the &ldquo;joys&rdquo; of growing a language in the public as well.</strong> Swift is now
open-source, and that will bring with it the same pains as what Oracle and Microsoft are feeling.
Enjoy, guys!</li>
<li><strong>Ted will continue to layer in a few features into the blog engine.</strong> For example, right now
I have no comments feature, and I suspect people will want to start telling me how incredibly
<strong>wrong</strong> I am about so many of these. So, on the docket already, Disqus or Discourse or some
other JavaScript-based comment-engine integration. Plus, I want to tweak the template I&rsquo;m using
for the blog&rsquo;s look and feel a little (although keeping it way simple, especially compared
to what I had before), so there&rsquo;s likely to be more than a few tweaks here and there. (Again,
not really a hard prediction to make, but I always like to close on a prediction that I have a
relatively .9 probability chance of hitting.)</li>
</ul>
<p>Happy Holidays, and thanks for reading!</p>
Peoples be talkin'...http://blogs.tedneward.com/post/peoples-be-talkin/
Wed, 04 Jun 2014 04:13:43 -0800http://blogs.tedneward.com/post/peoples-be-talkin/<p>
"Ted, where the hell did you go?"
</p>
<p>
I've been getting this message periodically over a variety of private channels, asking
if I've abandoned my blog and/or if I'm ever going to come back to it. No, I haven't
abandoned it, yes, I'm going to come back to it, but there's going to be a few changes
to my online profile that I'll give you a heads-up around... if anybody cares. :-)
</p>
<p>
First of all, <a href="http://blogs.tedneward.com/2013/12/10/On+Endings.aspx">as I
mentioned before</a>, LiveTheLook and I parted ways back at the end of 2013. Sad,
but every cloud has a silver lining in that I found a new home as the CTO of <a href="http://www.itrellis.com">iTrellis</a>,
a custom software development and IT continuous improvement consultancy. And therein...
lies the root of my problem.
</p>
<p>
Truth time: I'm ridiculously busy. And even more ridiculously happy.
</p>
<p>
For years now, almost a full decade in fact, people have been asking me when I was
going to start up a consulting company. (In fact, before I jumped in to LiveTheLook,
I interviewed about a job at GitHub, and Phil Haack, who's known me for years, expressed
outright surprise at the idea. "I've always pictured you as the consummate consultant--what
makes you want to go work at a product company?" And truth was, he was right--the
idea of working for a product company (like GitHub) wasn't really a strong appeal.
What was appealing was the idea of growing a team, managing a group of developers,
making them better as a group in a variety of manager-y ways. That was a large part
of the attraction of LiveTheLook, though I never got to the point of hiring anyone
to work with me there.) My response to people has always been the same: I believe
that a company needs a triumvirate of people at the top--one to handle sales/marketing/business
development, one to handle the technology, and one to handle the operations. I could
never seem to find a great biz-dev guy, nor a great ops guy, and so thoughts of building
a consulting firm were pretty far off in the distance.
</p>
<p>
But after LtL, a mutual acquaintance heard that I was looking, and he knew two guys
who were looking for a CTO for this new consulting company they were spinning up.
Chris (CEO) and Paul (CFO) and I met a few times. Chris and I in particular spent
a fair amount of time talking, weighing the mutual decision to jump into this thing
together, because it was obvious from the very beginning that he and I would need
to be able to work well together--if he was going to go off and do biz-dev, he had
to trust that I could carry the implementation through, and I needed to trust that
he wasn't going to sell a bill of goods that was impossible for me to deliver while
he did it, and so on and so on and so on.
</p>
<p>
Six months later, we're at four current clients (with a fifth one scheduled to spin
up in July), five billable consultants (including Chris and I, working together to
do an IT assessment project for a $10bn business unit of a $100bn company out on the
East Coast), and there's strong evidence to suggest that we'll crest the $1mn mark
in our first year of existence.
</p>
<p>
Yeah... it's been a fun ride so far. :-) And neither Chris nor I have any intention
of slowing down any time soon.
</p>
<p>
But, what I'm finding is that between billable hours, biz-dev meetings, implementation
meetings, one-on-ones with my people, speaking, and writing for the various publications
I still write for, I have almost no energy left to blog. At least, for now.
</p>
<p>
I have plans, though. Here's what I'm looking to do:
</p>
<ul>
<li>
First, we're going to stand up an iTrellis blog, and a lot of technical content I
write will be hosted in both places (there and here), where and when it makes sense.
Maybe, over time, the content will shift in quantity to over there, but I'll probably
always keep this channel open in some fashion.</li>
<li>
Second, I want to spin up a "personal blog", one in which I feel more comfortable
expressing completely non-technical ideas and topics, including politics and such.
That way, those who are interested in just the technical content can still get that,
and those who want to hear what I think about the rest of the world can tune in on
a separate channel.</li>
<li>
Third, I'll likely migrate this content into a new technical blog over at the "new"
professional website I'm slowly building out for myself, at <a href="http://www.newardassociates.com">www.newardassociates.com</a>.
That will eventually, over time, become the only technical channel I use, but I'll
set something up at this domain to redirect links to the corresponding blog entries
over there. That is going to be the real PITA in all of this, because I really want
to preserve the old links without having to stand up the same blog system over there.
(I'm "done" with the idea of a server-side processed blog--the blog entries should
be just plain ol' HTML, generated from whatever source I choose to write in, a la
Jekyll and its ilk. Plus, I never again want a blog with anything other than tech-agnostic
URLs; the whole ".../On+Endings.aspx" thing is soooooo 1997. Why should you--or I--care
what the underlying implementation is?)</li>
</ul>
Of course, like all plans, this is subject to change based on whatever obstacles pop
up to distract me. ("Want to make God laugh? Tell Him your plans." --old Yiddish proverb)
<p>
(By the way, if you have any experience with taking a dasBlog blog and redirecting
the links over to a new site, please email me how you did it and/or what tools you
used to do it. I'd really prefer to not have to write that redirect handler myself,
if I can help it. I don't even care too much about the comments--it's the entry links
I really want to preserve. I'm even willing to discuss payment measured in bottles
of Scotch... :-) )
</p><p>
I will, at a minimum, promise to keep up the Tech Predictions, though, no matter what
else happens. That's an eight-year tradition that I have absolutely no intention of
ever giving up. Even when I'm old and crotchety and every prediction reads, "I remember
when Swift was first released... you young'un's have NO IDEA what it was like to actually
type your code into an editor. It was hard! It was painful on the fingers! And WE
LIKED IT!"
</p>
On Endingshttp://blogs.tedneward.com/post/on-endings/
Mon, 09 Dec 2013 20:59:24 -0800http://blogs.tedneward.com/post/on-endings/<p>A while back, I mentioned that I had co-founded a startup (<a href="http://www.livethelook.com">LiveTheLook</a>); I'm saddened to report that just after Halloween, my co-founder and I split up, and I'm no longer affiliated with the company except as an adviser and equity shareholder. There were a lot of reasons for the split, most notably that we had some different ideas on how to execute and how to spend the limited seed money we'd managed to acquire, but overall, we just weren't communicating well.</p>
<p>While I'm sad to no longer be involved with LtL, I wish Francesca and the company nothing but success for the future, and in the meantime I'm exploring options and figuring out what my next great adventure will be. It's not the greatest time of the year (the "dead zone" between Thanksgiving and Christmas) to be doing it, but fortunately I've gotten a few leads that may turn out to be hits. We'll have to see. And, while we're sorting that out, I've got plans for things to work on in the meantime, including a partnership effort with my eldest son on a game he invented.</p>
<p>So, what I'm saying here is that if anyone's desperate for consulting, now's a great time to reach out, because I can be bought. :-)</p>
Seattle (and other) GiveCampshttp://blogs.tedneward.com/post/seattle-and-other-givecamps/
Thu, 29 Aug 2013 12:19:45 -0700http://blogs.tedneward.com/post/seattle-and-other-givecamps/<p>Too often, geeks are called upon to leverage their technical expertise (which, to most non-technical peoples' perspective, is an all-encompassing uni-field, meaning if you are a DBA, you can fix a printer, and if you are an IT admin, you know how to create a cool HTML game) on behalf of their friends and family, often without much in the way of gratitude. But sometimes, you just gotta get your inner charitable self on, and what's a geek to do then? Doctors have "Doctors Without Boundaries", and lawyers can always do work "pro bono" for groups like the Innocence Project and so on, but geeks....? Sure, you could go and join the Peace Corps, but that's hardly going to really leverage your skills, and Lord knows, there's a ton of places (charities) that could use a little IT love while you're off in a damp and dismal jungle somewhere.</p>
<p>(Not you, Seattle. You're just damp today. Dismal won't be for another few months, when it's raining for weeks on end.)</p>
<p>(As if in response, the rain comes down even harder.)</p>
<p>About five or so years ago, a Microsoft employee realized that geeks didn't really have an outlet for their desires to volunteer and help out in their communities through the skills they have patiently mastered. So Chris created <a href="http://givecamp.org/">GiveCamp</a>, an organization dedicated to hosting "GiveCamps" all over the US, bringing volunteer developers, designers, and other IT professionals together with charities that need some IT love, whether that's in the form of a new mobile app, some touch-up on the website, a port from a Microsoft Access app to something even remotely more modern, or whatever.</p>
<p><a href="http://www.seattlegivecamp.org/">Seattle GiveCamp</a> is coming up, October 11-13, at the Microsoft Commons. No technical bias is implied by that--GiveCamp isn't an evangelism event, it's a "let's help people" event. Bring your Java, PHP, Python, and yes, maybe even your Perl, and create some good karma for groups that are doing good things. And for those of you not local to Seattle, there's lots of other GiveCamps being planned all over the country--consider volunteering at one nearby.</p>
On speakers, expenses, and stipendshttp://blogs.tedneward.com/post/on-speakers-expenses-and-stipends/
Mon, 26 Aug 2013 20:09:01 -0700http://blogs.tedneward.com/post/on-speakers-expenses-and-stipends/<p>In the past, I've been asked about <a href="http://channel9.msdn.com/Shows/HanselminutesOn9/Hanselminutes-on-9-The-Death-of-the-Professional-Conference-Speaker">my thoughts on conferences and the potential "death" of conferences</a>, and the question came up again more recently in a social setting. It's been a while since I commented on it, and if anything, my thoughts have only gotten sharper and clearer.</p>
<h3>On speaking professionally</h3>
<p>When you go to the dentist's office, who do you want holding the drill--the "enthused, excited amateur", or the "practiced professional"?</p>
<p>The use of the term "professional" here, by the way, is not in its technical use of the term, meaning "one who gets paid to perform a particular task", but more in a follow-on to that, meaning, "one who takes their commitment very seriously, and holds themselves to the same morals and ethics as one who would be acting in a professional capacity, particularly with an eye towards actually being paid to perform said task at some point". There is an implicit separation between someone who plays football because they love it, for example, going out on Sunday afternoons and body-slamming other like-minded individuals just because of the adrenaline rush and the male bonding, and those who go out on Sunday afternoons and command a rather decently-sized salary ($300k at a minimum, I think?) to do so. Being a professional means that not only is there a paycheck associated with the activity, but a number of responsibilities--this means not engaging in stupid activity that prevents you from being able to perform your paid activity. In the aforementioned professional athlete's case, this means not going out and doing backflips on a dance floor (*ahem*, Gronkowski) or playing some other sport at a dangerous level of activity. (In the professional speaker's case, it means arranging travel plans to arrive at the conference at least a day before your session--never the day of--and so on.)</p>
<p>For a lot of people, speaking at an event is an opportunity for them to share their passion and excitement about a given topic--and I never want to take that opportunity away from them. By all means, go out and speak--and maybe in so doing, you will find that you enjoy it, and will be willing to put the kind of time and energy required into doing it well.</p>
<p>Because, really, at the end of the day, the speakers you see in the industry that are very, very good at what they do, they weren't just "born" that way. They got that way the same way professional athletes got that way, by doing a lot of preparation and work behind the scenes. They got that way because they got a lot of "first team reps", speaking at a variety of events. And they continue to get better because they continue to speak, which means continuously putting effort and energy into new talks, into revising old talks, and so on.</p>
<p>But all of that time can't be for free, or else people won't do it.</p>
<p>Go back to the amateur athlete scenario: the more time said athlete has to work at a different job to pay the bills, the less time they have to prep and master their athletic skills. This is no different for speakers--if someone is already spending 8 hours a day working, and another 6 to 8 hours a day sleeping, then that's 8 to 10 hours in the day for everything else, including time spent with the family, eating, personal hygiene, and so on, including whatever relaxation time they can carve out. (And yes, we all need some degree of relaxation time.) When, exactly, is this individual, excited, passionate, enthused (or not), supposed to get those "first team reps" in? By sacrificing something else: time with the family, sleep, a hobby, whatever.</p>
<p>Don't you think that they deserve some kind of compensation for that time?</p>
<p>I know, I know, the usual response is, "But they're giving back to the community!" Yes, I know, you never really figured anything out on your own, you just ran off to StackOverflow or Google and found all the code you needed in order to learn the new technology--it was never any more effort on your own part than that. You OWE the community this engagement. And, by the way, you should also owe them all the code you ever write, for the same reason, because it's not like your employer ever gave you anything for that code, and it's not like you did all that research and study for the code you work on for them.</p>
<p>See, the tangled threads of "why" we do something are often way too hard to unravel. So let's instead focus on the "what" you did. You submitted an abstract, you created an outline, you concocted some slides, you built some demos, you practiced your talk, you delivered it to the audience, and you submitted yourself to "life's slings and arrows" in the form of evaluations. And for all that, the conference organizers owe you nothing? In fact, you're required to pay for the privilege of doing all that?</p>
<h3>On "professional" conferences</h3>
<p>One dangerous trend I see in conferences, and it's not the same one I saw in 2009, is that the main focus of a conference is shifting; no longer is it a gathering of like-minded professionals who want to improve their technical skills by learning from others. Instead, it's turning into a gathering of people who want to party, play board games, gorge themselves on bacon, drink themselves to a stupor, play in a waterpark or go catch a Vegas show with naked women in it. Somehow, "professional developer conference" has taken on all the overtones of a Bacchanalian orgy, all in the name of "community".</p>
<p>Don't get me wrong--I think it can be useful to blow off some steam during a show, particularly because for most people, absorbing all this new information is mentally exhausting, and you need time to process it, both socially (in the form of hallway conversations) and physically (meaning, go give your body something to do while your mind is churning away). But when the focus of the conference shifts from "speakers" to "bacon bar", that's a dangerous, dangerous sign.</p>
<p>And you know what the first sign is that the conference doesn't think it's principal offering is the technical content? When they won't even cover the speakers' costs to be at that event.</p>
<p>Seriously, think about it for a moment: if the principal focus of this event is the exchange of intellectual and industrial information, through the medium of a lecture given by an individual, then where should your money go? The bacon bar? Or towards making sure that you have the best damn lecturers your budget can afford?</p>
<p>When a conference doesn't offer to pick up airfare and hotel, then in my mind that conference is automatically telling the world, "We're willing to bring in the best speakers that are willing to do this all for free!" And how many of you would be willing to eat at a restaurant that said, "We're willing to bring in the best chefs that are willing to cook for free!"? Or go to a hospital that brings in "the best doctors that are willing to operate for free!"?</p>
<p>And how many of you are willing to part of your own money to go to it?</p>
<p>For community events like CodeCamps, it's an understood proposition that this is more about the networking and community-building than it is about the quality of the information you're going to get, and frankly, given that the CodeCamp is a free event, there's also an implicit "everybody here is a volunteer" that goes with it that explains--and, to my mind, encourages--people who've never spoken before to get up and speak.</p>
<p>But when you're a CodeMash, a devLink, or some of these other shows that are charging you, the attendee, a non-trivial amount of money to attend, and they're not covering speakers' expenses at a minimum, then they're telling you that your money is going towards bacon bars and waterparks, not the quality of the information you're receiving.</p>
<p>Yes, there are some great speakers who will continue to do those events, and Gods' honest truth, if I had somebody to cover my mortgage and/or paid me to be there, I'd love to do that, too. But many of those people who are paid by a company to be speaking at events are called "evangelists" and "salespeople", and developers have already voted with their feet often enough to make it easy to say that we don't want a conference filled with "evangelists" and "salespeople". You want an unbiased technical view of something? You want people to talk about a technology that don't have an implicit desire to sell it to you, so that they can tell you both what it's good for and where it sucks? Then you want speakers who aren't being paid by a company to be there; instead, you want speakers who can give you the "harsh truth" about a technology without fear of reprisal from their management. (And yes, there are a lot of evangelists who are very straight-shooting speakers, and I love 'em, every one. But there's a lot more of them out there who aren't.)</p>
<p>In many cases, for the conference to deliver both the bacon bar and the speakers' T&E, it would require your attendance fee to go up some. By rough back-of-the-napkin calculations, probably about $50 for each of you, depending on the venue, the length of the conference, the number of speakers (and the number of talks they each do), and the total number of attendees. Is it worth it?</p>
<p>When you go to the dentist's office, do you want the "excited, enthused amateur", or the "practiced professional"?</p>
On startupshttp://blogs.tedneward.com/post/on-startups/
Mon, 26 Aug 2013 14:37:25 -0700http://blogs.tedneward.com/post/on-startups/<p>Curious to know what Ted's been up to? Head on over to <a href="http://signup.livethelook.com">here</a> and sign up.</p>
<p>Yes, I'm a CTO of a bootstrap startup. (Emphasis on the "bootstrap" part of that--always looking for angel investors!) And no, we're not really in "stealth mode", I'll be happy to tell you what we're doing if you drop me an email directly; we're just trying to "manage the message", in startup lingo.</p>
<p>We're only going to be under wraps for a few more weeks before the real site is live. And then.... *crossing fingers*</p>
<p>Don't be too surprised if the tone of some of my blog posts shifts away from low-level tech stuff and starts to include some higher-level stuff, by the way. I'm not walking away from the tech, by any stretch, but becoming a CTO definitely has opened my eyes, so to speak, that the entrepreneur CTO has some very different problems to think about than the enterprise architect does.</p>
Farewell, Mr. Ballmerhttp://blogs.tedneward.com/post/farewell-mr-ballmer/
Fri, 23 Aug 2013 23:25:54 -0700http://blogs.tedneward.com/post/farewell-mr-ballmer/<p>By this point, everybody who's even within shouting distance of a device connected to the Internet has heard the news: Steve Ballmer, CEO of Microsoft, is on his way out, retiring somewhere in the next twelve months and stepping aside to allow someone else to run the firm. And, rumor has it, this was not his choice, but a decision enforced upon the firm by the Microsoft Board.</p>
<p>You know, as much as I've disagreed with some of the decisions that've come out of the company in the last five years or so, I can't help but feel a twinge of sadness for how this ended. Ballmer, by all accounts, is a nice guy. I say that not as someone who's ever had to deal with him in person, but based on hearsay reports and two incidents where I've been in his general proximity, <a href="http://blogs.tedneward.com/2008/09/17/quotIm+Sorry+Sir+Those+Cookies+Are+Not+For+Youquot.aspx">one of which was absolutely hilarious</a>. Truth: when the cookie guard in that story told him that, he didn't, as some might imagine, immediately pull rank and start yelling "Do you know who I am?!?" In fact, he looked entirely like he was going to put the cookies back when another staff member rushed up, whispered in the first's ear, and when the first one apologized profusely, he just grinned--not meanly, but seemingly in the humor of the situation--and took a bite. He didn't have to play it so nicely, but how you treat the "little people" that touch on your life is a great indicator of the kind of person you are, deep down.</p>
<p>And count me a Ballmer-apologist, perhaps, but I have to wonder how much of his decision-making was made on faulty analysis and data from his underlings. Some of that is his own problem--a CEO should always be looking for ways to independently verify the information his people are reporting to him, and people who tell him only what he wants to hear should be immediately fired--but I genuinely think he was a guy just trying to do the best he could.</p>
<p>And maybe, in truth, that was never really enough.</p>
<p>Regardless, should the man suddenly appear at my doorstep, I would invite him in for dinner, offer him a beer, and talk about our kids' football teams. (They play in the same pre-high school football league.) He may not have been the great leader that Microsoft needed in the post-Gates years, but I wouldn't be surprised if Microsoft has to go a few iterations before they find that leader, if they ever can. A lot has to happen exactly right for them to find that person, and unless Bill has suddenly decided he's ready to take up the mantle again, it's something of a long shot.</p>
<p>Good luck, Microsoft.</p>
Programming Interviewshttp://blogs.tedneward.com/post/programming-interviews/
Mon, 19 Aug 2013 21:30:55 -0700http://blogs.tedneward.com/post/programming-interviews/<p>Apparently I have become something of a resource on programming interviews: I've had three people tell me they read the last two blog posts, one because his company is hiring and he wants his people to be doing interviews right, and two more expressing shock that I still get interviewed--which I don't really think is all that fair, more on that in a moment--and relief that it's not just them getting grilled on areas that they don't believe to be relevant to the job--and more on that in a moment, too.</p>
<p>A couple of things have emerged in the last few weeks since the saga described earlier, so I thought I'd wrap the thing up with a final post. Besides, I like things that come in threes.</p>
<p><b>First, go see <a href="http://channel9.msdn.com/Events/ALM-Summit/ALM-Summit-3/Technical-Interviewing-You-re-Doing-it-Wrong">this video</a>.</b> Jonathan pinged me about it shortly after the second blog post came out, and damn if he and Mitch don't nail a bunch of things directly on the head. Specifically, I want to call out two lists they put into their slides (which I can't find online, or I'd include a link, sorry).</p>
<p>One, what are the things you're trying to answer in an interview? They call it out as three questions an interviewer or interview team is seeking to answer:
<ol>
<li>Can they do the job?</li>
<li>Will they be motivated?</li>
<li>Would they get along with the team?</li>
</ol>
Personally, #2 to me is a red herring--frankly, I expect that if you, the candidate, take a job with my company, then either you have determined that you will be motivated to work here, or else you can force yourself to be. I don't really expect you to be the company cheerleader (unless, of course, I'm hiring you for that role), but I do expect professionalism: that you will be at work when you are scheduled or expected to be, that you will do quality work while you are there, and that you will look to make the best decisions possible given the information you have at the time. Motivation is not something I should be interviewing for; it's something you should be bringing.</p>
<p>But the other two? Spot-on.</p>
<p>And this brings me to my interim point: <b>I'm not opposed to a programming test.</b> I think I gave the impression to a number of readers that I think that I'm too good or too famous or whatever to be tested on my skills; that's the furthest thing from the truth. I think you most certainly should be verifying that I have the technical chops to do the job you want me to do; what I do want to suggest, however, is that for a number of candidates (myself included), there are ways to determine my technical chops without forcing me to stand at a whiteboard and code with a pen. For some candidates, you can examine their GitHub profile and see how many repos they have that're public (and have a look through some of the code they wrote). In fact, what I think would be a <i>great</i> interview question would be to look at a repo they haven't touched in a year, find some element of the code inside there, and ask them to explain what they were thinking when they wrote it. If it's well-documented, or if it's simple code, they'll be able to do that fairly quickly (once they context-swap to the codebase--got to give them time to remember, after all). If it's a complex or tricky bit, and they can't explain it...</p>
<p>... well, you just learned something about the code they write, now didn't you?</p>
<p>In my case, I have no public GitHub profile to choose from, but I'm an edge case, in that you can also watch my videos, and/or read my books and articles. Granted, there's a chance that I have amazing editors who save me from incredible stupidity and make me look good... but what are the chances that somebody is doing that for over a decade, across several technology platforms, and all without any credit? Probably pretty close to nil, IMHO. I'm not unique in this case--there's others whose work more or less speaks for itself, and I think you're disrespecting the candidate if you don't do your homework on the interview first.</p>
<p>Which, by the way, brings up another point: As an interviewer, you have a responsibility to do your homework on the candidate before they walk in the door, particularly if you're expecting them to have done their homework on your firm. Don't waste my time (and yours, particularly since yours is probably a LOT more expensive than mine, considering that a lot of companies are doing "interview loops" these days with a team of people, and all of their time adds up). If you're not going to take my candidacy seriously, why should I take your job or job offer or interview seriously?</p>
<p>The second list Jon and Mitch call out is their "interviewing antipatterns" list:
<ul>
<li>The Riddler</li>
<li>The Disorienter</li>
<li>The Stone Tablet</li>
<li>The Knuth Fanatic</li>
<li>The Cram Session</li>
<li>Groundhog Day</li>
<li>The Gladiator</li>
<li>Hear No Evil</li>
</ul>
I want you to watch the video, so I'm not going to summarize each here; go watch it. If you're in a position of doing hiring, ask yourself how many of those you yourself are perpetrating.</p>
<p><b>Second, go read <a href="http://firstround.com/article/The-anatomy-of-the-perfect-technical-interview-from-a-former-Amazon-VP">this article</a>.</b> I don't like that he has "Dig into algorithms, data structures, code organization, simplicity" as one of his takeaways, because I think most interviewers are going to see "algorithms" and "data structures" and stop there, but the rest seems pretty spot-on.</p>
<p><b>Third, ask yourself the critical question: What, exactly, are we doing wrong?</b> You think you're an agile organization? Then ask yourself how much feedback you get on your interviewing process, and how you would know if you screwed it up. Yes, you will know if hire a bad candidate, but how will you know if you're letting good candidates go? Maybe you're the hot company that everybody wants to work at, and you can afford to throw some wheat out with the chaff a few times, but you're not going to be in that position for long if you do, and more importantly, you're not going to be in that position for long, period. If you don't start trying to improve your hiring process now, by the time you need to, it'll be too late.</p>
<p><b>Fourth, practice!</b> When unit-testing came out, many programmers said, "I don't need to test my code, my code is great!", and then everybody had a good laugh at their expense. Yet I see a lot of companies say essentially the same thing about their hiring and interview practices. How do you test an interview process? Easy--interview yourselves. Work with known-good conditions (people you know, people who work with you already, and so on), and run them through the process, but with the critical stipulation that <i>you must treat them exactly as you would a candidate</i>. If you look at your tech lead and say, "Yeah, this is where I'd ask you a technical question, but I already know...", then unless you're prepared to do that for your candidates, you're cheating yourself on the feedback. It's exactly like saying, "Yeah, this is where I'd write a test checking to see how we handle a null in that second parameter, but I already know...". If you're not prepared to do the latter, don't do the former. (And if you are prepared to do the latter, then I probably don't want to work with you anyway.)</p>
<p><b>Fifth, remember: Interviewing is not easy!</b> It's not easy on the candidates, and it shouldn't be on you. It would be great if you could just test somebody on one dimension of themselves and call it good, but as much as people want to pretend that a programmer is just a code-spewing cog in a machine, they're not. If you want well-rounded candidates, then you must interview all aspects of that well-roundedness to determine if they are or not.</p>
<p>Whatever you interview for, that's what you will get.</p>
On "Exclusive content"http://blogs.tedneward.com/post/on-exclusive-content/
Mon, 19 Aug 2013 19:17:56 -0700http://blogs.tedneward.com/post/on-exclusive-content/<p>Although it seems to have dipped somewhat in recent years, periodically I get requests from conferences or webinars or other presentation-oriented organizations/events that demand that the material I present be "exclusive", usually meaning that I've never delivered said content at any other organized event (conference or what-have-you). And, almost without exception, I refuse to speak at those events, or else refuse to abide by the "exclusive" tag (and let them decide whether they still want me to speak for them).</p>
<p>People (by which I mean "organizers"--most speakers seem to get it intuitively if they've spoken at more than five or so conferences in their life) have expressed some surprise and shock at my attitude. So, I decided to answer some of the more frequently-asked questions that I get in response to this, partly so that I don't have to keep repeating myself (yeah, right, as if said organizers are going to read my blog) and partly because putting something into a blog is a curious form of sanity-check, in that if I'm way off, commenters will let me know posthaste.</p>
<p>Thus...:
<ul>
<li><b>"Nobody will come to our conference/listen to our webinar if the content is the same as elsewhere."</b> This is, by far, the first and most-used reaction I get, and let me be honest: if people came to your conference or fired up your webinar solely because of the information contained, they would never come to your conference or listen to your webinar. The Internet is huge. Mind-staggeringly huge. Anything you could possibly ever want about any topic you could ever possibly imagine, it's captured it somewhere. (There's a corollary to that, too; I call it "Whittington's Law", which states, "Anything you can possibly imagine, the Internet not only has it, but a porn site version of it, as well".) You will never have exclusive content, because unless I invented the damn thing, and I've never shown it to anybody or ever used it before, somebody will likely have used it, written a blog post or a video tutorial or what-have-you, and posted it to the Internet. Therefore, by definition, it can't be exclusive.</li>
<li>But even on top of that first point, no presentation given by the same guy using the same slides is ever exactly the same. Anybody who's ever seen me give a talk twice knows that a lot of how I give my presentations is extremely ad-hoc; I like to write code on the fly, incorporate audience feedback and participation, and sometimes I even get caught up in a tangent that we explore along the way. None of my presentations are ever scripted, such that if you filmed two of them and played them side-by-side, you'll see marked and stark differences between them. And frankly, if you're a conference organizer, you should be quite happy about this, because one of the first rules of presenting is to "Know thy audience", but if you can't know your audience ahead of time, what course is left to you but to poll the audience when you first get started, and adjust your presentation based on that?</li>
<li><b>"Sure, the experience won't be as great as if they were in the room at the time, but if they can get the content elsewhere, why should they come to our conference?"</b> Well.... Honestly, that question really needs to be rephrased: "Given all the vast amounts of information out there on the Internet, why should someone come to your conference, period?" If you and your fellow organizers can't answer that question, then my content isn't going to help you in the slightest. TechEd and other big conferences that stream all of their content to the Web seem to be coming to the realization that there is something about the in-person experience that still creates value for attendees, so maybe you should be thinking about that, instead. Yes, you will likely lose a few ticket sales from people watching the content online, but if those numbers are staggeringly large, it means that your conference offered nothing but content in the first place, and you were going to see those numbers drop off significantly anyway once the majority of your audience figured out that the content is available elsewhere. And for free, no less.</li>
<li><b>"But why is this so important to you?"</b> Because, my friends, everything gets better with practice, and that includes presentations. When I taught for <a href="http://www.develop.com">DevelopMentor</a> lo those many years ago, one of the fundamental rules was that "You don't really know a deck until you've delivered it five times". (I call it "Sumida's Law", after the guy who trained me there.) What's more, the more often you've presented on a subject, the more easily you see the "right" order to the topics, and better ways of explaining and analogizing those topics occur to you over time. ("Halloway's Corollary to Sumida's Law": "Once you've delivered a deck five times, you immediately want to rewrite it all".) To be quite honest with you all, the first time I give a talk is much like the beta release of any software product: it takes user interaction and feedback before you start to see the non-obvious bugs.</li>
</ul>
</p>
<p>I still respect the conference or webinar host that insists on exclusive content, and I wish you well finding your next speaker.</p>
More on the Programming Tests Sagahttp://blogs.tedneward.com/post/more-on-the-programming-tests-saga/
Thu, 25 Jul 2013 14:19:48 -0700http://blogs.tedneward.com/post/more-on-the-programming-tests-saga/<p>A couple of people had asked how the story with the company that triggered the "I Hate Programming Tests" post ended, so I figured I'd follow up with the rest of that story, and some thoughts.</p>
<p>After handing in the disjoint-set solution I'd come up with, the VP pondered things for a bit, then decided to bring me in for an in-person interview loop with a half-dozen of the others that work there. I said I'd be happy to, and came in, did a brief meet-and-greet with the group of folks I'd be interviewing with (plus, I think, a few others), and then we got to the first interview mono-a-mono, and after a brief "Are you familiar with MVC?", we get into...</p>
<p>... another algorithm challenge. A walk-up-to-the-whiteboard-and-code-this challenge.</p>
<p>OK, whatever. I already said I'm not great with algorithmic challenges like this, but maybe this guy didn't get the memo or he's just trying to see how I reason things through. So, sure, let's attack this, even though I haven't done this kind of problem in like twenty years. (One of the challenges was "How do you sort a file of integer numbers when you can't store the entire collection of numbers in memory?", which wasn't an unfair challenge, just not something that I generally have to mess with. Honestly, in the working world, I'll start by going through the file number by number--or do chunks of the file in parallel using actors, if the file is large enough--and shove them into a database that's indexed on that number. But, of course, as with all of these kinds of challenges, the interviewer continues to throw constraints at the problem until we either get to the solution he wants or Ted runs out of imagination; in this case, I think it was the latter.) End result: not a positive win.</p>
<p>Next interviewer walks in, he wasn't there for the meet-and-greet, which means he has even less context about me than the guy before me, and he immediately asks... another algorithmic challenge. "If you have a tree of nodes, and you want to get a list of the nodes in rank order" (meaning, a breadth-first search, where each node now gets a "sibling" pointer pointing to the sibling on its right in the tree, or null if it's the rightmost node at that depth level) "how would you do it?" Again, a fail, and now I'm getting annoyed. I admitted, from the outset, that this is not the kind of stuff I'm good at. We've already made that point. I accept the "F" on that part of my report card. What's even more annoying, the interviewer keeps sighing and drumming his fingers in an obvious state of "Why is this bozo wasting my time like this, I could be doing something vastly more important" and so on, which, gotta say, was kind of distracting. End result: total fail.</p>
<p>By this point, I'm really annoyed. The VP comes to meet me, asks how it's going, and I tell him, flatly, "Sucks." He nods, says, yeah, we're going to kill the interview loop early, but I want to talk to you over lunch (with another employee along for company) and then have you meet with one more person before we end the exercise.</p>
<p>Lunch goes quite well, actually, and the last interview of the day is with their Product Manager, who then presents me with a challenge: "Suppose I want to build an online system for ordering pizzas. Customers can order pizzas, in other words. Build for me either the UI or the data model for this system." OK, this is different. I choose the data model, and build a ridiculously simple one-to-many relationship of customers to orders, and a similar one-to-many for orders to pizzas. She then proceeds to complicate the model step by step, sometimes in response to my questions, sometimes out of the blue, until we have a fairly complex roughly-sketched data model on the whiteboard. Result: win.</p>
<p>The VP at this point is on the horns of a dilemma: two of the engineers in the interview loop are convinced I'm an idiot. They're clearly voting no on this. But he's read my articles, he's seen some of my presentations, he knows I'm not the idiot the others assume me to be, and he's now trying to figure out what his next steps are. He takes a week to think about it, then emails me yesterday to say that it's not going to work.</p>
<p>Here's my thoughts, and folks, if you interview people or are part of an interview process, I'm trying to generalize this beyond this one experience to take it into a larger context:
<ul>
<li><b>Know what you want to prove with your interview.</b> I get the feeling that this interview loop was essentially a repeat of every interview loop they've ever done before, with no consideration to the candidate himself. An interview is a chance for the company to get to know the candidate better, in order to make a well-informed decision. In this particular case, trying to suss out my skills around algorithms was a wasted effort--I'd already conceded that point. Therefore, find new questions! Find new areas in which to challenge the candidate to see what their skills are. (If you can't think of something else to ask, then you're not really thinking about the interview all that hard, and you're just going through the motions.)</li>
<li><b>Look for the proof you seek in other areas.</b> With the growth of things like Github and open source projects in general, it's becoming easier and easier to prove to yourself as a company that a candidate does or does not have the coding skills you're looking for. Did this guy submit some pull requests to a project? Did this guy post some blogs about interesting technical tidbits? (Or, Lord help us, write articles for major publications?) Did this guy author an open-source project, or work on a project that other people know about? Look at it this way: If Anders Heljsberg, Bjarne Stroustrup or James Gosling walk through the door, are you going to put them through the same interview questions you put the random recruiter-found candidate goes through? Or are you willing to consider their established body of work and call it covered? As an interviewer, it behooves you to look for that established body of work, so that you can spend the interview loop looking at other things.</li>
<li><b>Be clear in what you want.</b> One of the things the VP said to me was that he was looking for somebody who had a similar skillset to what he had; that is, had a architectural view of things and an interest in managing the people involved. By then submitting my candidacy to a series of tests that didn't really test for those things, he essentially torpedoed whatever chances it might have had.</li>
<li><b>Be willing to assert your authority.</b> If you're the VP of the company, and the people who work for you disagree with your decisions, sometimes the Right Thing To Do is to simply overrule them. Yes, I know, it's not all politically correct to do that, and if you do it too often you'll ruin whatever sense of empowerment that you want your employees to have within the company, but there are times when you just need to assert that authority and say, "You know what? I appreciate y'all's input, but this is one of those cases where I think I have a different enough perspective that I am going to just overrule and do it anyway." Sometimes you'll be right, yay, and sometimes you'll be wrong, boo, but there is a reason you're the VP or the Director or the Team Lead, and not the others. Leadership means making hard decisions sometimes.</li>
<li><b>Be willing to change up the process.</b> So your candidate comes in, and they're a junior programmer who's just graduated college, with zero experience. Do you then start asking them questions about their experience? That would be a waste of their time and yours. So you'll have to come up with new questions and a new approach. Not all interviews have to be carbon copies of each other, because certainly your candidates aren't carbon copies of each other. (At least, you'd better hope not, or else you're going to end up with a pretty single-dimensional staff.) If they've proven their strength in some category, or admitted a lack in another, then drop your standard set of questions, and go to something different. There is no honor in asking the exact same questions of every candidate.</li>
<li><b>Be willing to hire somebody that offers complementary skills.</b> If your company already has a couple of engineers who know algorithms really well, then hire somebody for a different skillset. Likewise, if your company already has a couple of people who are really good with customers, you don't need another one. Look for people that have skills that fall outside the realm of what you currently have, and trust that when that individual is presented with a problem that attacks their weakness, they'll turn to somebody else in the firm to help them with it. When presented with an algorithmic challenge, you're damn well sure that I'm going to turn to somebody next to me and say, "Hey, dude? Help me walk through this for a bit, would you?" And, in turn, if that engineer has to give a presentation to a customer, and they turn to me and say, "Hey, dude? Help me work on this presentation, would you?", I'm absolutely ready to chip in. That's how teams are built. That's why we have teams in the first place.</li>
</ul>
In the end, this is probably the best of all possible scenarios, not working for them, particularly since I have some other things brewing that will likely consume all of my attention in the coming months, but there's that part of me that hates the fact that I failed at this. That same part of me is now going back through a few of the "interview challenges" books that I picked up, ironically, for my eldest son when he goes out and does his programming interviews, just to work through a few of the problems because I HATE feeling inadequate to a challenge.</p>
<p>And that, in turn, raises my next challenge: I want to create a website, just a static thing, that has a series of questions that, I think, are far better coding challenges than the ones I was given. I don't know when or if I'm going to get to this, but I gotta believe that any of the problems out of the book "Programming Challenges" (by Skiena and Revilla, Springer-Verlag, 2003) or the website from which those challenges were drawn, would be a much better test of the candidate's ability, particularly if you look at the ancillary parts of the challenge: do they write tests, how do they write their tests, do they pair well with somebody, and so on. THOSE are the things you really care about, not how well they remember their college lessons, which are easily accessible over Google or StackOverflow.</p>
<p>Bottom line: Your time is precious, people. Interview well, or just don't bother.</p>
Programming Testshttp://blogs.tedneward.com/post/programming-tests/
Tue, 09 Jul 2013 00:02:11 -0700http://blogs.tedneward.com/post/programming-tests/<p>It's official: I hate them.</p>
<p>Don't get me wrong, I understand their use and the reasons why potential employers give them out. There's enough programmers in the world who aren't really skilled enough for the job (whatever that job may be) that it becomes necessary to offer some kind of litmus test that a potential job-seeker must pass. I get that.</p>
<p>And it's not like all the programming tests in the world are created equal: some are pretty useful ways to demonstrate basic programming facilities, a la the FizzBuzz problem. Or some of the projects I've seen done, a la the "Robot on Mars" problem that ThoughtWorks handed out to candidates (a robot lands on Mars, which happens to be a cartesian grid; assuming that we hand the robot these instructions, such as LFFFRFFFRRFFF, where "L" is a "turn 90 degrees left", "R" is a "turn 90 degrees right", and "F" is "go forward one space, please write control code for the robot such that it ends up at the appropriate-and-correct destination, and include unit tests), are good indicators of how a candidate could/would handle a small project entirely on his/her own.</p>
<p>But the ones where the challenge is to implement some algorithmic doodad or other? *shudder*.</p>
<p>For example, one I just took recently asks candidates to calculate the "disjoint sets" of a collection of sets; in other words, given sets of { 1, 2, 3 }, { 1, 2, 4 } and { 1, 2, 5 }, the result should be sets of {1,2},{3},{4}, and {5}. Do this and calculate the big-O notation for your solution in terms of time and of space/memory.</p>
<p>I hate to say this, but in twenty years of programming, I've never had to do this. Granted, I see the usefulness of it, and granted, it's something that, given large enough sets and large enough numbers of sets, will make a significant difference that it bears examination, but honestly, in times past when I've been confronted with this problem, I'm usually the first to ask somebody next to me how best to think about this, and start sounding out some ideas with them before writing any bit of code. Unit tests to test input and its expected responses are next. Then I start looking for the easy cases to verify before I start attacking the algorithm in its entirety, usually with liberal help from Google and StackOverflow.</p>
<p>But in a programming test, you're doing this alone (which already takes away a significant part of my approach, because being an "external processor", I think by talking out loud), and if it's timed (such as this one was), you're tempted to take a shortcut and forgo some of the setup (which I did) in order to maximize the time spent hacking, and when you end up down a wrong path (such as I did), you have nothing to fall back on.</p>
<p>Granted, I screwed up, in that I should've stuck to my process and simply said, "Here's how far I got in the hour". But when you've been writing code for twenty years, across three major platforms, for dozens of Fortune 500 companies and architected platforms that others will use to build software and services for thousands of users and customers, you feel like you should be able to hack something like this out fairly quickly.</p>
<p>And when you can't, you feel like a failure.</p>
<p>I hate programming tests.</p>
<p><b>Update:</b> By the way, as always, I would love some suggestions on how to accomplish the disjoint-set problem. I kept thinking I was close, but was missing one key element. I particularly would LOVE a nudge in doing it in a object-functional language, like F# or Scala (I've only attempted it in C# so far). Just a nudge, though--I want to work through it myself, so I learn.</p>
<p><b>Postscript</b> An analogy hit me shortly after posting this: it's almost as if, in order to test a master carpenter's skill at carpentry, you ask him to build a hammer. After all, if he's all that good, he should be able to do something as simple as affix a metal head to a wooden shaft and have the result be a superior device to anything he could buy off the shelf, right?</p>
<p><b>Further update:</b> After writing this, I took a break, had some dinner, played a game of Magic: The Gathering with my wife and kids (I won, but I can't be certain they didn't let me win, since they knew I was grumpy about not getting this test done in time), and then came back to it. I built up a series of little steps, backed by unit tests to make sure I was stepping through my attempts at reasoning out the algorithm correctly, backed up once or twice with a new approach, and finally solved it in about three hours, emailing it to the company at 6am (0600 for those of you reading this across the Atlantic or from a keyboard marked "Property of US Armed Forces"), just for grins. I wasn't expecting to get a response, since I was grossly beyond the time allotted, but apparently it was good enough to merit a follow-up interview, so yay for me. :-) Upshot is, though, I have an implementation that works, though now I find myself wondering if there's a way to do it in a functional/no-side-effect/persistent-data-structure kind of way....</p>
<p>I still hate them, though, at least the algorithm-based ones, and in a fleeting moment of transparent honesty, I will admit it's probably because I'm not very good at them, but if you repeat that to anyone I'll deny it as outrageous slander and demand satisfaction, Nerf guns at ten paces.</p>
More on Typeshttp://blogs.tedneward.com/post/more-on-types/
Wed, 01 May 2013 02:54:21 -0700http://blogs.tedneward.com/post/more-on-types/<p>With my most recent blog post, some of you were a little less than impressed with the idea of using types, One reader, in particular, suggested that:</p> <blockquote> <p>Your encapsulating type aliases don't... encapsulate :|</p> </blockquote> <p>Actually, it kinda does. But not in the way you described.</p> <blockquote> <p>using X = qualified.type;</p> <p>merely introduces an alias, and will consequently (a) not prevent assignment of <br />a FirstName to a LastName (b) not even be detectible as such from CLI metadata <br />(i.e. using reflection).</p> </blockquote> <p>This is true—the using statement only introduces an alias, in much the same way that C++’s “typedef” does. It’s not perfect, by any real means.</p> <blockquote> <p>Also, the alias is lexically scoped, and doesn't actually _declare a public name_ (so, it would need to be redeclared in all 'client' compilation units.</p> <p>(This won't be done, of course, because the clients would have no clue about <br />this and happily be passing `System.String` as ever).</p> <p>The same goes for C++ typedefs, or, indeed C++11 template aliases:</p> <p>using FirstName = std::string; <br />using LastName = std::string;</p> <p>You'd be better off using BOOST_STRONG_TYPEDEF (or a roll-your-own version of this thing that is basically a CRTP pattern with some inherited constructors. When your compiler has the latter feature, you could probably do without an evil MACRO).</p> </blockquote> <p>All of which is also true. Frankly, the “using” statement is a temporary stopgap, simply a placeholder designed to say, “In time, this will be replaced with a full-fledged type.”</p> <p>And even more to the point, he fails to point out that my “Age” class from my example doesn’t really encapsulate the fact that Age is, fundamentally, an “int” under the covers—because Age possesses type conversion operators to convert it into an int on demand (hence the “implicit” in that operator declaration), it’s pretty easy to get it back to straight “int”-land. Were I not so concerned with brevity, I’d have created a type that allowed for addition on it, though frankly I probably would forbid subtraction, and most certainly multiplication and division. (What does multiplying an Age mean, really?)</p> <p>See, in truth, I cheated, because I know that the first reaction most O-O developers will have is, “Are you crazy? That’s tons more work—just use the int!” Which, is both fair, and an old argument—the C guys said the same thing about these “object” things, and how much work it was compared to just declaring a data structure and writing a few procedures to manipulate them. Creating a full-fledged type for each domain—or each fraction of a domain—seems… heavy.</p> <p>Truthfully, this is <strong>much</strong> easier to do in F#. And in Scala. And in a number of different languages. Unfortunately, in C#, Java, and even C++ (and frankly, I don’t think the use of an “evil MACRO” is unwarranted, if it doesn’t promote bad things). The fact that “doing it right” in those languages means “doing a ton of work to get it right” is exactly why nobody does it—and suffers the commensurate loss of encapsulation and integrity in their domain model.</p> <p>Another poster pointed out that there is a <em>much</em> better series on this at <a href="http://www.fsharpforfunandprofit.com">http://www.fsharpforfunandprofit.com</a>. In particular, check out the series on <a href="http://fsharpforfunandprofit.com/series/designing-with-types.html">&quot;Designing with Types&quot;</a>—it expresses everything I wanted to say, albeit in F# (where I was trying, somewhat unsuccessfully, to example-code it in C#). By the way, I suspect that almost every linguistic feature he uses would translate pretty easily/smoothly over to Scala (or possibly Clojure) as well.</p> <p>Another poster pointed out that doing this type-driven design (TDD, anyone?) would create some serious havoc with your persistence. Cry me a river, and then go use a persistence model that fits an object-oriented and type-oriented paradigm. Like, I dunno, an <a href="http://www.db4o.com">object database</a>. Particularly considering that you shouldn’t want to expose your database schema to anyone outside the project anyway, if you’re concerned about code being tightly coupled. (As in, any other code outside this project—like a reporting engine or an ETL process—that accesses your database directly now is tied to that schema, and is therefore a tight-coupling restriction on evolving your schema.)</p> <p>Achieving good encapsulation isn’t a matter of trying to hide the methods being used—it’s (partly) a matter of allowing the type system to carry a significant percentage of the cognitive load, so that you don’t have to. Which, when you think on it, is kinda what objects and strongly-typed type systems are supposed to do, isn’t it?</p>