Monday, January 30, 2017

I often joke that my company is "the engineers running the asylum" because there's not much "product management" as a separate discipline; engineering figures out what to do, and does it. But it's not as bad as all that; we are highly data driven, and have a solid A/B test system in place.

So one engineer had a thought that "Subscribe" with a pin wasn't the most clear UI, and that we might get better results with alternate wording. So we took the top bar of this:

And changed it to that:

A/B tests (called "view versions here") are enumated in a Java file, and then we have a page that lists them all, allows the developer to say what % of traffic should be sent to each one, and then an "assign to me" button that says the developer ALWAYS gets put in that group:

For each of those, within each percent (like 30) half of the % traffic gets sent to GROUPNAME as a test, and other half gets sent to GROUPNAME_CONTROL. Then you can pull up logs at look at results.

The results are often counter-intutive. Stuff that seems like solid UI wins get poorer results. (As happened here - switching to a Bell and "email alerts" from a weird pushpin and subscribe got fewer clicks, and so did trying an email envelope icon and keeping the word "subscribe" ) Or- you might be looking at the wrong result. Like, maybe you don't want more sales leads if they're all crappy leads bumping down the overall conversion rate.

Still, it's good that not everything is just developer or product manager guessing...

Thursday, January 26, 2017

At work, I was seeing odd results with little blue underlines under some of the stars in our dealer ratings:

It was a bit mysterious - sometimes there, sometimes not. And as in that example, sometimes on the same page.

Through trial and error I realized messing with the browser width could make the underlines appear and disappear. Or even more reliably, changing the zoom level - that could get it to happen even on Safari, not just Chrome where I was mostly working.

I guess the takewaway is spritesheets need more breathing room, a bit more transparent space between elements, because browsers and screens are doing all kinds of tricks where a screen pixel is not a browser pixel. 95 times out of 100, that magic is transparent to the web developer, but sometimes the abstraction leaks a bit.

Wednesday, January 25, 2017

A while back a person named Esteban Hufstedler made a really cool Processing program that made awesome, lava-lamp-ish moving patterns, much like the mask Rorschach wore in the movie "Watchmen". It included a mode to see how the effect was done, and some other settings to play with to get different results. I took the defunct-ish original and made the tiny changes to let it work in the browser: (albeit a bit more slowly than in the original Java mode)

Wednesday, January 18, 2017

Arguing with my sparring partner, we got to think about what innovation is, like the iPhone launch – if it was more the thunderbolt of a new idea, or incremental progress suddenly revealed. I think it’s both – someone high up gets a vision of “hey maybe we could do this if we have the tech” and then a team has to put in the hard, slow trudge of all the steps to make that happen. (Or maybe, someone in the middle-to-high level gets the idea and pitches it to the very highest, even ruffling some peer’s feathers – the process that this article says might be breaking at Apple)

I think usually that vision takes the form of a new interaction, something that wasn’t possible with the current configuration of stuff.

I think some of the current state of Apple is the lack of a big idea. Look at Jobs’ last big 3: iMac was a matter of presentation and wrapping – actually a freshening of the very original information appliance concept, but redone beautifully. iPod’s innovation was the clickwheel – and it was a great one. iPhone’s interaction innovation was putting the new type of touch sensitivity (already used in say laptop trackpads) and putting it behind glass. (And visual voicemail )

In this view, iPad really didn’t represent interaction innovation (to be fair, it represents the innovation that then got diverted into the iPhone, so by the time it came around it was kind of ho-hum, just larger) And the same for the Apple Watch. The interaction of “smaller and on your wrist instead of your pocket” doesn’t involve all that many new forms of interaction. What will the next interaction innovation be? Well if i could say for sure I might be rich. It might be in voice assist, where Apple seems to be lagging on execution a bit (some argue it’s because they’re more privacy conscious than their rivals?). Random pipe dream: what if clear touchscreens could remold themselves slight to provide tactile bumps? Like tell your thumb where the virtual cross pad was, or have faux physical slider points… no idea if the supporting tech for that is even on the horizon, but it certainly sounds cooler than edge to edge curved screens, doesn’t it?

So we turn to Microsoft. They made a bet that the future of laptops and tablets might be doable with one OS. And they paid the price for that, some of their earlier attempts were really painful to use, and even now the legacy aspect they lug along is offputting for some. But now there is some exciting interaction innovation; giant, desktop workspace touchscreens and intriguing tactile physical dials are making a hard press for “creatives” – it’s a historical shame Apple is falling behind supporting that group. (Compare to the iPad Pro message, where Apple is saying “you can do all your pro work without a real filesystem” which honestly I’m not sure I believe.)

If I thought Window was anywhere near as acceptable as MacOS I might be tempted to swapback, but I’m not willing to gamble 800 bucks and find out its not. (And that’s another way Windows might suffer from people like me who could potentially be persuaded to “switch back” – I tend to compare the hardware on my mom’s $250 cheapie Windows box to the $1000 hardware of my Macbook Air, and that’s clearly not fair.)

Tuesday, January 17, 2017

Loewy had an uncanny sense of how to make things fashionable. He believed that consumers are torn between two opposing forces: neophilia, a curiosity about new things; and neophobia, a fear of anything too new. As a result, they gravitate to products that are bold, but instantly comprehensible. Loewy called his grand theory “Most Advanced Yet Acceptable”—MAYA. He said to sell something surprising, make it familiar; and to sell something familiar, make it surprising.

Interesting stuff. It reminds me of "Zen and the Art of Motorcycle Maintenance" and its description of how we know "Quality", the Tao, how something is good at being whatever it is, in a circular way: we learn define the quality as we recognize the quality in the instances of the thing we're defining.

Monday, January 16, 2017

I'm bummed I don't make many microgames any more, but when I did, they followed that idea of "make a new mechanic", have some fun with it. (I used to do Glorious Trainwrecks' two hour game jams, where you'd make the best worst game you could in 2 hours, and then hop online and play what everyone else made as well - a new fun mechanic was about all I could hope for. You can see some of the best of results on my game page (currently my 2015 advent calendar) and then a pile of other stuff here)

There should never be Game Police saying what games can and can't or should and shouldn't be, or are or aren't, but I do think you can make an argument that video games as a medium are especially interesting when they're playing to their unique strengths - things you can't easily do in other media, like for example making "physically" interactive microworlds. Lots of formats can tell stories, many of them can even bring the reader/viewer into the story.... (and video games always have to deal with "ludonarrative dissonance, where what the player wants to do may or may not make sense with what the character wants to do). And many, many types of games lets you address strategic fun, and even model their own little "worlds" in the process. But making an entertaining interactive/reactive new reality... that came first for games, and is where my focus tends to brought.

The other video presented another Nintendo-ish view, and reminded me what my attempts at gamemaking sometimes lack:

For me the most important quote was Miyamoto saying this:

I think that first is that a game needs a sense of accomplishment. And you have to have a sense that you've done something.

Challenge and accomplishment do bring a lot to a game. (Of course games have gotten a bit more friendly and forgiving over the years - sometimes I worry that reward time spent rather than skill built...)

I have mixed but mostly sad feelings about not making or playing games much these days. I have some friends that argue I've spent too much of my life with them already, and they serve as a distraction from the important things, and certainly some of the things I've been doing more of (especially playing in some street bands) has given me great rewards as well.

Thursday, January 12, 2017

In Cat's Cradle, Kurt Vonnegut introduces the concept of a "wrang-wrang": a person who steers people away from a line of thinking by reducing that line, with the example of the wrang-wrang's own life, to an absurdity.

A sudden irrational and disproportionate fury at somewhat trivial things that are out of my control. In some circumstances I'm almost too controlled, many of my potential feelings of desire have to be vetted by my inner judge before they're allowed... but the feeling of "this is just wrong" rises up in a sudden furious tantrum, and I don't like that about myself. (It's gotten me into trouble in previous jobs; it's not that I rant and rave endlessly, it's just that one moment of exposed anger, even if directed at a system and not an object, can make people very uncomfortable.)

The issue has been on my mind for a while. In 2008 I wrote
"C'est la Vie!" / accepting that / "this should not be!" / but coping / more stoically; / philosophically-- / "C'est la vie..."

A few years later I read about William Irvine's modern application classical Stoicism, in "A Guide to the Good Life'; protecting one's equanimity and contentment at all costs, in part by triaging the world into things one has complete control over, no control over, and somewhere in between, and attending only to the first and last category, along with "negative visualization" - a meditative technique of thinking about how bad things could get, and then being happy when they're better than that; and realizing that you'd be able to cope even if they were that bad. So that was helpful, but just recognizing that a situation was out of my control didn't actually help my equanimity all that much.

Other approaches suggested themselves. I wrote this in 2015:

Recently a conversation with Derek gave me the idea of approaching the world with a kind of cheerful pessimism- assume that "a bit screwed up and annoying" is kind of the natural state of the universe, that things WILL be messed up, but generally not irretrievably so, and then be extra cheerful when the dice roll your way. "Lousy minor setbacks" that could otherwise be absolutely and inappropriately infuriating become almost soothing reminders that Murphy's in His Heaven and all's right, or wrong in the right way, with the world.

Again, that sounded better on paper than in real life, in terms of not being upset. I don't really want to be all that dour all the time.

In early 2016, I stumbled on "Amor Fati" - still a concept that resonates for me, a call for the cultivation of love of one's fate, even the parts that are unpleasant, that you wouldn't have it any other way. As Nietzsche put it:

"My formula for greatness in a human being is amor fati: that one wants nothing to be different, not forward, not backward, not in all eternity. Not merely bear what is necessary, still less conceal it--all idealism is mendacity in the face of what is necessary--but love it."

I felt - still feel - that much of the problem is that our monkey brains are so good at daydreaming up these alternate realities that are just like this one, but better - this same roadway, this same car, not all these other cars - but those realities don't exist in our world, except for the power we give them to make us unhappy.

Later in the fall I also stumbled on the idea of using empathy to make situations more palatable. In its more extreme form, this is a kind of hippy-dippy "we are all one thing", but even without going to that extreme, if you see yourself on a common team of humanity, someone cutting you off might be a win you can share in. Of course, this doesn't apply to traffic jams so much, at least when everyone is equally stuck. (Remember- you're not 'in' a traffic jam, you 'are' the traffic jam)

But now I've found what seems the strongest counter-formula yet... the recognition of this weird animism humans tend to have, that we look for intent and purpose even in things that are just accidental and emergent. The first stage of the this realization was that "it is absurd to take traffic personally". And yet I do. Later, in the movie "Mistress America" I found the even wider application: "The path isn't against you. It's just the path." I've been finding that a very useful mantra lately.

The other nice thing is that these various view points are complementary, they don't really undercut each other that much. (I've been told that's characteristic of Eastern religions, in general they are less combative, and defensive of their "unique path to truth" sense, than many Western outlooks.)