It's one thing to sign me up for spam. It's another for a company to do it and claim it's some kind of benefit.

Someone (who shall not be named) sent me a link to a property that was on Zillow.com yesterday. This was fine, as I've been thinking about such things lately, but what happened next was surprising. I got an email from Zillow:

Since a home was recently shared with you, we created a free Zillow account for you to browse millions of homes, save and share your favorites, connect with professionals and shop mortgages.

Seriously?! Not only did this indicate that the spam would start, but it also created an account on a site I'd never used nor wanted. I had to go delete it and unsubscribe. Now yes, it's one thing for someone to sign you up for spam. It happens. But that's typically someone pretending to be you, and one would hope that in 2017, double-confirmation would happen. In this case, however, Zillow knew that the account they were creating was for someone that hadn't visited their site and, as you can see from the message I got, I wasn't asked if I wanted the account. They just created it as if they were doing me some kind of favor.

I'll come right out and say it - password security questions are not only insecure, they're a blatant security hole. They're worse than not being there at all, and for any of a number of reasons.

First, they're all the same. How many times have you been asked your mother's maiden name, the make or model of your first car, what city you were born in, or the name of your first pet? These answers, if given truthfully, are easy to find out. You've likely blogged the answer at some time in the past.

If I know your Uncle's last name, odds are I also know your mother's maiden name (50/50 shot there, and if I know he's your maternal uncle, I've got it).

At this point, these security questions are no better than a second, easy-to-guess password. And in cases where they're used to recover a password, they become more of a risk than anything else.

The only thing to do here if these questions are mandated is to make up a unique and incorrect answer. Yet another password. Yet another password to remember, and many password managers don't realize that these question fields are password fields to store and protect.

The immediate solution is two-factor authentication. When you log in to a site, the site sends you a one-time code to your phone and you must enter that number. The password is simply to keep people from causing the code to be spammed to your phone and interrupting you while you're in the bathroom. Since everyone has a smart phone these days (a generalization I'm prepared to make), this requires someone who wishes to hack you to have access to your phone. Sure, if they get your phone they get everything, but they still need to know your password to cause the two-factor to fire. It's not perfect, but it's close.

The real solution is an un-replayable biometric solution. A fingerprint reader on every keyboard, implemented in such a way as to make storing and replaying of biometric data impossible. That's a tough nut and might also have to include physical two-party, but I suspect it would work.

If you want into a site, you don't need to give it a name or password. You simply place your finger on the scanner and then wait for your phone to give you the access code which you then type in. The code expires the moment it's used (or in 60 seconds if it is unused). Thus, storing the biometric data isn't really all that useful. And if the biometric data is somehow hashed with an expiring timestamp, storing it won't do much good after a few minutes anyway.

Either way, passwords are dead and password security questions are worse than dead.

I'm an Apple guy, and while I'm not religious about it, I like that all my devices work together now and are generally portable. The UX works for me and I like having a development platform that is, under the hood, UNIX-based.

That said, Cortana kicks Siri's ass. It's not even a fair fight. What Lisa's Windows phone can do, in terms of an intelligent AI assistant is incredibly compelling.

Apple, your user experience is second to none. Now it's time to kick up the actual heft behind it. Microsoft is eating your lunch in this one, specific area. Step it up.

I spent the evening last night at an Irish Pub (yes, I know, this blog entry can just stop here) watching the Seahawks game. Remember, though I live in Silicon Valley, I'm a Seattle transplant. Go Hawks.

As I and the 50+ fans were enjoying a convincing victory, a commercial came on. It was entitled (and captioned), "The Call," and depicted a woman getting a phone call. She says hello, and her face drops as she listens, clearly being shocked at what she is hearing. I, the viewer, know only her shock - there is no indication of what's actually said.

And then the commercial ends with the call to action to go to a URL to find out what happens next.

No. Just no. Clickbait online is one thing. Doing it in a broadcast television commercial? Sorry, that's farther past a line that's already been crossed.

I encourage everyone to refuse to go to any URL presented in this manner. Please help send a message to advertisers that this simply won't work.