Archive for June 2015

It’s hard to get insight into why Apple is behaving this way. They never send anyone to web conferences, their Surfin’ Safari blog is a shadow of its former self, and nobody knows what the next version of Safari will contain until that year’s WWDC. In a sense, Apple is like Santa Claus, descending yearly to give us some much-anticipated presents, with no forewarning about which of our wishes he’ll grant this year. And frankly, the presents have been getting smaller and smallerlately.

In recent years, Apple’s strategy towards the web can most charitably be described as “benevolent neglect.” Although performance has been improving significantly with JSCore and the new WKWebView, the emerging features of the web platform – offline storage, push notifications, and “installable” webapps – have been notably absent on Safari. It’s tempting to interpret this as a deliberate effort by Apple to sabotage any threats to their App Store business model, but a conspiracy seems unlikely, since that part of the business mostly breaks even. Another possibility is that they’re just responding to the demands of iOS developers, which largely amount to 1) more native APIs and 2) Swift, Swift, Swift. But since Apple is pretty good at keeping a lid on their internal process, it’s anyone’s guess.

The tragedy here is that Apple hasn’t always been a web skeptic. As recently as 2010, back when Steve Jobs famously skewered Flash while declaring that HTML5 is the future, Apple was a fierce web partisan. Lots of the early features that helped webapps catch up to native apps – ApplicationCache, WebSQL, touch events, touch icons – were enthusiastically implemented by WebKit developers, and many even originated at Apple.

Don’t pit responsive and adaptive solutions against each other, as if the need for adaptive solutions in some (limited) situations means the failure of responsive web design. They can and should work together, as Livia Labate covered in Responsive versus Adaptive is Not A Thing.

“We were finding Android in general to be a slower platform to move on. There’s more time spent dealing with fragmentation bugs. There’s more time spent dealing with testing and debugging, and we would rather spend that time building new functionality.”

One one hand, of course, none of this matters a damn. A good idea transcends the basic logistical technology that originally delivered it, which is why we’re still reading newspapers even though we don’t use hot metal type, and why we still listen to “radio” stations even though they’re delivered over the internet rather than using radio waves. (Pleasing aside: Of course, if you listen to an internet radio station wirelessly over Wi-Fi, you are using radio waves. And a second: My parents’ generation would have called a radio set “a wireless.”)

But on the other hand, man, Apple has to be careful here. Technology companies are killed not because they make bad products but because they fail to recognize and embrace wider social changes sweeping everything along. Famously, Bill Gates acknowledged that Microsoft never saw the internet coming, and you could argue the company never quite recovered from that misstep.

Don’t hire a junior developer if you don’t plan on investing in their future. What happened to the idea of apprenticeship? The idea of training an inexperienced worker and helping them build their career. When you hire a developer, imagine yourself working with them for the long haul and helping to shape their future at the company.

Don’t hire a junior developer if you don’t have the resources to mentor them. I cannot emphasize this enough?—?do NOT hire a junior developer if you do not have resources to do so. If members of your team cannot take 5–10 minutes a few times a day to sit down and explain a problem or concept, you do not have the resources.

Don’t hire a junior developer if you can’t afford the time expense. People still in the process of learning new skills go slowly. If you’re in a time critical situation, make sure to account for the extra time it might take for a new developer to understand new concepts.

I feel your pain every time I encounter a shop or company that brags about its full-stack expertise but writes markup like it’s 1999. Or, just as bad, like it’s 2003: all meaningless spans and divs, half of them there purely as visual hooks, and the other half there because the developers didn’t know what to cut out of the framework they used (and didn’t consider that a problem, figuring that gzipping everything would take care of the performance aspects).

It’s particularly troubling when the code schools turning out tomorrow’s coders by the tens of thousands neglect to teach their students the vital importance of separating structure from appearance and behavior;

I’ve seen agencies where an HR person who doesn’t understand the web hires designers and developers based on the candidates’ meeting a checklist of skill areas. The more tools and software the candidate knows, the likelier they are to get hired. Understanding of accessible, standards-based design doesn’t even enter the picture.

Above all, the kind of “pretty design but bad code” you’re stuck with… comes from stone age companies that heavily silo their employees. In such places, the developer’s job is to comply with a list of specifications as quickly and cheaply as possible. The designer’s job is to make it pretty. If there is a user experience person, her job is to create wireframes in isolation, and slip them under the designer’s door.

Remember: the future will come whether you design for it or not. If your company charges $300,000 for a website that won’t work on next week’s most popular device, your company won’t be able to stay competitive in this business. It might not even be able to stay in the business, period.