This has nothing to do with my iPhone battery replacement, but here’s a miniature horse in the University Village Apple Store in Seattle back in 2009 — before it got too crowded for people, much less horses.

I got a new $29 battery for my iPhone yesterday, not just because I could (funny story about replacements) but because I had to. My iPhone had stopped holding a charge for longer than about six hours. I’m going to walk you through my replacement experience because you might find it informative. Also, I’m long overdue in writing an annual blog post, which means there will be a long <digression> for those of you who do not own an iPhone.

You have until the end of the year to take advantage of Apple’s offer, but it might be a good idea to act well before that, as you will see. It’s not quite as simple showing up at the Genius Bar.

My iPhone 7 was purchased in November 2016, so I’d had it for 17 months — the warranty had expired. A couple of weeks ago, the battery started depleting rapidly and I was getting 20 percent warnings that I had never seen before.

So I went to the Apple website. Among the options for battery replacement, I chose to make a Genius Bar appointment. But when I did, it showed me a small number of available times a week hence, and none of them worked for me. So I called the general support number. Turns out you need to order the battery first, and they let you know when it arrives at the nearest store. Then you leave it with them for 90 minutes while they make the change. But the woman on the phone told me it could take two to five weeks for the battery to arrive at the store. OK, I figured, I guess I’ll limp along until it arrives and be thankful I didn’t wait until the end of the year when everyone else will be trying to get their $29 batteries.

As it happened, the Apple Store called the next day and said they had the battery and I could make an appointment to get it installed. Talk about under-promising and over-delivering. I picked Saturday at 3 p.m. Continue reading →

Election night for newspapers is crazy. It’s the news equivalent of the Super Bowl — multiplied by however many races need coverage. For the print edition, you write and edit a bunch of stories, compile lists of results, lay out a bunch of pages and do all this within the span of a couple of hours. The print edition must roll off the press during the night for delivery in the morning. For the web, people expect instantaneous news, and they expect that report to be updated frequently. The deadline for a news website is now and forever.

That expectation of immediacy is true for broadcasting, too, of course, but textual journalism must be both timely and timeless. I am not going to watch a recording of CNN’s coverage of the presidential race the next day to find out what happened. I am going to seek an account that is smartly organized and can be consumed more efficiently. That is almost certainly going to be a story, or a graphical representation of the outcome. Digital or inked textual journalism must be produced with urgency similar to broadcasting, but it must be more polished and thoughtful. The only way to do that well is to plan well. Portions of stories are written in advance. If it’s a really important race that will require nuanced analysis on deadline, two entire versions might be written before the results are known. Plug in the numbers and away you go.

Sometimes the most basic information is the hardest to prepare for publication. This year The Daily Herald of Everett, Washington, where I am the digital news editor, identified 63 noteworthy races or ballot measures. We needed to make a table of results for national, state and local races, and there would be multiple sources for those data — national media, the state of Washington and several counties.

Sure, The Associated Press would feed us those data in an efficient stream if we were willing to pay several thousand dollars more than our AP membership costs, but that expenditure is tough to justify at a small publication. We needed to be self-sufficient, using free sources of government results online and bringing them together in our own custom list.

I thought about taking the state’s RSS feed of results and parsing them with a text macro, but the structure of the XML made this too challenging — I didn’t have the coding skill and extra time to do this, or the budget to hire someone who did. Besides, our county’s results were disseminated in a different format — flat-file HTML. We needed to accommodate both sets of data on deadline and regurgitate them in separate formats for print and the web.

I came up with a simple solution. The downside was that it required human data entry, either by typing the numbers into a spreadsheet, or by cutting and pasting them from government-generated results found online (my preferred method). The upside was that we would have our own custom compilation of results which could be exported for presentation however we wished.

If you want to do this yourself, you need only a spreadsheet. We are a Google Docs shop, so we used Sheets, the cloud-based Google spreadsheet app. (Sheets is inferior to Excel, IMO, but it’s the tool available to everyone in the room, which is important.) Since Sheets resides in the cloud, multiple people can enter data into the same document simultaneously. The advantage of that when you’re on a tight deadline should be obvious.

Here’s how I did it, step by step.

We needed to output results in tabular form for import into InDesign for the printed page, and as HTML tables for a web page. That meant we needed the same data exported from Sheets in two distinct formats — tabbed plain text for print and carefully tagged HTML for the web. We also needed a clean interface within the spreadsheet so humans could accurately input the results.

First, I compartmentalized the results, bearing in mind how I wanted the list organized and how people in the newsroom would be finding and compiling the results. I created a separate sheet, or tab, for each silo — one each for president, Congress, statewide races and measures, local races, etc. For example, I created a tab within the sheet for congressional races and designed it to be human-friendly for data entry.

I made the cells for data entry green and locked all the others so that mortals couldn’t break anything. I also made the tabs for the data-entry sheets green to indicate it was OK to edit them.

Formulas in this sheet automatically calculated percentages from the vote totals. I did this for two reasons: It’s one less number to enter (you don’t have to enter the vote number and the percentage) and there is a factual check — you can compare the calculated percentage in the spreadsheet to the percentage published by the government election results to ensure you input the raw numbers correctly. This requires, however, that you also enter the total number of votes for a given race. But that’s still fewer numbers to enter — three numbers instead of two numbers and two percentages, and you get the added bonus of quality control during data entry.

Then I created two more sheet-tabs for the same data. These would be used for exporting the data that was entered into the first version of the sheet. I used formulas to grab the data from the first sheet and paste it into the others. But I designed the other sheets differently, bearing in mind how I wanted the data to be exported for print and the web. I formatted the text in the cells of these duplicate sheets differently to make it easy to troubleshoot any problems.

The sheet for export to print was simple, because an exported tab-separated file (.tsv) was all that was needed. InDesign knows what a tab is, and that file could be imported into an InDesign page pretty much as-is. I made the tabs for print-export sheets blue.

The sheet for export to the web was more complicated, because there would need to be HTML code co-mingled with constantly changing data. (The HTML tags in the cells beside the data would not change.) I could still export this as a .tsv file, because web browsers know to ignore tabs and just read the HTML tags. But the spreadsheet needed to be organized carefully so all the HTML appeared in the right order in relation to the data. I made HTML-export tabs yellow.

So here’s how the whole process worked:

1. Election returns were entered into the green cells in the green sheets. I prefer to cut and paste the data from the government website, one number at a time, and use the data-only paste option in Google Sheets to prevent foreign HTML or formatting to contaminate the results:

2. The print (blue) version of that tab was exported as a .tsv file:

3. The web (yellow) version of that tab was exported as a .tsv file:

4. The print .tsv files were converted to .txt, then dragged into a shared folder that is visible to InDesign. A colleague, news-desk chief Mark Carlson, figured this part out. He created a macro in InDesign that imported the tabular data from those files into a pre-formatted table on the InDesign page. The file contents, before import into InDesign, looked like this:

5. I opened the web .tsv files and cut and paste their contents into a page on our website. (We happen to use WordPress for hosting. We bypassed our content-management system, NewsEngin GPS, with this process.) Here’s what the code looked like before pasting:

Once the data were entered into the green sheets, this whole process took just minutes. Of course, the more tabs you have in your spreadsheet, the more files there are to export. Here’s how the end result looked in print and on the web:

This setup is ready for next year, obviously. Because odd-numbered years have more races (lots and lots of city councils), I’m glad I tested this method in 2016. But I have a feeling I will come up with something even smarter and more awesome next year. Stay tuned.

Richard Perez-Pena, a journeyman reporter for The New York Times national desk, has written an interesting explanation of the revival of the role of rewrite editors in news. Things have kind of come full circle. The rewrite method of preparing news for publication evolved to overcome a lack of tools. Today it is in use again to overcome an overabundance of tools and numerous sources of information, and to overcome the fact things keep changing and the digital platform needs to be constantly fed.

As recently as the early 1990s, reporters had a notepad and a pocketful of coins, and that was about it in terms of tools in the field. If there was urgency, you found a landline telephone — usually a pay phone, but sometimes you knocked on a stranger’s door to ask if you could use theirs — to call in your story or dictate your notes. Sometimes you dictated the first draft of a story to whoever was on the other end of the line, but usually what we dictated was a combination of useable newswriting and notes, because you were flipping through the scribbled-on pages of the notepad. (“Oh, here’s a quote for you to insert higher up in the story …”) The person in the newsroom then rewrote this stuff into a polished narrative.

When I had to dictate like this, I used to close my eyes and pretend I was in front of a typewriter. Writing that was generated with typewriters was not so easily polished on the fly; you let your consciousness stream onto the page and refined it later. Writing was a more linear process than it is today. So dictating over the phone was simply a matter of switching from the keyboard to your voice. The result was going to be kind of rough either way. In the field, sometimes you took a few minutes to hand-write a story and then read it over the phone. But more often it was a combination of reciting into the receiver actual paragraphs that could be published and facts or quotations. In any event, it was quick and dirty.

For those of you not in the business, Perez-Pena’s description of what happens at The New York Times is fairly typical of how newspapers handle breaking news today. Few of us have a dedicated rewrite desk waiting for stuff to happen, but when big news breaks and you have multiple reporters involved or developments in multiple places, one person pulls it together in the newsroom and posts iterations to the Web as events unfold. What Mr. Perez-Pena did not get into is that everything else gets refined, too, as the story develops. We add images, improve the headlines, add links, change the home page layout. At some point late in the day, you settle on a version of a story that will go to print. And then you might update the online version again that night or in the morning.

Throughout this process, we practice a discipline that distinguishes professional journalism from all the crap online: verification. When we want to include something in a story, the operative question is always, “How do we know that?” Good news reporting is very labor intensive, even in modest situations, and high technology has made that no less the case than in the days of pay phones. That’s why it’s so important for you to be willing to pay for news — even a news product you know is occasionally flawed. I know everyone complains about their local paper, but try to imagine it not there.

Am I an optimist or a pessimist? It depends. The optimist in me thinks I have a lot to say. The pessimist counters that no one wants to hear so much negativity. Hence, one blog post per year.

The optimist in me thinks the Internet has been an enormously good thing for human knowledge and expression. The pessimist notes that uploading humanity’s collective wisdom to the cloud has revealed what foggy, uncritical thinkers we are. For example, Quora.

The fact Quora exists makes my heart soar. How cool is it that anyone anywhere can throw a vexing question into the wind and have experts on that very topic answer it?

On the other hand, who thought it was a good idea to let the stupid and the lazy reveal how ignorant they are by enabling the dissemination of unanswerable questions?

Whenever I read a story about temperature changes since the 19th century, I wonder about the source of those decades of data. Did the thermometers of 1900 have the fidelity of those of today? Were they calibrated the same? Because we’re talking about a 1-degree difference, sometimes less — and that’s somewhere in the neighborhood of 1 percent in the range of possible temperature readings. Add to that changing placement of instruments over the years and increasing ambient heat from built environments. In the end, can we trust the data?

I am not a climate-change skeptic, but I do wonder if the data of the past is truly comparable to that of the present.

TV stations used to be all about visibility and ostentation — the tower with Christmas lights, the unique building, the talent, the helicopter, the branding, the logo-laden vehicles, the live shot, the reporter or anchor sent to a faraway disaster for no good reason but to preen. (Remember the KIRO news jet?)

As it was with newspapers, the first thing to go is the valuable real estate. In five or 10 years, we’ll probably have one station doing news and perhaps sharing that. KING and KOMO already are sharing a helicopter. KOMO’s former parent sold Fisher Plaza. KCPQ and KIRO will be next to sell their dirt in the former media gulch.

Remember the one challenge that is currently making everyone doubt that MH370 actually flew to Turkmenistan, Iran, China, or Kyrgyzstan? That challenge is the thought that MH370 couldn’t make it through several key airspaces such as India or Afghanistan without being detected by the military.

It is my belief that MH370 likely flew in the shadow of SIA68 through India and Afghanistan airspace. As MH370 was flying “dark” without transponder / ADS-B output, SIA68 would have had no knowledge that MH370 was anywhere around and as it entered Indian airspace, it would have shown up as one single blip on the radar with only the transponder information of SIA68 lighting up ATC and military radar screens.