In the weeks since the latest, deep, round of Postmedia cuts, some journalists who still have jobs have written thoughtfully about the state of Canadian media. In amongst it all, an interesting idea popped up. In the Globe and Mail, via Lawrence Martin, it went like this: “If traditional print journalism cannot be sustained, what fills the void? Is there a larger role for the public sector? In Nordic countries, subsidies extend not only to journalism but to print as well – and with apparently good results.”

In the Toronto Star: “The federal and subnational governments have a role to play in funding non-profit trusts… should any of the country’s 100-plus daily newspapers hit the wall.”

At iPolitics: “Ottawa could follow Europe’s lead and even-handedly subsidize newspapers. It’s a modern version of the long-standing federal policy of subsidizing of postal rates for Canadian magazines, intended to ensure Canadians have access to diverse media voices.”

Back in December, former Bachelorette star Kaitlyn Bristowe sat down on a stage in Vancouver and gave a short interview as part of a TEDx event. Host Riaz Meghji asked her, referring to her time on the ABC reality show: “True intention: did you do this to find love, or did you do this to build a brand?” Bristowe paused before she answered. “To build a brand,” she replied. “I’m going to be honest. I’m not going to sit here and lie.”

Nor should she have been expected to. Reality TV viewers, and Bachelor watchers in particular, have surely noticed their favourite characters quickly begin to hawk various wares upon exiting the mansion and taking up camp on Instagram. There, they leverage their (often brief) fame into a few easy bucks. Equally, we have long assumed these plans were forged prior to them arriving on the show – it’s part of the reasoning why the phrase “not here for the right reasons” has such staying power. Everyone knows only some of the people involved in these contests are really looking to get married. In fact, if that occurs, it might be by accident, as it was, apparently, for Bristowe.

Amid the hubbub of this week’s mass layoffs throughout the Postmedia network, its finances were noted mostly for revealing how much each of its executives earns every year. Less noted (though mentioned) was the part of its quarterly report, released a few days prior, that touched on digital ad revenue.

“Digital revenue was $30.2 million for the three months ended November 30, 2015, representing 12.0% of total revenue,” the report said. Compared to the same period last year, digital revenues had increased $5.9 million, “as a result of the Sun Acquisition”. However, it went on to say, if Postmedia’s acquisition of the Sun newspaper chain were to be excluded from the tally, digital revenues actually decreased by $1.4 million, or 5.7% during those three months, compared to last year. The reason given for this decline is familiar by now to those in the media, or who simply pay attention to the media: decreases in local digital advertising and digital classified revenue.

We could, at this juncture, have an equally familiar, somewhat historical, discussion about all that – about how Craigslist and its ilk cut the knees out from newspapers just as the shift to free online platforms was beginning, thus undercutting a key revenue generator; or about how giving news away for free from the get-go was the media industry’s original digital sin for which there will be no fiduciary forgiveness.

Rather than talk again about how we got here, it might be time to accept the current state of affairs and look ahead. We could ask: Where will we all work next? But that might not be the correct question in this case – or at least not the one we should ask immediately. Before we can figure out where we’ll work, we need to think about what sort of work we might do, and for whom, and which models might work best. So, maybe in this journalism future-casting, we might better ask: What’s happening to advertising?

Taken to their logical extreme, services like Uber or Lyft or AirBnB envision a particular kind of world – not inherently anti-social, but with sociability based almost exclusively on a particular economic model. In other words, informal social interactions – the things that make a city, in a very simple sense, a city – will, under these models, become more and more informal transactions. Which might make us wonder: what will this shift mean for us? What kinds of people will we become, and what is a city, then?

Make what you will about President Obama, his record, and his accomplishments to this point. There are contentions to be made on each, of course, but let’s leave those for another time and talk now about the modern ear, listening for clues in a politician’s voice. Clues, that is, to what kind of reality they live in – is it ours, or one only loosely based on our surroundings?

In the United States currently, the basic details in these conflicting realities are roughly the same: jobs in America are scarce, wages are stagnant, and the external threats are more nihilistic and complicated than perhaps ever before. A subjective ear will hear what it wants in each version of reality, probably. So how do we really tell them apart? Is there a point on which we can agree there is a dividing line – where the fork splits one definitively from the other?

It’s worth wondering what Twitter is planning with its pending shift to a 10,000-character limit – a decision CEO Jack Dorsey announced Tuesday was in the offing. Perhaps, as John Herman explored at The Awl, it will mean that Twitter takes one step closer, as Facebook as done, to being not merely a delivery system for journalists to preview or link stories – or even live-tweet them – but more of a full on media delivery vehicle. “If readers never leave Twitter, what does a publication mean to them?” Herman asks.

Maybe it’s because the weather is unseasonably warm in some parts of this country that we have the energy to dispense on existential national queries. Those leftover Christmas calories we might otherwise expend shovelling the sidewalks or keeping our balance on the skating-rink sidewalks that arrive overnight, those dangerous deliveries care of the winter winds from Colorado giving a final gasp before heading off to die somewhere to the north, are begging to be used. Maybe because it’s been warmer, and the snowdrifts have been smaller and the sidewalks less icy in the nation’s capital, that we have time for the bigger questions. Questions like: what is democracy, anyway?

It was just over a year ago that Taylor Swift dropped what is likely to be the biggest pop album of a generation. 1989 is still in the top 10 on the charts – one of only five albums to ever achieve that level of sustained success (the others being: Born in the USA; Falling Into You; Rumours; and 21). What are we to make of this chart dominance? If it is a high-water mark for modern pop music, what might come next? The answer might be more of the same.

As if to underscore the whole point of the affair, sometime Wednesday, a guy showed up in the smallest theatre at the Angelika Film Center dressed as Kurt Cobain did when Jesse Frohman captured him for Rolling Stone in November 1993. The Kurt wannabe sat one row back and to the right of Shia LaBeouf, the man everyone was there to see, and yet not actually see much of at all.

Last week, news out of Seattle: someone uncovered pictures of the very first Nirvana show – if one can really call it that. The performance happened in 1987, in the basement of a house in Raymond, Washington. It was spring. On lead vocal that night was Kurt Cobain. On bass, Krist Novoselic. On drums, Aaron Buckhard. And joining on guitar was Tony Poukkula, Cobain’s friend since childhood. The group played two Led Zeppelin songs. Someone took pictures.

As anyone might have guessed from the never-ending promotional stunts and advertising in the weeks prior to its release on July 10, Minions, the prequel to 2010’s Despicable Me, has been very popular. It was the first film to come out this summer to knock Jurassic World from top spot, pulling in $115.2 million in its opening weekend. That’s the second-highest grossing opening weekend for an animated feature ever.

If we assume Minions continues in the same way for a few more weekends, it will soon be in very familiar company. Nine of the top 20 highest-grossing G- and PG-rated movies of all time are animated, including six of the top 10. (Pixar’s latest, Inside Out, is, as of writing, 21st on the list.) And, apart from Beauty and the Beast (1994), they’ve all been released since the turn of the century. A full sixty-six of the top 100 grossing G- and PG-rated movies share that honour, and of those, 48 have been animated.

Combine that with a seemingly endless parade of movies based on comic books and young adult novels, and one might argue that we’re experiencing a golden age of children’s filmmaking.

What makes a good greatest hits album? Is it good when each song is a known, quantifiable, hit? Is it good when it gives you a fresh perspective on a band or group? Is it good because it sparks nostalgia? Maybe it’s all of the above. Conversely, can a greatest hits compilation ever be good? Or is it too devoid of artistic composition, or simply too obviously a milking ploy for continued profits? Each of these questions is like a zen koan – all good questions worth pondering, but perhaps with no ultimate answer.

At least can we come up with a list of the best of the best? Yes. That we can do.

“No one’s impressed by a dinosaur anymore,” says Claire Dearing (Bryce Dallas Howard), head manager of the Jurassic World theme park, somewhere within the first hour of Jurassic World, the movie. She’s explaining why the park decided to breed a new hybrid dinosaur, Indominus Rex. The only way to increase profits and garner necessary corporate sponsorship is to make something bigger, badder, and louder.

Get it?

Obviously, the line was not meant to escape unnoticed. Jurassic World is happy to explain a few times that you’re watching a bloated Hollywood remake, designed to wow your eyeballs and empty your wallet. You’re just like the faceless, nameless rubes in the movie that spilled their savings in an endless search for that next level of wonder, and who gawk and cheer at the spectacle. Some of them get eaten alive.

It was an odd thing to see the promotional material some months ago for the final half-season of Mad Men, nostalgic as it was. The promo was a nod to the famous, oft-quoted, speech Don gave years ago (both for us and for him) to sell the Kodak executives his idea for the slide carousel. AMC even called the season trailer “Nostalgia”. The network doubled-down on the technique prior to the final episode, cooking up a montage of clips set to the tune of Paul Anka’s “Times of Your Life”. Perhaps it’s no surprise. The Kodak guys bought that carousel pitch, after all.

And yet, there was a time when this show was about the future. Back in Season 2, Don had one of his moments of clarity and burst from his office to tell his team to ditch all the work they’d done on an American Airlines proposal. “American Airlines is not about the past any more than America is,” he declared. “Ask not about Cuba, ask not about the bomb. We’re going to the moon. Throw everything out.” When Paul Kinsey was brave enough to confirm, “everything?” Don replied, “There is no such thing as American history, only a frontier.”

Is the TV singing competition dead? American Idol’s demise earlier this week seems to offer a kind of finality. While “not the end of television as we know it,” John Doyle wrote at the Globe and Mail, it’s “a big step toward that.” Colleague Sonya Bell seemed to agree. Idol’s cancellation “signaled the end of an era.” Why? Because the Internet, probably. As Sonya rightly points out: “Talented teens don’t need American Idol anymore. Justin Bieber was discovered on YouTube before he was old enough to audition.” Others since, like Shawn Mendes, have ridden 6-second Vine clips to stardom.

This is all surely correct. Television has changed a lot since Idol premiered in 2002, and the Internet star machine is beyond what most people realize, with YouTube the driving force in that regard (though Vine has worked its way into the celebrity-creating field lately). Last year, the New York Times published a revealing chart showing that, as American Idol went on, as its audience skewed older and older (from a median age of around 30 in 2002 to around 50 in 2013), its ratings generally declined (its peak being in its fifth season, when total viewership hovered around 15 million, and median viewer age was around 35 years old). It so happens that American Idol Season 5 aired in early 2006, not long after YouTube arrived on the scene. Michael McDonald-esque Taylor Hicks won that season – somehow.

And yet, if this is the rule – that we’ve all moved on from these musical talent shows – there must be an exception, and though it’s perhaps not a terribly compelling one, we might find it in the program whose success is charted just to the right of American Idol’s at the New York Times: NBC’s The Voice.

Some weeks ago, while at the pub, a friend told me about his lunchtime habits at a job he used to have. He worked downtown Toronto, not far from a movie theatre. When he took his lunch break, he’d go to a movie. Or, rather, he’d go to half of a movie, as that was all the time he was afforded. Later in the week, he’d schedule his lunch hour so that he could return to the theatre and see the rest of the film. He paid full price both times. It got expensive. This is how, he said, he got the idea for theatres showing half-movies. Patrons would buy a full-priced ticket, but it would be valid for two halves of the same film – each to be watched whenever suitable.

Of course the point wasn’t just to create a new business model for movie theatres (although that might happen, too). And the point was not simply to give people something different to do at lunch. It was to make the lunch hour feel longer. The point was to extend time, or at least, alter the perception of it.

If Jon Ronson could have chosen a better time to release his new book, I’m not sure when that might have been. The idea behind So You’ve Been Publicly Shamed has never been more relevant. That is, that we’ve entered into a period online that’s increasingly dominated by mobs of users shouting down, shaming, those with whom they disagree, or who have committed some perceived error. Case du jour: Trevor Noah, the new host of the Daily Show.

It turns out, Noah has made some very bad jokes on Twitter in the past, including some about Jewish people and women. Not just stupid, the jokes weren’t remotely funny, which some have pointed out might be the most pressing worry for fans of a comedy program. But the comedy worries were quickly overshadowed by how offended everyone was when the tweets were uncovered. For his comedic misfires, Noah was taken to the proverbial Internet stocks and pelted with fruit. And it remains unclear at this point when the shaming will end, if ever.

On Tuesday, only a day before Ashley Judd said she would start pressing charges against abusive Twitter users, James Poulos at the Daily Beast reminded us all of something called Block Bot. The application is designed to “automatically block the people added to its lists… discreetly blocking them on your Twitter account.” That is, once installed, you won’t hear from anyone who’s been nominated to one of the Block Bot’s lists.

There are three of those. The first, Level 1, is the “‘worst of the worst’ (as determined by the blockers), plus impersonators, stalkers and spammers.” Level 2 is all of those folks, plus additional people who are bad, but perhaps not the worst of the worst. And Level 3 includes both the first two lists, and adds “those who can be tedious and obnoxious.” (By the time you’ve applied Level 3 blocking, one might wonder whether there remains much point in being on Twitter at all.)

As Poulos contends, the fact that something like this exists highlights the fallacy living at the root of many online social networks – that rather than making us closer, things like Twitter, sometimes purely by their instantaneousness, have a tendency to push us farther apart. Wouldn’t it be nice to get beyond all the trolls and have a nice conversation (again)?

In the case of “Blurred Lines”, the price for copying is almost $7.4 million. This is the damages awarded this week to the estate of Marvin Gaye, the result of a lawsuit the family filed against Pharrell Williams and Robin Thicke, charging that their 2013 hit “Blurred Lines” was more than merely a song of questionable quality, but one lifted from Gaye’s 1977 hit “Got to Give it Up”.

The verdict has been panned widely as rubbish (though it does have its supporters). Many argue the songs are not enough alike – or, at least, not enough in the ways that actually matter – for the decision to really hold water, and certainly not similar enough to set such a troubling precedent. That precedent being, perhaps, that no more copying is allowed. (On Friday, word came that Gaye’s family also considers Pharrell’s “Happy” to be a copy of Gaye’s “Ain’t That Peculiar,” but aren’t considering legal action at this time.)

One wonders how different this week might have looked had this decision come some time ago, full as it was with reproductions and copies. Granted, some were entirely authorized, or just creators recreating their creations.