from the clunky,-inadequate-lie dept

Anyone who's used the US Court system's PACER system has complained about it. Some of those complaints have formed the basis of lawsuits. The multitude of complaints has moved legislators to make periodic runs at eliminating PACER's paywall. So far, PACER -- which looks and feels like it's still 2001 -- has managed to outlast these efforts. The only change over the last nineteen years has been an increase in access fees.

Many have complained, but few have complained as eloquently as Seamus Hughes, the deputy director of George Washington University's Program on Extremism. His op-ed for Politico is definitely worth reading. It highlights everything wrong with the PACER system, including its amazing profitability.

The U.S. federal court system rakes in about $145 million annually to grant access to records that, by all rights, belong to the public. For such an exorbitant price—it can cost hundreds of dollars a year to keep up with an ongoing criminal case—you might think the courts would at least make it easy to access basic documents. But you’d be wrong. The millions of dollars the courts have reaped in user fees have produced a website unworthy of the least talented of Silicon Valley garage programmers; 18 years since its online birth, PACER remains a byzantine and antiquated online repository of legal information.

This money is supposed to be used to improve PACER and fund other US Courts' efforts. A visit to PACER makes it clear none of that money is being routed towards making PACER less awful. At least one federal court has ruled the way the US Courts spend this money is illegal. But that determination hasn't stopped the court system from collecting fees and spending them on things like new TV screens in courthouses.

While it is a definite improvement over traveling to the pertinent court and using a kiosk to access electronic documents -- the way it was done from 1988-2001 -- the entire system is user-unfriendly and stupid expensive. The $0.10/page fee applies not only to documents, but to search results. Given the lack of standardization of case titles, searches are both expensive and frustrating. The fees apply whether or not the search turns up anything users are searching for.

And the per page fees for PDFs is simply ridiculous. These pages aren't being run off a Xerox by a court clerk. They're being served up from an infinite supply of 1s and 0s. Given their digital state, how does it possibly make sense to charge more for longer documents? The answer doesn't matter because it's the only game in town.

Adding to the problem is the court system's housecleaning efforts and the way it reacts to publication of public documents. In 2014, multiple appeals courts deleted "old" cases from their databases, memory-holing thousands of documents and decisions -- some only a couple of years old at that point. But what's more worrying is the way courts have responded to journalists publishing documents.

In January, I found a search warrant related to a wide-ranging investigation into public corruption in the Los Angeles City Council. When I made my discovery public, the Central District of California essentially locked down all search warrants filed on PACER. Most, if not all, search warrants recently filed in the district are no longer accessible online.

Presumably the DOJ and other law enforcement agencies did some loud complaining about the public court systems' publicly-accessible documents ending up in the hands of the public, resulting in a presumption of secrecy when it comes to affidavits and warrants.

More ends up hidden from public view -- not due to malice, but due to bureaucratic indifference. Hughes points out he has come across several terrorism prosecutions by the DOJ that have never been publicly announced by the Justice Department. The average member of the American public is not going to spend hundreds of dollars a month trying to track down documents from terrorism cases, so it's up the DOJ to provide timely notice of its anti-terrorism efforts. The DOJ is failing to do so and we only know this because dedicated private parties are willing to subject themselves to PACER's UI and inadequate search system to publicize the stuff the government can't be bothered to announce.

This all adds up to the worst system money can buy. It's broken. It's a joke. And it's the only access we the people have to documents the government has declared we have a First Amendment right to access. It's ugly, it's counterintuitive, and it somehow manages to personify the begrudging spirit of the most jaded bureaucrat, despite it being entirely composed of barely-functional code.

from the no-kidding dept

With rates of copyright infringement fluctuating year by year, and country by country, the end result is a debate that goes on as how to best keep rates trending downward. One side of this argument urges a never ending ratcheting up of enforcement efforts, with penalties and repercussions for infringement becoming more and more severe. The other side of the argument suggests that when content is made available in a way that is both convenient and reasonably priced, piracy rates will drop. A decent number of studies have been done that show the latter is the actual answer in this argument, including a study done last summer, which showed innovative business models fare far better than enforcement efforts.

Yet it seems it's going to take a compounding series of these studies to get the point across, so it's worth highlighting yet another study that has come out of New Zealand that concludes that piracy rates are a function of pricing and ease of access to content.

According to a new study commissioned by New Zealand telecoms group Vocus Group NZ and conducted in December 2018, this enhanced availability is having a positive effect.

“Legitimate streaming content providers are achieving what was impossible for Hollywood to get right: they are stamping out piracy by making available the shows people want to enjoy at reasonable cost and with maximum convenience,” Vocus announced this morning.

The company believes that “piracy is dying a natural death” as more locals choose to access content legitimately, via legal services that are both accessible and easier to use than pirate options.

“In short, the reason people are moving away from piracy is that it’s simply more hassle than it’s worth,” says Taryn Hamilton, Consumer General Manager at Vocus Group. “The research confirms something many internet pundits have long instinctively believed to be true: piracy isn’t driven by law-breakers, it’s driven by people who can’t easily or affordably get the content they want.”

We internet pundits have also speculated in past discussions that piracy rates probably have some sort of natural floor to them. In other words, rates aren't going to be 0% and it would be unreasonable both to expect them to be, or to attempt to conjure such fantasy rates into existence through legislative efforts. Instead, content providers need to figure out the sweet spot in pricing and ease of access that reaches or approaches that natural floor. Once they have done so, the job is complete. And, rather than having to worry about which enforcement effort to attempt next, content makers can spend their time instead both creating more content and counting all of their money.

And, as Vocus points out, this is already beginning to occur organically.

“The big findings are that whilst about half of people have pirated some content in their lives, the vast majority no longer do so because of the amount of paid streaming sites that they have access to,” Hamilton added in a video interview with NZHerald.

“Generally the survey has said that the vast minority of people are undertaking piracy – it’s just too hard. People prefer to pay for good quality, cheap, legal content, so we think that’s the best way forward,” Hamilton said.

That convenience is the "RtB" portion of the Cwf+RtB equation. Convenience is worth paying for, as demonstrated by thousands of people that are demonized as just wanting something for free, but who nevertheless subscribe to all kinds of content services and otherwise buy all kinds of content. It's a contradiction worth noticing, assuming that creators want payment above control.

Meanwhile, Hollywood's New Zealand representatives instead want to pretend that none of this data even exists.

In January 2018, the Motion Picture Distributors’ Association, which represents the major Hollywood studios in New Zealand, said that “nothing” can be done to tackle piracy in the country other than site-blocking. Vocus, however, is opposed to this type of action.

That's the kind of lazy attitude only government lobbying could allow. In the real world, there is a great deal that Hollywood could do to tackle piracy, if only they were willing to try.

from the anti-competitive,-anti-consumer-issues dept

As expected, UK Parliament Member Damian Collins released a bunch of documents that he had previously seized under questionable circumstances. While he had revealed some details in a blatantly misleading way during the public hearing he held, he's now released a bunch more. Collins tees up the 250 page release with a few of his own notes, which also tend to exaggerate and misrepresent what's in the docs, and many people are running with a few of those misrepresentations.

However, that doesn't mean that all of these documents have been misrepresented. Indeed, there are multiple things in here that look pretty bad for Facebook, and could be very damaging for it on questions around the privacy protections it had promised the FTC it would put in place, as well as in any potential anti-trust fight. It's not that surprising to understand how Facebook got to the various decisions it made, but the "move fast and break things" attitude also seems to involve the potential of breaking both the law and the company's own promises to its users. And that's bad.

First, the things that really aren't that big a deal: a lot of the reporting has focused on the idea that Facebook would give greater access to data to partners who signed up to give Facebook money via its advertising or other platforms. There doesn't seem to be much of a bombshell there. Lots of companies who have APIs charge for access. This is kind of a standard business model question, and some of the emails in the data dump show what actually appears to be a pretty thoughtful discussion of various business model options and their tradeoffs. This was a company that recognized it had valuable information and was trying to figure out the best way to monetize it. There isn't much of a scandal there, though some people seem to think there is. Perhaps you could argue that allowing some third parties to have greater access Facebook has a cavalier attitude towards that data since it's willing to trade access to it for money, but there's no evidence presented that this data was used in an abusive way (indeed, by putting a "price" on the access, Facebook likely limited the access to companies who had every reason to not abuse the data).

Similarly, there is a lot of discussion about the API change, which Facebook implemented to actually start to limit how much data app developers had access to. And the documentation here shows that part of the motivation to do this was to (rightfully) improve user trust of Facebook. It's difficult to see how that's a scandal. In addition, some of the discussions involve moving specific whitelisted partner to a special version of the API that gives them access to more data... but in a way that the data is hashed, providing better privacy and security to that data, while still making it useful. Again, this approach seems to actually be beneficial to end users, rather than harmful, so the attempts to attack it seem misplaced -- and yet take up the vast majority of the 250 pages.

The bigger issues involve specific actions that certainly appear to at least raise antitrust questions. That includes cutting off apps that recreate Facebook's own features, or that are suddenly getting a lot of traction (and using the access they had to users' phones to figure out which apps were getting lots of traction). While not definitively violating antitrust laws, that's certainly the kind of evidence that any antitrust investigator would likely explore -- looking to see if Facebook held a dominant position at the time of those actions, and if those actions were designed to deliberately harm competitors, rather than for any useful purpose for end-users. At least from the partial details released in the documents, the focus on competitors does seem to be a driving force. That could create a pretty big antitrust headache for Facebook.

Of course, the details on this... are still a bit vague from the released documents. There are a number of included charts from Onavo included, showing the popularity of various apps, such as this:

Onavo was a data analytics company that Facebook bought in 2013 for over $100 million. Last year, the Wall Street Journal broke the story that Facebook was using Onavo to understand how well competing apps were doing, and potentially using that data to target acquisitions... or potentially to try to diminish those competing apps' access. The potential "smoking gun" evidence is buried in these files, but there's a short email on the day that Twitter launched Vine, its app for 6-second videos, where Facebook decides to cut off Twitter's access to its friend API in response to this move, and Zuckerberg himself says "Yup, go for it."

Now... it's entirely possible that there's more to this than is shown in the documents. But at least on its face, it seems like the kind of thing that deserves more scrutiny. If Facebook truly shut down access to the API because it feared competition from Vine... that is certainly the kind of thing that will raise eyebrows from antitrust folks. If there were more reasons for cutting off Vine, that should come out. But if the only reason was "ooh, that's a potential competitor to our own service," and if Facebook was seen as the dominant way of distribution or access at the time, it could be a real issue.

Separately, if the name Onavo sounds familiar to you, that might be because earlier this year, Facebook launched what it called a VPN under the brand name Onavo... and there was reasonable anger over it because people realized (as per the above discussion) that Onavo was really a form of analytics spyware that charted what applications you were using and for what. It was so bad that Apple pulled it from its App Store.

The other big thing that comes out in the released documents is all the way at the end, when Facebook is getting ready to roll out a Facebook app update on Android that will snoop on your SMS and call logs and use that information for trying to get you to add more friends and for determining what kinds of content it promotes to you. Facebook clearly recognized that this could be a PR nightmare if it got out, and they were worried that Android would seek permission from users, which would alert them to this kind of snooping:

That is bad. That's Facebook knowing that its latest snooping move will look bad and trying to figure out a way to sneak it through. Later on, the team is relieved when they realize, after testing, that they can roll this out without alerting users with a permission dialog screen:

As reporter Kashmir Hill points out, it's notable that this "phew, we don't really have to alert users to our sketchy plan to get access to their logs" came from Yul Kwon, who was designated as Facebook's "privacy sherpa" and put in charge of making sure that Facebook didn't do anything creepy with user data. From an article that Hill wrote back in 2015:

The face of the new, privacy-conscious Facebook is Yul Kwon, a Yale Law grad who heads the team responsible for ensuring that every new product, feature, proposed study and code change gets scrutinized for privacy problems. His job is to try to make sure that Facebook’s 9,199 employees and the people they partner with don’t set off any privacy dynamite. Facebook employees refer to his group as the XFN team, which stands for “cross-functional,” because its job is to ensure that anyone at Facebook who might spot a problem with a new app — from the PR team to the lawyers to the security guys — has a chance to raise their concerns before that app gets on your phone. “We refer to ourselves as the privacy sherpas,” says Kwon. Instead of helping Facebook employees scale Everest safely, Kwon’s team tries to guide them safely past the potential peril of pissing off users.

And yet, here, he seems to be guiding them past those perils by helping the team hide what's really going on.

This is also doubly notable for Kashmir Hill who has been perhaps the most dogged reporter on the creepy levels to which Facebook's "People You May Know" feature works. Facebook has a history of giving Hill totally conflicting information about how that feature worked, and these documents reveal, at least, the desire to secretly slurp up your call and SMS records in order to find more "people you might know" (shown as PYMK in the documents).

One final note on all of this. I recently pointed out that Silicon Valley really should stop treating fundamental structural issues as political issues, in which they just focus on what's best for the short-term bottom line, and should focus on the larger goals of doing what's right overall. In a long email included in the documents from Mark Zuckerberg, musing thoughtfully on various business model ideas for the platform, one line stands out. Honestly, the entire email (starting on page 49 of the document) is worth reading, because it really does carefully weigh the various options in front of them. But there's also this line:

If you can't read that, it's a discussion of how it's important to enable people to share what they want, and how enabling other apps to help users do that is a good thing, but then he says:

The answer I came to is that we’re trying to enable people to share everything they want, and to do it on Facebook.
Sometimes the best way to enable people to share something is to have a developer build a
special purpose app or network for that type of content and to make that app social by
having Facebook plug into it. However, that may be good for the world but it’s not good for
us unless people also share back to Facebook and that content increases the value of our
network. So ultimately, I think the purpose of platform – even the read side – is to increase
sharing back into Facebook.’

I should note that in Damian Collins' summary of this, he carefully cuts out some of the text of that email to frame it in a manner that makes it look worse, but the "that may be good for the world, but it's not good for us" line really stands out to me. That's exactly the kind of political decision I was talking about in that earlier post. Taking the short term view of "do what's good for us, rather than what's good for the world" may be typical, and even understandable, in business, but it's the root of many, many long term and structural problems for not just Facebook, but tons of other companies as well.

I wish that we could move to a world where companies finally understood that "doing good for the world" leads to a situation in which the long term result is also "good for us," rather than focusing on the "good for us" at the expense of "good for the world."

from the who-needs-competition? dept

We've talked for a while how while there has been a lot of hype placed upon the nation's scattered but modest deployment of gigabit networks, broadband in countless parts of the country is actually getting significantly-less competitive. That's thanks in large part to the nation's phone companies, which have increasingly refused to pony up the necessary costs to upgrade their aging DSL networks at any scale. Instead, many have shifted their focus either to enterprise services, or as in the case of Verizon, into trying to peddle ads to Millennials after gobbling up AOL and Yahoo.

A new study by several consultants for the broadband industry offers a little more insight into the real-world result of the sector's ongoing competition problem. According to the report by Economists Incorporated and CMA Strategy Consulting, there's a fairly staggering number of broadband consumers that don't see any real competition whatsoever, especially at the FCC's standard definition of broadband (25 Mbps down, 3 Mbps up):

"More than 10.6 million US households have no access to wired Internet service with download speeds of at least 25Mbps, and an additional 46.1 million households live in areas with just one provider offering those speeds, a new analysis has found. That adds up to more than 56 million households lacking any high-speed broadband choice over wired connections. Even when counting access to fixed wireless connections, there are still nearly 50 million households with one 25Mbps provider or none at all."

So it should be noted here that these estimates are likely optimistic. FCC data has previously suggested that this number is even higher, former FCC boss Tom Wheeler stating that around 80% of homes can't get access to the agency's standard definition of broadband. It's notably worse in rural or tribal areas. But even this week's new, toned down report by industry consultants doesn't paint a particularly pretty picture. Even at slower broadband speeds, you'd be hard pressed to identify anything close to reasonable competition:

"There were 31.1 million households with exactly one wireline provider offering speeds of at least 10Mbps, and another 6.9 million households with zero providers offering such speeds over wired connections. At the paltry level of 3Mbps download speeds, 19.3 million households had access to one wireline ISP and 4.9 million households had no access at all."

It should be noted that one of the co-authors of the report, Hal Singer, has a bit of a history creatively-massaging data at the industry's behest -- especially when it comes to trying to vilify net neutrality (which the 45-page report seems to avoid talking about). So while Singer's ability to candidly acknowledge a lack of competition is a little surprising (even though the report does try to scale back previous FCC estimates on this front), less surprising is the authors' proposed solution to the broadband industry's broadband deployment and competition shortcomings: the magical wand that is telecom sector deregulation.

So again, the report is quick to avoid the debate over the current administration's decision to kill consumer privacy protections and gut net neutrality, despite Singer being a major player in trying to make the latter happen. And while it pays some lip service to competition, it fails to acknowledge how cable's growing monopoly and outright telco apathy are making competition problems worse. The report however does try to claim that several, less talked about FCC initiatives are going to expand fiber and competition to an additional 26.7 million homes:

"In two recent Notices of Proposed Rulemakings (“NPRMs”), the FCC has outlined a range of potential actions to make it faster and less costly to deploy next-generation networks. It is expected that these proposals will lower pole-attachment costs, reduce the time and cost of make-ready, reduce barriers to copper retirement, accelerate legacy time-division multiplexing (“TDM”) product discontinuance, and reduce barriers to locating and deploying wireless infrastructure.

The telecom industry has insisted for decades that if you remove all regulatory oversight, competition and connectivity will magically spring forth from the sidewalks, bathing us uniformly in dirt-cheap, ultra-fast connectivity. Of course that never happens because reality is notably more complicated, and each piece of regulation (especially in an industry where incumbent legacy giants are usually quite-literally writing the laws) needs to be weighed on its actual merits. When you just blindly "deregulate" a sector that suffers from both regulatory capture and limited competition, history tells us you don't get a miracle -- you get Comcast.

What most people also don't seem to understand is that when the telecom industry pushes for "deregulation," what it actually means is passing regulation it writes. And, historically, that regulation unsurprisingly makes life easier for wealthy, entrenched duopolists, but makes life substantially harder on the smaller competitive upstarts that lack the same lobbying and campaign-contribution firepower. It's generally how they get partisans who adore the concept of killing burdensome regulations (because yes, there is plenty of that) into cheering against their own best self interests. And it has been a smashing success for decades. Your Comcast bill surely agrees.

So while the report is correct that things like utility pole attachment reform is important for fiber deployment, it fails to mention that cities that have attempted to do so have been sued by Comcast, Charter and AT&T to try and slow competitive threats. Similarly, while the report is quick to emphasize the importance of "reducing barriers to copper retirement," it fails to mention that AT&T and Verizon's version of this involves severing the taxpayer-subsizied DSL connections of millions of users (many elderly), and just shoving them toward notably-more expensive wireless (assuming it's even available).

So yes, some of these efforts -- in an ideal world -- could speed up deployment. But because we've let industry giants quite literally infect government (including surveillance) on a bone-marrow level, actually implementing any regulatory or deregulatory policies that improve competition simply doesn't happen -- because it would reduce sector revenues. Consultants predominantly paid by the industry aren't likely to admit this, but as somebody having spent the better part of a lifetime tracking this sector I can assure you: none of the competition, coverage and service problems in telecom are going to be fixed until we somehow lessen Comcast, AT&T, Charter and Verizon's influence over state and federal politics.

from the nothing-to-see-here dept

Late last year Google Fiber announced it would be pausing expansion into several new markets, axing its CEO, and shuffling a number of employees around. Reports subsequently emerged suggesting that Alphabet higher ups were growing frustrated with the high cost and slow pace of fiber deployment, and were contemplating an overall larger shift to wireless. While the company continues to insist that there's nothing to see here and that everything is continuing as normal, signs continue to emerge that the ground Google Fiber is built on may not be particularly sturdy.

This week numerous Kansas City residents say they were told that the company was cancelling their installations after waiting eighteen months for service. Users there are frustrated by Google's complete lack of explanation for the rash of cancellations:

"About April, May, I saw sometimes as many as four to five Fiber trucks in the neighborhood. I kept watching my email but never got anything in the mail to schedule my appointment or anything,” Muerer told 41 Action News.

That was back in October 2015.

Eighteen months later, Meurer still doesn’t have Google Fiber. He recently received an email saying the company had canceled his installation.

"I’m left wondering what is going on,” said Meurer.

Kansas City residents aren't alone. Portland was one of the cities Google Fiber was supposed to launch in, but locals there are similarly frustrated by Google's about face. Especially since the city had shuffled around city ordinances, laid the groundwork for the placement of Google Fiber "huts," and convinced state legislatures to pass a new state law providing notable tax incentives for Google Fiber. Chicago, Jacksonville, Los Angeles, Oklahoma City, Phoenix, San Diego, San Jose, and Tampa were also in various states of contact with Google Fiber about potential builds that apparently will no longer be happening.

And while Google Fiber still exists, Google/Alphabet isn't helping restore confidence it the disruptive potential of the service. By and large the company continues to insist that everything is fine and there's nothing to see here despite ongoing evidence of cold feet at the executive level. Whenever press outlets inquire about last fall's decision, reporters are given a calorie-free rosy statement that tells people absolutely nothing substantive about what's going on. This statement, for example, is what I was given when I asked the company specifically why it was cancelling fiber installations in Kansas City:

"Google Fiber loves Kansas City and is here to stay. We’ve been grateful to be part of your community since 2011, and for the opportunity to provide superfast Internet to residents. We recently announced our expansion into Raymore, we are continuing to build in Overland Park, and we can’t wait for even more customers in Kansas City to experience what’s possible with Google Fiber."

Granted Google's pivot to wireless could certainly work. The company is conducting wireless trials in the 71-76 GHz and 81-86 GHz millimeter wave bands, as well as the 3.5 GHz band, the 5.8 GHz band and the 24 GHz band. It seems fairly clear that Alphabet executives really don't know what they want to do just yet, but don't want to admit that to anybody. But confidence that Google Fiber would be the answer to solving the broadband mono/duopoly log jam is quickly wavering, something unaided by Google's bizarre refusal to be clear about the direction the project is headed.

from the yeah,-that'll-work dept

Here's a story that starts out well. One of the UK's top police officers, Chief Superintendent Gavin Thomas, has said that putting people in prison for offenses like hacking into computers makes no sense. He points out that it costs around $50,000 a year to keep someone in a traditional prison, and that education programs are likely to be a far more cost-effective solution, especially in terms of reducing recidivism. This is absolutely right, and it's great to hear a senior officer admit it. Unfortunately, things go downhill from here. He told the Telegraph:

If you have got a 16-year-old who has hacked into your account and stolen your identity, this is a 21st century crime, so we ought to have a 21st century methodology to address it.

His solution is as follows:

He said convicted criminals could be fitted with electronic jammers around their wrists or ankles which blocked wifi signals and prevented them from going online.

Leaving aside the human rights implications, which to his credit Thomas acknowledges, there is another big problem with the proposal, as Techdirt readers have doubtless already spotted. The people wearing these WiFi jammers would be those who have been found guilty of some computer-related crime. By definition, then, they are likely to be tech-savvy. So they probably have other computers that can use Ethernet connections to access the Internet. In addition, they are unlikely to have any problems using Bluetooth or a USB cable to reverse-tether their mobiles to a system with wired access. The more adventurous might even try to rig up some kind of Faraday shielding to jam the jammer. In other words, this isn't going to work, but would probably cause havoc with everyone else's WiFi connections.

It is utterly essential for detectives and criminal investigators to use data held on smartphones and other devices when they are investigating serious crimes.

Given his belief that jamming bracelets would stop convicted computer criminals from using the Internet, the worry has to be that he shares the mistaken view that tech companies can create a safe system of crypto backdoors or "golden keys" that only the authorities can use. Let's hope he takes some expert advice before offering an opinion on that one.

from the flying-saucer-shit dept

Back in October Google Fiber confirmed that all was not entirely well at the disruptive broadband provider. The company announced that not only would it be shaking up its executive leadership, it would be eliminating some Google Fiber employees and putting a hold on fiber deployment to around nine cities (existing builds will continue, however). There were several reasons for the shift, the biggest being that the company wasn't happy with the time it was taking to build networks from scratch, and was considering a notable pivot to next-generation wireless to save both time and money.

But subsequent reports have suggested there's a notable split among Google/Alphabet executives as to the future direction of Google Fiber. Bloomberg recently unveiled some additional new details on this, noting how part of the underlying issue is that Alphabet CFO Ruth Porat has been engaged in some purse string tightening at the Mountain View giant. But the report also touches on the fact that Larry Page apparently grew tired of the slow pace of disruption in the telecom space because digging ditches isn't "flying saucer shit":

"But seeking permits to lay fiber is time-consuming and digging holes expensive. Former employees say Page became frustrated with Fiber’s lack of progress. “Larry just thought it wasn’t game-changing enough,” says a former Page adviser. “There’s no flying-saucer shit in laying fiber.” In October the company announced that it was dismissing around 130 staffers and halting the expansion of the fiber network in eight cities. Barratt resigned that same day."

In addition to navigating a labyrinthine maze of antiquated underground urban infrastructure, Google Fiber has faced all manner of delays caused by incumbent broadband providers like AT&T and Comcast, who work tooth and nail to hamstring Google Fiber and other competitors. From protectionist state laws intended to prevent cities from striking public/private partnerships, to attempts to prevent Google Fiber from quick access to utility poles, these companies have decades of experience using cash-compromised state legislators and regional regulatory capture against would-be competitors.

But this is all stuff Google Fiber knew full well before throwing its hat into the telecom arena. And while Page may not think that providing a desperately needed alternative to the existing broadband duopoly is "flying saucer shit," Google Fiber's impact on the market has been transformative all the same. Even with Google Fiber's admittedly sparse footprint, the mere presence of the service results in ISPs dramatically dropping prices and boosting their own deployments of gigabit service. Google Fiber's mere existence also created a necessary national dialogue on the sorry state of U.S. broadband competition.

Previous reports have suggested that executives at Google were split over Google Fiber, with some wanting the company to stay the path with fiber, and many others believing that wireless will be good enough. But the Bloomberg report is quick to highlight how many also worry this is just the latest example of Google's inevitable shift from risk-taking disruptor, to a notably blander legacy-turf-protection machine:

"These changes have prompted many in Silicon Valley to accuse Page of bowing to investor pressure—in other words, of acting like a CEO of a normal, publicly traded company. “It definitely looks like a more conventional company,” says Randy Komisar, a partner at Kleiner Perkins Caufield & Byers. “It’s the classic GE conglomerate model,” he says, comparing Page to Jack Welch, famous for turning General Electric around by shedding research divisions and slashing costs."

With the incoming Trump administration making it very clear the goal is to defang and defund the FCC, Google Fiber's path could get even more complicated in the form of fewer regulatory allies in the fight against incumbents. While the Google Fiber shift to wireless could still pay notable competitive dividends, it's still entirely within the realm of possibility that Page and friends get bored with Google Fiber entirely in a few years, leaving the effort as just another footnote in the never-ending quest to bring something vaguely resembling real price competition to bear on Verizon, AT&T, Comcast and Charter.

Taplin kicks it off by jumping on Mark Cuban's ridiculous comments from last week saying that AT&T should be able to buy Time Warner so that it can "compete" against Google and Facebook, and then takes it to an even more ridiculous level. The crux of Taplin's argument: Google and Facebook are big, and thus bad, and need antitrust treatment.

Look at the numbers. Alphabet (the parent company of Google) and Facebook are among the 10 largest companies in the world. Alphabet alone has a market capitalization of around $550 billion. AT&T and Time Warner combined would be about $300 billion.

Yup. They're big companies -- and certainly, like with all big companies, we should be wary about how they might abuse their powers. But big, by itself, isn't automatically bad. And the nature of antitrust is not that big is bad, but that abusing monopoly power is bad. And Taplin has no way to show either (1) monopoly power or (2) abusive behavior, so he just starts throwing numbers.

Alphabet has an 83 percent share of the mobile search market in the United States and just under 63 percent of the US mobile phone operating systems market. AT&T has a 32 percent market share in mobile phones and 26 percent in pay TV. The combined AT&T-Time Warner will have $8 billion in cash but $171 billion of net debt, according to the research company MoffettNathanson. Compare that to Alphabet’s balance sheet, with total cash of $76 billion and total debt of about $3.94 billion.

Nice cherry picking, Jonathan! The real scam in fake antitrust complaints is trying to define the markets in a way that looks much worse than it really is. Notice that Taplin focuses on "mobile search" (random?) as the market for Google and "mobile phones" for the market for AT&T. But he leaves out the simple facts: if you need an internet connection, in many cases AT&T is either your only option or one of two options. And if you do that, AT&T gets to see everything you do. And switching broadband providers or mobile phone providers is a complicated and often expensive process. Switching a search engine... is not.

Then, to get to the question of "bad behavior," Taplin falls back on the silly line that because Google and Facebook have made a lot of money, and his buddies in legacy entertainment companies have been making a lot less money, that somehow Google and Facebook have unfairly taken money from his industry. That's just silly.

In the past decade, an enormous reallocation of revenue of perhaps $50 billion a year has taken place, with economic value moving from creators of content to owners of monopoly platforms.

I reached this conclusion from the following statistics: Since 2000, recorded music revenues in the United States have fallen to $7.2 billion per year from $19.8 billion. Home entertainment video revenue fell to $18 billion in 2014 from $24.2 billion in 2006. United States newspaper ad revenue fell to $23.6 billion in 2013 from $65.8 billion in 2000.

And yet, by every available metric, people are consuming more music, video, news and books. During that same period, Google’s revenue grew to $74.5 billion from $400 million.

Sing it with me, folks: correlation is not causation. After all, the number of works of visual art copyrighted in the US similarly has an inverse correlation to the number of females in NY who slipped or tripped to their death (really!). It doesn't mean it's a causal relationship where more of one means less of the other.

The reason that Google and Facebook are making lots of money is because they're offering a product that people want and they're doing it for free and have come up with business models that work. The reason legacy entertainment companies are flailing (and, realistically, only some of them are), is because they tried to stick with their old business model that focused on basically ignoring or mocking and attacking competition from new sources.

The problem, in short: Taplin's whole world revolved around elitism and gatekeepers. The business models he celebrates are gatekeeper business models -- the ones that keep out the riff raff and the people that Taplin likes to insult because he thinks their "art" isn't good enough to be seen by the world. The world of the internet is the opposite. It's about enabling anyone to be a creator, and to open up new avenues to create, to share, to promote, to distribute, to build a fan base and to monetize. Those were all functions that Taplin and his friends used to control, with a strict lock on the gate, allowing them to artificially inflate the prices. When the new platforms came on the market and democratized every bit of the process of creating/distributing content, suddenly the "deal" offered by the gatekeepers didn't look so good. And that's why those busineses have struggled.

And it's why, comparatively speaking, most of the public likes companies like Google and Facebook, while they hate AT&T. Find me a list of consumer satisfaction or most admired companies where AT&T outranks either of the other ones. Antitrust should be about protecting consumers -- and the public is pretty happy with the services it gets from Google and Facebook... but not so much with AT&T.

But, of course, to Taplin, it all comes back to piracy, because he's absolutely sure that's why everyone uses Google and Facebook, even though he's wrong.

Every pirated music video or song posted on YouTube or Facebook robs the creators of income, and YouTube in particular is dominated by unlicensed content. Google’s YouTube has an over 55 percent market share in the streaming audio business and yet provides less than 11 percent of the streaming audio revenues to the content owners and creators. But Facebook, which refuses to enter into any licensing agreement on music or video, is challenging YouTube in the free online video and music world.

As we discussed a few months ago, when you look at the actual data, only 2% of music video views on YouTube are unauthorized. 2%. So, no, YouTube is not "dominated by unlicensed content." That's simply and utterly false. And, no, even those unauthorized videos are not "robbing creators of income." Many smart creators these days are using YouTube as a platform to get more fans and build a bigger support base, which they can take to platforms like Patreon or Kickstarter, rather than having to give up everything to sign with a major label run by one of Taplin's friends.

I recognize that Taplin's friends have struggled to understand and adapt to this new world. And I understand that they want to lash out at the big companies like Google and Facebook that have helped make this world a reality. But why does the NY Times keep letting him publish blatantly factually false information? Oh, and the kicker? After a long rant that is full of misleading buillshit... he asks for "an honest national conversation."

Perhaps in January we can have an honest national conversation on monopoly and our future.

If we were to have an honest conversation, it would have to leave out Taplin's lies.

from the build-it-and-they-won't-come dept

Back in August a report emerged claiming that Google Fiber executives were having some second thoughts about this whole "building a nationwide fiber network from the ground up" thing. More specifically, the report suggested that some executives were disappointed with the slow pace of digging fiber trenches, and were becoming bullish on the idea of using next-gen wireless to supplement fiber after acquiring fixed wireless provider Webpass. As such, the report said the company was pondering some staff reductions, some executive changes, and a bit of a pivot.

Fast forward to this week when Access CEO Craig Barrett posted a cheery but ambiguous blog post not only formally announcing most of these changes, but his own resignation as CEO. According to Barrett, Google will continue to serve and expand Google Fiber's existing markets (Austin, Atlanta, Charlotte, Kansas City, Nashville, Provo, Salt Lake City, and The Triangle in North Carolina), and will also build out previously-announced but not yet started efforts in Huntsville, Alabama; San Antonio, Texas; Louisville, Kentucky; and Irvine, California.

From there, the direction Google Fiber will be headed gets murky. According to Barrett, Google has paused (read: killed) potential deployments in cities where Google Fiber had been having conversations, but hadn't yet given the green light for full deployment (Portland, Chicago, Jacksonville, Los Angeles, Oklahoma City, Phoenix, San Diego, San Jose, and Tampa). Most of the layoffs will be in these cities, notes Barrett:

"For most of our “potential Fiber cities” — those where we’ve been in exploratory discussions — we’re going to pause our operations and offices while we refine our approaches. We’re ever grateful to these cities for their ongoing partnership and patience, and we’re confident we’ll have an opportunity to resume our partnership discussions once we’ve advanced our technologies and solutions. In this handful of cities that are still in an exploratory stage, and in certain related areas of our supporting operations, we’ll be reducing our employee base."

A report over at Bloomberg notes that about 9% of employees at Access (which covers multiple projects, not just Google Fiber) will be let go, which is notably fewer staff reductions than last summer's report had suggested. Bloomberg's insiders also claim that there have been some rifts among executives at Google/Alphabet/Access over whether to remain dedicated to the laborious process of fiber installations, or to pivot more completely to wireless:

"Moving into big cities was a contentious point inside Google Fiber, according to one former executive. Leaders like Barratt and Dennis Kish, who runs Google Fiber day-to-day, pushed for the big expansion. Others pushed back because of the prohibitive cost of digging up streets to lay fiber-optic cables across some of America’s busiest cities."

That there's some hesitation isn't surprising. Not only is building a fiber network from the ground up incredibly hard, expensive, and time consuming, the telecom industry is awash with deep pocketed incumbents intent on making things as difficult as possible for competitors like Google Fiber (and downright impossible for smaller ISPs). From AT&T suing cities to thwart attempts to streamline utility pole attachments, to incumbent ISPs writing awful state law prohibiting public/private partnerships, telecom can certainly be a cesspool of protectionism of the worst sort.

While these incumbent ISPs (and their armies of paid policy mouthpieces) will likely spend the next few weeks celebrating the "death of Google Fiber," there's nothing stopping the company from pivoting to next-generation wireless. Google has filed applications with the FCC to conduct trials in the 71-76 GHz and 81-86 GHz millimeter wave bands, and is also conducting a variety of different tests in the 3.5 GHz band, the 5.8 GHz band and the 24 GHz band. That said, it certainly remains possible that at some point Google gets tired of ramming its head against VerizoCasT&T and sells the project off in a few years, leaving us with another sad historical footnote in the often pitiful national quest for something vaguely resembling broadband competition.

from the innovative,-but-controlling dept

Undeniably, Prince’s death last week marked the loss of a true musical genius and maverick. In his life, he was known for being a talented musical innovator with flamboyant clothes and a contrarian streak. He was adept at a range of instruments, as well as in multiple genres of music including funk, jazz, pop, rock, and R&B.

As broadly gifted an artist as he was, Prince never quite found the right approach when it came to licensing his music for distribution -- in spite of the fact that sold over 100 million records, placing him among the best-selling artists of all-time. He won an Oscar, a Golden Globe, and seven Grammys, among other accolades. His massive discography includes 50 albums, 104 singles, 136 music videos, among other creative works. And yet his fans were left in the odd position, on the news of his death, of being frequently unable to provide links to Prince’s massive oeuvre.

Like David Bowie, who died only a few months earlier this year, Prince was constantly reinventing himself throughout his career. But one key reason for his reinvention -- at different times, he was known by “Prince,” “Jamie Starr,” an unpronounceable glyph, and perhaps most notoriously, as “The Artist Formerly Known as Prince” -- was his unhappiness with his record labels, and later with digital/Internet distribution.

And even now, if you’re looking to listen to your favorite Prince tracks on popular digital music services like Spotify or Apple Music, you’re out of luck. While you can find some live performances on YouTube, and a couple exceptions like his single “Stare” on Spotify, the streaming rights to his music are licensed exclusively through Tidal -- a niche subscription-only service owned by Jay Z.

You can see why Prince may have been attracted to Tidal as a service. Since its launch in late 2014, a number of major artists have embraced it, offering exclusive releases and touting the service’s better deal for artists. Indeed, Tidal purports to “pay the highest percentage of royalties to artists, songwriters and producers of any music streaming service.”

But it’s hard to see how it would make business sense to exclusively license with them, as Prince did. For one thing, it’s not entirely clear that Tidal’s rates are that much better than Spotify. Respectively, they each claim to pay out 75% and 70% of their revenues to rights holders. Yet, Tidal has also claimed that they pay out four times Spotify’s royalty rate.

Vania Schlogel, then executive at Tidal, clarified their rates in an interview for the Hollywood Reporter

There was some confusion on the Internet about whether “royalty rate” was a percentage of Tidal’s total revenue. According to Schlogel, it is. The industry standard royalty rate, she says, is 70% (roughly 60% to record labels, roughly 10% to artists via publishers). Tidal pays 62.5% and 12.5% (which equals the 75% Jay Z is referring to).

This makes their base royalty rate going to artists 25% higher than Spotify. But Tidal also has about 45% of their subscribers on a $19.99 per month premium tier. This would make the share of revenue going to artists around 80% higher.

That’s a lot more! Artists should all be switching to exclusive deals with them, right? Well...not so fast. Spotify alone has 30 million paying subscribers. 100 million if you include ad-supported free tier listeners. Apple Music has another 11 million paid subscribers. Compare that with Tidal’s relatively paltry 3 million. Not to mention commercial distribution to YouTube’s 1 billion active users, or the dozen other streaming services out there.

Assuming those subscribers have comparable activity profiles, it wouldn’t make business sense even if they paid ten times the royalty rate -- at which point it would be more than total revenue. Although, artists can do whatever they want. It’s a free market (sort of).

But for Prince, his embrace of Tidal may not have been just about royalty rates. Rather, it may have been a reflection of his proclivity to assert tight control of his brand. As Vox’s Constance Grade writes:

It's classic Prince: Tidal is the best program not only because it pays better, but because it gives him the most control over his music and his persona. And Prince never let someone else control his persona if he could help it.

This was fully consistent with the character of a man who preferred to play small, intimate venues even when he could have been selling-out stadiums.

But making music less accessible poses serious challenges for artists and consumers alike. For one thing, as English singer/songwriter Lily Allen explains, it will reinvigorate incentives for piracy (notably, she has also had an interesting relationship with Techdirt):

I love Jay Z so much, but Tidal is (so) expensive compared to other perfectly good streaming services, he's taken the biggest artists ... Made them exclusive to Tidal (am I right in thinking this?), people are going to swarm back to pirate sites in droves ... Sending traffic to torrent sites.

Perhaps unsurprisingly, when Kanye West decided to release his album The Life of Pablo exclusively on Tidal, it was pirated over 500,000 times in its first day alone -- drawing fire for reinvigorating online music piracy.

A recent study by Columbia University (among other research including the Copia Institute’s “The Carrot Or The Stick?”) confirms that providing access to good legal alternatives is effective at reducing online piracy -- particularly among young people. To take another example, the rise of Spotify in Sweden was followed by a major decline in music sharing on the Pirate Bay. According to Copia’s study, “a similar move was not seen in the file sharing of TV shows and movies...until Netflix opened its doors in Sweden.”

During his career, Prince also flirted with various album release strategies, and explored ways to cut out the middleman by going fully independent.

Prince’s strategy was visionary, but ahead of its time. A solution that’s just now coming of age is blockchain-driven smart contracts for digital music consumption. If they catch on, they could cut out the middleman and transparently distribute revenues directly to artists behind a given work, according to pre-arranged terms. Prototype service Ujo is already doing it with artist Imogen Heap’s single “Tiny Human.” So, in actuality, perhaps Jay Z should be more worried about blockchain than Spotify.

Indeed, as streaming becomes the dominant revenue source in the music market, and consumers continue to shift away from physical media and digital downloads, the pressure from artists will only increase as they seek more transparency, and a stronger ability to renegotiate their share of revenues from all sides (but particularly from labels).

On Twitter, Allen echoed this sentiment, writing that rather than demonizing streaming services, artists should look towards the hefty cut of revenue taken by labels:

For Prince, online streaming services were just the latest challenge in his complex relationship engaging with evolving digital markets. Like Bowie, Prince was a digital pioneer -- among the first to embrace the Internet’s potential to create a direct relationship with his fans. In 2001, he opened one of the first music subscription services, NPG Music Club, which was open for 5 years. In 2009, this was succeeded by lotusflow3r.com. As the Wall Street Journal describes it:

LotusFlow3r.com, resembled a galactic aquarium, featuring doodads like a rotating orb that played videos. The promise: fans who ponied up $77 for a year-long membership would receive the three new albums, plus an ensuing flow of exclusive content, like unreleased tracks and archival videos.

It was also met with a mixed reception, and a year after its launch, it went dark.

Ultimately, as the Internet came of age, Prince met it with increasing resistance. Likely, he saw his ability to assert control slipping away. He wasn’t a fan of people repurposing his work in the analog era, so why should we expect him to embrace a digital one -- where it’s far easier to remix, edit, dub and repurpose? As Mike Masnick explains, Prince became a militant enforcer of his intellectual property, who played fast and loose with the law in his litigiousness:

At one point, he even declared that the Internet is a fad, rebelling against a model that wouldn’t work on his terms:

The internet's completely over. I don't see why I should give my new music to iTunes or anyone else. They won't pay me an advance for it and then they get angry when they can't get it.

(At this point he could have styled himself “The Prince of Denial.” He even deleted his Facebook and Twitter accounts.)

Famously, Prince, via Universal Music, was responsible for the infamous “dancing baby” DMCA takedown over a video featuring Prince’s “Let’s Go Crazy” playing faintly in the background of a short clip as a toddler danced*. Ultimately our friends at EFF, who were representing Stephanie Lenz, prevailed on their fair use claim. In 2013, EFF awarded him their “Raspberry Beret Lifetime Aggrievement Award” for “extraordinary abuses of the takedown process in the name of silencing speech.”

Despite all the digital-copyright agitation Prince managed to generate in the steps he took to express his unhappiness with Internet distribution channels -- and despite his insistence, it doesn’t seem as if the Internet is “over” quite yet -- he will of course be remembered primarily for his genius as a songwriter, performer, and producer. And, also, as a visionary. Although he passed away just before the rise of virtual reality and mixed reality technologies, one can only imagine him as someone who would have embraced it. Even if imperfectly.

Ironically, given his virtuosity and lasting impact on pop music, limiting his digital distribution, and the ability of his fans to find new creative uses for his work, makes it orders of magnitude more difficult for fans to bring his music to new generations of listeners, who may never know what all the fuss about Prince was about. And that’s a shame.

* Post updated to reflect that while Prince/Universal sent the initial DMCA takedown, it was Lenz and EFF who brought the lawsuit for that takedown.