from the would-be-a-start dept

As we've discussed for many years, Homeland Security and the Justice Department have convinced too many courts that there is some sort of 4th Amendment "exception" at the border, whereby Customs and Border Patrol agents (CBP) are somehow allowed to search through your laptops, phones, tablets and more just because, fuck it, they can. Now bipartisan pairs in both the Senate and the House have introduced a new bill that would require that CBP get a warrant to search the devices of Americans at the border. On the Senate side, the bill is sponsored by Senators Ron Wyden and Rand Paul, and in the House, it's Reps. Blake Farenthold and Jared Polis. Honestly, it's absolutely ridiculous that this kind of bill is even needed in the first place, because the 4th Amendment should just take care of it. But with DHS and the courts not properly appreciating the 4th Amendment's requirment for a warrant to do a search, here we are. Here's a short summary of the bill as well, that notes:

The government has asserted broad authority to search or seize digital devices at the border
without any level of suspicion due to legal precedent referred to as the “border search
exception” to the Fourth Amendment’s requirement for probable cause or a warrant.

Until 2014, the government claimed it did not need a warrant to search a device if a person had
been arrested. In a landmark unanimous decision, the Supreme Court (in Riley v. California)
ruled that digital data is different and that law enforcement needed a warrant to search an
electronic device when a person has been arrested.

This bill recognizes the principles from that decision extend to searches of digital devices at the
border. In addition, this bill requires that U.S. persons are aware of their rights before they
consent to giving up online account information (like social media account names or passwords)
or before they consent to give law enforcement access to their devices.

That last part is especially important, given how eager Homeland Security has been to start demanding social media passwords as you deplane. Unfortunately, the bill as written only applies to "US Persons" as defined here, meaning that it may not be of much help for a new DHS proposal, also revealed this week, to more aggressively pursue phone and social media searches of foreigners. This is a bad idea for a whole host of reasons we've already discussed, but the short version is that it's bad for security, it's bad for tourism, it's bad for Americans' safety (because other countries will reciprocate). It's just a bad, bad idea.

At the very least, this new bill would block this from happening for American citizens or otherwise legal aliens, but it should go much further. And, of course, who knows if this bill will get any traction, or get signed by the President.

from the we'd-just-like-to-know-wtf-is-going-on dept

Columbia University's Knight First Amendment Institute wants to know why device searches at the border have skyrocketed since the beginning of this year. As was reported earlier this month, the number of devices searched in February 2017 equals the total searched in all of 2015. Even last year's jump from 5,000 to 25,000 searches looks miniscule in comparison. Border device searches are on track to more than double last year's numbers. (h/t The Intercept)

The Knight First Amendment Institute filed FOIA requests with the DHS, ICE, and CBP for "statistical, policy, and assessment records" related to the steep increase in device searches. It's also looking for any legal interpretations the agencies might have on hand that explain their take on the Supreme Court's Rileydecision, which instituted a warrant requirement for cell phone searches.

It asked for expedited handling given the significant public interest in all things immigration and border-related, which has climbed along with the device searches thanks to several presidential directives, some of which are being challenged in court.

As the lawsuit [PDF] notes, the public definitely should be apprised of the policies and procedures governing border device searches. If there's been an increase in searches, the public should be made aware of why this is happening, as well as their rights and remedies when it comes to entering or leaving the United States. The suit also points out that several recent reports suggest devices have been taken by government agents by force, or "consent" obtained through threats of further detention and/or violence.

Naturally, the FOIA requests have been greeted with non-responses and indifference by these agencies, which has prompted the Institute's FOIA lawsuit. The FOIA requesters seek the court's assistance in pushing the agencies into quicker responses. To date, it's received nothing but acknowledgements. There have been no estimates of time needed to fulfill the requests or any indication the agencies have even begun searching for responsive documents.

Of course, this immediate lawsuit strategy could backfire. The government has been pushing back against FOIA requesters' lawsuits filed shortly after the statutory response period has expired. It claims these immediate lawsuits are nothing more than certain requesters hoping to push their requests to the front of the line, rather than allow theirs to be ignored/mishandled/stonewalled in the order it was received. Of course, the government's arguments would be more sympathetic if multiple federal agencies didn't repeatedly engage in these tactics and do whatever they can to keep requested documents out of requesters' hands.

from the that's-just-wrong dept

The news site Mashable has apparently decided that you, the general public, are simply too dumb to actually own the stuff you thought you bought because you might just injure yourself. We've written about so-called "right to repair" laws and why they're so important. There are a variety of issues, but the most basic one here is about property rights. If you buy something, it's supposed to be yours. It doesn't remain the property of whoever first made it. And they shouldn't then be able to deny you the ability to tinker with, modify, or repair what you bought. However, Mashable's Lance Ulanoff (last seen here being completely clueless about the importance of anonymity online because he, personally, never could see a reason why someone might want to speak truth to power without revealing who they are), has decided that because you might be too dumb to properly repair stuff, the entire "right to repair" concept "is a dumb idea."

The article can basically be summed up as "I have a friend, and her iPhone wasn't repaired properly, so no one should be able to repair your iPhone but Apple." Really.

My co-worker, Tracey, held her iPhone like a baby bird with a bent wing.

I stared at the dark screen. The device was still on, but stuck somewhere between living technology and a dead iPhone. Tracey said that the device made a popping sound and got really hot in one corner while she was making a phone call. Then, her screen cracked, and burnt her ear. She wanted to know what to do. She explained the incident happened shortly after having third-party iPhone screen repair company iCracked replace her shattered iPhone 6 screen. iCracked was ready to let the original technician repair her phone again. I warned her against it. The phone was obviously dangerous—and letting them touch it again probably wouldn’t help. In fact, I thought it might hurt.

Getting past the "using a single anecdote to generalize to absolutely everyone," this still makes no sense. If we want to rule that any kind of repair/modification/tinkering shouldn't be allowed if it might not work right, well... there goes the entire DIY space. This weekend I repaired a broken toilet. It's entirely possible that I could have messed it up (in fact, I did at first, but after a couple of helpful YouTube videos, I got it figured out). Should I not have been allowed to do that? Should I have had to call a plumber who would have charged me $150 just to walk in the door? Lance Ulanoff apparently thinks that's the case. I'm not exactly a handyman, but over the years, I've repaired a ton of stuff in my house from broken dishwashers, computers, garage door openers and more. And sure, if I did it wrong, it could have been dangerous (that garage door opener, in particular, was pretty tricky). But do we really want to live in such a paternalistic society that we shouldn't even be allowed to do that? That seems to be the crux of Ulanoff's article.

Right-to-Repair? What a ridiculous thing to say. No one has the right to repair anything. You might have the skill to repair something (something that iCracked tech might've lacked). And you can hand people all the schematics, instructions, and parts you want and they still won’t be able to replace an iPhone battery or screen.

This is an even sillier argument. His complaint here is with the semantics. It's called a "right to repair," but of course no one's saying everyone will have the ability. The question is whether or not you can even try to repair something that you bought. It's really a question of property rights and whether or not you are breaking the law just trying to tinker with something.

The wonderful world of innovation we live in is built off of people tinkering. The computer industry that makes Mashable possible only exists because a bunch of people were tinkering with different devices and built multiple massive industries out of it. But, Ulanoff is effectively saying that all needs to stop now. Only approved sources can tinker.

Later, Ulanoff tries to clarify that he's fine with people being able to repair stuff... if it has moving parts.

It’s not that I don’t believe in better-built products and repairability. We need tightening against planned obsolescence cycles—TV sets that once lasted 25 years now fail after five. I’m also a tinkerer. I’ve taken apart everything from VCRs to BlackBerry Curve phones and their classic scroll buttons. When I see moving parts, I think: repairability. Today’s phones have almost no moving parts. At least the iPhone 6 had a moveable home button. The iPhone 7 and 7 Plus don’t even have that.

Why the distinction? Who the hell knows? Ulanoff never explains it beyond "when I see moving parts, I think: repairability." Well, good for you Lance. Have a cookie. Not everyone sees the world the way you do. Some people like -- for example -- replacing the significantly weakened battery on their phones so that they can make it last a lot longer. Some people like to replace their cracked screens rather than having to buy a new phone.

But, in the end, Ulanoff is just really concerned that you're just too dumb and you're going to hurt yourself:

I think it’s a fair concern that Right-to-Repair laws could lead to an explosion of Radio Shack-like iPhone and Samsung electronics parts shops. Consumers will wander in with broken iPhone and Samsung Galaxy screens, and walk out with all the parts and tools they need to repair them. And they will fail, miserably.

Plus, what if a consumer's injured during a failed repair attempt? They slice open a finger on the cracked glass, or put it back together incorrectly, so the battery fails (and maybe even explodes). It’s the consumer’s fault, obviously, but they could also try to sue Apple or Samsung.

Try to sue? Sure. Succeed at suing? No. And, really, is that the big concern here? No one should be allowed to tinker with their own devices because they might fuck it up and sue Apple. What?

In the end, once again, this is a question of whether or not people actually own what they buy. Ulanoff, by default, seems to be saying they shouldn't be able to do so. Because they might hurt themselves. Because they're too dumb to know that glass might cut them if not handled properly. But is that really the job of our laws (including copyright law, which is a key component in blocking people from repairing their own phones...) to say "you can't fix or modify something you bought because you're an idiot"? Ulanoff, like in his silly article about not seeing any need for anonymity, apparently doesn't see any need for people to fix their own stuff. Even worse, because he doesn't see such a need for himself, or his friend Tracey, he's decided it's a-ok for the law to clamp down on people who actually are competent and actually are able to modify, tinker or repair products such as phones.

This seems like a very odd way for Mashable to create its opinion pieces. Having some random dude extrapolate his own experiences to apply across everyone. Folks at Mashable responded to lots of people mocking Ulanoff's silly piece by pointing out that it alsopublished a counterpoint. But, as I've noted, this is why I find point/counterpoint arguments so useless. It puts the two arguments on an equal footing and suggests that "welp, you decide." That's silly. Ulanoff's argument makes no sense and is based on nothing more than his own confusion about how the world works. Mashable should feel bad about publishing such an article and pretending it's legit journalism.

from the my-smart-toaster-just-destroyed-the-internet dept

For much of the last year, we've noted how the rush to connect everything from toasters to refrigerators to the internet -- without adequate (ok, any) security safeguards -- has resulted in a security, privacy and public safety crisis. At first, the fact that everything from Barbies to tea kettles were now hackable was kind of funny. But in the wake of the realization that these hacked devices are contributing to massive new DDoS botnet attacks (on top of just leaking your data or exposing you to hacks) the conversation has quickly turned serious.

Security researchers have been noting for a while that it's only a matter of time before the internet-of-not-so-smart-things contributes to human fatalities, potentially on a significant scale if necessary infrastructure is attacked. As such, the Department of Homeland Security recently released what they called "strategic principles" for securing the Internet of Things; an apparent attempt to get the conversation started with industry on how best to avoid a dumb device cyber apocalypse.

Most of the principles are simple common sense, such as recommending that companies, oh, actually think about security a little bit during the product design phase. Other principles are a bit ironic given the government's behavior on other fronts, including the recommendation that companies implement encryption at the processor level for devices like the iPhone:

"Use hardware that incorporates security features to strengthen the protection and integrity of the device. For example, use computer chips that integrate security at the transistor level, embedded in the processor, and provide encryption and anonymity."

Again though, most of the recommendations are painfully basic, including actually "understanding what consequences could flow from the failure of a device," ensuring devices are more quickly and automatically updated, and engaging in "red teaming exercises" where employees probe devices for vulnerabilities before launch. Still, just getting some of this stuff in writing isn't a bad idea, given that most of the new IoT DDoS malware relies on something as stupid as not changing default login credentials. So there is value in just establishing some kind of core best practices (apparently incompetent) companies can look to.

As such, the DHS is clear that this is just a "first step":

"These non-binding strategic principles are designed to enhance security of the IoT across a range of design, manufacturing, and deployment activities, and include relevant suggested practices for implementation. It is a first step to motivate and frame conversations about positive measures for IoT security among IoT developers, manufacturers, service providers, and the users who purchase and deploy the devices, services and systems. "

The problem of course is that voluntary guidelines are no guarantee that the companies involved will actually adhere to them. After all, these are companies (and IoT evangelists) that were so keen on selling hardware that they couldn't be bothered to do the bare minimum to secure their products or acknowledge this rising, obvious problem. As a result, you have hardware like the Jidetech 720p WiFi enabled security camera, which security researcher Rob Graham noted this week can be hijacked by malware and participate in a botnet in all of five minutes after being unboxed:

"An additional market failure illustrated by the Dyn attack is that neither the seller nor the buyer of those devices cares about fixing the vulnerability. The owners of those devices don't care. They wanted a webcam —­ or thermostat, or refrigerator ­— with nice features at a good price. Even after they were recruited into this botnet, they still work fine ­— you can't even tell they were used in the attack. The sellers of those devices don't care: They've already moved on to selling newer and better models. There is no market solution because the insecurity primarily affects other people. It's a form of invisible pollution."

That's certainly not going to be good news for the regulation phobic, but Schneier argues the alternative is, quite literally, chaos:

"Regardless of what you think about regulation vs. market solutions, I believe there is no choice. Governments will get involved in the IoT, because the risks are too great and the stakes are too high. Computers are now able to affect our world in a direct and physical manner."

One problem of course is that U.S. regulation certainly won't help deter the rest of the world from creating internet-connected devices that can wreak havoc on vital infrastructure. There's also the very real concern that federal regulations would be crafted poorly, restricting sector innovation or consumers' freedom to tinker with their own device. In fact, many of these devices have such abysmal interfaces and control systems that hacking and modifying them is in some instances the only path to actually securing them and controlling what traffic is being sent over the network.

As such, IoT regulation is going to be a debate that rages for several years, when it's not entirely clear we have several years to waste. In the interim, the only recourse left to consumers continues to be to establish smart security in your own home and business, and continue to name and shame IoT vendors that clearly prioritized profits over human lives and the health of the internet at large.

from the hello-dystopia dept

FORBES found a court filing, dated May 9 2016, in which the Department of Justice sought to search a Lancaster, California, property. But there was a more remarkable aspect of the search, as pointed out in the memorandum: “authorization to depress the fingerprints and thumbprints of every person who is located at the SUBJECT PREMISES during the execution of the search and who is reasonably believed by law enforcement to be the user of a fingerprint sensor-enabled device that is located at the SUBJECT PREMISES and falls within the scope of the warrant.” The warrant was not available to the public, nor were other documents related to the case.

The memorandum goes on to point out that simply demanding fingerprints implicates neither the Fourth nor Fifth Amendments. But the additional permissions sought certainly do.

“While the government does not know ahead of time the identity of every digital device or fingerprint (or indeed, every other piece of evidence) that it will find in the search, it has demonstrated probable cause that evidence may exist at the search location, and needs the ability to gain access to those devices and maintain that access to search them. For that reason, the warrant authorizes the seizure of ‘passwords, encryption keys, and other access devices that may be necessary to access the device,’” the document read.

Not only are the devices being seized, but so are any passwords, which does carry some implications, but not necessarily at the point of seizure. It's the refusal to turn over passwords or encryption keys in the face of a court order that can result in contempt charges, and it's still less-than-settled that access information has no testimonial value.

But even the seizure of these devices in hopes of searching them later (but securing fingerprints to unlock them first) is a Fourth Amendment problem if they're accessed in nearly any way during the unlocking process. One court found, post-Riley, that simply openinga flip phone constituted a search. In that context, forcing a finger onto the phone and viewing the screen's contents could be considered a search -- and a warrantless one at that.

Of course, the government cited plenty of cases to back up its seizure, detention of residents, and its taking of fingerprints -- most of them at least 30 years old.

It also cited Holt v. United States, a 1910 case, and United States v. Dionisio, a 1973 case, though it did point to more recent cases, including Virginia v. Baust, where the defendant was compelled to provide his fingerprint to unlock a device (though Baust did provide his biometric data, it failed to open the iPhone; after 48 hours of not using Touch ID or a reboot Apple asks for the code to be re-entered.).

As for the Fourth, the feds said protections against unreasonable searches did not stand up when “the taking of fingerprints is supported by reasonable suspicion,” citing 1985′s Hayes v. Florida. Other cases, dated well before the advent of smartphones, were used to justify any brief detention that would arise from forcing someone to open their device with a fingerprint.

This is the reality of what the government is seeking: law enforcement officers detaining suspects and non-suspects alike and forcing them to apply their fingers to all locked devices on the premises. If this is the new normal for warrant service, it's time for the courts to step up and be a bit more aggressive in holding the government to particularity requirements.

from the unconfrontable-witnesses dept

Defendant Martell Chubbs currently faces murder charges for a 1977 cold case in which the only evidence against him is a DNA match by a proprietary computer program. Chubbs, who ran a small home-repair business at the time of his arrest, asked to inspect the software’s source code in order to challenge the accuracy of its results. Chubbs sought to determine whether the code properly implements established scientific procedures for DNA matching and if it operates the way its manufacturer claims. But the manufacturer argued that the defense attorney might steal or duplicate the code and cause the company to lose money. The court denied Chubbs’ request, leaving him free to examine the state’s expert witness but not the tool that the witness relied on.

That's a starkly mercenary stance to take. The "trade secret privilege" invoked here basically states that the company's potential loss of income outweighs a person's potential loss of freedom. It also asks for a level of trust it hasn't earned: that the software is as close to infallible as it needs to be. Cross-examination is next to useless when the software itself can't be examined.

Worse, this closed-off software operates in a field where nearly every previous form of "indisputable" evidence has proven to be severely flawed.

Studies have disputed the scientific validity of pattern matching in bite marks, arson, hair and fiber, shaken baby syndrome diagnoses, ballistics, dog-scent lineups, blood spatter evidence, and fingerprint matching. Massachusetts is struggling to handle the fallout from a crime laboratory technician’s forgery of results that tainted evidence in tens of thousands of criminal cases. And the Innocence Project reports that bad forensic science contributed to the wrongful convictions of 47 percent of exonerees.

Everything tied to securing convictions seems to suffer from pervasive flaws compounded by confirmation bias. For four decades, the DOJ presented hair analysis as an unique identifier on par with fingerprints or DNA when it wasn't. A 2014 Inspector General's report found the FBI still hadn't gotten around to correcting forensic lab issues it had pointed out nearly 20 years earlier. This contributed to two decades of "experts" providing testimony that greatly overstated the results of hair analysis. All of this happened in the FBI's closed system, a place outsiders aren't allowed to examine firsthand.

That's the IRL version. The software version is just as suspect. Computers aren't infallible and the people running them definitely aren't. If the software cannot be inspected, the statements of expert witnesses should be considered highly dubious. After all, most expert witnesses representing the government have a vested interest in portraying forensic evidence as bulletproof. Without access to forensic software code, no one will ever be able to prove them wrong.

If a piece of software has the ability to deprive a member of the public of their freedom, its code should be open for inspection by the defense. "Trade secrets" should not take precedence over the public's right to defend themselves in court. Even in the highly unlikely event that Chubb's defense team would have copied the code and destroyed the company's future profits, it would still have the ability to seek redress through the court system. After all, that's the line the government uses when it argues for expanded "good faith exceptions" or warrantless searches and seizures: "Hey, if we screw up, you can always sue."

The judicial system is a remedy for wrongs, both criminal and civil. What it shouldn't be is a protective haven where ridiculous assertions like those made here are used to prevent an accused person from learning more about the evidence being used to convict them.

from the heaping-piles-of-bullshit dept

Amazon has come up with a rather ingenious way to give its own Fire TV streaming video devices a leg up in the market place: stop selling major competing products. In a letter sent this week to the company's marketplace sellers, Amazon announced that it would no longer be allowing new listings for either the Chromecast or Apple TV starting today, and that existing retail stock of both products would be discontinued at the end of the month. Google of course just unveiled two new versions of the Chromecast, which has been historically outselling Amazon's own streaming devices.

Amusingly, Amazon unloads what has to be one of the larger piles of ambiguous bullshit in defense of an anti-competitive position seen in some time:

"Over the last three years, Prime Video has become an important part of Prime," Amazon said in the e-mail. "It’s important that the streaming media players we sell interact well with Prime Video in order to avoid customer confusion."

Hilarious. Except it's up to developers to embed Chromecast support into their services and apps, and both Google and Apple publish open software development kits that allows any application to be utilized on both devices. In other words, it's Amazon's choice that Chromecast and Apple TV won't play nicely with Amazon Prime Instant Streaming. It has nothing to do with the devices not "interacting well" with Amazon's services. Bloomberg even helps prop up this nonsensical explanation further by repeating the idea that Prime Video "doesn’t run easily on rival’s devices."

Meanwhile, on what planet exactly are you "avoiding customer confusion" by suddenly removing access to hugely popular, competing products?

Apparently other streaming competitors like the Roku and game consoles have yet to see Amazon's ire and will remain sold, for now. Obviously none of this is the end of the world, since Amazon controls just 1% of the overall retail market and these devices can be bought at a long-list of retail alternatives, including Apple and Google themselves. Still it's an utterly-idiotic decision that's sure to invite antitrust scrutiny at worst, and an absolute shit storm of negative PR at best.

No government entity shall place, locate, or install an electronic device on the person or property of another, or obtain location information from such an electronic device, without a warrant issued by a judge based on probable cause and on a case-by-case basis.

As Watchdog.org points out, the spirit of the law is somewhat undermined by the letter of the law.

There are noteworthy exceptions, many of which appeared in previous iterations.

Tracking is permitted without a warrant with the informed consent of a device owner, unless the owner knowingly loaned it to a third party. You can track calls for 911 emergencies. A parent or legal guardian can provide informed consent to locate a missing child. The government can track its own property or employees in possession of that property. And alcohol ignition interlock control devices placed by court order would also be traceable without a warrant.

The other problem with the bill is a problem with all bills introduced by state legislators: it can't lock out federal intrusion, at least not in its present form. The bill states that it does not apply to "federal government agencies." So, if local law enforcement wants to engage in warrantless tracking of cellphones, all it has to do is partner up with a federal agency.

On top of that, there are the loopholes that have always been exploited. Stingray use -- one method of tracking location -- has routinely been hidden under more innocuous paperwork, like pen register orders. Obtaining cellphone records -- including location data -- is primarily done with subpoenas, considering most laws still treat these as third-party business records. While the law would force some of the latter requests to take the form of a search warrant, it doesn't make a clear distinction between real-time tracking and historical data.

What it does appear to outlaw is the warrantless, real-time tracking of GPS location, meaning tracking devices can only be deployed after obtaining a warrant. This is certainly a step forward, one perhaps partially prompted by the Supreme Court's US v. Jones decision. However, this would go against precedent in the First Circuit Court (which covers New Hampshire), which has found that warrantless GPS tracking devices may constitute a "search," but not to the extent that a lack of a warrant should automatically result in suppression of evidence. (Also somewhat aligned with the Supreme Court's reluctance to declare all GPS tracking worthy of a warrant.)

The court then held that it was reasonable for the agents to use the GPS device in Sparks’ case based upon reliance on clear precedent.

However, the court noted that they did not decide the issue of whether any exceptions to the warrant requirement exist for future installation use of the GPS device to monitor suspect’s movements. Therefore, future use of such GPS monitoring is governed under the United States v. Jones.

As such, the court of appeals affirmed the denial of the motion to suppress.

Although this case appeared before the judges after the Supreme Court's US v. Jones decision, the events of the case proceeded that finding. This may change rulings in the future, but for now, the First Circuit has not made it expressly clear that tracking devices require warrants.

As the proposed law pertains to physical tracking devices, it's much more closely aligned with the Supreme Court's decision. Left unclear is its application to Stingray devices and obtaining historical cell site location information from telcos -- both forms of "tracking" that don't involve attaching a monitoring device to a "person or property."

from the awwwwww dept

Perhaps, like me, you've never really understood the curious ban some airflights and airlines have had on mobile and electronic devices during flights, take-offs, and landings. Perhaps, like our Jefe, Mike Masnick, you've dismissed the requests from flight attendants that those devices be fully powered down out of hand, because you too are a rebel the likes for which this world is wholly unprepared. And maybe you too cheered when the FAA summarily dismissed these silly rules way back in 2013, thinking that the madness of a few moments without our favorite devices had finally come to an end.

But then, as you may know, the Association of Flight Attendants sued the FAA in order to retain the ability to lord over your smart-phones, tablets, and computers on flights. Notably, the AFA's filing made essentially zero claims having anything to do with the safety of electronic devices on the flights. Instead, their argument centered on whether the power to decide whether flight attendants could treat passengers like children who hadn't finished their vegetables resided with the FAA, or if the AFA should have some input.

In this case, it really does not matter whether Notice N8900.240 is viewed as a policy statement or an interpretive rule. The main point here is that the Notice is not a legislative rule carrying “the force and effect of law.” Perez, 135 S. Ct. at 1204. A legislative rule “modifies or adds to a legal norm based on the agency’s own authority” flowing from a congressional delegation to engage in supplementary lawmaking. Syncor, 127 F.3d at 95.

That's court-speak for "nice try, now go away." Of course the FAA can make changes to flight rules as it pleases and, when it comes to the use of devices the ban for which has always been cast in the light of flight-safety, an association for flight attendants ought to have about as much input as a doctor's receptionist should have on medical policy. This tantrum of a suit, which is all it ever was, has been dismissed and we are finally free to play Angry Birds during takeoff. Free at last, free at last.

More seriously, it's somewhat nice to see some aspect of security theater being done away with regarding anything to do with airplanes and flights. If we could just take this same tact with the rest of airport security, we'd be making a world of improvements.

from the a-good-start dept

Last year, we wrote about Rep. Blake Farenthold introducing a small, but important piece of copyright legislation, the You Own Devices Act (YODA), which just says that if you buy some piece of computerized equipment, you can sell it with any included software, without having to get permission from the software provider. As we noted, the reality is that this is just making it clear that the first sale doctrine applies to computer equipment too -- which shouldn't need a new law, but some tech companies (especially in the networking space) feel otherwise.

Farenthold has now reintroduced YODA, this time with Rep. Jared Polis as a sponsor as well (giving the bill that necessary "bi-partisan" shine). It's unfortunate that these kinds of bills are even necessary, but such is the state of copyright laws today, that they often mean the devices you buy, you don't even really own.

Also, kudos to Farenthold for playing on the YODA name in his tweet announcing the new version of the bill: