Kristoffer’s father, Robert Davies, started noticing that Kristoffer was logging into his Xbox Live account and playing video games that were off-limits. When prompted to enter a password, Kristoffer would enter a series of spaces and hit enter, gaining access to his father’s account.

"I was like yea!" Kristoffer told KGTV-10, a CNN affiliate, after breaking into his dad’s account.

Glee quickly turned into panic as the thought of his father finding out what he did dawned upon Kristoffer. Instead, Davies was interested, as he himself works in online security.

"How awesome is that?" Davies said. "Just being five years old and being able to find a vulnerability and latch on to that. I thought that was pretty cool."

After Kristoffer showed his father what he did, Davies reported the issue to Microsoft.

"We’re always listening to our customers and thank them for bringing issues to our attention," Microsoft said in a statement to KGTV-10. "We take security seriously at Xbox and fixed the issue as soon as we learned about it."

According to KGTV-10, Microsoft will give Kristoffer four games, a $50 gift card and a year-long subscription of Xbox Live.

Target might have been a tad negligent when it came to observing its security systems last year, according to a Thursday Businessweek report.

Months before hackers stole 40 million payment cards, among heaps of other information, at the end of 2013, the retail giant installed a $1.6 million malware detection system from security company FireEye that later picked up on the attackers’ suspicious activity – on multiple occasions.

Interviewing more than 10 former Target employees familiar with the company’s security, and eight people with knowledge of the attack, Businessweek learned of an alert system that worked like a charm – or at least it was supposed to work.

When asked to explain Target’s lack of a response to those alerts, a variation of a media statement was emailed to Businessweek from company CEO Gregg Steinhafel. It begins by explaining that Target had been certified as meeting payment card industry (PCI) standards, before explaining what Target has done since the breach.

In a Thursday email, Eric Chiu, president and co-founder HyTrust, told SCMagazine.com that not responding to these types of alarms is shocking, but not so surprising for a company that, at the time, had security fairly low on its list of priorities.

“We often see organizations ignoring alarms like this because they’ve become numb to them, receiving too many false positives, or because they’re understaffed,” Chiu said. “You can have all the alarms you want, but unless you put security in a prominent position in the company and have enough staff to review them, those alarms don’t mean anything.”

Joe Schumacher, security consultant for Neohapsis, offered other reasons to SCMagazine.com in a Thursday email.

“I don’t think it is about not paying attention to the technologies as much as fine tuning for actionable, relevant information from the technology,” Schumacher said. “Many security systems (e.g. Web application firewall, log monitoring, Intrusion Detection/Prevention Systems, etc.) correlate large amounts of data into a single repository. Unfortunately, a lot of companies and professional services stop here.”

Security company RSA was paid $10 million to use the flawed Dual_EC_DRBG pseudorandom number generating algorithm as the default algorithm in its BSafe crypto library, according to sources speaking to Reuters.

Leaked documents say that the NSA has compromised encryption specs. It wasn’t always this way.

The Dual_EC_DRBG algorithm is included in the NIST-approved crypto standard SP 800-90 and has been viewed with suspicion since shortly after its inclusion in the 2006 specification. In 2007, researchers from Microsoft showed that the algorithm could be backdoored: if certain relationships between numbers included within the algorithm were known to an attacker, then that attacker could predict all the numbers generated by the algorithm. These suspicions of backdooring seemed to be confirmed this September with the news that the National Security Agency had worked to undermine crypto standards.

The impact of this backdooring seemed low. The 2007 research, combined with Dual_EC_DRBG’s poor performance, meant that the algorithm was largely ignored. Most software didn’t implement it, and the software that did generally didn’t use it.

One exception to this was RSA’s BSafe library of cryptographic functions. With so much suspicion about Dual_EC_DRBG, RSA quickly recommended that BSafe users switch away from the use of Dual_EC_DRBG in favor of other pseduorandom number generation algorithms that its software supported. This raised the question of why RSA had taken the unusual decision to use the algorithm in the first place given the already widespread distrust surrounding it.

RSA said that it didn’t enable backdoors in its software and that the choice of Dual_EC_DRBG was essentially down to fashion: at the time that the algorithm was picked in 2004 (predating the NIST specification), RSA says that elliptic curves (the underlying mathematics on which Dual_EC_DRBG is built) had become “the rage” and were felt to “have advantages over other algorithms.”

Reuters’ report suggests that RSA wasn’t merely following the trends when it picked the algorithm and that contrary to its previous claims, the company has inserted presumed backdoors at the behest of the spy agency. The $10 million that the agency is said to have been paid was more than a third of the annual revenue earned for the crypto library.

Other sources speaking to Reuters said that the government did not let on that it had backdoored the algorithm, presenting it instead as a technical advance.

“What exactly did you say you were trying to do again?” the ranger asked me as we stood on a seawall at Fort McHenry, taking turns winding in hundreds of feet of kite string attached to a nine-foot kite.

The kite, a nine-foot delta wing, had landed near a channel marker buoy and was now a nine-foot delta wing sea anchor. Tethered to it was a modified plastic food container encasing a very wet Android phone that was never intended to be a submersible. As we pulled the kite in, I asked myself the same thing—what the hell was I doing?

What I was trying to do was replicate what the military, government agencies, and private companies typically do with satellites, aircraft, and drones: get a bird’s-eye view of the Earth’s surface and create a photographic map.

Instead, my attempt at do-it-yourself aerial mapping quickly turned into a fiasco involving a squadron of US Park Service rangers, a few dozen puzzled tourists, and the flagpole that stood where the Star Spangled Banner once flew. I only mapped the limits of my own sense of humor, the patience of the National Park Service, and the contours of the bottom of the Patapsco River. An effort I planned since August was starting to look like a complete failure.

Officials from RSA Security are advising customers of the company’s BSAFE toolkit and Data Protection Manager to stop using a crucial cryptography component in the products that were recently revealed to contain a backdoor engineered by the National Security Agency (NSA).

An advisory sent to select RSA customers on Thursday confirms that both products by default use something known as Dual EC_DRBG when creating cryptographic keys. The specification, which was approved in 2006 by the National Institute of Standards and Technology (NIST) and later by the International Organization for Standardization, contains a backdoor that was inserted by the NSA, The New York Times reported last week. RSA’s advisory came 24 hours after Ars asked the company if it intended to warn BSAFE customers about the deliberately crippled pseudo random number generator (PRNG), which is so weak that it undermines the security of most or all cryptography systems that use it.

"To ensure a high level of assurance in their application, RSA strongly recommends that customers discontinue use of Dual EC DRBG and move to a different PRNG," the RSA advisory stated. "Technical guidance, including how to change the default PRNG in most libraries, is available in the most current product documentation" on RSA’s websites.

The BSAFE library is used to implement cryptographic functions into products, including at least some versions of the McAfee Firewall Enterprise Control Center, according to NIST certifications. The RSA Data Protection Manager is used to manage cryptographic keys. Confirmation that both use the backdoored RNG means that an untold number of third-party products may be bypassed not only by advanced intelligence agencies, but possibly by other adversaries who have the resources to carry out attacks that use specially designed hardware to quickly cycle through possible keys until the correct one is guessed.

McAfee representatives issued a statement that confirmed the McAfee Firewall Enterprise Control Center 5.3.1 supported the Dual_EC_DRBG, but only when deployed in federal government or government contractor customer environments, where this FIPS certification has recommended it. The product uses the newer SHA1 PRNG random number generator in all other settings.

The NIST certification page lists dozens of other products that also use the weak RNG. Most of those appear to be one-off products. More significant is the embrace of BSAFE as the default RNG, because the tool has the ability to spawn a large number of derivative crypto systems that are highly susceptible to being broken.

In the beginning …

From the beginning, Dual EC_DRBG—short for Dual Elliptic Curve Deterministic Random Bit Generator—struck some cryptographers as an odd choice for one of NIST’s officially sanctioned RNGs. It was literally hundreds of times slower than typical RNGs, and its basis in "discrete logarithm" mathematics was highly unusual in production environments.

"I personally believed that it was some theoretical cryptographer’s pet project," one cryptographer who asked not to be named told Ars. "I envisioned a mathematician, annoyed at the lack of theoretical foundation in random number generation, badgering his way into an NIST standard."

A year after NIST approved the RNG as a standard, two Microsoft researchers devised an attack that allowed adversaries to guess any key created with the RNG with relatively little work.

Johns Hopkins professor Matt Green recounts that failing and a wealth of other peculiarities surrounding the embrace of Dual_EC_DRBG in an exhaustive technical analysis published Wednesday. Among them, when Dual_EC_RNG was adopted, was that it had no security proof.

"In the course of proposing this complex and slow new PRNG where the only frigging reason you’d ever use the thing is for its security reduction, NIST forgot to provide one," Green wrote. "This is like selling someone a Mercedes and forgetting to include the hood ornament."

In an e-mail, RSA Chief of Technology Sam Curry defended the decision-making process that went into making the RNG the default way for BSAFE and Data Protection Manager to generate keys.

"The length of time that Dual_EC_DRBG takes can be seen as a virtue: it also slows down an attacker trying to guess the seed," he wrote. He continued:

Plenty of other crypto functions (PBKDF2, bcrypt, scrypt) will iterate a hash 1000 times specifically to make it slower. At the time, elliptic curves were in vogue and hash-based RNG was under scrutiny. The hope was that elliptic curve techniques—based as they are on number theory—would not suffer many of the same weaknesses as other techniques (like the FIPS 186 SHA-1 generator) that were seen as negative, and Dual_EC_DRBG was an accepted and publicly scrutinized standard. SP800-90 (which defines Dual EC DRBG) requires new features like continuous testing of the output, mandatory re-seeding, optional prediction resistance, and the ability to configure for different strengths.

It will take time for people to ferret out all the products that use Dual_EC_DRBG, particularly as the sole or default RNG. Readers who know of others are invited to leave that information in a comment to this post.

This McClatchy piece (written by some of the same people who got the Iraq war run-up story so right while everyone else got it wrong) is as chilling to me as anything we’ve heard over the past few weeks about the NSA spying. In fact, it may be worse:

Even before a former U.S. intelligence contractor exposed the secret collection of Americans’ phone records, the Obama administration was pressing a government-wide crackdown on security threats that requires federal employees to keep closer tabs on their co-workers and exhorts managers to punish those who fail to report their suspicions. President Barack Obama’s unprecedented initiative, known as the Insider Threat Program, is sweeping in its reach. It has received scant public attention even though it extends beyond the U.S. national security bureaucracies to most federal departments and agencies nationwide, including the Peace Corps, the Social Security Administration and the Education and Agriculture departments. It emphasizes leaks of classified material, but catchall definitions of “insider threat” give agencies latitude to pursue and penalize a range of other conduct. Government documents reviewed by McClatchy illustrate how some agencies are using that latitude to pursue unauthorized disclosures of any information, not just classified material. They also show how millions of federal employees and contractors must watch for “high-risk persons or behaviors” among co-workers and could face penalties, including criminal charges, for failing to report them. Leaks to the media are equated with espionage.“Hammer this fact home . . . leaking is tantamount to aiding the enemies of the United States,” says a June 1, 2012, Defense Department strategy for the program that was obtained by McClatchy.

When the free free press, explicitly protected in the bill of rights becomes equivalent to an "enemy of the United States" something very, very bad is happening. The administration says it’s doing this to protect national security and that it is willing to protect those who blow the whistle on waste, fraud and abuse. But that is not how the effect of this sort of program is going to be felt. After all, it’s being implemented across the federal government, not just in national security:

The program could make it easier for the government to stifle the flow of unclassified and potentially vital information to the public, while creating toxic work environments poisoned by unfounded suspicions and spurious investigations of loyal Americans, according to these current and former officials and experts. Some non-intelligence agencies already are urging employees to watch their co-workers for “indicators” that include stress, divorce and financial problems.

“It was just a matter of time before the Department of Agriculture or the FDA (Food and Drug Administration) started implementing, ‘Hey, let’s get people to snitch on their friends.’ The only thing they haven’t done here is reward it,” said Kel McClanahan, a Washington lawyer who specializes in national security law. “I’m waiting for the time when you turn in a friend and you get a $50 reward.” The Defense Department anti-leak strategy obtained by McClatchy spells out a zero-tolerance policy. Security managers, it says, “must” reprimand or revoke the security clearances – a career-killing penalty – of workers who commit a single severe infraction or multiple lesser breaches “as an unavoidable negative personnel action.” Employees must turn themselves and others in for failing to report breaches. “Penalize clearly identifiable failures to report security infractions and violations, including any lack of self-reporting,” the strategic plan says. The Obama administration already was pursuing an unprecedented number of leak prosecutions, and some in Congress – long one of the most prolific spillers of secrets – favor tightening restrictions on reporters’ access to federal agencies, making many U.S. officials reluctant to even disclose unclassified matters to the public. The policy, which partly relies on behavior profiles, also could discourage creative thinking and fuel conformist “group think” of the kind that was blamed for the CIA’s erroneous assessment that Iraq was hiding weapons of mass destruction, a judgment that underpinned the 2003 U.S. invasion.

I don’t know about you, but that does not sound like freedom. In fact, it sounds like something else entirely to me. This government paranoia and informant culture is about as corrosive to the idea of freedom as it gets. The workplace is already rife with petty jealousies, and singular ambition— it’s a human organization after all. Adding in this sort of incentive structure is pretty much setting up a system for intimidation and abuse. And, as with all informant systems, especially ones that "profile" for certain behaviors deemed to be a threat to the state, only the most conformist will thrive. It’s a recipe for disaster if one is looking for any kind of dynamic, creative thinking. Clearly, that is the last these creepy bureaucrats want. This is the direct result of a culture of secrecy that seems to be pervading the federal government under president Obama. He is not the first president to expand the national security state , nor is he responsible for the bipartisan consensus on national security or the ongoing influence of the Military Industrial Complex.This, however, is different. And he should be individually held to account for this policy.:

Administration officials say the program could help ensure that agencies catch a wide array of threats, especially if employees are properly trained in recognizing behavior that identifies potential security risks. “If this is done correctly, an organization can get to a person who is having personal issues or problems that if not addressed by a variety of social means may lead that individual to violence, theft or espionage before it even gets to that point,” said a senior Pentagon official, who requested anonymity because he wasn’t authorized to discuss the issue publicly. […] “If the folks who are watching within an organization for that insider threat – the lawyers, security officials and psychologists – can figure out that an individual is having money problems or decreased work performance and that person may be starting to come into the window of being an insider threat, superiors can then approach them and try to remove that stress before they become a threat to the organization,” the Pentagon official said. The program, however, gives agencies such wide latitude in crafting their responses to insider threats that someone deemed a risk in one agency could be characterized as harmless in another. Even inside an agency, one manager’s disgruntled employee might become another’s threat to national security. Obama in November approved “minimum standards” giving departments and agencies considerable leeway in developing their insider threat programs, leading to a potential hodgepodge of interpretations. He instructed them to not only root out leakers but people who might be prone to “violent acts against the government or the nation” and “potential espionage.” The Pentagon established its own sweeping definition of an insider threat as an employee with a clearance who “wittingly or unwittingly” harms “national security interests” through “unauthorized disclosure, data modification, espionage, terrorism, or kinetic actions resulting in loss or degradation of resources or capabilities.” “An argument can be made that the rape of military personnel represents an insider threat. Nobody has a model of what this insider threat stuff is supposed to look like,” said the senior Pentagon official, explaining that inside the Defense Department “there are a lot of chiefs with their own agendas but no leadership.” The Department of Education, meanwhile, informs employees that co-workers going through “certain life experiences . . . might turn a trusted user into an insider threat.” Those experiences, the department says in a computer training manual, include “stress, divorce, financial problems” or “frustrations with co-workers or the organization.” An online tutorial titled “Treason 101” teaches Department of Agriculture and National Oceanic and Atmospheric Administration employees to recognize the psychological profile of spies.

A Defense Security Service online pamphlet lists a wide range of “reportable” suspicious behaviors, including working outside of normal duty hours. While conceding that not every behavior “represents a spy in our midst,” the pamphlet adds that “every situation needs to be examined to determine whether our nation’s secrets are at risk.” The Defense Department, traditionally a leading source of media leaks, is still setting up its program, but it has taken numerous steps. They include creating a unit that reviews news reports every day for leaks of classified defense information and implementing new training courses to teach employees how to recognize security risks, including “high-risk” and “disruptive” behaviors among co-workers, according to Defense Department documents reviewed by McClatchy. “It’s about people’s profiles, their approach to work, how they interact with management. Are they cheery? Are they looking at Salon.com or The Onion during their lunch break? This is about ‘The Stepford Wives,’” said a second senior Pentagon official, referring to online publications and a 1975 movie about robotically docile housewives. The official said he wanted to remain anonymous to avoid being punished for criticizing the program. The emphasis on certain behaviors reminded Greenstein of her employee orientation with the CIA, when she was told to be suspicious of unhappy co-workers. “If someone was having a bad day, the message was watch out for them,” she said. Some federal agencies also are using the effort to protect a broader range of information. The Army orders its personnel to report unauthorized disclosures of unclassified information, including details concerning military facilities, activities and personnel. The Peace Corps, which is in the midst of implementing its program, “takes very seriously the obligation to protect sensitive information,” said an email from a Peace Corps official who insisted on anonymity but gave no reason for doing so. Granting wide discretion is dangerous, some experts and officials warned, when federal agencies are already prone to overreach in their efforts to control information flow. The Bush administration allegedly tried to silence two former government climate change experts from speaking publicly on the dangers of global warming. More recently, the FDA justified the monitoring of the personal email of its scientists and doctors as a way to detect leaks of unclassified information.

Maybe this is just another way of reducing the federal workforce. Nobody normal should want to work there. When the Department of Education is searching for "insider threats" something’s gone very wrong.

The Guardian released an interview today with the man who has been the paper’s source for a few now-infamous leaked documents that revealed a vast dragnet maintained by the NSA for gathering information on communications in America. That source, is Edward Snowden, 29, an employee of American defense contractor Booz Allen Hamilton and a former technical assistant for the CIA.

When The Guardian published a leaked document on Wednesday of last week that showed a Fisa court granting the NSA power to collect the metadata pertaining to phone calls from all of Verizon’s customers over a period of three months, it became one of the biggest exposures of privacy invading actions taken by the government without the public’s knowledge.

That is, until the next day, when The Guardian and The Washington Post revealed slides pertaining to another NSA project called PRISM, which apparently gathered vast swaths of information on users of Google services, Facebook, Apple, and more. While the companies named in the PRISM slides have all denied participation in such a program, President Obama and a number of senators confirmed the collection of phone call metadata on Friday.

Snowden, it seems, was prepared to have his leaked documents blow up in the news and chose to expose himself. "I have no intention of hiding who I am because I know I have done nothing wrong," he told The Guardian’s Glen Greenwald. Still, Snowden knows that he will probably be made to suffer for leaking the documents he did. As The Guardian writes:

Having watched the Obama administration prosecute whistleblowers at a historically unprecedented rate, he fully expects the US government to attempt to use all its weight to punish him. "I am not afraid," he said calmly, "because this is the choice I’ve made."

The 29-year old hails from Elizabeth City, North Carolina, attended community college in Maryland (where he studied computing but never completed the coursework), and enlisted in the Army in 2003. After he broke both his legs on a training accident, he worked as a security guard at an NSA facility in Maryland and then entered into the CIA, working on IT security. He was able to rise through the ranks quickly after showing considerable talent for the work.

As Snowden, who sports Electronic Frontier Foundation and Tor Project stickers on his laptop, tells it, he started identifying abuses of privacy early on, but he remained quiet as the Obama Administration came into office, believing that the abuses would be checked. But, Snowden told the Guardian, he "watched as Obama advanced the very policies that I thought would be reined in," and decided to act.

On May 20, he told his NSA supervisor that he needed to take a few weeks off to treat his epilepsy and went to Hong Kong where he has been living in a hotel ever since. As The Guardian reports:

He has had "a very comfortable life" that included a salary of roughly $200,000, a girlfriend with whom he shared a home in Hawaii, a stable career, and a family he loves. "I’m willing to sacrifice all of that because I can’t in good conscience allow the US government to destroy privacy, internet freedom and basic liberties for people around the world with this massive surveillance machine they’re secretly building."

According to Greenwald’s account, Snowden showed no remorse or sadness about his actions, except for when reflecting on the fate of his family, “The only thing I fear is the harmful effects on my family, who I won’t be able to help any more,” Snowden told The Guardian. But, he said, what the NSA is doing poses "an existential threat to democracy" because “the government has granted itself power it is not entitled to. There is no public oversight. The result is people like myself have the latitude to go further than they are allowed to.”

When asked about his future, Snowden acknowledged that he might be extradited, or captured by the CIA, or maligned as aiding China because he chose to ensconce himself in Hong Kong after leaving the US. To the latter concern, Snowden is quoted as saying, "There are more important things than money. If I were motivated by money, I could have sold these documents to any number of countries and gotten very rich." To the former fears, The Guardian reports that Snowden hopes to find asylum in an privacy-friendly country like Iceland, but is prepared for the consequences if that does not happen.

On whether he sees himself as akin to well-known leaker of documents Bradley Manning, Snowden draws something of a distinction: “I carefully evaluated every single document I disclosed to ensure that each was legitimately in the public interest," he said. "There are all sorts of documents that would have made a big impact that I didn’t turn over, because harming people isn’t my goal. Transparency is."

Read the whole article and see a video interview with Snowden on The Guardian‘s website here.