Studying April Fools hoax news stories could offer clues to spotting ‘fake news’ articles, new research reveals. Researchers interested in deception have compared the language used within written April Fools hoaxes and fake news stories.

New study shows that when it comes to sharing emergency information during natural disasters, timing is everything. The study on Twitter use during hurricanes, floods and tornadoes offers potentially life-saving data about how information is disseminated in emergency situations, and by whom. Unlikely heroes often emerge in disasters, and the same is true on social media.

A British oversight board has slammed the Chinese telecom giant Huawei for software security flaws. The report, however, stopped short of blaming Chinese intelligence agencies for the engineering defects. The United States is concerned that Huawei is a front for the Chinese intelligence services, and that rolling out Huawei’s 5G system in Europe would open the door for Chinese spying or sabotage.

Just before his shooting spree at two Christchurch, New Zealand mosques, the alleged mass murderer posted a hate-filled manifesto on several file-sharing sites. Soon, the widespread adoption of artificial intelligence on platforms and decentralized tools like IPFS will mean that the online hate landscape will change. Combating online extremism in the future may be less about “meme wars” and user-banning, or “de-platforming,” and could instead look like the attack-and-defend, cat-and-mouse technical one-upsmanship that has defined the cybersecurity industry since the 1980s. No matter what technical challenges come up, one fact never changes: The world will always need more good, smart people working to counter hate than there are promoting it.

Growing drone use in populated areas poses significant risks that, without additional safeguards, could result in attacks by malicious entities and exploited for use in cyberattacks, terrorism, crime and invasion of privacy.

On Saturday afternoon, Attorney General William Barr sent Congress his “principal conclusions” of the Mueller report. Barr quotes the Mueller report to say that “[T]he investigation did not establish that members of the Trump Campaign conspired or coordinated with the Russian government in its election interference activities.” The Mueller report does not take a position on whether or not Trump engaged in obstruction of justice. Barr writes: “The Special Counsel… did not draw a conclusion — one way or the other — as to whether the examined conduct constituted obstruction.” The AG quotes the report to say that “while this report does not conclude that the President committed a crime, it also does not exonerate him.”

The shocking mass-shooting in Christchurch last Friday is notable for using livestreaming video technology to broadcast horrific first-person footage of the shooting on social media.The use of social media technology and livestreaming marks the attack as different from many other terrorist incidents. It is a form of violent “performance crime.” That is, the video streaming is a central component of the violence itself, it’s not somehow incidental to the crime, or a disgusting trophy for the perpetrator to re-watch later.In an era of social media, which is driven in large part by spectacle, we all have a role to play in ensuring that terrorists aren’t rewarded for their crimes with our clicks.

A study found that Russian trolls and bots have been spreading false information about vaccination, in support of the anti-vaccination movement. The false information was generated by propaganda and disinformation specialists at the Kremlin-affiliated, St. Petersburg-based IRA. The Kremlin employed IRA to conduct a broad social media disinformation campaign to sow discord and deepen divisions in the United States, and help Donald Trump win the 2016 presidential election.

cybercrimes reached a six-year high in 2017, when more than 300,000 people in the United States fell victim to such crimes. Losses topped $1.2 billion. Cybercriminals can run, but they cannot hide from their digital fingerprints.

A thriving marketplace for SSL and TLS certificates—small data files used to facilitate confidential communication between organizations’ servers and their clients’ computers—exists on a hidden part of the internet.

The ADL and the Network Contagion Research Institute will partner to produce a series of reports that take an in-depth look into how extremism and hate spread on social media – and provide recommendations on how to combat both.

Monitoring hateful content is always difficult and even the most advanced systems accidentally miss some. But during terrorist attacks the big platforms face particularly significant challenges. As research has shown, terrorist attacks precipitate huge spikes in online hate, overrunning platforms’ reporting systems. Lots of the people who upload and share this content also know how to deceive the platforms and get round their existing checks. So what can platforms do to take down extremist and hateful content immediately after terrorist attacks? I propose four special measures which are needed to specifically target the short term influx of hate.

It is a phenomenon known to almost all of us: you browse the web and suddenly your computer slows down and runs loudly. This could be due to so-called crypto mining, meaning the access to computer power to generate cryptocurrencies without the knowledge of the user. New software, called “CoinEater,” blocks crypto mining.

On the final page of his 35-page dossier, former British intelligence officer Christopher Steele refers to a company, whose name is redacted, that allegedly was used to hack the Democratic party. Today, the New York Times identifies the company and its owner, Aleksej Gubarev, and says that according to a newly revealed report, the allegations against the Russian technology entrepreneur’s operations check out.

The long view

Russia’s attack on American elections in 2016, described in Special Counsel Robert Mueller’s recent report as “sweeping and systematic,” came as a shock to many. It shouldn’t have. Experts had been warning of the danger of foreign meddling in U.S. elections for years. Already by 2016, the wholesale adoption of computerized voting had weakened safeguards against interference and left the United States vulnerable to an attack. So, too, the shift to digital media and communications had opened new gaps in security and the law that could be used for manipulation and blackmail.

When former U.S. Special Counsel Robert Mueller testified before the House Intelligence Committee last week about his investigation into Russian interference in the 2016 presidential election, some saw his comments about Moscow’s ongoing meddling attempts as the most important statement of the day. “It wasn’t a single attempt,” he said when asked about the spread of disinformation and whether Moscow would replicate the efforts again. “They’re doing it as we sit here and they expect to do it during the next campaign.” It’s not clear, however, who can or will lead the charge in this “war on disinformation.” Even as experts say the problem is worsening, it is unlikely that the current divided government could produce anything close to a solution.

A little-known science fiction book penned by the late father of U.S. Attorney General William Barr is being sold online at astronomical prices by sellers eager to attract Jeffrey Epstein conspiracy theorists. Space Relations: A Slightly Gothic Interplanetary Tale by Donald Barr has been thrust into the spotlight in the wake of the convicted pedophile’s apparent suicide, and eBay sellers — quick to link the two men — are now hawking it for as much as $4,999.

The preliminary results of Facebook’s long-awaited “bias” audit are out. The key takeaway? Everyone is still unhappy. The report is little more than a formalized catalog of six categories of grievances aired in Republican-led congressional hearings over the past two years. It doesn’t include any real quantitative assessment of bias. There are no statistics assessing the millions of moderation decisions that Facebook and Instagram make each day. The results are all the more remarkable because the audit was an exhaustive affair, the fruit of about a year of research led by former Republican Sen. Jon Kyl, encompassing interviews with scores of conservative lawmakers and organizations. “Despite the time and energy invested, the conspicuous absence of evidence within the audit suggests what many media researchers already knew: Allegations of political bias are political theater,” Renee DiResta wites.

The Israeli-Palestinian conflict has long been a global battle, fought by hundreds of proxies in dozens of national capitals by way of political, economic, and cultural pressure. As the internet has evolved, so have the tools used to wage this information struggle. The latest innovation — a pro-Israel smartphone app that seeds and amplifies pro-Israel messages across social media — saw its first major test in May 2019. It offered a glimpse of the novel methods by which future influence campaigns will be conducted and information wars won.

Caution and restraint are not known as the hallmarks of the digital revolution. Especially when there’s the admirable possibility of increasing participation by going digital, the temptation to do so is strong—and rarely resisted. But a decision reportedly taken by the Democratic National Committee, however, presents a significant display of caution that deserves both attention and praise. “Showing restraint usually isn’t exciting or flashy,” Joshua Geltzer writes. “But it can be admirable. And, here, organizations like the DNC that take these steps deserve our collective applause for erring on the side of caution, especially in a world replete with cybersecurity and election interference threats.”