Posted
by
BeauHDon Wednesday December 13, 2017 @09:10PM
from the partly-cloudy-with-a-chance-of-rain dept.

According to Reuters, The White House said Wednesday the U.S. government needs a major overhaul of information technology systems and should take steps to better protect data and accelerate efforts to use cloud-based technology. The report outlined a timeline over the next year for IT reforms and a detailed implementation plan. One unnamed cloud-based email provider has agreed to assist in keeping track of government spending on cloud-based email migration. From the report: The report said the federal government must eliminate barriers to using commercial cloud-based technology. "Federal agencies must consolidate their IT investments and place more trust in services and infrastructure operated by others," the report found. Government agencies often pay dramatically different prices for the same IT item, the report said, sometimes three or four times as much. A 2016 U.S. Government Accountability Office report estimated the U.S. government spends more than $80 billion on IT annually but said spending has fallen by $7.3 billion since 2010. In 2015, there were at least 7,000 separate IT investments by the U.S. government. The $80 billion figure does not include Defense Department classified IT systems and 58 independent executive branch agencies, including the Central Intelligence Agency. The GAO report found some agencies are using systems that have components that are at least 50 years old.

Posted
by
BeauHDon Wednesday December 13, 2017 @08:30PM
from the reverse-engineered dept.

Greg Synek reports via TechSpot: To help with the reverse engineering of malware, Avast has released an open-source version of its machine-code decompiler, RetDec, that has been under development for over seven years. RetDec supports a variety of architectures aside from those used on traditional desktops including ARM, PIC32, PowerPC and MIPS. As Internet of Things devices proliferate throughout our homes and inside private businesses, being able to effectively analyze the code running on all of these new devices becomes a necessity to ensure security. In addition to the open-source version found on GitHub, RetDec is also being provided as a web service.

Simply upload a supported executable or machine code and get a reasonably rebuilt version of the source code. It is not possible to retrieve the exact original code of any executable compiled to machine code but obtaining a working or almost working copy of equivalent code can greatly expedite the reverse engineering of software. For any curious developers out there, a REST API is also provided to allow third-party applications to use the decompilation service. A plugin for IDA disassembler is also available for those experienced with decompiling software.

Posted
by
BeauHDon Wednesday December 13, 2017 @07:50PM
from the cord-cutting dept.

T-Mobile has revealed that it's launching a TV service in 2018, and that is has acquired Layer3 TV (a company that integrates TV, streaming and social networking) to make this happen. The company thinks people are ditching cable due to the providers, not TV itself. Engadget reports: It claims that it can "uncarrier" TV the way it did with wireless service, and has already targeted a few areas it thinks it can fix: it doesn't like the years-long contracts, bloated bundles, outdated tech and poor customer service that are staples of TV service in the U.S. T-Mobile hasn't gone into detail about the functionality of the service yet. How will it be delivered? How much will it cost? Where will it be available? And will this affect the company's free Netflix offer? This is more a declaration of intent than a concrete roadmap, so it's far from certain that the company will live up to its promises. Ultimately, the move represents a big bet on T-Mobile's part: that people like TV and are cutting the cord based on a disdain for the companies, not the service. There's a degree of truth to that when many Americans are all too familiar with paying ever-increasing rates to get hundreds of channels they don't watch. However, there's no guarantee that it'll work in an era when many people (particularly younger people) are more likely to use Netflix, YouTube or a streaming TV service like Sling TV.

In a particularly dystopian move, it seems that the San Francisco SPCA adorned the robot it was renting with stickers of cute kittens and puppies, according to Business Insider, as it was used to shoo away the homeless from near its office. San Francisco recently voted to cut down on the number of robots that roam the streets of the city, which has seen an influx of small delivery robots in recent years. The city said it would issue the SPCA a fine of $1,000 per day for illegally operating on a public right-of-way if it continued to use the security robot outside its premises, the San Francisco Business Times said.

Posted
by
BeauHDon Wednesday December 13, 2017 @06:30PM
from the new-methodology dept.

AT&T has started trials to deliver high-speed internet over power lines. The company announced the news on Wednesday and said that trials have started in Georgia state and a non-U.S. location. Reuters reports: AT&T aims to eventually deliver speeds faster than the 1 gigabit per second consumers can currently get through fiber internet service using high-frequency airwaves that travel along power lines. While the Georgia trial is in a rural area, the service could potentially be deployed in suburbs and cities, the company said in a statement. AT&T said it had no timeline for commercial deployment and that it would look to expand trials as it develops the technology.

"We think this product is eventually one that could actually serve anywhere near a power line," said Marachel Knight, AT&T's senior vice president of wireless network architecture and design, in an interview. She added that AT&T chose an international trial location in part because the market opportunity extends beyond the United States.

Posted
by
BeauHDon Wednesday December 13, 2017 @05:50PM
from the sorry-not-sorry dept.

Patreon has decided to halt its plans to add a service fee to patrons' pledges, a proposed update that angered many users. "We're going to press pause," CEO Jack Conte tells The Verge. "Folks have been adamant about the problems with the new system, and so basically, we have to solve those problems first." The company plans to work with creators on a plan that will solve issues with the current payment system, but won't create major new problems in their stead. From the report: Conte published a blog post laying out the core problems, alongside an apology. "Many of you lost patrons, and you lost income. No apology will make up for that, but nevertheless, I'm sorry," it reads. "We recognize that we need to be better at involving you more deeply and earlier in these kinds of decisions and product changes. Additionally, we need to give you a more flexible product and platform to allow you to own the way you run your memberships. I know it will take a long time for us to earn back your trust. But we are utterly devoted to your success and to getting you sustainable, reliable income for being a creator."

Conte says that any new system will need to take the popularity of small pledges into account, and preserve the benefits of aggregation. It will also need to give artists more autonomy, rather than announcing a sweeping overall change directly to users. "The overwhelming sentiment was that we overstepped our bounds" with the non-negotiable fee, he says. "I agree, we messed that up. We put ourselves between the creator and their fans and we basically told them how to run their business, and that's not okay." Webcomic creator Jeph Jacques previously quoted Conte as saying Patreon "absolutely fucked up that rollout."

Posted
by
BeauHDon Wednesday December 13, 2017 @05:10PM
from the no-hard-feelings dept.

An anonymous reader quotes a report from BBC: Google is deepening its push into artificial intelligence (AI) by opening a research center in China, even though its search services remain blocked in the country. Google said the facility would be the first its kind in Asia and would aim to employ local talent. In a blog post on the company's website, Google said the new research center was an important part of its mission as an "AI first company." "Whether a breakthrough occurs in Silicon Valley, Beijing or anywhere else, [AI] has the potential to make everyone's life better for the entire world," said Fei-Fei Li, chief scientist at Google Cloud AI and Machine Learning. The research center, which joins similar facilities in London, New York, Toronto and Zurich, will be run by a small team from its existing office in Beijing. The tech giant operates two offices in China, with roughly half of its 600 employees working on global products, company spokesperson Taj Meadows told the AFP news agency. But Google's search engine and a number of other services are banned in China. The country has imposed increasingly strict rules on foreign companies over the past year, including new censorship restrictions.

Posted
by
msmash
on Wednesday December 13, 2017 @04:29PM
from the interesting-comparisons dept.

An anonymous reader shares a report (condensed for space): Online streaming is a win for the environment. Streaming music eliminates all that physical material -- CDs, jewel cases, cellophane, shipping boxes, fuel -- and can reduce carbon-dioxide emissions by 40 percent or more. Scientists who analyze the environmental impact of the internet tout the benefits of this "dematerialization," observing that energy use and carbon-dioxide emissions will drop as media increasingly can be delivered over the internet. But this theory might have a major exception: porn. Since the turn of the century, the pornography industry has experienced two intense hikes in popularity. In the early 2000s, broadband enabled higher download speeds. Then, in 2008, the advent of so-called tube sites allowed users to watch clips for free, like people watch videos on YouTube. Adam Grayson, the chief financial officer of the adult company Evil Angel, calls the latter hike "the great mushroom-cloud porn explosion of 2008." Precise numbers don't exist to quantify specifics, but the impression across the industry is that viewership is way, way up. Pornhub, the world's most popular porn site, provides some of the only accessible data on its yearly web-traffic report. The first Year In Review post in 2013 tabulated that 14.7 billion people visited the site. By 2016, the number of visitors had almost doubled, to 23 billion, and those visitors watched more than 4.59 billion hours of porn. And Pornhub is just one site. Using a formula that Netflix published on its blog in 2015, Nathan Ensmenger, a professor at Indiana University who is writing a book about the environmental history of the computer, calculates that if Pornhub streams video as efficiently as Netflix (0.0013 kWh per streaming hour), it used 5.967 million kWh in 2016. For comparison, that's about the same amount of energy 11,000 light bulbs would use if left on for a year. And operating with Netflix's efficiency would be a best-case scenario for the porn site, Ensmenger believes.

Posted
by
msmash
on Wednesday December 13, 2017 @03:49PM
from the challenge-accepted dept.

Zack Whittaker, writing for ZDNet: The maker of a sneaky adware that hijacks a user's browser to serve ads is back with a new, more advanced version -- one that can gain root privileges and spy on the user's activities. News of the updated adware dropped Tuesday in a lengthy write-up by Amit Serper, principal security researcher at Cybereason. The adware, dubbed OSX.Pirrit, is still highly active, infecting tens of thousands of Macs, according to Serper, who has tracked the malware and its different versions for over a year. Serper's detailed write-up is well worth the read. [...] TargetingEdge sent cease-and-desist letters to try to prevent Serper from publishing his research. "We've received several letters over the past two weeks," Serper told ZDNet. "We decided to publish anyway because we're sick of shady 'adware' companies and their threats."

Posted
by
msmash
on Wednesday December 13, 2017 @03:09PM
from the stranger-things dept.

Mark C. Wilson, a senior lecturer at Department of Computer Science, University of Auckland, writing for The Conversation: University research is generally funded from the public purse. The results, however, are published in peer-reviewed academic journals, many of which charge subscription fees. I had to use freedom of information laws to determine how much universities in New Zealand spend on journal subscriptions to give researchers and students access to the latest research -- and I found they paid almost US$15 million last year to just four publishers. There are additional costs, too. Paywalls on research hold up scientific progress and limit the publicâ(TM)s access to the latest information.

Posted
by
msmash
on Wednesday December 13, 2017 @02:29PM
from the clever dept.

dmoberhaus shares a Motherboard report: A UK techie with a sense of humor may have found an alternative to expensive corporate broadband cables: some wet string. It's an old joke among network technicians that it's possible to get a broadband connection with anything, even if it's just two cans connected with some wet string. As detailed in a blog post by Adrian Kennard, who runs an ISP called Andrews & Arnold in the UK, one of his colleagues took the joke literally and actually established a broadband connection using some wet string. Broadband is a catch-all term for high speed internet access, but there are many different kinds of broadband internet connections. For example, there are fiber optic connections that route data using light and satellite connections, but one of the most common types is called an asymmetric digital subscriber line (ADSL), which connects your computer to the internet using a phone line. Usually, broadband connections rely on wires made of a conductive substances like copper. In the case of the Andrews & Arnold technician, however, they used about 6 feet of twine soaked in salt water (better conductivity than fresh water) that was connected to alligator clips to establish the connection. According to the BBC, this worked because the connection "is not really about the flow of current." Instead, the string is acting as a guide for an electromagnetic wave -- the broadband signal carrying the data -- and the medium for a waveguide isn't so important.

Posted
by
msmash
on Wednesday December 13, 2017 @01:45PM
from the justice-served dept.

Three hackers responsible for creating the massive Mirai botnet that knocked large swathes of the internet offline last year have pleaded guilty. Brian Krebs reports: The U.S. Justice Department on Tuesday unsealed the guilty pleas of two men (Editor's note: three men) first identified in January 2017 by KrebsOnSecurity as the likely co-authors of Mirai, a malware strain that remotely enslaves so-called "Internet of Things" devices such as security cameras, routers, and digital video recorders for use in large scale attacks designed to knock Web sites and entire networks offline (including multiple major attacks against this site). Entering guilty pleas for their roles in developing and using Mirai are 21-year-old Paras Jha from Fanwood, N.J. and Josiah White, 20, from Washington, Pennsylvania. Jha and White were co-founders of Protraf Solutions LLC, a company that specialized in mitigating large-scale DDoS attacks. Like firemen getting paid to put out the fires they started, Jha and White would target organizations with DDoS attacks and then either extort them for money to call off the attacks, or try to sell those companies services they claimed could uniquely help fend off the attacks. Editor's note: The story was updated to note that three men have pleaded guilty. -- not two as described in some reports.

Posted
by
msmash
on Wednesday December 13, 2017 @01:06PM
from the hitting-the-bottom dept.

Kate Conger, reporting for Gizmodo: For years, Uber systemically scraped data from competing ride-hailing companies all over the world, harvesting information about their technology, drivers, and executives. Uber gathered information from these firms using automated collection systems that ran constantly, amassing millions of records, and sometimes conducted physical surveillance to complement its data collection. Uber's scraping efforts were spearheaded by the company's Marketplace Analytics team, while the Strategic Services Group gathered information for security purposes, Gizmodo learned from three people familiar with the operations of these teams, from court testimony, and from internal Uber documents. Until Uber's data scraping was discontinued this September in the face of mounting litigation and multiple federal investigations, Marketplace Analytics gathered information on Uber's overseas competitors in an attempt to advance Uber's position in those markets. SSG's mission was to protect employees, executives, and drivers from violence, which sometimes involved tracking protesters and other groups that were considered threatening to Uber. An Uber spokesperson declined to comment for this story.

Posted
by
msmash
on Wednesday December 13, 2017 @12:20PM
from the security-woes dept.

wiredmikey writes: A team of researchers has revived an old crypto vulnerability and determined that it affects the products of several major vendors and a significant number of the world's top websites. The attack/exploit method against a Transport Layer Security (TLS) vulnerability now has a name, a logo and a website. It has been dubbed ROBOT (Return Of Bleichenbacher's Oracle Threat) and, as the name suggests, it's related to an attack method discovered by Daniel Bleichenbacher back in 1998. ROBOT allows an attacker to obtain the RSA key necessary to decrypt TLS traffic under certain conditions. While proof-of-concept (PoC) code will only be made available after affected organizations have had a chance to patch their systems, the researchers have published some additional details. Researchers have made available an online tool that can be used to test public HTTPS servers. An analysis showed that at least 27 of the top 100 Alexa websites, including Facebook and PayPal, were affected.

Posted
by
msmash
on Wednesday December 13, 2017 @11:40AM
from the concerning-patterns dept.

A new study claims 44.7 million metric tons (49.3 million tons) of TV sets, refrigerators, cellphones and other electrical good were discarded last year, with only a fifth recycled to recover the valuable raw materials inside. From a report: The U.N.-backed study published Wednesday calculates that the amount of e-waste thrown away in 2016 included a million tons of chargers alone. The U.S. accounted for 6.3 million metric tons, partly due to the fact that the American market for heavy goods is saturated. The original study can be found here(PDF; Google Drive link).