Posted
by
msmash
on Friday December 09, 2016 @11:40AM
from the math-is-hard dept.

An anonymous reader shares an article on MarketingLand: For the third time since September, Facebook is disclosing new measurement errors. The two new errors affected the reaction counts Facebook reports on Pages' Live videos, as well as the engagement figures Facebook reports for off-Facebook links; the latter link engagement metrics were recently used in investigations by BuzzFeed and The New York Times into fake news articles' performance on Facebook. In addition to acknowledging the two new errors -- of which one has been corrected and one is still being inspected -- Facebook has refined a measurement marketers may reference when buying ads through the social network. None of the aforementioned metrics had any impact on how much money Facebook charges advertisers for their campaigns. But they may have informed brands' Facebook ad-buying strategies as well as brands', publishers' and others' Facebook-related content-publishing strategies.

Posted
by
EditorDavid
on Sunday December 04, 2016 @11:39PM
from the upvoting-with-money dept.

Tonight at NASA's Ames Research Center in Silicon Valley, Morgan Freeman emceed a glamorous, Oscars-style celebration that recognizes scientific achievements with money from tech billionaires.
An anonymous reader writes:
Donors for the Breakthrough Prize included Google's Sergey Brin, Facebook's Mark Zuckerberg and his wife Priscilla Chan, Alibaba founder Jack Ma and his wife Cathy Zhang, and billionaire venture capitalist Yuri Milner, according to an article in Fortune. TechCrunch has a list of the winners, which included Princeton math professor Jean Bourgain, who won a $3 million prize "for his many contributions to high-dimensional geometry, number theory, and many other theoretical contributions."

Three more physics researchers -- two from Harvard, and one from U.C. Santa Barbara -- will share a $3 million prize recognizing "meaningful advances in string theory, quantum field theory, and quantum gravity." And another $1 million prize honored the leaders of three teams responsible for "collaborative research on gravitational waves and its implications for physics and astronomy," with another $2 million to be shared among the 1,012 members of their research groups. 17-year-old Deanna See from Singapore also won the $250,000 "Breakthrough Junior Challenge" prize -- and more money for her teachers and school -- for her video about antibiotic-resistant superbugs. Google has created a special page where you can read more about some of the other winners.

Posted
by
EditorDavid
on Saturday December 03, 2016 @05:39PM
from the or-they-don't dept.

"The brain's basic computational algorithm is organized by power-of-two-based logic," reports Sci-News, citing a neuroscientist at Augusta University's Medical College.
hackingbear writes:
He and his colleagues from the U.S. and China have documented the algorithm at work in seven different brain regions involved with basics like food and fear in mice and hamsters. "Intelligence is really about dealing with uncertainty and infinite possibilities," he said. "It appears to be enabled when a group of similar neurons form a variety of cliques to handle each basic like recognizing food, shelter, friends and foes. Groups of cliques then cluster into functional connectivity motifs to handle every possibility in each of these basics. The more complex the thought, the more cliques join in."

Posted
by
EditorDavid
on Sunday November 20, 2016 @07:34AM
from the jobs-no-one-wanted dept.

"There was never a job opening for a drone pilot until there was something to fly," writes the founder of market research firm Beagle Research Group, arguing that automation won't inevitably lead society to a universal basic income "free lunch" because new jobs arise when "new capabilities, technical and otherwise, innovate them into existence."
Heck, computer programmers had no existence until computers. At one point a computer was just someone who was very good at math performing calculations all day...it took a year to check all of the calculations needed to produce the atomic bomb and that work was all done by humans. Imagine how history might be different if even one of them had a pocket calculator. You get the idea. New technology inspires new jobs.
He also argues that historically automation eliminates jobs that were "dull, dirty, and dangerous," and that automation also ends up performing previously-nonexistent jobs -- or work that was forced onto customers in self-service scenarios.

Posted
by
msmash
on Wednesday November 16, 2016 @01:00PM
from the quest-for-perfect-coffee dept.

One coffee drinker's perfect brew may be another drinker's battery acid. For this reason, and presumably others, mathematicians are zeroing in on the equations behind the taste of drip coffee. From a report on BBC:Composed of over 1,800 chemical components, coffee is one of the most widely consumed drinks in the world. The work by Kevin Moroney at the University of Limerick, William Lee at the University of Portsmouth and others offers a better understanding of the parameters that influence the final product. It had previously been known that grinding beans too finely could result in coffee that is over-extracted and very bitter. On the other hand not grinding them enough can make the end result too watery. "What our work has done is take that [observation] and made it quantitative," said Dr Lee. "So now, rather than just saying: 'I need to make [the grains] a bit bigger,' I can say: 'I want this much coffee coming out of the beans, this is exactly the size [of grain] I should aim for." Dr Lee says he sets his grinder to the largest setting. By doing so, he says: "The grains are a bit larger than you get in the standard grind, which makes the coffee less bitter. Partly because it's adjusting that trade-off between the stuff coming out of the surface and stuff coming out of the interior. When things are larger, you're decreasing the overall surface area of the system. "Also, the water flows more quickly through a coffee bed of large grains, because the water's spending less time in contact with the coffee, helping reduce the amount of extraction too. "If it's bitter, it's because you're increasing the amount of surface area in the grains. Also, when the grains are very small, it's hard for the water to slide between them, so the water is spending a lot more time moving through the grains -- giving it more time for the coffee to go out of solution."

Posted
by
msmash
on Saturday October 15, 2016 @05:31PM
from the changing-patterns dept.

In the middle of a discussion about the pros and cons of statins, Sir Rory Collins, the head of clinical trials at Oxford University, noted that If you want a career in medicine these days you're better off studying mathematics or computing than biology. A report on BBC adds: It is a nice one-liner, but I didn't think much more about it until a few days later, when I found myself sitting in a press conference to mark the launch of a new initiative on cancer. Rubbing shoulders on the panel with the director of the Institute of Cancer Research, Professor Paul Workman, was a scientist I didn't recognise, but it soon became clear this was exactly what Sir Rory had had in mind. Dr Andrea Sottoriva is an astrophysicist. He has spent much of his career searching for Neutrinos -- the elusive sub-atomic particles created by the fusion of elements in stars like our sun -- at the bottom of the ocean, and analysing the results of atom smashing experiments with the Large Hadron Collider at Cern in Geneva. "My background is in computer science, particularly as it applies to particle physics," he told me when we met at the ICR's laboratories in Sutton. So why cancer? The answer can be summed up in two words: big data. What Dr Sottoriva brings to the fight against cancer is the expertise in mathematical modelling needed to mine the vast treasure trove of data the information revolution has brought to medicine. "The exciting thing is that we can apply all the new analytical techniques we've developed in physics to biology," he says. "So we have all these new quantitative technologies that allow us to process an enormous amount of data, and all of a sudden we can start to apply that to implement the paradigm of physics in biology."

Posted
by
EditorDavid
on Saturday October 08, 2016 @11:34AM
from the Ich-bin-ein-Berliner-Konferenz dept.

An anonymous Slashdot reader writes:
The Linux Foundation held its "LinuxCon Europe" this week, "where developers, sys admins, architects and all types and levels of technical talent gather together under one roof for education, collaboration and problem-solving to further the Linux platform." They've now updated their web site with photos and slide presentations.

Posted
by
BeauHDon Monday September 26, 2016 @06:30PM
from the ones-and-zeros dept.

grcumb writes: Peruvian mathematician Harald Helfgott made his mark on the history of mathematics by solving Goldbach's weak conjecture, which states that every odd number greater than 7 can be expressed as the sum of three prime numbers. Now, according to Scientific American, he's found a better solution to the sieve of Eratosthenes: "In order to determine with this sieve all primes between 1 and 100, for example, one has to write down the list of numbers in numerical order and start crossing them out in a certain order: first, the multiples of 2 (except the 2); then, the multiples of 3, except the 3; and so on, starting by the next number that had not been crossed out. The numbers that survive this procedure will be the primes. The method can be formulated as an algorithm." But now, Helfgott has found a method to drastically reduce the amount of RAM required to run the algorithm: "Now, inspired by combined approaches to the analytical 100-year-old technique called the circle method, Helfgott was able to modify the sieve of Eratosthenes to work with less physical memory space. In mathematical terms: instead of needing a space N, now it is enough to have the cube root of N." So what will be the impact of this? Will we see cheaper, lower-power encryption devices? Or maybe quicker cracking times in brute force attacks? Mathematician Jean Carlos Cortissoz Iriarte of Cornell University and Los Andes University offers an analogy: "Let's pretend that you are a computer and that to store data in your memory you use sheets of paper. If to calculate the primes between 1 and 1,000,000, you need 200 reams of paper (10,000 sheets), and with the algorithm proposed by Helfgott you will only need one fifth of a ream (about 100 sheets)," he says.

Posted
by
BeauHDon Tuesday September 20, 2016 @07:05PM
from the ambitious-goals dept.

Microsoft is serious about finding a cure for cancer. In June, Microsoft researchers published a paper that shows how analyzing online activities can provide clues as to a person's chances of having cancer. They were able to identify internet users who had pancreatic cancer even before they'd been diagnosed, all from analyzing web query logs. Several months later, researchers on behalf of the company now say they will "solve" cancer within the next 10 years by treating it like a computer virus that invades and corrupts the body's cells. The goal is to monitor the bad cells and potentially reprogram them to be healthy again. The Independent reports: The company has built a "biological computation" unit that says its ultimate aim is to make cells into living computers. As such, they could be programmed and reprogrammed to treat any diseases, such as cancer. In the nearer term, the unit is using advanced computing research to try and set computers to work learning about drugs and diseases and suggesting new treatments to help cancer patients. The team hopes to be able to use machine learning technologies -- computers that can think and learn like humans -- to read through the huge amounts of cancer research and come to understand the disease and the drugs that treat it. At the moment, so much cancer research is published that it is impossible for any doctor to read it all. But since computers can read and understand so much more quickly, the systems will be able to read through all of the research and then put that to work on specific people's situations. It does that by bringing together biology, math and computing. Microsoft says the solution could be with us within the next five or ten years.

Posted
by
msmash
on Tuesday September 20, 2016 @09:40AM
from the fascinating-studies dept.

People born without sight appear to solve math problems using visual areas of the brain. NPR has a fascinating report on this: A functional MRI study of 17 people blind since birth found that areas of visual cortex became active when the participants were asked to solve algebra problems, a team from Johns Hopkins reports in the Proceedings of the National Academy of Sciences. "And as the equations get harder and harder, activity in these areas goes up in a blind person," says Marina Bedny, an author of the study and an assistant professor in the department of psychological and brain sciences at Johns Hopkins University. In 19 sighted people doing the same problems, visual areas of the brain showed no increase in activity. "That really suggests that yes, blind individuals appear to be doing math with their visual cortex," Bedny says. The findings, published online Friday, challenge the idea that brain tissue intended for one function is limited to tasks that are closely related.

Posted
by
EditorDavid
on Saturday September 03, 2016 @02:34PM
from the coding-competitions dept.

After analyzing 1.4 million scores on HackerRank's tests for coding accuracy and speed, Chinese programmers "outscored all other countries in mathematics, functional programming, and data structures challenges". Long-time Slashdot reader DirkDaring quotes a report from InfoWorld:
While the United States and India may have lots of programmers, China and Russia have the most talented developers according to a study by HackerRank... "If we held a hacking Olympics today, our data suggests that China would win the gold, Russia would take home a silver, and Poland would nab the bronze. Though they certainly deserve credit for making a showing, the United States and India have some work ahead of them before they make it into the top 25."
While the majority of scores came from America and India, the two countries ranked 28th and 31st, respectively. "Poland was tops in Java testing, France led in C++, Hong Kong in Python, Japan in artificial intelligence, and Switzerland in databases," reports InfoWorld. Ukrainian programmers had the top scores in security, while Finland showed the highest scores for Ruby.

Posted
by
BeauHDon Monday August 08, 2016 @09:30PM
from the foreign-labor dept.

geek writes: According to Caroline May from Breitbart News, "The tech industry is seeking to bolster its argument for more white-collar foreign tech workers with the insulting claim that the education system is insufficiently preparing Americans for tech fields, according to pro-American worker attorneys with the Immigration Reform Law Institute (IRLI). [In an op-ed published at The Daily Caller, IRLI attorneys John Miano and Ian Smith take the tech industry to task for its strategy to promote the H-1B visa program -- alleging a labor shortage of apt American tech workers while importing thousands of foreign workers on H1-B visas from countries with lower educational results than the U.S.]" John Miano and Ian Smith write via The Daily Caller: "But if the H-1B program really is meant to correct the failings of our education system, as BigTech's new messaging-push implies, why is it importing so many people from India? According to results from the Programme for International Student Assessment (PISA), a global standardized math and science assessment sponsored by the OECD, India scored almost dead last among the 74 countries tested. The results were apparently so embarrassing, the country pulled out of the program all together. Not surprisingly then, there isn't a single Indian university that appears within the top 250 spots of the World University Rankings Survey. And unlike American bachelor's degrees, obtaining a bachelor's in India takes only three years of study."

Posted
by
BeauHDon Thursday August 04, 2016 @07:10PM
from the don't-blink dept.

An anonymous reader writes from a report via Android Authority: The Galaxy Note 7 was just announced and one of the most intriguing features is its iris scanner. Android Authority has a report explaining how it works: "According to the company, the device stores your registered iris information as an encrypted code safely in its hardware using its KNOX security platform. Whenever you want to access content, such as a protected app, the device first captures your iris pattern for recognition, extracts and digitizes it, and then proceeds to match it with the encrypted code to provide access. You can be sure that no one else apart from you can access your device in case it is stolen or lost because the Note 7 registers the iris information of only one person. Samsung has made all this possible by including a dedicated iris camera for recognizing the composition of the user's eyeballs. The dedicated iris camera uses a special image filter to receive and recognize the reflected images of the irises through an infrared light on the other end of that panel. The light emitted from the Galaxy Note 7's display allows the scanner to receive data even in low light environments." The iris scanner can be used to access private information via Samsung's Secure Folder feature. Samsung also plans to partner with major financial institutions to incorporate its iris scanner into mobile banking applications.

Posted
by
BeauHDon Tuesday August 02, 2016 @02:00AM
from the market-share dept.

An anonymous reader writes: On June 29, Microsoft announced that Windows 10 was running on 350 million devices -- 50 million more devices than the previous milestone announced by Microsoft on May 5. While the company is expected to update the number of devices running the latest OS when it releases the Windows 10 Anniversary Update on August 2nd, NetMarketShare has decided to conduct some research on its own. According to its report, Windows 10 currently runs on a 21.13% desktop OS share. Meanwhile, Windows 7 continues to dominate the market with a 47.01% share, with Windows 8 and Windows 8.1 representing less than 10% of the PC market, and Windows XP representing 10.34%. While the market share of Windows 10 is all but certain to rise, it likely won't rise as fast as it did between May and June or June and July for example, as Windows 10 is no longer offered as a free upgrade for PCs running Windows 7 or Windows 8. Microsoft has even backtracked on its original statement that Windows 10 will hit one billion devices by mid-2018, saying last month that Windows 10 likely won't in fact make that deadline.

Posted
by
BeauHDon Wednesday July 27, 2016 @02:00AM
from the this-or-that dept.

An anonymous reader writes from a math-heavy report via AllFlicks: The folks at AllFlicks decided to crunch some numbers to determine just how much more expensive cable is than Netflix. They answered the question: how much does Netflix cost per hour of content viewed, and how does that compare with cable's figures? AllFlicks reports: "We know from Netflix's own numbers that Netflix's more than 75 million users stream 125 million hours of content every day. So that's (roughly) 100 minutes per user, per day. Using the price of Netflix's most popular plan ($9.99) and a 30-day month, we can say that the average user is paying about 0.33 cents per minute of content, or 20 cents an hour. Not bad! But what about cable? Well, Nielsen tells us that the average American adult cable subscriber watches 2,260 minutes of TV per week (including timeshifted TV). That's equivalent to 5.38 hours per day, or 161.43 hours per 30-day month. Thanks to Leichtman Research, we know that the average American pays $99.10 per month for cable TV. That means that subscribers are paying a whopping 61.4 cents per hour to watch cable TV -- more than three times as much as users pay per hour of Netflix!"

Posted
by
msmash
on Tuesday July 05, 2016 @04:25PM
from the fighting-odds dept.

An anonymous reader writes: In the wake of a deadly crash involving a Model S that was driving with its Autopilot software turned on, Tesla CEO Elon Musk issued a few interesting remarks on the technology to Fortune. Notably, the publication recently ran a piece attempting to portray Tesla in a bad light by noting that Musk sold more than $2 billion worth of Tesla stock just 11 days after the aforementioned May, 2016 accident. And all the while, shareholders were kept in the dark up until recently. "Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.

Posted
by
BeauHDon Friday July 01, 2016 @05:00AM
from the computer-science-race dept.

theodp writes: Q. How is K-12 computer science like the Cold War? A. It could use a Sputnik moment, at least that's the gist of an op-ed penned by Senator Jerry Moran (R., KS) and Microsoft President Brad Smith. From the article: "In the wake of the Soviet Union's 1957 Sputnik launch, President Eisenhower confronted the reality that America's educational standards were holding back the country's opportunity to compete on a global technological scale. He responded and called for support of math and science, which resulted in the National Defense Education Act of 1958 and helped send the country to the moon by the end of the next decade. It also created the educational foundation for a new generation of technology, leadership and prosperity. Today we face a similar challenge as the United States competes with nations across the globe in the indispensable field of computer science. To be up to the task, we must do a better job preparing our students for tomorrow's jobs." Smith is also a Board member of tech-bankrolled Code.org, which invoked Sputnik in its 2014 Senate testimony ("learning computer science is this generation's Sputnik moment") as it called for "comprehensive immigration reform efforts that tie H-1B visa fees to a new STEM education fund [...] to support the teaching and learning of more computer science," nicely echoing Microsoft's National Talent Strategy. Tying the lack of K-12 CS education to the need for tech visas is a time-honored tradition of sorts for Microsoft and politicians. As early as 2004, Bill Gates argued that CS education needed its own Sputnik moment, a sentiment shared by Senator Hillary Clinton in 2007 as she and fellow Senators listened to Gates make the case for more H-1B visas as he lamented the lack of CS-savvy U.S. high school students.

Posted
by
BeauHDon Wednesday June 29, 2016 @08:00AM
from the silicon-valley dept.

theodp writes from a report via USA Today: "If there was any lingering doubt as to tech's favored presidential candidate," writes USA Today's Jon Swartz, "Hillary Clinton put an end to that Tuesday with a tech plan that reads like a Silicon Valley wish list. It calls for connecting every U.S. household to high-speed internet by 2020, reducing regulatory barriers and supporting Net neutrality rules, [which ban internet providers from blocking or slowing content.] It proposes investments in computer science and engineering education ("engage the private sector and nonprofits to train up to 50,000 computer science teachers in the next decade"), expansion of 5G mobile data, making inexpensive Wi-Fi available at more airports and train stations, and attaching a green card to the diplomas of foreign-born students earning STEM degrees."dcblogs shares with us a report from Computerworld that specifically discusses Clinton's support of green cards for foreign students who earn STEM degrees: As president, Hillary Clinton will support automatic green cards, or permanent residency, for foreign students who earn advanced STEM degrees. Clinton, the presumptive Democratic presidential candidate, wants the U.S. to "staple" green cards on the diplomas of STEM (science, technology, engineering, math) masters and PhD graduates "from accredited institutions." Clinton outlined her plan in a broader tech policy agenda released today. Clinton's "staple" idea isn't new. It's what Mitt Romney, the GOP presidential candidate in 2012, supported. It has had bipartisan support in Congress. But the staple idea is controversial. Critics will say this provision will be hard to control, will foster age discrimination, and put pressure on IT wages.

Slashdot Top Deals

Slashdot Top Deals

Slashdot Poll

For April Fool's, I...

Enjoyed Every Minute
Was Amused, But Am Glad It's Done
Hid In the Basement, Waiting For It To End
Was Arrested/Assaulted For A Prank
Assaulted Somebody For A Prank
Got Duped Way More Than I Should Have
Was Busy Securing My Conficker Shelter
... Is There A Poll Achievement?