Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

malachiorion (1205130) writes "DARPA knows that people are afraid of robots. Even Steve Wozniak has joined the growing chorus of household names (Musk, Hawking, Gates) who are terrified of bots and AI. And the agency's response--a video contest for kids--is equal parts silly and insightful. It's called Robots4Us, and it asks high schoolers to describe their hopes for a robot-assisted future. Five winners will be flown to the DARPA Robotics Competition Finals this June, where they'll participate in a day-after discussion with experts in the field. But this isn't quite as useless as it sounds. As DRC program manager Gill Pratt points out, it's kids who will be impacted by the major changes to come, moreso than people his age (or mine, for that matter). Here's my post on the contest for Popular Science."Link to Original Source

seven of five (578993) writes "One man is dead and another severely injured after a shootout at one of the main gates of the National Security Agency located at Fort Meade, Maryland.

Two men dressed as women attempted to “penetrate” the entry point with their vehicle when a shootout occurred, officials said.The FBI said they do not believe the incident is related to terrorism."Link to Original Source

An anonymous reader writes "It will come as no surprise that Apple's CEO Tim Cook doesn't agree with so-called religious freedom laws. Cook says, "[they] rationalize injustice by pretending to defend something many of us hold dear," and has penned an op-ed piece for The Washington Post which reads in part: "A wave of legislation, introduced in more than two dozen states, would allow people to discriminate against their neighbors. Some, such as the bill enacted in Indiana last week that drew a national outcry and one passed in Arkansas, say individuals can cite their personal religious beliefs to refuse service to a customer or resist a state nondiscrimination law. Others are more transparent in their effort to discriminate. Legislation being considered in Texas would strip the salaries and pensions of clerks who issue marriage licenses to same-sex couples — even if the Supreme Court strikes down Texas’ marriage ban later this year. In total, there are nearly 100 bills designed to enshrine discrimination in state law. These bills rationalize injustice by pretending to defend something many of us hold dear. They go against the very principles our nation was founded on, and they have the potential to undo decades of progress toward greater equality.""

It's definitely not much more complex than that. The only technical innovation is the use of computer vision to recognize that a door has opened. But this thing is designed to operate almost continuously, and probably costs exponentially less to build. Those experimental bots almost always insanely expensive, and constantly breaking down.

Do you encounter any robots that do that? iRobot's AVA, or even those bots in hospitals, will stop and wait for a clear path, but it would take some really bad engineering, and a deep urge to be sued, to build a robot that plays chicken with pedestrians,
This bot specifically weaves around obstacles, instead of coming to a dead stop like a big dummy.

One of the things I didn't include in the story, since it was more of a hunch on the part of the robot's makers than anything based on the pilot, is the notion that people might be more likely to make service item requests if they don't have to deal with a human. That could be because they don't want to worry about tipping someone for bringing an item, like a toothbrush, that's ostensibly free, or because they simply don't want to deal with someone after a long trip. Or, and this is where some data would be valuable, there's the anecdotal "evidence" of female hotel guests just not being comfortable with opening the door to a stranger, if it's not absolutely essential. Again, not exactly a concrete benefit, but I think it makes a good deal of sense.

An anonymous reader writes "The Open Humans network [http://thestack.com/open-humans-network-open-source-dna-240315] is a new online platform which lets participants share their medical data and genomes for a variety of open source research projects. The project currently has three research partners, including one researching into stomach bacteria, and is expecting interest from a number of potential collaborators. Open Humans project director Jason Bobe said "“It's like open-sourcing your body,” [http://blog.openhumans.org/]. Instead of the standard scrollable disclaimers that usually herald the dismissal of users' privacy, participants must pass a test to prove that they understand the consequences of sharing their most intimate medical information and their DNA with a third party."Link to Original Source

malachiorion (1205130) writes "After a successful 6-month pilot, Savioke's "butler bots" are heading to hotels around the country. These are not sexy, scary, or even technically impressive machines. But they were useful enough, over the course of their 2000 or so deliveries, to warrant a redesign, and a larger deployment starting in April. Savioke's CEO had some interesting things to say about the pilot, including the fact that some 95 percent of guests gave the robot a 5-star review, and only the drunks seemed to take issue with it. Plus, as you might expect, everyone seemed to want to take a damn selfie with it. But as small as the stakes might appear, highly specialized bots like this one, which can only do one thing (in this case, bring up to 10 pounds of stuff from the lobby to someone's door) are a better glimpse of our future than any talk of hyper-competent humanoids or similarly versatile machines. This is my post for Popular Science about why the rise of the boring robot is good news for robotics."Link to Original Source

An anonymous reader writes "Philips, the people who brought you the Hue lights, targeted me with an ad on Facebook. The product looked pretty neat, so I pulled my credit card out to buy one. After entering my name, address, and phone number into the checkout page, I was notified that the product was in beta, and could either press a button saying not to contact me, or another button to be notified if it is released. I feel as if they trick me into giving up my personal information. I then noticed that the web form used the GET method, so that all info I entered became part of the URL of the next page that loaded, which means that it is now in the web server logs, and was sent in clear text to the partners that they work with (Inspectlet, Eloqua, etc). Am I overreacting, or is tricking people in this way fair game?"Link to Original Source

Cornell CIS (4023141) writes "Can computers, like humans, be fooled by optical illusions? Yes, say Cornell researchers, who are opening new avenues for research in computer vision using Deep Neural Networks."Link to Original Source

An anonymous reader writes "Recently, Techdirt noted that the FBI may soon have permission to break into computers anywhere on the planet. It will come as no surprise to learn that the US's partner in crime, the UK, granted similar powers to its own intelligence services some time back. What's more unexpected is that it has now publicly said as much, as Privacy International explains:

The British Government has admitted its intelligence services have the broad power to hack into personal phones, computers, and communications networks, and claims they are legally justifed to hack anyone, anywhere in the world, even if the target is not a threat to national security nor suspected of any crime.

That important admission was made in what the UK government calls its "Open Response" to court cases started last year against GCHQ."Link to Original Source