Here's a link with a bit more information about the study. They do claim to have asked about previous gun use. Although I might have prefered higher sample sizes (151 people split in 5 groups) bigger the sample the better but ohwell. The gun game players made on average 7 headshots out of 16 with an airsoft gun (other groups dropped down to 2). I'd be curious how these results rank against other things like watching movies etc.

Blablahb:That study is very likely to be worthless for a number of reasons. His sample among students isn't random, and that sinks the entire experiment.

For a start because psychology as a whole is a 'soft science'. Even it's rock-solid conclusions are prone to change and revision because they turn out to be inacurate.

And to sink the study:What about students who have experience with actual guns? It was done in the US after all. He asked about firearms training, but what about other sources of knowledge like military service, knowledge transfered from relatives or self-practise. Because I'd 'speculate' that shooting guns makes you better at shooting guns.And other things that train eye-hand coordination, he didn't control for that. Archery for instance. Different weapon, basically the same activity.And what about people being customary to other forms of violence? Did he for instance check for ring experience in fighting sports? No.

So he used an extremely unreliable sample consisting of one socio-economic group, same education level, only one country, likely extremely biased in ethnic background even. No matter what else you work on after that, from a sample group that biased, no conclusions can possibly be drawn.

His method is also flawed. He used a gun-shaped controller. Wait, hold it a moment there; So he didn't use the normal input device for a game, but instead he used a gun analogy? That means the increase in accuracy could be caused by using something shaped as a firearm, and not by it being a game.

Also it says he let students fire actual guns. Wouldn't the increase in willingness to aim at human shaped objects be caused by handling actual firearms? It's pretty common knowledge that weapons in such a context incite violence by themselves, so again he's gotten himself an interfering variable that sends his research down the drain.

Unless he's done some serious maths to rule out those interfering variables (something which he pretty much can't because of there being several other variables and the weakness of his test), then all his conclusions already sunk.

To make it even worse, he put a live-sized human target at 6 metres distance. Let me tell you, even if you had Parkinson's disease, you could make headshot on a stationary target at only 6 metres. It's basically point blank range at which nobody can miss, and accuracy results count for nothing at all.

Then he made yet another mistake in the number of shots. Six shots. But wait, he's counting missed or hit. That means he's conducting a new test, a binominal chance experiment. Hit or miss. With only 6 shots, it's impossible to draw any conclusions. The absolute lowest limit to draw conclusions in binominal chance experiments is 30, so the study's conclusions are invalidated because the outcome can be explained by randomness.

Basically this 'professor' wrote a setup that is so crappy that if you used it for a bachelor thesis, your tutor would come down on you like a ton of bricks and give it a heavily insufficient mark.

So okay, professor does test, proves that randomness exists. Good for him. When is his university going to sack him for disgracing them?

Lancer873:The statistics sound made up (really? 99 percent and 33 percent? That's way too rounded off and way too extreme a difference

He only let them take 6 shots, meaning two shots difference is already a 33% gap. If one non-gamer hits twice and a gamer doesn't miss (which is quite bloody hard at such a tiny range) he can already write in a sensationalist style 'gamers make three times as many headshots', and his conclusions reek of such unsound assumptions.

But you're right. If he lets them take 3 shots, he must have been making up the data, because a hit percentage of 99% is an impossible fraction of the numbers, because people can only shoot 1 full bullet, and not 0,05 bullet.

Unless there's a different explanation, that professor had committed fraud.

I have been a member of a shooting club for 9 years, and I can clearly say that playing a shooter game does not makes you better at shooting.

First off all: A controller is no way near as heavy as a gun.

Second: A gun produces recoil when you pull the trigger, a controller just vibrates when you click a button. Big difference.

An Third: I have seen persons on training who joined because they thought they were star shooter, just because they have played CoD, BF etc. Turned out, they where all horrible shooters.

Playing games does not makes you a better shooter. It bothers me that some people, thinks like this. Before I became a member of the shooting club, I thought it was easy. At least that is what the video games and movies made it look like. But when I tried using an actual gun, I was horrible. I had problem shooting the target, had problem reloading properly and the recoil surprised me. After some years of training, I became better and learned both how to use a gun and weapon safety. But some gamers who has spent nine years playing a video game would have no chance Sure he could have beginners luck, but he/she would not gave had the required training to properly use a gun right.

So people who writes stuff likes this in the newspaper, annoys me, the bother me. I can understand that they believe there is a connection, but in reality, there is none or extremely few.

Is this even really an anti game study? I've played plenty of games and I've fired a gun. You wanna know what I fired it at? Clay pigeons. Just because you're good at something doesn't mean you're gonna use it in a violent way. I'm pretty sure that more guns are fired at targets than at people (not taking wars and the like into account), and that's a perfectly legitimate way to pass time, I actually found clay pigeon shooting to be quite relaxing and un-murderous.

Well put. Though, you did forget one vital part of the information. At no point would a sample of 151 people be large enough to indicate a single thing. The anecdotal "rule of small numbers" dictates that you are far more likely to get an extreme sampling of data from such a small group. This study should have been conducted with thousands of participants in much more controlled settings (as you so eloquently outlined for everyone).

Breaking news: activities that make you practice hand-eye coordination enhance your hand-eye coordination on other tasks.

I really don't see what he's trying to prove with this, anyway. I mean, if you really want to get good at shooting guns accurately, you could, I don't know, go to a shooting range and shoot real guns. And either way (video games or shooting range) you're still not likely to go out and start shooting actual people.

So... what's his point?

thats where i was going as well. its going to improve hand eye coordination and the twitch play would probably help with accuracy as well

good sample size, but a few things missing for this :-Group is large but what about the sample size, how many shots did each individual fire?-peer review (to make sure it's all above board)-double blind (yes, even in an objective test, the score counters cannot know who played the games)-control group (it may be that people who are better at aiming are more likely to play games)-repeatability (can someone repeat the outcome)

Also some controls. How about comparing those who "train" on a lightgun game against another group that trains with an actual gun? How do they compare then?

If it is a light gun like this:

Then they will have had SOME training with a "gun". Versus the control group who likely have never aimed to hit something in their life.

If for the "testing" where they used a very light recoiling weapons like the most common self-loading rifle you can find, a Remington 10/22 rifle. It has minuscule recoil, about equivalent to tapping down the end of the barrel will a ball-peen hammer.

PROPOSAL FOR FURTHER INVESTIGATION:

Repeat this experiment with another control group:

Group 1: Spends 20 minutes sitting reading a magazine before the testGroup 2: spends 20 minutes playing a lightgun game before the testGroup 3: Spends 20 minutes shooting at cans with the same rifle they are about to use in the test

If Group 2 performs best, that might be cause for concern. But if group 3 performs best, that just indicates the banal trend that the more you repeat a task the better you get at it.

After all, who gets a gun and doesn't spend any time shooting it? Breivick (Norway killer) stated in his trial that he became a good shot with his rifle my training with it on the range, that the games he played were just time wasting hobbies.

I played some laser tag a few weeks ago, and I was surprised at just how accurate I was, since I'd never shot a gun before. Of course, I was ironsighting the whole time, and laser guns have no recoil, but still. Those kids were dead before they could even cry about it.

Well put. Though, you did forget one vital part of the information. At no point would a sample of 151 people be large enough to indicate a single thing. The anecdotal "rule of small numbers" dictates that you are far more likely to get an extreme sampling of data from such a small group. This study should have been conducted with thousands of participants in much more controlled settings (as you so eloquently outlined for everyone).

Uh, the sample size is large enough, how could you get much more than 150 people for a test like this? Even divided into two groups 75 data points is enough to demonstrate a significant trend.

The issue with sample size is samples per individual. If they took only 3-4 shots each, that doesn't say much especially if playing Resident Evil 4 they thought they'd get more points for aiming for the head, but that was not a matter of ability, simply a matter of suggestion. They thought they SHOULD go for the head, not that they COULD.

Were they ALL told to aim for the head for more points in the testing stage? If not - and Resident Evil 4 did implicitly - then it's nothing but indication of the power of suggestion.

Blablahb:That study is very likely to be worthless for a number of reasons. His sample among students isn't random, and that sinks the entire experiment.

For a start because psychology as a whole is a 'soft science'. Even it's rock-solid conclusions are prone to change and revision because they turn out to be inacurate.

And to sink the study:What about students who have experience with actual guns? It was done in the US after all. He asked about firearms training, but what about other sources of knowledge like military service, knowledge transfered from relatives or self-practise. Because I'd 'speculate' that shooting guns makes you better at shooting guns.And other things that train eye-hand coordination, he didn't control for that. Archery for instance. Different weapon, basically the same activity.And what about people being customary to other forms of violence? Did he for instance check for ring experience in fighting sports? No.

So he used an extremely unreliable sample consisting of one socio-economic group, same education level, only one country, likely extremely biased in ethnic background even. No matter what else you work on after that, from a sample group that biased, no conclusions can possibly be drawn.

His method is also flawed. He used a gun-shaped controller. Wait, hold it a moment there; So he didn't use the normal input device for a game, but instead he used a gun analogy? That means the increase in accuracy could be caused by using something shaped as a firearm, and not by it being a game.

Also it says he let students fire actual guns. Wouldn't the increase in willingness to aim at human shaped objects be caused by handling actual firearms? It's pretty common knowledge that weapons in such a context incite violence by themselves, so again he's gotten himself an interfering variable that sends his research down the drain.

Unless he's done some serious maths to rule out those interfering variables (something which he pretty much can't because of there being several other variables and the weakness of his test), then all his conclusions already sunk.

To make it even worse, he put a live-sized human target at 6 metres distance. Let me tell you, even if you had Parkinson's disease, you could make headshot on a stationary target at only 6 metres. It's basically point blank range at which nobody can miss, and accuracy results count for nothing at all.

Then he made yet another mistake in the number of shots. Six shots. But wait, he's counting missed or hit. That means he's conducting a new test, a binominal chance experiment. Hit or miss. With only 6 shots, it's impossible to draw any conclusions. The absolute lowest limit to draw conclusions in binominal chance experiments is 30, so the study's conclusions are invalidated because the outcome can be explained by randomness.

Basically this 'professor' wrote a setup that is so crappy that if you used it for a bachelor thesis, your tutor would come down on you like a ton of bricks and give it a heavily insufficient mark.

So okay, professor does test, proves that randomness exists. Good for him. When is his university going to sack him for disgracing them?

Lancer873:The statistics sound made up (really? 99 percent and 33 percent? That's way too rounded off and way too extreme a difference

He only let them take 6 shots, meaning two shots difference is already a 33% gap. If one non-gamer hits twice and a gamer doesn't miss (which is quite bloody hard at such a tiny range) he can already write in a sensationalist style 'gamers make three times as many headshots', and his conclusions reek of such unsound assumptions.

But you're right. If he lets them take 3 shots, he must have been making up the data, because a hit percentage of 99% is an impossible fraction of the numbers, because people can only shoot 1 full bullet, and not 0,05 bullet.

Unless there's a different explanation, that professor had committed fraud.

Go reread the link i posted or read it. He let them take 16 shots not 6 with a sample size of about 30 people in 5 groups. So thats 480 shots per group so its not too hard to imagine 99 and 33 percent. And it sounds like at least some of the factor were controlled for in the study. I don't like it either but we have to at least try to represent the study properly.

Treblaine:good sample size, but a few things missing for this :-Group is large but what about the sample size, how many shots did each individual fire?-peer review (to make sure it's all above board)-double blind (yes, even in an objective test, the score counters cannot know who played the games)-control group (it may be that people who are better at aiming are more likely to play games)-repeatability (can someone repeat the outcome)

Also some controls. How about comparing those who "train" on a lightgun game against another group that trains with an actual gun? How do they compare then?

If it is a light gun like this:

Then they will have had SOME training with a "gun". Versus the control group who likely have never aimed to hit something in their life.

If for the "testing" where they used a very light recoiling weapons like the most common self-loading rifle you can find, a Remington 10/22 rifle. It has minuscule recoil, about equivalent to tapping down the end of the barrel will a ball-peen hammer.

PROPOSAL FOR FURTHER INVESTIGATION:

Repeat this experiment with another control group:

Group 1: Spends 20 minutes sitting reading a magazine before the testGroup 2: spends 20 minutes playing a lightgun game before the testGroup 3: Spends 20 minutes shooting at cans with the same rifle they are about to use in the test

If Group 2 performs best, that might be cause for concern. But if group 3 performs best, that just indicates the banal trend that the more you repeat a task the better you get at it.

After all, who gets a gun and doesn't spend any time shooting it? Breivick (Norway killer) stated in his trial that he became a good shot with his rifle my training with it on the range, that the games he played were just time wasting hobbies.

If you read the link I posted earlier they talk about the gun they used. It was an airsoft pistol that they say had similar recoil to a normal handgun.

If you read the link I posted earlier they talk about the gun they used. It was an airsoft pistol that they say had similar recoil to a normal handgun.

Right. An AIRSOFT pistol!?!??

I can see why he hasn't touted peer review of his study, he has clearly weighed this test to denigrate video games by contriving a comparison between a game controller and a "weapon" that are unrealistically similar. Then he has done is try to paint all games are training you for ALL weapons.

This is not (good) science for his misrepresentation of the experiment to come to an unsafe conclusion.

Ah, this study looked so promising: large sample size, and practical testing but clearly this is not science, he has gone out to prove his prejudice. Science is about investigating a topic you are interested in and following where the evidence leads you, he is clearly trying to contrive evidence to support his preconceived conclusion.

Science, I love you and all, but please stay the hell away from video games.Or at the very least, get a scientist that plays a lot of video games to do these studies involving video games.

There are just way too many factors that aren't accounted for in this study, that Professor Bushman hasn't accounted for and would not know to factor into his study, for me to even try to take this study seriously. Now if you'll excuse me, I am going to play some Search and Destroy to make my bomb defusing skills IRL go up.

Well put. Though, you did forget one vital part of the information. At no point would a sample of 151 people be large enough to indicate a single thing. The anecdotal "rule of small numbers" dictates that you are far more likely to get an extreme sampling of data from such a small group. This study should have been conducted with thousands of participants in much more controlled settings (as you so eloquently outlined for everyone).

Uh, the sample size is large enough, how could you get much more than 150 people for a test like this? Even divided into two groups 75 data points is enough to demonstrate a significant trend.

The issue with sample size is samples per individual. If they took only 3-4 shots each, that doesn't say much especially if playing Resident Evil 4 they thought they'd get more points for aiming for the head, but that was not a matter of ability, simply a matter of suggestion. They thought they SHOULD go for the head, not that they COULD.

Were they ALL told to aim for the head for more points in the testing stage? If not - and Resident Evil 4 did implicitly - then it's nothing but indication of the power of suggestion.

That is arguable. But you can't really have both. If they were to let people take 100 shots, for example, the data would be much better from that sample size. But now you are also giving people more time to become proficient with a firearm, which could skew the data still. But you aren't putting it to enough individuals still. People are different. You cannot sample such a small portion of the population and make a broad spectrum analysis. There are all kinds of things wrong with this study, and the chief thing (in my opinion) is that the sample should be larger, especially if people only have a few shots. More shots would be nice, but there is an issue with that as well.

When push comes to shove, a days testing would never be enough. This would have to repeated with the exact same experiment with more people to get enough data to even start to draw a conclusion. 75 data points is not enough when making a broad spectrum statement like, "playing shooters make people better at shooting firearms". Any scientific studies that hold any merit at all are usually done over the course of months or even years, with every possible sample group they can include. This is just bad science for many reasons.

Rant time:

That is the one flaw of the information age it seems, everyone does studies and and publishes them like they mean anything. Daniel Kahneman, Michael S Gazaniga, Leonard Mlodinow... these people spent years coming up with their conclusions. David Eagleman did a study on a single subject, but for two long years before he even considering coming up with a conclusion. The study only ended because the student graduated, but they did extensive backround testing on the subject before it even began. This guy spent a day, maybe a week at most and tells us conclusive data from a sampling that is far too small from an experiment that is so sloppy it shouldn't warrant page time anywhere. I once heard a guy tell me that he read a study that too much vitamin C can increase your chances of heart disease, and the reason he trusted it was because of the source. Now we get college professors that say one thing that is counter intuitive to many many studies and people actually believe it because of the source. It literally makes me rage.

That is arguable. But you can't really have both. If they were to let people take 100 shots, for example, the data would be much better from that sample size. But now you are also giving people more time to become proficient with a firearm, which could skew the data still. But you aren't putting it to enough individuals still. People are different. You cannot sample such a small portion of the population and make a broad spectrum analysis. There are all kinds of things wrong with this study, and the chief thing (in my opinion) is that the sample should be larger, especially if people only have a few shots. More shots would be nice, but there is an issue with that as well.

When push comes to shove, a days testing would never be enough. This would have to repeated with the exact same experiment with more people to get enough data to even start to draw a conclusion. 75 data points is not enough when making a broad spectrum statement like, "playing shooters make people better at shooting firearms". Any scientific studies that hold any merit at all are usually done over the course of months or even years, with every possible sample group they can include. This is just bad science for many reasons.

Rant time:

That is the one flaw of the information age it seems, everyone does studies and and publishes them like they mean anything. Daniel Kahneman, Michael S Gazaniga, Leonard Mlodinow... these people spent years coming up with their conclusions. David Eagleman did a study on a single subject, but for two long years before he even considering coming up with a conclusion. The study only ended because the student graduated, but they did extensive backround testing on the subject before it even began. This guy spent a day, maybe a week at most and tells us conclusive data from a sampling that is far too small from an experiment that is so sloppy it shouldn't warrant page time anywhere. I once heard a guy tell me that he read a study that too much vitamin C can increase your chances of heart disease, and the reason he trusted it was because of the source. Now we get college professors that say one thing that is counter intuitive to many many studies and people actually believe it because of the source. It literally makes me rage.

End rant, sorry.

Agreed that his conclusion is wrong (seems to be the main problem with this study) but you can conclude SOMETHING from only 150 data points (75 per group, or whatever) even if that conclusion is only indicative that suggests more studies should be done. You can't dismiss a 150 individual study.

150 individuals is certainly a good place to start. But my problem with this is comparing "light gun" type controller, which is rarely used in games with the test being a very similar airsoft-pistol... and I don't think anyone has ever been killed by an airsoft pistol except if they were possibly beaten to death with one. So the study is meaningless just from that. It doesn't control for general computer game use (aiming with a mouse or thumbstick) against most firearms use, such as a 9mm pistol or at the very least a .22LR calibre rifle.

Agreed that his conclusion is wrong (seems to be the main problem with this study) but you can conclude SOMETHING from only 150 data points (75 per group, or whatever) even if that conclusion is only indicative that suggests more studies should be done. You can't dismiss a 150 individual study.

150 individuals is certainly a good place to start. But my problem with this is comparing "light gun" type controller, which is rarely used in games with the test being a very similar airsoft-pistol... and I don't think anyone has ever been killed by an airsoft pistol except if they were possibly beaten to death with one. So the study is meaningless just from that. It doesn't control for general computer game use (aiming with a mouse or thumbstick) against most firearms use, such as a 9mm pistol or at the very least a .22LR calibre rifle.

I can agree with that. Sometimes I'm out spoken and use wrong language, that is my bad. I agree completely that it certainly does lends itself to more studying. I said that you can draw no conclusion from that, and it was stupid of me to say.

Torrasque:Science, I love you and all, but please stay the hell away from video games.Or at the very least, get a scientist that plays a lot of video games to do these studies involving video games.

There are just way too many factors that aren't accounted for in this study, that Professor Bushman hasn't accounted for and would not know to factor into his study, for me to even try to take this study seriously. Now if you'll excuse me, I am going to play some Search and Destroy to make my bomb defusing skills IRL go up.

Don't fear science, and this IS NOT science by the way. Science is FAR more rigorous than this.

He doesn't need to play video games, he does need to study them. But the great issue here is his conceit of guns, his worst deception is to perform this test with a low/zero recoil Airsoft pistol as the "firearm" which is ridiculous as I don't think anyone has ever been killed with an Airsoft pistol. Though it IS most similar to a lightgun.

Torrasque:Science, I love you and all, but please stay the hell away from video games.Or at the very least, get a scientist that plays a lot of video games to do these studies involving video games.

There are just way too many factors that aren't accounted for in this study, that Professor Bushman hasn't accounted for and would not know to factor into his study, for me to even try to take this study seriously. Now if you'll excuse me, I am going to play some Search and Destroy to make my bomb defusing skills IRL go up.

Don't fear science, and this IS NOT science by the way. Science is FAR more rigorous than this.

He doesn't need to play video games, he does need to study them. But the great issue here is his conceit of guns, his worst deception is to perform this test with a low/zero recoil Airsoft pistol as the "firearm" which is ridiculous as I don't think anyone has ever been killed with an Airsoft pistol. Though it IS most similar to a lightgun.

Its science light!Half the facts, and twice the "what the fuck is this shit" !

Really? I thought the things about shooting guns were adapting for range, dealing with recoil, holding it properly, sighting properly. I wouldn't have expected it to help much at all.

99% is huge, I'm really interested in this and the legitimacy of this, because if so it's got some big implications. I mean he's suggesting, I don't know, if someone were to do 20 minutes on a shooter before going out they'd become twice as accurate. If you went out hunting you'd hit twice as many birds.[/quote][/quote]

It doesn't take too long for someone who's played a ton of FPS' to get used to those things, in my opinion. At least in my case. The first time I ever fired a real gun I was getting bull's eyes at 300 feet by the end of the first box of shells...

It takes a bit more setup and patience to do it for real, but it's still point and click.

Really? I thought the things about shooting guns were adapting for range, dealing with recoil, holding it properly, sighting properly. I wouldn't have expected it to help much at all.

99% is huge, I'm really interested in this and the legitimacy of this, because if so it's got some big implications. I mean he's suggesting, I don't know, if someone were to do 20 minutes on a shooter before going out they'd become twice as accurate. If you went out hunting you'd hit twice as many birds.

[/quote]

It doesn't take too long for someone who's played a ton of FPS' to get used to those things, in my opinion. At least in my case. The first time I ever fired a real gun I was getting bull's eyes at 300 feet by the end of the first box of shells...

It takes a bit more setup and patience to do it for real, but it's still point and click.[/quote]

Well the article above cleared some things up, they were shooting with airsoft guns so recoil wasn't a thing and whats more their accuracy was still awful compared to what you'd need.

The gamers or people who played the more realistic shooters were *more concerned* about getting good impressive hits like a headshot. The others were probably more interested in simply hitting the target.

Not because they were inherently better or "trained" in a virtual environment. Anything else can easily be attributed to basic learning and skill transfer, which would occur the same for a docile person as it would for someone aggressive.

Treblaine:Uh, the sample size is large enough, how could you get much more than 150 people for a test like this? Even divided into two groups 75 data points is enough to demonstrate a significant trend.

No, you must realise that the number of students used is unimportant basically, because he is testing something else. He is not testing 'do my students show an average that can be compared to population X'. He was testing if gamers with a gun-shaped controller are more accurate than Joe Average.

And even if he was comparing an average, 150 wouldn't be enough because the sample group was extremely biased, not random at all. If you're working with a biased group, not even thousands and thousands of people are a large enough group. (because no amount of students in his class can form an accurate representation of all gamers everywhere on the world)

But an even more damning criticism was in that I noted how he failed to properly set up what he was actually testing: if gamers hit more often.

The number of shots he let them take are the actual test. Are the hits a result of coincidence, or an underlying trend? And like I noticed, for that case, the sample was way too small, and the method of testing extremely flawed.

Harker067:Go reread the link i posted or read it. He let them take 16 shots not 6 with a sample size of about 30 people in 5 groups.

Can't see that anywhere, but even if, that's still barely half of the *absolute minimum* required to draw any conclusions. Any outcomes will still have fallen completely under randomness.

Also, with 16, a fraction of either 33 or 99 is still impossible, so the likelyhood of fraud in the form of fictional data is still present.

Remember: the thing he's testing is a possibility of a connection existing between variable 1 (being a gamer with a gun-shaped controller) and variable 2 (accuracy). That is a simple method 1 examination of testing whether a sample holds up to a population, he can't just Z test it and expect to be done with it.

And in addition, like I wrote, the setup was also fatally flawed and interfering variables are at play.

Treblaine:Uh, the sample size is large enough, how could you get much more than 150 people for a test like this? Even divided into two groups 75 data points is enough to demonstrate a significant trend.

No, you must realise that the number of students used is unimportant basically, because he is testing something else. He is not testing 'do my students show an average that can be compared to population X'. He was testing if gamers with a gun-shaped controller are more accurate than Joe Average.

And even if he was comparing an average, 150 wouldn't be enough because the sample group was extremely biased, not random at all. If you're working with a biased group, not even thousands and thousands of people are a large enough group. (because no amount of students in his class can form an accurate representation of all gamers everywhere on the world)

But an even more damning criticism was in that I noted how he failed to properly set up what he was actually testing: if gamers hit more often.

The number of shots he let them take are the actual test. Are the hits a result of coincidence, or an underlying trend? And like I noticed, for that case, the sample was way too small, and the method of testing extremely flawed.

Could you explain where this bias comes from?

Is it safe to assume it comes from the "screening" stage where they are interviewed for their gaming habits and their prior experience with fire arms? Are you suggesting he put all the people who had pistol experience in the group to be tested for link between playing light-gun games and "real world" accuracy with an airsoft pistol. Or is it something else?

My problem is finding that playing 20 minutes on a toy (Wii video game) meant they were better with another toy (airsoft) is utterly pointless. The relevance should be playing ANY video game involving aiming and shooting (like Call of Duty on a gamepad) with a firearm most often used in murders (S&W revolver seems most common). I don't have access to firearms nor even the several hundred people to test this but I genuinely would like to see this test. I think the result will give no advantage to the CoD gamer. Yet He gives them a light gun to play with then instead of testing with an actual Pistol that is heavy (almost 3-pounds fully loaded) hard to load and complicated to set-to-fire and that has significant recoil, a heavy moving slide and spitting hot brass. He instead tests them with a light plastic airsoft pistol with no significant recoil.

I understand he wants to take reasonable safety precautions, but how about using a Simunition that the military uses? This is real guns loaded with reduced power paintball loads instead of lead bullets in the cartridges, they are very unlikely to be lethal but still have significant recoil and don't particularly interfere with the pistol's functioning. Or if you can't do that, just have the test performed with a real gun with one round loaded at a time by a Range Officer who constantly stands by the test subject to make sure they don't point the gun somewhere dangerous.

As to the "hit more often" do you mean that in the test the subjects were just allowed to take as many shots as they liked with no particular instruction? It could be simply the game players simply inferred from the game they played earlier that they should aim for the head. That's all this test would indicate, the power of suggestion.

However if it was, all that would indicate was a greater tendency to fire more shots. Did his witness plate (I presume it was a paper target with human silhouette) also record missed shots? The more I look at this "study" the more sloppy and contrived it seems.

Does anyone have an actual link to where his study is published? If it is even published AT ALL! I hope he didn't just put it on his website without any kind of review or assessment process otherwise we have all wasted a lot of our time.

Harker067:Go reread the link i posted or read it. He let them take 16 shots not 6 with a sample size of about 30 people in 5 groups.

Can't see that anywhere, but even if, that's still barely half of the *absolute minimum* required to draw any conclusions. Any outcomes will still have fallen completely under randomness.

Also, with 16, a fraction of either 33 or 99 is still impossible, so the likelyhood of fraud in the form of fictional data is still present.

Remember: the thing he's testing is a possibility of a connection existing between variable 1 (being a gamer with a gun-shaped controller) and variable 2 (accuracy). That is a simple method 1 examination of testing whether a sample holds up to a population, he can't just Z test it and expect to be done with it.

And in addition, like I wrote, the setup was also fatally flawed and interfering variables are at play.

While not the paper it does go into some of the details of the study. It clearly says each group fired 16 shots.

Ok so 151/5 groups is ~30 people per group with 16 shots per person (not more or less) that's 480 shots per group a decent sample size. The link sites an average of 7/16 headshots in the gun in gun game group. While we don't have the data or the standard deviation that's still ~210 shots.

The escapist story says that 99% is a comparison to several groups who did not use the gun in the gun game " 99 percent more headshots on mannequin targets using a real gun than those who played the other games".

The link doesn't mention all the numbers but says "Participants who played the nonviolent, non-shooting game had the fewest head shots, an average of about 2. Those who played the other games, including those who played the violent shooting game with a standard controller, fell in between those extremes."

So what this means is that the average of those other trials was a little under 3.5 head shots. They are using the combined data sets in these cases its perfectly legitimate and can easily get you figures like 99% and 33%.

While you may not agree with that comparison being made or the research methods I see absolutely no reason why this looks like manipulated data. While I may not like the results its seems probable that the data they are representing really does show this. If you're going to criticize it please do it a bit more responsibly as you come off as tribalistic and reactionary. Such reactions and sloppy arguments don't help us they just make us look bad.

While I agree that the study is small and preliminary (I have some issues with the idea that it improves accuracy) I don't actually see a huge reason why the aiming for the head part need be wrong. I don't see why being primed by playing a game that rewards head shots then going onto say a fireing range a player might not be more likely to shoot for a target's head. My link mentions the shooting is done immediately after playing the game. I'd personally be more interested in a number of follow up studies. Looking at things like moving target, larger sample sizes, how long after playing a game does any kind of preference for head shots last, does watching a racing movie affect ones likelihood of speeding immediately following.

s_h_a_d_o:"Playing games with gun-shaped controllers apparently makes you a better real-life killer."

Has anyone introduced Professor Bushman to the industrial design adage "form follows function", and explained to him that practising with any tool specifically designed and built to accomplish a particular task, will increase the user's chances of success at performing said task?

First of all this, even if it's not the exact same it's more experience with something akin to it. If you've never shot a gun but use something like it in the exact same way, it's logical to understand you have an edge up on your opponent.

Now for my thoughts on the study. It's nice that they had the gun mimic the recoil from a 9mm but I doubt they put the person under pressure; I.E. You have 15 seconds to put 10 bullets into the dummy. Yes the people who've played with the gun controller may have better aim and more tenacity to go for a head-shot but I hope no one tries to link this to games making people violent. There's a huge difference between standing in a hallway and shooting a dummy then actually rampaging.

I didn't read through all the previous comments, so if this has already been said, I apologise. But his research proves nothing important. Just because these people know HOW to shoot does not mean that they WILL shoot, a human being. You can take the greatest marksman on the globe and put him in a room with a gun and another person for a few hours, nothing will likely happen. But replace that marksman with a psychopath who enjoys pulling the wings off of flies, then we'll have a different (and arguably more "interesting") story. It's like claiming that having a gleaming career in demolition means you're going to commit acts of terrorism, it's bullshit, plain and simple. Anyone who uses this research against the video game community shouldn't have a hand in the debate, also plain and simple.