research volunteering

I’ve written before about getting involved with research volunteering; I do it because I enjoy:

Getting paid for my time, brainpower and medical samples;

Being part of a bigger research project that will advance technology for all of us.

And I’ve started contributing to the Folding@home project run by Stanford University. It’s a distributed computing effort similar to GIMPS, except they’re not searching for particular numbers; they use project members’ spare computing power to carry out calculations that model protein folding. The results of these calculations are used to study biological processes and to develop treatments for numerous conditions.

Well, that fulfils my desire to save humanity, one calculation at a time. But where’s the financial incentive, eh? Well, I’ve signed up with the 1337Foundation, who convert the points earned from the folding calculations into cryptocurrency. Unsurprisingly, the 1337Foundation pays me in 1137coin (this site plays music, which you may or may not enjoy, but just a friendly warning so that you don’t annoy your companions / coworkers!). If you want to get involved, you’ll need to download a 1337 wallet, and then set your identity for F@h as your wallet address, and join team 233050 (more information here).

This currency was set up as a Gamers Currency, with “1337” symbolising “leet”, as in “leetspeak” (OMG this pushes all my geek buttons). Right now it’s not doing so great against the dollar, but most cryptocurrencies seem to be on a slight downward trend at the moment. I feel less concerned because I’m earning these coins as compensation for taking part in an experiment (people have calculated the typical energy use per day for continuous folding and it’s not huge, plus I have a large data allowance – although you only need about 1 GB/month).

I also secretly hope that if I hoard my 1337coins for long enough, they’ll see a similar explosion in growth to Bitcoin, and I can retire young and live in a tropical paradise somewhere. Although this is probably what all the folders are hoping. Never mind, belief in the value of the market actually keeps the market buoyant (for a time, at least). Have a little faith, people!

This was a super-cool adventure in the biomechanics lab at Manchester University, which is located in, uh, the Optometry building. There was nowhere else for it, ok?

The research is designed to understand the nuances of human movement, to apply the findings to humanoid robots. The aim is to make robots move in a more natural, less clunky way. I’m sure that somewhere along the way there will be some Uncanny Valley-like results, but it’s all a step (ha) on the path to more humanlike robots. On a more serious note, the work has applications in medicine, for rehabilitation and developing prosthetics. Often university research sounds rather arcane and peculiar (Ig Nobel awards, anyone?), but it has real and relevant applications.Before the testing day, I was informed that I needed to be of average fitness as I would be performing walking and jogging movements in a gym for three hours. I do consider myself pretty healthy and active, but I was worried that they were expecting some sort of endurance hero to turn up. Fortunately they weren’t, and it was just repeated brisk walking over short distances.

The Biomechanics Laboratory

When I turned up, the lab was set up with tracks on the floor and motion capture cameras on the ceiling. At this point I realised this was going to be pretty futuristic and fun, and I was so glad I’d taken an afternoon off to help with this work. As is the case with all research involving human participants, the session began with the completion of consent forms. This is to comply with the requirements of the University Ethics Committee, and also gives a chance for questions to be answered, and a little more detail about the work to be revealed to the volunteer. Unlike any other study I’ve taken part in, this activity was accompanied by coffee and cake. Researchers, take note! It was quite a long afternoon, so we also had breaks and chatter throughout the session, to break up the repetition.

Following the initial registration process, I changed into a vest and shorts provided by the lab (this was necessary because my limbs needed to be exposed and I had to have various items of electronics fixed to my body). I was then weighed and my height was measured. It seems that I have got heavier (by 1kg) and shorter (by 1cm), which is the worst combination. Damn you, properly calibrated scales! This was necessary to calculate my BMI (it was a requirement of the study that participants were of normal weight – BMI 18.5 to 24.5).

Then I had a multitude of sensors fixed about my person. It took about half an hour to get all of the kit on, and the placing of the devices was meticulous. There were three different types of sensor used:

EMG (electromyography) sensors, measuring the activity of muscles involved in walking. I had a number of these placed on my legs and lower back, to identify which muscles are active during which movement activities.

Pressure-sensitive insoles to measure my gait. I also had to wear shoes provided by the lab, and due to my totally non-ergonomic feet, I required a 7 on my left foot and a 6 on my right.

You can see the marker pen still on my wrists a few hours after the experiment

Reflective motion-capture markers, which were placed all over: my shoulders, elbows, knees, wrists, ankles, hips, chest and toes, plus a snazzy felt hat with markers for my head. In order for these to be placed correctly, one of the researchers had to feel for the exact part of the joint to place the marker, and draw on me with a felt pen ready to glue the silvery reflective marker balls on to my skin. they also precisely measured the length of my hands and feet in order to relate the position of some of these markers to the rest of my body – my left hand is 187mm long and my left foot 236mm long. Not sure what I will ever do with this information, but it’s kinda cool.

In addition to the above, I needed to wear transmitters for the EMGs in a snazzy tool belt around my waist, plus transmitters for the insoles were strapped to my calves.

When all of the gear was on, my height and weight were remeasured, and I had gained a centimetre in height and two kilos in weight (not surprising, given how much of the damn stuff there was).

I then stood in front of the display screen in the middle of the lab, and I could see virtual me moving around in the form of a number of white dots (silly dance obligatory at this point). The researchers also had two screens in which they could see the pressure maps of my feet and the electrical output of my muscles. The motion capture data was collected by a number of cameras on the ceiling of the lab, and the pressure and movement signals were transmitted over the wi-fi to the recording equipment.

My first task was to stand on one leg and then switch to the other, for both feet. This is surprisingly difficult, and I didn’t realise how crap my balance was until this point. This was necessary for a baseline reading. Then the movement tasks began. I had to perform seven different actions, involving walking for a short distance, and only slightly energetic. The researchers recorded the data output for each of the seven actions, repeated ten times each.

Walking around with all the gear on felt a bit strange at first, but none of it got in the way of my ordinary walking style, which was really important as they wanted to record people’s normal, unimpeded movements. The type of movements that the researchers are interested in are to do with changes in speed and direction, the type of movements for which we can easily distinguish between the characteristics of man or machine.

The experiment took about three hours to complete; one of the longest studies I’ve taken part in. I got paid £20 for my participation, which isn’t loads, but given that I got to feel like I was in Avatar for an afternoon, it was definitely worth it. The team are still looking for more participants, aged 18 – 40 with a BMI between 18.5 and 24.9, to participate in trials starting in September. Details will soon be published on the University’s Research Volunteering page.

This piece of research took me to the Psychological Sciences department. They have lots of studies that require human volunteers, and they are all really interesting. This study involved a number of hearing and visual tests, with the aim of looking at how ageing affects the way we process visual and auditory information.

I can’t give away too much information, as the study is ongoing and still recruiting (they are especially keen to find volunteers aged 65+), but I had to don a pair of headphones and respond using a clicker when I heard certain sounds. There were other things going on during the test, too, like noises that I was not to respond to, but like I said, I can’t give too much away. There were a number of these tests and also some tests combining a visual stimulus on a screen and a noise. I was being tested on how I perceived these in relation to each other.

Things I learnt during the test: I have very sensitive hearing (yay!). I’m more sensitive to sounds at the ends of the spectrum (at the limits of human auditory perception) than those in the mid range. Yes, this is odd, and the researcher couldn’t explain it. But that’s the result we measured and it stands.

Following my most adventurous research volunteering experience (in which I underwent minor surgery in the name of science), I decided to take part in one a little more tame. It involved sitting at a desk and carrying out a computer task. Nice and safe.

The experiment asked the participant to walk through a virtual town (it was set up on a PC), with keyboard controls to move, and mouse control to determine the direction in which you were looking. This part of the task involved carrying a package from one building to another, and memorising the layout. After each fictional environment (there were 20 in total), I was asked to draw a map of the landscape in which I had walked, and answer some questions in which I had to recall the names of the buildings I had seen.

It was pretty repetitive, but kinda cool. I received a £10 Amazon voucher in exchange for my participation. The whole thing took about an hour.

After the experiment, I spoke with the researcher about what sort of things they were looking for. Different people tend to use different memorisation strategies, and look for patterns in different ways. I found the first few tasks a bit more difficult, probably because the controls were unfamiliar, and it was a task I’d not done before.

I took part in another study for someone else’s project just before the Christmas break. They were looking at the ways in which different users utilise the different areas of a mock online collaboration tool for researchers to share datasets.

In this exercise I was instructed to find certain sections of the site, upload a fake dataset, and explore other areas of the site as I wished (the researcher also wanted to see which functions people might want to use in addition to the simple task we had to perform).It was kind of interesting, but I only got entered into a prize draw to win a £100 Amazon voucher for my participation. I would have preferred a smaller, more certain reward, but I guess that the researcher was only able to obtain a small amount of funding for their work.

Today was the fourth time I’ve taken part in a study for someone’s doctoral research at the University. But this is also a bit of a weird post in that I can’t really give you any information about it. Some of the other studies I’ve been in were designed so that it was possible to give a away a little information about the exercise, but for this one… you need to enter the study completely green. Here’s a link to the University’s Research Volunteering page. It’s the one to do with suspicious thoughts and task performance.

My next foray into research volunteering involved a visit to the clinical psychology department. I was being tested for my ability to perceive randomness. As I had expected, human are rubbish at this, but I did get to talk a little about why this is and what people typically perceive as random.

I was first shown a number of 20-character long sequences, made up of ‘H’s and’T’s, to represent the results of a fair coin being tossed. I was asked to rank these in order of how random I thought these appeared. After evaluating the sequences, I then moved on to a computer task. During this test, I was firstly asked to generate a sequence that I thought looked random. I did this by typing ‘H’ or ‘T’ in a 200-letter long sequence.

Then things changed, and I viewed a 200-character sequence broken down into blocks of five letters.

I was then asked to repeat the computer exercise, but by creating forty 5-letter long sequences instead of one huge 200-character long chain. My behaviour in relation to the representation of randomness changed between the two tests. One of the measures that the researcher was considering was the number of times I changed between ‘H’ and ‘T’ – the more changes, the less random overall.

As is typical for many of the research volunteering opportunities at Manchester University, I was rewarded with a £10 Amazon voucher for my participation.

My first time as a test subject took place sometime in November 2014. As mentioned in a previous post, I sign up for research volunteering opportunities for the incentive offered, but also because it’s kinda fun and interesting. And I assist others in their research, which is a noble aim and all that.

The first study was to do with the perception of risk in healthcare choices. The person whose study I was involved with recently gave a talk at an event I co-organised (more about this later). Anyway, the researcher was interested in the way that patients perceive the benefit and risk associated with participating in breast cancer screening programmes.

I was presented with information in various formats, including the cost to me to attend (this was assuming a publically-funded healthcare system, and so only included things like travel cost and inconvenience), the risk of getting a false positive and attending for unnecessary treatment, and the chance of the test detecting an actual cancer that needed treatment.

I was shown a number of screens for different options, and asked to evaluate my preferred testing choices. After I had completed the exercise, the researcher told me a little about what they were using the data for, and how when healthcare choices are made in the UK, the reliance tends to be on the clinician, and the effect of decisions made by the patient has not really been looked at. While patients do not prescribe their own medication, or select appropriate treatments, they do make choices about whether to attend screening, or to continue with a form of treatment, or to follow the doctor’s advice.

After the study we spoke about this, and other related topics, for about an hour. This was a really cool experience because not only did I get to participate in, and find out about, exciting research happening at the University, I got to meet somebody really interesting and have a properly intellectually stimulating chat.

Below is a link to the Research Volunteering page on the University of Manchester website. Here, researchers post surveys and opportunities to take part in studies undertaken by staff and postgraduate researchers at the university.

I’ve taken part in one study (for which I was compensated with a lovely £10 Amazon voucher), and I’ve applied for lots more. I find it very exciting to be a part of new research. My participation will contribute to someone’s Ph.D, and it might influence future technology.

All of the projects first have to be run past the University’s ethics committee, and any medical research also has to comply with NHS regulations. I’m going to be submitting my own request for a research survey soon, and I’ll let you know when it’s live.

I Am A Science Lady is a participant in the Amazon EU Associates Programme, an affiliate advertising programme designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.co.uk.