Sunday, 20 December 2009

Here you go, as promised a rewritten project proposal written for beginners to neuroscience. Not quite a plain English rewording, but simplified and with key terms defined. This is what I will be doing up until September...

The way in which the brain that processes our main senses, such as sight and hearing (and by the way, don’t go thinking you only have the 5 senses we all know about, there are far more!) is fascinating. They are what we call ‘topographically organised’. Which basically means that the surface of the brain, the cortex, acts as a little map of what we perceive. For example, if you could look down on the main visual area of your brain, called V1, and see what each brain cell is processing, it would look more or less the same as the picture you see out of your eye (although this is a hugely simplified explanation, but you get the general idea).

The hearing process is slightly different. In another simplified explanation, sound is processed in terms of what we perceive as the ‘pitch of a sound’, or how ‘high or low’ it sounds; its frequency. Each frequency (or note, if that makes it simpler) is processed by a separate part of the primary auditory cortex, or A1. These areas of the A1 are also organised by frequency, a bit like the keys of a piano going from low up to high pitch notes. So if you were to scan the brain and run your hand down a piano, you would see a ripple of activation move along A1. The range of frequencies that each auditory brain cell responds to is called it’s ‘receptive field’.

If you read to the entry I wrote recently about neuroplasticity you will know that the organisation of the brain, or its ‘wiring’ can be changed. When this occurs to our sensory areas we can think of this as ‘remapping’. Research in animals has shown that the receptive fields of neurons in AI can undergo these ‘plastic’ changes very rapidly, as a result of what the animals learns to associate particular sounds with. Amazingly, these changes occur within minutes.

If something happens to make the animal associate a particular frequency with something external, the tone acquires behavioral relevance and a large number of these AI cells shift their preferred frequency and begin to respond more to the new frequency. This effect has been shown to depend on the animal paying direct attention to the stimulus. In one experiment, two groups of rats were trained to respond to musical tones. One group responded to the frequency and the second group responded to the volume of the tone. So, each were played a series of tones at either different frequencies or different volumes, and had to respond only to the frequency or volume to which they were trained to respond. The frequency rats demonstrated changes to the frequency map in the A1, with more cells firing in response to the trained tone. This didn’t happen in the rats who were trained to respond to loudness, but they did have an increase in the type of cells that respond to volume rather than frequency. This would support the idea that brain cells can be changed depending on what an animal needs them for, and what sounds hold a particular relevance for the animal. Other studies have achieved the same result without training the animal, instead electrically stimulating parts of the brain, such as the nucleus basalis, when the animal heard a particular frequency.

However, very little work has been done in humans on this subject. So our study aims to determine whether conditioning of a particular frequency can lead to improved performance in detection and/ or discrimination of that frequency amid others, as would be predicted if human receptive fields show similar plasticity to that documented in animals.

What we are planning to do is to compare the ability of people to detect a particular frequency as well as to discriminate between the frequency and others close by. In the detection task subjects will have to decide which of two successive bursts of white noise contained a ‘hidden’ embedded auditory tone. In the discrimination experiment participants will be required to decide whether the second of two successively presented pure tones was higher or lower than the first. After this initial detection / discrimination task, subjects will undergo a training method known as ‘classical conditioning’, repeatedly pairing one distinct target frequency tone with an electric shock to the forearm so the participant comes to associate that frequency with receiving a physical shock. After this association is established, the detection / discrimination tasks are repeated, with an occasional “topping up” of the shock conditioning. After 40 minutes, the detection / discrimination task will continue without further conditioning. The absence of reinforcement of the target frequency will then lead to what is called “extinction” of the association between frequency and shock. We will then compare the ability of participants to spot the tone to which they were conditioned before and after the task.

If the animal studies are applicable to humans there is likely to be a greater number of brain cells in A1 detecting the target frequency, because it has become associated with the electric shock. More cells responding to that frequency should make people better at spotting it. If we find a significant effect we may then go on to repeat the experiment while monitoring brain activity using a method known as MEG, to see what is going on in the brain during the experiment.

This study seeks to determine whether detection and/or discrimination of a pure auditory tone can be improved by classical conditioning, pairing a target frequency with an electric shock

Literature Review

Work by Merzenich, Weinberger, Irvine and others has shown that receptive field properties of neurons in primary auditory cortex (AI) can undergo rapid plastic changes in response to behavioral learning in animals (reviewed in Weinberger 2004, Weinberger 2007, Irvine 2007). Remarkably, these changes occur within minutes. During learning, when a target frequency acquires behavioral relevance a large number of AI pyramidal cells shift their best frequency towards this distinct frequency. This effect has been shown to depend on attention , i.e. behavioural relevance (Polley et al., 2006). Two groups of rats underwent operant conditioning with identical stimulus sets. One group responded to a target frequency and demonstrated tonotopic changes resulting in an increased representation of the target frequency, while the second group performed the task (with exactly the same stimuli) in response to a target loudness which led to changes in the topographic organisation of neurons’ preferred loudness. In non-human primates, Blake et al. (2006) demonstrated a crucial role for active cognitive control and involvement needed for tonotopic re-mapping to occur.

Later mechanistic assessment has revealed that the neurotransmitter acetylcholine (ACh) is crucially involved in these plastic processes. Pairing brief ACh infusions with the purely passive presentation of tones induced changes in the AI tonotopic maps similar to the ones observed in the experiments described above. In addition, stimulation of the nucleus basalis, the main source of corticopetal cholinergic projections, led to identical remapping. These findings are intriguing because they strongly argue against the long-held view that primary sensory cortices are merely passive input structures in which plastic changes of receptive fields occur only during early ontogeny. Instead, the studies summarised indicate that the sensitivity and perhaps even local network resonance patterns can be dynamically adapted to current behavioural requirements.

Very little work has been done in humans on this subject. Thus, we aim to behaviourally determine whether conditioning of one or another frequency can lead to improved performance in detection and/ or discrimination of pure tones, as would be predicted if human receptive fields show similar plasticity to that documented in animals.

Materials and Method

In a within-subject design (with conditioned frequencies counterbalanced over subjects), we will compare the detection (experiment A) as well as the discrimination (experiment B) of pure tones. The detection task will employ a two alternative forced choice (2AFC) scheme in which subjects have to decide which of two successively presented white noise stimuli actually contained a pure tone. In the discrimination experiment, participants will be required to decide whether the second of two successively presented pure tones was higher or lower than the first one. In both experiments tones of a range of frequencies will be used and this part of the experiment will last about 15 minutes.

After this initial detection / discrimination block, subjects will undergo classical conditioning, pairing one distinct target frequency tone with an electric shock to the forearm. After this association is established, the detection / discrimination 2AFC routines are repeated, interleaved with further conditioning blocks (“topping up”). After 40 minutes, the detection / discrimination task will cease to be interupted by further conditioning. The absence of reinforcement of the target frequency will then lead to extinction of the association between frequency and shock (extinction).

A number of potential follow-up studies are conceivable. First, the work by Blake and colleagues (2006) suggests that operant conditioning might be more effective in inducing tonotopic changes. Thus, modifications of the paradigm employing reward or punishment depending on performance are possible. Also, there are potential MEG versions of all experiments described which would, through analysis of early latency auditory components of the evoked magnetic fields, allow for a direct assessment of the underlying neurophysiological principles.

Predicted Outcomes

This series of experiments allows for three possible outcomes:1. Conditioning may improve tone detection performance but not tone discrimination. This situation would allow for the conclusion that a greater number of neurons areresponding to the target frequency after conditioning but that this improvement does not involve a sharpening of best frequency tuning curves. 2. Conditioning may improve tone discrimination performance but not tone detection. This outcome could be interpreted as a potential increased local signal-noise ratio. This situation seems somewhat unlikely, however, since previous studies reported best frequency shifts in large numbers of cells rather than sharpening of existing tuning curves.3. Finally, if conditioning leads to performance improvements in both detection and discrimination our interpretation would be that although there was an increase in the number of neurons responding to the target frequency, this change does not come at the expense of frequencies around it. In this situation it would be interesting to study the underlying compensatory mechanisms in a later MEG experiment.

Analysis

The data will be analyzed with ANOVA (random effects) using the SPSS statistics software package. Further analysis may be required depending on results.

Timetable

Preparatory work: January - March 2010 (generation of stimuli, programming of the actual experiment, pilot measurements)Data collection: March - June 2010 (16 to 20 subjects each group)Analysis and write up: June - July 2010

Budget

Participants will be reimbursed for their time and effort using existing research grants of the ICN attention group. No investment in equipment or software will be neccessary.

Ethics

Full ethical approval will be sought from the Graduate School Research Ethics Committee prior to pilot data collection. The ethics application will be submitted in early January 2010.

Thursday, 10 December 2009

My apologies, but this may be a rather lengthy, self-indulgent entry.

There are times on this course when I really need to take a deep breath and swallow down the surge of inadequacy that I feel building up inside me. It is something akin to those moments in life when one almost throws up but somehow at the last second manages to swallow down the noxious brew. Both avoid leaving oneself in an unpleasant position, but equally leave you with a revolting taste in your mouth.

A tad overdramatic? Perhaps.

It is fair to say, however, that I often feel out of my depth on this course. Those of you who have read my past entries in this blog will know that my school record was far from exemplary. You will also know that I have felt more than a slither of trepidation at being accepted to study at such a prestigious institution, a world leader no less.

It has not been the finest of weeks. Monday started off optimistically enough, with a meeting at the Institute of Cognitive Neuroscience (ICN) to discuss my upcoming research project with my soon-to-be collaborator, Dr Christian Kluge. I left the ICN buzzing at having drawn up an exciting and original piece of research with Dr Kluge, and feeling a lot more confident about what would in the new year become the embryonic stage of my thesis.

How crushed I was then, on returning home to see in my inbox the following words from Dr Kluge:

“There is an ongoing project with quite similar designs that i did not know of. Therefore, we will probably have to re-think what we want to do”.

Anyway, there was little time to waste, as in seven days I time I would be required to make a brief presentation to my peers and the course administrators outlining my research project. What Dr Kluge proposed was we meet with Professor Jon Driver, one of the directors of the ICN and the man who would be supervising the project that myself and Dr Kluge will spend the next nine months on.

It didn’t go well.

I came across as a bumbling, poorly read amateur. Prof Driver was clearly unimpressed, and Dr Kluge was visibly embarrassed at having brought me into his office. Nevertheless, between us we managed to thrash out a viable research project which certainly has potential to be an exciting piece of work.

The one advantage of making such a poor first impression on such an important figure within the ICN is that there is now only one direction in which his opinion of me can go. The last thing professor Driver asked of me, as he was on his way into a two hour meeting, was to draw up the slides for my presentation on Monday, and to do it before Friday so he could take a look at it.

It was in his inbox by the time he left the meeting.

The best thing about fighting down those feelings of inadequacy is that it resets ones perspective in order that we may replace them with feelings of pride and self-congratulation; quite rare for me to feel and even rarer to voice. But if I am honest I have done bloody well to get here. It can be off-putting at times to hear some of my fellow students list their accomplishments, to reel off terminology that leaves me perplexed, and to have to take the time to explain things to me in simple terms.

I should not lose sight of the fact that I came into this course without a single science A-level, and ten years after a decidedly average performance at GCSE science. In addition, contrary to what some may think, a psychology degree is far from the ideal prerequisite for a cognitive neuroscience MSc, let alone one with as little scientific content as my bachelors degree had. But then again, I had no experience of psychology before undertaking my degree, and emerged with first class honours.

This may all be new to me now, and I may have to endure some snobbery, condescending comments and pangs of self-doubt, but I will learn fast, improve exponentially and come out the other side with one heck of a valuable qualification.

Thursday, 3 December 2009

Some people are just ahead of their time.

"Men ought to know that from nothing else but the brain come joys, delights, laughter and sports, and sorrows, griefs, despondency, and lamentations. And by this, in an especial manner, we acquire wisdom and knowledge, and see and hear, and know what are foul and what are fair, what are bad and what are good, what are sweet, and what unsavoury... And by the same organ we become mad and delirious, and fears and terrors assail us... All these things we endure from the brain"

Wednesday, 25 November 2009

Not much to report over the last couple of weeks, I had a reading week recently and actually managed to get rather a lot done, having spent several hours in the library making notes on this and that. It was a much needed 'break' of sorts, if only from the time-tabled activities.

I am currently trying to think of a research project which will form my thesis. I have a pretty good idea who will be supervising me, a guy over at the institute of cognitive neuroscience who is researching plasticity in the auditory cortex.

Plasticity is something that fascinates me greatly, and one book in particular, 'the brain that changes itself' by Norman Doidge, was more or less the main reason why I applied for this particular course.

Essentially, plasticity (or neuroplasticity) is the ability of the brain to 'rewire' itself, to make new connections. For the century or so since the 'neuron doctrine' came to the forefront of thinking about the brain, it was believed that the structure of the brain was more or less fixed from adolescence onward. Scientists thought that no new brain cells could be formed, and it a part of the brain was damaged then it was lost for life.

Of course, there has to be some degree of plasticity in the brain, otherwise we couldn't form new memories, but we now know that the brain is far, far more plastic than previously thought.

The most exciting research being done in this field that I am aware of is in the area of sensory substitution, most notably the work carried out by a guy by the name of Paul Bach-y-rita.

Bach-y-rita developed a device which could allow the brain to recover one lost sense from another. If I wanted to sensationalise this, I would say he made blind people see again. But that wouldn't quite do him justice - he made them see out of their tongue.

Sounds insane, but it is actually possible. The key concept here is that we don't see with out eyes, just as we don't hear with our ears. All of our senses are essentially electrical information carried to and processed in the brain. For example, the actual physical image of what you see doesn't get any further than the back of the retina, before it becomes a complex series of electrical pulses that are then carried to the visual areas at the back of the brain.

When somebody is blind, it is generally because of a problem with their eyes, rather than the visual areas of the brain. Therefore if an alternative pathway could be found to get these electrical pulses to the visual cortex, you have vision.

What Bach-y-rita invented was a small device that sits on your tongue and converts a picture from a video camera into electrical information. Imagine a strip of plastic with hundreds of tiny electrical points covering it's surface. Each one of these points is like the pixels that make up a digital image, the brighter the pixel the stronger the electrical pulse. Multiply this over the entire surface of the tongue and you can make up a crude image of your visual field. Apparently, when this device is activated it feels like those old 'popping candy' sweets you can get that fizz in your mouth.

The amazing thing is, over time your brain learns how to process the signals as images, and slowly the signal becomes a valid, if slightly basic, black and white image. Brain scans using MRI machines have even confirmed that this information is being processed in the visual parts of the brain.

So this now becomes a philosophical question - if visual information is being processed in the visual cortex, but it just so happens it is relayed via the tongue by a video camera - is this still eyesight?

Monday, 9 November 2009

I realise that my last entry was pretty dull, so I found out an interesting little fact to keep your interest.

There are 100-150 billion neurons in the human brain.Each neuron may connect with around 10,000 other neurons.If each neuron connected with every other single neuron, our brain would be 12.5 miles in diameter (Nelson & Bower, 1990). This is the size of greater London.

Taken from Jamie Ward's book 'The students guide to cognitive neuroscience', chapter 2.

Sunday, 8 November 2009

Over a month into the course now, and my predominant feeling is exhaustion. I can't particularly explain why, but I am bloody tired. How handy it is then, that next week is a reading week, and it couldn't have been timed better.

There isn't a great deal to report about the last fortnight. We had some interesting lectures and some achingly dull ones. We learnt a little about how computational modelling can help us to understand the brain, had another case study at the hospital, and a whole load more stats.

I got my marks back for the first statistics test as I was handing in the second assignment. I did reasonably well, considering the conditions under which I wrote the last one (see my previous entry on my late-night rewrite). The marks I dropped were, I think, due mostly to the length of time it has been since I had to do any statistical analysis, and a couple of silly mistakes. I feel a lot more confident about the second test, and it would seem that I am back into the swing of things.

My main worry now is the exam in January (the only exam on this course), which is on the neurophysiological side of the course. Of course, this stuff is bloody hard, and Marty has just put a few example questions on his website, which really put the fear into me. As I was rereading my notes on how brain cells communicate with one another, I was struck by just how easy this would be it the brain were intrinsically self-aware. My neurons are firing right now, in many different parts of my brain, as are yours, as are all of ours, all the time, many many times over. We should be experts in this. Given the frequency with which our neurons fire you could argue that it is the single most practised act any human has ever performed.

Friday, 23 October 2009

The third week in, and things are getting tougher. On the plus side, though, certain things are getting more interesting.

For example, we had a lecture explaining how an MRI (magnetic resonance imaging) scanner is able to capture such detailed images of the brain. Would you like to know how it works?

Essentially, an MRI scanner is a tube containing a very strong magnetic field, thousands of times stronger than the Earth’s gravitational pull. This field is kept constant, so is absolutely uniform at every point within the scanner.

The protons inside the head are all busy jiggling around all over the place, randomly veering around at different angles, but once inside the scanner they all begin to align, with every proton facing the exact same direction, due to the power of the huge magnetic force generated by the machine.

The operator can then send a strong sudden magnetic pulse through the scanner, which flips every single proton in the scanner chamber to one side. To help you imagine this, picture all the protons facing north, and then suddenly being knocked to face east (of course this is not how it happens, but it can be easily visualised).

Now, this is the clever bit. The protons will then flip back to align with ‘north’, only in every different type of tissue this process will take slightly different amounts of time. So, bone, grey matter, white matter, even oxygenated and lesser oxygenated blood, each will take a slightly different time for their protons to recover from the knock in order to face ‘north’.

This different timing for each type of tissue is crucial, as the scanner can keep a record of how long each proton took to recover, and this will effect whether a light or dark patch appears on the screen, giving us a perfect image of the skull, brain and surrounding tissue.

Amazingly, the machine can do this in 3d, so you can then work through sections of the brain as if you were travelling through the body.

It is quite an amazing procedure, and if you are interested you can see more here.

On an unrelated note, I had my first bit of coursework due in yesterday, a particularly nasty statistics worksheet. The night before it was to be handed in I had got it all finished, printed off, done and dusted. My first early night in ages, I was actually in bed by 11 which is unheard of.

12:30 a.m, I get a phonecall. It is Melissa, a girl on my course, who I collaborated with on the work. She tells me there was an error in our data, meaning all our calculations were wrong. So, at one in the morning I had no choice but to rewrite the bloody thing from scratch, redoing all my calculations, graphs and tables. I got to bed some time after 4 a.m, a bit of sleep, and handed it in an ten the next morning.

Sunday, 18 October 2009

To my right-hand side sits a pile of my old undergraduate study aids, module guides, notes and help sheets.

Directly in front of me sits my laptop and, next to that, three textbooks, one intimidatingly large yet deceptively simple, the other two smaller and more advanced.

What do these myriad documents have in common? If I were to hazard a guess I should suggest that they were faxed to me direct from Hades by Satan himself, written in his own blood and printed on the compacted bones of history's greatest tyrants. But in reality the truth is far more sinister. For what links these formidable documents is a word, one single word, that strikes fear into the hearts of countless generations of poor psychology students.

Statistics.

Or maybe it's just me. See, I have always struggled with numbers. Words, on the other hand, have always come naturally to me, flowing from my lips (or as is increasingly the case, fingertips) without any kind of difficulty, but numbers, with their logical structure and stubborn insistence on being 'right or wrong' make me want to curl up into a little ball and cease to function.

Anyway, this was not supposed to be a moaning entry about the stats coursework I am currently writing. All you really need to know is that I looked at the questions, started to panic, stopped panicking and all of a sudden it all started to come back to me, albeit in a trickle rather than a flood.

No, my real intention for this entry was not to moan about statistics, but to rejoice in how wonderful Thursday was. Every other Thursday, as part of our 'lesion approaches' module, we are invited over to the National Hospital for Neurology and Neurosurgery for a real life case presentation.

What this entailed (although obviously I can't be specific, in the interest of the confidentiality of the patient) was a short talk from some of the senior doctors about a specific patient, how their brain has been injured and how the injury has affected the patient. The patient was then brought in and took part in a short mock examination whereby the doctor tested various aspects of cognitive function with a series of standardised tests. We were then given the opportunity to ask the patient questions about their injury and subsequent experiences. The patient then left and we had a discussion about the case and the opportunity to ask the doctor any questions about the case.

It was absolutely fascinating, and really brought this MSc course crashing into the real world. I was overwhelmed by the generosity of the patient at coming in to discuss such a personal and traumatic experience, and I feel that these case presentations will be an incredibly valuable part of the course.

Oh well, that is quite enough enthusiasm for one entry. I must get back to the utter joy of these statistical calculations.

Tuesday, 13 October 2009

Week two began once again with Structure and Measurement of the Brain, delivered by Marty Sereno, who I hadn’t quite appreciated until now is one of the world leading researchers in his field. Anyone interested can see him at work below.

The topic this week focused on the development of the brain in the womb, with a detailed explanation of how a small cluster of cells begin to slowly divide and form the beginnings of the spine, the eyes and gradually folds over on itself to take shape as the infant brain. This was fascinating, and I was really impressed by Marty’s knowledge, with frequent deviations to discuss the brains of other mammals and assorted creatures of all different sizes.

He then went on to discuss the visual system, and how surprisingly complex it is. We have no idea quite how much our brain needs to do in order to make sense of the world around us. For example, what we perceive as a single field of vision is actually processed in several different parts of the brain, such as an area for the periphery, an area for our immediate focus etc. These different maps are then seamlessly patched together in order that they be perceived as one single image. This on top of flipping the image (due to the convex shape of the retina), and creating a 3D image of the world via depth perception, all happening simultaneously without us knowing it. He also discussed the limitations of our knowledge of just how this is done, with particular reference to objects moving across our field of vision.

We had an extended lecture with Marty, 3 hours rather than the usual 2. He actually managed to maintain my full and undivided attention for almost the full 3 hours, until that is he started talking about areas of the visual cortex known as (I kid you not) ‘blobs’. This was about the point when both my interest and understanding trailed off, the final straw being the point that the areas between the ‘blobs’ are known as ‘interblobs’. Frankly if scientists can’t be bothered to name these things properly, I can’t be bothered to understand.

The reading Marty set us this week is much more accessible too. One bit I found particularly interesting is how the brain of a tennis player will come to perceive his or her racquet as an extension of their arm, increasing it’s mental map of their immediate surroundings to compensate for this increased ‘body shape’.

This follows research by Iriki et al (1996) on monkey using sticks as primitive tools, and the science is explained thusly:

“The visual receptive fields expand when the monkey uses the rake as an extension of its hand, while the somatosensory receptive fields are unchanged. This is interpreted as a change in the body image: The enlargement of the visual receptive field reflects the neural correlate of a hand representation that now incorporates the tool. The visual receptive fields return to their original size within a few minutes after tool use is discontinued. They do not expand at all if the monkey simply holds the rake without intending to use it. These rapid changes in visual receptive field size indicate that the neural connections that allow for the expansion must be in place all along.”

Anyway, as challenging as this module is I feel it may end up being a very rewarding one, and perhaps even a personal favourite, as long as I can keep up with the reading material, and maintain the pretence of understanding it.

The afternoon talk was delivered by a guest lecturer who gave a talk on Autism and Williams syndrome, both examples of what can happen when certain parts of the brain don’t function as they should. Frustratingly, all she did was read out her lecture slides without any elaboration, meaning she may as well have just emailed us all and we could have read the bloody things in 20 minutes. It didn’t help that she pitched the thing at GCSE to A-level standard, not far above the quality of science you would expect on ‘loose women’. And yes, I am fully aware that last week I was complaining about things being too hard, only to complain this week that it is too easy.

Friday, 9 October 2009

As the first week of fifty passes, it seems appropriate to stop and reflect on the last five days. Granted, it wasn't technically the first week (we had a week of introductory talks, induction sessions and general 'getting to know people' type events last week) but it was the first week of teaching, and my first class on Monday morning left me with a heavy heart.

Now, much had been made by the lecture staff about the wide variety of backgrounds and experiences we, the students, had prior to the course. For some this is their second or third masters, but for the majority it is their first postgraduate qualification. Many, like me, graduated with a BSc or BA in Psychology, but there is also a wealth of other talent on board, from computer science students to those who studied linguistics. Some studied neuroscience, and others had a more biomedical background.

Therefore one would hope that the lecture staff would really start with the basics, and build up to more complex and specialist information. Sadly not. My main complaint about Tuesday morning is that Marty Sereno, who teaches 'Structure and Measurement of the Brain' (which encompasses Neuroanatomy, Neurophysiology, and Neuroimaging Physics)seemed to assume we were all already familiar with pretty advanced maths, physics and chemistry concepts, and wasted no time in explaining much of the terminology he employed. For poor old me, who hasn't done any maths or real science in the last decade (not since the last century, in fact) this was a bit of a problem.

This was exacerbated by the fact that, rather than starting at the beginning, Marty seemed to deliver the points of his lecture in a bizarre mixed up order of concepts, often explaining how a process works far before explaining what the process actually is and does. Passionate he certainly is, and his knowledge seemed superb. But when I asked a question (basically 'what on earth are you talking about') he merely stated he would come back to my question, and then promptly forgot.

Luckily when I went home and began to plough through the reading things started to make sense. I now have a basic understanding of the signalling processes employed by neurons (or brain cells), which is far too dull to even bother explaining here, but in very simple terms it has to do with flipping the balance of positive and negatively charged ions inside and outside the cell wall to create a sudden electrical charge which then fires along to the next brain cell to convey the signal. Simple, and it only took a few hours of reading.

The rest of Tuesday was pretty much administrative stuff, learning about the departmental intranet and IT services and a little information about our major project for the course. We also discussed applying for PhD funding, which needs to be done very soon indeed. This presents me with a bit of a problem, as a large part of my reason for doing this course is that I don't yet know what field I want to go into for my PhD, and so was hoping that the masters would give me extra knowledge of all aspects of neuroscience. So it came as a bit of a shock to learn that I would need to apply for funding within the first month of the course. Arse.

By the time my second day of lectures came around, I was still worrying about my experience of Mondays class. Surely it couldn't all be this hard? If week one was going to be that complex then what the hell was week ten, twenty or fifty going to be like?

Luckily, I was presently surprised by Thursday, as much of it was very basic revision of the statistical concepts I had learnt at undergraduate level. I would say I was already familiar with about 90% of what was discussed in out first 'Advanced Quantitative Methods' class, and looking over the timetable for the coming months most of this module does seem to build on concepts with which I am already familiar. I had been dreading statistics, but I left the class feeling very, very relieved.

Which brings me on to my final class of the week, 'Lesion approaches'. This class covers what we can learn about the brain by examining what effect is observed when a specific area of the brain is damaged, and the impairments it causes. I have a feeling this may become a personal favourite of mine, and although week one was very basic and introductory in tone, it did not disappoint.

So there you have it, week one. Sorry there wasn't much to report, but due to the nature of these things this first week was mostly spend getting to know the staff, finding out where we needed to be for each class and discussing what we will be learning over the next 12 months.

I would imagine that as of Monday, the exciting stuff starts! But don't worry, I will be here to keep you posted!

Thursday, 8 October 2009

Good lord, it appears I am going to spend 23 hours a day for the next 12 months solidly reading.

I may have hit a snag in my plan, which at first seemed simple. I would take what I have learnt on the course and post it in an informative, witty and simple format whereby anybody could understand and appreciate the wonders of the brain.

As I said, that was the plan. Cast your eyes over this little lot:

Now, I appreciate that those are all real words, and many of them may even appear in the English language. Just, not all at once and certainly not in that order.

That is one page out of sixty that make up this article.

For this particular module, we have been given three articles to read this week. Three core articles, and then a series of other 'suggested readings'.

I have four modules running simultaneously at any one time, each with their own prescribed reading list.

Wednesday, 7 October 2009

After what has felt at times like the longest of all summers, October has arrived, and with it my opportunity to embark upon a Master of Science degree.

This has been more than a little daunting, and I must admit that to many it is probably also pretty surprising (with good reason).

Here is, then, a little background information about me, and I promise that after this entry I will stick to detailing my experiences of the course, the field of cognitive neuroscience, and try and express what I am learning in easily digestible chunks. But for now, the autobiography commences:

When I was at school I was always what could be considered an 'under-performer'. Although many teachers over the years noted that I was pretty bright, this observation was almost universally followed up with an immediate 'but'. Typically my school reports would highlight the fact that if a subject did not capture my imagination I just wouldn't bother. Conversely, if it did capture my over-active imagination then I would invariably achieve very highly. In an ideal world this should not present a problem, but god knows a good teacher is hard to find, let alone one who will infuse each and every lesson with a spark of excitement and passion, meaning that more often than not my work failed to live up to my potential.

Not that I should attempt to absolve myself of blame absolutely, more often than not I was more than happy to let myself coast, often only expelling the bare minimum effort needed to scrape by with a moderately acceptable grade. Consequently I did just fine. Nothing great, but enough to ensure that the only way in which my memory would live on in my old school will be in the tales passed down the years about my childish and troublesome, albeit somewhat amusing behaviour (details about which are probably best left for another forum).

Anyway, I left school as the nineties turned into the naughties, and after a very uneventful gap year I enrolled, hated, and subsequently dropped out of a drama degree, all within the space of a term. While away I had become quite depressed which put me off academia for quite some time. This left a substantial gulf in my life, which I filled with a job working in a bar, and I was soon climbing up the rungs of a career ladder I had no real desire to be climbing, with promotion to bar supervisor and the offer of a place on a management training scheme. Four years passed, stuck serving the same old drinks behind the same old bar to the same old idiots, and it was becoming depressingly clear that I had no idea what to do next.

The trouble with bar work is it does not afford you much time to research career prospects - particularly when you have none. The combination of long hours (15 hour shifts, anyone?) late finishes and working all weekend meant I had little time to consider my options, let alone draw up a strategy for how to act on them. So I took some time out. I saved up as much money as I could muster and jetted off to New York City. For 3 months.

Finally, a chance to clear my mind of distractions, and to give some real thought as to what I would enjoy doing, and to what I would be particularly skilled at doing (oh, and to have some much needed fun in one of the best cities in the world!).

The result (drum roll please.....) I decided I wanted to become a hypnotherapist. Only I didn't, I just didn't yet know that I did not want it.

Anyway, to cut what was supposed to have been a short story at least relatively short, logic dictated that a psychology degree would be a good springboard for such an aspiration, and so I picked a London university at random, ending up at St Mary's University College in Twickenham.

To be fair to St Mary's, it was a very small campus, and therefore had very limited facilities. The psychology department was great, with some wonderful academic staff, but looking back the university itself wasn't great, and I didn't really fit in to university life on a campus that was primarily sports based. But, as you know by now, I didn't have the best set of A-levels, so I just had to work bloody hard to make up for all those years of slacking, and emerged in the summer of 2008 with a first class honours degree.

As you will know, the summer of 2008 also coincided with the start of a global recession, which rather buggered my employment prospects, but after six months on unemployment benefit I took a job in a medium-secure psychiatric hospital, giving me some great entry level clinical experience. Some good news, finally! Good news which was followed by some further good news, Goldsmiths College wanted me for their Cognitive and Clinical Neuroscience MSc! As great as this sounded, I had my sights set on UCL, and the amazing course they had to offer.

The months rolled by and UCL remained silent, while I remained at the psychiatric hospital (still as staff, I might add) until, eventually, I was accepted onto my dream course.

And we arrive back at the never-ending summer, which has now ended. I am here. Studying at one of the top 5 universities in the country, and one of the top 10 in the world. On one of the top 3 neuroscience courses on the planet. Perhaps now you can see why I started off by saying I was a little daunted by the whole experience.

Now the history is out of the way, I can begin the exciting stuff, the reason I am here in the first place. Cognitive Neuroscience. The biological underpinnings of mental activity. Or something like that, at any rate. I will endeavour to keep this blog updated with my experiences over the next 12 months, and try to translate what I have learnt (well, the interesting bits at any rate) into plain English.