Monday, March 31, 2014

Brian Martin has a helpful article Countering supervisor exploitation that considers the problem of Ph.D students and junior researchers being expected to include inappropriate co-authors in their publications. In some cases their work is completely stolen by superiors. Martin has practical and realistic advice including the options of acquiesce, leave, complain, or resist.
He also gives the wise advice of clarifying the issue of co-authorship criteria and the senior persons track record on the issue before joining a research group.

Based on my limited and anecdotal experience, I fear these problems are more prevalent than is acknowledged, and they are getting worse.

I add a few other dimensions to this problem of inappropriate co-authorship.

First, senior people are not always the problem. Sometimes it is junior people who want to include honorary senior co-authors because they think that will "curry favour" with them or increase the chance of the paper being published in a better journal or of being cited.

Second, I have encountered the following two scenarios where junior people are included when they should not be.

"Lets put Susan's name on the paper. She did an undergraduate research project with us over the summer. It will look good for us if we can say we had an undergraduate involved in a paper".

"Peter has been a postdoc in the group for two years but has not produced a paper because he is working on a long term project, so lets put his name on this paper to help him out."

Third, the relative amounts of time spent on the project leading to the paper is not the relevant criteria, but rather the level of intellectual contribution. For example, suppose an advisor suggests a project to a student and then meets with them for a half an hour every week for a year, often making crucial suggestions as to the next steps. The student spends forty hours a week on the project and writes the first draft of the paper which then goes through many revisions involving "red ink" from the advisor. In the end the relative time contribution of the advisor is something like 2 per cent. Yet I think co-authorship is fully justified.

Aside: Martin has articles Plagiarism struggles and When ghosts plagiarise that put the issue of student plagiarism in a broader social context, e.g., politicians and university administrators reading speeches and publishing articles that they did not write.

Friday, March 28, 2014

I have heard it claimed that some string theorists say that the theory does make experimentally testable predictions because it predicts gravity! I think this is silly because it is a postdiction or retrodiction, i.e., one is explaining already known phenomena not suggesting new experiments.

However, we should not just make fun of string theorists, because most of us actually do it to! For example,

I do it too! In my latest paper I just counted 11 times where I wrote things like "our model predicts….". Yet in every case it concerns quantities that have already been observed.

Why do we use "predict" like this?
My guess is that it is because once we write down some model Hamiltonian and make some sort of approximation to calculate some observable beforehand we don't know exactly what the outcome of the calculation is going to be.
I welcome suggestions of other justifications.

Why should we stop this practise?
I think using "postdiction" would highlight just how feeble some of our theoretical efforts are. It is very rare in condensed matter and chemistry that theorists predict new phenomena or can make accurate quantitative predictions about unmeasured quantities. This is the challenge of emergence. Furthermore, the choice of model Hamiltonians and approximations is usually based on an intuition as to what is going to give us the answer [both qualitatively and sometimes even quantitatively] that we want.

Low-barrier hydrogen bonds are characterised by an energy barrier to proton transfer that is comparable to the vibrational zero-point energy. As a consequence the proton is delocalised between the donor and the acceptor atoms.

Previously I posted about the general issue of whether these bonds exist in proteins, and more importantly whether they have a functional role. Before I started working on H-bonds I wrote a post about new experimental studies claiming that the photoactive yellow protein has a low barrier H-bond (LBHB). The relevant geometry and the two relevant H-bonds [2.52 and 2.56 Angstroms] are shown below.

The key issue this JACS paper addresses is

in several very recent papers, Saito and Ishikita (33-35) have claimed that … the chemical properties of the pCA···Glu46 bond can be simply explained as a conventional hydrogen bond, without invoking the LBHB concept. In particular,(33) they have carried out quantum mechanical/molecular mechanical (QM/MM) calculations to reproduce the two short hydrogen bond distances of the crystal structure, obtaining 2.57 and 2.50 Å for pCA···Glu46 and pCA···Tyr42, respectively, but they have not found any minimum energy structure with the proton near the central region of the hydrogen bonds. In both cases, the electronic structure calculations lead to energy minima with the two protons clearly belonging to the Glu46 or Tyr42 moieties, respectively.

However, as the authors stress, this earlier work treats the proton classically. The JACS takes into account quantum effects of the proton motion.

They find a low-dimensional ground state potential energy surface using several QM/MM methods [most DFT with the CAM-B3LYP functional] and then find the low-lying vibrational eigenstates, for both protons and deuterium. They conclude

our work supports the dual result that, in the solid (crystal) phase, PYP presents an LBHB in the pCA···Glu46 hydrogen bond, whereas in solution this strong interaction is gone and shows characteristics of a “normal” hydrogen bond, much in line with what was found for many simpler systems by Perrin et al. (26, 27) Then our results support the first direct experimental demonstration of the formation of an LBHB in a protein.

I am wondering how robust these results are. There are some extreme subtleties.

Second, following some of the results in my recent preprint, when the donor-acceptor distance R is about 2.5 Angstroms (A) quantum effects become very significant leading to

an increase in the average value of R from the minimum of the classical potential by about 0.1 A due to zero-point energy of the proton.

a difference of about 0.05 A between hydrogen and deuterium. This is significant because it means the distances determined from neutron scattering on deuterated crystals will be different from the native protein with protons.

a significant change in the proton transfer potential [particularly the size of the energy barrier] as R changes by amounts as small as 0.05 A.

Due to all of the above I think that it is going to be difficult to make definitive conclusions about this fascinating and important issue.

Tuesday, March 25, 2014

In the conservative Australian magazine Quadrant there is a provocative piece Why Australian universities are just not good enough, by James Allan, a Professor in the law school at University of Queensland. It pains me to admit that some of his criticisms and concerns are largely accurate. Allan is to be affirmed for bringing some of these issues to public debate. Whether, some of provocative language will help is debatable.

Some of the problems I have highlighted previously on this blog. But he also makes a good case that some of the problems are particularly worse in Australia than other countries. The problems [many of which are inter-related] include:

lack of mobility of undergraduates [students just go to the best university in their home city] leading to a lack of real competition between universities within major cities

large first year classes

an acceptance of students doing part-time paid work that distracts from studies

centralised decision making leading to obscure and inflexible bureaucratic policies that are applied indiscriminately

Monday, March 24, 2014

A previous post considered the large orbital magnetoresistance observed in PdCo2 and reported in a recent PRL. The magnetoresistance depends significantly on the direction of the intralayer field. Below I show the results of a very simple model calculation, that explain several key features of the observations. The essential physics of the model was mentioned in the PRL, but the authors reported a full numerical calculation of the magnetoresistance. The outer hexagon below is the first Brillouin zone within the layers. The inner green hexagon is the Fermi surface.

The observations explained by the model include:

the magnetoresistance for intralayer fields can be orders of magnitude larger than fields perpendicular to the layers,

for fields in the [110] direction the magnetoresistance saturates at high fields [but, it looks like the experimental value is larger than the predicted value of 200%] and the resistance will decrease monotonically with decreasing temperature, like the zero-field resistance,

for fields in the [1-10] direction the magnetoresistance increase monotonically with field and the resistance will decrease non-monotonically with temperature, as highlighted in the previous post,

for low fields the magnetoresistance is isotropic with respect to intralayer field direction.

Right click on the notes to see a larger version. The formula for interlayer conductivity comes from a simple solution of the Boltzmann equation. It can be found here and here.

Wednesday, March 19, 2014

UNESCO has declared 2014 the International Year of Crystallography. This is to mark 100 years since Max von Laue got the Nobel Prize for discovering diffraction of x-rays by crystals.
The International Union of Crystallography has produced some nice educational resources for the occasion.
But, I actually prefer this video from the Royal Institution for the Bragg centenary last year.

Due to the involvement of UNESCO there are initiatives to promote crystallography in the developing world. The map below highlights the problem, and shows how science is so under-resourced in Africa.

They are also highlighting how crystallography can aid the development of new materials relevant to pressing issues of clean water, food security, renewable energy, health, and green industry.
Although this is true I have mixed feelings about this. Scientific and technological breakthroughs could potentially help the Majority World in this way. However, the main obstacles to addressing many of these issues are not technical but rather a lack of political will [both on the part of the Western world and corrupt post-colonial national leaders]. There are plenty of feasible, viable, and affordable technical solutions to problems of clean water, disease, and food security. We don't really need more research. I thank Vinoth Ramachandra for teaching me this.

At higher temperatures does not see a splitting and there is a temperature activated conversion between the two tautomers.

The ground state can be written as a superposition of two Born-Oppenheimer states [products of nuclear and electronic wave functions]

Psi = |L>|A> + |R>|B>

where |L> and |R> are the two nuclear states and |A> and |B> the two electronic states.

These are approximately orthogonal to each other.

What is impressive about this?

Each electronic state involves about 28 valence electrons. Roughly each double bond can be described by a valence bond state consistent of a pair of electrons in a maximally entangled singlet state.

This is not quite a Schrodinger cat state. But it is a nanoscale kitten!

How is such a state possible? The key is that the two electronic states are very strongly coupled. They have a Hamiltonian matrix element of order of electron volts [10,000 cm^-1] . Yet the tunnel splittings are only a few cm^-1 due to the small overlap of nuclear states.

Hence, these superposition states are very fragile and will be easily destroyed at a few kelvin and/or any sort of polar solvent.So don't even start thinking about quantum biology!

Monday, March 17, 2014

Over the years I have seen some very impressive people not get permanent jobs and some less able scientists who did. What is the difference?
I feel that sometimes it is just a matter of being at the right place at the right time.

There are many "random" factors that affect both the global and local availability of positions: the economy, changes in government policies, new funding initiatives, new discoveries, new fashions, expanding or shrinking budgets, geopolitical events, changes in Deans and Department Chairs, the whims of influential individuals, retirement or resignation of individuals, ....All of these factors are beyond your control.

If you accept this strong random element to the process there are some important corollaries I discuss below.

But, first a qualification. I am not claiming it is totally random and that everyone has an equal chance. For the top few per cent of postdocs [world wide] it is almost a certainty they will get a job. For the bottom 50 per cent it is highly unlikely. Doing quality work, giving nice talks, networking, and having influential supporters certainly helps. But, they far from guarantee a permanent job.

So here are the corollaries to my claim.

Relax and enjoy what you are doing. Looking for a permanent job can be extremely stressful and produce a lot of anxiety. It is easy to start making comparisons with ones peers. Don't. It is also easy to start putting a lot of pressure on yourself believing "only if" lies such as "if I get one more PRL", "if I get a grant", "if I get a Nature paper" .... "then I will get a permanent job". It isn't that simple. I have been on many search committees and I don't remember the decisions ever being that close that such marginal additions to a CV had any impact on the outcome. All this anxiety achieves nothing and robs you the joy of doing science.Humility. If you are one of the fortunate few who get a permanent job don't let it go to your head and look down on your peers. It probably had more to do with luck than your abilities. The flip side is that if you don't get a permanent job don't let it affect your self image or think that it somehow shows your inferiority. You were probably just unlucky.

Given the randomness you can significantly increase your chances by staying in the game longer. But you also have to consider the value of quitting. Furthermore, it depends on the country. I suspect that in the USA once you are onto your third postdoc it is highly unlikely you will land an interview. In contrast, in Australia and the UK it seems hanging in there longer does increase your chances, particularly if you can get a non-tenured research fellowship [like a Research Assistant Professor in the USA].

Given the randomness geographic and institutional flexibility will increase your chances. Sometimes because of family commitments this may not be possible. But once you limit yourself to one country, one city, or even one institution your chances dramatically diminish. Also, I have been impressed to see a few cases of individuals who were willing to take tenure-track jobs at mediocre institutions, keep publishing good work, and then eventually move to a better institution.

Friday, March 14, 2014

Hydrogen bonding represents a particular challenge to computational quantum chemistry. The figures below show the energy of two different "proton sponge" molecules as a function of the position of a proton as it moves between the donor and acceptor. They are taken from a nice paper I blogged about before.

The different curves correspond to different "levels" of theory and methods.

The other four methods involve density functional theory [DFT] with different functionals.

The different methods give significantly different values for the energy barrier [or whether it exists all] and the positions of the minima.

Which method is "correct", i.e. the most reliable? How does one decide?
What can one benchmark against?
Higher level quantum chemistry [e.g. multi-reference methods] are not possible on large molecules.

Do these differences matter?
After all, isn't 1 kcal/mol [40 meV] just room temperature and so defines "chemical accuracy"?
Does 1/30 of an Angstrom really matter?

There are important situations where the differences in results between methods, especially the size of the barrier, will have a large effect on the results of simulations. For example, path integral molecular dynamics simulations of proton transfer in biomolecules, and of quantum nuclear effects in water. Now there are many simulations using DFT based methods. Some just pick a functional and go from there.

I would like to suggest that one way to benchmark different methods is to compare the above one dimensional potential with the simple parametrisation used in my recent preprint on quantum nuclear effects in hydrogen bonded complexes, and shown in the Figure below. This potential seems to have properties that are consistent with a wide range of experiments [bond lengths, vibrational frequencies, isotope effects] for a diverse set of chemical complexes.

My proposal is in contrast two DFT benchmarking studies [one in JPC, the other in JCTC] that don't actually compare their results to experiment.

Thursday, March 13, 2014

About twenty years ago I applied for my first grant. The science section was about 10 pages and that took me a couple of months to write since I was a novice. The admin. stuff and CV was probably an extra 10 pages. Before I submitted it the Director of the university "Research office", a career administrator, looked it over and suggested a couple of cosmetic changes. In todays dollars, the grant was about $120K per year for 5 years. That did not include overhead, which seemed to be a secret between the university and the government.

A couple of years later I applied for another grant to hire a postdoc for 3 years. A month after I submitted the application I realised that in the rush I had forgotten to include my publication list in the application. I wandered over to the Research Office, which was located in a house just off campus. I apologised for my mistake and asked them if they could do anything. They contacted the funding agency who said "no problem. just send the missing pages and we will add them to the application." The grant got funded. I was very thankful I survived my mistake.

How things have changed! It is striking to me that the above experiences were in the 90's not the 60's!

Today "Research offices" or "Grant administration units" occupy whole floors of buildings on campus and employ multitudes of people to go over applications with a fine tooth comb. This seems to be a painful necessity.

My last application was more than 60 pages. The science is now reduced to about 6 pages when you subtract out "timelines, national relevance, management of data, role of personnel, publication strategy..." from the Project Description. You have to submit the application to the Research office one month before the final deadline. They then sent me an email with all their suggested changes. Many dealt with compliance issues, internal consistency, font sizes, section headings, ....
I printed out the email. It was 8 pages of 10 point font!

Wednesday, March 12, 2014

Chandra Varma wrote a cursory piece Mott Physics, Mixed Valence, Oxygen Valence, in Lithium ion batteries at the Journal Club for the Condensed Matter Physics. He pointed out that these batteries use some materials involving cobaltate [CoO2] layers similar to those in superconducting sodium cobaltate. Furthermore, insights are being gained from ARPES.

But, I struggled to see the way forward for any significant involvement of quantum many-body theorists. The most useful connection was the figure below in the article that Chandra reviews.

I would think that LDA+DMFT calculations would be ideal for addressing whether these cartoon pictures are correct. One question will be the role of Hund's rule coupling.

The Economist had an article about Tesla motors which makes electric cars and is really trying to develop the lithium battery technology further. The company stock capitalisation is now half that of General Motors! But GM sells as many cars in one day as Tesla sells in six months!
Tesla CEO, Elon Musk has a physics background [he dropped out of an applied physics Ph.D at Stanford] and there is an interesting interview with him in APS News.

Tuesday, March 11, 2014

I love the Solid State Simulations software for teaching. Unfortunately, it is getting increasingly difficult to install on the latest operating systems. I have it running o.k. on my Mac. But the students in my class can't get it to install on Windows 8. If anyone has overcome this obstacle, please let us know how.

Monday, March 10, 2014

Each time I teach a course I realise there is some particular intellectual challenge for students that I take for granted because the issue has become so second nature to me.

As an undergraduate I don't think I really learnt, or was taught, to make orders of magnitude estimates and then consider their consequences. I never took a course in solid state physics. I learnt every subject in a precise manner, more like applied mathematics. Perhaps, physics was not taught that way. But that is certainly how I learnt it. It was only when I went to graduate school in the US, that I had to learn to deal with orders of magnitude estimates. Indeed in the General Exam [qualifying Ph.D exam after 2 years] at Princeton there was a whole section called General Physics that did this kind of stuff. You can see some of the questions in this book. I actually think that learning to solve these type of problems was one of the most useful things I learnt during my whole Ph.D. This is the first step in theoretical model building.

Students need to get a feel for all these scales and remember them.
But this is not just an exercise in mindless memorisation, such as a random historical dates or the Latin names of different flora and fauna. Rather, they need to learn and understand the significance and implications of the relative magnitudes of these numbers. Here are a few examples of increasing profundity.

1. For the electrons in an elemental metal the Fermi temperature is orders of magnitude larger than room temperature.
Consequently, the electrons can be treated as a degenerate gas of fermions and most of their thermodynamic and transport properties are determined by the properties of the Fermi surface.

2. At low temperatures the electronic mean free path can be orders of magnitude larger than the spacing of atoms in a crystal.
This is completely inconsistent with the Drude and Sommerfeld models where the electrons scatter off the ions in the crystal. How can they "miss" thousands of atoms? This problem is resolved by the Bloch model: Bloch states do not scatter off the periodic potential of the crystal. The Bloch wavevector is a "good quantum number."3. In elemental metals the average electronic kinetic energy is comparable to the Coulomb repulsion energy between electrons.

Yet, the Drude, Sommerfeld, and Bloch models all ignore interactions between the electrons. So, why do they work so well? This turns out to because of Landau's Fermi liquid theory.

4. The thermal energy at the superconducting transition temperature is orders of magnitude smaller than other energy scales in the problem [phonon energies, Fermi energy, ...].

This turns out to be because superconductivity is an emergent phenomena that leads to a new emergent energy scale, also reflecting the non-pertubative nature of the problem.

Besides emphasising the above issues in lectures and assigning relevant homework problems are their particular ways to help students learn this important skill and concept?

Unfortunately, I think the past few years the show has degenerated into the typical Hollywood sitcom, focusing on "who is dating who now", titillation, and inane crude humour.

However, I saw a great recent episode where Sheldon proposes the existence of a new superheavy element which is subsequently "discovered" by a Chinese research group. It turns out he made simple error in the units he used in his calculations and the Chinese group fabricated their results...

This is actually reminiscent of a real fraud committed by at Berkeley and Darmstadt by Viktor Ninov who fabricated data and claimed the discovery of new elements.

A recent case of scientific fraud at UQ made it onto the local TV news. Unfortunately, the video date has expired. I thought it was pretty interesting when I first saw it. It is also interesting that the university seems to have wiped out the electronic history at the university of the researchers; one was a Head of School for a decade.

This is a highly anisotropic metal with a layered crystal structure.
The authors measure the interlayer resistance as a function of the magnitude and direction of a magnetic field parallel to the layers.

In the abstract of the paper I have highlighted statements for which I give a different perspective, below.

Extremely large magnetoresistance is realized in the nonmagnetic layered metal PdCoO2. In spite of a highly conducting metallic behavior with a simple quasi-two-dimensional hexagonal Fermi surface, the interlayer resistance reaches up to 35 000% for the field along the [1-10] direction. Furthermore, the temperature dependence of the resistance becomes nonmetallic for this field direction, while it remains metallic for fields along the [110] direction. Such severe and anisotropic destruction of the interlayer coherence by a magnetic field on a simple Fermi surface is ascribable to orbital motion of carriers on the Fermi surface driven by the Lorentz force, but seems to have been largely overlooked until now.

1. The material is always metallic.

The figure below shows how the magnetoresistance varies with temperature for fields of fixed direction and magnitude. The key observation is different temperature dependences in the top and bottom panel In the top panel [field in the [1-10] direction] the resistance increases with decreasing temperature and in the bottom panel it decreases.

The authors refer to the top as "non-metallic" and the bottom as "metallic". I think this is confusing terminology because it hints at the idea that there is metal-insulator transition as a function of field direction.
However, the material is always in a metallic phase., i.e. it is a Fermi liquid with a well-defined Fermi surface. It is just that the magnitude of the orbital magnetoresistance [which only depends on omega_c tau, as stated by the authors since Kohler's rule is obeyed] varies with temperature because the scattering time tau varies with temperature.

As another aspect of the field dependence, it is known that such a H^1.5 dependence can originate from the out-of-plane incoherent transport [29], in which a large number of in-plane scatterings of conduction electrons occur before electrons hop or tunnel to a neighboring plane.

I don't think Ref. 29 establishes that claim. Rather it just speculates about the effects of incoherent transport. Very little is known about what really happens in that regime. A PRB by Moses and I discusses definitive signatures of coherent transport. One is that as the field is slightly tilted away from the layer direction, the magnetoresistance should drop significantly. The authors do see such an effect. So I would say the material is actually in a coherent regime (at least at low temperatures). Also the interlayer hopping integral t_perp is estimated to be about 40 meV which should be compared to a scattering rate of about 4 meV (corresponding to the scattering time of 1 psec used in their simulations). The approximate H^1.5 dependence can have a more mundane explanation: it reflects a crossover from H^2 at low fields to H at high fields. This is actually what happens for a circular Fermi surface, as discussed in the PRB above and the Schofield-Cooper paper mentioned next.

3. Large orbital magnetoresistance in layered metals has been observed before and discussed theoretically.

There are many measurements in clean organic charge transfer salts which show a large magnetoresistance. For example, Figure 5 of this paper by Kartsovnik and Laukhin shows a magnetoresistance of order 100.
A nice theoretical paper isQuasilinear magnetoresistance in an almost two-dimensional band structure
A. J. Schofield and J. R. Cooper
Indeed Ref. 31, by two of the co-authors of the PRL, calculates magnetoresistances as large as 60, motivated by experiments on quasi-one-dimensional organic metals.
What is new is the suggestion that one could actually do something technologically useful with this magnetoresistance. I always assumed that the problem was the requirement of low temperatures to keep omega_c tau large.

In a later post I will discuss the essential physics of the orbital magnetoresistance, including the extreme anisotropy. As discussed by the authors the key issue is that the Fermi surface is approximately hexagonal and so has large "flat" sections that can be exactly parallel or perpendicular to an intralayer magnetic field.

makes an important point: Even the top practitioners in the field do not completely agree on its main principles or where to find them.

He then describes some different perspectives, including those enunciated in leading texts, such as Phil Nelson's wonderful book [I have used it in a course and blogged about it before].

He then goes on to describe two of Bialek's principles.

sensory and regulatory systems have mechanisms for managing thermal noise, particle number fluctuation, and other types of background, and that those mechanisms in many cases reduce noise almost to its fundamental physical limits. Examples include reaction dynamics in photosynthetic enzymes, bacterial chemotaxis, embryonic development, and bat echolocation.

the principle of efficient representation, the view that sensory systems represent and transmit the information that they gather in a way that is optimal, subject to physical limits. This idea is made quantitative through information theory, which Bialek presents in tutorial fashion before exploring examples from such areas as embryogenesis, neural spike train encoding, bacterial growth, and animal learning.

Asides:
Steve Hagen has an impressive career history. After a Ph.D and postdoc working on experiments on strongly correlated electron materials he made a transition to biophysics.
The webpage for the undergraduate biophysics course he teaches at U. Florida has some frank rules for students about classroom behaviour and plagiarism.

The relevant papers were not just in some unheard of "spam" "journal" or "conference", but ones that were published by Springer and IEEE.

The fraud was exposed by a computer scientist, Cyril Labbe.

Labbé is no stranger to fake studies. In April 2010, he used SCIgen to generate 102 fake papers by a fictional author called Ike Antkare. Labbé showed how easy it was to add these fake papers to the Google Scholar database, boosting Ike Antkare’s h-index, a measure of published output, to 94 — at the time, making Antkare the world's 21st most highly cited scientist. Last year, researchers at the University of Granada, Spain, added to Labbé’s work, boosting their own citation scores in Google Scholar by uploading six fake papers with long lists to their own previous work.

My thanks to Anthony Jacko and Ben Powell for bringing this to my attention.

Subscribe To

About Me

I have fun at work trying to use quantum many-body theory to understand electronic properties of complex materials.
I am married to the lovely Robin and have two adult children and a dog, Priya (in the photo). I also write an even more personal blog Soli Deo Gloria [thoughts on theology, science, and culture]

Followers

Disclaimer

Although I am employed by the University of Queensland and funded by the Australian Research Council all views expressed on this blog are solely my own. They do not reflect the views of any present or past employers, funding agencies, colleagues, organisations, family members, churches, insurance companies, or lawyers I currently have or in the past have had some affiliation with.

I make no money from this blog. Any book or product endorsements will be based solely on my enthusiasm for the product. If I am reviewing a copy of a book and I have received a complimentary copy from the publisher I will state that in the review.