Posts

Showing posts from April, 2014

Draft of Colloquium on Emergent States of Quantum Matter

Image
Next week I am giving the Physics Department Colloquium at UQ. I am working hard at trying to follow David Mermin's advice , and make it appropriately basic and interesting. I am tired of listening too many colloquia that are more like specialist research seminars. I would welcome any feedback on what I have done so far. Here is the draft of the abstract. Emergent states of quantum matter  When a system is composed of many interacting components new properties can emerge that are qualitatively different from the properties of the individual components. Such emergent phenomena leads to a stratification of reality and of scientific disciplines. Emergence can be particularly striking and challenging to understand for quantum matter, which is composed of macroscopic numbers of particles that obey quantum statistics. Examples included superfluidity, superconductivity, and the fractional quantum Hall effect. I will introduce some of the organising principles for describing such p

What are the ten most remarkable scientific ideas?

Feynman said the most important idea is that all things are made from atoms. On the weekend I listened to a short and fascinating talk by Bill Bryson The four most remarkable things I know . So, I wondered what do I think? What are the ten most remarkable scientific ideas? I have used the following rough criteria. The idea is far from obvious is often not thought about because we have become so used to it that we take it for granted  may evoke not just an intellectual response but also a somewhat emotional one of wonder and awe is profound but can be simply stated is a specific law, principle, or property, rather than a general scientific idea such as that laws can be encoded mathematically, experiments must be repeated, the same laws apply everywhere in the universe. Here is my first rough attempt at a list of the top ten, in no particular order. I hope it will generate some discussion. 1. The universe had a beginning. 2. Time has a direction. 3. The fundamental const

Slow spin dynamics in the bad Hund's metal

Image
I have the following picture of a bad metal. It is half way between a Fermi liquid and a Mott insulator. This means that although there is no energy gap, the electrons are almost localised. The imaginary part of their self energy is comparable to the electron bandwidth. Since they are almost localised they have slowly fluctuating local moments. Hence, the dynamical spin correlation function, chi_s(omega) should be narrow, on what energy scale I am not sure. Somehow I expect a qualitative change in chi_s(omega) as the temperature increases above the coherence temperature, as the system crosses over from the Fermi liquid to the bad metal. But, I am not sure whether this picture is correct because as far as I am aware there are very few calculations of chi_s(omega), and particularly not its temperature dependence. There are a few Dynamical Mean-Field Theory [DMFT] calculations at zero temperature, i.e., in the Fermi liquid regime, such as described here.  In the Mott insulating phase

Is publishing debatable conclusions now encouraged?

Image
I am increasingly concerned about how many papers, particularly in luxury journals , publish claims and conclusions that appear (at least to me) to simply not follow from the data or calculations presented. Is this problem getting worse? Or am I just getting more sensitive about it? Last year Nature published Bounding the pseudogap with a line of phase transitions in YBa2Cu3O6+δ The abstract states Here we report that the pseudogap in YBa2Cu3O6+δ is a distinct phase, bounded by a line of phase transitions. The doping dependence of this line is such that it terminates at zero temperature inside the superconducting dome. From this we conclude that quantum criticality drives the strange metallic behaviour and therefore superconductivity in the copper oxide superconductors. Let me examine separately the three claims I have highlighted in bold. 1. The claim that the line terminates at zero temperature is based on two data points! (the red dots in the figure below). To be c

A survival and sanity guide for new faculty

Occasionally I have conversations with young faculty starting out which often move to how stressful and frustrating their jobs are. I find it pretty disturbing how the system is drifting and some of the pressures put on young faculty. So here is my advice to tenure-track faculty aimed to help preserve their sanity and to survive. The post is not directed towards non-tenure-track people [adjunct faculty, research assistant professors, fixed-term lectureships....]. Their case is a whole different can of worms. Although, some of the advice below is still relevant. But an underlying assumption is that your institution wants to keep you and so provided that you  publish some papers, don't completely mess up your teaching, have some grad students, and get some funding then you will get probably tenure . So how do you stay sane? 1. Tune out the noise. You will hear countless voices shouting and whispering from inside and outside the university about a host of issues that can easily

Four reasons why the USA has the best universities

Why does the USA have the best universities? It is not just that they have more money, as is claimed, for example  here. Hunter Rawlings  is a former President of Cornell, and currently the President of the  Association of American Universities , a consortium of 60 of the leading North American universities. He recently gave a fascinating talk Universities on the Defensive . He states Our colleges and universities became the best in the world for four essential reasons:   1) They have consistently been uncompromising bastions of academic freedom and autonomy;   2) they are a crazily unplanned mix of public and private, religious and secular, small and large, low-cost and expensive institutions, all competing with each other for students and faculty, and for philanthropic and research support;   3) our major universities combined research and teaching to produce superior graduate programs, and with the substantial help of the federal government, built great research programs

Even Sheldon Cooper has given up on string theory

More and more I look at Peter Woit's blog to get his take on string theory, cosmology, and high-energy physics. I think he is doing a great job challenging some of the lame arguments that are presented for the validity and importance of string theory. Even, worse is the multiverse....  I was shocked to read the claim made by Arkani-Hamed that if there are no supersymmetric partners found at the LHC then the multiverse must exist! Then there is the Cambridge University Press book claiming that string theory represents a new paradigm for doing science: you don't need empirical support for a theory to be accepted as true! I find all of this rather bizarre and scary.... Recently, Woit   pointed out that even Sheldon Cooper has given up on string theory.

A definitive experimental signature of short hydrogen bonds in proteins: isotopic fractionation

Image
I have written several posts about the controversial issue of low-barrier hydrogen bonds in proteins and whether they play any functional role, particularly in enzyme catalysis. A basic issue is to first identify short hydrogen bonds, i.e., finding a reliable method to measure bond lengths. I recently worked through and a nice article, NMR studies of strong hydrogen bonds in enzymes and in a model compound T.K. Harris, Q. Zhao, A.S. Mildvan Surely, these bond lengths just be identified with x-ray crystallography? No. the standard errors in distances determined by protein X-ray crystallography are 0.1–0.3 times the resolution. For a typical 2.0 Ã… X-ray structure of a protein, the standard errors in the distances are ±0.2–0.6 Ã…, precluding the distinction between short, strong and normal, weak hydrogen bonds.  [Aside: I also wonder whether the fact that X-ray crystal structures are refined with classical molecular dynamics using force fields that are parametrised for weak bond

Roaming: a distinctly new dynamic mechanism for chemical reactions

Image
Until recently, it was thought that the dynamics of breaking a chemical bond could occur via one of two mechanisms. The first is simply that one stretches a single bond until the relevant atoms are a long way apart. The second mechanism is via a transition state [a saddle point on a potential energy surface], where the geometry of the molecule is rearranged so that it is "half way" to the products of the chemical reaction. The energy of the transition state relative to the reactants determines the activation energy of the reaction. Transition state theory  establishes this connection. Catalysts work by lowering the energy of the transition state. Enzymes work brilliantly because they are particularly good at lowering this energy barrier. An earlier post considered the controversial issue of whether it is necessary to go beyond transition state theory to explain some enzyme dynamics. I have been struggling through an interesting Physics Today article Roaming reactions: the

How 5 years of blogging has changed me

Last month marked the 5 year anniversary of this blog. My first post was a tribute to Walter Kauzmann. In hindsight, after almost 1500 posts, I think that was a fitting beginning. Kauzmann represented many of the themes of the blog: careful and thorough scholarship, theory closely connected to experiment, simple understanding before computation, hydrogen bonding, fruitful interaction between chemistry and physics, …. Reflecting on this anniversary I realised that writing the blog has had a significant influence on me. Writing posts forces one to be more reflective. I think I have a greater appreciation of good science: solid and reproducible, influential, ... how important it is to good science, rather than just publishing papers how hard it is to do good science today, the practise of science is increasingly broken the bleak long-term job prospects on most young people in science the danger and limitations of metrics for measuring research productivity and impact the import

What role does reasoning by analogy have in science?

Two weeks ago I went to an interesting history seminar  by Dalia Nassar that considered a debate between the philosopher Immanuel Kant and his former student Johann Gottfried von Herder. Kant considered that thinking by analogy had no role in science whereas Herder considered it did. Apparently, for this reason Kant thought that biology [natural history] could never be a real science. Thinking objects were fundamentally different from non-thinking objects. One of the reasons I like going to these seminars is that they stimulate my thinking in new directions. For example, a seminar last year helped me understand that one of my "problems" is that I view science as a vocation rather than a career, perhaps in the tradition of Robert Boyle and  the Christian virtuoso. After the seminar I had a brief discussion with some of my history colleagues about what scientists today think about analogy. I think it plays a very important role, because it can help us understand new sy

Giant polarisability of low-barrier hydrogen bonds

Image
An outstanding puzzle concerning simple acids and bases is their very broad infrared absorption, as highlighted in this earlier post.  The first to highlight this problem was  Georg Zundel . His solution involved two important new ideas: the stability of H5O2+ in liquid water, [the Zundel cation] that such complexes involving shared protons via hydrogen bonding have a giant electric polarisability , several orders of magnitude larger than typical molecules. Both ideas remain controversial. A consequence of the second is that the coupling of electric field fluctuations associated with the solvent of the complex will result in a large range of vibrational energies, leading to the continuous absorption.  Later I will discuss the relative merits of Zundel's explanation. Here I just want to focus on understanding the essential physics behind the claimed giant polarisability. The key paper appears to be a 1972 JACS Extremely high polarizability of hydrogen bonds R. Janosche

Did Wigner actually say this?

It is folklore that Eugene Wigner said "It is nice to know that the computer understands the problem. But I would like to understand it too." But did he actually say it? Where and when? I have been trying to track it down. The earliest reference I can find is in the beginning of Chapter 5 of a 1992 book by Nussenzweig, which just says it is attributed to Wigner. It is a great quote so it would be nice to know that Wigner actually said. I welcome any more information.

The grand challenge of wood stoves and the rural poor

Image
Today I went to a very interesting talk, presented by the UQ Energy Initiative , by Gautam Yadama. He described the "Wicked problem"  of the use of wood stoves by the rural poor in the Majority world. This causes a multitude of problems including deforestation, climate change, household pollution, disability due to respiratory problems,…. Yet solutions are elusive, particularly because of poverty, cultural obstacles, gender inequality, technical problems, …. In particular, previous "top down" "solutions" such as the wide scale free distribution of 35 million gas stoves by the Indian government in the 70s and 80s [largely funded by the World Bank] have been complete failures. He described his multi-disciplinary research involving social scientists, engineers, and medical experts. Yadama emphasized the importance of community involvement and programs that are "evidence based" using randomised trials [similar to those featured in Poor Economics ]

A basic but important research skill, 3: talking, asking, and listening

One of the quickest ways to learn about a research field is to talk to others working in the area. Trying to learn the fundamentals [key questions, techniques, background, …] by only reading can be a slow and inefficient process. Furthermore, key pieces of information, can be buried or not even there. So reading needs to be complemented by talking to others. They don't have to be the worlds leading expert. Yet this is a very hard process and many students give up too easily . First, there is the problem of finding someone who both knows enough and will take the time to talk to you. Second, you will probably feel dumb.  It requires courage and confidence to do this. You may not even know what questions to ask. Much of the jargon/language they use may be unfamiliar or meaningless. Third, it is just plain hard work and requires a sustained effort. Theorists and experimentalists talking to each other presents a special set of challenges. So does talking across disciplines (chemists

Competing phases are endemic to strongly correlated electron materials

At the Journal Club for Condensed Matter Steve Kivelson has a nice commentary on a recent preprint Competing states in the t/J model: uniform d-wave state versus stripe state P. Corboz, T.M. Rice, and Matthias Troyer. This paper highlights an important property [see, e.g.,  here and here ] of strongly correlated electron systems. A characteristic and challenging feature is subtle competition between distinctly different ground states.  For example, for the t-J model the authors find a broad range of parameters [t/J and doping x] that three phases are almost degenerate. The phases are a spatially uniform d-wave superconducting state (USC) [which sometimes also co-exists with antiferromagnetism] a co-existing charge density wave and d-wave superconducting state (CDW+SC) a pair density wave (PDW) that includes superconducting pairing that spatially averages to zero and is closely related to the Larkin-Ovchinnikov-Fulde-Ferrell state . The authors find that the energy differ

The challenge of colossal thermoelectric power in FeSb2

Image
There is an interesting paper Highly dispersive electron relaxation and colossal thermoelectricity in the corr elated semiconductor FeSb2 Peijie Sun, Wenhu Xu, Jan M. Tomczak, Gabriel Kotliar, Martin Søndergaard, Bo B. Iversen, and Frank Steglich. The main results that are a struggle to explain are in the figure below. The top panel shows the temperature dependence of the thermopower [Seebeck coefficient] of FeSb2 [red] and the isoelectronic FeAs2. First, notice the vertical scale is tens of mV/K. In an elemental metal the thermopower is less than a microV/K. In a strongly correlated metal it can be tens of microV/K. [see for example this e arlier post] . Why is it so large? Why is the Sb compound so much larger than the As compound? In a simple model of a band semiconductor S ~ k_B/e * gap/k_B T. But here the Sb compound has the smaller gap. Also, why is there a maximum in the temperature dependence, S(T) going to zero with decreasing temperature. In an attempt to elucid