Thursday, June 23, 2005

Sailing on Radiation

Due to summer temperatures, I am not quite able to do proper work, so I waste my time in the blogsphere. Alasdair Allen is an astronomer that I know from diving in the English Channel. In his blog, he reports on the recent fuzz about Cosmos-1 not reaching its orbit.

Cosmos-1 was a satellite that was supposed to use huge mirror sails to catch the solar radiation for propulsion. Al also mentions a paper together with a rebuttal that claims that this whole principle cannot work.

In physics 101, we've all seen the light mill that demonstrates that the photons that bounce off the reflecting sides of the panels transfere their momentum to the wheel. So this shows that you can use radiation to move things.

Well, does it?

Gold argues, that the second law of thermodynamics is in the way of using this effectively. So what's going on? His point is that once the radiation field and the mirrors are in thermal equilibrium, the mirror would emit photos to both sides and there is no net flux of momentum. On general grounds, you should not be able to extract mechanical energy from heat in a world where everything has the same temperature.

The reason that the light mill works is really that the mill is much colder than the radiation. So, it seems to me that the real question (if Gold is right, which I tend to think, but as I said above, it's hot and I cannot really convice myself, that at equilibrium the emission and absorption of photons to both sides balances) is how long it takes for the sails to heat up. If you want to archive a significant amount of acceleration they should be very light which on the other hand means the absolute heat capacity is small.

At least, the rebuttle is so vague, it's written by an engineer of the project, that I don't think he really understood Gold's argument. But it seems, that some physics in the earlier stages of the flight was ill understood, as cosmos-1 did not make it to orbit...

Monday, June 20, 2005

The future of publishing

Back in Bremen and after finishing my tax declaration for 2004, great wakering provided my with an essay by Michael Peskin on the future of scientific publication. Most of it contains the widely accepted arguments about how The ArXive has revolutionized high energy physics but one aspect was new to me: He proposes that the refereeing process has to be organized by professionals and that roughly 30 percent of the costs of an article in PRL come from this stage of the publishing process. He foresees that this service will always need to be paid for but his business model sounds interesting: As page charges don't work, libraries should pay a sum (depending on the size of the institution but not on the number of papers) to these publishers which then accept papers from authors affiliated with those institutions for refereeing.

This would still require a brave move to get this going but this would have to come from the libraries. And libraries are well aware of the current crisis in the business (PRL is incedibly cheap (2950 US$) compared to NPB which costs 15460 US$ per year for institutions).

Once we are in the process of reforming the publishing process, I think we should also adopt an idea that I learnt from Vijay Balasubramanian: If a paper gets accepted, the name of the referee should also be made public. This would still protect the referee that rejects a paper but would make the referee accountable and more responsible for accepting any nonsense.

Friday, June 17, 2005

Still more phenomenology

Internet connectivity is worse than ever, so Jan Plefka and I had to resort to an internet cafe for lunch to get online. So I will just give a breif report of what happened since my last report.

First there was Gordon Kane who urged everybody to think about how to extract physical data from the numbers that are going to come out of LHC. He claimed, that one should not expect (easily obtained) useful numbers on susy except the fact that it exists. Especially, it will be nearly impossible to deduce lagrangian parameters (masses etc) for the susy particles as there are not enough independent observables at LHC to completely determine those.

Still, he points out that it will be important to be trained to understand the message that our experimental friends tell us. To this end, there will be the LHC Olympics where Monte Carlo data of the type as it will come out of the experiment will be provided with some interesting physics beyond the standard model hidden and there will be a competition to figure out what's going on.

Today, Dvali was the first speaker. He presented his model that amounts to an IR modification of gravity (of mass term type) that is just beyond current observational limits from solar system observations that would allow for fit of cosmological data without dark energy. One realization of that modification would be a 5D brane world scenario with a 5D Einstein Hilbert action and a 4D EH action for the pull-back of the metric.

Finally, there was Paul Langacker who explained why it is hard to get seasaw type neutrinos from heterotic Z_3 orbifolds. As everybody knows, in the usual scenario neutrino masses arise from physics around some high energy (GUT, Planck?) scale. Therefore neutrino physics might be the most direct source of ultra high energy physics information and one should seriously try to obtain it from string constructions. According to Langacker, this has so far not been possible (intersectin g brane worlds typically preserve lepton number and are thus incompatible with Majorana masses and he showed that none of the models in the class he studied had a usable neutrino sector).

Thursday, June 16, 2005

More Phenomenology

Now, we are at day three of the string phenomenology and it gets
better by the day: Yesterday, the overall theme was flux vacua and
brane constructions. These classes of models have the great advantage
over heterotic constructions for example that they are much more
concrete (fewer spectral sequences involved) and thus a simple mind
like myself has fewer problem to understand them.

Unfortunately, at the same rate as talks become more interesting (at
least to me, I have to admit that I do not get too excited when people
present the 100th semi-realistic model that might even have fewer
phenomenological shortcomings than the the 99th that was presented at
last year's conference) the internet connectivity get worse and worse:
In principle, there is a WLAN in the lecture hall and the lobby and it
is protected by a VPN. However, the signal strength is so low that the
connection gets lost every other minute resulting in the VPN client
also losing its authentification. As a result, I now type this into my
emacs and hope to later cut and paste it into the forms at

Today's session started with two presentations that I am sure many
people are not completely convinced by they at least had great
entertainment value: Mike Douglas reviewed his (and collaborators)
counting of vacua and Dimopoulos presented Split Supersymmetry.

Split Supersymmetry is the idea that the scale of susy breaking is
much higher than the weak scale (and the hierarchy is to be explained
by some other mechanism) but the fermionic superpartners still have
masses around (or slightly above) 100GeV. This preserves the MSSM good
properties for gauge unification and provides dark matter candidates
but removes all possible problems coming with relatively light scalars
(CP, FCNC, proton decay). However, it might also lack good motivation
(Update: I was told, keeping this amount of susy prevents the Higgs
mass from becoming too large. This is consistent with upper bounds
coming from loop corrections etc) . But as I learned at the weak scale
there are only four coupling constants that all come from tan(beta) so
they should run and unify at the susy scale.

But the most spectacular prediction would be that LHC would produce
gluinos at a rate of about one per second and as they decay through
the heavy scalars they might well have a life time of several
seconds. As they are colour octets they either bind to q q-bar or to
qqq and thus form R-mesons and R-baryons. These (at least if charged
which a significant fraction would be) would get stuck inside the
detector (for example in the muon chambers) and decay later into jets
that would be easy to observe and do not come from the interaction
area of the detector. So, stay tuned for a few more years.

Talking of interesting accelerator physics beyond the standard model,
Gianguido Dall'Agata urges me to spread a rumour that at some US
accelerator (he doesn't remember which) sees evidence for a Z' that is
a sign of another SU(2) group (coupling to right handed fermions?)
that is broken at much higher scale than the usual SU(2)-left. He
doesn't remember any more details but he promised to dig up the
details. So again, stay tuned.

Finally, I come to at least one reader at Columbia's favourite topic,
The Landscape(tm). Mike gave a review talk that evolved from a talk
that he has already given a number of times, so there was not much
news. I haven't really followed this topic over the last couple of
months so I was updated on a number of aspects and one of them I find
worth discussing. I have to admit it is not really new but at least to
me it got a new twist.

It is the question of which a priori assumptions you are willing to
make. Obviously you want to exclude vacua with N=2 susy as they come
with exact moduli spaces. That is there is a continuum of such vacua
and these would dominate any finite number however large it (or
better: its exponent) it might be. Once you accept that you have to
make some assumption to exclude some "unphysical" vacua you are free
to exclude further: It is common in this business to assume four
non-compact dimensions and put an upper bound on the size of the
compact ones (or a lower bound on KK masses) for empirical
reasons. Furthermore, one could immediately exclude models that for
example unwanted ("exotic") chiral matter. To me (being no expert in
these counting matters), intuition from intersecting branes and their
T-duals, magnetized branes, suggests that this restriction would help
to get rid of really many vacua and in the end you might end up with a
relatively small number of remaining ones.

Philosophically speaking, accepting a priori assumptions (aka
empirical observations) one gives up the idea of a theory of
everything, a theory that predicts every observation you make. Be it
the amount of susy, the number of generations, the mass of the
electron (in Planck units), the spectrum of the CMB, the number of
planets in the solar system, the colour of my car. But (as I have
argued earlier) a hope for such a TOE would have been very optimistic
anyway. This would be a theory, that has only one single solution to
its equations of motion (if that classical concept
applies). Obviously, this is a much stricter requirement than to ask
for a theory without parameters (a property I would expect from a more
realistic TOE). All numerical parameters would actually be vevs of
some scalar fields that are determined by the dynamics and might even
be changing or at least varying between different solutions.

So, we will have to make a priori assumptions. Does this render the
theory unpredictive? Of course not! At least if we can make more
observations that data we had to assume. For example, we could ask for
all string vacua with standard model gauge group, four large
dimensions, susy breaking at around 1TeV and maybe an electron mass of
511keV and some weak coupling constant. Then maybe we end up with an
ensemble of N vacua (hopefully a small number). Then we could go ahead
(if we were really good calculators) and check which of these is
realized and from that moment on we would make predictions. So it
would be a predictive theory, even if the number of vacua would be
infinite if we dropped any of our a priori assumptions.

Still, for the obvious reasons, we would never be able to prove that
we have the correct theory and that there could not be any other, but
this is just because physics is an empirical science and not math.

I think, so far it is hard to disagree with what I have said (although
you might not share some of my hopes/assumptions). It becomes really
controversial if one starts to draw statistical conclusions from the
distribution of vacua as in the end we only live in a single one. This
becomes especially dangerous when combined with the a priori
assumptions: These are of course most effective when they go against
the statistics as then they rule out a larger fraction of vacua. It is
tempting to promote any statement which goes against the statistics
into an a priori assumption and celebrate any statement that is in
line with the weight of the distribution. Try for yourself with the
statement "SUSY is broken at a low scale". This all leaves aside the
problem that so far nobody has had a divine message about the
probability distribution between the 10^300 vacua and why it should be

Monday, June 13, 2005

String Phenomenology

Having decided not to go to Toronto this summer, I attend a more local conference, namely "String Phenomenology" in Munich. If you would have asked me only a short time ago, if I would ever attend such a conference I would have probably denied it as I consider myself to be more on the fundamental side of things rather than trying to figure out how to build the best semi-(or even more) realistic models. But I will try to find out what these people are up to (and keep you my dear readers informed). And I hope to learn a few things on my latest hobby horse, generalized geometry.

Now I am here after spending a night on a sleeper train from Hamburg (admittedly those have much improved since I last used one roughly ten years ago. There even was laptop power next to my bed so I could watch a dvd before (trying to) falling asleep. Still, I was not really at 100% Battery power when I arrived at 8 am, one hour late.

Today I even survived most of the talks without sleeping for a significant amount of time (and I always woke up well before the applause, good). Unfortunately most of those talks were not exactly in my own area of interest (heterotic (motivated) orbifolds were a common theme) but I would like to mention two presentations: The first speaker this morning was Andrei Linde. This was pretty much standard cosmolgy and KKLT stuff, but he revealed one new feature he found in the WMAP picture: Stephen Hawking has managed to imprint his initials (although the W, that he usually uses is lacking) in the cosmic microwave background.

Later in the afternoon, Stefan Groot Nibbeling talked about his classification of orbifolds of the heterotiv SO(32) string. You know that this is suppoed to be S-dual to type I but it seems there are many more Het SO(32) orbifold models than there are type I models that are free of RR tadpoles. So there is an obvious puzzle that Stefan asked the audience to explain. However, nobody could.

Tuesday, June 07, 2005

Gamma rays from dark matter made of neutralinos?

Spiegel Online has an article about a bump in the spectrum of extra gallactical gamma rays observed by the Compton satellite. A group from Würzburg claims that this can be explained by assuming wimps are neutralinos with a mass of 515GeV that annihilte in halos. See the original Physical Review Letter.

Sounds interesting. Let's wait a bit. Could anybody please confirm that LHC would see these particles? I do not really understand the statement in the discussion section regarding what ("The rather high scale for the mass ladder of superpartners might render detection of all but the lightest of these particles by the CERN LHC difficult") LHC would actually see.

This brings me to the question that I have been asking many people: What do you expect LHC to see? there seems to be a consensus that just the higgs would be the most boring (conservative) outcome and just a bit of susy would not be much better. What do you think?

Monday, June 06, 2005

Travelling with light luggage

Unfortunately, my laptop' hard drive doesn't have enough room to contain all of hep-th, so I need to make a selection when travelling.

To this end, I wrote a little perl script that takes a Spires querry on the command line and then downloads all pdf's that come up for this querry (actually the first page of the web results) to the current directory. To use it, chmod a+x getspires, move it to some directory in your path and the for example
getspires a witten

to get Ed's latest or
getspires FIND C QJMAA,54,281

for a bibliography on generalized geometry.