Now, we are at day three of the string phenomenology and it gets
better by the day: Yesterday, the overall theme was flux vacua and
brane constructions. These classes of models have the great advantage
over heterotic constructions for example that they are much more
concrete (fewer spectral sequences involved) and thus a simple mind
like myself has fewer problem to understand them.
Unfortunately, at the same rate as talks become more interesting (at
least to me, I have to admit that I do not get too excited when people
present the 100th semi-realistic model that might even have fewer
phenomenological shortcomings than the the 99th that was presented at
last year's conference) the internet connectivity get worse and worse:
In principle, there is a WLAN in the lecture hall and the lobby and it
is protected by a VPN. However, the signal strength is so low that the
connection gets lost every other minute resulting in the VPN client
also losing its authentification. As a result, I now type this into my
emacs and hope to later cut and paste it into the forms at
blogger.com.
Today's session started with two presentations that I am sure many
people are not completely convinced by they at least had great
entertainment value: Mike Douglas reviewed his (and collaborators)
counting of vacua and Dimopoulos presented Split Supersymmetry.
Split Supersymmetry is the idea that the scale of susy breaking is
much higher than the weak scale (and the hierarchy is to be explained
by some other mechanism) but the fermionic superpartners still have
masses around (or slightly above) 100GeV. This preserves the MSSM good
properties for gauge unification and provides dark matter candidates
but removes all possible problems coming with relatively light scalars
(CP, FCNC, proton decay). However, it might also lack good motivation
(Update: I was told, keeping this amount of susy prevents the Higgs
mass from becoming too large. This is consistent with upper bounds
coming from loop corrections etc) . But as I learned at the weak scale
there are only four coupling constants that all come from tan(beta) so
they should run and unify at the susy scale.
But the most spectacular prediction would be that LHC would produce
gluinos at a rate of about one per second and as they decay through
the heavy scalars they might well have a life time of several
seconds. As they are colour octets they either bind to q q-bar or to
qqq and thus form R-mesons and R-baryons. These (at least if charged
which a significant fraction would be) would get stuck inside the
detector (for example in the muon chambers) and decay later into jets
that would be easy to observe and do not come from the interaction
area of the detector. So, stay tuned for a few more years.
Talking of interesting accelerator physics beyond the standard model,
Gianguido Dall'Agata urges me to spread a rumour that at some US
accelerator (he doesn't remember which) sees evidence for a Z' that is
a sign of another SU(2) group (coupling to right handed fermions?)
that is broken at much higher scale than the usual SU(2)-left. He
doesn't remember any more details but he promised to dig up the
details. So again, stay tuned.
Finally, I come to at least one reader at Columbia's favourite topic,
The Landscape(tm). Mike gave a review talk that evolved from a talk
that he has already given a number of times, so there was not much
news. I haven't really followed this topic over the last couple of
months so I was updated on a number of aspects and one of them I find
worth discussing. I have to admit it is not really new but at least to
me it got a new twist.
It is the question of which a priori assumptions you are willing to
make. Obviously you want to exclude vacua with N=2 susy as they come
with exact moduli spaces. That is there is a continuum of such vacua
and these would dominate any finite number however large it (or
better: its exponent) it might be. Once you accept that you have to
make some assumption to exclude some "unphysical" vacua you are free
to exclude further: It is common in this business to assume four
non-compact dimensions and put an upper bound on the size of the
compact ones (or a lower bound on KK masses) for empirical
reasons. Furthermore, one could immediately exclude models that for
example unwanted ("exotic") chiral matter. To me (being no expert in
these counting matters), intuition from intersecting branes and their
T-duals, magnetized branes, suggests that this restriction would help
to get rid of really many vacua and in the end you might end up with a
relatively small number of remaining ones.
Philosophically speaking, accepting a priori assumptions (aka
empirical observations) one gives up the idea of a theory of
everything, a theory that predicts every observation you make. Be it
the amount of susy, the number of generations, the mass of the
electron (in Planck units), the spectrum of the CMB, the number of
planets in the solar system, the colour of my car. But (as I have
argued earlier) a hope for such a TOE would have been very optimistic
anyway. This would be a theory, that has only one single solution to
its equations of motion (if that classical concept
applies). Obviously, this is a much stricter requirement than to ask
for a theory without parameters (a property I would expect from a more
realistic TOE). All numerical parameters would actually be vevs of
some scalar fields that are determined by the dynamics and might even
be changing or at least varying between different solutions.
So, we will have to make a priori assumptions. Does this render the
theory unpredictive? Of course not! At least if we can make more
observations that data we had to assume. For example, we could ask for
all string vacua with standard model gauge group, four large
dimensions, susy breaking at around 1TeV and maybe an electron mass of
511keV and some weak coupling constant. Then maybe we end up with an
ensemble of N vacua (hopefully a small number). Then we could go ahead
(if we were really good calculators) and check which of these is
realized and from that moment on we would make predictions. So it
would be a predictive theory, even if the number of vacua would be
infinite if we dropped any of our a priori assumptions.
Still, for the obvious reasons, we would never be able to prove that
we have the correct theory and that there could not be any other, but
this is just because physics is an empirical science and not math.
I think, so far it is hard to disagree with what I have said (although
you might not share some of my hopes/assumptions). It becomes really
controversial if one starts to draw statistical conclusions from the
distribution of vacua as in the end we only live in a single one. This
becomes especially dangerous when combined with the a priori
assumptions: These are of course most effective when they go against
the statistics as then they rule out a larger fraction of vacua. It is
tempting to promote any statement which goes against the statistics
into an a priori assumption and celebrate any statement that is in
line with the weight of the distribution. Try for yourself with the
statement "SUSY is broken at a low scale". This all leaves aside the
problem that so far nobody has had a divine message about the
probability distribution between the 10^300 vacua and why it should be
flat.
Thursday, June 16, 2005
Subscribe to:
Post Comments (Atom)
5 comments:
If Douglas is claiming that he can do statistics by making the number of vacua finite using a size cutoff, what does he say to the argument that his results are going to depend on the cutoff (probably strongly, since the whole point of the cutoff is to get rid of an infinite number of possibilities)?
I am not Douglas. So I can only speak for myself. It seems to me, that as long as the finite number is not one, it does not really mean anything. It just means that you have to make a couple (i.e. finite number) of more empirical observations to narrow things down so much as to obtain predictions from that point on. The dependence on the cut-off would enter only in so far that from that reasoning you cannot predict the maximal size of the extra dimensions (as you put that in as a prior).
You seem to be talking about an older idea, that by imposing enough conditions, maybe one can get a more or less unique vacuum state and then use it to make predictions. But Douglas's program was something different. My understanding was that his idea was to assume that there were a large number of vacua consistent with some prior conditions, then by counting vacua and their properties find the probability distribution for other observables, and hope to make some kind of prediction this way. Has he given up on this?
Again, I can only speak for myself but any statistics with more than one point after imposing priors is at least dangerous. As you point out, even if you accept anthropic type arguments it is not at all clear that they commute with imposing the priors so results would strongly depend on details of the priors.
Post a Comment