Physicists on infinities and paradoxes in QFT

Dirac (1938), Classical theory of radiating electrons, Proc. Roy. Soc. London A167, p. 148:

One may think that this difficulty [the divergence of the self-energy of a point mass] will be solved only by a better understanding of the structure of the electron according to the quantum laws. However, it seems more reasonable to suppose that the electron is too simple a thing for the questions of the laws governing its structure to arise, and thus quantum mechanics should not be needed for the solution of the difficulty —- our easiest path of approach is to keep within the confines of the classical theory


Dyson, letter to Oppenheimer, 1948, as quoted in Schweber's QED and the Men Who Made It, p. 523:

I believe it to be probable that the Feynman theory will provide a complete fulfillment of Heisenberg's S-matrix program. The Feynman theory is essentially nothing more than a method of calculating the S-matrix for any physical system from the usual equations of electrodynamics. It appears as an experimental fact (not yet known for certain) that the S-matrix so calculated is always finite; the divergencies only appear in the part of the theory which Heisenberg would in any case reject as meaningless. This seems to me a strong indication that Heisenberg is really right, that the localisation of physical processes is the only cause of inconsistency in present physics, and that so long as all experiments are interpreted by means of the S-matrix the theory is correct.


Rosenfeld, On Quantum Electrodynamics, from Niels Bohr and the Development of Physics (Pergamon Press, 1955), pp. 70-95:

the last remarks… do not at all imply that this discussion leads to the disclosure of the insufficiency of quantum electrodynamics. The limits of validity of the theory can only be derived from physical arguments, quite distinct from the logical analysis of its internal consistency. This analysis shows in fact that, whatever shape any more comprehensive theory will take in future, the present theory does give a coherent description of electromagnetic fields and electrons within a domain amply covering all practical applications.

The steady progress of the last few years in the unravelling of the formal structure of quantum electrodynamics leads to the same conclusions. In particular, it is now widely realized that the mathematical divergencies, which were such a disturbing feature so long as they could not be isolated and identified in a covariant way, are not due to any inherent inconsistency of the formalism, but only arise from a failure to take account of the non-electromagnetic elements which come into play in domains of nuclear dimensions.

In this respect, the situation was in no way different in the classical electron theory. Critical conditions under which the classical picture of the electron as a moving point charge breaks down, are reached when radiative reactions become of the same order as static interactions; the size of the critical domains, given by the 'classical electron radius' $e^2/mc^2$, coincides with nuclear dimensions —- owing, no doubt, to some unknown, deep-lying law of nature…

One often speaks in this connection of "open" theories, as opposed, I imagine, to some sort of ideal theory which would suffer no restriction of the range of variation of the physical quantities involved, and would be completely self-contained. It seems to me, however, that this opposition is quite artificial; so far as I can see, all physical theories have always been of the so-called 'open' type, and a 'closed' theory is just as much a will o' the wisp as universal determinism.


Kramers, Non-relativistic Quantum-Electrodynamics and correspondence Principle, Solvay Congress 1948, p. 265:

It may of course be that point-model and fluctuation-divergencies can and must ultimately be traced to the same source from the point of view of a future complete theory and that one therefore should not think to [sic] hard of the device: first quantizing a wrong Hamiltonian and trying to make amends later on. Still something might be learned from a theory of the sort to which the present paper aims and in which it is tried to analyse the structure-independent features of classical theory first and to see whether and to what extent quantization can learn us something new.


Schwinger, On Quantum-Electrodynamics and the Magnetic Moment of the Electron, Physical Review 73, p. 416 (1948):

Electrodynamics unquestionably requires revision at ultra-relativistic energies, but is presumably accurate at moderate relativistic energies. It would be desirable, therefore, to isolate those aspects of the current theory that essentially involve high energies, and are subject to modification by a more satisfactory theory, from aspects that involves only moderate energies and are thus relatively trustworthy.


Schwinger, interview with S. Schweber, in QED and the Men Who Made It, p. 366:

QED was necessary in restoring faith in field theory with the understanding that field theory was to be trusted in one place but not trusted in another. One simply had to separate the untrustworthy area. That was a fundamental difference with the Dyson approach. He ends with a mathematical scheme which is electrodynamics to all orders and energies.

[…]

[renormalization] is the clear separation of what we don't know — but which affects our experiments in a very limited way — from what we do know and where we can calculate in detail. In fact, I insist that all theories are like this. —People may not want to face up to it, [but] there is always an area beyond which the theory either breaks down or where other phenomena come into play that you don't know about. They do not upset everything in the area you can control, and you isolate that from it: That's what renormalization is really about. Not sweeping infinities away but isolating the unknown part and recognizing its limited influence.

I am not sure that I was at all interested in the mathematical question of convergence to all order. I don't think that is a physical question. [Similarly] I have a feeling even then that I did not take renormalizability too seriously. If in fact the theory had been not renormalizable at the 27th stage or whatever have you, I would have said "O.K. That's good" because here is a place where what we don't know, namely what happens at very high energies, enters the theory and will learn something. It wasn't essential to me that the theory be renormalizable to all orders. That was nice to get the theory going to lowest order. What would be even more interesting is if it didn't work. I wasn't very caught up in all these all order questions. To that extent I consider myself a physicist.

While a happy thing to have, I don't regard renormalization as fundamental — even in terms of selection of theories. It may be right, but one has to recognize that there is an element of hubris involved in saying that we are so close to the final answer that this criterion becomes relevant — that there isn't a whole world unknown to us about to be discovered in higher energies.


Schwinger, Renormalization theory of quantum electrodynamics: an individual view, in The Birth of Particle Physics, ed. Brown and Hoddeson, Cambridge University Press, 1983, p. 331:

I must give Kramers very high marks for his recognition that the theory should have a structure-independent character. The relativistic counterpart of that was to be my guiding principle, and over the years it has become generalized to this commandment: Thou shalt not entangle that which is known, and reliable, with that which is unknown, and speculative. The effective-range treatment of nuclear forces, which evolved just after the war, also abides by this philosophy.


Schwinger, Renormalization theory of quantum electrodynamics: an individual view, in The Birth of Particle Physics, ed. Brown and Hoddeson, Cambridge University Press, 1983, p. 331:

[On Bethe's derivation of the Lamb shift] The agreement of that number with the observed level shift ended any doubt, if doubt there was, concerning the electrodynamic nature of the phenomenon. Yet the relativistic problem, of extracting from the theory a finite and unique prediction, remained.


Schwinger, Renormalization theory of quantum electrodynamics: an individual view, in The Birth of Particle Physics, ed. Brown and Hoddeson, Cambridge University Press, 1983, p. 348:

It is probably a fairly widespread opinion that renormalized quantum electrodynamics is just the old quantized version of the combined Maxwell and Dirac equations, with some rules for hiding divergences. That is simply not true. a theory has two aspects. One is a set of equations relating various symbols. The other is, at some level, the physical interpretation to be associated with the symbols. In the course of the development here being described, the equations did not change, but the interpretation did. In the late 1930s, most people would not have challenged these statements: e and m, as they enter the Dirac and Maxwell equations, are the charge and the mass of the electron; an electromagnetic field operator creates or annihilates a photon; a Dirac field operator creates an electron or annihilates a positron, and its adjoint field does the inverse. And all this would be true if the two fields were uncoupled. But, in the real world, the localized excitation represented by an electromagnetic field, for example, does not just create a photon; it transfers energy, momentum, and angular momentum… Renormalization, properly understood, is an aspect of the transfer of attention from the initial hypothetical world of localized excitations and interactions to the observable world of the physical particles. As such, it is logically independent of divergences. Could we construct a convergent theory of coupled fields, it would still need to be renormalized.


Feynman, Mathematical Formulaiton of the Quantum Theory of Electromagnetic Interaction, Physical Review 80, p. 440 (1950):

The advantage of a Lagrangian form of quantum mechanics is that in a system with interacting parts it permits a separation of the problem such that the motion of any part can be analyzed or solved first, and the results of this solution may then be used in the solution of the motion of the other parts. This separation is especially useful in quantum electrodynamics which represents the interaction of matter with the electromagnetic field. The electromagnetic field is an especially simple system and its behavior can be analyzed completely. What we shall show is that the net effect of the field is a delayed interaction of the particles. It is possible to do this easily only if it is not necessary at the same time to analyze completely the motion of the particles. The only advantage in our problems of the form of quatum mechanics in C is to permit one to separate these aspects of the problem. There are a number of disadvantages, however…


Dyson, The S Matrix in Quantum Electrodynamics, Physical Review 75, p. 1736 (1949):

The paradox is the fact that it is necessary in the present paper to start from the infinite expressions in order to deduce the finite ones. Accordingly, what is to be looked for in a future theory is not so much a modification of the present theory which will make all infinite quantities finite, but rather a turning-around of the theory so that the finite quantities shall become primary and the infinite quantities secondary.


Heitler (1944), The Quantum Theory of Radiation, 2nd ed., Oxford University Press, x:

The present theory can be correctly applied to the interaction of light with elementary particles to the first approximation. The difficulties which occur, especially in the higher approximations to this interaction, seem to be rather like the 'ultra-violet catastrophe' (divergence of the radiation formula for high frequencies) in the classical theory: they show the limits within which the theory is valid. It may be hoped that the examination of these difficulties will lead in the future to a further development of the theory to a stage when it will probably comprise nuclear physics and give some information about the nature of the elementary particles.

If the application of the theory is confined within these limits, it will be seen in this book that the theory gives — qualitatively and quantitatively — a full account of the experimental facts within a large field, including even phenomena connected with the creation and annihilation of positive electrons. In applying the theory to such processes, however, it is even more necessary to consider carefully the limits of the theory.


Heitler (1944), The Quantum Theory of Radiation, 2nd ed., Oxford University Press, 191:

The polarizability of the vacuum turns out to have an infinite but constant and field-independent contribution. This contribution can, however, be subtracted, like the infinite energy and charge, and the number of pairs created by an adiabatic switching on of the field is finite. The polarization of the vacuum due to a constant field has then no physical meaning.


Heitler (1944), The Quantum Theory of Radiation, 2nd ed., Oxford University Press, 185:

The degree of divergence [of the transverse self-energy of the electron], and therefore the upper limit [of the integral for self-energy], seems to depend on the assumptions which one makes in developing a theory for the positive electron and has therefore hardly any physical meaning.


Landau, On the Quantum Theory of Fields, in Collected Papers of L. D. Landau, Gordon and Breach (1967), New York, pp. 634-5:

Since the consideration of a point interaction leads at once to infinities, it appears reasonable to regard it as the limit of some "smoothed-out" interaction with a finite radius, as this radius decreases to zero. In doing so, we have no reason to suppose that the constant $e_1$, which appears as a coefficient in the interaction and is the "intrinsic" charge of the electron in quantum electrodynamics, is independent of the radius of interaction. Furthermore, the dependence of this constant on the radius of interaction should be so defined that the final result of the theory (the expressions for physical effects) is independent of the radius of interaction, since this was introduced as an auxiliary quantity and is therefore devoid of physical significance.

An approach of this kind means, essentially, the rejection of any unjustified consideration of point interactions by means of a $\delta$-function.

All the divergences in electrodynamics are logarithmic, as is well known. The only exception is the quadratic divergence in the intrinsic mass of the photon. This latter fact, however, is not a serious objection, for it is easy to see that the appearance of a mass of the photon, under the influence of its interaction with charged particles, contradicts the law of conservation of charge. If the "smoothing-out" is effected in such a way that the law of conservation of charge is not thereby violated the corresponding expressions should reduce to zero identically.


Landau, On the Quantum Theory of Fields, in Collected Papers of L. D. Landau, Gordon and Breach (1967), New York, pp. 641:

Of course, no unambiguous physical conclusions can be drawn from the result obtained, that the point interaction is zero in the case of electrodynamics. The energies $\Lambda$ for which $e^2 \sim 3 \pi / \nu \ln ( \Lambda^2 / m^2 )$ are in every case very large. At these energies, the effects of gravitational interaction may exceed the electromagnetic effects, so that a discussion of electrodynamics as a closed system becomes physically incorrect.


Landau, On the Quantum Theory of Fields, in Collected Papers of L. D. Landau, Gordon and Breach (1967), New York, pp. 646:

Let us see how the problem of the scattering of one meson by another appears from the standpoint of the present article. In applying perturbation theory, the amplitude of scattering of one meson by another at small energies apart from a numerical factor, is taken as $g^4_1 \ln ( \Lambda^2 / M^2)$.This result is obviously not re-normalisable, that is, for a given value of $g_1$ it depends also on the cutoff limit $\Lambda$. This non-renormalisability, however, must not be regarded as a defect of the theory, but has a definite physical significance. The introduction of the cut-off means that virtual particles with energies greater than $\Lambda$ are excluded from consideration. The re-normalisability, for instance, of the scattering of mesons by nucleons should really be regarded only as signifying that only virtual particles with energies of the order of those of the colliding particles take part in this phenomenon. In the scattering of mesons by mesons, however, virtual particles of larger energies also take part.


Landau, On the Quantum Theory of Fields, in Collected Papers of L. D. Landau, Gordon and Breach (1967), New York, pp. 647:

We have seen in the case of electrodynamics that a point interaction can lead to the absence of any interaction, even if its intensity increase without limit. The possibility cannot be excluded that this is a general property of point interactions. In this case, the construction of meson theories is possible only by abandoning the point interaction, that is, by renouncing essentially all the methods at present existing. The great difficulties which arise in a physical "smoothing-out" of particles, as opposed to a purely formal "smoothing-out" such as was discussed in the present article, are well-known. In this case, therefore, the theory of meson interactions would draw a blank.


Landau, Fundamental Problems, in Collected Papers of L. D. Landau, Gordon and Breach (1967), New York, pp. 801-2:

Almost 30 years ago Peierls and I noticed that in the region of relativistic quantum theory no quantities concerning interacting particles can be measured and the only observable quantities are the momenta and polarisations of freely moving particles. Therefore if we do not want to introduce unobservables we may introduce in the theory as fundamental quantities only the scattering amplitudes.

The $\psi$ operators which contain unobservable information must disappear from the theory and, since a Hamiltonian can be built only from $\psi$ operators, we are driven to the conclusion that the Hamiltonian method for strong interaction is dead and must be buried, although of course with deserved honour.

The foundation of the new theory must consist of a new diagrammatic technique which must deal only with diagrams with "free" ends i.e. with scattering amplitudes and their analytic continuation. The physical basis of this technique is the unitarity conditions and the principle of locality of interaction which expresses itself in the analytic properties of the fundamental quantities of the theory, such as the different kinds of dispersion relations.

As such a new diagrammatic theory is not as yet constructed, we are obliged to derive the analytical properties of the vertex parts from a Hamiltonian formalism, but it requires much naivity to try to make such derivations "rigorous", forgetting that we derive existing equations from Hamiltonians which do not really exist.


Goldberger and Watson, Collision Theory, p. 621-2:

The methods of dispersion theory are used almost exclusively at the present time in attempts at carrying out quantitative calculations of processes involving elementary particles. To a very large extent these techniques have become a branch of theoretical physics independent of the quantum field theory on which they were originally based. By this we mean that we can give a set of reasonably logical computational rules by which physical reaction amplitudes may be obtained without explicit reference to the field concept. On the other hand, the experience and intuition which are required for engaging in this sport seem difficult to acquire without spending some time with field theory. There is a distinct possibility that formal field theory as we now know it is meaningless. Nevertheless we shall find it possible to draw certain conclusions from this formalism that appear to be based on those aspects of the theory that we would imagine might well survive even very drastic modifications of the details.


Heitler, Physical Aspects of Quantum Field Theory, in The Quantum Theory of Fields, Proceedings of the Twelfth Solvay Conference on Physics, 1961, p. 41:

The difficulty of achieving invariance in any finite field theory may suggest that we take a critical attitude towards the a priori postulate we usually make, namely that of exact Lorentz-invariance in the present local sense. We may well imagine that eventually a generalization of the Lorentz-transformation will be required that takes account of something like a "smallest space and time region". It is perhaps not out of the question that this will exhibit itself at present in a small departure from results derived by the postulate of exact invariance.


Heitler, Physical Aspects of Quantum Field Theory, p. 55:

We let ourselves be guided by the one convergent theory we have, namely that of 2. Non-local field theories which start from exact Lorentz-invariance have already been studied in detail, and in all cases it has turned out, that the divergences reappear in higher approximations. They are, therefore, useless (gauge invariance is not fulfilled either). The theory to be discussed is a generalization of the non-relativistic extended source model into the relativistic region. As we shall see, convergence can be achieved and we shall state the general conditions for exact Lorentz- and gauge invariance. If we discuss here a particular non-local theory, the form factor is, of course, merely thought to replace an unknown region. It is not the idea that such a theory could be final. The theory which follows turns out to be convergent, but it is certainly not the only way to achieve convergence. Our purpose is merely to show that convergent quantum field theories do exist, and on the basis of such a theory, to try to delimit the region of validity of the present theory.


Heitler, referring to above remarks, in The Quantum Theory of Fields, p. 98:

Personally, of course, I do not believe that any departure of this sort from invariance exists, this was merely meant to show what kind of results arise when one insists on the finiteness of the theory, and to suggest that such fundamental relation as Einstein's mass-velocity relation should be checked as accurately as possible by experiments.


Feynman, The Present Status of Quantum Electrodynamics, in The Quantum Theory of Fields, Proceedings of the Twelfth Solvay Conference on Physics, 1961, p. 74:

Evidently one cannot analyze the bound state by starting a perturbation series from non-interacting particles. But one very effective way is to use, as a starting point, the system held together by instantaneous Coulomb potentials. This system can be analyzed by an ordinary differential equation in time, like the Schrodinger equation, because of the instantaneous nature of the interaction. The perturbation then consists of adding the effect of virtual transverse photons in various orders. Of course, any other unperturbed system, held together by some approximation to the true interaction, will serve as well; the perturbation being the difference between the true interaction of Q.E.D. and the approximate interaction assumed. The instantaneous Coulomb potential is a good starting point because its initial approximation is so good.


Feynman, The Present Status of Quantum Electrodynamics, p. 78:

The final question is whether the renormalized theory is, in fact, a logically consistent theory. With any finite cut-off the theory is not consistent, slight deviations from unitarity (the principle that the sum of probabilities for all alternatives should be unity) occur. These get smaller as the cut-off energy $\Lambda$ goes to infinity. But the mass $m_0$ that may be needed to get a finite m for very large $\Lambda$ may be negative. That is, the theory may contain hidden difficulties if we computed processes for energies E such that $\alpha \ln (E/m) \gtrsim 1$. This is such an extreme energy that such matters are of no apparent concern to calculation of lower energy phenomena. However, from a strictly theoretical point of view it would be nice to know whether renormalized Q.E.D. is a consistent theory, or whether difficulties may not arise of relative magnitude $e^{-137}$.


Feynman, The Present Status of Quantum Electrodynamics, p. 89:

I have been converted from a long-held strong prejudice that [quantum electrodynamics] must fail significantly (other than by simply being incomplete) at around 1 Gev virtual energy. The origin of this feeling was the belief that the mass of the electron (relative to the nucleon, say) and its charge, must be ultimately computable and that Q.E.D. must play some part in this future analysis. I still hold this belief, and do not subscribe to the philosophy of renormalization. But I now realize that there is much to be said for considering theoretically the possibility that Q.E.D. is exact, although incomplete. This assumption may be wrong, but it is precise and definite, and suggests many things to study theoretically, while the other negative assumption (that it fails somehow) is not enough to suggest definite theoretical research. This is Wheeler's principle of "radical conservatism".


Exchange between Wightman and Källen, in The Quantum Theory of Fields, pp. 94-95:
Wightman:

In Heitler's theory… the theory appears Euclidean invariant. This gives rise to phenomena which, I believe, could strongly affect Heitler's conclusions, whatever the purpose for which the theory is studied. What i have in mind is the Haag theorem which says that in a Euclidean invariant theory in which there are canonical variables, the no particle state is necessarily Euclidean invariant. Now the only Euclidean invariant state which admits a reasonable physical interpretation is the physical vacuum. Since in any theory where there is non-trivial pair creation (as in Heitler's) the no particle state is not stationary, there is no reasonable vacuum state.

Kallen:

Perhaps I may be allowed to formulate what Wightman has just said in a slightly different way… The so-called "Haag theorem" says that the transformation $C_{n,n'}$ is indeed, very singular from the mathematical point of view. I agree that that is certainly so also for the theory Heitler discussed yesterday. However, I also believe that this fact is not really very serious. If one computes more physical quantities like scattering amplitudes or even self-masses, they can very well exist even if $C_{n n'}$ is singular. Therefore, I believe that this particular argument against the Heitler model is not very relevant.


Wightman, The Quantum Theory of Fields, pp. 95-6:

I believe that the problem of finding [an observable \phi] with these properties is a well posed one mathematically and the solutions would give a natural expression for the basic ideas of field theory put forward thirty some years ago by Dirac, Heisenberg and Pauli. Unfortunately, it is as yet unsolved. Until we understand its solutions or lack of them I feel there will be no physical paradox in the foundations of field theory, just a muddle.


Heisenberg, The Quantum Theory of Fields, p. 96:

With regard to Wightman's remark, I would like to emphasize, that in my opinion well known difficulties of divergences, etc. are not primarily mathematical problems…

It is a problem of physics and not only of mathematics to see how such a profound change as the replacement of non local interactions by local ones will affect the mathematical representation. We cannot expect that such a change could be represented without radical changes in the formalism of quantum theory.


Exchange between Chew and Gell-Mann, The Quantum Theory of Fields, pp. 143-4:
Chew:

The development of physics has been characterized thus far by the feature that no theory has ever been exact.

Perhaps we are destined indefinitely to this fate; theory will gradually become more and more comprehensive and accurate — perhaps more and more beautiful — but a final stage may never be reached. If one accepts this view, it is perhaps important at any given time not to try to make too large a step before the situation is ripe. Think of the development of atomic theory; might that not have been hindered if one insisted from the beginning on generating a comprehensive framework that included nuclear structure?

Perhaps we have an analogous situation with regard to strong and weak interactions. What do you think of the argument that it is only a fluke we know anything about weak interactions at the present time — that historically an understanding of them may have to wait until a theory of strong interactions alone has been constructed?

Gell-Mann:

Assuming we want to work on a part of the theory, we are faced with a choice. The strong interactions present some advantages, but so do the weak and electromagnetic ones…. one might argue that everyone should concentrate on studying the weak interactions and not the very complicated strong ones. But I do not say that at all; instead, I would say that it is a good thing to have people working on many different approaches to particle physics.


Kallen, Some Aspects of the Formalism of Field Theory, in The Quantum Theory of Fields, p. 152:

[The Haag theorem] is not critically dependent on the model but appears in practically every realistic theory.


Kallen, Some Aspects of the Formalism of Field Theory, in The Quantum Theory of Fields, p. 152:

there have been a few attempts during the last ten years or so to try to discuss the structure of quantized field theory without any resource to perturbation theory or other approximation schemes… The general philosophy here is that one tries to extract as much information as possible from general symmetry properties and other physical assumptions about the theory but refrains from any attempt to solve explicit equations of motion. The motivation behind these efforts varies somewhat from author to author. Many people, among them myself, always have an uneasy feeling here that we are really only scratching the surface of things and are very far from anything which deserves the name of a physical theory. Other people seem to like to make a virtue out of a vice, and there has been much talk recently about "the axiomatic approach to field theory". In its most extreme form an axiomatic paper starts by three or four "axioms" which are supposed to give essentially the whole physical content of a theory. Then one tries to work out as many consequences of these assumptions as possible. It cannot be denied that it is possible to get quite amusing and useful results in this way but I think one must the whole time keep in mind that there are many questions which have a reasonable answer, say, in the perturbation theory approach to electrodynamics but which one simply cannot discuss in the most extreme versions of the axiomatic approach.


Kallen, Some Aspects of the Formalism of Field Theory, in The Quantum Theory of Fields, p. 164:

This is one very simple example on how one is able to find relations which in principle can be checked experimentally without knowing very much about the detailed properties of electrodynamics. As we shall hear in other lectures during this conference one can do similar arguments also for other and perhaps more interesting physical systems and obtain "dispersion relations" for various scattering processes without really knowing anything about the interaction responsible for the scattering. Every relation one obtains in this way gives one measurable function expressed as an integral transform of another measurable function. If both functions are measured one can check whether or not the integral relation is fulfilled. As such a statement is independent of perturbation theory or, even, of any assumption about a classical Lagrangian, it is of rather general validity. In the very few cases where one has been able to make a check on these relations they seem to be fulfilled. However, it must be emphasized that these relations even in their most sophisticated forms are not equivalent to a detailed theory of interacting fields. They are at most consistency conditions.


Heisenberg, in The Quantum Theory of Fields, p. 169:

In connection with the so-called "theorem of Haag" I would like to point out, that its content should be well known already from conventional quantum mechanics. If one compares e.g. the state of a ferromagnet where the total magnetic moment has the direction of the Z-axis and another state with a slightly different direction of the total magnetic moment, these two states will always be completely orthogonal to each other, if we have to do with an infinite ferromagnet. If we excite an electron from one of these states, again the resulting state will be exactly orthogonal to the other. Therefore, in writing down matrix equations for such systems one must be careful not to write down relations between matrix elements in which both sides of the equation are trivially zero. Such an error might occur in perturbation theory or in the old Tamm-Dancoff method, where one starts with the "bare" vacuum; it ought however not to occur in the new Tamm-Dancoff method, where one starts from the real vacuum. Taking the theory of superconductivity (Bardeen-Bogoliubov) as an example, the new Tamm-Dancoff method gives correct results (the energy gap), while the theory of perturbation does not. This whole problem therefore has nothing to do with the real difficulties of quantum field theory, nor will it give rise to any criticism concerning the use of the new Tamm-Dancoff method in field-theory.


Kallen, in The Quantum Theory of Fields, p. 170:

the theorem discovered by Van Hove and Friedrichs and usually referred to as the "Haag theorem" is really of a very trivial nature and it does not mean that the eigenvalues of a Hamiltonian never exist or anything that fundamental.


Wightman, in The Quantum Theory of Fields, p. 171:

The significance which one attaches to Haag's theorem depends on one's attitude towards model such as Heitler's. On the one hand, one can reagrd this model as a short hand for the investigation of the numerical effect of cut-offs in the perturbation series of a relativistic theory. Then mathematical questions about the exact spectrum of the model are quite irrelevant. On the other hand, one can take the Hamiltonian of the model really seriously, and try to find out what spectrum it predicts and what properties its exact eigenfunctions have. In this case, it seems to me that Haag's theorem is distinctly non-trivial. It says that to make physical sense of the Hamiltonian one must insert not the familiar representation of the annihilation and creation operators but one of the strange representations.

--`

Van Hove, in The Quantum Theory of Fields, p. 171-2:

Haag's theorem is… of little direct relevance to the basic difficulties of field theory.


Feynman, in The Quantum Theory of Fields, p. 176:

I think that someone said once that the problem in theoretical physics is to prove yourself wrong as quickly as possible. The difficulty we have had for 27 years is that we haven't been able to prove Yukawa was wrong. I would like to discuss the history of attempts. The central problem at the beginning was to solve equations, figure out the consequences (that is what we used to do in physics once), make experiments and then think of another idea. The best progress is made when this can be done.

In the case of field theories, other than electrodynamics where there was essentially no difficulty in making the calculations other than infinities, no one has figured out how to make the calculations. So there was an original history of Tamm-Dancoff method, various damping approximations, Salpeter equations and other tricks… One tried to solve these things and people became discouraged. A group of mathematically minded people who were not able to solve the equations tried to prove they had no solution and made no sense. This has not succeeded and absolutely demonstrate that this is essentially or nearly a blind out.
[…]
One of the reasons it is and has always been difficult to work hard on these problems is that nature keeps telling us that it has the quality of being much more elaborate than we thought and that any minute another resonance may come in and give another clue. There has always been a feeling that something is incomplete.


Dyson, in The Quantum Theory of Fields, p. 177:

I believe that the central problem of field theory is to define a precise notion of convergence which makes the solution of an infinite set of equations a meaningful and feasible mathematical operation.


Goldberger, in The Quantum Theory of Fields, p. 180:

in terms of what is called axiomatic field theory, one can produce an infinite set of equations based on only the finest postulates, which with a suitable notation may be written very compactly. This simplicity, which is only superficial, is contrasted with what appear to be rather arbitrary and intuitive procedures on the part of the dispersion theorists who do not seem to express things very neatly.

On the other hand, the axiomatic field theoreticians are very hard pressed to compute the Klein-Nishina formula. The Lagrangian people can do this, but if we allow the discussion at the conference to degenerate to the physics of the strong interactions, they, to say nothing of the axiomaticians, are absolutely helpless. At this level, the challenge of field theory presented to dispersion theory is non existent. Although the axioms if you like, of the dispersion approach have not been stated in what we imagine to be their final form, no one is really ever in much doubt as to how to proceed. The reason is that the unashamed dispersion theorist has even been willing to resort to experiment to get ideas and to push a little further his understanding of the strong interactions. Thus as Gell-Mann has put it, he is understanding some physics while developing the theory.


Goldberger, The Quantum Theory of Fields, p. 186-7:

[on the mathematically derived limitation on the value of momentum transfer when dispersion theory is applied to non-forward scattering] there is no physical basis for this limitation; most people, including me, believe that it is mathematical in origin and that there is no restriction on momentum transfer.


Goldberger, The Quantum Theory of Fields, p. 195:

at present we do not really know what we have theoretically, whether the axiomatic scheme is rich enough to contain all the embarrassingly large number of particles or even more modestly anything at all relevant to experience.


Wentzel, The Quantum Theory of Fields, p. 204:

Abandoning field theory altogether in favor of a dispersion relations scheme seems to me similar in spirit to abandoning statistical mechanics in favor of phenomenological thermodynamics. We believe in statistical mechanics as a comprehensive theory although only in simple cases can one actually calculate a partition function. If this calculation is technically too difficult, we may content ourselves with applying thermodynamics at the cost of feeding in more experimental data.

Still, nobody would want to do without statistical mechanics as a higher ranking discipline.

Similarly, I would hope that also field theory in one form or another, will retain its place as a superior discipline.


Dyson, The Quantum Theory of Fields, p. 206:

[on Dirac's question about what happens if the commutation relations fail by way of a non-local interaction behaving like exp(-r/a) at distance r] If one adopts the point of view that dispersion relations should hold strictly, it is a disaster. If one adopts the Chew philosophy that observable quantities are sensitive only to nearby singularities in the complex energy-plane, the appearance of such a very distant pole creates no difficulty.


Mandelstam, Two-Dimensional Representations of Scattering Amplitudes and their Applications, in The Quantum Theory of Fields, p. 215:

the possibility of analytically continuing a function into a certain region is a very mathematical notion, and to adopt it as a fundamental postulate rather than a derived theorem appears to us to be rather artificial. The concept of a local field operator, though it may well have to be modified or abandoned in the future, seems more physical.


Yukawa, Extension and Modifications of quantum Field Theory, in The Quantum Theory of Fields, p. 235-6:

Since unexpected discoveries of strange particles around 1950, we hvae begun to complain of the inability of the theory to predict new particles and new phenomena and of the lack of unifying principles. Thus, we have long been awaiting the emergence of a theory which is not only free from the logical and structural defects, but is also able to give cogent reasons for the existence and interaction of various types of particles. What seems to me mos troublesome is the lack of an undeniable contradiction between experiment and theory. On the one hand, real discrepancies between quantum electrodynamics and relevant experiments are not yet detected in spite of the fact that quantum electrodynamics cannot be a completely isolated theory. On the other hand, a comparison of a theory of strong interaction with experimetns can not yet be sufficiently accurate. Fifty years ago Planck's quantum theory was there and was clearly contradictory to the whole body of classical theory. This very contradiction, however, proved to be a great motive power for further developments in theoretical physics. In contradistinction to it, we cannot easily locate a contradiction today.


Bohr, in The Quantum Theory of Fields, p. 254:

Even when speaking of phenomena at very high energies, we must somehow maintain at any rate an asymptotic connexion with the classical model of description. It is the question of how far the technical mathematical devices used in the account of such phenomena take into consideration all aspects of the situation.


Bjorken and Drell, Relativistic Quantum Mechanics, McGraw-Hill (1964), viii:

Some modification of the Feynman rules of calculation may well outlive the elaborate mathematical structure of local canonical quantum field theory, based as it is on such idealizations as fields defined at points in space-time. Therefore, let us develop these rules first, independently of the field theory formalism which in time may come to be viewed more as a superstructure than as a foundation.

3. Such a development, more direct and less formal — if less compelling — than a deductive field theoretic approach, should bring quantitative calculation, analysis, and understanding of Feynman graphs into the bag of tricks of a much larger community of physicists than the specialized narrow one of second quantized theorists. In particular, we have in mind our experimental colleagues and students interested in particle physics. We believe this would be a healthy development.


Bjorken and Drell, Relativistic Quantum Mechanics, 65-6:

the Klein-Gordon equation was discarded and the development of the Dirac equation was motivated by the desire to establish a one-particle theory. Therefore, we may ask, why not abandon the Dirac equation too? We are reluctant to discard it for the simple reason that by now we have uncovered an impressive body of "truth" in the Dirac equation — it predicts the correct hydrogen-atom spectrum and g value of the electron to high accuracy. Moreover, positrons as first predicted by the theory have been observed.

Thus the historical path of reasoning mapped out originally by Dirac has led to the desired equation for an electron, though we have now reinterpreted the theory and thereby renounced the motivation that started the development. The history of physics has numerous other examples of this pattern of progress. Therefore, we shall retain the Dirac equation and the hole-theory interpretation and reject instead the one-particle probability interpretation which we originally set out to achieve.


Bjorken and Drell, Relativistic Quantum Fields, McGraw-Hill (1965), 3-5:

there is noe vidence of an ether which underlies the electron wave $\psi (x, t)$. However, it is a gross and profound extrapolation of present experimental knowledge to assume that a wave description successful at "large" distances (that is atomic lengths $\approx 10^{-8}$ cm) may be extended to distances an indefinite number of orders of magnitude smaller (for example, to less than nuclear lengths $\approx 10^{-13}$ cm).

In the relativistic theory, we have seen that the assumption that the field description is correct in arbitrarily small space-time intervals has led — in perturbation theory — to divergent expressions for the electron self-energy and the "bare charge". Renormalization theory has sidestepped these divergence difficulties, which may be indicative of the failure of the perturbation expansion. However, it is widely felt that the divergences are symptomatic of a chronic disorder in the small-distance behavior of the theory.
[…]
At the present time one knows how to develop only approximate solutions to this problem, and therefore the predictions of any such theory are incomplete and at best somewhat ambiguous.

Faced with this situation, the most reasonable course to steer in constructing theories is to retain the general principles which have worked before in a more restricted domain. In this case, this includes the prescription for quantization which strongly involves the existence of a hamiltonian H.
[…]
Since there exists no alternative theory which is any more convincing, we shall hereafter restrict ourselves to the formalism of local, causal fields. It is undoubtedly true that a modified theory must have local field theory as an appropriate large-distance approximation or correspondence. However, we again emphasize that the formalism we develop may well describe only the large-distance limit (that is, distances $> 10^{-13}$ cm) of a physical world of considerably different submicroscopic properties.


Bjorken and Drell, Relativistic Quantum Fields, 36:

We may make this troublesome result — the divergence of [the vacuum fluctuation amplitude, (12.46) in text] — less unpleasant with the observation that one cannot in fact measure the square of a field amplitude at a point. In order to probe a single isolated point of space-time, one needs infinitely large frequencies and infinitesimally short wavelengths — and these are not to be achieved at less than infinite energies. Also, in practical calculations the fact that (12.46) diverse will cause no serious difficulties. However it is disturbing to find out formalism full of expressions such as $\mathcal{L}$ and $P^\mu$ which, like (12.46), involve products of field operators evaluated at the same space-time point. It should be kept in mind that only products of fields averaged over finite regions of space-time may exist mathematically and have physically observable meaning. We interpret results like (12.46) as indicative of the limitations of a continuous-field description — it is an idealization which provides an adequate description of the physical world only in the sense of the correspondence principle for large space-time intervals. It remains for experiment to show how small are the space-time lengths before quantitative revisions in the theories are required.


Bjorken and Drell, Relativistic Quantum Fields, 69:

In setting up the formalism, we shall abandon manifest Lorentz covariance by making a particular choice of photon polarizations. However, we start with the Lorentz-covariant Maxwell field equations, and when the smoke clears, we shall finally be led back to the same covariant rules of calculation derived in Chaps. 7 and 8 on the basis of more intuitive arguments. These rules lead to the same results in all Lorentz frames.


Bjorken and Drell, Relativistic Quantum Fields, 96:

the method of introducing for each particle in nature a separate field and a separate interaction term is dubious and at best phenomenological in character. It is not clear which of the particles in nature might be "bound states" or "excitations" of more fundamental fields. In fact, the present trend of thinking is to attempt to reformulate the theory without the use of the lagrangian, retaining only the fundamental axioms of field theory. Instead of introducing, via a lagrangian, "bare" particles which are then "clothed", that is, gain structure owing to their interactions, one accepts the physical particles which exist and successively "undresses" them by investigating the structures which have the great spatial extent. There is considerable optimism that such a program, which is also manifestly phenomenological in approach, is essentially equivalent in physical consequences to the lagrangian formalism.


Bjorken and Drell, Relativistic Quantum Fields, 130:

It is the goal of quantum field theory as a physical theory to describe the dynamics of interacting particles which are observed in nature.


Bjorken and Drell, Relativistic Quantum Fields, 270:

Progress in the area of dynamical calculations has been made at the expense of drastic approximations, and the methods of calculation are still undergoing change and criticism. Validity of the approximations can rarely, if ever, be defended on purely theoretical grounds. Instead, one turns to salient experimental features in order to motivate a strong of approximations at the end of which comes a prediction whose merit is judged only by comparison once more with observation.


Bjorken and Drell, Relativistic Quantum Fields, 281:

Although we cannot defend the validity of this assumption, it is still useful because of its simplicity and vulnerability to experimental test.


Bjorken and Drell, Relativistic Quantum Fields, 375-6:

Fortified by the above arguments, we may feel justified in applying renormalization group arguments to improve perturbation theory estimates at high momenta. However, it is implicit in our arguments than an order-by-order sum of asymptotic parts in a perturbation series yields the asymptotic behavior of their sum. It may very well be that terms dropped along the way as relatively small at high momenta in each individual order of calculation actually become dominant relative to those retained when the perturbation series is summed. Therefore, conclusions based on the renormalization group arguments concerning the behavior of the theory summed to all orders are dangerous and must be viewed with due caution.

So is it with all conclusions from local relativistic field theories.


Eden, High Energy Collisions of Elementary Particles, Cambridge University Press (1967), 2:

(3) Methods that have been developed from assumptions not yet fully proved from fundamental axioms but which involve techniques whose importance will surely remain even if there is some change in their motivation. These include dispersion theory and the Mandelstam representation, and Regge theory using complex angular momentum.

(4) Theoretical models which can be expected to apply to a limited range of experimental results and to evolve as one begins to understand more clearly the essential approximations involved. These include the peripheral model and simple approximations to Regge theory.


Eden, High Energy Collisions of Elementary Particles, Cambridge University Press (1967), x:

Although some of the special assumptions of Regge theory may not survive future critical experimental and theoretical tests, it is certain that the basic methods of complex angular momentum will remain amongst the most useful techniques for studying high energy collisions.


Eden, High Energy Collisions of Elementary Particles, Cambridge University Press (1967), 61-3 (too much to quote)


Eden, High Energy Collisions of Elementary Particles, Cambridge University Press (1967), 3:

Although [the Mandelstam representation] has not been proved from fundamental axioms, it provides a useful illustration of analytic properties and crossing symmetry relations, many of which would remain correct even if some aspect of the Mandelstam representation (such as the subtraction assumption) turns out to be incorrect. The same comment applies to the Regge theory using complex angular momentum which is described in Chapter 5. Although some of the special assumptions may not survive critical analysis, it is certain that the methods of complex angular momentum will remain amongst the useful techniques for parametrising and studying high energy collisions.
[…]
it seems very likely that simple approximations to the full theory will be useful for many years in the analysis of high energy experiments.


Chew, The Analytic S Matrix, W. A. Benjamin Inc., 1966, 3:

Historically the recognition of analyticity in the theory of subatomic particles developed as much from studies of field theory as from experimental observations. In fact, one often hears it said that dispersion relations for scattering amplitudes were "derived" from field theory. There is some truth in such a statement, but it can be seriously misleading. It is entirely possible that hadron scattering amplitudes may have the analyticity properties commonly ascribed to them, while at the same time the association of fields with these particles is meaningless. In other words, one does not know how to construct fields purely in terms of analytic scattering amplitudes. The field concept involves a larger set of functions than is achieved by the analytic continuation of the S matrix, and the existence of this larger domain does not necessarily follow from that of the smaller. If in the future it should develop that a field theory of hadrons is impossible, the conjectured analyticity properties of the nuclear scattering matrix still may survive. There is ample precedent in the history of physics for special aspects of an unsuccessful theory to appear again in a quite different theory that does succeed.

The simple framework of S-matrix theory and the restricted set of questions that it presumes to answer constitute a major advantage over quantum field theory. The latter is burdened by a superstructure, inherited from classical electromagnetic theory, that seems designed to answer a host of unanswerable questions. S-matrix theory goes too far in the other direction, however, because it is not designed to describe experiments in which interparticle forces continue to act while momentum measurements are being performed. Behaving in this fashion are the forces that we best understand: the long-range interactions of electromagnetism and gravity. In its current form, S-matrix theory can at most describe short-range interactions.


Georgi, H., 1993. Effective Field Theory. Annual Review of Nuclear and Particle Science 43, pp. 209–252.:

The philosophical question underlying old-fashioned renormalizability is this: How does this process end? It is possible, I suppose, that at some very large energy scale, all nonrenormalizable interactions disappear, and the theory is simply renormalizable in the old sense. This seems unlikely, given the difficulty with gravity. It is possible that the rules change dramatically, as in string theory. It may even be possible that there is no end, simply more and more scales as one goes to higher and higher energy. Who knows? Who cares? In addition to being a great convenience, effective field theory allows us to ask all the really scientific questions that we want to ask without committing ourselves to a picture of what happens at arbitrarily high energy


David Gross (2004). The triumph and limitations of quantum field theory, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 56:

I am not sure it is necessary to formulate the foundations of QFT, or even to define precisely what it is. QFT is what quantum field theorists do. For a practising high energy physicist, nature is a surer guide as to what quantum field theory is as well to what might supersede it, than is the consistency of its axioms.


Sam Treiman(2004). Comments, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 69:

Wightman's outlook was, in the main, generic. His talk was devoted, so to speak, to the theory of quantum field theories. There are many interesting findings and many open issues that are internal to that subject. But on a broad enough level, if you want to look for features that are common to all field theories, you of course cannot hope to extract many phenomenological predictions. But there do exist a few such predictions, as Wightman reminded us: the connection between spin and statistics and the CPT theorem. I would ass the forward pi-nucleon dispersion relations, and various other bits and pieces of analyticity information.


Sheldon Glashow, Does Quantum Field Theory Need a Foundation?, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 77:

Like its predecessors, quantum field theory offers — and will always offer — a valid description of particle phenomena at energies lying within its own domain of applicability. This domain cannot extend all the way to the Planck scale, but its actual limits of applicability have not yet been probed. From this point of view, we are discussing the foundations of a theory that, whatever its successes, cannot be accepted as true.

In fact, QFT is just wrong! Quantum mechanics is all-encompassing. A correct theory must include quantum gravity and QFT is not up to the task. Furthermore, it is beset by divergences


Michael Fisher, in discussion on Glashow's Does Quantum Theory Need a Foundation?, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 86:

In condensed matte rphysics, in one way or another, we do have action at a distance. Of course, we use (effective) field theories with gradient terms, usually just gradient-squared terms but more if we need them: however, we remain conscious that this is an approximation. But in quantum field theory we say that operators commute at spatially separated points and this represents locality; however, we would not get any further if we did not also have gradient terms of various sorts present. So it has always tsruck me that introducing a field gets rid of action at a distance is something of a swindle that physicists now accept. But to be honest to the philosophers, what we are really saying is that there is action at a distance, but it's a mighty small distance! We accept the Newtonian idealization of a derivative or gradient.

That perspective has always seemed to me to offer an argument against point particles.


R. Jackiw, The unreasonable effectiveness of quantum field theory, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 149:

The… infrared infinity afflicts theories with massless fields and is a consequence of various idealizations for the physical situation: taking the region of space-time which one is studying to be infinite, and supposing that massless particles can be detected with infinitely precise energy-momentum resolution, are physically unattainable goals and lead in consequent calculations to the aforementioned infrared divergences. In quantum electrodynamics one can show that physically realizable experimental situations are described within the theory by infrared-finite quantities… So the consensus is that infrared divergences do not arise from any intrinsic defect of the theory, but rather from illegitimate attempts at forcing the theory to address unphysical questions.
[…]
The ultraviolet infinities appear as a consequence of space-time localization of interactions, which occur at a point, rather than spread over a region… Therefore choosing models with non-local interactions provides a way to avoid ultraviolet infinities.
[…]
the divergences of quantum field theory must not be viewed as unmitigated defects; on the contrary, they convey crucially important information about the physical situation, without which most of our theories would not be physically acceptable.


R. Jackiw, The unreasonable effectiveness of quantum field theory, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 154:

in local quantum field theory these phenomenologically desirable results are facilitated by ultraviolet divergences, which give rise to symmetry anomalies.


Steven Weinberg, What is QFT and what did we think it was?, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 250:

On the plane coming here I read a comment by Michael Redhead, in a paper submitted to this conference: 'To subscribe to the new effective field theory programme is to give up on this endeavour' [the endeavor of finding really fundamental laws of nature], 'and retreat to a position that is somehow less intellectually exciting'. It seems to me that this is analogous to saying that to balance your checkbook is to give up dreams of wealth and have a life that is intrinsically less exciting. In a sense that's true, but nevertheless it's still something that you had better do every once in a while. I think that in regarding the standard model and general relativity as effective field theories we're simply balancing our checkbook and realizing that we perhaps didn't know as much as we thought we did, but this is the way the world is and now we're going to go on to the next step and try to find an ultraviolet fixed point, or (much more likely) find entirely new physics.


John Stachel, Michael Fisher, Howard Schnitzer, and Roman Jackiw, discussion on Cao's "Renormalization group: a puzzling idea", in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. pp. 281-2:
Stachel:

In developing a physical theory, a premature demand for consistency can be very dangerous. OBviously, at a certain point, it's very important to demand logical coherence. But a physicist will generally have a feeling about when such a consistency requirement is important and when it's not important at some point in time.

Of course, a completely consistent theory is a dead theory. It's only when a theory has been superceded, and you already know its limitations that you can give a complete and consistent account of it, axiomatize it, and so forth. But while a theory is still developing, to call for closure and demand logical consistency is extremely dangerous and risks damaging its development. A creative physicist should not be simply paralyzed. It's extremely dangerous for a philosopher who does not have this 'einfuhlung' for the physics to try prematurely to demand consistency.

Fisher:

The advantage of having people set down axioms is that you can see which one you want to get rid of, when you want some different answer. And that's a method which one has to use time and time again in physics. If there are certain no-go theorems, then you have to find some way around or you have to say, you know, this particular fundamental principle I'm going to do without.

Schnitzer:

I think looking at axiomatic structures and finding axioms one wants to get rid of involve an a posteriori vision of what goes on. Let's look at the attack on locality that string theory presents as just a paradigm of what we're talking about. One did not sit down and say let's get rid of locality. Rather, there were physical issues on the agenda, and the issue of locality arose as a result of analyzing those issues. Then a posteriori, one said we can eliminate locality and still keep consistency. If one is looking at the historical development, one should not adopt an a posteriori reinterpretation.

Jackiw:

Another example was the development of supersymmetry. It was a longstanding desire in particle physics to combine internal symmetries with space-time symmetries in a nontrivial fashion. It never worked. There was the Coleman-Mandula theorem, which showed you couldn't do it. But then, remarkably, it was done. And it was done by people who weren't paying attention to the previous development at all, but came at the problem from something like string theory, and they were forced, in order to get a result, to use anticommutators rather than commutators in setting up a Lie algebra, and thereby completely circumvented the no-go theorem. In retrospect you can say, the hypothesis in the theorem was that you should use commutators and we'll do away with that. But actually people's brains don't work that way. You don't look at a set of hypotheses in a theorem, and say I'll cross out this one hypothesis, and now I'll make progress.


Fritz Rohrlich, "On the ontology of QFT", in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p. 361:

the mathematical questions are not the only reasons for my hesitation at a philosophical interpretation of QFT. Perhaps even more important for considering QFT as not yet ready is the fact that QFT is not yet an established theory (in the sense of Rohrlich and Hardin 1983). This means that QFT has so far not been 'overthrown by a scientific revolution' — the next level theory is not yet here. Only then can one know the exact validity limits of QFT; and only then can one make the corrections that the development of the next level theory necessitates.


Sidney Coleman, panel discussion, in Conceptual Foundations of Quantum Field Theory, ed. Cao, Cambridge University Press. p 371:

The appearance of divergences or Landau ghosts or other pathologies are clues as to where the theory is inapplicable. The fact that quantum electrodynamics has problems at $e^{137}$ or maybe it's even $e^{63.5}$, I don't know what the constants are. But at such a huge energy —
[interruption]
The fact is useful only if you believe QED presents you with a contradiction, only if you believe QED describes the world on that distance scale. If you think QED is just good on a much smaller energy scale, that fact is irrelevant.


Weisskopf, Growing up with field theory: the development of quantum electrodynamics, in The Birth of Particle Physics, ed. Brown and Hoddeson, Cambridge University Press, 1983, p. 73:

A strong increase [in the effective charge of the electron] occurs only at the very small distance $r \sim \lambda_c \exp ( -\hbar c / e^2)$, the same distance we discussed in connection with the self-energy, where the theory is most likely inapplicable. We then get a dependence of $Q_{eff}$ on the distance… This must be regarded as an interesting result of quantum electrodynamics in spite of the unnatural assumption of an infinite 'true' charge $Q_0$.


Weisskopf, V. F. (1939, July). On the Self-Energy and the electromagnetic field of the electron. Physical Review 56 (1), 72-85:

The situation, however, is entirely different for a particle with Bose statistics. Even the Coulombin part of the self-energy diverges to a first approximation as $W_{st} \sim e^2 h / (mca^2)$ and requires a much larger critical length… to keep it of the order of magnitude of $mc^2$. This may indicate that a theory obeying Bose statistics must involve new features at this critical length; whereas a theory particles obeying the exclusion principle is probably consistent down to much smaller lengths or up to much higher energies.