Ontology and Physics
I recently had some offline discussion about some issues
in physics, which is the most basic of all the sciences
and the one in which the theories have been developed
with the highest degree of precision.
Yet engineers who have a practical problem to solve never
use anything that remotely resembles a unified theory of
everything in physics. There are some very, very good
approximations, but all of those approximations are highly
context dependent. Each of them is based on a different
set of simplifying assumptions, and nobody ever goes back
to some fundamental theory, such as general relativity or
quantum field theory, in designing an airplane or predicting
Even for a subfield, such as electromagnetism, for which
Maxwell's equations were formulated in the mid 19th century,
an enormous number of practical problems are still solved
by trial and error. A famous example is antenna design
for radio, TV, radar, etc. Practitioners in the field
say that it's more art than science, and when they find
something that works, they copy it without completely
understanding how and why it works.
Given this mess of unrelated and mutually inconsistent
"microtheories" in physics, which is the most precise of
the hard sciences, I am amazed that people think it might
be possible to have a unified set of general axioms that
would cover all or even some significant number of business
applications on which computer systems must interoperate.
I'm not saying that it's impossible. I'm just asking people
to look at history. Following are two notes I sent recently.
The first is in answer to a couple of questions, and the
second is a recommendation for a book on physics.
-------- Original Message --------
To start with the last question,
> And from where is it known that the effect of a force
> is proportional to the 2nd derivative of the position,
> rather than (e.g.) the third derivative? Pure reasoning?
> Or empirical observation generalized?
There's an enormous amount of guesswork in the search for
fundamental principles. And it is very hard to distinguish
the principles from the concepts they relate. For example,
consider Newton's basic F=ma. The only thing that is
independently measurable is the acceleration. The force
and the mass cannot be measured independently.
There is also the fundamental question about the difference
between the mass as measured by weight (i.e., gravity) and
the mass as measured by inertia (i.e., F=ma). Newton assumed
they were the same, but that is a pure leap of faith, which
for reasons that were mysterious to many people (including
Einstein) turned out to be justified by measurements.
Newton based his work on Galileo and Kepler, among others,
but both of them had a lot of guesswork intermixed with
metaphysical discussions. And Kepler used the results of
his mentor, Tycho Brahe, who did the most careful possible
measurements in the hope of proving that Ptolemy was right
and Copernicus was wrong.
For essentially every concept in physics, there has been
many years of guessing, debate, observation, mathematical
formulations, and more iterating back to the guessing stage.
> Do you know where the laws of conservation of momentum &
> energy come from? Mr. Newton, perhaps?
Leibniz and Newton were debating which, if either, was conserved
-- and what to call those mysterious properties that were or
were not conserved. Descartes and Newton believed that the
product mv (what we now call momentum) was conserved, and
Leibniz argued that mv**2 (what we now call kinetic energy)
was conserved. The notion of potential energy and the idea
that mechanical energy could be converted to heat energy and
back was not sorted out until the middle of the 19th century.
There are two very good volumes about the historical development
of the concepts of electromagnetism, heat, and light, which go
into all the debates about what those mysterious things could
possibly be and the very many different hypotheses that were
proposed, debated, measured, and revised, for centuries:
_A History of the Theories of Aether & Electricity_ by
Sir Edmund Whittaker. Vol. 1, the Classical Theories to 1900,
Vol. 2, the Modern Theories to 1926. Third vol. never finished.
I bought the paperback versions in 1960 ($1.95 for vol. 1 and
$1.85 for vol. 2). They were reprinted by Dover in 1990, but are
now out of print. Second-hand prices range from $120 to $400.
The short answer is that it's very hard to sort out what is
mathematics, what is physics, what is metaphysics, and what is
wishful thinking. The amount of confusion over the centuries
has been enormous. But that seems to be par for the course.
I realize that you already have an ambitious reading list,
but I thought that I'd add one more thing to consider:
_The Road to Reality: A Complete Guide to the Laws
of the Universe_ by Roger Penrose, Knopf, New York.
This is an 1100-page tome that covers all of modern
physics without skimping on the math. However, it's
beautifully written, and the first 3 or 4 pages of
each of the 34 chapters give a good intuitive overview
of the topic of that chapter.
It is possible to skip around and just start reading
at the beginning of any chapter, even the concluding
Chapter 34, which predicts that there will be a major
revolution sometime during the 21st century that will
create as many or more upheavals as the revolutions
of relativity and quantum mechanics at the beginning
of the 20th.
It's the book I would have loved to have on my shelf
when I was an undergraduate studying math and physics.
For example, Chapter 6 is an 18-page summary of calculus,
which is an excellent review for people who forgot
everything they had learned and a lovely summary for
people who still remember it.
Chapter 16 is a good 25-page summary of Cantor's
set theory, the hierarchy of infinities, Turing
machines, and Goedel's theorem with good intuitive
discussion of what they mean.
Following are two reviews. The first is by
Theory of everything by Martin Gardner
And following is the review from _The American Scientist_: