
NetKernel News Volume 4 Issue 21
August 30th 2013
Repository Updates
No updates this week.
Tom's Blog  The PoiNK Spreadsheet Resource Model
Tom has announced an update to the PoiNK spreadsheet resource model first developed by Tom Hicks...
http://practicalnetkernel.blogspot.be/2013/08/poinkrevisited.html
Hmmmm spreadsheets, I can only assume that Tom G must be having to automate his money counting to keep up with it.
On Empiricism  Part 3
Last time, in the second part of this series, I gave a very brief history of the Triumph of Empiricism in the period from the 17th to the turn of the 20th centuries. Earlier, in the first part, I told the Parable of the Clockwork Mayor.
Which means we're nearly ready to start thinking about information technology, but first we need to smash up the world...
Turn of the Century  The cracks start to grow
We have seen that by the end of the 19th century the bizarre consequences of Set Theory were already starting to place strain on the rigid structure of Empirical mathematics. But the general faith in Empricism was still firmly rooted.
As we have heard, David Hilbert set forth on an ambitious plan to construct a "formal system". A rigid empirical framework in which every mathematical statement could be expressed, and by a simple algorithmic process be mechanistically proven to be either true or false.
It wasn't just in Mathematics that Empiricism was riding high. In Physics there was complacency and a general belief that "we are close to knowing everything". A sense that the big things, like thermodynamics, electricity and magnetism, had been solved and the rest would follow merely by empirical handleturning.
Boy were they in for a shock...
Just after the turn of the century, almost simultaneously, two related but separate phenomena of light were observed. The most dramatic for many was the socalled Ultraviolet catastrophe in which the observed spectrum of blackbody radiation diverged from the expected thermodynamic model at high frequencies. The second was the discovery that under certain conditions light could stimulate the emission of "electricity" from metals  the photoelectric effect.
It was ultimately down to Planck and Einstein respectively to resolve these unexplainable phenomena. In both cases the explanation demanded that the energy and emissions were "lumpy". That is, the only explanation that worked was to recognise that the systems were not smooth and continuous but quantized into discrete states.
The Empirical Damn Bursts
The shocking aspect of these new theories was that they hinted that the world of continuous smooth empirical clockwork seemed to have discrete limits. As with set theory before them, the classical world was starting to discover that things were a lot weirder than was comfortable.
Motivated by the mounting experimental evidence that nomatter where you measured it, the speed of light always seemed to be exactly the same, even if you were on different sides of the earth/sun spinning at thousands of miles per hour difference, Einstein stuck it to the Empirical world bigstyle by asking a simple question:
"What would you experience if you travelled on a beam of light?"
The answer turned out to be almost incredible. For if the speed of light is universally constant, then the only possible answer is that space and time must bend.
One hundred years later, through repeated exposure to it and repeated experimental validation, we're kind of getting familiar with this idea. But I don't think the deeper significance is common currency. That space and time bend is one thing, but relativity theory states something even more profound:
Every observer experiences a different reality. Reality is relative to context.
That's still incredibly shocking. But the shocks kept on coming...
Collectively the community of researchers exploring the microscopic phenomena of quantization began to establish the experimental and mathematical tools for describing these effects and waved goodbye to Empiricism forever.
Quantum Mechanics showed the world to be inherently discrete, but incredibly, any given state could never be measured with precision. Precision could only be achieved either collectively (macroscopic ensembles) or by compromising the precision of another property of a system.
Worse still for Empiricism, the only way to explain the experimental evidence was to accept that the very nature of physical reality was itself intrinsically probabilistic. That is, stuff is not concrete at all, everything is a smeared out set of possibilities!
Put this together and you have the disconcerting truth:
Reality is contextual to the observer, it is nondeterministic and the precision with which we can interact with it is necessarily finite.
An empiricist would be weeping into their beer at this point, but in fact, the lifting of the rigid hard boundaries at the base of our scientific description of the world has led to an explosion in technological opportunities that even after half a century of mining its treasures, we are still only beginning to exploit. Your smartphone being just the tiniest tip of the iceberg.
But we're still some way off the information technology I want to address, so lets return to mathematics...
Hilbert's Cogs Slip
While the earth was shifting beneath the feet of the scientific community the Mathematical world could hold on to Hilbert's manifesto.
Indeed the first decade of the 20th Century saw a great triumph as Bertrand Russell and Alfred Whitehead published the Principia Mathematica. Using set theory to provide a formal description of number and, as a crowning achievement after two huge volumes, to provide a proof that one and one makes two.
Very soon afterwards (well it was twenty years, but in Mathematics that's like the next day), up pops a precocious genius  the kid in the clockwork mayor story. His name is Kurt Gödel. Twenty five years old from Vienna.
He wheels his clockwork creations into the center of the mathematical community and shoots the mayor right between the eyes.
On his wheel barrow he had not one but two theorems: Gödel's Incompleteness Theorems. The summary of which is that Hilbert's dream of a formal system can never be realised. Given any set of mathematical axioms there exist statements in the axioms that cannot be proved. Mathematics is incomplete.
Now lets get a sense of the impact of this.
Mathematics, like software, is a great place for the megalomaniac, since, unlike physical reality, you are only constrained by your imagination and your ability. If you run out of road with one set of axioms you can always choose new ones. But the implication of Gödel is that no matter which formal system you choose  there are statements you will not be able to prove.
Mathematics can never be empirical. As with the relativistic and probabilistic "unknowable" underpinnings of the physical sciences, Mathematics is also ultimately "unknowable", or rather more correctly, "incomplete".
Shocked? You should be. Our last tether to rational empirical reasoning just snapped. Its not elephants all the way down. We can't even know if there are elephants, we can't even know if there's a "down".
On that bombshell we shall pause. Next time I promise we will get to the IT and I hope synthesize some valuable perspective from this mess...
Have a great weekend. Enjoy what remains of the summer  nearly time to get back to work...
Comments
Please feel free to comment on the NetKernel Forum
Follow on Twitter:
@pjr1060 for daytoday NK/ROC updates
@netkernel for announcements
@tab1060 for the hardcore stuff
To subscribe for news and alerts
Join the NetKernel Portal to get news, announcements and extra features.