NetKernel News Volume 4 Issue 24 - On Empiricism Part 5
search:

NetKernel News Volume 4 Issue 24

October 18th 2013

Catch up on last week's news here, or see full volume index.

Repository Updates

The following updates are available in the NKEE and NKSE repositories

  • http-client-2.16.1
    • Enhanced the caching of HTTP responses when Cache-Control header has max-age set. Thanks to Ron Hitchens at OverStory for proposing and testing this with MarkLogic.
  • lang-dpml-1.23.1
    • Major update with changes to behaviour of <sequence> functionality and fixes to the consistency of handling of <varargs> and multiple arguments with same name:
    • better syntax checking to stop assignment attributes where they are ignored
    • varargs are passed with correct references when multiple arguments with same name
    • stop NPE when referencing unassigned variable in sequence
    • fix to <if> tag/accessor so that response is not expired when no then/else block is evaluated
    • assignments within nested sequences are scoped in the outer most sequence, this behaviour is now consistent with DPML1.0 from NK3 - more details in documentation
    • arg:arguments now preserves ordering of arguments
    • fix so that exclude-dependencies header will work inside lazy evaluated arguments
    • fix to stop arguments to sub-requests from clashing with arguments supplied from DPML invocation
  • nkse-cron-1.13.1
    • documentation correction
  • nkse-dev-tools-1.53.1
    • minor correction to new module wizard template
  • nkse-doc-content-1.46.1
    • various documentation updates/refinements including page on setting up NetKernel as a service
  • nkse-visualizer-1.23.1
    • fix to stop NaN when timing async requests
  • nkse-xunit-1.10.1
    • new assert - exceptionMessage to allow checking message field in exception response

On Empiricism - Part 5

Last time we learned that Turing created (and destroyed) computing. We discovered that the empirical march of information technology has run its course, and unlike its peers in the natural sciences, has been relatively undisturbed by the implications of some pretty cataclysmic shocks.

This week I'd like to try to go back to first principles and see if we can't also countenance an end to empiricism in computation...

Mr Feynman on Probability as the basis of Reality

If you've followed these newsletters, or glanced through some of the articles in the collection we just published then you will understand that I came to the world of computing as an outsider. I'd already had my world view shaped by a career in the physical sciences and, in particular, in quantum mechanics.

Now I know the ideas of quantum mechanics have permeated into the general culture and in particular technical people have more of a sense of the ideas. I'd like to show what I mean by the death of empiricism by showing how physicists dealt (deal) with the problem. I can't give you a crash course in quantum mechanics but I know a man who can...

This lecture was given by Richard Feynman in 1964 as one of a series on "The Character of Physical Law". They're all good but this one is particularly good since in one hour he nails every aspect of quantum mechanics and also demonstrates how physicists deal with the daunting idea that there is no hard ground to stand on - there's only probability...

If you prefer you can read the transcripts published in this book, chapter 6...

http://www.physicsteachers.com/pdf/The_Character_of_Physical_Law.pdf

Its a ready-reckoner for QM and there's no maths. It all comes down to Feynman's simple description of the true and repeatedly verified nature of reality.

Observations

If you don't have time to watch this - yes its old and in black and white and its just a man, a piece of chalk and a blackboard, but 56 minutes to understand all of QM is pretty brief. Anyway, we can cut to the key observations for our particular concern: the death of empiricism...

Around about minutes 49-50 Feynman has shown that stuff has both particle and wavelike properties and that the the act of observation always disturbs the results. You'll hear him passingly dismiss the idea of a "hidden variable".

The hidden variable explanation of QM was a last ditch attempt to cling on to empiricism as a way of dealing with the non-deterministic nature of reality that experiment consistently kept showing. The idea was that nature couldn't be random and probabalistic - there must be an unknown, a "hidden variable". That is, a hidden layer of empiricism that we couldn't see but must account for the weirdness.

Feynman very succinctly explains that even if we could get closer and closer to the source of the particles until we could see them almost before they're emitted we always have to ultimately accept that reality is not empirical.

It is comically summed up (51:56) in this short smack in the face, to the empiricist...

A philosopher once said, 'It is necessary for the very existence of science that the
same conditions always produce the same results.'
for comic timing  "Well, they do not!"

Science had to deal with this. I reckon its time that computing took this on board too...

Redefining Computation

What would happen if we went right back to first principles? What if we fundamentally re-examine the Church-Turing Thesis? You remember it from last time...

Any calculation that can be performed by a human-being following an algorithmic
procedure can be performed by a Turing Machine.

Our best definition of computing starts with the unquestioned, explicit axiom, that computation is the evaluation of calculations by the empirical operation of algorithmic procedures. Furthermore, it implicitly guarantees that the result of the algorithmic procedure is an absolutely precise answer.

Even though we know that the real world isn't empirical, and that any really useful computer is one that is connected to the messy, noisy indeterminate real world in some way... but the empirical dice has been loaded since day one.

No wonder our information systems are brittle. No wonder we can't change them and have to throw them away and start over and over. No wonder we end-up in factional tribes arguing religiously over the articles of faith of one language versus another, over one framework (algorithmic procedure) and another.

Like hamsters on a wheel we're condemned to live our lives running up and down the Turing tape. We don't question it because the very definition of computing says that's what you do.

OK, here's a radical suggestion. Let's stop the wheel and try on a different idea...

Computation and Probabilistic Reality

Computing is not the mechanical execution of an algorithmic procedure. We have lost perspective and mistaken the machinery for the purpose. Here is a better definition of computing:

Computation is the reification of an abstract resource to a concrete representation.

Now you might think this sounds a little vague, and you might accuse me of reloading the dice in favour of the ROC world that I'm trying to lead you to. But it is not vague and I am not deliberately loading the dice - this is a better definition because it encompasses and expands the Church-Turing thesis but it also admits the possibility of a non-empirical world.

How can this encompass Church-Turing? Here's an example: What is the one-millionth digit of pi?

The C-T way: choose an algorithm, cast it to an algorithmic procedure, execute, wait. For example...

PI = 2 * F(1);
F(i) {
 return 1 + i / (2.0 * i + 1) * F(i + 1);
}

We understand that this is an algorithm and its computation is exactly as with Turing-Church - it could be done by hand but we would prefer to let a Turing machine do it for us.

But how would we interpret this using our new definition?

Pi is an abstract resource - it is an irrational number - it has no finite representation. But pi can also be thought of as a sequential set of numerical digits.

It follows that "computing the millionth digit of pi" in our new definition is therefore making concrete a representation of the state of a point in the abstract set of all digits of pi. We have seen we can do it by operating an algorithmic procedure (so we encompass C-T) or, if computing is about the ends and not the means then I can do it this way too...

http://www.wolframalpha.com/input/?i=millionth+digit+of+pi&x=0&y=0

which says its "5". But I can get an even more interesting answer here...

http://wiki.answers.com/Q/What_is_the_1_millionth_digit_for_pi

The millionth after the decimal point, is "1". But, if you consider "3" to be the
first digit of pi, then the millionth digit would be the number before that, namely "5"

Which tells me that the resource I requested was ambiguously identified and until I clarify it then I'm experiencing the computing equivalent of particle-wave duality: the millionth digit of pi is 1 and 5.

Asking for a resource is in no way less of a computation. I achieved the same thing 3 different ways. I reified the state of an abstract resource, I computed something by the new definition. If you think I cheated by using the Web, then consider this, I could equally have looked it up in the Dr Zuess "Big Book of Pi" on page one million.

This is a suitable point to pause. Next time we'll consider how this new definition allows us to see computation and measurement as the same thing, find that we can deal with non-determinism and start to understand where all that entropy comes from...


Have a great weekend.

Comments

Please feel free to comment on the NetKernel Forum

Follow on Twitter:

@pjr1060 for day-to-day NK/ROC updates
@netkernel for announcements
@tab1060 for the hard-core stuff

To subscribe for news and alerts

Join the NetKernel Portal to get news, announcements and extra features.

NetKernel will ROC your world

Download now
NetKernel, ROC, Resource Oriented Computing are registered trademarks of 1060 Research


WiNK
© 2008-2011, 1060 Research Limited