|
NetKernel News Volume 3 Issue 18
April 6th 2012
What's new this week?
Catch up on last week's news here
Repository Updates
The following update is available in the NKEE repository...
- layer0 1.79.1
- Fix to grammar parsing to stop NullPointer when parsing optional choice groups that didn't match
- Enhancement to allow expanded modules without trailing slash in modules.xml
- module-standard 1.52.1
- Enhancement to configured endpoint to support incremental configuration resource changes
- layer1 1.35.1
- Enhancement EXISTS verb support to golden thread
- Enhancement number of golden threads reported in space explorer
- nkee-encrypted-module-factory 1.12.1
- Enhancement to support metered licensing
- nkee-license-manager 1.10.1
- Enhancement to support metered licensing
- database-relational 1.12.1
- Fix to SQLTransactionOverlay to be declared threadsafe
- nkee-apposite 1.35.1
- Enhancement to support metered licensing
- nkee-dev-tools 0.23.1
- Enhancement in space integrity healthcheck to detect problems in module itself not just contained spaces
- xml-core 2.4.1
- Fix to HTML serialization
- nkp 1.11.1
- Enhancements to support load balancer
- Enhancement to support dynamic NKP Server configuration
- Enhancement to NKP Client to support exposing connection status resource
- nkse-control-panel 1.23.1
- Fix to control panel templating to serialize as HTML
- nkse-dev-tools 1.44.1
- Fix to catch exceptions and behave gracefully in enterprise edition if modules become unlicensed
- json-core 1.6.1
- Enhancement to JSONToXML to support adding a wrapping root element around JSON without a single root
Utility Meter Client
Alas we are loath to report more "tales from the darkside"...
You will now find in the NKEE repository the nkee-mclient package. This provides, in a simple "install and forget" module, the capability to have an NKEE instance automatically obtain a license from one (or more) utility license servers.
The period of the license is configurable locally on the client - from one hour upwards. The client also automatically supports failover and tolerance to talk to a redundant array of license servers. So one very simple config can be deployed across every instance in a cluster.
As we discussed before, the primary objective is to make possible a sub-thirty second boot-deploy-operational cycle allowing cloud NK instances to be automatically provisioned on-demand.
The secondary affect is that for both you and us it massively simplifies administration and management of licensing.
You can install the nkee-mclient now to any NKEE instance. Obviously it also needs a server to talk to. This comes in the form of a single module that can be installed on an NKEE instance to be the license server. This server has to be specially baked for a customer account. But its as easy to install as the client and provides the ability to monitor cluster activity and to generate and manage usage reports etc.
If metered licensing is something that would help make life simpler for you then please get in touch with us and we can quickly get you set up.
On Information Engineering
Computer Science has a noble and notable history. And yet, in the context of its peers, it is a very young science. It can also, for reasons that are fairly obvious, be somewhat regarded as a branch of applied mathematics.
Ever since I started to work in the field of Information Technology (IT note, not the rather sickening phrase ICT which has over the last decade crept into use. Surely the "C", representing communication, is a tautology of the grossest kind - what the bloody hell point is there to any information if it is not communicated!?) I have been nagged by a concern about the "historical closeness of Computer Science", and sense that (sacrilege), it may have an undue and often debilitating bearing on our ability to solve problems.
What is he going on about this time? I hear you mutter...
Well, as an analogy, lets take a short history lesson. Lets consider the science of electromagnetism. No doubt, you know the general background, from the the Lyden jar, the Voltaic pile, to Faraday and electromotive force, to the pinnacle of the science in Maxwell's equations. A journey of some sixty or seventy years of fundamental scientific discovery.
So what happened next? Well starting with equally great men like Nikola Tesla the science of electromagnetism was transitioned into the discipline of Electrical Engineering (and more recently Electronic Engineering) - a journey of some one hundred and twenty more years.
This is the contract that science and engineering have agreed between themselves... Whilst science discovers the foundations, engineering works out the equally challenging problems of how to exploit and take advantage of the scientific knowledge.
This is not to say that Engineering knowledge is lesser in value - no, if anything, once it is in place the engineering knowledge is more valuable. For example, how many electrical engineers apply Kirchoff's current laws on a regular (daily) basis? How many of them have even ever used Maxwell's equations? (Answer: most of them, mostly none of them).
My iconoclastic perspective suggests to me that Computer Science has only just completed its "Maxwell equation phase". What I see around me are information system engineering problems. And I see a population of developers trained to work with software's Maxwell's equations (you name your favourite language).
And here's the problem: Whilst you can develop an electrical circuit from first principals, it will be a terrible circuit.
The reason for this is that Engineering encompasses the transition from Science to Art - it embodies and requires an insight and ability to grasp that the reality of a real problem imposes constraints and compromises and balance and tolerances. All of which are facets that are not even comprehensible at the level of the first principals.
In short, engineers need an innate sense of beauty and balance. They also have to have a very very strong sense for safety and quality - ie they have to allow for margins of error between ideal specification and reality.
So now let me give you a real world example. I have recently been working closely to help design and deliver a very large scale information engineering system (commonly known as a web-site, but no matter that's just the top surface). The gist of the problem is this. There are some empty slots and they need to be filled with stuff. The rate at which the slots must be filled is determined by external pressures including externalities like market promotions causing an extreme peak in demand for one particular set of slots.
To me this is a problem of engineering not of computer science. This is about finding a balance between rates of inputs and rates of outputs, about understanding the acceptable longevity of information, of about coping with loss of information or loss of ability to fill slots. Most importantly its about working out where the engineering safety margins should go so that no matter what happens in the outside world the system remains operating within acceptable limits. ie it cannot explode.
So then I look around and up the road from my hotel I see a regional Coca Cola bottling plant. I imagine what they are doing inside that grey factory? And I think, you know what? All they are doing all day long is filling slots too! They too have externalities that determine how many slots (bottles) they need to fill and what capacity and form that should be (big, small, bottles, cans etc etc), they too have constraints that must be met and they too have an imperative that the plant cannot explode if the inputs are not bounded.
And I think, inside that building are a bunch of people who are called "industrial plant engineers". They're trained to look at the information problem of industrial production and to find an engineering balance that satisfies the economics of the problem.
So my question goes out to the IT industry: Where are our "Information Engineers"? Why are we training Computer Science graduates in software's "Maxwell's equations"? Undoubtedly we always need a little code written - but can't we compose together our information engineering solutions at a larger granularity. Can't we work at the scale of Kirchoff's "current laws"?
The answer naturally, because I have an axe to grind, is that of course we can. When you explain ROC to a mechanical or a chemical or an industrial process engineer they feel right at home. Throttles, gatekeepers, overlayed boundary constraints, freshness of resources and concurrency. These are the levers which they are trained to use. I know this for fact since I had the good fortune to stumble-into (literally) a young mechanical engineering graduate recently hired to his first job - I told him about ROC and he was at home straight away.
Now here's the punchline, Randy recently told me he's reading the leading book in the field of Applied Enterprise Software Architecture. He said the following made him laugh out loud, I'll paraphrase to spare blushes, "throttles are external system constraints and cannot be applied in software but are applied at the network level"...
Can you hear that whirring sound? Its Nikola Tesla spinning in his grave (and obeying the left hand rule!)...
Career Move
We're in some danger of becoming a ten-year overnight sensation. We are definitely projecting an increase in demand for services and ROC skills. We want to share the opportunity with you the ROC community since you are the natural self-selecting constituency from which to draw. Get in touch with us - the game is afoot.
Have a great holiday weekend, I plan to start from Maxwell's equations to figure out why my Sennheiser earphones have stopped working on the right hand side...
Comments
Please feel free to comment on the NetKernel Forum
Follow on Twitter:
@pjr1060 for day-to-day NK/ROC updates
@netkernel for announcements
@tab1060 for the hard-core stuff
To subscribe for news and alerts
Join the NetKernel Portal to get news, announcements and extra features.