By George Staw, Corporate Metadata Architect, Alchemy Data Solutions Ltd

“To understand a world that changes we must revisit our myths, we must subject them to criticism” Umberto Galimberti – “The Myths of our Time”

There is a giant mystery at the heart of corporate IT: how is it possible – in spite of the expenditure of
so much time and effort – that the IT industry still hasn’t worked out how to provide the one technology
which its corporate clients need more than any other (1), i.e. a straightforward, comprehensive and
a reliable way of organising and making use of the huge quantities of data which are piling up in almost
every corner of every organisation? Something, somewhere is going seriously wrong and in this article
we are going to discover the real reasons for this massive failure at the heart of the IT industry.


The realisation that there is something fundamentally wrong with corporate IT is not new of course, and
in an interview in 1999 Larry Ellison, the head of Oracle, was very critical of his own industry’s failings:
“We blew it in the 1990’s. By running applications on the client, client/server was meant to put
information at your fingertips. But all we did was to create distributed complexity and fragmented data.
CEOs have come to hate IT because they can’t get what they want from it. Burger King put an SQL Server
database in every hamburger store, but they still couldn’t answer the question, ‘how many Whoppers
are we selling each day?’. ERP as an industry missed the boat. It focused on automating processes, not
on getting information to key decision makers.” (2).


More than twenty years later however it would seem that very little has changed: “distributed
complexity and fragmented data” are still the bane of every corporate IT department, vendors still seem
to think only in terms of “automating processes” and CEO’s still can’t get what they want from IT. But
why should anything have changed when IT vendors are still designing and building systems on the basis
of the same process driven methodology which was first used in the early days of computing? Because
this is the fundamental reason for the problems which affect computing – the IT industry has not yet
learned how to do the one thing which any successful organisation must be able to do – it does not
know how to evolve it turns out that the IT industry is totally and utterly stuck in the past.


It is not easy to accept that an industry which likes to see itself as being at the forefront of intellectual
and technological innovation is in fact incapable of change, but the problem for the IT industry is that it
believes in only one thing, which is to create processes and technologies which will be disruptive, and
the more disruption the better. For corporate IT the result of this approach has been problematic, to
say the least, and in the final section of this white paper we will look at how the resulting tech-driven
mindset has prevented the IT industry from learning how to create the advanced, data-centric platforms
which its clients so urgently need.
It is our conviction that IT is just too important to be left to technologists, and we hope that this analysis
will provide users with the information and insights they need to demand more and better of their IT
suppliers.

The power of the process


“To a man with a hammer, everything looks like a nail.” Mark Twain


As with any community or organisation, IT has been shaped by the world it grew up in, and this was a
world in which the primary purpose of computing was indeed process automation – creating computer
systems to carry out routine commercial or technical tasks (e.g. producing the company payroll) which
previously had been carried out manually. But in the 1980’s, as digital technologies started becoming
more widely available, that world began to change and people started looking for ways to not just
automate but to transform the way businesses worked – it was the start of a digital revolution.


But for those working in IT at the time, what did this change actually mean? There had been a flood of
new technologies but, as usual, very little time to work out how to make best use of them, so when it
came to implementing these new systems it is hardly surprising that IT providers continued relying on
the only development model they knew, i.e. harnessing the power of the computer to manipulate and
transform data – in effect the logic of the factory production line. It was a straightforward, tried and
tested approach, and there seemed no reason to think that it could not be used to create the next
generation of IT systems: after all generating the data for a data warehouse surely couldn’t be very
different from generating the data for the monthly credit card statements.


But what looked like a practical solution turned out to be a catastrophic miscalculation, because what
developers failed to realise (and for the most part still haven’t understood) was that a methodology
which was perfectly adequate for creating simple physical artefacts would be no good at all for dealing
with the much more complex challenges of enterprise-level IT – to put it simply, you cannot build a
skyscraper using the same methods and materials you use to build a house. It is no wonder that
client/server and so many other IT innovations never worked as intended, they were all based on a
design paradigm that was fundamentally unfit for purpose – the question is, why did no one realise this?
And it is the answer to this question which is going to help us discover how the IT industry really thinks
– starting with its attitude to what is arguably the cornerstone of computing science, the algorithm.


It was two pioneers of the computing age, Alan Turing and Claude Shannon, who realised that there is
no intellectual task, no matter how difficult, which cannot be expressed as a precisely defined sequence
of logical operations, i.e. as an algorithm – the magic spell which releases the genie of digital
transformation. It is therefore hardly surprising that the algorithm has come to dominate how we think
about and use IT; whatever the task, it is just taken for granted that the best, the surest way to deliver
any organisational or technical innovation is by following the right process, program or methodology.
But this faith in the power of the algorithm has given rise to a mindset which assumes (a) that
technological perfectibility is not only possible but certain, and (b) that getting the algorithm right is all
that matters. The result is a technology sector which always underestimates the practical challenges
which are an inevitable part of using any new technology, as can be seen in the current obsession with
driverless cars: Chris Urmson, until 2016 the head of Google’s self-driving car programme (Waymo) once
thought that his young son would never need a driving licence but now talks of driverless cars appearing
“gradually […] over the next 30 to 50 years” (3), and Raj Rajkumar, an autonomous driving expert at
Carnegie Mellon University also anticipates a transitional period which would involve “letting the car
drive itself in easier conditions, while humans take over at more challenging moments.” (4)

This idea of a gradual transition to fully automated control may sound reasonable, but could it actually
work in practice? To put it simply, would any sane person allow themselves to be driven by a system
which is liable to abruptly hand back control whenever a “more challenging moment” occurs, especially
if they had not driven a car for many weeks?


Perfecting a specific product or methodology in itself is not enough to ensure that it can work in the real
world; IT vendors need to take much greater account of the overall organisational, cultural and
economic context within which any technology is to be used, but thinking “outside the process box” is
not something most IT professionals ever think about, because they operate on the basis that computer
systems should “do the whole thing”, replacing rather than augmenting human effort– and the following
incident shows how ill conceived this approach can be.


On May 31st 2009 Air France flight 447, en route from Rio de Janiero to Paris, crashed into the Atlantic
Ocean with the loss of all 228 passengers and crew; the cause of the crash was frighteningly simple, the
plane’s autopilot and fly-by-wire systems had been temporarily switched off when an airspeed sensor
iced over – a minor difficulty which could have been dealt with by one of the plane’s three pilots taking
manual control of the plane; the trouble was that all three pilots had become so dependent on the
plane’s computer systems that they had forgotten how to fly it themselves. By the time the most
experienced of the three pilots realised what was happening and what had to be done to save the plane,
it was too late. (4)


Surely it would be better to design autopilot systems which just ran in the background, monitoring the
flight, ready to report any problems but leaving the pilots to do the actual flying. But the idea of creating
systems which take a back seat holds little appeal for IT professionals, who want technology to transform
the world, in Mark Zuckerberg’s famous phrase, to “move fast and break things” (5); a commendable
ambition perhaps but one that is driven not just by the wish to make the world a better place.

Uncovering IT’s hidden agendas


“It is an old maxim of mine that when you have excluded the impossible, whatever
remains, however improbable, must be the truth.” Sir Arthur Conan-Doyle – Sherlock Holmes – The Adventure of the Beryl Coronet


So far we have been focusing on the historical and cultural influences which seem to keep the IT industry
trapped in its process-centric mindset, and this might give rise to the expectation that IT suppliers could
be persuaded to reform the way they work. But is this true? Because the more we examine how the IT
industry thinks and operates, the clearer it becomes – and there is no easy way to say this – that IT has
become a technology junkie, addicted to the excitement and the endless highs which come from being
constantly on the verge of unleashing yet another revolutionary technology, and determined to keep
on getting that fix, no matter what the cost might be to itself or to those who depend on it. In other
words the IT industry has no interest at all in reforming itself – it is having too much fun, and making too
much money, with things as they are; IT is never going to grow up and start taking what it does seriously
until its customers, who control the purse strings, compel it to do so.

It is not easy to accept that an entire industry can be characterised as an addict, but this theory does
explain patterns of behaviour which are otherwise hard to make sense of. Why for example is IT
constantly working itself up into a state of almost febrile excitement, even when there is nothing
particular to be excited about? Why does the IT industry keep on abandoning and then rediscovering
the same technologies again and again? Worst of all, why has the IT industry systematically undermined
or ignored all those ideas and technologies which aimed to make corporate computing more genuinely
data-centric, such as the relational database and the Semantic Web? The trail of evidence created by
IT’s compelling need for constant technological revolution includes a number of recurring techniques
which are worth looking at in more detail:


· Technology recycling


Genuine innovations, such as the iPhone or the World Wide Web, do not occur very often, but
over the years the IT industry has become very adept at relaunching existing technologies as
though they were something new; for example what is now called “analytics” has had multiple
reincarnations since its inception as “management information” several decades ago – with each
relaunch bringing with it a new set of products, methods and specialist terms for users to acquire.
But why does each iteration require users to start from scratch? Technologies may come and go
but the fundamental nature of any business remains very much the same; so why not create
platforms which allow business related elements to be bequeathed from one generation to the
next? That would certainly make the task of introducing new products a lot easier, but it would
also detract from the impression that a given product represents something totally new and
amazing –something which IT vendors are determined to avoid.


· Technology relabelling


Easier than recycling is relabelling, where the IT industry takes an existing task or activity and
simply renames it in order to create an impression of continual progress and innovation: for
example the work that has traditionally been carried out by a software developer suddenly
requires the skills of a “data scientist”. The work itself hasn’t changed, and it is the same person
doing it, but for all newly fledged data scientists, and for the consultancies which hire them out,
this sudden change in the IT skills market brings nothing but benefits.


· Technology’s law of the jungle


It sometimes seems that IT vendors are never happier than when they are warning their customers
of the oblivion which faces them unless they keep up with the latest technological trends; after
all, we all know that even the most successful business can be wiped out overnight if it fails to see
which way the technological wind is blowing – and as a way of compelling users to be constantly
innovating, this kind of existential threat is hard to resist.


But what kind of innovations are IT vendors actually offering? Andy Jassy, the CEO of AWS, tells us
that “Invention requires two things: 1. The ability to try a lot of experiments, and 2. not having to
live with the collateral damage of failed experiments.”
(6). The obvious implication of this
statement is that the more businesses use AWS, the more inventive and therefore successful they
will be, but does this idea actually make sense? Making it very easy to do anything is not
necessarily a good idea and often it is only by dealing with the consequences of our failures that
we learn anything at all. Invention does not need unlimited resources but it does require ideas,
good communication, an effective organisation and, just to mention it again, the ability to learn
from experience. IT vendors are always keen to create the impression that they are selflessly
pursuing the lofty goal of technological progress but perhaps that isn’t always entirely true.

In fact what these three examples demonstrate is the extent to which the corporate IT agenda is
controlled by IT vendors who define it purely in terms of their own products and services, rather than
with reference to what might actually benefit their customers; and nowhere is this gulf between what
users need and what vendors are able to provide wider (and more damaging) than when it comes to the
issue of corporate data, because – and once again there is no easy way to say this – almost everything
the IT industry does for its customers with regards to data is fundamentally wrong, not (of course) out
of any malicious intent, but simply because the IT industry has not yet taken the time to learn what data
really is and is therefore unable to conceive and build the data-centric solutions which its clients need.


How the IT industry ignores data


” The world as we have created it is a process of our thinking. It cannot be changed without changing our thinking.” Albert Einstein


To understand the limitations of IT’s usual approach to data, we need only consider the fate of the
Semantic Web, a vision for the future development of the World Wide Web outlined in the May 2001
edition of Scientific American by Tim Berners-Lee and two fellow academics. In this article the authors
described how “A new form of Web content that is meaningful to computers” would “unleash a
revolution of new possibilities (7)”, but the revolution – a revolution which would have made datacentric IT a reality and brought enormous benefits for all IT users – never took place, and in 2006 Tim Berners-Lee was forced to concede that “this simple idea” had remained “largely unrealised” (8).


But the Semantic Web was just the latest in a long line of failed attempts to put data architecture at the
heart of corporate IT, and proof once again that IT vendors and their teams of software developers are
determined to resist any ideas which threaten their traditional, technology driven approach to IT. After
all, as far as IT vendors were concerned, they were just doing what any other professional community
would do, and responding to a specific challenge on the basis of their own interests and experience;
never having taken the time and trouble to understand data, IT vendors had no way of grasping the
potential of this extraordinary new idea, whereas a concept such as “data is the new oil” made perfect
sense because it fits in with their process-centric view of the world.


A more recent example of IT’s inability to use data can be found in the ongoing saga of BCBS 239. In
January 2013 the Basel Committee on Banking Supervision (BCBS), keen to prevent another banking
crisis, issued BCBS 239, a set of guidelines outlining how banks could improve their risk reporting
capabilities (9). But in a progress report published in April 2020, the BCBS reported that: “As of the end
of 2018, none of the banks are fully compliant with the BCBS 239 principles, as attaining the necessary
data architecture and IT infrastructure remains a challenge for many. In general, banks require more
time to ensure that the Principles are effectively implemented.” (10)

The fact that after nearly six years not one of the world’s thirty GSIB’s (global systemically important
banks) has been able to implement these fourteen principles should have rung a few alarm bells – not
least because implementing BCBS 239 is not actually very difficult and simply requires banks to underpin
their existing risk IT systems with a flexible and transparent metadata architecture. The real problem
here is that hardly anyone in IT knows what this task would actually involve, let alone how to carry it
out. So nothing changes, and the IT industry continues trying to meet the challenges of the data
revolution by producing more products and services, more tasks and methodologies; these creations
may all include the word “data” somewhere in their name (the creative abilities of the IT industry in this
regard are remarkable) but if the IT industry could only break free of its process-centric mindset, it would
realise that all these fictions and creations are completely unnecessary, because everything that is
needed for managing every type of data already exists – not only the concepts, but also the means to
deliver them. When it comes to data governance, the wheel has already been invented.


Conclusion


“Never waste a crisis. It can be turned to joyful transformation.”
Rahm Israel Emanuel – The New York Times (March 17th 2009)
We have now looked at some of the factors that prevent the IT industry from evolving, but beyond such
practical considerations, there is one final observation we would like to make.


It is easy to forget that the process paradigm dominates not only what the IT industry produces but how
it is produced – to work in IT means learning to follow the process, but as IT systems have grown in size
and complexity, the resulting processes have become so complicated that it is very difficult for anyone
to judge the impact of what they are doing: developers, testers, analysts, managers and users spend
much of their time working in the dark, striving to meet their various targets, and hoping that they won’t
be blamed if anything goes wrong. But does the pressure to follow the process allow anyone the time
to think, to ask questions, to communicate with their colleagues? It comes as no surprise that in a recent
book, Juval Löwy, one of the world’s leading software experts, should refer to the “ dark, depressing
reality of software development” (11) when describing the problems faced by the software industry.


The irony is that, contrary to what many IT professionals think, a data-centric system architecture would
provide much greater scope for individual creativity and in our next white paper, we plan to show what
this would look like in practice. Readers might be surprised how easy it would be to finally get off the IT
process treadmill – they might even find themselves starting to actually enjoy IT, and what a change
that would be!

Bibliography


[1] 1. Michael Fitzgerald, Nina Kruschwitz, Didier Bonnet, and Michael Welch. EMBRACING
DIGITAL TECHNOLOGY: A New Strategic Imperative. : MIT Sloan Management Review, 2013. .
[2] 2. The Economist. ERP RIP? The Economist. 24 June 1999.
[3] 3. —. Driverless Cars are Stuck in a Jam. The Economist. 2019.
[4] 4. Harford, Tim. Crash – how computers are setting us up for disaster. Guardian Long Reads.
October 11, 2016.
[5] 5. Blodget, Henry. An Interview with Mark Zuckerberg. Business Insider. 01.10.2009.
[6] 6. Connecting the Dots – AWS Summit SG 2017. Jassy, Andy. : AWS, 2017. .
[7] 7. Tim Berners-Lee, James Hendler, and Ora Lassila. The semantic web. Scientific american.
2001.
[8] 8. Shadbolt, Nigel, Hall, Wendy and Berners-Lee., Tim. The Semantic Web Revisited. IEEE
Intelligent Systems. May/June, 2006.
[9] 9. Basel Committee on Banking Supervision. Principles for effective risk data aggregation and
risk reporting. : BIS – Bank for International Settlements, 2013. ISBN 92-9131-913-9.
[10] 10. Basel Committee on Banking Supervision (BCBS). Progress in adopting the Principles for
effective risk data aggregation and risk reporting. : Bank for International Settlements, 2020. ISBN 978-
92-9259-377-3.
[11] 11. Löwy, Juval. Righting Software. : Addison Wesley, 2019. ISBN-13: 978-0136524038.


Latest Posts