Most deliveries seem to mandate substantial software engineer involvement before any consideration as to why, or to alternative approaches.
This is the third blog in the Data Topic series based on the book datagility. The previous blog is datagility 2 – data centricity.
Well here we are in the 21st century and people are still writing copious amounts of code!
This astonishes me.
Why is it that almost every delivery project I have ever been involved with, has had significant code development and almost all of them demanded legions of software engineers?
Almost 30 years ago, I used system generators that could produce basic online transaction processing (OLTP) systems without a single software engineer. In the intervening years it seems that rather than gain universal acceptance, the dream of software engineer-free system delivery has all but disappeared.
Data Driven Delivery vs. Software Engineers
Requiring human intervention whenever changes need to be made to the system landscape, slows our ability to adapt to a snail’s pace. It is self-evidently an agility anti-pattern. Adopting a data driven delivery approach in its place removes this negative impact and in addition provides many other advantages. Let’s consider a few of the key impacts from software engineer-led approaches.
Despite their best intentions and endeavours, software engineers inevitably skew deliveries through their interpretation of requirements. When we consider how requirements are mediated through their communication chain, it is obvious how this can happen.
Burying Understanding
Another aspect of software engineer-led development is that any data rules and business rules become implemented purely as a part of the codebase. This results in them becoming well and truly buried. Thus, often critical organisation IP can simply disappear into a black hole of code!
Increased Maintenance Costs
A core theme in the Agile Manifesto is that we need to constantly develop new componentry. But we need to remember that typically the more we deliver, the higher our maintenance costs become, simply due to the increased legacy from the additional moving parts.
The mantra of constant delivery can lead to an increasing system landscape inertia that constantly erodes organisational agility.
More moving parts result in even small changes to functionality creating significant hurdles that require impact analysis and possibly resultant wide-spread changes. We must avoid this at all costs. If we cannot, the tangled web of the landscape complexity will frustrate our organisation’s ability to free itself towards an agile future.
Data Driving Delivery
I first witnessed the data driven delivery approach when I worked for Oracle many years ago. At the time, it struck me as wondrous how thoughts could be turned into systems that work – without any software engineers being involved at all!
The approach at Oracle was then called ‘Computer Aided System Engineering’ (CASE) and they provided a suite of tools branded as Oracle CASE tools. The approach was based upon the simple premise that; if it is possible to accurately capture a systematised definition of the process and data requirements, then application code can be deterministically delivered directly from these.
A simple example that illustrates this idea, is the ability of many contemporary modelling tools to produce data definition language (DDL) statements directly from their stored data structure definitions. These DDL statements allow physical database structures to be created, based directly upon model definitions.
However, Oracle took this idea further by developing generators that could also deliver UI screens and reports. These were driven by function to entity and attribute mappings that defined the create, retrieve, update and delete (CRUD) usages of the data.
This approach can be characterised as a declarative approach. It generates system deliverables based upon the declaration of what is required, without the need to define how to achieve it. As a result, it delivers a radical step-change in agility, since infrastructure components are generated directly from stakeholder defined metadata.
Importantly, by reducing software engineer effort, the declarative approach also hands back more direct control of delivery to the SMEs of our organisations.
Notice how the generator replaces the software engineers in the development framework zone.
Delivering Models into the System Landscape
Using a data driven delivery has some important consequences, including the stitiching of the organisations metadata to the system landscape. Let’s illustrate this with the specification of interfaces from our Business Data Models.
In the preceding diagram, we can see how models are used to define the physically implemented interface component. Each of the elements in it can therefore have a traceability back to the logical Business Data Model. This is turn can act as a reference lynchpin linking all of our metadata definitions. For example, it is easy for us to map Data Privacy constraints to our Business Data Model.
This prinicple is covered more fully in the datagility 2 – data centricity blog.
Data Driven Behaviour
There is, however, a second declarative approach that uses rule-based metadata to replace software engineer-led delivery.
By contrast with the previous approach, no system components are produced from the declarative definitions. Instead, the software is developed to read the metadata to drive its behaviours.
In this approach, data processing behaviours are not hard-wired into the codebase, but instead are prescribed in the metadata that is read and then implemented by the codebase.
It is extremely effective in areas where there is anticipated volatility of prescriptive requirements. The following examples illustrate the kind of functional requirements that benefit from this approach:
- Anti-Money Laundering (AML)
- Workflow Engine
The approach is illustrated in the following image.
From my experience of using this technique over many decades, what I have noticed is the following possibly counter intuitive principle; the effort to build a rule-based engine that can support many use cases is strangely less than it would take to implement just one equivalent software engineer implemented use case!
How can it take less time to develop a configurable solution than a single point solution?
In the datagility 2 – data centricity blog, we discovered that rule-based modelling is all about enshrining data rules in the data structures. This has the potentially surprising consequence of significantly reducing the amount and complexity of code required!
In other words, the metadata definitions replace explicit code defintions.
If we think about the AML example above, there may be many thousands of possible combinations of values that would constitute a client becoming compliant – even for just one specific compliance regime. Add to this complexity, the temporal and other impacts of different compliance types and levels, it becomes obvious that a code-based approach will be problematic and become increasingly difficult to maintain over time.
By contrast, a metadata rule-based approach allows individual rule elements to be easily entered into the metadata structures directly from the governed outputs of the regulatory framework analysis.
With a metadata-driven approach the behavioural rules are easily made visible outside of the codebase and can be presented to stakeholders through an intuitive interface. As with the previous generator approach, this allows easy confirmation of what has been deployed. For example, the following image is a representation of the state transitions for a request workflow defined using a behavioural metadata workflow engine.
Summary
What we have learned in this blog is that we need to
- ensure we fully consider using data driven delivery to replace much of the manual delivery by of software engineers
- use data driven techniques to deliver
- system infrastructure
- system behaviours dynamically driven by metadata
- use data driven delivery techniques to stitch our metadata universe to the system landscape
- provide visualisation of the metadata that is used in our data driven delivery to provide stakeholders with an understanding of what is deployed
The next blog in this Data Topic is datagility 4 – data agility where we will learn how to ensure that data drives the fundamental organisation adaptability and agility.
Leave a Reply