Home United States USA — software A Model-Based Approach to CD: From ''Everything as Code'' to ''Everything Is...

A Model-Based Approach to CD: From ''Everything as Code'' to ''Everything Is a Model'' A Model-Based Approach to CD: From ''Everything as Code'' to ''Everything Is a Model''

407
0
SHARE

The  »Everything as Code » approach has its drawbacks. There is a better way – adopt the  »Everything-as-Model » approach to build better apps, faster.
Techniques such as “ Infrastructure-as-Code ” (IAC) have proven a popular paradigm for codifying and managing infrastructure as versioned software to drive automated deployments in Continuous Delivery pipelines.
The concept of IAC has been extended to “ Everything-as-code ” (EAC) extends that paradigm to apply to other aspects of DevOps such as testing, security, databases, and operations.
While treating everything as code provides many benefits, it has its drawbacks. Code sprawl and complexity creates its own quality and maintenance challenge.
In this blog, I will discuss the concept of “Everything-as-Model” (EAM) approach to Continuous Testing and Delivery. This is an innovative evolution of the EAC concept that addresses its drawbacks and provides significant benefits.
A model, in our context, is a form of abstraction for different types of entities in a continuous delivery system – for example, code, tests, data, infrastructure, etc. For example, model-based testing is an emerging discipline which allows us to represent tests as model from which actual tests are generated. Similarly, model-based software allows us to design the software as a model from which code is generated.
A model-based approach offers many advantages, such as:
All the benefits of model-based approaches are very pertinent to continuous delivery where the emphasis is on agile delivery of small packages of continuous change to application systems. In other words, being able to manage change – i.e. understanding the impact of change, implementing, testing and deploying the change in an automated manner is of paramount importance.
Let’s look at the key processes involved in Continuous Delivery and how model-based approaches can be applied to each. These are:
Representing requirements as a model address much of the ambiguity problems associated with requirements by explicitly capturing the intended behavior of the application (with support for complex logic) as visual flow. Here’s an example of a model developed in CA Agile Requirements Designer.
Such models can be expressed at many different levels, such as: business process, feature, story etc. And we can establish relationships between the models at different levels, which is much easier to do than using traditional text-based techniques. This allows us to do much more detailed traceability between different requirements components. As well as perform automated impact analysis when a requirement is changed.
As described above, detailed feature and story level requirements (that encapsulate the behavioral logic of the application) can be defined using model-based approach. This enables developers to easily translate such logical behavior into code either manually or automatically. Model-based development has been practiced in domains such as embedded systems, GUI design, and database development (where the “code” is database schemas and tables for example) . Regardless of whether the code is auto-generated, application development based on a model allows developers to build software that closely matches the requirements stated. Some tools allow management of traceability between the model and code, so it is possible to do automated change impact analysis (and often automated code updates) when the model is changed.
No process area benefits more from model-based approach than testing. One of the biggest challenges (and time sink in an agile context) in effective testing is manual test design and maintenance. Understanding frequently changing requirements and turning them into optimal tests requires significant time and skills. Most application systems are either under-tested (resulting in defect leakages or failure risk) or (sometimes) over-tested (resulting in higher costs or elapsed time) , rarely optimal.
Most of these challenges are easily circumvented by having tests generated automatically from the model of requirements rather than having to manually create the tests. This not only obviates the labor intensive work of designing tests, but also helps to generate the most exhaustive (or most optimal) set of tests needed to validate the requirements at the touch of a button. Tools such as CA Agile Requirements Designer – which you can take for a test run by starting your Free Trial – allow generation of tests that allow a variety of coverage depending on customer needs – most exhaustive (thorough testing of all possible scenarios) , optimal (smallest number of tests for maximum coverage) , based on risk/complexity, based on past defect history etc. (for example see Figure below) .
In addition to automated test optimization, model-based testing allows the following benefits:
One of the biggest impediments in Continuous Delivery is environment configuration, provisioning, and deployment of applications. “Infrastructure-As-code” approaches automate the configuration and deployment, but often require detailed scripting to define these topologies, provision them and then deploy applications into environments. In addition, such scripts proliferate depending on the number and types of environments (e.g. containerized, cloud, on-prem, hybrid etc.) . By comparison, model-based approaches abstract the elements of the environment, application and the logic into an overall topology from which lower level artifacts can be generated using code generation, see Figure below:
In addition, we can establish traceability between different environment topologies. This allows us to do change impact testing every time the topology of an environment is changed (e.g. by a developer or an operations engineer) by flagging topology components (e.g. OS patch version) that have become non-conformant. The same approach can be used for regulatory compliance verification in regulated environments.
Release modeling also helps us map requirements (features/stories) to release payloads. Large enterprises typically have multiple release trains active at the same time tied to different teams assigned to different backlog (see Figure below) .
As the payload of these trains change, manual tracking and adapting to such change can be challenge. Modeling of release trains with application payload information allows the deployment packages to be automatically re-configured without manual intervention.
Now that we have considered how different CD processes can take advantage of model-based approaches, let’s consolidate our overall approach to model-based CD. Essentially, we propose a versioned model repository (analogous to code repositories) for various processes (and artifacts) which are inter-related to each other and drive the end-to-end CD pipeline (see Figure below) .
The model repository serves as the source of truth for all the artifacts and nothing can be changed without a check-out/check-in/verification process that is a core software engineering discipline. All changes (for example a requirement change) originate in the models, which triggers change-based impact analysis across multiple models, generation of artifacts (for example tests are the “code” generated from the model) , and execution of pipeline processes (for example a Continuous Integration tool like Jenkins is triggered by a committed change to a code model) .
A typical model-based CD process could be as follows:
Requirements model is checked out from repository, and updated to reflect changes in requirements (e.g. new feature or enhancement to an existing feature) . The appropriate release model is updated to reflect the new payload
Updates to the model trigger updates to source code, test cases, test data, virtual services which are also updated in the repository.

Continue reading...