Model-Driven Software Development: As Relevant As Ever

[article]
Summary:

Decades before web apps or smartphones existed, the concept of the separation of logic, presentation, and data layers in software made a lot of sense. That vision evolved into what we now call model-driven development, where rules, workflows, and dependencies are built once, as models in a centralized repository. It's the same basic idea, and it's just as useful, if not more so.

There was a time when I could quote embarrassingly long passages from C.J. Date’s An Introduction to Database Systems by heart.

Now, it’s not that Date’s writing is so engaging that you can’t put it down. The truth is, Introduction is a bit of a slog. It’s long (more than eight hundred pages, in my sixth-edition copy), it’s dense, and it’s full of tables, charts, and code strings. It’s not vacation reading. But it’s important and incredibly informative.

One of the core concepts Date described was clear separation between the data layer, the business logic layer, and the presentation layer of a piece of software (though he used different terminology in his book). Not coincidentally, that separation was one of the core concepts in force at the place I worked. It was a very different way of approaching development and deployment, and it made a ton of sense to mid-20s me.

Why am I telling you about ancient history? I happened to open an eighth-edition copy of Introduction the other day, and I find its contents as relevant, if not more so, today as when I first picked the thing up a couple of decades ago.

Before the table of contents, Date includes a few short quotes, including one from Maurice V. Wilkes, a computing pioneer, a knight, a Cambridge academic, and an all-around smart guy:

I would like to see computer science teaching set deliberately in a historical framework … Students need to understand how the present situation has come about, what was tried, what worked and what did not, and how improvements in hardware made progress possible. The absence of this element in their training causes people to approach every problem from first principles. They are apt to propose solutions that have been found wanting in the past. Instead of standing on the shoulders of their precursors, they try to go it alone.

It’s a fair point, particularly in a field that changes so fast. The past has relevance today, both in theoretical and applied senses. Date’s book was first published more than forty years ago, and the ideas in it aren’t just still sound; they’re actually more important than ever.

Think about the applications businesses use these days.

It’s increasingly rare for those applications to be deployed on just one platform. For the office itself, traditional client/server architecture can still make lot of sense, from both economic and performance standpoints. Remote workers or those using different operating systems (the graphic artists using Macs in a predominantly PC shop, for example) probably need to get to the application by hitting a URL in a browser. And a streamlined version of the app for smartphones and tablets can increase accessibility by leaps and bounds, driving adoption and ROI.

The application exists for the same reason, regardless of the device. Maybe it’s a leave system for a payroll company. Employees use it to request time off, check their leave balance, etc., managers use it to track employee requests and make sure critical job functions are covered, and payroll accesses the data for reporting and verification. Use cases vary slightly depending on the device—for example, managers likely won’t need to run large reports from their phones—but the underlying logic and purpose is the same.

Date actually addressed this scenario decades before web apps or smartphones even existed. That separation of logic, presentation, and data layers he described in Introduction is the same vision that has evolved into what we now call model-driven development, or MDD. Terminology has changed—we rarely talk about “conceptual schema” anymore; we talk about the application model, and where I work, it holds (a lot) more than the representation of the database definitions—but the concept is basically the same as it was. And it still has massive ramifications for the world we live in.

There’s a lot to like about MDD. (Note: Yes, I’m biased, but no, I’m not lying.) One of the most important characteristics is how it facilitates technology independence by segregating business logic. Rules, workflows, and dependencies are conceptualized and built once, as models in a centralized repository. The versions of an application deployed for client/server, web, and mobile architectures don’t have different sets of logic baked in; they simply call back to the single source of truth. App behavior is consistent across devices, even if the entire business logic set isn’t used. (And even if the device is proprietary.)

What’s particularly nice is that this “code once, deploy anywhere” methodology isn’t just for current versions of applications. The same logic that applies to deployments across multiple platforms applies to updates and new operating systems, too. If business logic evolves, the structure can simply be tweaked within the requisite models, not torn down entirely.

Should every application developer in the world turn to MDD methodologies tomorrow? Absolutely not. There’s always going to be a case for object-oriented design. But when device agnosticism and long-term stability are key pieces of a project puzzle, it’s worth taking the advice Dates left us from Wilkes and stand on the shoulders of a decades-old methodology that is becoming more relevant than ever.

User Comments

2 comments
Martin Ivison's picture

 

Having taught and practiced model-driven testing for a number of years, I found that the real differentiator of the method is the rigor of analysis required. To build a complete and correct model, you cannot allow ambiguity in the rules and behaviors. Since this rigor comes at a cost, I found the technique most useful where we are working in a high risk environment (think life and limb, or finance). I experienced quite a significant uptick in test coverage and test correctness because of it, and a resulting lowering of defect load and scrap and rework. Over the years, however, I also learned not to use it when we can eschew rigor to gain time-to-market or save cost. Because whatever the advantages of model-based testing are, particularly fast or cheap it ain't ;)

PS: I love your comments about the historical approach to teaching. You're right, this perspective helps understand what you do and why you do it -

May 13, 2016 - 2:20pm
Adrian Gosbell's picture

thanks for some interesting feedback.. I'm actually planning on writing a blog on the whole quality side of things, and your observations are in line with what I've experienced in regards to quality and where things can potentially go wrong. 

May 18, 2016 - 5:10am

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.