Are you doing MDA (Model Driven Architecture) right now? If so, what tools do you use, and how is it working out?
Asked Answered
F

6

11

Model Driven Architecture is the idea that you create models which express the problem you need to solve in a way that is free of any (or at least most) implementation technologies, and then you generate implementation for one or more specific platforms. The claim is that working at a higher level of abstraction is far more powerful and productive. In addition, your models outlive technologies (so you still have something when your first language / platform becomes obsolete that you can use for your next generation solution). Another key claimed benefit is that much of the boilerplate and "grunt work" can be generated. Once the computer understands the semantics of your situation, it can help you more.

Some claim this approach is 10 times more productive, and that it is the way we will all be building software in 10 years.

However, this is all just theory. I am wondering what the outcomes are when the rubber meets the road. Also, the "official" version of MDA is from the OMG, and seems very heavy. It is heavily based on UML, which might be considered good or bad depending on who you ask (I'm leaning towards "bad").

But, in spite of those concerns, it is hard to argue with the idea of working at a higher level of abstraction and "teaching" the computer to understand the semantics of your problem and solution. Imagine a series of ER models which simply express truth, and then imagine using those to generate a significant portion of your solution, first in one set of technologies and then again in another set of technologies.

So, I'd love to hear from people who really are doing MDA right now ("official" or not). What tools are you using? How is it working out? How much of the theoretical promise have you been able to capture? Do you see a true 10X effectiveness increase?

Falter answered 30/3, 2009 at 4:46 Comment(0)
C
0

I tried it once. Roughly about halfway through the project I realized that my models were hopelessly out of date from my code and were so complex that keeping them up to date was prohibitive and slowing me down.

The problem is that Software is full of edge cases. Models are great at capturing the larger picture but once you start to actually code the implementation you keep finding all those edge cases and before too long you notice that the model is far too granular and you have to make a choice between maintaining the model or getting some code written. Maybe the boilerplate generation is a benefit for starting up but after that the benefits quickly vanish and I found that I got a drastic drop in productivity. The models eventually disappeared from that project.

Corridor answered 1/4, 2009 at 2:31 Comment(4)
Thanks. Interesting that the devil is in the details. Models are by definition over-simplifications, and that's what caused you the most pain. +1Falter
Model driven software development is about generating the code from the model. You modify the meta-model, the model, and the generators to modify or add behavior. It is not about creating and maintaining an independent model that is manually updated when you update the code.Calista
That's exactly my point. At some point the generated code is no longer useful. As soon as you have to start modifying the code by hand the process breaks down.Corridor
How could that ever be the "right" answer to this (actually quite interesting) question? And btw: if you need to add "manual" code to generated classes there are several strategies available. One of them is to define protected regions in your code that are recognized by the generator and protect manual code insertions from being deleted at generation time. Most major frameworks (oAW, acceleo) support this.Tetrastichous
P
6

The lack of response to this question is somewhat ominous... maybe I'll let Dijkstra field it.

... Because computers appeared in a decade when faith in the progress and wholesomeness of science and technology was virtually unlimited, it might be wise to recall that, in view of its original objectives, mankind's scientific endeavours over, say, the last five centuries have been a spectacular failure.

As you all remember, the first and foremost objective was the development of the Elixir that would give the one that drank it Eternal Youth. But since there is not much point in eternal poverty, the world of science quickly embarked on its second project, viz. the Philosopher's Stone that would enable you to make as much Gold as you needed.

...

The quest for the ideal programming language and the ideal man-machine interface that would make the software crisis melt like snow in the sun had —and still has!— all the characteristics of the search for the Elixir and the Stone. This search receives strong support from two sides, firstly from the fact that the working of miracles is the very least that you can expect from computers, and secondly from the financial and political backing from a society that had always asked for the Elixir and the Stone in the first place.

Two major streams can be distinguished, the quest for the Stone and the quest for the Elixir.

The quest for the Stone is based on the assumption that our "programming tools" are too weak. One example is the belief that current programming languages lack the "features" we need. PL/I was one of the more spectacular would-be stones produced. I still remember the advertisement in Datamation,1968, in which a smiling Susie Mayer announces in full colour that she has solved all her programming problems by switching to PL/I. It was only too foreseeable that, a few years later, poor Susie Mayer would smile no longer. Needless to say, the quest went on and in due time a next would-be stone was produced in the form of Ada (behind the Iron Curtain perceptively referred to as PL/II). Even the most elementary astrology for beginners suffices to predict that Ada will not be the last stone of this type.

...

Another series of stones in the form of "programming tools" is produced under the banner of "software engineering", which, as time went by, has sought to replace intellectual discipline by management discipline to the extent that it has now accepted as its charter "How to program if you cannot."

Pals answered 1/4, 2009 at 1:11 Comment(4)
Yes, definitely relevant. I doubt any developer really believes that some approach will make developers obsolete. But here's what I could believe in: a whole ecosystem of tools that takes a top-notch developer and amplifies his/her effectiveness significantly. Maybe OMG MDA is not that though.Falter
"a whole ecosystem of tools that takes a top-notch developer and amplifies his/her effectiveness significantly" I think that's called Emacs. :-DPals
Really? Maybe I should give it a second (no, make that third) chance then :)Falter
I'm only half-joking, too. If you take it as given that your tool vendor cannot anticipate your every need, you must necessarily select the most flexible tools in order to remain effective. Even more modern "tool platforms" like Eclipse have a much higher barrier to entry for customization.Pals
K
4

I am doing my own independent research in the Model-Driven Software Development area since 1999. I've finally developed a generic modeling methodology in 2006 that I labeled ABSE (Atom-Based Software Engineering).

So, ABSE builds up on two fundamental aspects:

  • Programming is about problem decomposition
  • Everything can be represented on a tree

Some ABSE features:

  • It can support all other forms of software engineering, from the traditional file-oriented methods up to Component-Based Development, Aspect-Oriented Programming, Domain-Specific Modeling, Software Product Lines and Software Factories.

  • It is generic enough to be applied to enterprise software, embedded, games, avionics, internet, any domain in fact.

  • You don't need to be a rocket scientist to use if effectively. ABSE is accessible to the "mere developer mortal". There's no complexity like the one found in oAW/MDA/XMI/GMF/etc tool chains.

  • Its meta-metamodel is designed to support 100% code generation from the model. No round-trip necessary. The custom/generated code mix is directly supported by the metamodel.

  • The model can be concurrently manipulated. Workflows and version control can be applied (tool support needed).

It may sound like it's on the utopic side, but actually I left the research phase and I am now in the implementation phase of an IDE that puts all the above into practice. I think I'll have a basic prototype ready in a few weeks (around end of April). The IDE (named AtomWeaver) is being built through ABSE, so AtomWeaver will be the first proof-of-concept of the ABSE methodology.

So, this is not MDA (thankfully!), but at least is a very manageable approach. As the inventor of ABSE, I am understandably excited about it, but I am sure Model-Driven Software Development will get a boost in 2009!

Stay tuned...

Koheleth answered 2/4, 2009 at 9:13 Comment(3)
That sounds very interesting. If you have some blog posts about it, or if you wants others to look at it and give feedback, let me know.Falter
I know this is a very old answer, but if possible, I'm interested in knowing why you say it is 'thankfully!' not MDA?Scutum
@Yohsoog I was talking about the well-known complexity of MDA and its subsystems. Comparatively, ABSE might not be as "powerful" but the same results can be achieved in more simpler (albeit longer) solutions.Koheleth
E
4

Model-Driven Software Development is still a niche area but there are published case studies and a growing body of other literature showing success over hand-coded methods.

The OMG's MDA is just one approach, other people are showing success using Domain-Specific Languages (that don't use UML for modelling).

The key is to generate code from the models and to update your generator if it's not generating what you want - not to modify your code. Specialist tooling to help you do this has been around for years now but interest in this approach has grown over the last five years or so due to Microsoft's move into this area and through open-source projects like openArchitectureWare in the Eclipse world.

I run a couple of sites: www.modeldrivensoftware.net and www.codegeneration.net where you can get more discussion, interviews, articles and tooling options on these topics.

Eberhart answered 4/5, 2009 at 12:48 Comment(1)
Cool, I'll check them out. I've seen codegeneration.net, but not modeldrivensoftware.net yet.Falter
C
1

I started working with model-driven technologies and DSLs in 1997, and I am more and more enthusiastic about MDE.

I can testify that reaching the 10-times-more productivity (and perhaps even more ;-) is possible under certain circumstances. I have implemented many model-driven software factories that were able to generate executable software with very simple models, from the persistency layer to the UI layer, associated to their generated technical documentation.

But I don't follow the MDA standard for several reasons. The MDA promise is to express your software in a PIM model, and have the capacity to transform it automatically into one or several technical stacks (PSMs).

But :

  • who needs to target several technical stacks in the real life ? who needs to target one single and well-defined architecture ?
  • the magic of MDA stands in the PIM->PSM transformation, but model2model transformation in an iterative and incremental way is tough :
    • model2model is much more complicated than model2text to implement, debug, maintain.
    • as it is rarely possible to generate 100% of a software, details have to be added to the resulting PSM model, and preserved transformation after transformation. That means a merge operation (3-way, to remember the added details). And when dealing with models, merging graph of objects is far more complicated that textual merging (that works pretty well).
    • you have to deal with a PSM model (that is to say a model that looks very close to your final generated source code). It is interesting for the tool vendor, since ready-to-use PSM profiles and associated code generators can be selled and shipped with MDA tool.

I advocate for MDE strategies where the PIM is a DSL that speaks about your logical architecture (independently of any technical stack), and generate the code from this PIM with a custom and specific code generator.

Pros :

  • you don't have to deal with a complex and technical PSM model. You have your code instead.
  • using DSL techniques, the PIM is more efficient, sustainable, expressive and easy to interpret by code and document generators. Models keep simple and precise.
  • it makes the obligation to define your architectural requirements and concepts very early (since it is your PIM metamodel), independently of any technical stack. Usually, it is about identifying various type of data, service, ui components, with their definition, capabilities, and features (attributes, links to other concepts; ...).
  • the generated code fit your needs, since it is custom. And you can keep it even more simple if your generated code extends some extra manually maintained framework classes.
  • you capitalize knowledge in several orthogonal ways :
    • models capitalize the functionalities / business
    • code generators capitalize the technical mapping decisions from your logic architectural components to a particular technical stack.
    • the PIM DSL capitalize a definition of your logical architecture
  • with the logical-architecture-oriented PIM, it is possible to generate all the technical code and other non-code files (configs, properties, ...). Developers can focus on the implementation of business functionalities that could not be fully expressed with the model, and usually don't have to deal with the technical stack anymore.
  • merges operations are all about flat source code files, and this works pretty well.
  • you still can define several code generators if you target several technical stacks.

Cons :

  • you have to implement and maintain your own specific code and document generators
  • generally speaking, to take the best of the DSL approach, you have to invest into specific tooling (model validation, specific wizards, dialogs, menus, import/export...).
  • when updating/improving your DSL, you sometimes need to migrate your models. Usually this can be done with some disposable migration code, or manually (depending on the impact).
  • all of these cons require a specific developer team with model-driven skills

This particular approach can be implemented on top of an extensible UML modeler with UML profiles, or with specific model editors (textual or graphical).

The big difference between MDA and MDE could be summarized as :

  • MDA is a set of General Purpose tooling and languages, providing off-the-shelf md profiles and tooling for everyone needs. This is perfect for tool vendors, but I suspect that everyone needs and contexts are different.
  • With MDE + specific DSL and tooling, you need some supplementary skilled model-driven developers that will maintain your custom software factory (modeler, modeler extensions, generators...), but you capitalize everywhere and manage very simple-precise-sustainable models.

There is a kind of conflict of interest between the two approaches. One advocates to reuse off-the-shelf precapitalized model-driven components, and in the other, you make your own capitalization with defining DSLs and associated tooling.

Crowell answered 24/4, 2018 at 11:38 Comment(0)
C
0

I tried it once. Roughly about halfway through the project I realized that my models were hopelessly out of date from my code and were so complex that keeping them up to date was prohibitive and slowing me down.

The problem is that Software is full of edge cases. Models are great at capturing the larger picture but once you start to actually code the implementation you keep finding all those edge cases and before too long you notice that the model is far too granular and you have to make a choice between maintaining the model or getting some code written. Maybe the boilerplate generation is a benefit for starting up but after that the benefits quickly vanish and I found that I got a drastic drop in productivity. The models eventually disappeared from that project.

Corridor answered 1/4, 2009 at 2:31 Comment(4)
Thanks. Interesting that the devil is in the details. Models are by definition over-simplifications, and that's what caused you the most pain. +1Falter
Model driven software development is about generating the code from the model. You modify the meta-model, the model, and the generators to modify or add behavior. It is not about creating and maintaining an independent model that is manually updated when you update the code.Calista
That's exactly my point. At some point the generated code is no longer useful. As soon as you have to start modifying the code by hand the process breaks down.Corridor
How could that ever be the "right" answer to this (actually quite interesting) question? And btw: if you need to add "manual" code to generated classes there are several strategies available. One of them is to define protected regions in your code that are recognized by the generator and protect manual code insertions from being deleted at generation time. Most major frameworks (oAW, acceleo) support this.Tetrastichous
B
0

We do use MDA and EMF as tools. It saves us a lot of man-hours through code generation instead of manual coding. It does require high qualification of analytics, but it is what IT is about. So we mainly concentrated on problems itself and also tools/frameworks which do code generation and run-time support of the generated code. Finally, I can confirm that we do have 10x productivity increase with MDA.

Broadfaced answered 19/6, 2019 at 5:49 Comment(2)
Can you provide an example of its usage in your company? How you use it? What problems does it solve? What skills have you seen in the people working with it?Eliason
We are using MDA for developing ERP platform with very rich set of models. For object model (PIM) development we are using business analysts. For implementation we are using Eclipse Modeling Framework (EMF).BPMN2 have being used for business processes with Activiti as a runtime engine. For UI model we had developed our own notation with Vaadin as implementation base. All parts are based on open source and Java.Broadfaced

© 2022 - 2024 — McMap. All rights reserved.