Reason and meaning of (top) models

Photo of a top model girl

After a semester in teaching model driven software engineering in my Advanced Software Engineering course at Politecnico, I feel the urgency of echoing once again a few words about the reason of models, inspired also from some content in our last book on Model-driven Software Development, or MDSE (see more here or on

By the way, don’t mind the girl picture now, we will come back to her later.. by now, you (and my students) can just consider her as attention-catching trick.

My main point is that you cannot avoid modeling.
The human mind inadvertently and continuously re-works reality by applying cognitive processes that alter the subjective perception of it. Humans generate a mental representation of the reality which is at the same time able to:

  •  generalize specific features of real objects (generalization);
  • classify the objects into coherent clusters (classification);
  •  aggregate objects into more complex ones (aggregation).

These represent natural behaviors that the human mind is natively able to perform (babies start performing these processes since they are a few months old) and that are performed by people in their everyday life. This process is known as abstraction, also widely applied in science and technology, where it is often referred to as modeling.

So, we can informally define a model as a simplified or partial representation of reality, defined in order to accomplish a task or to reach an agreement on a topic. Therefore, by definition, a model will never describe reality in its entirety.

And here we are to our nice girl in the picture. She actually is a “model” (actually, a top model). She is not reality. She is an idealized representation of reality, incarnating beauty, grace, desire or whatever feature you want to name, which is instrumental to some purpose (in this case, to show and sell clothes). You will not see all the aspects of her life. You only see her as a partial abstraction of the concept of “desirable girl”.

If we want to go back to more “serious” usages, models have been and are of central importance in many scientific contexts. Just think about physics or chemistry: the billiard ball model of a gas or the Bohr model of the atom are probably unacceptable simplifications of reality from many points of view, but at the same time have been paramount for understanding the basics of the field; the uniform motion model in physics is something that will never be accomplished in the real world, but is extremely useful for teaching purposes and as a basis for subsequent, more complex theories. Mathematics and other formal descriptions have been extremely useful in all fields for modeling and building upon models.
Modeling has been proven very effective at description and powerful at prediction.

A huge branch of philosophy of science itself is based on models. Thinking about models at the abstract and philosophical level raises questions in semantics (i.e., the representational function performed by models), ontology (i.e., the kind of things that models are), epistemology (i.e., how to learn through or from models) and philosophy.

In many senses, also considering that it is recognized that observer and observations alter the reality itself, at a philosophical level one can agree that “everything is a model”, since nothing can be processed by the human mind without being “modeled”.

Therefore, it’s not surprising that models have become crucial also in technical fields such as mechanics, civil engineering, and ultimately in computer science and computer engineering.
Within production processes, modeling allows us to investigate, verify, document, and discuss properties of products before they are actually produced. In many cases, models are even used for directly automating the production of goods.

That’s why discussion about whether modeling is good or bad is not really appropriate. As I said at the beginning, we all and always create a mental model of reality. This is even more appropriate when dealing with objects or systems that need to be developed: in this case, the developer must have in mind a model for his objective.

The model always exists, the only option designers have is about its form: it may be mental (existing only in the designers’ heads) or explicit. In other words, the designer can decide whether to dedicate effort to realizing an explicit representation of the model or to keep it within her/his own mind.

Model-driven Software Engineering in Practice (MDSE)

I find this discussion intriguing and profoundly motivating for neophytes. You can read more on this and on the purpose of modeling in the book Model-driven Software Development in Practice written by Jordi Cabot, Manuel Wimmer and myself (see more here and on

To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).

My new book on Model-Driven Software Engineering

Model-Driven Software Engineering in Practice. Book cover
Model-Driven Software Engineering in Practice.
See more on

I’m really proud to announce that a huge joint effort with Jordi Cabot and Manuel Wimmer has finnally reached his end. Our new book on model-driven software engineering, on which we have been working for almost one year, is finally published!

First of all, I wish to extend my thanks to my coauthors for the wonderful teamwork and to Richard Soley, OMG Chairman, who accepted to write the foreword of the book. I actually found it one of the most inspiring outlooks on the field I’ve read.

The book discusses how model-based approaches can improve the daily practice of software professionals. Model-Driven Software Engineering (MDSE) or, simply, Model-Driven Engineering (MDE) practices have proved to increase efficiency and effectiveness in software development, as demonstrated by various quantitative and qualitative studies. MDSE adoption in the software industry is foreseen to grow exponentially in the near future, e.g., due to the convergence of software development and business analysis.

The aim of this book is to provide you with an agile and flexible tool to introduce you to the MDSE world.

This allows you to quickly understand its basic principles and techniques and to choose the right set of MDSE instruments for your needs so that you can start to benefit from MDSE right away.
As such, the book is not addressing only the hard-core software modelers, but also BPM practitioners, enterprise system consultants and analysts, and so on. Indeed, the book is targeted to a diverse set of readers, spanning: professionals, CTOs, CIOs, and IT team managers that need to have a bird’s eye vision on the matter, so as to take the appropriate decisions when it comes to choosing the best development techniques for their company or team; software and business analysts, developers, or designers that expect to use MDSE for improving everyday work productivity, either by applying the basic modeling techniques and notations or by defining new domain-specific modeling languages and applying end-to-end MDSE practices in the software factory; and academic teachers and students to address undergrad and postgrad courses on MDSE.

The book is organized into two main parts.

The first part discusses the foundations of MDSE in terms of basic concepts (i.e., models and transformations), driving principles, application scenarios and current standards, like the well-known MDA initiative proposed by OMG (Object Management Group) as well as the practices on how to integrate MDSE in existing development processes.

The second part deals with the technical aspects of MDSE, spanning from the basics on when and how to build a domain-specific modeling language, to the description of Model-to-Text and Model-to-Model transformations, and the tools that support the management of MDSE projects.

In addition to the contents of the book, more resources are provided on the book’s website we are currently setting up. There you can find the detailed TOC, the sources of the examples presented in the book, and the teaching materials we will build to support training activities based on the book.

If you want to buy the Model-Driven Software Engineering in Practice book, you can find it on Amazon or on the Morgan&Claypool website (printed and electronic versions available).

If you read and like the book, we will be glad if you post a review on Amazon!

To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).

Model-driven development on and for the Cloud: CloudMDE Workshop at ECMFA 2012

I’m glad to say that the CloudMDE workshop we organized at ECMFA has stirred interesting presentations and discussions.

I’ve summarized the online comments in this storified list of tweets:[<a href=”” target=”_blank”>View the story “CloudMDE workshop on Model-Driven Engineering on and for the Cloud (at ECMFA2012)” on Storify</a>]<h1>CloudMDE workshop on Model-Driven Engineering on and for the Cloud (at ECMFA2012)</h1><h2>CloudMDE ( aims to identify opportunities for using MDE to support the development of cloud-based applications (MDE for the cloud), as well as opportunities for using cloud infrastructure to enable MDE in new and novel ways (MDE in the cloud). </h2><p>Storified by Marco Brambilla · Tue, Jul 03 2012 01:52:24</p><div>The CloudMDE workshop has been held on July 2, 2012. This is a storified version of the Twitter discussions that happened during the workshop. If you want to know more you can visit: <a target=”_blank” href=””></a></div><div>About to start the #cloudmde workshop at @ecmfa2012Richard Paige</div><div><h2>The keynote speech by Ali Babar:</h2></div><div>Ali Babar keynote at #cloudmde @ecmfa2012 Paige</div><div>Ali Babar talking about the context of global software engineering: testers & developers cost the same in Scandinavia! #cloudmdeECMFA 2012</div><div>Ali is terrifying his audience about the lack of testing done by some small companies he’s interviewed. #cloudmdeECMFA 2012</div><div>In Denmark, around 92% of (presumably ICT) companies are 10 people or less. How to provide affordable tools and infrastructure? #cloudmdeECMFA 2012</div><div>Discussion of how to support multi-tenancy in a "Tools-as-a-Service" cloud infrastructure. #cloudmdeECMFA 2012</div><div>Ali now discussing a case study where he migrated a software metrics system to the cloud. Might be a scenario for MDE #cloudmdeECMFA 2012</div><div>An opportunity for MDE in cloud may be to make it easier to port applications to different cloud infrastructure. #cloudmdeECMFA 2012</div><div>Ali’s final remarks: cloud computing matters; tools-as-a-service has huge potential; MDE can help with migration or construction #cloudmdeECMFA 2012</div><div><h2>Techincal sessions – Morning:</h2></div><div>Massimo Tisi now talking about a research roadmap for model transformation in the cloud. #cloudmdeECMFA 2012</div><div>Massimo now talking about distributing parts of transformation computation to computational nodes. #cloudmdeECMFA 2012</div><div>Interesting discussion on strategies to distribute a transformation: no local loading, based on props of model elements, etc. #cloudmdeECMFA 2012</div><div>Ekkart Kindler jumped right in with the first controversial question of #cloudmde :-)ECMFA 2012</div><div>Now, Alek Radjenovic presenting a roadmap for using MDE to migrate applications to the cloud. #cloudmdeECMFA 2012</div><div>Alek Radjenovic at #cloudmde Paige</div><div>#webratio, one of the first commercial full fledged cloud-enabled #MDE tools #cloudMDE @ecmfa2012 #MDD #cloudcomputingMarco Brambilla</div><div>Massimo Tisi getting ready to rumble at #cloudmde Paige</div><div>Ekkart should come to every workshop 🙂 #cloudmdeECMFA 2012</div><div>Interesting question from @AVallecillo "What would be lost in migrating [data] to the cloud?" #cloudmdeECMFA 2012</div><div><h2>Technical session. Afternoon:</h2></div><div>Now back after lunch; Sebastien Mosser talking about CloudML. #cloudmdeECMFA 2012</div><div>Plug for REMICS project. Appropriate, given that this is a talk about pluggability. #cloudmdeECMFA 2012</div><div>Sebastian talking about migration of a COBOL application to IaaS. #cloudmdeECMFA 2012</div><div>"I should never rely on colours." — overheard at #cloudmdeECMFA 2012</div><div>Sebastian Mosser presenting at #cloudmde Paige</div><div>Laszlo Deak presenting work on NoSQL and performance analysis. #cloudmde Paige</div><div>Even if not there, the <br /> To keep updated on my activities you can subscribe to the <a href=””>RSS feed</a> of my blog or follow my twitter account (<a href=””>@MarcoBrambi</a>). And also remember that WebRatio itself is fully cloud-enabled by now!

Code Generation 2012 – from geeky (programming) interfaces to user interfaces

Code Generation 2012 logo

For the second year, I’ve been attending the event called Code Generation, held in Cambridge, UK and organized by Software Acumen. For the second year, the event has been a quite good mix of practitioners, vendors and experienced evangelists.

During the event, Emanuele Molteni (product manager of WebRatio) and I have given a talk on User Interaction modeling problems, requirements and initiative, with special attention to the IFML (Interaction Flow Modeling Language) standardization activity within OMG. I think the problem is perceived as definitely relevant by the community, as demonstrated by the lively discussion that followed the presentation.

I would say that the main characterization of this year is really towards User Interaction modeling. A lot of sessions and speeches (including ours) addressed this problem from different perspectives, spanning from multi-platform mobile apps, to interaction modeling, to interpreted approaches for enterprise UIs, and so on. Even during our presentation, the audience admitted that they have built tens of domain specific languages (DSLs) for modeling some aspects of UIs. If you are interested in the topic, you can have a look at our presentation here:

WebRatio has been sponsoring the event, and as a sponsor also got a booth at the conference (see picture). We have been interacting with a lot of visitors ad got enthusiastic feedback.

WebRatio booth at Code Generation 2012.

The event also features several interesting sub-events, sessions and keynotes, notably:

.. and several other interesting talks. 
In summary, happy to attend and looking forward to next edition.

To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).

RDF Data Management. Talk by Tamer Ozsu at Politecnico di Milano

Tamer Ozsu gave a 1 hour seminar on RDF Data Management today at Politecnico di Milano.

Tamer Ozsu,
University of Waterloo

What I found intriguing is the database and modeling perspective he has on RDF.

We all know that RDF is a machine readable format for describing resources on the web based on triples. A triple is a simple concept composed of three terms: .

Every resource is identified by a URI. The terms can either be URIs or literals.
Terms can have types (rdf:type), which is basically corresponding to a class in modeling sense.
Triples define a graph of vertexes, including URI vertexes and literal vertexes.

RDF can be queried through SPARQL.
In database sense, one could think to a naive solution by saying we have a global schema with only one table with the three columns (Subject, Predicate, Object), upon which one could perform SQL queries.
However, this solution is critical because it’s introducing a huge number of joins.

The real relational solutions that one can apply are:

  1. Property table: each class of objects go to a different table (like in normalized relations)
  2. Vertically partitioned table: build a two column table, containing subject and object, ordered by subjects
  3. Exhaustive indexing: consider the whole table and build all the possible indexes upon it

However, typical solutions do not go towards relational schemas. They directly work on the RDF graphs, by applying subgraph matching, by applying isomorphisms. However, off-the-shelf algorithms do not scale well to the size of RDF graphs.

Typically, RDF graphs are not generic graphs. They often implement “star-shape patterns”, where one core object is connected to several properties.
Ozsu proposes an encoding technique for optimizing querying.
It defines an adjacency table where for every vertex, you select all its adjacent nodes, and you encode them by applying n-gram technique with n=3, encoding the result in binary format and then putting them all in OR. If you encode the question in the same way, this lets you find a set of nodes that possibly match the question. You need to run a two-step process though: the first step runs an inclusion query, which extracts all the nodes that include a given query node. To make it efficient, you run an S-tree solution: you define a tree and apply the inclusion query. You get a set of potential nodes, and then you perform a join between them based on the predicate label that connect the query nodes. To reduce the space of the join, one can apply pruning and thus apply VS-tree techniques.
The result is complete but not sound, in the sense that you can still get a lot of false positives. In a second step, you verify that the extracted nodes actually match. 
Aggregation obviously adds up complexity.

RDF graph or neuron connections?

To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).

Touch base on Model Driven Software Engineering: MDE, MDD, MDA and (other) stuff. What about a new book?

I think it’s time for Model Driven Software Engineering practitioners (and lurkers) to touch base, see where we are and where we are heading to, and finally spread the world.

That’s why I felt the need of a comprehensive and agile reference on the topic, and having not found one that fits the needs of both developers and designers, and of both enterprise and academia, I decided, together with Jordi Cabot and Manuel Wimmer, to start working on a new Model Driven Software Engineering book to be published next Spring by Morgan&Claypool.

The choice of the publisher and the series is very much in line with our philosophy of providing an agile, easy to fetch reference book, at a reasonable price, and available all over the world.Several students will actually be able to get it for free, if their institution is affiliated with the M&C subscription program.

Our book will approach the topic of MDE from a high level perspective, and will proceed by describing the various techniques, methods, languages and technologies in this field with a pragmatic and hands-on style. The book will target people with no previous knowledge on MDE with the goal of giving them a clear idea of what MDE is, what it is good for and how to apply it.
We believe the book will be interesting for professionals (software developers, project managers, CTOs,..), university students in academic courses, and consultants on MDE topics.
Some of the topics we will cover in the book are:

  • an introduction to MDE and the plethora of acronyms that surround it (MDD, MDA, MOF, GML, DSL) 
  • an overview to the General Modeling Languages (GML) approach
  • the Domain Specific (Modeling) Languages (DSL / DSML) approach, with all its variant and application issues
  • the model transformations concepts and languages, including model-to-model and model-to-text transformations and code generation techniques
  • the basics of the supporting infrastructure for MDE
All the topics will be illustrated through a running example in a very pragmatic way, so as to help readers to easily grasp the complexity of MDE. We will describe the full MDE-based development process (from high-level models to the final running applications, but covering also maintenance, reengineering, and so on). Readers will be able to easily test and execute the examples. The reference platform for them will be the Eclipse Modeling Framework (EMF) build on top of the well-known Eclipse open source platform, but references to other tools and platforms will be provided too.

We would be glad to get your opinion on this initiative and to tell us what you would like to see in the book or any other comment that can be useful while we are writing it. The TOC has been already defined (based on the topics above), but we can think of extending/reducing some parts  if we see you are interested in specific topics!
Feel free to comment and propose ideas here or on Jordi’s blog.

To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).

Models and reality: upon the verdicts on MDA/MDD

I recently happened to read this abstract of Friedrich Steimann’s verdict on MDA/MDD:

Models have their greatest value for people who cannot program. Unfortunately, this is not because models are more expressive than programs (in the sense that they are capable of expressing complex things in a simple manner), but because they oversimplify. To make a useful program (i.e., one that meets its usersexpectations) from a model, the model must be so complicated that people who cannot program cannot understand it, and people who can program would rather write a program than draw the corresponding model. Even though (over)simplification can be useful sometimes, it is certainly not a sufficient basis for creating a satisfactory end product.

While I agree on some statements, I strongly disagree on the conclusions.
It’s true that models are appealing because they are a simplified representation of the reality. This is exactly their definition. Not a weakness, but a strength, I think. Just think what has been possible in the scientific fields of physics and chemistry. All we studied in classes are models, not the actual reality. Then one can discuss how much these models simplify such reality.But the final outcome is that several real solutions has been devised based on these studies on simplified models.

Back to software engineering, models are crucial for describing overly complex systems, and I strongly believe they are a powerful tool for leading to the end product. Then, the question could be how to reach the target.
One could argue that models can help in the initial phases, while manual implementation is needed at the end. However, I think that MDD is now comprehensive enough to provide a good set of tools for modeling applications down to the final product delivery. We should not forget that we may apply hierarchies of models, model transformations, and aggregation of complementary models for describing a system. At the end, simplification is just another word for abstraction.

Therefore, I think that simplification is actually sufficient for going to the end products, provided that the MDD approaches allows for models with multiple levels of details, transformations among them, and appropriate inferences starting from the simplified views of the reality. I.e., one must assume that starting from a simplification of the reality (or from a combination thereof) there are some logical deductions (more or less automatic) that lead to filling the gap from the simplification to the final result.

I base my opinions on very concrete experience of the MDD approach in the Web field, in which the combination of appropriate models (BPM, ER, WebML, presentation layer [see also the RUX-tool]), of proper transformations, and of design tools (WebRatio) actually allows us to “magically” get to the final product delivery with hundreds of customers.

Posts on BPM and UML interaction

Here are the responses I gave on Jordi Cabot blog ( on the issue of business process modeling and of the new CFP for a UML profile for BPMN.

Mixed feelings – but clear understanding Submitted on Thu, 09/16/2010.

I have mixed feelings about this issue: first, about the objective of the move; second, about the advantages it will bring; and third, about the relevance of the discussion.
1) Objective:
If the target is to flatten all the modeling to just one notation and role(the software engineer), then this is definitely a wrong direction.
BPMN and UML have two different focuses (business and software) and are used by different roles (analysts and softengs). We should not forget this… even more: bpmn itself is now perceived as not fully understandable by its target users.
2) Advantages:
Supposing we keep in mind the two focuses, the advantages
of the proposal above could be to allow the orthogonal design of different aspects of the applications by different roles, also granting integrated design of the different orthogonal aspects in a unified design vision (old story on separation of concerns).
3) Relevance:
Well, to be honest I don’t see the discussion as so relevant. In the bpm field the notation issues are more and more seen as marginal aspects (someone is already wondering what will be the fate of BPMN 2.0). I don’t see why we shouldn’t start doing the same in the softeng community.
In term of acceptance and utility of modeling notations, you can see the some lessons learned from our on the field activities here:

Rules for BPM(N) modeling Submitted on Mon, 11/08/2010

For methodological guidelines see also the online decalogue by Bruce Silver Associates:
And also this paper by Michael Zur Muehlen on empirical analysis on the usage of BPM modeling languages (basically more or less a Pareto Rule for moveling languages concepts: less than 20% of the notation and patterns covers more than 80% of cases, or even more):
This is in line with our experience too.

5th MDA and Agile Modeling Forum, Milano, Sept 30, 2010 – Morning Session

I attended the 5th Model driven architecture and agile forum in Milano on September 30th.

Here are a few take away messages I got:

Richard Soley (Chairman and CEO of OMG): 

  1. If IT departments of large enterprises don’t change, they are doomed to end up cleaning the floor and changing light bulbs. Entire IT division role need to be reinvented. Basically CIO role shall become_ to automate the business processes throughout the company, and even better: to optimize the processes, more than just automate (otherwise: risk of commoditization). 
  2. Business Ecology Initiative: Green economy also means no redundant or inefficient processes. This and other communities of practice like: BPM/SOA, Green CIO, CyberSecurity, … are part of the current OMG strategy for sharing and exploiting experiences between companies.
  3. Models are going mainstream in the near future: “By 2013 graphical models in software will be used in more than 80% of new compositions” [Gartner]. And the purpose of standards is not to drive industry to a unique notation, but to make adaptation easier. 

Stephen White (the main editor of BPMN): 
BPMN is not able to bridge the gap between IT and business per se. However, it can combine with other languages, such as SoaML to solve the issue. Btw, this is in line with what we are doing now with WebRatio BPM: we integrate BPMN with WebML (and all its design dimensions) to address the gap and grant quick design and implementation.

Allen Brown (with The Open Group):  
TOGAF and MDA integration is crucial for making the first work and the second actually implemented in the enterprise. I see TOGAF as a rather heavyweight methodology (à-la ITIL).

Stephen Mellor (one of the fathers of MDA): 
You can be agile while developing with model driven methods, despite the agile critiques to MDA: i.e., that models don’t run, can’t be tested, are used just for documentation, require extra work and alignment. The criticism comes from a different understanding of the Model concept:

  • Models as sketches: you draw them and then throw them away
  • Models as blueprints: aim at directing and planning the implementation, under the assumption that construction is more costly than design
  • Executable models: they are not just models, they are intended as part of implementation and for verification. They are built under the assumption that construction is less costly than design.

Now with executable UML models (xtUML) you can describe your actions and perform them on the models.
Thus MDA and Agile can merge, thanks to model compilers and alternative techniques. Why does it happen now and not 20 years ago?

Claus Torp Jensen (with IBM):
The question is: what are we modeling? software systems or business solutions?
We need strategic synergies betweeen Business strategies, SOA, BPM, and EA. Each of them in isolation can produce incremental results only..
Architectural models and requirements must be contextual, collaborative, consumable (i.e., understandable) and connected in nature for being useful and integrated in the business strategy of the enterprise. How to get there?
Semantics of the EA plan is not the same of the one in the BPM tool. You need to understand where is your work located: at business level (BPM), at information system level (business dependent IT), or at technology level (business independent IT). Each of them has his own “tribe” of people and will have its own tools and models.
But remember that copying is evil, even at the enterprise planning level. You shall not copy, but only define and preserve links between the levels and the models.

Morning panel
The biggest difficulty for companies adopting BPM+SOA+EA is:

  • dealing with people habits and resilience to change (Soley)
  • accepting standards (White)
  • who to ask for guidance and training, and tools availability; communication between users, business lines, and other stakeholders (Brown)
  • definitely people (Mellor)
  • impatience of getting to the results (Jensen)

Software integrators will not disappear, but will need to change their activities basically to BPM integrators.
UML 3.0 will be out in 1.5 years or so. Now the working group is building the first draft.
If you don’t have a success measure for your BPM/SOA project, you are at risk of failure. Some KPIs must be defined and obviously must benchmark the processes before and after the project. BPMM (Business Process Maturity Model) can be used for that too.
About SOA, people tend to focus on reuse as the main advantage. But this is not the only aspect.
About standards: as anybody knows, they are not complete enough to grant interoperability of models or diagrams. You may choose to buy all the best of breed tools and make the integration yourself, or you can buy an integrated toolsuite and make the seller integrate it in your business.Standards are only the common denominator of all the producers, they cannot cover all the cases and scenarios.

Attendees to the event mainly included people from banking, software integration, (BPM and MDD) software producers, and consulting companies.