The ORM Foundation

Get the facts!

What you can do with ORM

Agree semantics before writing code
Generate a class model
Generate a fully normalized schema
Generate DDL
Analyse database semantics

Events & News 

25 August 2014: Stanford University - Free Logic Course.  ORM is based on first order logic. So it is good news that Professor John Etchemendy and his colleagues at Stanford University are offering a free course in first order logic. The course is online and runs from 2 September to 19 December 2014. You can see the course summary and register for this important course by clicking here.
18 July 2014: ORM 2014 Workshop re-scheduled.The ORM 2014 workshop will now be held at the Free University of Bozen Bolzano, Italy on September 22 & 23 2014. The meeting will be followed by a meeting of the Fact Based Modeling working group which will be chaired by Serge Valera of the European Space Agency.
31 Dec 2013: New Release of NORMA
A new release of NORMA is available for download from the Library.This version includes a Value Constraint feature

ORM Books: click an image

The "How To" book for Microsoft's ORM tool.

The ORM book. 


Ken EvansIn the early 1990's whilst looking for a data modeling tool, I heard about object-role modeling (ORM). Intrigued by the productivity claims, I wanted to know more. So I flew to Seattle to meet Terry Halpin and the team who were developing InfoModeler for Paul Allen. I was so impressed that I agreed to promote and distribute InfoModeler in Europe.

Some years later, after I had taught many business analysts, software developers and university professors how to design databases using InfoModeler and VisioModeler, Terry asked me to co-author a book about Microsoft's new ORM tool called "Visio for Enterprise Architects" (VEA). (it's the blue book in the sidebar). The book was published in 2003 to coincide with Microsoft's product launch.

Terry, together with Robert Meersman, co-organised the first international ORM Workshop which took place in 1994 on Magnetic Island, Australia. Since 2005, the ORM Workshop has been run in conjunction with Robert Meersman's popular annual On The Move conference. At the 2005 Workshop, those present agreed by unanimous vote that an ORM Foundation was needed to promote ORM. This led me to set up this website which I have maintained ever since.


ORM 2014 Workshop: This year, the ORM 2014 Workshop will be held in the beautiful town of Bolzano, Italy instead of being part of the October OTM Conferences as originally intended.

The workshop will start on Monday September 22 and run for one or two days depending on the number of submissions. The workshop will be followed by a meeting of the Fact Based Modeling Working Group.

The venue is the KRDB Research Centre in the Faculty of Computer Science at the Free University of Bozen-Bolzano. 

If you are interested in presenting at this workshop, please e-mail an abstract of at most 150 words for your presentation to Enrico Franconi at Presenters will need to provide slides for their presentation at the time of the workshop, but no paper is required.

For information about location, travel and accommodation click here.

The rest of this page summarises ORM's context, how it relates to other modeling methods, how ORM is used, its history and tools and what you can find in this website.

Ken Evans


The context of object-role modeling

All too often, software projects greatly exceed budget and timescale or are cancelled at great cost. People who study and write about failed IT projects such as journalist Tony Collins often cite "vague requirements" as a main cause of project failure.

So you might conclude that there would be fewer failed projects if each project started with a clear and agreed set of requirements that defined "what" the project had to deliver before investing in a project to implement the "how".

In 1975, Fred Brooks said "Because our ideas are faulty, we have bugs."  (The Mythical Man Month). Twenty five years later, after extensive research into the causes of IT project failures, Tony Collins suggested that every project should be split into two separate and independent contracts: The first contract should define requirements and the second contract should develop software and systems that conform to the requirements.

But projects still keep failing which suggests that the "requirements lesson" learned in the 1950's seems to have been forgotten, disparaged or discarded. So let's consider three possible problem sources: sponsoring managers, documentation and development methods.  

Some sponsors just don't have time to get involved in the development process. Others, having agreed a project's concepts, benefits and budget, choose not to get involved and just want the developers to "get on with it". But whatever the style of management, busy sponsors would benefit by ensuring that their needs are understood by developers. So what's most efficient way to do it?

Developers need a clear understanding of what the sponsors want. But even when requirements documents are prepared, they often contain dozens if not hundreds of pages of verbose and ambiguous prose.  Can this problem be solved?

Over the last 60 years, several methods have been used to bridge the knowledge gap between sponsors and developers.

In the 1950's, the waterfall method was used for the SAGE computer project. Waterfall was also used by NASA's Apollo project which operated with IMS, IBM's hierarchical database management system. A hierarchical database is very fast when you want to find data by navigating the physical hierarchical pointer structure that was specified by the database designer. But it is much harder to extract data that is spread across several branches of the hierarchy.  

In 1969, Edgar Codd proposed the relational model as a solution to this problem. The first sentence of his paper says "Future users of large data banks must be protected from having to know how the data is organized in the machine".  His paper introduced the concept of normal form and the related process of normalization which is used to avoid logical inconsistencies, anomalies and data duplication.

In 1976, Peter Chen proposed the Entity-Relationship (ER) model for database design. ER remains popular but there are several different dialects. One problem with ER is that to get rid of anomalies and redundancies, you have to normalise your ER model by using complex, time-consuming and error prone manual methods such as functional dependency analysis. So my "elephant in the room" question for ER modelers is: "If it is so difficult to get rid of the anomalies and redundancies in your ER model, why did you put them there in the first place?" I wonder if Edsger Dijkstra was thinking about ER when he said "All unmastered complexity is of our own making!"

In the 1990's, Grady Booch, Ivar Jacobson and James Rumbaugh developed UML, a method for designing object-oriented programs. With UML, you define a data structure by using a class diagram. UML is complex and has been criticised as being ambiguous and inconsistent which makes it hard to design a class structure that does not contain problems such as data duplication.

Why ORM? ORM gives you an efficient way to define requirements and it avoids the need to  use ambiguous natural language prose. An ORM tool can generate a good class model and provide automatic normalization.  

How do you make an object-role model? An ORM analyst guides sponsors to express their ideas using simple sentences such as "The person called Fred was born on 15th July 1985." and "The person called Mary was born on 10th July 1990". In ORM we call such sentences "facts".  The analyst then looks for patterns in the facts. In this case the facts have the pattern "Person was born on BirthDate". We call such patterns "fact types".  

Sometimes restrictions must be placed on the facts. For example in the year 2014, you can't have someone with a birth date that is in the year 2015 or later. So you use ORM to put a constraint on the allowable values of "BirthDate". For example: "The value of BirthDate cannot be greater than today's date."

The analyst creates the object-role model by adding each new fact type and constraint into the object-role model. The analyst uses an ORM tool to generate easy-to-read sentences that show the fact types and constraints that have been defined within the object-role model.

This makes it easy for sponsors to check that the model accurately represents their ideas. The sponsor just agrees or disagrees with the output generated by the ORM tool. The cycle of input and validation continues until the model is considered complete.

The analyst then uses the ORM tool to generate a data structure against which developers can write software. The data structure can be a class model or a relational database schema. When generating a relational schema, the ORM tool automatically generates a fully normalised schema which avoids the considerable effort required for manual normalisation that is needed in the ER method. Thus, ORM helps to reduce development effort and the risk of making costly design errors.    

What is the history of ORM? ORM has evolved from European research into semantic modeling during the 1970's. There were many contributors and this paragraph only mentions a few. In 1973, the IBM Systems Journal published a paper by Michael Senko about "Data Structuring". In 1974 Jean-Raymond Abrial contributed an article about "Data Semantics" and in June 1975, Eckhard Falkenberg published his doctoral thesis. In 1976, Falkenberg used the term "object-role model" in one of his papers. Later, Sjir Nijssen introduced the "circle-box" notation together with an early version of the conceptual schema design procedure. Robert Meersman added subtyping, and a conceptual query language.

In 1989, Terry Halpin formalised ORM in his PhD thesis. In the same year, Terry and Sjir Nijssen co-authored the book "Conceptual Schema and Relational Database Design".

ORM Tools: Early ORM tools such as IAST and RIDL* (Control Data) were followed by InfoDesigner (ServerWare), InfoModeler (Asymetrix) and VisioModeler (Visio Corporation). In 2000, Microsoft bought the Visio Corporation and improved VisioModeler.  In 2003, Microsoft published its first ORM implementation as a component of Visual Studio for Enterprise Architects called "Microsoft Visio for Enterprise Architects" (VEA).  

Microsoft retained VEA in the high-end version of Visual Studio 2005 but then discontinued its ORM project. The book "Database Modeling with Microsoft Visio for Enterprise Architects" (see sidebar) contains a comprehensive guide to VEA

Recently, Terry Halpin and Matt Curland have been developing an ORM tool called "Natural ORM Architect for Visual Studio" (NORMA). You can download the latest release of NORMA from the Library.  

How can I learn more? The definitive book on ORM is "Information Modeling and Relational Databases - Second Edition". You can order the book by clicking on the image in the sidebar.

The research page describes the scientific experiment that I used to support my MSc dissertation in 2008. I designed the experiment to test the hypothesis that "ORM based methods require at least 25% less effort than alternative methods such as UML and ER."  

The Forum contains over 3,000 posts of ORM related discussions. The Library contains ORM software, ORM tutorials and over one hundred ORM presentations of peer reviewed scientific papers given by ORM researchers at the annual ORM Workshops.

You can browse the Library and Forum and registered members can download documents and participate in the forum discussions. 

What's New

Who Is Online

© 2008-2014 The ORM Foundation: A UK not-for-profit organisation -------------- Terms of Service