in

The ORM Foundation

Get the facts!

Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

Last post Fri, Mar 2 2018 3:17 by Matthew Curland. 10 replies.
Page 1 of 1 (11 items)
Sort Posts: Previous Next
  • Tue, Feb 27 2018 13:59

    • Steve Miller
    • Top 50 Contributor
      Male
    • Joined on Thu, Jan 1 2009
    • Portland, Oregon USA
    • Posts 18

    Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

    Greetings,

    I have a database definition that I have created and wish to document formally.  The report generator does a nice job, but the verbalizations in the  constraint report do not come close to matching that which is displayed in the verbalization browser.  In fact, if one did not know what the browser window showed and only relied on the report to understand the constraint, the interpretation could be massively wrong... 1) Why is there a difference? and 2) is there a work around or some other way to export to a report the actual/correct verbalization?

     Thanks, Steve 

  • Tue, Feb 27 2018 18:04 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

     Hi Steve,

    Please will you post an example of the differences that you are observing between the verbalizer and the output from the report generator.

    Thanks

    Ken  

  • Tue, Feb 27 2018 18:51 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

     Hi Steve,

    Your registered email address is not working.
    The system is sending messages to you but it seems that your email server is not accepting the messages.

    Please will you update your email address in your profile.

    Thanks

    Ken 

  • Tue, Feb 27 2018 22:31 In reply to

    • Steve Miller
    • Top 50 Contributor
      Male
    • Joined on Thu, Jan 1 2009
    • Portland, Oregon USA
    • Posts 18

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

     Hi Ken,

    Thanks for the reply. Email address updated.

    For a very simply constraint where the relationship is mandatory and there is a left or a right/left binary fact constraint the reading is like this in the browser:

    Tool has CreationDate.

    Each Tool has exactly one CreationDate.

    It is possible that more than one Tool has the same CreationDate. 

    Tool has ToolName.

    Each Tool has exactly one ToolName. (this is correct)

    For each ToolName, at most one Tool has that ToolName.

    ...but the constraint report lists:

    Each Tool has at most one CreationDate.  (fishy)

    Each Tool has some CreationDate. 

    -and-

    Each Tool has at most one ToolName.

    Each Tool has some ToolName. 

    This implies to me that it can be zero or one and that is NOT the case.

     

    Now, for a more complex relationship where it is mandatory and the constraint bar stretches over both sides of the binary constraint, the readings are the same:

    Tool has ToolAlias.

    It is possible that some Tool has more than one ToolAlias

    and that for some ToolAlias, more than one Tool has that ToolAlias.

    In each population of Tool has ToolAlias, each Tool, ToolAlias combination occurs at most once.

    This association with Tool, ToolAlias provides the preferred identification scheme for ToolHasToolAlias.


    It is possible that some Tool has more than one ToolAlias

    and that for some ToolAlias, more than one Tool has that ToolAlias.

    In each population of Tool has ToolAlias, each Tool, ToolAlias combination occurs at most once.

    This association with Tool, ToolAlias provides the preferred identification scheme for ToolHasToolAlias. 

     

    Why would the report be different for the first two cases?? The browser is correct. The report is unreliable.

     

    Thanks, Steve 

  • Wed, Feb 28 2018 2:55 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

    Hi Steve, 
    Please will you post the ORM diagram as well?
    You can do this by copying the image, scrolling to the top of the thread and using Options>Add/Update.

    Thanks

    Ken 

  • Wed, Feb 28 2018 8:03 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

     Hi Steve,

    The email server at scayl.com is still rejecting emails.

    Ken 

     

     

  • Wed, Feb 28 2018 18:17 In reply to

    • Steve Miller
    • Top 50 Contributor
      Male
    • Joined on Thu, Jan 1 2009
    • Portland, Oregon USA
    • Posts 18

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

     


  • Wed, Feb 28 2018 19:18 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

    Hi Steve,

    Thanks for the ORM diagram - it is always best to start from the diagram. It is also useful to be able to see the data types.

    So: Problem 1: Inconsistency between the verbalizer and the generated HTML report. 
    You claim that :
    Each ToolAlias has exactly one Alias. 
    is not the same as: 
    Each Tool has at most one ToolName.
    Each Tool has some ToolName. 

    And so you conclude that:  

    This implies to me that it can be zero or one and that is NOT the case.

    Well, firstly, you are correct that the first single sentence predicate (...exactly one...) does not use the same text as the two sentence predicates that are generated in the HTML report.

    However, your conclusion is false because the logical combination of "at most one" and "some" (= not zero) is the same as "exactly one".

    In other words, when there is more than one constraint, you have to read all predicates as applying together.


    Some other comments:
    I never use the (.id) option because it sometimes leads to inconsistencies.
    When I want to create an auto-counter primary key column in SQL Server, I use the reference mode (.nr) and set the data type to "Numeric: Auto Counter".

    Other suggestions:

    Tool(.nr) was created on Date()
    Tool has Alias(.name)
    Tool is active. (a unary) 

     This model generates only two tables.

    Hope this helps.

    Ken

      


  • Wed, Feb 28 2018 21:23 In reply to

    • Steve Miller
    • Top 50 Contributor
      Male
    • Joined on Thu, Jan 1 2009
    • Portland, Oregon USA
    • Posts 18

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

    Hi Ken,

    Thank you for your constructive feedback. I love your suggestions for improvement.

    As for the readings, I see what you are saying, but I am still perplexed as to why it would be verbalized differently between the report and browser.  If I were to choose between the two readings I would opt for that which is shown in the browser...but, I'll try to get over it!!! Big Smile

    Cheers, Steve 

  • Wed, Feb 28 2018 22:14 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

     Hi Steve,

    I agree that it would be better if the report generator and the verbalizer showed exactly the same output. The mismatch probably came about because Matt updated the verbalizer (probably to meet a request from Terry) but did not get around to implementing the updates in the report generator.

    As you may know, in September 2016, I implemented this JIRA system to manage NORMA's known problems and feature requests. 
    Over the last sixteen months, the JIRA system has helped to reduce the number of unresolved bug-type issues in NORMA from 43 to the 21 that you can see here.

    If you are working with databases, you will probably find it helpful to read the report that I posted as part of issue NOR-2.  

    The NORMA code was moved from SourceForge to Github and you can access it from within the JIRA project.

    NORMA is being worked on by volunteers so if you know of anyone who wants to help please let me know. 

    Thanks for your contribution.

    Ken  

  • Fri, Mar 2 2018 3:17 In reply to

    Re: Verbalizations in Constraint Validation Report don't match ORM Verbalization Browser

    Hi Steve,

    Please let me solve the mystery for you. There are not two verbalization engines, there is only one. The report generator simply uses a different set of 'verbalization snippets' to produce different styling for the report. Verbalization snippets are simply format strings with replacement fields, and all of the NORMA verbalization is snippets contained in snippets (etc) with some user-provided data thrown in from the actual model.

    If you look at the Options page (on the Tools menu) you'll see an 'ORM Designer' page. Look at the Verbalization section on that page and uncheck the 'Combine Mandatory and Uniqueness Constraints'. This defaults to true and means when you verbalize the fact type (not the individual constraint) we pick up the common pattern of a simple mandatory constraint and a single role uniqueness on the same role. This gives you the 'exactly' reading you're seeing. However, if you right click the fact type and choose Select in Model Browser then expand the browser node, you'll see Internal Constraints, which lets you select the constraints individually (you can't select the simple mandatory constraints directly in the UI). Here you'll see the non-combined constraints, which gives you exactly what you see in the constraint report.

    Hopefully that explains the difference. The constraint report is listing the constraints individually, whereas the verbalization browser fact type verbalization combines them as child verbalizations of a fact type if this option is on.

    -Matt 

    P.S. There are lots of fun options. For example, if you're showing a model to someone who can read an ER diagram, you can turn on one of the Entity Relationship Learning Mode options to show different ER notations overlaid on binary fact types. The only option I routinely change is the Delete Key Behavior: I prefer Delete to remove the backing element, and Ctrl-Delete to remove the shape (but not the element).

Page 1 of 1 (11 items)
© 2008-2024 ------- Terms of Service