in

The ORM Foundation

Get the facts!

Considerations for large .orm files?

Last post Fri, Jan 15 2010 8:26 by sbattisti. 3 replies.
Page 1 of 1 (4 items)
Sort Posts: Previous Next
  • Thu, Jan 14 2010 7:58

    Considerations for large .orm files?

    I'm considering using the NORMA tool for a project of considerable size, that would involve 30+ diagrams of considerable complexity. Are you aware of any limitations in how many pages of diagrams the NORMA tool can handle? Will the tool still perform adequately as the .orm file increases in size? Are there any things I should consider when tackling this?

     Thanks!

     Steve

  • Thu, Jan 14 2010 12:12 In reply to

    • Ken Evans
    • Top 10 Contributor
      Male
    • Joined on Sun, Nov 18 2007
    • Stickford, UK
    • Posts 805

    Re: Considerations for large .orm files?

    Hi Steve,

    NORMA models are held in memory so the model size is limited by factors such as the amount of RAM, the size of the page file and the speed of your disk drives.

    Ken 

  • Thu, Jan 14 2010 12:43 In reply to

    Re: Considerations for large .orm files?

    Steve,

    It is hard to tell the true complexity of the model from a diagram count. We fixed some issues last year with very large files (5-10MB on disk) with repeated identification patterns causing major slowdowns. This was caused by over aggressive attempts to reduce the number of generated tables while analyzing the ORM model.

    The core of the tool itself scales well beyond this size. However, there are a couple of areas in our use of the framework that cause lag in large files, especially on model load:

    1. Sizes for all text-based shapes are verified on load, and all lines are recalculated on load. The line routing in particular is excessively slow because the framework ignores the parameter that says 'no jumps or bends' and calculates these anyway. I don't have an exact number, but this is the majority of the load time.
    2. The relational view rebuilds itself on every major change, including the expensive rerouting of all lines. The relational model shows all of the information available in the relational view, so you can improve incremental performance by turning off this extension.
    3. We currently do a full regeneration of the relationally mapped elements when any significant ORM change is made. On large models, this can easily result in transactions of tens of thousands of individual items. Work to get this incremental is scheduled for the first half of this year and will alleviate both the relational view issues and the sheer size of these transactions. Currently, even with models of the size you're describing, this adds at most a .5 second delay on an .orm model change, but we can definitely do better.

    Bottom line, I think the scale of the models you're discussing will not have problems. Note that we're also using some internal partitioning tools that allow us to break up and selectively merge .orm files, generally without the shapes, which cause the load slowdowns. If you have major issues you can contact me off line on this issue as the tools here still require a fair amount of hand-holding (which is why they haven't been broadcast at this point),.

    -Matt

  • Fri, Jan 15 2010 8:26 In reply to

    Re: Considerations for large .orm files?

    Excellent, thanks for the feedback!

    Steve

Page 1 of 1 (4 items)
© 2008-2024 ------- Terms of Service