Analysing Map Files

Forums for specific tips, techniques and example code
Stokes
Posts: 66
Joined: Wed Oct 13, 2010 2:06 pm
Location: QLD, Australia

Analysing Map Files

Postby Stokes » Wed Dec 07, 2016 3:36 pm

We have an issue with some large data (.dat) files that we are wanting to remap using JCF by splitting out the large classes within that file.

I have been trying to work out how to analyse a data file. I want to see for each class attached to that map how many instances and possibly how much space that class is taking up within the file.

Does anyone know how this can be done? I did this manually for 1 of the files by copying out all the class names from the maps browser and then just going to a work space and putting in "write <class>.instances.size". But I'm hoping there is a better way.

Also as a side note, we were discussing the idea of giving all classes that will create a large number of instances their own Map file. Would be interested to know what are the downsides to creating a map file for a single class except for the huge amount of data files that would then sit in the system folder. It seems to me that performance would improve with smaller data files but I'm not sure it would make any difference. I suppose if you had a lot of users all wanting to get data from the 1 data file then splitting it up would improve things a bit?

allistar
Posts: 156
Joined: Fri Aug 14, 2009 11:02 am
Location: Mount Maunganui, Tauranga

Re: Analysing Map Files

Postby allistar » Thu Dec 08, 2016 7:06 am

Hi Stokes,
Having a dedicated map file for a high volume class make sense, particularly when taking into account database operations like reorgs and compacts. Jade can reorg files in parallel, but that's not only useful if there are many map files involved in the reorg. In the past I have gone with a few simple rules: high instance classes get a dedicated map file. Singletons get their own map file, transient classes all go in one map file, other medium to low instance classes get a map file per module/area of the product. I don't see the value in having a dedicated map file for each low or medium instance class.

I'm not aware of an easy way on code of working out how much disk space an object takes up. It should be possible as the sizes for the data types are documented.

If you find the DbFile instance you can iterate all classes on that map, from there you can write the number of persistent instances of that class.

Regards,
Allistar.

Stokes
Posts: 66
Joined: Wed Oct 13, 2010 2:06 pm
Location: QLD, Australia

Re: Analysing Map Files

Postby Stokes » Thu Dec 08, 2016 9:01 am

Thanks for that Allistar. I was thinking along the same lines regarding map files but have never really thought about it in the past. That changed when we had a 3 hour reorg on a 33GB data file (the disk speed on the server was not great).

We could possibly get a rough idea on data size of each class if we start adding up byte sizes on all the properties and references on that class but thought there would be an easier way.

I couldn't see how to get all the classes from the DBFile so I just went through all the classes and got the Map file from the display String as shown below:

Code: Select all

vars class : Class; pos : Integer; display : String; begin foreach class in Class.instances do display := class.display; pos := display.pos("Maps:", 1); if pos > 0 then if display[pos + 5: end].trimBlanks().toLower() = "<map"> then //<map> do not include the extension (.dat) and lowercase. write class.name & Tab & class.countPersistentInstances().String; endif; endif; endforeach; end;
Last edited by Stokes on Wed Jan 18, 2017 12:23 pm, edited 1 time in total.

JohnP
Posts: 73
Joined: Mon Sep 28, 2009 8:41 am
Location: Christchurch

Re: Analysing Map Files

Postby JohnP » Thu Dec 08, 2016 9:08 am

A rule of thumb I use is to aim for 50 dat files, combining the smaller ones and giving larger ones their own file. This is for operational logistics, including reorgs as Allistar has mentioned, backups, transfers, etc. A larger database may justify more files, eg 200, and a smaller one fewer files.

There is no runtime performance difference - it is the same regardless of the dbfile splitup.

It is much faster to use Class::countPersistentInstances than Class::instances::size.

User avatar
BeeJay
Posts: 311
Joined: Tue Jun 30, 2009 2:42 pm
Location: Christchurch, NZ

Re: Analysing Map Files

Postby BeeJay » Fri Dec 09, 2016 12:59 pm

Another option is running a Jade database utility certify on the mapfile in question, or on a restored copy of the system on another machine if you can't afford an outage of production to run the certify. This will give you a summary of the instances of classes in that mapfile, which may be useful for your purpose.

Cheers,
BeeJay.


Return to “Tips and Techniques”

Who is online

Users browsing this forum: No registered users and 1 guest

cron