Posted: Sun Oct 16, 2005 11:28 pm Post subject: Processes that are high involving IO
Hi,
Please let me know the exact cause of the following problem.
I tried to execute a job which has around 12 steps and most of it are
sort steps. At one step, a temporary dataset (15000 trks) passed from the previous step is getting passed to the subsequent step after getting
processed. At this point of the time, the job ran around 2 hours against
the usual 15 mins. Once I removed all the temporary datasets and
used cataloged datasets the problem is solved. But still other jobs
which uses temp datasets for huge volume runs without any problem.
Using a perofrmanc evalaution tool, when this process was evaluated,
it showed that there is reference to .COBLIB IGZCULE and IGZCXFR modules and there is a reference to Attribution of CPU execution time for I/O Logic Error Handler and I/O Declarative Transfer.
I could not understand what for these modules are (.COBLIB IGZCULE and IGZCXFR ) and the exact cause of this problem. _________________ Regards,
SMS
Joined: 07 Jan 2003 Posts: 1056 Topics: 91 Location: The Blue Planet
Posted: Mon Oct 17, 2005 12:54 am Post subject:
SMS,
Quote:
Once I removed all the temporary datasets and
used cataloged datasets the problem is solved.
Could be because of two reasons.
1. If the temporary datasets are huge, the job will definitely take some time to find out volumes that can accomodate the amount of space required. On the other hand, when you provide already cataloged datasets, it doesn't spend time hunting for space.
2. It also depends on the type of volume where your temporary datasets are stored. Note. the job takes significantantly longer time to allocate a dataset in a compressed volume compared to a normal volume. If you are forcing the jcl to allocate on compressed volume using DATACLAS parameter or if your shop default is to allocate all Temp. datasets on a compressed volumne then it might be the reason for the delay.
Find out where the temporary datasets are being stored.
DATACLAS PARAMETER :
1) Explicitly dataclas parameter is not used. But could you pls let me know how to check whether it is allocated on default ?.
EXTERNAL SORT :
2) I am performing external sort. Actually what happens is, step1 passes a
dataset to step2 after truncating the length to required size, step2 process this temp dataset by adding a keyfield in the existing filler clause after which is passed to another dataset. This is where issue starts. _________________ Regards,
SMS
Joined: 26 Nov 2002 Posts: 12375 Topics: 75 Location: San Jose
Posted: Tue Oct 18, 2005 4:42 am Post subject:
Quote:
2) I am performing external sort.
Sms,
I don't thinks so. If you are performing an external sort, how did the cobol modules come into picture. Post your JCL along with your sort sysin cards.
The external sort I mentioned about is happening in Step7.
But the problem occurs only in Step8. In this step, both the
input and output will have the same file attributes.
Once temporary datasets are replaced with cataloged one, the job runs fine. That 's what I am wondering about since other jobs with temporary datasets of huge volume runs fine. _________________ Regards,
SMS
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum