View previous topic :: View next topic |
Author |
Message |
jajularamesh Beginner
Joined: 14 Apr 2006 Posts: 87 Topics: 33
|
Posted: Wed Aug 22, 2007 10:53 am Post subject: Handling data on tapes |
|
|
Hi
we created jobs which will create intial loads.As the data is huge so each day is on tape.we need to omit the duplicate account id based on specfic condtion for that we have a program.
Input to program is daily file and consolidated backup file which is the merged file till date both these files are on tapes.
oupput file will be currenday consolidated backup file i,e new version of consolidated backup file is created here
we have 25 version daily files for which we need to run the job(25 days processing is kept in the some job).
My job is running from 3 days and only 30% complete.
How to handle when huge data is present and data is residing on tape,
Any help on this which will improve performance will be a learning for me. |
|
Back to top |
|
 |
dbzTHEdinosauer Supermod
Joined: 20 Oct 2006 Posts: 1411 Topics: 26 Location: germany
|
Posted: Wed Aug 22, 2007 11:03 am Post subject: |
|
|
what is your tape output block size?
what does your program do? _________________ Dick Brenholtz
American living in Varel, Germany |
|
Back to top |
|
 |
jajularamesh Beginner
Joined: 14 Apr 2006 Posts: 87 Topics: 33
|
Posted: Wed Aug 22, 2007 11:13 am Post subject: |
|
|
Hi dbzTHEdinosauer,
Block size is 10 times of the lrecl of the tape file.
file is of variable block.
Program will copy current day data as it is and now read each record from consolidated back file and check whether particular account id is present in the input file if it not present add that record to consolidated back file if it is present omit the record.
I have a VSAM file which will have the current day account id's in it.each time delete define and repro steps will be run and current day account id's will be copied |
|
Back to top |
|
 |
expat Intermediate

Joined: 01 Mar 2007 Posts: 475 Topics: 9 Location: Welsh Wales
|
Posted: Wed Aug 22, 2007 11:22 am Post subject: |
|
|
Well the first thing I would do is to use optimum blocking on the tapes.
BLSIZE=32760 for VB files
BLKSIZE = Integer(32760/LRECL) * LRECL for FB tapes
This may considerably lower you I/O overheads _________________ If it's true that we are here to help others,
then what exactly are the others here for ? |
|
Back to top |
|
 |
dbzTHEdinosauer Supermod
Joined: 20 Oct 2006 Posts: 1411 Topics: 26 Location: germany
|
Posted: Wed Aug 22, 2007 11:31 am Post subject: |
|
|
jajularameshm,
since you are the only one that knows the lrecl, you should see if your block size approximates the max length allowed for your tape devices. if not, suggest you increase the blocksize.
how many buffers are you using on your tape files? if you have not specified in your dd statement, then it has defaulted to your shop's default bufno value. find out what it is.
how many records in the vsam file? actually, how many records are there and how much data from each vsam record do you need for your program?
expand upon 'what you use the vsam file for, and how it is used'.
give us more info and we can attack the problem from both a non-program change perspective and also possibly changing your program.
nothing should take three days except snail-mail. _________________ Dick Brenholtz
American living in Varel, Germany |
|
Back to top |
|
 |
|
|