View previous topic :: View next topic |
Author |
Message |
jajularamesh Beginner
Joined: 14 Apr 2006 Posts: 87 Topics: 33
|
Posted: Wed Jun 20, 2007 3:52 am Post subject: Requirement to take backup of huge files |
|
|
I have a requirement to take the backup of very huge files.
The reqirement is to append the data every data to the backup file.
I plan to use
Disp=(mod,catlg,delete)
Is there any possibilty that a sort step fails while taking the backup.
If it fails what might me the reason.
when it fails the existing backup dataset will be deleted.
Client is not interested to use GDG's.
So can i use the dispostion as given below
Disp=(mod,catlg,keep)
what will be the status of the backup dataset when the job fails
will it contain the backup till the last run or not |
|
Back to top |
|
|
Phantom Data Mgmt Moderator
Joined: 07 Jan 2003 Posts: 1056 Topics: 91 Location: The Blue Planet
|
Posted: Wed Jun 20, 2007 9:17 am Post subject: |
|
|
Jajularamesh,
Why don't you just try playing around with a small PS (both ways -MOD,CALTG,DELETE and MOD,CATLG,KEEP). I'm sure you will know even before someone replies to your note on this board.
If you face some problem or don't understand the behaviour of DISP parameter - after trying out, please feel free to come back again
Thanks,
Phantom |
|
Back to top |
|
|
Bill Dennis Advanced
Joined: 03 Dec 2002 Posts: 579 Topics: 1 Location: Iowa, USA
|
Posted: Wed Jun 20, 2007 9:24 am Post subject: |
|
|
Using DISP=MOD is risky because the file is changing and cannot be "reset" on an abend (bad media, accidental cancel, ???). On a restart there will likely be duplicated records on your output.
Why are you using SORT? Is it just a COPY? You might be able to use SKIPREC to restart if you're sure how many were already copied. Sounds like trouble. I hope you'll consider making each days backup a separate dataset. _________________ Regards,
Bill Dennis
Disclaimer: My comments on this foorum are my own and do not represent the opinions or suggestions of any other person or business entity. |
|
Back to top |
|
|
jajularamesh Beginner
Joined: 14 Apr 2006 Posts: 87 Topics: 33
|
Posted: Wed Jun 20, 2007 9:52 am Post subject: |
|
|
Phantom,
I already ready tried that and i am not satisfied with both the options i feel it is better to use a GDG.
and for disposition
disp=(new,catlg,delete)
if the Job abends i can restart it.
But problem Client is insisting to do using a simple PS file.
Phantom
i have taken huge files and tried to copy to another dataset.
To make the job abend i have set Time parameter to 1sec and when i ran the job no.of times in one of the run the job abended but the records are not in the multiples of the no of records present in the input file.
Actully i wanted to know when ever a job abends can we retain the same no of records the are present in the file before exection of the job
Bill,
I mean Sort in the sense Sort utility.
Regards,
Venkata Apparao Jajula |
|
Back to top |
|
|
Phantom Data Mgmt Moderator
Joined: 07 Jan 2003 Posts: 1056 Topics: 91 Location: The Blue Planet
|
Posted: Wed Jun 20, 2007 10:10 am Post subject: |
|
|
Jajularamesh,
I apologize if I sounded harsh in my earlier note. We have faced lot of people who expect "Spoon Feeding" - without trying out anything. We do not encourage such a behaviour - they would never learn anything this way.
As Bill mentioned, it is risky to use single file with DISP=MOD. You never know how much you have processed when a job abends. Ofcourse there are ways to get count of records processed and using them while restarting the job but that would complicate things.
Better way is to use 1 PS for each day and at the end of the month - have a monthend job and concatenate all the daily files to a monthly PS and do so on for quarterly / annually.
That way you could use DISP=(NEW,CATLG,DELETE) for daily and monthly PSes and if the job abends, everything will be erased and could be restarted safely.
Hope this helps,
Thanks,
Phantom |
|
Back to top |
|
|
dbzTHEdinosauer Supermod
Joined: 20 Oct 2006 Posts: 1411 Topics: 26 Location: germany
|
Posted: Wed Jun 20, 2007 11:30 am Post subject: |
|
|
Phantom wrote: | Why don't you just try playing around with a small PS |
could you help out an old-dummy - what is a PS?
jajularamesh wrote: | ,
But problem Client is insisting to do using a simple PS file. |
Has the Client given a reason for not using GDG? _________________ Dick Brenholtz
American living in Varel, Germany |
|
Back to top |
|
|
Phantom Data Mgmt Moderator
Joined: 07 Jan 2003 Posts: 1056 Topics: 91 Location: The Blue Planet
|
Posted: Wed Jun 20, 2007 11:47 am Post subject: |
|
|
PS - Physical Sequential - A Flat File / Dataset.
Thanks,
Phantom |
|
Back to top |
|
|
dbzTHEdinosauer Supermod
Joined: 20 Oct 2006 Posts: 1411 Topics: 26 Location: germany
|
Posted: Wed Jun 20, 2007 12:16 pm Post subject: |
|
|
thanks, Phantom _________________ Dick Brenholtz
American living in Varel, Germany |
|
Back to top |
|
|
jajularamesh Beginner
Joined: 14 Apr 2006 Posts: 87 Topics: 33
|
Posted: Wed Jun 20, 2007 11:54 pm Post subject: |
|
|
Thanks Phantom |
|
Back to top |
|
|
dbzTHEdinosauer Supermod
Joined: 20 Oct 2006 Posts: 1411 Topics: 26 Location: germany
|
Posted: Thu Jun 21, 2007 1:56 am Post subject: |
|
|
jajularamesh,
been thinking about this for a few days.
I can understand daily backups. But why does you client want each days backup appended to the previous?
At what point do you start a new file?monthly?, yearly?
Is this backup a purged history? - selectively extracted records
or
a point in time restore?
or
daily transaction file?
Will this file be tape or DASD?
Of what use is this continually growing file?
Why or for what would you ever need this file?
How would you process it if you had a requirement to access something?
if the requirement could be defined? _________________ Dick Brenholtz
American living in Varel, Germany |
|
Back to top |
|
|
jajularamesh Beginner
Joined: 14 Apr 2006 Posts: 87 Topics: 33
|
Posted: Mon Jun 25, 2007 4:14 am Post subject: |
|
|
Actually ours is a migration project backup for two month data is required and finally this backed file will be loaded to the database
Regards,
Ramesh |
|
Back to top |
|
|
|
|