View previous topic :: View next topic |
Author |
Message |
arvibala Beginner
Joined: 12 Feb 2008 Posts: 142 Topics: 67
|
Posted: Tue Jul 20, 2010 6:28 am Post subject: Help needed for better scheduling |
|
|
I have a Job, JOBA, which is dataset(GDG) triggered. We are receiving multiple data within seconds and because of that we are missing some files (generations) and one file (generation) is processed twice.
example :
USERID.DATASET.DATA.G0001V00 - 12.30.02
USERID.DATASET.DATA.G0002V00 - 12.40.01
USERID.DATASET.DATA.G0003V00 - 12.40.01
Here there is a chance that JOBA is triggered 3 times and Generation 3 goes as input twice and Gen 2 is missed.
Is this case how to do a proper scheduling so that JOBA runs seperately processing each file. Also when 1 file is being processed another file should not be processed simultaneously.
Thanks, _________________ Arvind
"You can make a difference with your smile. Have that with you always" |
|
Back to top |
|
|
superk Advanced
Joined: 19 Dec 2002 Posts: 684 Topics: 5
|
Posted: Tue Jul 20, 2010 7:01 am Post subject: |
|
|
This is how we used to handle this scenario.
The scheduler would still load JOBA based on the dataset trigger event firing. JOBA would never specify a specific dataset. Instead, it would be designed to run an IDCAMS LISTCAT and find all of the currently cataloged datasets. That list would be compared against a log of all previously processed datasets. The new dataset would then be processed using dynamic allocation routines. At the end, we'd then log the new dataset name. Then JOBA could run again for the next dataset, and the next, until they had all been processed and logged, one at a time. |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12375 Topics: 75 Location: San Jose
|
Posted: Tue Jul 20, 2010 10:06 am Post subject: |
|
|
arvibala,
Change the file creation to a sequential file instead of GDG and trigger the next job upon creation and the last step would have this sequential file copied to a GDG ( for back up purposes if you need it).
By doing like that even if you received a file while the file is still being processed, it would wait until the earlier job is completed. This approach would be like FIFO process.
Alternatively you can accumulate all the files received as GDG generations and process all of them at once at the end of the day.
Kolusu _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
|
arvibala Beginner
Joined: 12 Feb 2008 Posts: 142 Topics: 67
|
Posted: Wed Jul 21, 2010 12:49 am Post subject: |
|
|
Thanks Kolusu and Superk,
"accumulate all the files received as GDG generations and process all of them at once at the end of the day" is the option we were thinking off, but since this is a response file, it needs to be processed as soon as we receive them.
If we process like a sequential file, what will happen if we receive 2 files at a time within micro-seconds ... wont the second file override the first file before the Job process them? _________________ Arvind
"You can make a difference with your smile. Have that with you always" |
|
Back to top |
|
|
Anuj Dhawan Intermediate
Joined: 19 Jul 2007 Posts: 298 Topics: 7 Location: Mumbai,India
|
Posted: Wed Jul 21, 2010 4:07 am Post subject: |
|
|
arvibala wrote: | If we process like a sequential file, what will happen if we receive 2 files at a time within micro-seconds ... wont the second file override the first file before the Job process them? | No - because if the file is already being processed another Data-set with same name can't be created/cataloged. Also, remember you need to delete the file "once processed". So that when your "next file" comes in, there is no situation like "data set already cataloged". Also, before deleting the file you might also consider to take the back-up as Skolusu has said, and ofcourse with a different name. _________________ Regards,
Anuj |
|
Back to top |
|
|
Anuj Dhawan Intermediate
Joined: 19 Jul 2007 Posts: 298 Topics: 7 Location: Mumbai,India
|
Posted: Wed Jul 21, 2010 4:15 am Post subject: |
|
|
And what scheduler are you using? _________________ Regards,
Anuj |
|
Back to top |
|
|
|
|