View previous topic :: View next topic |
Author |
Message |
satya_mn_99 Beginner
Joined: 26 Sep 2005 Posts: 17 Topics: 5
|
Posted: Tue Dec 06, 2005 2:36 pm Post subject: CA-7 schedule problem |
|
|
Hi
We have the following senario where we need to run the group of jobs 4 times a day in Production thru CA-7. Our plan is like this :
Set 1 (8AM) Set 2(10AM) Set 3(1PM) Set 3(5PM)
Job A Job A Job A Job A
Job B Job B Job B Job B
Job C Job C Job C Job C
Job D Job D Job D Job D
Here SET2 JobA should only start after successful completion of SET1 JOBD and subsequent sets also should do the same.
Please help on this ..... |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12370 Topics: 75 Location: San Jose
|
Posted: Tue Dec 06, 2005 3:14 pm Post subject: |
|
|
satya_mn_99,
You can use Virtual resource manager to put a hold on the jobs.
pick a dataset from each of Job b job c and job d and add it in VRM for Job a. So if any of the jobs are excueting they will have a hold on that dataset. so JOBA will wait all they are complete.
Hope this helps...
Cheers
Kolusu _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
|
satya_mn_99 Beginner
Joined: 26 Sep 2005 Posts: 17 Topics: 5
|
Posted: Tue Dec 06, 2005 3:28 pm Post subject: |
|
|
Thx, for the reply, But this doesn't solve our problem. We need some kind of solution in CA-7. We have around 60 jobs in each set, we can' t have hold on datasets like that.
Anyone have anyother idea ????????? |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12370 Topics: 75 Location: San Jose
|
Posted: Tue Dec 06, 2005 3:32 pm Post subject: |
|
|
satya_mn_99,
The VRM is *indeed* a CA-7 Solution. Talk to your schedulers , and they will let you know how to set up the VRM.
Kolusu _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
|
satya_mn_99 Beginner
Joined: 26 Sep 2005 Posts: 17 Topics: 5
|
Posted: Tue Dec 06, 2005 4:15 pm Post subject: |
|
|
Our schedule guys are not comfortable to keep a hold on datasets, any other solution in CA-7 ? |
|
Back to top |
|
|
SureshKumar Intermediate
Joined: 23 Jan 2003 Posts: 211 Topics: 21
|
Posted: Tue Dec 06, 2005 4:21 pm Post subject: |
|
|
satya_mn_99,
If you don't want to utilize the datasets as Kolusu suggested, take a VRM on the Job, talk to schedulers
Add resource whatever-name to job A |
|
Back to top |
|
|
satya_mn_99 Beginner
Joined: 26 Sep 2005 Posts: 17 Topics: 5
|
Posted: Tue Dec 06, 2005 5:30 pm Post subject: |
|
|
Hi All,
Thanks a lot for your suggestions .... We have spoken with schedule group and we were told that there was a facility that the sencond set of jobs can be triggered after successful completion of 1st set even though the job names are same in those sets.
Thanks again .... |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12370 Topics: 75 Location: San Jose
|
Posted: Wed Dec 07, 2005 6:20 am Post subject: |
|
|
Quote: |
We have spoken with schedule group and we were told that there was a facility that the sencond set of jobs can be triggered after successful completion of 1st set even though the job names are same in those sets.
|
Satya_mn_99,
Can you share with us what the facility was?
Kolusu _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
|
Xevious Beginner
Joined: 29 Dec 2005 Posts: 5 Topics: 2
|
Posted: Fri Jan 13, 2006 9:32 am Post subject: |
|
|
If you are single threading the jobs, Rav's suggestion is perfect. If you aren't, the amount of preds/triggering could add up (assuming you have 60 jobs per set). VRMs allow you to bypass alot of coding and make the cycle more streamline. CA-7 gives you multiple options, your Production Control/Scheduling group should know the most efficient way.
The "facility" that you are speaking of, is the DB option from the primary menu. All manipulation of CA-7 jobs/components are done through here (with the exception of BTIs).
Good Luck |
|
Back to top |
|
|
arunkantony Beginner
Joined: 28 Feb 2006 Posts: 6 Topics: 2 Location: India
|
Posted: Tue Feb 28, 2006 11:54 pm Post subject: Complex CA-7 Scheduling |
|
|
Hi,
I have a similar issue but the difference is that any number of JOB A could get triggered simultaneously since it is demand dataset triggered. So what I require is after the completion of JOB D of the first set, I need to start the running of JOB A of next set.
Please help me in resolving this |
|
Back to top |
|
|
Jaya Beginner
Joined: 02 Sep 2005 Posts: 77 Topics: 10 Location: Cincinnati
|
Posted: Wed Mar 01, 2006 6:20 am Post subject: |
|
|
Arunkantony,
Quote: | Hi,
I have a similar issue but the difference is that any number of JOB A could get triggered simultaneously since it is demand dataset triggered. So what I require is after the completion of JOB D of the first set, I need to start the running of JOB A of next set.
Please help me in resolving this _________________ "Great spirits have always encountered violent opposition from mediocre minds."
-Albert Einstein |
|
Back to top |
|
|
CaptObvious Beginner
Joined: 01 Feb 2006 Posts: 19 Topics: 1
|
Posted: Wed Mar 01, 2006 1:16 pm Post subject: |
|
|
It seems to me that the problem here is that the job is externally triggered by the dataset, which is taking some of the control away from CA-7. Rather than triggering the job, I would schedule it under SCHID=0 for a submit time of 8:00am, and make the dataset a dependency. If the dataset doesn't arrive, the job doesn't submit.
Once JobD completed it could then trigger JobA, SCHID=1. This different schedule ID specifies a submit time of 10:00am. At the end of that jobstream JobA is triggered again, under SCHID=2... and so on. |
|
Back to top |
|
|
arunkantony Beginner
Joined: 28 Feb 2006 Posts: 6 Topics: 2 Location: India
|
Posted: Thu Mar 02, 2006 1:51 am Post subject: |
|
|
Thanks Jaya/CaptObvious for your suggestions!
Sorry I guess I have confused you. I shall explain the scenario again in a different manner.
We have two jobs J1 and J2.
Job J1 is triggered by demand dataset (which we have no control over, so any number of jobs could get triggered)
This job will pull a file from third party FTP site (CE Mailbox) to MVS dataset.
Job J2 is triggered by Job J1
This job will transfer the file created by Job J1 to Teradata (through PM4DATA).
So the next instance of J1 should start only after the successful completion of job J2 of the previous set, otherwise before the file transfer happens the dataset will get overwritten.
Also if we put J2 as a requirement of J1 in CA-7, after the successful completion of J2 of the first set, the requirement of second and third instance of job J1 get satisfies and this will again create issues.
If there are only two files coming up simultaneously we could have solve this, but in this case we are having more than 3 files coming up.
Please let me know how we could solve this. Your help will be highly appreciated...
CA-7 Experts,
Is there any possible way we could treat job J1 and J2 as a single group? So that the successful completion of the previous group will only start the next group of jobs (J1, J2) to start running?
Suggestions please...
--Arun |
|
Back to top |
|
|
Jaya Beginner
Joined: 02 Sep 2005 Posts: 77 Topics: 10 Location: Cincinnati
|
Posted: Thu Mar 02, 2006 4:21 am Post subject: |
|
|
Arun,
Answer these following questions...
Quote: | CA-7 Experts,
Is there any possible way we could treat job J1 and J2 as a single group? So that the successful completion of the previous group will only start the next group of jobs (J1, J2) to start running?
|
Even if your job J1 starts after the successful completion of J2 of previous batch, How do you ensure job J1 will always pick a fresh copy of file from third party FTP site (CE Mailbox) to MVS dataset without missing any?
Also I find no reason for J1 to pick a constant file from the FTP site and process it again and again.
Quote: | If there are only two files coming up simultaneously we could have solve this, but in this case we are having more than 3 files coming up.
|
I assume that your job J1 would get triggered by a single DSN=AAA.BBB.CCC. There can't be 2 posts simultaneously from the same DSN. As i had mentioned in my earlier post, this DSN would be corrupted since you have no control over it. More worser is the case if your job J1 uses this DSN.
Also, what happens if your job J1 aborts due to some outage..
Whether the third party file will be refreshed with the new version or will it wait till you pick the file?
Hence i had suggested you to send a tag file in your JOB J2 to the third party FTP site saying that your job chain J1-> J2 is completed.
Now your third party can proceed to create the next version of file and after that they can send you the post again to trigger your next J1->J2 chain.
Hope i am right! Also i am eagerly waiting to know if that can be achieved in CA7 without the communication with the third party.
Thanks,
Jaya. _________________ "Great spirits have always encountered violent opposition from mediocre minds."
-Albert Einstein |
|
Back to top |
|
|
CaptObvious Beginner
Joined: 01 Feb 2006 Posts: 19 Topics: 1
|
Posted: Thu Mar 02, 2006 12:58 pm Post subject: |
|
|
Have you considered combining the steps of Job1 and Job2 into a single job? |
|
Back to top |
|
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|