View previous topic :: View next topic |
Author |
Message |
jim haire Beginner
Joined: 30 Dec 2002 Posts: 140 Topics: 40
|
Posted: Mon Feb 26, 2018 2:00 pm Post subject: Determining if/when a file exists on the mainframe |
|
|
I have a process I am trying to build which uses a 3rd party software to FTP a file from an external customer to our mainframe. I am trying to determine the best method for identifying when the dataset hits the mainframe so the file can be processed.
The problem is that there can be approximately 1200 files being sent by different customers and there will be others coming in the future. The customer will notify us that they will begin sending the file on a regular basis and we will record that on a DB2 table. Other than that I would like to do no other set up for each file that is being sent. My thoughts are that I would build JCL and write it to the Internal Reader based on when one of the files on the DB2 table hit our mainframe.
I would like to have the FTP software write a record containing the name of the file being sent to an MQ Queue which would then cause an MQ Listener to trigger the Internal Reader job to do its processing. I don't think the FTP software can write to MQ, however.
Another thought would be to have a job that runs every 15 minutes to see if certain files exist. However, there would be some additional setup that would be needed to add that file to the list of files to search for. (I could probably pull these from the DB2 table, but it means I'd have to check for the existence of every file every time the job runs.). The MQ queue which has the name of the file sent would be so much cleaner.
Does anyone else have any ideas on how to identify files that have hit the mainframe? |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12376 Topics: 75 Location: San Jose
|
|
Back to top |
|
|
jim haire Beginner
Joined: 30 Dec 2002 Posts: 140 Topics: 40
|
Posted: Tue Feb 27, 2018 9:48 am Post subject: |
|
|
The company I am at uses Zeke. But even if you had a scheduling package, you would need to have to perform some sort of set up in the scheduling package to have it looking for that specific file.
In the MQ scenario I mentioned in my original post, a message would be posted to the queue containing the name of the file being sent, so I would not have to know the name of the file being sent ahead of time and would not need to set up anything for that file. (We are expecting that files coming from each of our various customers have the same length and follow the same file layout).
In my COBOL program that writes to the Internal Reader, I would be able to grab the MQ message containing the name of the file being sent and build JCL using that file name.
I am wondering if it would be possible to set up these "triggers" (similar to your TWS example), but do it dynamically based on the file name I read.
I am new to Zeke, so I may have to find out from someone around here whether this would be possible with Zeke.
Thanks for your response! |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12376 Topics: 75 Location: San Jose
|
Posted: Tue Feb 27, 2018 12:13 pm Post subject: |
|
|
jim haire,
You basically set a rule to each vendor sending the file. And in your scheduling package, you would invoke 1 single job which would take the vendor name and create a GDG with that file. ( this will be useful for your data verify process.
so if the vendor sends a file something like this
VNDRNAME. PROCESS1.XMLDATA.TEXT .... or
VNDRNAME. A.B.C.D.. or
VNDRNAME.BLAH.BLAH.BLAH
The scheduling package would then strip off the vendor name (8 characters) and create a copy of sent file as the following
HLQ.RECEIVE.VNDRNAME.FILE.G00001V000.
....
so it doesn't matter how many files you receive and you will have them all in one place where you can simply search for them with
HLQ.RECEIVE.*
I don't have access to ZEKE manuals, but may be you can look up their manuals for "Dataset triggering" and it should give you a start point to work with. Alternatively you can request them to help you out with dataset triggering since you are a licensed user. _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
|
jim haire Beginner
Joined: 30 Dec 2002 Posts: 140 Topics: 40
|
Posted: Wed Feb 28, 2018 2:41 pm Post subject: |
|
|
Thanks Kolusu! I think you and I have the same idea as far as the naming goes.
Once I determine the name of the GDG using the name of the customer's file, I then perform a step to see if this GDG already exists. If it does not, I create the GDG base using IDCAMS. If it already exists, I skip this step.
I then have a step which copies the customer's file to the (+1) version of the GDG. I then delete the customer file so it doesn't receive an error stating that the file already exists the next time they send the file.
Thanks for all of your input! |
|
Back to top |
|
|
bauer Intermediate
Joined: 10 Oct 2003 Posts: 315 Topics: 49 Location: Germany
|
Posted: Tue Mar 20, 2018 4:23 am Post subject: |
|
|
jim haire,
just in case you prefer a COBOL and / or PL/1 coding for checking, if a dataset exists, you can use the IBM provided catalog routine namend IGGCSI00.
Documentation is available here DFSMS Managing Catalogs , SC26-7409-05.
Kind regards,
bauer |
|
Back to top |
|
|
kolusu Site Admin
Joined: 26 Nov 2002 Posts: 12376 Topics: 75 Location: San Jose
|
Posted: Tue Mar 20, 2018 11:03 am Post subject: |
|
|
bauer,
Op's Intention is to kick off a process as soon as the vendor sends the file. With IGGCSI00, you need to have the catalog search job always running(more like a started task) and check for the existence of the file. _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
|
|
|