Posted: Tue Sep 27, 2005 3:23 am Post subject: FTP: Do's And Don'ts ?
Gents:
Are there any industry-wide (ie: generally accepted) do's and don'ts when using FTP to send flat-files from external firms to a receiving firm's MVS mainframe ?
Assume these are mutually agreed on, industrial strength "push jobs" (fully automated at the sender site) daily interfaces from a large financial company to its commercial bank for example, using leased-lines.
Not that there are any "typical" MVS sites, but:
1: Do receiving MVS sites frown on the sender using FTP in general, for some reason (data-integrity perhaps) ?
2: Do receiving MVS sites dislike incoming FTP's specifying their own JCL to run on their mainframe, perhaps preferring the incoming FTP to simply specify a JCL member already residing on the receiver's host ?
3: Do such MVS sites decline to provide their mainframe's IP address to the other party, instead provding the address of the receiver's intermediate file-server box instead ?
4: Any other issues you can think of ? There must be more !
Joined: 03 Jan 2003 Posts: 550 Topics: 23 Location: Michigan, USA
Posted: Tue Sep 27, 2005 6:57 am Post subject:
Our site does not expose any internal IP addresses to the external world. Any incoming FTPs are sent to an intermediate server located within the external firewall (DMZ). All sensitive data is encrpyted using an idustry standard protocol.
Joined: 03 Jan 2003 Posts: 1014 Topics: 13 Location: Atlantis
Posted: Tue Sep 27, 2005 3:58 pm Post subject:
I don't work in the real world so take these comments 'with a grain of salt'. Rarely is a mainframe exposed directly to the outside network for services such as FTP or even HTTP that are sent in the clear. There are several methods of doing secure FTP (and Secure FTP with SSL through comm server, sftp a la ssh, and others) and those can be opened to the outside, but even then, they are often sent through a traffic cop like HOD's redirector (host on demand) that forwards traffic and hides the backend mainframe. Some others will allow FTP but only on a write-only basis. You can put data to a directory but you can't list the directory or read from it. I have no idea how sites deal with incoming JCL, but my guess is that that would be considered a gaping security whole waiting to happen. A much better approach would be a receiving task on the mainframe that validates incoming files and does with them whatever is necessary. MVS is a very secure system in that you can not get at data for which you are not authorized and you can't run programs for which you are not authorized, but much data can be gleened by looking at shared data such as what jobs are active, who is logged in, what file names exist, etc. So while that will not expose, say, customer account data, one may be able to determine things like suppliers' names, upcoming versions of products, or things like that if data sets or ids are named sloppily. If you are developing a system that requires sending data to clients, I would definitely not depend on having FTP access to their system, and depend less than that (less than 0) on being able to submit jobs. Some may let you do it, but it would be a lousy business plan to count on it.
I subscribe to the "hee-hoo" principal, as in "he who creates the data is he who sends the data", with the added caveat that no data file ever be allowed to be empty (a simple record stating "no data available" is fine). By forcing the partner that is generating the data to "push" their data through, this prevents the nightmare of having to initiate searches and directory lists and relying on data to always be ready when it is needed.
As has been previously mentioned, many shops do not allow unregulated outside access to their major corporate systems. Period. Many shops completely disable the FTP daemon on the mainframe servers. However, the TCP/IP addresses can be hidden with NAT tables.
I don't believe in employing a file server. First, that means that the data is just sitting there, and that can be exploited. Second, that means having to double-pass all the data, which seems rather inefficient, and of course, must be tracked and audited. I like the idea of a front-end interpreter, one that would control access to the internal systems from the outside world, but acts as a pass-through device, never requiring the data to be stored anywhere but on the intended back-end system. I like Sterling Commerce's "CONNECT:Enterprise Gateway" for this, especially when you can use it to provide an FTP Front-End to the outside world, and then convert the data to an SNA-3770 data stream for passing back to the mainframe.
I also think that it is better to employ job schedulers, file directory watchers, or other automation tools, rather that allow remote users to submit jobs. However, if they were allowed to submit a job, then I would think that it would have to be a "pull" of a pre-defined local job, rather than one of their own design.
My goal is to have my external businesses be able to push files all the way thru to my host, and more importantly - be able to resend a particular file if we find it not compliant (debits not = credits and the like). I like your idea of not leaving copies on any server, but automatically removing them after the host upload.
Since we require all our incoming file-names to include the sender-name and business-date, I would imagine send (and re-send with replace-if-file-already-exists) method should be do-able. I just want a robust "file is new, or file is being replaced on host - so here it is" approach.
When we run our batch-uptake of those files is none of our sender's business, and I don't want him doing any job submission or other crazy things. Its an internal end-of-day job-scheduler thing transparent to the sender.
Could Conn-Direct Gway be the solution we're looking for ?
By the way, all hats off to you guys at MVSForums. This lively and thought provoking discussion about industry best-practices is what separates MVSF from all the other help-boards out there !
PS: We simply send a "file non-compliant" email to our users which he responds to by re-sending such files after fixing it. We simply re-edit the file on our host - expecting it to be in "repaired" condition before our next scheduled batch uptake.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum