We have a source system that will send files via MFT to a destination system.

Is there a best practice in terms of which part of the overall solution (source, MFT or destination) should handle archiving of the files?

E.g. if the source and destination can both archive the files, does MFT not need to?

Or does it depend on the criticality of the data? i.e more critical file have archiving at every touch point during the data transfer process for clear auditing

5 Spice ups

I think we’re going to need more info on your use case. In the end, when and where files are archived, saved, backed up, whatever is the decision of the stakeholder. What you use to complete the task is arbitrary.

(also, welcome to the club!)

1 Spice up

Thanks for the reply and also the welcome!

It’s more of an general architectural question, but I can provide one use case that comes to mind

We have a ODS that has a table that contains sales data for each of our suppliers (supplier rebates). At the end of the month, a process is run in the ODS that pulls all of this sales data for the month, does some calculation and then produces a CSV file which sits on a shared drive. That CSV is picked up by our MFT solution (Go Anywhere) and then is put onto a shared directory on our server that hosts our document storage solution (ColumbusDW)

So the flow is this: ODS → MFT → Columbus server

That document storage solution then loads the CSV file into it’s system, produces a document for each supplier, that contains all the transactions for them for month in question, and then emails it to them.

For me, the key part of this case is that the ODS has to do some heavy data processing in order to produce the CSV file, which makes me lean towards MFT archiving it. Also the ODS currently does not do archiving, it simply produces the CSV from the data and puts it in a location for MFT to access. However some colleagues have more of view point that both source and destination are purely responsible for archiving and MFT should not get involved with that.

2 Spice ups

While it’s possible the MFT could handle the archive, it seems more reasonable that Columbus handle it, I say this because somewhere between MFT and Columbus, changes can take place…but if the end-point (Columbus) handles the final stages of file use THEN archives it as well, there’s little or no chance for modification before it’s archived.

1 Spice up

Yes that seems reasonable, thank you for your help!

1 Spice up

I manage a LOT of mft/edi file transfers. Simple cold hard truth: IF you want file backups available to yourself for whatever reason (audits/forensics/re-send requests), you’d better be storing them yourself.

1 Spice up