DataFlux: Output a Job Log When Executed from SAS® DI Studio
This SAS tip describes the steps required to output a DataFlux job log, when a DataFlux job has been called and executed from SAS DI Studio.
Although this tip centres on DataFlux jobs executed from DI Studio, the code and principles can be applied across any SAS application which has the capability of connecting to the DataFlux integration Server or Data Management Server.
The SAS functions used in this document are applicable to post SAS 9.2 releases. However, the same functions can also be harnessed for pre-SAS 9.3 releases, with a slight change to each function name:
DI Studio: Basic DataFlux job Transformation Structure
Below is an example of a typical DI Studio job transformation, which has been configured to run a DataFlux job named “DF_Job”:
Navigating to the Code tab will display the SAS DATA step generated from the selections made in the Job tab:
N.B. the above is a reformatted version of the SAS generated code.
Adding DataFlux Logging
In the last section, we have seen the SAS code which is generated to execute a DataFlux job from DI Studio. Taking this a step further and outputting a job log, we can manually add the dmsvrcopylog function to the DATA step:
The above addition outputs a log for the “DF_Job” DataFlux job to the C: drive, on the server where the DataFlux Integration or Data Management Server is deployed.
Output a New Log for Each Run
In the last section we successfully added a function which will output a log for our DataFlux job. However, this log would be overwritten each time the job is executed. What if we needed a log for each time the job was executed?
Taking the code from the last section a step further, we can easily achieve this with the addition of a SAS macro variable:
The macro variable log_file will generate a filename based on the current datetime. For example:
|DateTime||Log File Generated|