NOTE / UPDATE: the data flow referenced in this article is no longer available but the article is retained for legacy purposes. It is recommended that for operations such as exporting, you use the AnalyzeCli tool - please see this article for information on how to get it and how to use it: AnalyzeCli
The automatic export data flow provides functionality to export all data flows or export data flows based upon last modified date. The data flow was created in Data360 Analyze 3.6.3, and requires this version or newer to run. This data flow leverages internal product APIs and the functionality or syntax of these APIs may change over time.
The data flow has properties to control the frequency of exports. It's recommended to set a scheduler for this data flow and to set the frequency of the exports equal to the execution frequency, such as both daily or both weekly.
Setup
To setup the data flow, import the LNA file and specify the Data Flow properties:
The AnalyzeUsername
and AnalyzePassword
properties must be an admin account in order for the export node to work.
Setting the Modified since X days
property to -1
will trigger an export of all data flows. This is recommended for the first execution of the data flow, along with after a product upgrade to receive a new backup of each data flow from the newer product version. If a positive value is provided, the exports will only occur for data flows that have been modified within that many days.
The Output Directory
property sets the directory to place the exported data flows. The directory structure within this location will mimic your directory structure within Data360 Analyze. In addition to the directory structure, a directory is created for each data and contains a history of exports.
Each individual export contains prerequisites, such as library nodes, to allow each individual export to be imported in full. Importing an older export will override newer versions shared items. Depending on your data flow or library node setup, this may have an impact on other data flows that rely on the shared item.
Comments
8 comments
Thanks for the quick response.
I had an error with this flow.
In the "Export data Flow" part:
Can you help me?
I will check it out with the admin.
Thanks for the help Gerard.
Hi Gerard,
An admin user had the same problem as I did.
Can you tell me why it didn't work?
That error occurs when the user/pass combination provided in the data flow properties are correct but isn't an admin and can't export all data flows. Switching the user credentials to an admin account will correct the error. The flow doesn't have a built-in check to check the user's privileges, which is the cause of that error. I just updated the article to reflect that the user properties need to correspond to an admin account.
Hi Gerard,
Is this usable for D360 server version?
The error is a permission issue to one of the data flows to export, which the admin account should have access to every data flow. Were all nodes cleared and rerun after switching the user and pass? The first API login node will store the login details, so if it wasn't rerun after switching the user, it would still have the previous credentials.
Yes, it was created and tested using the Server edition.
Yes, some Dataflows from other users weren't accessible to admins and that caused the problem. It's working now.
Thanks again for your help Gerard.
Please sign in to leave a comment.