About looping and aggregating across rows
1. Looping :
Is there a way to read in a file one by one and do manipulation? I want to be dynamic because the number of files is unknown.
2. Aggregating across rows :
Is there a way to perform this special kind of summation?
Example:
Type ColA ColB Sum (come from last record ColB + current record ColA)
-------------------------------------------------
Apple 1 2 /
Orange 3 4 5
Banana 5 8 9
Many thanks.
-
You can perform looping using the Do While and Do While Conditional nodes. See the Help documentation for your system and search for "loop" and it will display the topics. The nodes are in the Logistics section of the node library.
The Help is also available online for the latest version of the product:
https://d3sa-preview.infogixsaas.com/docs/dist/help/Default.htm
There is an example data flow that is shipped with the installed software that you can import into your system. It is located in the following directory
<\Data3Sixty Analyze Install Dir>/docs/samples/nodes
e.g. for a default Windows Desktop installation:
C:\Program Files\Data3SixtyAnalyze\docs\samples\nodes
Alternatively, if you have sufficient Python programming skills you could leverage Python scripting within, a Transform node or the Python node.
Re. your second question, I'm not sure I understand what you are trying to achieve. Maybe you could consider providing an equivalent set of calculations in an Excel spreadsheet that would produce the same result.
-
Hi Ron,
For aggregating across rows, the attached data flow contains an example of this in first pair of nodes. This example uses the Transform node to initialize a value in the ConfigureFields section, which is then used in the ProcessRecords section. The last thing it does when processing each record is save the value that is needed for the next run.For reading in a file one by one, the second pair of nodes in the data flow demonstrate how to do that, using the Do While Conditional node to process one file at a time.Attached files
Please sign in to leave a comment.
Comments
2 comments