Tableau + Data360 Analyze: Better Together
Data360 Analyze allows Tableau users to:
- Blend more complex data sources
- Perform more advanced analytical routines
This article provides just one example where the complexity of the data rules can be easily assessed and managed using Data360 Analyze to complement the power of a Tableau dashboard.
Download the Example Files
The attachment at the bottom of this article contains the examples referenced including the data.
Download the file and extract to the location of your choosing. Once the files have been extracted, continue with the instructions below.
Importing the Example File
Import the example data flow (*.lna) by following the instructions in the section titled "Importing data flows and library nodes" within the "Sharing Data Flows and Library Nodes" help section.
To run the examples:
- You will need to update the "InputDirectory" and "OutputDirectory" properties to point to the folders containing the input and output directories, which will differ depending on where you have saved the attachment.
- If you have access to the Market Basket Analysis node that is used in this example, you will need to first run the R Library Package Download node, found in the Logistics category to ensure that relevant libraries are downloaded (you will need to accept the agreement).
- The example Tableau dashboard was developed in Tableau 10.1. You will need a compatible version and valid license key.
Example: Transaction Analysis
In this example, we want to display our sales transactions in a Tableau dashboard:
The data required for this dashboard involves some of the following complexities:
- There are four (4) data sources of different file types that we will need to read in: CSV, tab-delimited, XML and JSON.
Data360 Analyze has native connectors for these file types.
- Sales transactions are multi-currency: USD, GBP and EUR. The purchased amounts in EUR use a comma to denote decimals whereas amounts in GBP and USD use a period. In order to have consistency in the data, we will need to replace the comma with a period.
This is easily done by applying the replace() function.
- Further, to ensure that total purchases sales amount can be summed appropriately, we will need to convert non-USD based transactions into USD using the relevant currency exchange rate on that date.
We can use the Lookup node to reference the Historical Currency Rates data and then apply a simple multiplication formula to apply the currency exchange rate to convert non-USD transactions to USD.
- Transaction data need to be joined to the customer data on a customer identifier - we will need to output any transactions that don't have a customer identifier for further investigation.
Using the X-Ref node to join the Transactions data to the Customer data will automatically give us the transactions where no customer record was matched. These records can be easily output to Excel using the Output Excel node. There are many other output nodes in Data360 Analyze that can be used.
- The customer data includes the postal code/ zip code, which will be needed for mapping the sales transactions in Tableau. For the customers in the UK the full postal code is provided, however, Tableau only recognizes the first segment of the five- to seven-character UK postcode so we will need to extract the first segment. (Tableau: Prepare Your Geographic Fields).
We can use the split() function to divide the UK postal codes by the space character and then use the getItem() function to extract the first segment.
- Transaction data will need to be enriched with the Product data as well so that we have more descriptive product references instead of just an ID number.
This is easily done using a Lookup node.
- Market Basket analysis is to be performed on the purchased products to see the products that are purchased together.
For those with the Advanced Analytics nodes enabled, there is a Market Basket Analysis node that can be used to perform this analysis.
The blue text above describes how these complexities may be managed within Data360 Analyze - step through the data flow and follow the logic as well as the data at each step. There are comments within the logic to help provide some guidance.
Please comment below if you have any questions!