There are a number of ways you can clear disk space using the attached tools and instructions provided in this article. The tools were developed by members of our Professional Services team but are not part of the Analyze Product.
This article is segregated into four sections:
A) HOW TO CLEAR SPACE VIA THE USER INTERFACE
B) HOW TO CLEAR SPACE VIA THE EXECUTIONS FOLDER
C) HOW TO MONITOR DISK SPACE AND USAGE
D) HOW TO DELETE OLD SERVER & TOMCAT LOG FILES
A) HOW TO CLEAR SPACE VIA THE USER INTERFACE:
There are a couple of ways by which you can clear some space here from "/opt/Data3SixtyAnalyze/data-7731/executions"
1. In Data360 Analyze select a data flow from the directory and then on the right-hand menu "Delete Run Data"
You have to do that on a per-data flow basis.
2. Are you scheduling data flows? If yes, you have to set up this screen. It will help fill up the disk rather quickly
If you change those settings (particularly from not deleting everything to something reasonable) it will take some time (won't happen immediately expect an hour or two) but the system will automatically purge down to the limits you set.
You can select the number as per your wish.
Note: The temp files for a currently running data flow may still be being used by that data flow, so please ensure the data flow has completed before archiving or deleting them.
B) HOW TO CLEAR SPACE VIA THE EXECUTIONS FOLDER:
You can also delete files directly from <dataDirectory>/data-7731/executions/<username>, per this example - C:\Users\<username>\Data3SixtyAnalyze\data-7731\executions/<AnalyzeUsername>
It is ok to delete them when the data flow that created them finishes running and if you no longer need to troubleshoot what happened in the run.
Note: Please DO NOT DELETE item in the folder named "cache"!
C) HOW TO MONITOR DISK SPACE AND USAGE:
You can also import the attached example data flow called Disk Usage to monitor disk space. Feel free to modify it per your own use and system configuration (you can then schedule it to run frequently, e.g. weekly or daily). It will tell you how much space is remaining on your disk, and how space is being used.
Note, the example data flow expects the following folder structure to be used for the temporary data files:
If your execution temporary data folder structure does not contain the 'servername' element you should modify the 'Filter By Type, Add Months' Transform node ProcessRecords script as follows:
execution temporary data folder structure does not contain the 'servername' element you should modify the 'Filter By Type, Add Months' Transform node ProcessRecords script as follows:
Note you should also review the set of exclusions in the 'Ignore System Folders' Transform node's ProcessRecords script to match the folder structure used by your system.
D) HOW TO DELETE OLD SERVER & TOMCAT LOG FILES:
From time to time you may wish to delete old Server and Tomcat log files for the purposes of saving disk space.
Attached is a script called `deleteTomcatLogs.sh` that can be used to cleanup the Tomcat log files. The application admin can schedule this script.
Alternatively, admins can schedule the attached data flow, called D3SA_Delete_Old_Server_Logs. This data flow is structurally similar to the `deleteTomcatLogs.sh`, but it will delete Server and Tomcat log files.