We wanted to put some useful tools and training together in one place to help new and existing customers to manage their systems and hopefully learn some new things. We recommend you run some or all of the data flows regularly in the scheduler. They are designed to give you various details about your system, so that you can ensure it's in good health.
THE DATA FLOWS
By clicking on the links below, you'll be able to access the data flows and gain more information on what they do. Each one will need to be edited in order to setup the various credentials required – usually admin/welcome, and any email addresses you want to use. Some of them don’t need schedules, but for the first three we recommend that you do:
D3SA_Check_Backup_Status – schedule this to run around 3:00am daily. It will email you if it sees a failed backup in the logs. For further reading, see Check Backup Status
D3SA_Export_All_Dataflows – schedule this to run around 1:00am daily. On success, it will do an EXPORT on every dataflow, placing them individually into some folder that you pick on the server. This allows you to have a known good backup (in case the system backup fails), but you also could grab one of the LNAs and copy it to another server. This gives you a bit of ‘source code control’ and it gives you the ability to promote a dataflow from DEV to PROD. For further reading, see Export All Dataflows
D3SA_Check_Disk_Usage – is handy to run at least once per week, to keep an eye on your temp disk space usage. For further reading, see Disk Usage
Other system tools include:
D3SA List Users. For further reading, see List Users
LDAP Tools - For further reading, see LDAP Tools
Check_Schedule_Status is not really necessary, but it does have some code you might need to enable and run, if a schedule ever gets stuck. For further reading, and for a dataflow to stop a schedule, see Schedule Status and Stop
Disk Free node - this node will provide the following information on your system resources.
To use this node, you'll need to:
- download it from attachments in this article and import it as you would any other dataflow or node
- create a new dataflow
- click edit search paths and navigate to the folder where the node was imported into
- run the node
SELF SERVICE TRAINING
We recommend you try out our articles and videos on the following topics
Beginners Guide (for brand new Analyze users)
Run Properties and Run Property Sets
FURTHER READING
Help section on System Administration
Help section on SSL Configuration
Help section on automatically purging old data Cleanup Configuration
Technical & Architectural FAQ's
We'd love to hear your thoughts on this article and it's contents, or if you have a use case that you'd like to see demonstrated in a video / article, please let us know by posting in the comments below, or email me directly at ccostello@infogix.com
Comments
0 comments
Please sign in to leave a comment.