We recommend switching to the latest versions of Edge, Firefox, Chrome or Safari. Using Internet Explorer will result in a loss of website functionality.
Our Support systems migrated on Saturday, May 21. We'll automatically forward you to the new location for this content.

DQ+ Analysis custom sizing settings

Follow

Comments

1 comment

  • Avatar
    Stephen Collins

    What would be useful is to actually work through an example and to put that in the arcticle.

    Reading the following from the article is just a headache.............

    "A good rule of thumb is to take the uncompressed data size (compressed size multiplied by about 10) in MB and divide that up 1.25GB if caching is on and 2GB if caching is off to get the number of cores needed to process. You would then count up the number of complex nodes. Assuming about 5 complex operations per core, you would divide the number of complex operations by 5. If the value is greater than 1, multiply the number of cores calculated earlier by the value here to calculate a new values for number of cores. Take the memory per core, add the overhead % and multiple that by number of cores for total memory needed."

     

    Whilst the information, I'm sure is accurate - it is not user friendly.

    0
    Comment actions Permalink

Please sign in to leave a comment.