Hide this message
Welcome to the NEW Success Center. Search all resources (documentation, videos, training, knowledge base articles) or browse resources by product. If you are unable to find what you are looking for, please contact us at firstname.lastname@example.org
Performance issues and concerns are centered around the initial population of the statistical data, regular updates of the data during polling, and recalculations after any sort of database maintenance (data deletion and summarization).
This could have impact on the storage of historical data to the history tables. If performance is an issue with continual updates, then a method of running updates only on a interval basis (like part of database maintenance) should be adopted.
Data storage should be minimal as there will be one entry per net object, per statistic, per time frame. For example, 1000 nodes, CPU Load, 2 time frames = 2000 rows.
Is the statistical data accurate? Is my data "normal"? – These questions are difficult to answer as each environment will differ per net object, so we can provide a "normality" test for the data given the available number of records, the valid range of data for the given value and the current standard deviation (http://en.wikipedia.org/wiki/Normality_test)
The normality test will consistently calculate if the data points follow the 68-95-99.7 rule and notify NPM how many times in the given data set a point would have triggered, a 3 standard deviation, and 4 standard deviation alert or threshold.