Submit a ticketCall us

Looking to compare latest NPM features with previous versions of NPM?
The NPM new feature summary offers a comparison of new features and improvements offered with this release.

 

Home > Success Center > Network Performance Monitor (NPM) > Interface baseline calculation frequency implications on system performance

Interface baseline calculation frequency implications on system performance

Table of contents

Performance Constraints

Performance issues and concerns are centered around the initial population of the statistical data, regular updates of the data during polling, and recalculations after any sort of database maintenance (data deletion and summarization).

This could have impact on the storage of historical data to the history tables.  If performance is an issue with continual updates, then a method of running updates only on a interval basis (like part of database maintenance) should be adopted.

Data storage should be minimal as there will be one entry per net object, per statistic, per time frame. For example, 1000 nodes, CPU Load, 2 time frames = 2000 rows.

Is the statistical data accurate?  Is my data "normal"?  – These questions are difficult to answer as each environment will differ per net object, so we can provide a "normality" test for the data given the available number of records, the valid range of data for the given value and the current standard deviation (http://en.wikipedia.org/wiki/Normality_test)
The normality test will consistently calculate if the data points follow the 68-95-99.7 rule and notify NPM how many times in the given data set a point would have triggered, a 3 standard deviation, and 4 standard deviation alert or threshold.

Last modified
20:30, 6 Oct 2015

Tags

Classifications

Public