Hi all,
I am currently analyzing CPU usage for two applications and comparing them based on how linearly they increase there CPU usage as load increases. I plot the average of each minute on a chart and I thought it would be good to create a linear trendline to analyze linearity.
Now at 0 load the CPU usage is 0 for both applications.
At 1 user/sec load the CPU usage of the first application is 0 and of the other is 3.6.
At 40 users/sec ....
I set the y intercept of the trendline to 0 but for some reason the line still does not pass from 0. Can someone help me with this please? Attached is an image of the actual graph.
dddd.JPG
Bookmarks