Accounting for Variance Between Regression Line and Actual Data Points

Hey all,

It’s been a while since I’ve taken statistics and I tried helping myself out as much as I could however, I don’t know the key terms I need to Google at the moment so I am coming up short. I have some scatter plot data points and I’ve managed to fit a nice linear regression line to the data points which gives me a sort of prediction about what data might look like in the future. The problem is that some of the data points jump pretty far away from the regression line on both sides of it. I need to take into account the min/max distance jumped away from the regression line or at least an approximation so I can figure that into my numbers.

For reference, I am attempting to predict storage requirements for an IT system at day n in the future based on the last 360 some odd data points already calculated (sorry if that doesn’t make sense). The current issue is that I have a “guess” at day n but there is definitely some variance that I need to take in account. I think I need to do something with an R2 value which I have, but I am not sure.

Any and all help is appreciated. Thank you!

submitted by /u/Khue
[link] [comments]

Published by

Nevin Manimala

Nevin Manimala is interested in blogging and finding new blogs https://nevinmanimala.com

Leave a Reply

Your email address will not be published. Required fields are marked *