This is a problem I have every once in a while, and it annoys me tremendously, beacuse I have always to recheck every trendline I get.
An example:
r L
(mm)
30,00 97,0
60,00 103,2
90,00 106,0
110,00 101,0
125,00 88,0
140,00 62,0
148,00 36,7
152,50 17,0
Upon drawing a trendline (using 3rd order polynomial regression type) with r on the x axis, and L on the y one, Excel will give the formula
y = -0,0002x³ + 0,0341x² - 1,8979x + 128,73
with R² = 0,994. If I interpolate values using that formula for the same values of r as the ones the formula was derived from, I get
r y
(mm)
30,00 97,083
60,00 94,416
90,00 88,329
110,00 66,371
125,00 33,68
140,00 -17,416
148,00 -53,5912
152,50 -76,97725
which are quite different?
Why does this happen? What is the reason for it?
Answer
It looks like the formula that Excel gave has rounded coefficients. Using an OpenOffice calc routine for regression, I get this formula, which has a much better fit of the data:
y=-0.00017257x³+0.034107417x²-1.89794239x+128.7325785
Since the x³ term is so large, a small difference in the coefficient has a large influence on the predicted result.
No comments:
Post a Comment