I'm a bit confused with determining where the root lies (hope that makes sense).
Say for example,
Show that x^3 + x^2 + x + 8 = 0 has a root between x = 1 and x = 2 and use the bisection method twice to determine the root.
Skipping the showing a root exists bit...
x = 1 and x = 2
xm = .5 (1 + 2) = 1.5
f(1.5) = -0.875 < 0
Since f(1.5) < 0, the root lies between 1.5 and 2. <-- I get confused as of this bit. Why does the root lie between 1.5 and 2 when f(1.5) < 0 ?
Continuing on,
f(1.75) = 2.18 > 0
Since f(1.75) > 0, the root lies between 1.5 and 1.75. <-- Same reason ^
Thanks in advance.
Say for example,
Show that x^3 + x^2 + x + 8 = 0 has a root between x = 1 and x = 2 and use the bisection method twice to determine the root.
Skipping the showing a root exists bit...
x = 1 and x = 2
xm = .5 (1 + 2) = 1.5
f(1.5) = -0.875 < 0
Since f(1.5) < 0, the root lies between 1.5 and 2. <-- I get confused as of this bit. Why does the root lie between 1.5 and 2 when f(1.5) < 0 ?
Continuing on,
f(1.75) = 2.18 > 0
Since f(1.75) > 0, the root lies between 1.5 and 1.75. <-- Same reason ^
Thanks in advance.