First one, let roots = alpha, 1/alpha and beta.
So if we multiply them, we get \beta = -4
Then if we add them the roots we should get -b/a. However as there is no x^2 term, -b/a=0. So if you add them you would get 0. You should be able to solve it from there.
Same thing with 2. If you let...