This website will be removed on December 31st, 2017.

If you are the site owner, please visit the Server Decommissioning page for more details.

If you are the site owner, please visit the Server Decommissioning page for more details.

The next question to be tackled was how to predict the number of positive,
negative, and complex roots. Cardan
noted that the complex roots of an equation occur in pairs. This is the
conjugate pair theorem used today. Newton went on to prove this in his
*Arithmetica Universalis*. Descartes
stated his rule on signs in *La Géométrie*, but did not
give a proof. He said that the maximum number of positive roots of
*f(x)* = 0, where *f* is a polynomial, is the number of
changes in sign of the coefficients and that the maximum number of
negative roots is the number of sign changes in *f(-x)* = 0. This
became known as Descartes's rule.

There is disagreement whether the rule of signs was known before
Descartes's *La Géométrie*. Some say Thomas
Harriot gave it in
his *Artis analyticae praxis*, published in 1631. Cantor denies this
since Harriot didn't admit negative roots. Cardan also stated a relation
between one or two variations in sign and the occurrence of positive roots.
Newton
produced a procedure for finding the number of imaginary roots. Leibniz
suggested an outline of a proof, but did not give a detailed one.
Several mathematicians went on to prove and refine Descartes' rule from
1745-1828.

In 1828 Gauss added a statement to the rule. He said that if the number of positive roots falls short of the number of variations of sign, it does so by an even integer. This is also true for negative roots.

In summary, Descartes' rule of signs is a tool to help determine what type of roots, positive, negative, or imaginary, a polynomial equation has.