Only one such value of
a need be found for the test to deterministically confirm primality, provided that
p is a Proth number. This is a practical test because, if
p is prime, then any chosen
a has about a 50 percent chance of working, and if
p is not prime, then no chosen
a will work. Furthermore, since the calculation is modulo
p, only values of
a smaller than
p have to be considered.
Systematic naïve variant If
p is Proth composite, then no base
a will work to bear witness of primality. If any one base
a bears witness, then primality is confirmed. If none do, then compositeness is confirmed. This is because the
inverse of Proth's theorem is also true: : If no
a exists such that a^{\frac{p-1}{2}}\equiv -1 \pmod{p}, and
p is a Proth number, then
p is composite. The contrapositive of this statement is that if
p is a Proth prime, such an
a value is guaranteed to exist. Indeed, if
p is a Proth prime then we expect roughly half of all
a-values to satisfy the formula, in the general case. On the other hand, if the second condition is not met - if
p is not a Proth number - then compositeness cannot be guaranteed (the inverse is not generally true for non-Proth), even if the first condition of congruence is met. As such, we may systematically check all base values [2,
p − 1] to verify compositeness (note that
a = 0 and
a = 1 will never work), unless and until one is found to confirm primality. This process, as it is stated, though the most straightforward and trivial, can be made more efficient. In principle, since if
p is prime, there is roughly a 50% chance of a chosen
a of proving primality, we may make the process slightly more efficient by checking about one-half of all possible
a-values smaller than
p - we expect half of said values to satisfy the congruence. Once more than
p/2 distinct values of
a have been tested, compositeness is deterministic. This is because, if
p is prime then we expect half of all bases to bears witness; by the
pigeonhole principle, once more than half have been checked, we can deduce that none will bear witness, and if no base value
a will work, then
p is composite. If on the other hand
p is prime, then at least one of the values checked would inevitably have borne witness, as would all remaining unchecked values. This variation of the test is similar to the deterministic variant of the
Fermat primality test. Both of these naïve variants are grossly inefficient and never employed in practice. Note that both of these approaches represent significantly more computational work than simple brute force
trial division in the worst case scenario.
Probabilistic Monte Carlo Variant As 50% of bases
a are expected to bear witness to primality, if
p is indeed prime, then we may form a
Monte Carlo probabilistic test thus: if the test is repeatedly performed
m times, each iteration with a random
a, each time failing to confirm primality, then we may infer that
p is
probably composite - this is in contrast to the
probably prime results typical of other Monte Carlo algorithms such as the
Miller-Rabin test. An approximate upper bound error probability of a prime being falsely identified as composite can also be inferred. A composite will, however, never be falsely identified as prime. This probabilistic implementation is not typically performed. Even though it is far more efficient than the deterministic naïve test, with computational efficiency on par with the Miller-Rabin test, it can still be improved both in performance runtime and in accuracy (or definitiveness).
Las Vegas Variant The Las Vegas formulation of Proth's test is by far the most efficient of the variants, and as definitive as the deterministic variant. This is the variant typically employed, though there are some nuances in implementation approach still. In practice, a
quadratic nonresidue of
p is found and taken as the value of
a. Since, if
a is a quadratic nonresidue modulo
p then the
converse of Proth's theorem is also true (if Euler's criterion does not yield –1 then
p is composite) and the test becomes
conclusive (bidirectional). The theorem may be restated: : For all Proth numbers
p, and for any quadratic nonresidue
a of
p,
p is prime if and only if a^{\frac{p-1}{2}}\equiv -1 \pmod{p}. A quadratic nonresidue
a of
p may be identified when the
Legendre symbol is –1, thus for such an
a-value: :: \left(\frac{a}{p}\right)=-1 . For such a value of
a, the test is deterministic for both primality and compositeness; thus, for such an
a the check against Proth's/Euler's criteria only requires one iteration: congruence or non-congruence to -1 is wholly descriptive of primeness. The difficulty is in finding such of an
a-value. The value of
a may be found either by systematically checking values in the interval [2,
p − 1], through random selection and checking, or by a more direct computation (the more efficient option), against the Legendre symbol. In any case, when a value of
a is verified against the Legendre symbol as a valid candidate, it may be applied in Proth's / Euler's criterion to determine primality or compositeness conclusively. A systematic check is not grossly inefficient; candidates are fairly common in the general case and one will likely found in a handful of attempts, regardless of whether
p is prime or composite. Random selection is also not grossly inefficient; the probability of finding a candidate is about 50% per iteration. For a variety of reasons, however, a more direct calculation is typically employed via a
modified Euclidean algorithm. Though only when
p is composite, it is possible for no quadratic nonresidue
a to exist at all. In such a case, compositeness can be deduced with certainty, though establishing this case is more complicated in programming without exhaustive verification. Both a systematic search and random selection always carry a certain probability of failing to find a candidate, when one exists, in a reasonably small number of attempts, thus potentially resulting in an early-exit condition with a misleading result of presumed compositeness. We may infer probabilistic compositeness (Monte Carlo) with a threshold of confidence if through random selection no nonresidue satisfying the Legendre symbol can be found in a reasonable number of attempts. If a candidate does not exist and no early-exit condition is defined, it may cost significant computational resources. As such, and to guarantee a deterministic (and efficient) test, a direct computation of the nonresidue is preferred. If direct computation of a quadratic nonresidue fails, then compositeness can be deduced with absolute certainty. Thus, in contrast to many Monte Carlo primality tests (randomized algorithms that can return a
false positive or
false negative), this deterministic variant of the primality testing algorithm is a
Las Vegas algorithm, always returning the correct answer but with a randomly varying
runtime. The check against Proth's criterion, a simple
modular exponentiation operation, once
a has been determined, has a runtime on the order of the bit-length of
p; the random variability in the overall test runtime, therefore, is primarily a result of the search for an appropriate
a value, however that may be done.
Simplified forms Given a Proth number
p =
k2
n + 1, particular forms of
p,
k, and
n have been identified that correspond to predetermined quadratic nonresidue values that are appropriate for use. It has been shown that: • If
p>3 and 3 ∤
k, then
a = 3 is always a quadratic nonresidue (candidate) and therefore a valid base to check, and so: :: 3^{\frac{p-1}{2}} \equiv -1 \pmod{p} if and only if
p is prime. : This is the basis of
Pépin's test for
Fermat numbers and their corresponding primes, wherein
k = 1 is indivisible by 3. • If
p>5 and
p is 2 or 3 modulo 5, then
a = 5 is always a quadratic nonresidue (candidate) and therefore a valid base to check, and so: :: 5^{\frac{p-1}{2}} \equiv -1 \pmod{p} if and only if
p is prime. :- Though no such simple rule exists for the case where
p is 1 or 4 modulo 5, in the latter 4 mod 5 case, one can increase the chances of finding a quadratic nonresidue candidate systematically by checking values in the narrow vicinities of the two distinct square roots of 5 modulo
p. If there are not precisely two distinct square roots (see the
Tonelli-Shanks algorithm) then
p is not prime. Direct computation of the nonresidue is still likely more efficient. ==Numerical examples==