The complexity of a process is one of the determinants for its the overall error rate. This can be compounded by other factors such as stress, unfamiliarity, and exhaustion. Simple processes with few steps have the greatest chance of being completed without error.
According to Nolan, the error rates for processes with multiple steps can be estimated as follows (assuming constant error rate for all steps):

Cumulative Error Rate According to the Error Rate for Each Step

Number of Steps

0.05

0.01

0.001

0.0001

1

0.05

0.01

0.001

0.0001

5

0.33 (0.23)

0.05

0.005

0.002 (0.0005)

25

0.72

0.22

0.02

0.003

50

0.92

0.39

0.05

0.005

100

0.99

0.63

0.10

0.01

Table 2, page 773.
where:
• The given value of 0.33 for 5 steps at 0.05 error rate should be 0.23
• The given value of 0.002 for 5 steps at 0.0001 error rate should b 0.0005)
If this is analyzed, it can be seen that this is stating:
cumulative rate for successful process with no errors =
= (1  (error rate for each step as a decimal fraction)) ^ (number of steps)
cumulative error rate as the rate of having 1 or more errors in the entire process =
= 1  (cumulative rate for successful process)
To make this more real world, you could indicate the number of steps at each risk level, then calculate the product for the separate rates.
To reduce the cumulative error rate you need to:
(1) reduce the number of steps
(2) reduce the error rate for each step
If you want to achieve a given cumulative error rate with a certain number of steps, then
error rate for each step =
= 1  ((1  (target cumulative error rate))^(1 / (number of steps)))
If you want to achieve a given cumulative error rate at a given error rate, then
number of steps allowed at the given error rate =
= LN(1  (cumulative error rate)) / LN(1  (error rate for each step))