Description

The entropy of a range of test results can provide insights into the test's usefulness to diagnose a binary outcome.


 

If a given finding is a associated with a probability of a binary outcome (presence or absence of a disease) then

 

entropy of the result = S =

= ((-1) * (probability) * LN(probability)) - ((1 - (probability)) * LN(1 - (probability)))

 

If this is rearranged:

 

(-1) * S =

= LN(((1 - prob)^(1 - (prob))) * ((prob)^(prob)))

 

The entropy for a result is maximal when the probability is 0.5 and minimal when either the disease has been excluded (probability 0) or is certain (probability 1).

 

Vollmer introduced the value (1 - S), which can be plotted vs a laboratory result. The lower the value (the higher the entropy) the lower the value of the information. A graphical plot of (1-S) against a range of test values can identify ranges with limited or maximal diagnostic utility.

 


To read more or access our algorithms and calculators, please log in or register.