FAILURE TOLERANCE
14.6 FAILURE TOLERANCE
The quality attributes of software products such as reliability, safety, security, and avail- ability depend not only on the products themselves but also on their specifications. Hence, if we want to define metrics that reflect software quality attributes, our metrics need to take into account specifications as well as software product per se. In this sec- tion, we consider a metric that captures relevant attributes of specifications.
Functional redundancy, which we have discussed in the previous section, reflects the ability of a software product to avoid failure and compute the intended behavior despite the presence and sensitization of faults and the emergence of errors. But if a specification is sufficiently non-deterministic, a program may fail to compute its exact intended function and still satisfy the specification. The question that we raise in this regard is: How do we measure the extent to which a program may deviate from its intended behavior without violating its specification. The measure of specification non-determinacy is an attempt to answer this question.
We consider a relation R on space S and we let X and Y be random variables that take their values in the domain (respectively) and range of R. The non-determinacy of R is the con- ditional entropy v R = H Y X .
The conditional entropy of Y with respect to X represents the uncertainty we have about the value of Y if we know the value of X. The bigger the non-determinacy of R, the bigger this conditional entropy. As an illustrative example, we consider the fol- lowing relation on space S defined by three variables i, j,and k.
R= s,s k = 2i + j i ' =i+j j ' = i −j
The non-determinacy of this relation is v R = H Y X = H X; Y −H X . The reader may notice that (in this particular case) the inverse of relation R is a function; in other
words, X is a function of Y, hence the join entropy H(X,Y) is the same as the entropy of Y. Hence the non-determinacy of this specification can be written as:
v R = H Y −H X
14.7 AN ILLUSTRATIVE EXAMPLE 327
In order to derive the entropies of X and Y, we must first compute the domain and range of relation R. We find:
X = dom R = s,s
s k = 2i + j i = i + j j = i−j = s,s k = 2i + j s i=i+j j = i−j = s,s k = 2i + j
Y = rng R = s,s
s k = 2i + j
i = i' + j' j = i'−j'
=S Hence, under the assumption of uniform probability distribution, we find
H X = 2w, since we have only two independent variables (i and j), and
H Y = 3w, since we have three independent variables. We find:
v R = H Y −H X = w,
which we interpret as follows: A program may lose as much as w bits of information (i.e., the width of an integer value) and still not violate specification R. Indeed, specification R mandates final values for i and j, but no final value for k, which means that a candidate program may lose track of k and still be correct; this is the meaning of non-determinacy.