Definition: | a process in which the continuous range of values that a quantity may assume is divided into a number of predetermined adjacent intervals and in which any value within one of these intervals is represented by a single predetermined value within the interval
|