x_{1} (li) is the (signed) difference between the average of all elements/data of x_{2} (completely specified ordered multiset/list) and single number x_{3} (li; default: infimum of x_{2} under ordering x_{6} if such is finite), using averaging function x_{4} (default: arithmetic average), weights x_{5} (completely specified ordered multiset/list with the same cardinality/length as x_{2}; default: according to notes), and inherited ordering x_{6} (default: standard ordering on the set of reals).

Default for x_{5} is the same as the default for the weights of the given averaging function x_{4}, which for the arithmetic mean of a finite set is the ordered set of |x_{2|} terms with each term equal identically to 1/|x_{2|}, where "| |" represents the cardinality of its input (circumfixed). x_{6} is defined on a superset of x_{2} united with the singleton of x_{3}. The default for x_{6} is mrenspoi. For example: If x_{2 }= (1, 7) and all of the defaults hold, then x_{1 }= avg(1,7) - min(1,7) = 4 - 1 = 3; when all defaults hold and x_{2} is a bounded set, x_{1} is the difference between the arithmetic average of (or average of the uniform distribution on) x_{2} and the infimum of x_{2}. See also: "sigma".