Elsevier

Information Sciences

Volume 481, May 2019, Pages 520-549
Information Sciences

Mixture functions and their monotonicity

https://doi.org/10.1016/j.ins.2018.12.090Get rights and content

Abstract

We consider mixture functions, which are a type of weighted averages for which the corresponding weights are calculated by means of appropriate continuous functions of their inputs. In general, these mixture function need not be monotone increasing. For this reason we study sufficient conditions to ensure standard, weak and directional monotonicity for specific types of weighting functions. We also analyze directional monotonicity when differentiability is assumed.

Introduction

A mixture function is a particular type of weighted averaging operator. To build it, the weights are defined by means of a monotone continuous weighting function and depend on the considered inputs. In this way, it is possible to use the weighting function to give more or less importance to some specific inputs, so mixture functions provide a higher degree of flexibility than usual weighted means, for instance. In this sense, mixture functions can be considered as related to the well-known ordered weighted averaging (OWA) functions [36], but contrary to the latter case, in the former the weights are not assigned a priori but calculated in a input-dependant way. There also exists a close relation between mixture functions and other aggregation functions, such as overlap functions [10], as well as with well-known concepts, as the ROC index [8]. Furthermore, mixture functions can be used in a broad number of applied problems, in fields such as multicriteria decision making, fuzzy systems or data analytics, among others, see [12], [23], [37], [40]. Note that, since mixture functions extend particular instances of aggregation functions as weighted means, for instance, they can be succesfully applied on those problems where the latter are useful. This is specially the case in problems where a reduction of data is required (see, for instance [30], for an application in image processing), and, in general, in any application in machine learning where data fusion plays a relevant role, see [22].

Recall that a key property in order to define aggregation functions is that of monotonicity [6]. For this reason, different authors have analyzed the problem of whether monotonicity is fulfilled by mixture functions, see [6], [26], [27], [31]. In particular, in [28], [29], [32], sufficient conditions to ensurer that a mixture function is monotone increasing have been provided.

But usual monotonicity can be a very restrictive condition for applications, and, in fact, some functions which are widely used for data processing, such as the mode function or some kinds of means [6] are not monotone. Some authors have considered the problem of relaxing the monotonicity condition, leading, in particular, to the notion of weak monotonicity [2], [5], [16], [38], [39]. Basically, a function is weakly monotone if it is monotone along the ray defined by the vector (1,,1), specially to calculate representative values of clusters of data when outliers exist, see [37]. If monotonicity is required along a ray defined by an arbitrary non-null vector, we get the notion of directional monotonicity [11]. These notions have been further extended, considering concepts such as and cone monotonicity, monotonicity with respect to coalitions of inputs [4], as well as those of pre-aggregation function ([24]) or ordered directional monotonicity, see [9], [13], [15] for more details.

Equally important are the papers related the so-called generalized mixture functions which generalize mixture functions and extend, along with mixture functions, under certain conditions, an important class of aggregation functions, [20]. The authors in [19] studied also directional and ordered directional monotonicity of the generalized mixture functions, and determined some criteria for obtaining generalized mixture functions and so-called bounded generalized mixture functions. Applications of the mentioned generalized mixture functions in machine learning and classification can be found, for example, in [17], [21].

In this work, we study sufficient conditions to guarantee standard, directional and weak monotonicity of mixture functions with some specific weighting functions. In particular, we consider weighting functions which are given in terms of linear and exponential functions, as well as by means of linear splines. Furthermore, we also analyze the problem of directional monotonicity for differentiable mixture functions.

The paper consists of seven sections and the Appendix. Section 1 presents an overview of the latest results on monotonicity of mixture functions. Section 2 presents the main definitions. Because the paper introduces also sufficient conditions of standard and weak monotonicity of mixture functions with linear spline weighting function, this section gives presents concepts related to linear spline functions. Section 3 provides sufficient conditions of standard and weak monotonicity of mixture functions with linear and exponential weighting functions. Section 4 introduces sufficient conditions of standard and weak monotonicity of mixture functions with linear spline weighting functions. Section 5 gives sufficient conditions of directional monotonicity of mixture functions with linear and exponential weighting functions. Moreover, it also gives sufficient conditions of ordered directional monotonicity. Section 6 introduces sufficient conditions of directional monotonicity of mixture functions with differentiable weighting functions. The Conclusion summarizes the results and provides some ideas for future research. The Appendix contains proofs of selected theorems. All calculations were made using the R software, [35] and the Mathematica 8.0.

Section snippets

Preliminaries

Throughout the paper, the following notations are used.

We denote by I=[a,b]R¯=[,] a closed interval. In this way, In={x=(x1,,xn)xiI,i=1,,n} is the set of all vectors x whose components lie in the interval I. Considering x,yIn, x=(x1,,xn), y=(y1,,yn), we say that x ≤ y if and only if xi ≤ yi for each i=1,,n. By increasing we do not forcibly mean strictly increasing.

Definition 2.1

A function A:InI is an aggregation function if it is monotone increasing in each variable and satisfies the boundary

Monotonicity and weak monotonicity of mixture function with affine and exponential weighting function

We discuss sufficient conditions of standard and weak monotonicity of mixture functions with specific types of weighing functions. We start considering mixture functions defined in terms of affine weighting functions g(x)=x+l,l>0.

Monotonicity and weak monotonicity of the mixture function with T-spline weighting function

Now we introduce properties of the mixture function with a piecewise linear weighting function, especially with T-spline weighting function which we described in Section 2.3. The main reason for using T-splines is the following. It is known that any continuous function can be approximated arbitrarily well by a piecewise linear function, i.e., by a linear spline. Hence by using monotone linear splines we can model different weighting functions g, e.g., study the impact of the shape of the graph

Directional monotonicity of the mixture function with affine and exponential weighting function

We now discuss directional monotonicity for mixture functions which are defined by means of affine weighting functions.

Theorem 5.1

Let Mg: [0, 1]2 → [0, 1] be the mixture function defined by (1) with the affine weighting function g(x)=x+l, l > 0. Then Mg is r-increasing for vectors r=(r1,r2), r ≠ 0, r1+r2>0 which satisfy the conditionl>max{r1r1+r2,r2r1+r2}+r12+r222(r1+r2)2.

Proof

Let r=(r1,r2)0. Let x=(x,y)I2 and k > 0 such that x+krI2.

From Definition 2.6 we get(x+kr1)(x+kr1+l)+(y+kr2)(y+kr2+l)x+y+2l+k(r1+r2)

Directional monotonicity of the mixture function with differentiable weighting function

Now, we introduce sufficient conditions of directional monotonicity of mixture function (1), which are based on the directional derivative of the mixture function.

Proposition 6.1

Let Mg: [0, 1]n → [0, 1] be the mixture function defined by (1) with a differentiable weighting function g: [0, 1] → ]0, ∞[ and r=(r1,r2,,rn) be an n-dimensional vector, rj ≥ 0, j=1,2,,n, r ≠ 0. Then Mg is r-increasing if the condition holds:(i=1ng(xi))·j=1nrj·(g(xj)+xj·g(xj))(i=1ng(xi)xi)·j=1nrj·g(xj).

Proof

This follows directly

Conclusion

In this paper, we introduced sufficient conditions for three types of monotonicity of mixture functions with selected weighting functions. Our attention has been given to linear, exponential and piecewise linear weighting functions. A significant part of the paper was devoted to the T-spline weighting functions.

We want to remark the relation between mixture functions and some other types of functions, as overlap functions. Taking this fact into account, the analysis done in the present work can

Acknowledgement

Jana Špirková has been supported by the Slovak Scientific Grant Agency VEGA no. 1/0093/17 Identification of risk factors and their impact on products of the insurance and savings schemes.

Humberto Bustince and Javier Fernandez have been supported by Spanish Research Project TIN2016-77356-P (AEI/FEDER, UE).

References (40)

  • J. Špirková

    Weighted Aggregation Operators and their Applications

    (2008)
  • J. Špirková

    Induced weighted operators based on dissimilarity functions

    Inf. Sci.

    (2015)
  • R Core Team

    R: A language and environment for statistical computing

    R Foundation for Statistical Computing. Vienna. Austria. [Online]. Cit. 11-05-2017

    (2016)
  • G. Beliakov

    Shape preserving approximation using least squares splines

    Approx. Theory Appl.

    (2000)
  • G. Beliakov et al.

    Fuzzy connectives for efficient image reduction and speeding up image analysis

    IEEE Access

    (2018)
  • G. Beliakov

    Monotone approximation of aggregation operators using least squares splines

    Int. J. Uncertain. Fuzziness Knowl.-Based Syst.

    (2002)
  • G. Beliakov et al.

    Three types of monotonicity of averaging functions

    Knowl. Based Syst.

    (2014)
  • G. Beliakov et al.

    Weak monotonicity of Lehmer and Gini means

    Fuzzy Sets Syst.

    (2016)
  • H. Bustince et al.

    Ordered directionally monotone functions. justification and application

    IEEE Trans. Fuzzy Syst.

    (2018)
  • H. Bustince et al.

    Overlap functions

    Nonlinear Anal.

    (2010)
  • Cited by (0)

    View full text