Discontinuous convex contractions and their applications in neural networks
Özet
In this paper, we show that the class of convex contractions of order m ∈ N is strong enough
to generate a fixed point but do not force the mapping to be continuous at the fixed point. As
a by-product, we provide a new setting to answer an open question posed by Rhoades (Contemp Math 72:233–245, 1988). In recent years, neural network systems with discontinuous
activation functions have received intensive research interest and some theoretical fixed point
results (Brouwer’s fixed point theorem, Banach fixed point theorem, Kakutani’s fixed point
theorem, Krasnoselskii fixed point theorem, etc.,) have been used in the theoretical studies
of neural networks. Therefore, possible applications of our theoretical results can contribute
to the study of neural networks both in terms of fixed point theory and discontinuity at fixed
point.