This paper defines a family of neural network interpolation operators. The first derivative of generalized logistic-type functions is considered as an density function. Using the first-order uniform approximation theorem for continuous functions defined on the finite intervals, the interpolation properties of these operators are presented. A Kantorovich-type variant of the operators Fna,ε is also introduced. The approximation of Kantorovich-type operators in LP spaces with 1 ≤ p ≤ ∞ is studied. Further, different combinations of the parameters of our generalized logistic-type activation function θs,a are examined to see which parameter values might give us a more efficient activation function. By choosing suitable parameters for the operator Fna,ε and the Kantorovich variant of the operator Fna,ε, the approximation of various function examples is studied.