The Naive-Bayes classifier is widely used due to its simplicity and accuracy. However, it fails when, for at least one attribute value in a test sample, there are no corresponding training samples. This is known as the zero-frequency problem and is typically addressed using Laplace Smoothing. Laplace Smoothing does not take into account the statistical characteristics of the neighborhood of the attribute values of the test sample. Gaussian Naïve-Bayes addresses this but the resulting Gaussian model is formed from global information. We propose an approach that estimates conditional probabilities using information in the neighborhood of the test sample. We no longer make the assumption of independence of attributes and consider the joint probability distribution conditioned on the given class. We illustrate the performance on datasets taken from the University of California at Irvine Machine Learning Repository and demonstrate that the proposed approach is simple, robust and outperforms standard approaches.