Accurate evaluation of percent severity of visible symptoms or injuries caused by fungal diseases on plant organs is not only essential in plant health and epidemics research but also critical for breeding, assessment of germ plasm and genetic studies of resistance. In this paper, we have introduced a series of effective pixel-level semantic segmentation models used for automatic detection, quantification and classification of plant foliar diseases based on convolutional neural networks (CNNs). The proposed scoring and grading system of plant diseases involves several steps, which entails: dataset creation, image segmentation, statistical calculation and table query operation. In this study, we train four state-of-the-art CNN-based semantic segmentation models including PSPNet-ResNet50, UNet-ResNet50, BiSeNet-ResNet50 and DeepLabV3Plus-ResNet50 for pixel-level predictions in images of individual leaves exhibiting pathological lesions caused by fungal disease: potato late blight fungus and grape black measles fungus, which serve as case studies. Experimental results report that DeepLabV3Plus-ResNet50 is a more efficient model for plant foliar diseases severity evaluation and an overall classification accuracy of 92.00% for potato disease divided into 5 levels and 94.67% for grape disease divided into 4 levels on the hold-out test dataset is obtained.