Handwriting has been a natural way to communicate since ancient times.Humans often prefer to write and draw to facilitate rapid exchange ofideas. When handwritten diagrams need to be converted to application-specific digital formats for later use, conversion requires skill and time.Automatic handwritten document conversion can save time, but dia-grams and text require different recognition engines. We propose tosolve the problem of segmentation of text and non-text elements ofhandwritten diagrams using deep semantic segmentation. Our model,DeepDP, can be tuned in complexity to a level appropriate for a particular dataset and diagram type. Experiments on a public flowchart datasetand a business process flow diagram dataset show excellent performance,with more than 95% accuracy, despite relatively small training datasets.