Anthropoid images encode reliable biometric information in abundance. Recent research on image-based screening drives this effort to investigate the feasibility of interpreting the inherent nutritional state from multidimensional human body images. However, anthropometric databases are becoming increasingly essential and grow in parallel to achieve efficient system designs. This paper presents a novel imaging-based strategy in an augmented environment to quantify the human anthropometric features with blockchain-based learning to generate a diagnosis report. It includes evaluating the attributes such as height, weight, forearm, wrist, waist, Mid-Upper Arm Circumference (MUAC), Knee, Feet, head length from an image using the Augmented Reality and Blockchain-based Transfer Learning for diagnostic accuracy. We developed a novel skeleton known as FETTLE to determine the role of body measures for assessing nutritional conditions and body weight from human body images. It forms an instantly applicable technique aimed at evaluating children’s growth patterns all through their initial ages. FETTLE app can also be operated on bedridden people as a screening mechanism to spot their risk of pressure ulcers and undernutrition, followed by a more structured examination. Our approach is superior in accuracy measures with consortium blockchain-based learning context with privacy-preserved medical data sharing and high-end user experience and interaction. Our framework is proved to gain about 97.26% validation accuracy on anthropoid images.