The School Food Sweden tool
The development and validation of the tool is described in more detail elsewhere . Briefly, a stakeholder group helped identify six important domains of school meal quality, and a web-based tool to measure provision/choice, nutritional quality, food safety issues, service and pedagogy, environmental and organisational aspects was developed. Following pilot-testing, validation studies and further improvements including automatic feedback reports, the tool was made freely available for all primary schools in March 2012. The tool consists of two parts: questionnaires – one per domain – plus feedback in the form of a tailored report. The questionnaires are free-standing and can be answered in any order; all are optional, but the domain “provision/choice” is required in order to generate the report. The report includes all domains, even if not yet completed. It is a pedagogically designed PDF-document, about 20 pages long. It includes summary statistics and clear explanations of why each domain and sub-domain is important. The score for each question is shown using a traffic-light colour system to indicate what is currently good and what could be improved. Schools can contact the administrator of the tool by email or phone if they have questions, but no support, follow-up or other feedback is offered as standard. Schools can use any part of the tool as often as they wish, without limitations. Any member of staff can complete any questionnaire, but most commonly the nutritional quality domain is completed by the school kitchen manager. When the school is ready, they click a button to create and download their tailored feedback.
Setting, recruitment and study design
Guidelines for school meals are produced by the Swedish Food Agency, a government agency. Guidelines were first issued in 2007; a major revision was published in 2013 . The guidelines state that meals are expected to meet nutritional recommendations over a four-week period, and include general information about foods to promote, and but no standards or rules. In fact, in the most recent revision 2018 , suggestions of food servings were toned down even more, in order to emphasise the importance of schools taking a common-sense and holistic approach. This non-prescriptive approach is possible in part because of the long tradition of school meals – even today a school lunch consists of a cooked meal, a salad buffet and crispbread with spread, and milk/water. Fried foods have never been a feature, nor have desserts or soft drinks. In the majority of schools, a choice of two or more warm meals is offered. Food is prepared freshly, either on-site (by municipal or private catering), at a nearby school or at a central municipal kitchen. School cafeterias are common, but while the offering is generally less healthy it is not free of charge, and vending machines are very rare. The guidelines also emphasise the need to consider other aspects of meal quality, such as the importance of a pleasant meal environment, adequate time to eat, and the potential for school meals to be well-integrated in the school’s educational activities. Pupils generally serve themselves and eat in a canteen; teachers are usually present and are encouraged to use the meal as an opportunity to interact with pupils, the “pedagogical lunch” .
The study population was all primary schools that used the tool between the launch date 29th March 2012 and 31 July 2019. Schools self-select to use the tool, although some public schools may be directed to do so by their municipality. They are not invited, and any contact with them prior to sign up is usually indirect – e.g., the tool is mentioned in guidelines, we have had contact with their municipalities and regions, or with organisations and government agencies working with public meals, or we have participated in relevant meetings/conferences etc. A municipality-level account function that can create municipality-level reports and get an overview of school accounts was added in 2016.
To examine changes over time (Study A), we used a repeated cross-sectional design. If a school performed more than one audit of nutritional quality during a school year (defined here as 1 August-31 July), only the most recent was included when reporting that school year’s results at group level. To compare the results of repeated audits (Study B), we used an open cohort design. Due to pilot-testing and the pre/post study, some schools had used the tool before the launch date, when automatic feedback was not yet in place. We restricted this analysis to schools that had only ever been exposed to the complete version of the tool so all schools who had first completed an audit of nutritional quality after the launch date were included, and all their audits were included.
The nutritional quality questionnaire assesses the adequacy of a school’s four-week lunch menu in terms of four nutritional aspects: iron, fibre, vitamin D and fat quality. These four were chosen to focus on nutrients of importance for children that are not easily met , including in school lunches [22, 23], while keeping the questionnaire as brief as possible. The questionnaire includes questions about the serving frequency of both rich and/or common food sources of these nutrients over a four-week period. All data is self-reported by schools. The answers are scored and compared to validated criteria for the four nutrients . If the criteria for all four are met, the school menu is classified as “likely to meet nutritional recommendations”, in this study referred to as “meeting nutritional criteria”, the primary outcome. All other results are combined as “not meeting nutritional criteria”. Where two audits had been conducted very close together (within 28 days), the later was excluded, on the assumption that this was unlikely to reflect meaningful changes and could signal that the school was testing the effect of alternative answers.
We extracted data on when and how often the school had performed the audit(s) of nutritional quality, as well as the number of days that had elapsed since any previous audit, and whether feedback (a report with results) had been generated. Some reports are never generated, due to lack of awareness, lack of interest or perhaps technical difficulties, and we cannot see if reports have been opened/read. We calculated the proportion of times a school had generated reports and categorised this as sometimes (0-50% of occasions), mostly (51-89% of occasions) and almost always (90-100% of occasions). For public schools we also noted if and when the municipality had created an account. This variable was included as a proxy for how engaged the municipality was with the tool, which could either be signal that schools had extra support, or that they were under external pressure to do so.
Data on schools was extracted from a national database , namely: the number of pupils, the owner of the school (municipality or private), and the location of the school. As measures of the school’s socioeconomic position, we used the proportion of students with parents with longer education (>12 years), as well as the proportion with a foreign background (pupil or both parents born outside Sweden). Occasionally data was missing, or if less than 10 pupils in a category, not published. In the latter case we imputed it as 5. School size was categorised into three categories (≤200 pupils, 201-400 pupils, ≥400 pupils. Geographical location in Sweden was coded as east, south or north, according to one of the definitions used by Statistics Sweden.
For the cross-sectional study (Study A), the proportion of schools meeting the criteria for nutritional quality each school year was compared and a binary logistic regression was performed to see if school year was a significant predictor. For study B, to investigate whether schools with more audits were more likely to meet the nutritional criteria than those with fewer, three analyses were performed. Firstly, we grouped audits from all schools by audit order (i.e., all 1st audits, all 2nd audits etc) and compared the proportion of schools meeting the criteria across all groups, calculating the average results and the average change from the preceding audit. As selection bias was a potential concern, i.e., schools which went on to use this tool many times might differ from ones that only used the tool once, we repeated this analysis, stratifying schools according to the number of audits in total that each performed, to see if the pattern held. Schools with more than nine audits were excluded due to very small numbers (N=13, 1%).
Secondly, we extended this subgroup analysis in a way that allowed us to control for potential confounders, using variance weighted least squares (VWLS) regression. This model, sometimes referred to as meta-regression, extends the simple linear regression to consider the outcome as an estimated quantity that can be averaged rather than a simple observation. For each subgroup (schools grouped by audit order), the variances of the outcome variable are estimated and assumed independent of the other subgroups. Then the model treats each subgroup as one observation, weighted with the estimated variance. In general, the outcome variable can be seen as an estimate and the explanatory variables as confounders observed at subgroup level that might influence the average of the “intervention” effect. Here, we estimated the continued effect of total number of answers of the audit order with and without the potential confounders included in the models. The confounders controlled for in the models were distribution of region, proportion of private schools and average size of the schools.
Thirdly, as the tool consists of two components – an audit component plus a feedback component - we wanted to consider both as independent predictors. Two logistic regressions were performed to test if the number of occasions a school evaluated its nutritional quality or if the proportion of occasions a school generated its previous feedback was associated with the odds of meeting the nutritional criteria. (We first checked there was no evidence of a correlation between number of audits performed and percent of all feedback generated; Spearman’s rho 0.019). Potential confounding factors in both regression models were audit date (expressed as months since March 2012), school characteristics, and for public schools, whether the municipality had an account by the time of the school’s final audit. Statistical significance was set to a level of 0.05. Analysis was performed in IBM SPSS Statistics for Windows (version 26), except for the VWLS which was performed in Stata Statistical software (version 16.1).