1. Créquit, P., et al., Wasted research when systematic reviews fail to provide a complete and up-to-date evidence synthesis: the example of lung cancer. BMC medicine, 2016. 14(1): p. 8.
2. Gøtzsche, P.C., Why we need a broad perspective on meta-analysis: it may be crucially important for patients. 2000, British Medical Journal Publishing Group.
3. Ioannidis, J.P., Integration of evidence from multiple meta-analyses: a primer on umbrella reviews, treatment networks and multiple treatments meta-analyses. Cmaj, 2009. 181(8): p. 488-493.
4. Leucht, S., et al., Network meta-analyses should be the highest level of evidence in treatment guidelines. European Archives of Psychiatry & Clinical Neuroscience, 2016. 266(6): p. 477-80.
5. Li, T., et al., Comparative Effectiveness of First-Line Medications for Primary Open-Angle Glaucoma: A Systematic Review and Network Meta-analysis. Ophthalmology, 2016. 123(1): p. 129-40.
6. Nikolakopoulou, A., et al., Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study. BMJ, 2018. 360: p. k585.
7. Naudet, F., E. Schuit, and J. Ioannidis, Overlapping network meta-analyses on the same topic: survey of published studies. International journal of epidemiology, 2017. 46(6): p. 1999-2008.
8. Patel, C.J., B. Burford, and J.P. Ioannidis, Assessment of vibration of effects due to model specification can demonstrate the instability of observational associations. Journal of clinical epidemiology, 2015. 68(9): p. 1046-1058.
9. Whiting, P., et al., ROBIS: A new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol, 2016. 69: p. 225-34.
10. Greco, T., et al., The attractiveness of network meta-analysis: a comprehensive systematic and narrative review. Heart, lung and vessels, 2015. 7(2): p. 133.
11. Jansen, J.P. and H. Naci, Is network meta-analysis as valid as standard pairwise meta-analysis? It all depends on the distribution of effect modifiers. BMC Med, 2013. 11: p. 159.
12. Li, T., et al., Network meta-analysis-highly attractive but more methodological research is needed. BMC medicine, 2011. 9(1): p. 79.
13. Hutton, B., et al., The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Annals of Internal Medicine, 2015. 162(11): p. 777-84.
14. Higgins, J.P., et al., The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. Bmj, 2011. 343: p. d5928.
15. Chandler, J., et al., Methodological standards for the conduct of new Cochrane Intervention Reviews. Sl: Cochrane Collaboration, 2013.
16. Shea, B.J., et al., AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. bmj, 2017. 358: p. j4008.
17. Shea, B.J., et al., Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC medical research methodology, 2007. 7(1): p. 10.
18. Oxman, A.D. and G.H. Guyatt, Validation of an index of the quality of review articles. J Clin Epidemiol, 1991. 44(11): p. 1271-8.
19. Guyatt, G., et al., GRADE guidelines: 1. Introduction—GRADE evidence profiles and summary of findings tables. Journal of clinical epidemiology, 2011. 64(4): p. 383-394.
20. Puhan, M.A., et al., A GRADE Working Group approach for rating the quality of treatment effect estimates from network meta-analysis. Bmj, 2014. 349: p. g5630.
21. Brignardello-Petersen, R., et al., Advances in the GRADE approach to rate the certainty in estimates from a network meta-analysis. Journal of clinical epidemiology, 2018. 93: p. 36-44.
22. Nikolakopoulou, A., et al., CINeMA: An approach for assessing confidence in the results of a network meta-analysis. PLoS Medicine / Public Library of Science, 2020. 17(4): p. e1003082.
23. Phillippo, D.M., et al., Threshold analysis as an alternative to GRADE for assessing confidence in guideline recommendations based on network meta-analyses. Annals of internal medicine, 2019. 170(8): p. 538-546.
24. Page, M.J., et al., The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, 2021. 372: p. n71.
25. Moher, D., et al., Guidance for developers of health research reporting guidelines. PLoS medicine, 2010. 7(2): p. e1000217.
26. Whiting, P., et al., A proposed framework for developing quality assessment tools. Syst Rev, 2017. 6(1): p. 204.
27. Page, M.J., et al., Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines. J Clin Epidemiol, 2020. 118: p. 60-68.
28. Page, M.J., et al., Updating guidance for reporting systematic reviews: development of the PRISMA 2020 statement. 2020.
29. Song, F., et al., Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews. Bmj, 2009. 338: p. b1147.
30. Sanderson, S., I.D. Tatt, and J. Higgins, Tools for assessing quality and susceptibility to bias in observational studies in epidemiology: a systematic review and annotated bibliography. International journal of epidemiology, 2007. 36(3): p. 666-676.
31. Lunny, C., et al., Methodological review to develop a list of bias items used to assess reviews incorporating network meta-analysis: protocol and rationale. BMJ Open, 2021. 11(6): p. e045987.
32. Page, M.J., J.E. McKenzie, and J.P.T. Higgins, Tools for assessing risk of reporting biases in studies and syntheses of studies: a systematic review. BMJ Open, 2018. 8(3): p. e019703.
33. Moher, D., et al., Assessing the quality of randomized controlled trials: an annotated bibliography of scales and checklists. Control Clin Trials, 1995. 16(1): p. 62-73.
34. Chambers, J.D., et al., An assessment of the methodological quality of published network meta-analyses: a systematic review. PloS one, 2015. 10(4).
35. Chiocchia, V., et al., ROB-MEN: a tool to assess risk of bias due to missing evidence in network meta-analysis. BMC Medicine, 2021. 19(1): p. 304.
36. Bujkiewicz, S., et al., NICE DSU Technical Support Document 20: multivariate meta-analysis of summary data for combining treatment effects on correlated outcomes and evaluating surrogate endpoints. 2019. 2019.
37. Dotson, S., et al., Rising placebo response rates threaten the validity of antipsychotic meta-analyses. Annals of Clinical Psychiatry, 2019. 31(4): p. 249-259.
38. Ban, J.K., et al., History and publication trends in the diffusion and early uptake of indirect comparison meta-analytic methods to study drugs: animated coauthorship networks over time. BMJ open, 2018. 8(6): p. e019110.
39. Jansen, J.P., et al., Indirect treatment comparison/network meta-analysis study questionnaire to assess relevance and credibility to inform health care decision making: an ISPOR-AMCP-NPC Good Practice Task Force report. Value in Health, 2014. 17(2): p. 157-173.
40. Laws, A., et al., A Comparison of National Guidelines for Network Meta-Analysis. Value in Health, 2019. 22(10): p. 1178-1186.
41. Donegan, S., et al., Assessing key assumptions of network meta-analysis: a review of methods. Res Synth Methods, 2013. 4(4): p. 291-323.
42. Efthimiou, O., et al., GetReal in network meta-analysis: a review of the methodology. Research Synthesis Methods, 2016. 7(3): p. 236-63.
43. Stevens, J.W., et al., A review of methods for comparing treatments evaluated in studies that form disconnected networks of evidence. Research Synthesis Methods, 2018. 9(2): p. 148-162.
44. Welton, N.J., Phillippo, D.M., Owen, R., Jones, H.E., Dias, S., Bujkiewicz, S., Ades, A.E., Abrams, K.R., CHTE2020 SOURCES AND SYNTHESIS OF EVIDENCE; UPDATE TO EVIDENCE SYNTHESIS METHODS. http://rees-france.com/wp-content/uploads/2020/12/CHTE-2020_synthesis-of-evidence.pdf. 2020, National Institute for Health and Care Excellence (NICE) Decision Support Unit (DSU): Sheffield, UK.
45. Ortega, A., et al., A checklist for critical appraisal of indirect comparisons. International journal of clinical practice, 2014. 68(10): p. 1181-1189.
46. Ades, A., et al., NICE DSU Technical Support Document 7: Evidence synthesis of treatment efficacy in decision making: a reviewer’s checklist. 2012, National Institute for Health and Clinical Excellence: https://research-information.bris.ac.uk/en/publications/nice-dsu-technical-support-document-7-evidence-synthesis-of-treatment-efficacy-in-decision-making-a-reviewers-checklist(3831c37d-b492-446f-8882-d94cabf7b95d).html. p. 01.
47. Al Khalifah, R., et al., Network meta-analysis: users' guide for pediatricians. BMC Pediatrics, 2018. 18(1): p. 180.
48. Dias, S., et al., Chapter 8 Validity of network meta-analyses. In: Network meta-analysis for decision-making. 2018: John Wiley & Sons.
49. Hutton, B., F. Catala-Lopez, and D. Moher, The PRISMA statement extension for systematic reviews incorporating network meta-analysis: PRISMA-NMA. Med Clin (Barc), 2016. 147(6): p. 262-266.
50. Jansen, J.P., et al., Interpreting indirect treatment comparisons and network meta-analysis for health-care decision making: report of the ISPOR Task Force on Indirect Treatment Comparisons Good Research Practices: part 1. Value in Health, 2011. 14(4): p. 417-428.
51. Kiefer, C., S. Sturtz, and R. Bender, Indirect Comparisons and Network Meta-Analyses. Deutsches Arzteblatt international, 2015. 112(47): p. 803-808.
52. Papakonstantinou, T., et al., Estimating the contribution of studies in network meta-analysis: paths, flows and streams. F1000Research, 2018. 7: p. 610.
53. Salanti, G., et al., Evaluating the quality of evidence from a network meta-analysis. PloS one, 2014. 9(7).
54. Richter, T., Lee, K.A. and CADTH Working Group Contributors, . Guidance document on reporting indirect comparisons. 2015, CADTH: Ottawa.
55. Chaimani, A., et al., Undertaking network meta‐analyses. Cochrane handbook for systematic reviews of interventions, 2019: p. 285-320.
56. Chaimani, A., et al., Additional considerations are required when preparing a protocol for a systematic review with multiple interventions. J Clin Epidemiol, 2017. 83: p. 65-74.
57. Chaimani, A., et al., Common pitfalls and mistakes in the set-up, analysis and interpretation of results in network meta-analysis: what clinicians should look for in a published article. Evid Based Ment Health, 2017. 20(3): p. 88-94.
58. Coleman, C.I., et al., AHRQ Methods for Effective Health Care, in Use of Mixed Treatment Comparisons in Systematic Reviews. 2012, Agency for Healthcare Research and Quality (US): Rockville (MD).
59. Cope, S., et al., A process for assessing the feasibility of a network meta-analysis: a case study of everolimus in combination with hormonal therapy versus chemotherapy for advanced breast cancer. BMC Medicine, 2014. 12: p. 93.
60. Dwan, K., Bickerdike, L., Livingstone, N., Editorial decisions in reviews with network meta-analysis. https://training.cochrane.org/resource/editorial-considerations-reviews-network-meta-analysis. 2020, Cochrane Editorial and Methods Department.
61. Foote, C.J., et al., Network Meta-analysis: Users' Guide for Surgeons: Part I - Credibility. Clinical Orthopaedics & Related Research, 2015. 473(7): p. 2166-71.
62. Haute Autorité de Santé, Summary Report. Indirect comparisons, methods and validity. 2009.
63. Hummela, N., et al., Work Package 4 Methodological guidance, recommendations and illustrative case studies for (network) meta-analysis and modelling to predict real-world effectiveness using. 2017.
64. Tonin, F.S., et al., Mapping the characteristics of network meta-analyses on drug therapy: A systematic review. PLoS ONE [Electronic Resource], 2018. 13(4): p. e0196644.
65. Fleetwood, K., et al., A Review of the use of network meta-analysis In NICE Single Technology Appraisals. Value in Health, 2016. 19(7): p. A348.
66. Bafeta, A., et al., Reporting of results from network meta-analyses: methodological systematic review. BMJ, 2014. 348: p. g1741.
67. Thieffry, S., et al., Understanding the challenge of comparative effectiveness research in focal epilepsy: A review of network meta‐analyses and real‐world evidence on antiepileptic drugs. Epilepsia, 2020. 61(4): p. 595-609.
68. Donegan, S., et al., Indirect comparisons: a review of reporting and methodological quality. PLoS One, 2010. 5(11): p. e11054.
69. Cameron, C., et al., The importance of considering differences in study design in network meta-analysis: an application using anti-tumor necrosis factor drugs for ulcerative colitis. Medical Decision Making, 2017. 37(8): p. 894-904.
70. Cameron, C., et al., Importance of assessing and adjusting for cross-study heterogeneity in network meta-analysis: a case study of psoriasis. Journal of comparative effectiveness research, 2018. 7(11): p. 1037-1051.
71. Davies, A.L. and T. Galla, Degree irregularity and rank probability bias in network meta‐analysis. Research Synthesis Methods, 2021. 12(3): p. 316-332.
72. Donegan, S., et al., Assessing key assumptions of network meta‐analysis: a review of methods. Research synthesis methods, 2013. 4(4): p. 291-323.
73. Efthimiou, O. and I.R. White, The dark side of the force: multiplicity issues in network meta‐analysis and how to address them. Research Synthesis Methods, 2020. 11(1): p. 105-122.
74. Efthimiou, O., Multivariate extension of meta-analysis. 2017, Πανεπιστήμιο Ιωαννίνων. Σχολή Επιστημών Υγείας. Τμήμα Ιατρικής. Τομέας ….
75. Goring, S., et al., Disconnected by design: analytic approach in treatment networks having no common comparator. Research synthesis methods, 2016. 7(4): p. 420-432.
76. Jackson, D., et al., Paule‐Mandel estimators for network meta‐analysis with random inconsistency effects. Research synthesis methods, 2017. 8(4): p. 416-434.
77. Kibret, T., D. Richer, and J. Beyene, Bias in identification of the best treatment in a Bayesian network meta-analysis for binary outcome: a simulation study. Clinical Epidemiology, 2014. 6: p. 451-60.
78. Krahn, U., H. Binder, and J. König, A graphical tool for locating inconsistency in network meta-analyses. BMC medical research methodology, 2013. 13(1): p. 1-18.
79. Lin, L., H. Chu, and J.S. Hodges, Sensitivity to excluding treatments in network meta-analysis. Epidemiology (Cambridge, Mass.), 2016. 27(4): p. 562.
80. Linde, K., et al., Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments. J Clin Epidemiol, 2016. 71: p. 86-96.
81. Marks‐Anglin, A. and Y. Chen, A historical review of publication bias. Research Synthesis Methods, 2020. 11(6): p. 725-742.
82. Naci, H., S. Dias, and A.E. Ades, Industry sponsorship bias in research findings: a network meta-analysis of LDL cholesterol reduction in randomised trials of statins. BMJ, 2014. 349: p. g5741.
83. Owen, R.K., et al., Multivariate network meta-analysis incorporating class effects. BMC medical research methodology, 2020. 20(1): p. 1-21.
84. Papakonstantinou, T., et al., In network meta-analysis, most of the information comes from indirect evidence: empirical study. Journal of clinical epidemiology, 2020. 124: p. 42-49.
85. Salanti, G., V. Marinho, and J.P. Higgins, A case study of multiple-treatments meta-analysis demonstrates that covariates should be considered. Journal of clinical epidemiology, 2009. 62(8): p. 857-864.
86. Shi, C., et al., Node-making processes in network meta-analysis of nonpharmacological interventions should be well planned and reported. Journal of Clinical Epidemiology, 2018. 101: p. 124-125.
87. Song, F., et al., Validity of indirect comparison for estimating efficacy of competing interventions: empirical evidence from published meta-analyses. Bmj, 2003. 326(7387): p. 472.
88. Tan, S.H., et al., Presentational approaches used in the UK for reporting evidence synthesis using indirect and mixed treatment comparisons. Journal of health services research & policy, 2013. 18(4): p. 224-232.
89. Thorlund, K., et al., Why the findings of published multiple treatment comparison meta-analyses of biologic treatments for rheumatoid arthritis are different: an overview of recurrent methodological shortcomings. Annals of the Rheumatic Diseases, 2013. 72(9): p. 1524-35.
90. Tonin, F.S., et al., Description of network meta-analysis geometry: A metrics design study. PLoS ONE [Electronic Resource], 2019. 14(2): p. e0212650.
91. Lunny, C., et al., Knowledge user survey and Delphi process to inform development of a new risk of bias tool to assess systematic reviews with network meta-analysis (RoB NMA tool). BMJ Evidence-Based Medicine, 2023. 28(1): p. 58-67.
92. Whiting, P., et al., ROBIS: Tool to assess risk of bias in systematic reviews-Guidance on how to use ROBIS. Available at)(Accessed March 26, 2018) http://www. bristol. ac. uk/media-library/sites/social-community-medicine/robis/robisguidancedocument. pdf View in Article, 2016.
93. Cochrane Methods Group, About the Cochrane Methodology Register: http://www.cochranelibrary.com/help/the-cochrane-methodology-register-july-issue-2012.html
2012, Cochrane.