In this paper, a performance-agnostic fusion of probabilistic classifier outputs from [21] (called Yayambo) is modified. Yayambo is iterative and was proposed for combining probabilistic outputs of black-box classifiers trained for the same task to make a single consensus class prediction. Yayambo considers the diversity between the outputs of the various classifiers, iteratively updating predictions based on their correspondence with other predictions until the predictions converge to a consensus decision. [21] treated black-box classifiers. For this paper, cases where prior information, i.e. performances of individual classifiers in the form of accuracy is taken into account for fusion of classifier outputs are considered. Then we discussed the relevance of prior information for combination of probabilistic classifier outputs. Though informed classifier fusion methods outperform uniformed classifier fusion methods in general, the experiments have demonstrated that performances of classifier fusion methods, for both informed and uniformed fusion methods, depend on nature of data and types of individual classifiers used. This means that one needs to carefully select a classifier fusion method under good understanding of characteristics of data sets and classification models used. In other words, for some situations, uninformed classifier fusion methods might outperform the informed ones. The modified version of Yayambo (i.e. informed Yayambo) outperforms the uninformed one in most cases considered in this study.