In his recent paper (https://doi.org/10.1016/j.measen.2022.100416), Willink showed two contradictory theoretical solutions to the problem of combining and transforming two sets of information about the same quantity. Each set of information is represented by a probability density function (PDF), and the transformation function is non-linear. We refer to Willink’s contradictory result as “Willink paradox”. The same potential contradiction can exist in measurement uncertainty analysis that involves combining and propagating information. The two operations: (a) information combination and (b) information transformation (or propagation) are both mathematically valid according to probability theory. Therefore, the two contradictory solutions are both theoretically correct. In practice such as measurement uncertainty analysis, however, we cannot use both solutions; we must choose one or the other. We propose an entropy metric for choosing the preferred solution, which resolves the Willink paradox. Four examples are presented to illustrate the Willink paradox and its resolution.