Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean inequality, we point out an analytic inequality for the logarithmic map and apply it for the Kullback-Leibler distance in Information Theory. Some applications for Shannon’s entropy are given as well.