Using an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new inequality for log mappings and apply it in information theory for the Shannon entropy and mutual information