In this paper we derive some upper bounds for the relative entropy D(p || q) of two probability distribution and apply them to mutual information and entropy mapping. To achieve this we use an inequality for the logarithm function, (2.3) below, and some classical inequalities such as the Kantorovič Inequality and Diaz-Metcalf Inequality.