Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for divergence measures arising in Information Theory. In particular, the results are demonstrated for the estimate of the Kullback-Leubler distance, Shannon entropy and mutual information. Application to the Jeffreys divergence measure is also examined.