In complex dynamical systems, precise and accurate estimation of parameters and quality of predictions depends on the information contained in the experimental data. Choosing experimental schemes that maximize information contained in the data is known as Optimal Experimental Design (OED). Fisher Information Matrix and variance-covariance matrix are the central ideas of OED. However, using OED in a class of models known as sloppy models renders the model less predictive, even though the parameters are estimated with substantial precision. This work introduces a new information gain index as an experiment design criterion in the Bayesian framework. The proposed design criterion is based on what is known as the Bhattacharyya coefficient. Our previous studies show that the information gain index indicates a loss of practical identifiability. Further, it is also an indication of sloppy and stiff parameters. Hence, we extend the information index and its interpretation to joint Gaussian distributions; then, using simulations, we demonstrate that the new experiment design criterion selects experiments that minimize prediction and parameter uncertainty in sloppy models.