Cloud computing has been widely used by computational scientists and
engineers as a means to run large-scale simulations while circumventing
capital investment of hardware. However, a challenging problem is to
accurately estimate how much cloud resources that a specific computation
requires, in order to execute computations in a cost-effective way. In
this paper, we use a real-world molecular dynamics (MD) simulation as a
motivating scenario and present our work in modeling parallel execution
of such a computation on the cloud. Our model estimates the workload of
an MD simulation at fine-grained detail, and based on that estimate,
calculates the time required to run the simulation. The accuracy of the
model has been evaluated using various types of MD simulations on
different scales.