We use 32 age measurements of passively evolving galaxies as a function of redshift to test and compare the standard model (ACDM) with the R-h = Ct universe. We show that the latter fits the data with a reduced chi(2)(dof) = 0.435 for a Hubble constant H-0 = 67.2(-4.0)(+4.5) km s(-1) Mpc(-1) By comparison, the optimal flat ACDM model, A dof ith two free parameters (including Omega(m) = 0.12(-0.11)(+0.54) and H-0 = 94.3(-35.8)(+32.7) km s(-1) Mpc(-1)), fits the age-z data with a reduced chi(2)(dof) = 0.428. Based solely on their chi(2)(dof) values, both models appear to account for the data very well, though the optimized ACDM parameters are only marginally consistent with those of the concordance model (Omega(m), = 0.27 and H-0 = 70 km s(-1) Mpc(-1)). Fitting the age-z data with the latter results in a reduced chi(2)(dof) = 0.523. However, because of the different number of free parameters in these models, selection tools, such as the Akaike, Kullback and Bayes Information Criteria, favor R-h = Ct over ACDM with a likelihood of similar to 66.5%-80.5% versus similar to 19.5%-33.5%. These results are suggestive, though not yet compelling, given the current limited galaxy age-z sample. We carry out Monte Carlo simulations based on these current age measurements to estimate how large the sample would have to be in order to rule out either model at a similar to 99.7% confidence level. We find that if the real cosmology is Lambda CDM, a sample of similar to 45 galaxy ages would be sufficient to rule out R-h = Ct at this level of accuracy, while similar to 350 galaxy ages would be required to rule out ACDM if the real universe were instead R-h = Ct. This difference in required sample size reflects the greater number of free parameters available to fit the data with Lambda CDM.