The numbers between Kissmetrics and Optimizely have always differed some, but the conversion rates and such would always average out to be roughly equivalent. In this case however, I'm seeing completely different results. Optimizely is showing Variation #1 as the winner, while Kissmetrics is showing the Original as the winner.

It almost seems like Kissmetrics has flipped the labels on the splits?

I'm still pretty new to using these tools and I was just wondering if anyone has seen anything like this or maybe has some ideas on what could be causing it?

Re: Vastly different results in Kissmetrics versus Optimizely, Don't know who to trust?

This seems like a rare case and is isolated to a particular experiment. I would like to take closer look at the experiment and will need some more information. I am going to create a ticket on your behalf with this information and someone in support should reach out to you soon.

Re: Vastly different results in Kissmetrics versus Optimizely, Don't know who to trust?

I have gotten in touch with Kissmetrics support and they offered some suggestions as to what could cause discrepancies between different analytic platforms, but nothing really explains why conversions are being attributed to the splits differently.