CPU utilization by a query

Hi experts,

I understand that the CPU utilized by a query can be determined by adding AMPCPUTime and ParserCPUTime. If I get a value of 100 does this mean that the CPU was processing the query for 100 seconds? Or is the number represented in some other unit?

Also, if the TD system has 50 AMPS, can we say that each AMP was processing for 2 seconds assuming even data distribution (no skew)?

Finally, is there direct relation between CPU time mentioned above and the CPU cycles?