Hey I seem to be having a problem trying to implement some Java quick sort code over an array of 10,000 random numbers. I have a text file containing the numbers which are placed into an array, which is then passed to the sorting algorithm to be sorted. My aim is to time how long it takes to time the sorting increasing the numbers sorted each time using the timing loop I have. Not sure if each time the sorting code is being ran on the sorted array. I just keep getting this graph for best case (best case meaning the textfile is already sorted) : http://i.imgur.com/jIXxj.jpg rather than a O(n log n) graph.