Applications operating on streams of data are becoming more prevalent in the computing industry, especially as hardware allows it. Used in areas from security such as packet-sniffing intrusion detection software packages to the financial world attempting to model the stock market to map out future trends, algorithms for processing these unbounded streams are growing in necessity. Traditional database management systems fall short, as they are limited to bounded data. Therefore, stream management systems are required, as well as efficient algorithms. Furthermore, these algorithms must be agile and adaptive. This means that there can be no wasted processing time with overhead, as valuable data may be missed in the process. In this paper, we try to explore tradeoffs between performance and overhead in an effort to fine-tune the adaptivity parameters. Our experimental results show that a re-optimization interval of 1 second and re-optimization threshold of 90% work best in general. We also discovered that Fodp variants perform significantly better than XGreedyJoin under high stress scenario, which was undiscovered in our previous work.