So what your seeing is that the SparkConf isn't a java object, this is happening because its trying to use the SparkConf as the first parameter, if instead you do sc=SparkContext(conf=conf) it should use your configuration. That being said, you might be better of just starting a regular python program rather than stopping the default spark context & re-starting it, but you'll need to use the named parameter technique to pass in the conf object either way.

Email codedump link for How to change SparkContext properties in Interactive PySpark session