python 27 - Boolean check fails while multiprocessing

I have a script that retrieves a list of active 'jobs' from a MySQL table and then instantiates my main script once per active job using the multiprocessing library. My multiprocessing script has a function that checks if a given job has been claimed by another thread. It does this by checking if a particular column in the DB table is/is not NULL. The DB query returns a single item tuple:

When I run this function without the multiprocessing portion, the claim check works just fine. But when I try to run the jobs in parallel, the claim check reads all the (None,) tuples as having value and therefore truthiness and therefore the function assumes the job has already been claimed.

I have tried adjusting the number of concurrent processes the multiprocessor uses, but the claim check still doesn't work... even when I set the number of processes to 1. I have also tried playing around with the if statement to see if I could make it work that way:

if job_claimed == True
if job_claimed == (None,)
# etc.

No luck though.

Is anybody aware of something in the multiprocessing library that would prevent my claim checking function from properly interpreting the job_claimed tuple? Maybe there's something wrong with my code?

EDIT

I had run some truthiness tests on the job_claimed variable in debug mode. Here are the results of those tests:

Any non-empty tuple (or list, string, iterable, etc.) will evaluate to True. It doesn't matter if the contents of the iterable are non-True. To test that, you can use either any(iterable) or all(iterable) to test whether any or all of the items in the iterable evaluate to True.

However, based on your edits, your problem is likely caused by using a global connection object across multiple processes.

You could also try using connection pooling, but I'm not sure if that would work across process, and would probably require you to switch to threads instead.

Also, I would move all the code under if __name__ == '__main__': into a function. You generally want to avoid polluting the global namespace when using multiprocessing, because when python creates a new process, it tries to copy the global state to the new process. That can lead to some odd bugs since global variables no longer share state (since they're in separate processes), or an object either can't be serialized or loses some information during serialization when it's reconstructed in the new process.