Python Sharing Objects across Namespaces

I have an application that runs on Postgres & Mysql. Each program checks to determine the database type and then imports either postgres_db as db_util or mysql_db as db_util. This works without a problem if all code referencing the class db_util is in the __main__ module. When I started putting code in classes and importing the class, the class would throw an error: “global name ‘db_util’ is not defined”. What follows is a solution to this problem. I don’t know if it is Pythonic and would like to hear any commments.

The Problem:

When a python script starts, the main script is loaded and given the __name__ of ‘__main__’ and it is a module unto itself. If you import a file, it creates a module with its own namespace and the two do not share, unless explicitly told to share.

What must be done is to develop a pattern to forces the modules to share. But as an application developer, you must decide if the classes or instantances are to be shared and if the shared object is a global within the module (refenced by name) or within the class (referenced by self.name).

Another concern is that modules and classes should be stand-alone and if there are dependencies, these need to be explicitly defined in module. That way, everything needed for the code to run is defined in the code. In the long run, it should make debugging much easier.

The code samples that follow passes the instance, but the commented code passes the class.

The main program imports the db import module which determines the database type and imports the appropriate db_util file. Then creates an instance of db_util as du. In the module that imports the dbi.py module, the db_util instance defined in dbi is defined in the globals for the module with the command du=dbi.du.

Once the concept of each module is a namespace and there is not a ‘global’ as in C/C++, this makes sense.