I was impressed by the enormous and seemingly seamless open worlds in entropia Universe, and i became curious about how they managed to keep such huge game worlds nearly completely lag-free despite having plenty of active players and only one visible server for each planet.

I know that they do a lot of movement client-side -- when they launched, I tried playing it, and it took many minutes for their client to realize that the servers had gone down, and meanwhile, I was running around the world just fine :-) That was a long time ago, but I doubt they changed much since then.

A "visible server" is only a place (IP+port) where you send your packets. That could easily be a load-balancer, or a gateway, or perhaps some good hardware that just doesn't have to work hard because most real-time information is client-only.

would you have any suggestions on how i could accomplish a similar system, without purchasing thousand dollar servers and hardware?
note, im not planning on making an MMO game, just a multiplayer game that caters to a limit of 32 people at a time, while keeping a persistent and large open world.

For 32 players, you need to be able to load/unload parts of the world on demand in a streaming fashion, but otherwise you can do that with brute force. That can be dealt with on an EC2 micro instance, which you get for a year for free when you sign up at aws.amazon.com. Forward mutual updates for players that are within a kilometer of each other, or something. The main problem will be in building a large scale world -- that's a lot of assets! That, and asynchronous loading of bits of the world as they become "important" to the simulation.