In this paper we showcase an integrative approach for our actuated Tangible Active Objects (TAOs), that demonstrates distributed collaboration support to become a versatile and comprehensive dynamic user interface with multi-modal feedback. We incorporated physical actuation, visual projection in 2D and 3D, and vibro-tactile feedback. We demonstrate this approach in a furniture placing scenario where the users can interactively change the furniture model represented by each TAO using a dial-based tangible actuated menu. We demonstrate virtual constraints between our TAOs to automatically maintain spatial relations.