Abstract : Rapid retrieval of specified items in a large amount of multi-dimensional data has become an urgent and challenging topic for improving the effectiveness of data searching and analysis approaches. In order to help users quickly re-find the desired items, we present a novel multi-perspective visual retrieval method which is totally different from traditional visualizations for searching results, such as list-based techniques. In our prototype, customized dimensions are generated adaptively which enables users to fast approach to the desired targets from multiple perspectives. We also design several display modes to layout the items by introducing degree-of-interest(DOI) ranking, which is implicitly defined by the geometric relationships of different dimensions. The display modes includes the Cluster Mode using importance-driven dimensionality reduction techniques, the Radial Mode diffusing the items into separate directions, the Uniform Mode evolved by Archimedean Spiral which keeps items non-overlapping and the Custom Mode for deep analysis. Moreover, we combine Focus + Context, Visual Cues which presents item properties, Collaborative Filtering and animations to recommend results according to the features users are interested in. Finally, we evaluate our method in different scenarios: text message retrospection, desktop search and web browsing history retrieval. The experiments show that our method is flexible, extensible, and efficient.