Outlier detection is an important step in many data － mining applications． It can find patterns in data that do not conform to expected behavior，these nonconforming patterns can imply potentially useful information． The disadvantages of current methods are that if the data has outliers that form a small cluster，the technique fails to label them correctly． In this paper，we propose a new method for outlier detection． The essential idea behind this technique is that two neighbor data points must be normal points or outliers in the same time． The paper first create the graph model of the data points to be detected． By constructing energy model of vertices and edges，the Markov model for outlier detection problem is established，followed by solving a linear programming problem，the optimal solution of the model is obtained ，and then outlier detection results are provided． Finally，the paper use a synthetic data set and three real data sets experiment to test the proposed algorithm，experiment results show that the proposed algorithm for ordinary data sets and the data sets containing small cluster of data sets are able to correctly detection，and has a high detection accuracy．