Researchers at the Institute of Acoustics (IOA) of the Chinese Academy of Sciences proposed a novel combination of two machine learning methods, which can be used to predict network state in dynamic networks, such as recommend someone who you may be interested in, reveal underlying contact in human connection network and predict traffic stream in city transportation network.

Dynamic networks exist widely in real word, such as social media network like Weibo, human contact network like cellphone grid, and so on. They can be used to describe groups of entities, whose connection structure evolves over time. Link prediction on such networks can estimate the connection state among all nodes by using their historical information.

Most previous approaches for link prediction, based on members’ similarity and supervised learning methods, try to compute the similarity between a node pair and adopt machine learning algorithms to train these features. However, rare research work on investigating hidden patterns of dynamic social networks has been conducted.

Link prediction in dynamic networks aims to predict edges according to historical linkage status. It is inherently difficult because of the linear/non-linear transformation of underlying structures. Furthermore, due to the scale of networks and different evolving patterns, it is extremely challenging to perform dynamic link inference efficiently and effectively.

To solve this problem, the IOA research team took the advantages of Gradient Boosting Decision Tree (GBDT) and Temporal Restricted Boltzmann Machine (TRBM), and proposed a new generative method -- the GBDT Temporal Restricted Boltzmann Machine (GTRBM).The GBDT model can be used as a supervised method to capture topological features of network. The TRBM has sufficient hidden layers for modelling nonlinear transformations of in dynamic datasets.

To further reduce the computation time, the IOA research team proposed an efficient dimensionality reduction method, which significantly decreases the computational cost without accuracy deterioration. The prediction process of GTRBM is presented in Figure 1, and dimensionality reduction method is shown in Figure 2.

Figure 1. The prediction process of GTRBM. (Image by IOA)

Figure 2. Matrix dimensionality reduction process. (Image by IOA)

Experiments demonstrated that the proposed method outperformed existing state-of-the-art algorithms on real-world dynamic networks, including a human contact network, a bibliographic network and two email networks.

Funding for this research came from the National Natural Science Foundation of China (Nos. 11590770-4, 61650202, 11722437, U1536117, 61671442, 11674352, 11504406, 61601453), the National Key Research and Development Program (Nos. 2016YFB0801203, 2016YFC0800503, 2017YFB1002803), and the Key Science and Technology Project of the Xinjiang Uygur Autonomous Region (No. 2016A03007-1).