In mathematics, an embedding (or imbedding) is one instance of some mathematical structure contained within another instance. In graph theory, graph embedding is to transform a graph into vector space. The benefit of such transformation is that vector operations are much simpler and faster. This transformation essentially is to compress high-dimensional data to be represented in lower dimensions while still preserving certain properties, such as graph topology and node/edge relationships. Graph embedding algorithms often serve for downstream tasks, such as classification and link prediction.
Graph embedding is very popular in the field of graph computing in recent years. On the one hand, a lot of AI researchers have changed their research direction from deep learning and neural network to graph embedding and graph neural network. It is also increasingly found that combining graph embedding leads to better anti-fraud or smart marketing effects.
Graph embedding is an ever-expanding category of algorithms. One of the main methods to do graph embedding is random walk. Random walk refers to the process that a node moves along the edges for several steps, each step it picks one of its adjacent edges randomly; the process is repeated mutiple times to generate a sequence of paths; then the sequence can be transported to a model as samples for training, resulting in node embedding vectors of the graph.