A Truss-Based Framework for Graph Similarity Computation

A Truss-Based Framework for Graph Similarity Computation

Yanwei Zheng, Zichun Zhang, Qi Luo, Zhenzhen Xie, Dongxiao Yu
Copyright: © 2023 |Pages: 18
DOI: 10.4018/JDM.322087
Article PDF Download
Open access articles are freely available for download

Abstract

The study of graph kernels has been an important area of graph analysis, which is widely used to solve the similarity problems between graphs. Most of the existing graph kernels consider either local or global properties of the graph, and there are few studies on multiscale graph kernels. In this article, the authors propose a framework for graph kernels based on truss decomposition, which allows multiple graph kernels and even any graph comparison algorithms to compare graphs at different scales. The authors utilize this framework to derive variants of five graph kernels and compare them with the corresponding basic graph kernels on graph classification tasks. Experiments on a large number of benchmark datasets demonstrate the effectiveness and efficiency of the proposed framework.
Article Preview
Top

Introduction

Graph, a form of structured data, allows for the modeling of complex relationships among objects. In recent years, more and more data are modeled as graphs and such data is ubiquitous in social networks, biology, chemistry, computer vision, and other domains (Cai et al., 2018, 2021; L. Li et al., 2019; Yanardag & Vishwanathan, 2015b). For example, graphs can be derived from interactions between groups of people, such as friendships on a social website, or collaborations in a network of actors or scientists. In biochemistry, molecular compounds can be modeled as graphs with vertices corresponding to atoms and the edges to chemical bonds (Vishwanathan et al., 2010).

The problem of accurately measuring the similarity between graphs is at the core of many applications(Peng et al., 2022). As the demand for structured data analysis grows, the problem of graph classification has been widely studied in these domains. For example, in protein function prediction, it is a common task to determine whether a protein is an enzyme or a non-enzyme (Borgwardt & Kriegel, 2005). A similar task on a collaboration dataset is to predict which genre an ego-network of an actor/actress belongs to (Yanardag & Vishwanathan, 2015b). Since there are many ways to represent images as graphs, many computers vision tasks, such as classifying images(Tian & Han, 2022), can also be considered as graph classification problems.

To date, the kernel methods are one of the most popular methods for comparing similarities between structured objects, especially in the domain of graph classification. Graph kernels have achieved state-of-the-art results on many graph datasets. In general, the kernel methods utilize a positive semidefinite kernel function to measure the similarity between two objects, which can be expressed as an inner product in reproducing kernel Hilbert space JDM.322087.m01 (Schölkopf & Smola, 2001). Some graph kernels are computed based on local properties, such as the neighborhood features of vertices (Hido & Kashima, 2009), or some small substructures of graphs (i.e., trees, cycles, graphlets) (Horváth et al., 2004; Orsini et al., 2015; Shervashidze et al., 2009, 2011). There are also some graph kernels built from a global perspective and they focus on the global properties of graphs (Kang et al., 2012; Nikolentzos et al., 2017; Wu et al., 2019). Most of the existing graph kernels cannot capture graph similarity at multiple levels of scale. The Multiscale Laplacian Graph Kernel (Kondor & Pan, 2016) is the first graph kernel that can compare graphs at multiple different scales, by building a hierarchy of nested subgraphs. However, it requires Laplacian matrix operations on the graph and thus has a very high runtime.

Complete Article List

Search this Journal:
Reset
Volume 35: 1 Issue (2024)
Volume 34: 3 Issues (2023)
Volume 33: 5 Issues (2022): 4 Released, 1 Forthcoming
Volume 32: 4 Issues (2021)
Volume 31: 4 Issues (2020)
Volume 30: 4 Issues (2019)
Volume 29: 4 Issues (2018)
Volume 28: 4 Issues (2017)
Volume 27: 4 Issues (2016)
Volume 26: 4 Issues (2015)
Volume 25: 4 Issues (2014)
Volume 24: 4 Issues (2013)
Volume 23: 4 Issues (2012)
Volume 22: 4 Issues (2011)
Volume 21: 4 Issues (2010)
Volume 20: 4 Issues (2009)
Volume 19: 4 Issues (2008)
Volume 18: 4 Issues (2007)
Volume 17: 4 Issues (2006)
Volume 16: 4 Issues (2005)
Volume 15: 4 Issues (2004)
Volume 14: 4 Issues (2003)
Volume 13: 4 Issues (2002)
Volume 12: 4 Issues (2001)
Volume 11: 4 Issues (2000)
Volume 10: 4 Issues (1999)
Volume 9: 4 Issues (1998)
Volume 8: 4 Issues (1997)
Volume 7: 4 Issues (1996)
Volume 6: 4 Issues (1995)
Volume 5: 4 Issues (1994)
Volume 4: 4 Issues (1993)
Volume 3: 4 Issues (1992)
Volume 2: 4 Issues (1991)
Volume 1: 2 Issues (1990)
View Complete Journal Contents Listing