Talk 1

Title:Adaptive networks: coevolution of disease and topology

Speaker: Zhicheng Guo


Adaptive networks have been recently introduced in the context of disease propagation on complex networks. They account for the mutual interaction between the network topology and the states of the nodes. Until now, existing models have been analyzed using low complexity analytic formalisms, revealing nevertheless some novel dynamical features. However, current methods have failed to reproduce with accuracy the simultaneous time evolution of the disease and the underlying network topology. In the framework of the adaptive SIS model of Gross et al. [Phys. Rev. Lett. 96, 208701 (2006)], we introduce an improved compartmental formalism able to handle this coevolutionary task successfully. With this approach, we analyze the interplay and outcomes of both dynamical elements, process and structure, on adaptive networks featuring different degree distributions at the initial stage.

Supervisor: Kexian Zheng

Talk 2

Title: Graph Attention Network with Memory Fusion for Aspect-level Sentiment Analysis

Speaker: Li Li


Aspect-level sentiment analysis(ASC) predicts each specific aspect term’s sentiment polarity in a given text or review. Recent studies used attention-based methods that can effectively improve the performance of aspectlevel sentiment analysis. These methods ignored the syntactic relationship between the aspect and its corresponding context words, leading the model to focus on syntactically unrelated words mistakenly. One proposed solution, the graph convolutional network (GCN), cannot completely avoid the problem. While it does incorporate useful information about syntax, it assigns equal weight to all the edges between connected words. It may still incorrectly associate unrelated words to the target aspect through the iterations of graph convolutional propagation. In this study, a graph attention network with memory fusion is proposed to extend GCN’s idea by assigning different weights to edges. Syntactic constraints can be imposed to block the graph convolutional propagation of unrelated words. A convolutional layer and a memory fusion were applied to learn and exploit multiword relations and draw different weights of words to improve performance further. Experimental results on five datasets show that the proposed method yields better performance than existing methods. The code of this paper is availabled at

Supervisor: Guogen Tang


Time:16:00  November 4, 2021

Address:MingLi Buliding C1102

Chair: Ze Kang