Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Review of Graph Transformer

less than 1 minute read

Published:

In last few years, Graph Transformers models become a promising direction for graph learning model due to the successful of Transformers in other tasks like NLP or CV. This note reviews about the recent works adapt Transformer for graph data for more a scaleable, expressive tool include improving positional encoding, attention mechanism, etc.

Note in Probability basis

less than 1 minute read

Published:

Some of the concepts of probability I think we will use frequently.

Snails climb up the well

less than 1 minute read

Published:

A snail climbs a well $x$ feet during the day, and during night, it slips $y$ feet. If the well is $h$ feet deep, how long will it take for the snail to climb?

portfolio

publications

Improving Graph Convolutional Networks with Transformer Layer in social-based items recommendation

Published in 13th International Conference on Knowledge and Systems Engineering (KSE), 2021

In this paper, we improve the embedding output of the graph-based convolution layer by adding a number of transformer layers. The transformer layers with attention architecture help discover frequent patterns in the embedding space which increase the predictive power of the model in the downstream tasks.

Recommended citation: T. L. Hoang, T. D. Pham and V. C. Ta, "Improving Graph Convolutional Networks with Transformer Layer in social-based items recommendation", KSE 2021.

Effect of Cluster-based Sampling on the Over-smoothing Issue in Graph Neural Network

Published in 14th International Conference on Knowledge and Systems Engineering (KSE), 2021

In this paper, we propose the usage of cluster-based sampling to reduce the smoothing effect of the high number of layers in GNN. Given each nodes is assigned to a specific region of the embedding space, the cluster-based sampling is expected to propagate this information to the node’s neighbour, thus improve the nodes’ expressivity.

Recommended citation: T. L. Hoang, and V. C. Ta, "Effect of Cluster-based Sampling on the Over-smoothing Issue in Graph Neural Network", KSE 2022.

Dynamic-GTN: Learning an Node Efficient Embedding in Dynamic Graph with Transformer

Published in Pacific Rim International Conference on Artificial Intelligence (PRICAI), 2022

In this paper, we propose the Dynamic-GTN model which is designed to learn the node embedding in a continous-time dynamic graph. The Dynamic-GTN extends the attention mechanism in a standard GTN to include temporal information of recent node interactions. Based on temporal patterns interaction between nodes, the Dynamic-GTN employs an node sampling step to reduce the number of attention operations in the dynamic graph. We evaluate our model on three benchmark datasets for learning node embedding in dynamic graphs.

Recommended citation: T. L. Hoang, and V. C. Ta, "Dynamic-GTN: Learning an Node Efficient Embedding in Dynamic Graph with Transformer", PRICAI 2022.

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.