Sequential RecSys의 경우, 사용자의 Item Sequence의 Historical한 관계를 학습하여 추천합니다.
이를 통해 item간의 transition, sequence를 좀 더 잘 학습할 수 있지만 global context를 반영하지는 못합니다.
특히 추천시스템이 본질적으로 bipartite graph의 link prediction 문제라는 점에서 item sequence만을 바라보는 것은 Sequential RecSys의 큰 한계라고 생각합니다.
이를 위해 GNN 기법을 사용하여 user-item interaction을 sequential recsys에 통합함으로써 추천 성능을 향상시키고자 합니다.
Case Study
Research Question
How to make model focus on similar sequence using collaborative signal?
Followed Question
1.
How to capture similar user?
2.
2nd stage
a.
How to weight more on similar sequence user rather than other sequences when training
b.
How to incorporate similar user’s sequence into target user’s sequence
How to capture similar user?
•
Construct User embedding with Item sequences
◦
Concrete Item can be different, but their latent category is same
◦
the ordering maybe different
→ Item의 Category를 고려하자
•
two followed question
◦
How to find latent Item category? → Deterministic Way or Learnable way
◦
How to construct user features with item embedding?
Proposing Method
1.
How can we use categories in finding similarities
a.
Det: Ignore items, consider each sequence as a sequence of categories, then compute deterministic sequence distance.
b.
Lrn: Ignore items, consider each sequence as a sequence of categories, then apply Transformer to get user embeddings. > Introduce category embeddings.
c.
Lrn: Item encodings + category encodings + positional encodings -> Transformer.
Introduce category embeddings (learnable parameters vs. from DGI).
2.
How can we find categories.
a.
DGI (…)
3.
How to utilize similar user information
a1) For training a sequential recommender system.
•
> L_rec + L_similarity_aware_contrastive_learning_loss
a2) For training a sequential recommender system.
•
> Transformer(x) <- Similarity-aware attention
b1) For designing a similarity-aware RecSys.
•
> GNN with user-user edges > RR^T
b2) For creating better input user encodings for GNNs.
Current Method
1.
Find latent category of items using GNN
a.
eg) DGI
2.
initialize user embedding as sequence of item embeddings
a.
apply transformer to this, get one embeddings
i.
this embedding has sequential information of each user
b.
we can now identify similar user by similarity comparison
3.
Two ways exists
a.
creating better input for Sequential Learning
i.
augmenting user’s item sequence with similar user’s item sequence and apply Sequential RecSys
b.
creating better input user encodings for GNN
i.
make user-user edge on Graph and apply GNN



