Skip to content

Latest commit

 

History

History
109 lines (70 loc) · 5.13 KB

README.md

File metadata and controls

109 lines (70 loc) · 5.13 KB

Dan: A Dependency-based Attention Network for ABSA

Aspect Based Sentiment Analysis, PyTorch Implementations.

Packagist PRsWelcome PythonVersion

The aim of this research is to extend deep learning models with information coming from domain ontologies to improve their working especially when small amounts of annotated data are available. Ontologies can play a role in the feature formation of deep learning solutions or can inject knowledge in the appropriate components of the used deep learning network architecture.

The work here uses the framework and some of the implementations developed by songyouwei as a starting point to implement and extend the Cabasc model

  • Baseline models
  • Cabasc
  • LCRS
  • Parser
  • Experiments

Extensions

  1. Dependency graph to encode word location information
  2. Dependency to generate bigger aspect terms
  3. Implementation for Separated models

New models (State of the Art)

GCN

Aspect Based Sentiment Analysis with Gated Convolutional Networks [pdf] [code]

LCRS (lcrs.py)

Left-Center-Right Separated Neural Network for Aspect-based Sentiment Analysis with Rotatory Attention [pdf]

LCRS

BaseA (base_a.py)

Content attention module

  • Attention weight of memory slices is calculated using a FwNN with 2 inputs. This differes from memnet that uses concatenation

base_a

BaseB (base_b.py)

Sentence-level content attention module

  • Adds the sentence representation to the calculation of the attention weight
  • Embedds the entire sentence into the output vector resulting of attn_applied

base_b

BaseC (base_c.py)

Position attention based memory module words arround the aspect have a greater impact on the sentiment polarity

  • The memory is weighted by the position attention weights
  • The weighted memory is feed into the sentence-level content attention module

base_c

Implemented models

RAM (ram.py)

Chen, Peng, et al. "Recurrent Attention Network on Memory for Aspect Sentiment Analysis." Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017. [pdf]

ram

MemNet (memnet.py)

Tang, Duyu, B. Qin, and T. Liu. "Aspect Level Sentiment Classification with Deep Memory Network." Conference on Empirical Methods in Natural Language Processing 2016:214-224. [pdf]

memnet

IAN (ian.py)

Ma, Dehong, et al. "Interactive Attention Networks for Aspect-Level Sentiment Classification." arXiv preprint arXiv:1709.00893 (2017). [pdf]

han

TD-LSTM (td_lstm.py)

Tang, Duyu, et al. "Effective LSTMs for Target-Dependent Sentiment Classification." Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers. 2016. [pdf]

td-lstm

LSTM (lstm.py)

lstm

Requirement

  • PyTorch 0.4.0
  • NumPy 1.13.3
  • tensorboardX 1.2
  • Python 3.6
  • GloVe pre-trained word vectors (See data_utils.py for more detail)

Reviews / Surveys

Zhang, Lei, Shuai Wang, and Bing Liu. "Deep Learning for Sentiment Analysis: A Survey." arXiv preprint arXiv:1801.07883 (2018). [pdf]

Young, Tom, et al. "Recent trends in deep learning based natural language processing." arXiv preprint arXiv:1708.02709 (2017). [pdf]

Contributions

Feel free to contribute!

You can raise an issue or submit a pull request, whichever is more convenient for you.

Licence

MIT License