Skip to content

Code and Configuration for Bringing self-attention architectures into real world scenarios

License

Notifications You must be signed in to change notification settings

lang-ai/chatbots2s

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chatbot_s2s

Code and Configuration for Teaching your bot to talk: How hard can it be?.

Introduction

Under the umbrella of self-attentional networks, many papers claiming to have advanced the SotA of NLP-related tasks have been recently presented. In this paper we validate the performance of this architecture when applied to a real enterprise scenario: a customer service chatbot. The rationale is to test whether the differences between research and enterprise scenarios, its specific challenges and its impact on performance may limit the application scope of these novel proposals.

Dataset

For the experimentation in a real enterprise scenario, the Customer Support on Twitter dataset has been used. The dataset contains 794,299 question-answer pairs in Twitter related to customer service support from different companies.

(Extracted from https://www.kaggle.com/thoughtvector/customer-support-on-twitter/home)

The preprocess.py script has been applied to process and clean the dataset and to create the training, validation and testing files. Additionaly, the prepare_data.sh script creates the files needed by sockeye to train the models.

Models

The Sockeye toolkit has been used for building and training the systems, as well as for evaluating the experimental results. In particular, the following models, implemented in Sockeye, has been tested.

  • CNN: CNNs are expected to better capture semantic relationships between different parts of the text, such as long range dependencies. This model can be trained by executing the cnn_training script.

  • RNN: RNNs rely on the use of additional information from previous steps for generating an output, which makes it a highly suitable architecture for sequential problems, such as textual representation. This model can be trained by executing the rnn_training script.

  • Transformer: This model is an implementation of the idea of self-attentional networks, replacing recurrent or convolutional layers for building the encoder-decoder. These self-attention layers connect all the tokens in the instance. Additionally, the attention layers are divided in sub-layers, each one of them representing a (learned) linear projection of the inputs, in a smaller dimension. This mechanism allows the model to pay attention to information from different sub-spaces of the input space. This model can be trained by executing the transformer script.

Evaluation

To evaluate the models, the translation folder contains the files that applies the different models to generate answers for the training set. The output generated by these scripts, can be then evaluated by means of the evaluation_script.

The following table summarizes the results of the different models.

System BLEU CHRF Rouge-1 Rouge-2 Rouge-L
RNN_S2S 0.064 0.250 0.223 0.074 0.179
CNN_S2S 0.073 0.255 0.229 0.081 0.187
Transformer 0.071 0.258 0.226 0.079 0.184

Examples

In the following, some examples of the answers generated by the different models are shown. The entire responses for the test set of each model can be found at the output folder.

User Request: You guys suck. Have been without internet for almost 24 hours in Ohio. Get your stuff together!

Answer I do apologize for the inconvenience and I will be happy to help. Can you DM me your account number? -JP
CNN Hello, I apologize for the service issues. Could you please Follow/DM your full service address or account number so I can assist you further.-RK
RNN I would be happy to look further into your service issues if you would provide your full address and account associated phone # in a DM. -JW
Transformer I apologize for the inconvenience. I will be glad to assist you. Can you DM me your Name and acct # or phone #? -JB

User Request: @VerizonSupport Company is no help regarding a billing issue. Phone department they referred me to is close today and they claim to not be able to help. They treat customers who pay a lot more than all other phone companies like crap. You figure pay more better services not

Answer Hello Demetris. We understand you are frustrated. Can you please elaborate on your issue? ^CAR
CNN Is this for your Verizon residential or Verizon wireless mobile?^ADP
RNN We can look into this for you. Please send us a Direct Message, so that we can further assist you. -LC url
Transformer Is this for your Verizon residential or Verizon wireless mobile?^ADP

User Request: @hulu_support Hi Hulu, I've cancelled my account twice after the free trial and I STILL keep getting billed. Can you please elaborate. My email is email and would be more than happy to chat on a call

Answer Apologies for the difficulty canceling, Alan! We're very sad to see you go, but it's now canceled effective immediately.
CNN Sorry to hear about the trouble! Please reach out to us here: url so we can look into this for you. url
RNN Sorry to hear about the trouble! Please reach out to us here: url so we can take a closer look.
Transformer Sorry to hear about the unexpected charge! If you still need help, please contact us here: url so we can take a look.

About

Code and Configuration for Bringing self-attention architectures into real world scenarios

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published