Replies: 1 comment 2 replies
-
Is this with a trained model? Can you reproduce this without LIME? i.e. Use the |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to implement LIME on BERT model of simpletransformers. For whom does not know how LIME works, LIME perturbate the text which will be predicted with creating an array that contains different combinations of the sentences. When I use predict of model for the text which is 'Hi,my name is Ege', the prediction will be 1. On the other hand, if I use LIME with number of samples is 20, the array that is perturbed will be ['Hi,my name is Ege', 'Hi, ', 'Hi, name is Ege', 'Hi,my name is ', ',my name is Ege', ', ', ', ', 'Hi, is Ege', ',my ', 'Hi,my is ', 'Hi, is ', ', ', 'Hi,my is Ege', ', Ege', ', ', ', Ege', ', name is ', ',my Ege', 'Hi,my ', 'Hi, name is Ege'] and the predictions are [0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0]. As you can see,the first text of the perturbed array is the original text and the prediction changed, although the text predicted is same. Why it can be happened, and can I fix it? If I set number of samples as 1, which does not perturbed data the prediction is still 1, so I do not think it is about LIME.
Beta Was this translation helpful? Give feedback.
All reactions