Glove and Positional Embeddings: We verify Glove em- beddings’ significance by replacing it with one hot embed- ding. We also validate the usefulness of Positional Embed- dings (P.E.) by training a model without them. Both vari- ants observe a drop in performance (Table 2), with the drop being more significant in the variant without Glove embed- dings. These ablations suggest the importance of capturing word-level semantics and positional-aware features.