-
Deep Learning for NLP: Book by Yoav Goldberg, and a Primer version (without the NLP bits, without some of the advanced bits)
-
Manning and Schutze Foundations of Statistical Natural Language Processing. Buy at Amazon
- Classic book, a bit outdates by now, but some chapters are still worth reading today.
-
Jurafsky and Martin Speech and Language Processing (3rd Edition)
- Another classic book. The third edition is up-to-date with latest techniques.
-
Jacob Eisenstein Natural Language Processing
- Up-to-date book covering a wide array of NLP topics.
-
Class notes by Michael Collins (see under "See Also") has great technical exposition on the more algorithmic aspects, focusing on the pre-neural, classic, beutiful and useful techniques.
-
In general the Morgan and Claypool book series has some good topic-oriented books.
I like the NLP and related courses from Michael Collins, Jacob Eisenstein, Graham Neubig, Greg Durret, Jason Eisner. (There are also many others, but this is an initial good list to follow).
The main conferences in NLP are ACL, NAACL, EMNLP, EACL. The main journal is TACL. You can look up conferences websites and look for the list of accepted papers for a given year to stay up-to-date. For older stuff, the Computational Linguistics Journal (CL) used to be very good, until 2012 or so.
The ACL Anthology has all the papers from all maajor NLP conferences (including the ones mentioned above, and others). It also allows searching within this collection, which is great if you want to dig in on a topic.
Most papers in NLP are free. If you ocassionally come across a locked one (where a publisher charges for money to read a paper) which you really want to read, try pasting the URL into sci-hub.
These come and go. Currently (as of 2018-2019), Sebatian Ruder writes the leading one. He also maintains a page for tracking NLP Progress.