Top products from r/LanguageTechnology

We found 8 product mentions on r/LanguageTechnology. We ranked the 6 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top comments that mention products on r/LanguageTechnology:

u/hobo_law · 1 pointr/LanguageTechnology

Ah, that makes sense. Yup, using any sort of large corpus like that to create a more general document space should help.

I don't know what the best way to visualize the data is. That's actually one of the big challenges with high dimensional vector spaces like this. Once you've got more than three bases you can't really draw it directly. One thing I have played around with is using D3.js to create a force directed graph where the distance between nodes corresponds to the distance between vectors. It wasn't super helpful though. However I just went to look at some D3.js examples and it looks like there's an example of an adjacency matrix here: https://bost.ocks.org/mike/miserables/ I've never used one, but it seems like it could be helpful.

The link seems to working now for me, but if it stops working again here's the book it was taken from: https://www.amazon.com/Speech-Language-Processing-Daniel-Jurafsky/dp/0131873210 googling the title should help you find some relevant PDFs.

u/tpederse · 2 pointsr/LanguageTechnology

I always thought this was a pretty good introduction to UIMA.

http://www.morganclaypool.com/doi/abs/10.2200/S00194ED1V01Y200905HLT003

It presumes you know a bit about NLP already, and for that Jurafsky and Martin is a great place to start.

http://www.amazon.com/Speech-Language-Processing-2nd-Edition/dp/0131873210

There are some very nice video lectures from Chris Manning and Dan Jurafsky as well :

https://www.youtube.com/playlist?list=PLSdqH1MKcUS7_bdyrIl626KsJoVWmS7Fk

u/nihalnayak · 2 pointsr/LanguageTechnology

Not much of linguistics knowledge is required to get started with NLP. However, if you wish to learn about linguistics before starting with NLP, I'd recommend this book - Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax by Emily M. Bender.

​

u/bengaliguy · 1 pointr/LanguageTechnology

Not the best book but if you are looking for a book covering some of the Neural Network methods used in NLP this recent book by Yoav Goldberg seems very promising.

u/Axioplase · 2 pointsr/LanguageTechnology

Arguably, everyone is doing neural MT nowadays, but it wouldn't hurt to read [Koehn's Statistical Machine Translation] (https://www.amazon.com/dp/B00AKE1W9O/ref=cm_sw_em_r_mt_dp_U_7NXuDb4X8VNRX)

u/mycl · 2 pointsr/LanguageTechnology

I don't know very well and I haven't worked with either of them but I can give a non-expert answer until an expert comes along.

The paper by the PSL group, Hinge-Loss Markov Random Fields and Probabilistic Soft Logic, has a nice section 7 on related work, which mentions ProbLog. I've also looked at the recent book Foundations of Probabilistic Logic Programming, which deals extensively with ProbLog but not PSL.

In the book, the author, Fabrizio Riguzzi, distinguishes broadly between two classes of semantics for probabilistic logic programs, distribution semantics (DS) and knowledge base model construction (KBMC). From the preface:

>Under the DS, a probabilistic logic program defines a probability distribution over normal logic programs and the probability of a ground query is then obtained from the joint distribution of the query and the programs. Some of the languages following the DS are: Probabilistic Logic Programs [Dantsin, 1991], Probabilistic Horn Abduction [Poole, 1993b], PRISM [Sato, 1995], Independent Choice Logic [Poole, 1997], pD [Fuhr, 2000], Logic Programs with Annotated Disjunctions [Vennekens et al., 2004], ProbLog [De Raedt et al., 2007], P-log [Baral et al., 2009], and CP-logic [Vennekens et al., 2009].
>
>Instead, in KBMC languages, a program is seen as a template for generating a ground graphical model, be it a Bayesian network or a Markov network. KBMC languages include Relational Bayesian Network [Jaeger, 1998], CLP(BN) [Costa et al., 2003], Bayesian Logic Programs [Kersting and De Raedt, 2001], and the Prolog Factor Language [Gomes and Costa, 2012]. The distinction among DS and KBMC languages is actually non-sharp as programs in languages following the DS can also be translated into graphical models.

So ProbLog is a DS language and my understanding from the PSL paper is that PSL is a KBMC language.

It's also worth noting that there are two academic traditions that have converged on the same field. Those who come from probabilistic graphical models and upgrade them to first order call it statistical relational learning (SRL). Those who come from inductive logic programming and generalize by adding probabilities call it probabilistic inductive logic programming (PILP). But SRL and PILP are really the same field. ProbLog comes from the PILP tradition and PSL comes from the SRL tradition.