1. 1-7 of 7
    1. Helpful Google

      Helpful Google

      The marvels of modern natural language processing : Michael Glazer, who sent in the example, wonders whether Google Translate has overdosed on old Boris and Natasha segments from Rocky and Bullwinkle: But it seems that the Google speech synthesis systems are not in on the fun, because if I accept Helpful Google's suggestion that I might mean "I vud be grateful if jou vould čonfirm rečeipt of this email so that I čan be sure that is has reačed jou", and then use the synthesize button, what come out sounds less like Boris Badenov and more like a bad reconstruction ...

      Read Full Article
    2. Do STT systems have "intriguing properties"?

      Do STT systems have "intriguing properties"?

      « previous post | In "Intriguing properties of neural networks" (2013), Christian Szegedy et al. point out that … deep neural networks learn input-output mappings that are fairly discontinuous to a significant extent. We can cause the network to misclassify an image by applying a certain imperceptible perturbation… For example: There has been quite a bit of discussion of the topic since then.

      Read Full Article
    3. The study is also termed into many conclusion

      The study is also termed into many conclusion

      « previous post | Charles Belov was surprised by the featured story in the Health section of his Google New index. It was Chhavi Goel, " Surprising Theory About The Cats Which Make The Scientist Stunned ", The News Recorder 12/26/2016: A theory which make the scientists and major medical team shocked came in front that your cat can also become the reason of bird flu to you.

      Read Full Article
    4. Language Log » Ex-physicist takes on Heavy Metal NLP

      " Heavy Metal and Natural Language Processing – Part 1 ", Degenerate State 4/20/2016: Natural language is ubiquitous. It is all around us, and the rate at which it is produced in written, stored form is only increasing. It is also quite unlike any sort of data I have worked with before. Natural language is made up of sequences of discrete characters arranged into hierarchical groupings: words, sentences and documents, each with both syntactic structure and semantic meaning.

      Read Full Article
    1-7 of 7
  1. Categories

    1. Default:

      Discourse, Entailment, Machine Translation, NER, Parsing, Segmentation, Semantic, Sentiment, Summarization, WSD