1. Articles in category: Parsing

    25-48 of 598 « 1 2 3 4 5 ... 23 24 25 »
    1. Unsupervised Dependency Parsing without Gold Part-of-Speech Tags

      Valentin I. Spitkovsky, Hiyan Alshawi, Angel X. Chang, and Daniel Jurafsky. 2011. Unsupervised Dependency Parsing without Gold Part-of-Speech Tags. In Proceedings of the 2011 Conference on Empirical Methods on Natural Language Processing (EMNLP 2011).

      Read Full Article
    2. Method for searching a text (or alphanumeric string) database, restructuring and parsing text data (or alphanumeric string), creation/application of a natural language processing engine, and the creation/application of an automated analyzer for the creati

      A sequential series of methods for optimized searching within a text (or alphanumeric string) database to retrieve specific and relevant results, followed by optimized restructuring and parsing of text data (or alphanumeric string), followed by creation/application of a natural language processing engine, followed by the creation/application of an automated analyzer is presented.

      Read Full Article
    3. Natural language processing utilizing transaction based knowledge representation

      Mechanisms are provided for processing logical relationships in natural language content. A logical parse of a first parse of the natural language content is generated by identifying latent logical terms within the first parse indicative of logical relationships between elements of the natural language content. The logical parse comprises nodes and edges linking nodes. At least one knowledge value is associated with each node in the logical parse. The at least one knowledge value associated with at least a subset of the nodes in the logical parse is propagated to one or more other nodes in the logical parse based ...

      Read Full Article
    4. Natural language processing utilizing logical tree structures

      Mechanisms are provided for processing logical relationships in natural language content. Natural language content is received, upon which a reasoning operation is to be performed. A first parse representation of the natural language content is generated, by a parser, by performing natural language processing on the natural language content. A logical parse of the first parse is generated by identifying latent logical operators within the first parse indicative of logical relationships between elements of the natural language content. A reasoning operation on the logical parse is executed to generate a knowledge output indicative of knowledge associated with one or more ...

      Read Full Article
    5. Natural language processing utilizing propagation of knowledge through logical parse tree structures

      Mechanisms are provided for processing logical relationships in natural language content. A logical parse of a first parse of a natural language content is generated by identifying latent logical operators within the first parse indicative of logical relationships between elements of the natural language content. The logical parse comprises nodes and edges linking nodes. At least one knowledge value is associated with each node in the logical parse. The at least one knowledge value of at least a subset of the nodes in the logical parse is propagated to one or more other nodes in the logical parse based on ...

      Read Full Article
    6. Extracting complex entities and relationships from unstructured data

      To extract relationships between complex entities from unstructured data, a parser parses, using an existing language model, the unstructured data to generate a parse tree. From the parse tree, a set of tokens is created. A token in the set of tokens includes a set of words found in the unstructured data. The set of tokens is inserted in the existing language model to form an enhanced language model. The unstructured data is re-parsed using the enhanced language model to create a knowledge graph. From the knowledge graph, a relationship between a subset of the set of tokens is extracted.

      Read Full Article
    7. Information, Vol. 8, Pages 13: Dependency Parsing with Transformed Feature

      Information, Vol. 8, Pages 13: Dependency Parsing with Transformed Feature

      Information 2017 , 8 (1), 13; doi:10.3390/info8010013 (registering DOI) Dependency Parsing with Transformed Feature School of Astronautics, Beihang University, Beijing 100191, China Academic Editor: Günter Neumann Received: 2 November 2016 / Revised: 15 December 2016 / Accepted: 16 January 2017 / Published: 21 January 2017 (This article belongs to the Section Artificial Intelligence ) No Abstract Dependency parsing is an important subtask of natural language processing.

      Read Full Article
    8. Multiple rule development support for text analytics

      Methods, computer program products and systems are provided for applying text analytics rules to a corpus of documents. The embodiments facilitate selection of a document from the corpus within a graphical user interface (GUI), where the GUI opens the selected document to display text of the selected document and also a token parse tree that lists tokens associated with text components of the document, facilitate construction of a text analytics rule, via the GUI, by user selection of one or more tokens from the token parse tree, and, in response to a user selecting one or more tokens from the ...

      Read Full Article
      Mentions: GUI
    9. Language processing method and integrated circuit

      A parse unit parses an input sequence of token elements for an input string, wherein each token element contains a token and/or at least one corresponding token classifier. In a first mode the parse unit applies regular production rules on the token elements and on multi-token classifiers for phrases obtained from the token classifiers. If the first mode parsing does not result in a multi-token classifier encompassing all tokens of the input string, a control unit controls the parse unit to parse the input sequence in a second mode that applies both the regular and artificial production rules. A ...

      Read Full Article
    10. Automated messaging response

      Apparatuses, systems, methods, and computer program products are disclosed for an automated messaging response. A message parsing module may parse a textual message to determine whether the message includes a question. A question determination module may determine a question type for the question. A response presentation module may present a response interface that includes a plurality of selectable responses to the question.

      Read Full Article
    11. System for identifying textual relationships

      A computer-implemented method identifies textual statement relationships. Textual statement pairs including a first and second textual statement are identified, and parsed word group pairs are extracted from first and second textual statements. The parsed word groups are compared, and a parsed word score for each statement pair is calculated. Word vectors for the first and second textual statements are created and compared. A word vector score is calculated based on the comparison of the word vectors for the first and second textual statements. A match score is determined for the textual statement pair, with the match score being representative of ...

      Read Full Article
    12. Data Recombination for Neural Semantic Parsing. (arXiv:1606.03622v1 [cs.CL])

      Modeling crisp logical regularities is crucial in semantic parsing, making it difficult for neural models with no task-specific prior knowledge to achieve good results. In this paper, we introduce data recombination, a novel framework for injecting such prior knowledge into a model. From the training data, we induce a high-precision synchronous context-free grammar, which captures important conditional independence properties commonly found in semantic parsing.

      Read Full Article
    13. Neural Network Models for Implicit Discourse Relation Classification in English and Chinese without Surface Features. (arXiv:1606.01990v1 [cs.CL])

      Inferring implicit discourse relations in natural language text is the most difficult subtask in discourse parsing. Surface features achieve good performance, but they are not readily applicable to other languages without semantic lexicons. Previous neural models require parses, surface features, or a small label set to work well. Here, we propose neural network models that are based on feedforward and long-short term memory architecture without any surface features.

      Read Full Article
    14. Dependency Parsing as Head Selection. (arXiv:1606.01280v1 [cs.CL])

      Conventional dependency parsers rely on a statistical model and a transition system or graph algorithm to enforce tree-structured outputs during training and inference. In this work we formalize dependency parsing as the problem of selecting the head (a.k.a. parent) of each word in a sentence. Our model which we call DeNSe (as shorthand for Dependency Neural Selection) employs bidirectional recurrent neural networks for the head selection task.

      Read Full Article
    15. Parsing Argumentation Structures in Persuasive Essays. (arXiv:1604.07370v1 [cs.CL])

      In this article, we present the first end-to-end approach for parsing argumentation structures in persuasive essays. We model the argumentation structure as a tree including several types of argument components connected with argumentative support and attack relations. We consider the identification of argumentation structures in several consecutive steps. First, we segment a persuasive essay in order to identify relevant argument components.

      Read Full Article
    25-48 of 598 « 1 2 3 4 5 ... 23 24 25 »
  1. Categories

    1. Default:

      Discourse, Entailment, Machine Translation, NER, Parsing, Segmentation, Semantic, Sentiment, Summarization, WSD
  2. Popular Articles

  3. Organizations in the News

    1. (1 articles) European Union
    2. (1 articles) Cortana
    3. (1 articles) Google
    4. (1 articles) Stanford Parser
    5. (1 articles) Google Translate
    6. (1 articles) NLP
    7. (1 articles) Part
    8. (1 articles) Google Research
  4. Locations in the News

    1. (1 articles) John
    2. (1 articles) Bayes
    3. (1 articles) Russian
    4. (1 articles) Italy
    5. (1 articles) Siri