1. Articles in category: Parsing

    1-24 of 562 1 2 3 4 ... 22 23 24 »
    1. Natural language processing utilizing propagation of knowledge through logical parse tree structures

      Mechanisms are provided for processing logical relationships in natural language content. A logical parse of a first parse of a natural language content is generated by identifying latent logical operators within the first parse indicative of logical relationships between elements of the natural language content. The logical parse comprises nodes and edges linking nodes. At least one knowledge value is associated with each node in the logical parse. The at least one knowledge value of at least a subset of the nodes in the logical parse is propagated to one or more other nodes in the logical parse based on ...

      Read Full Article
    2. Extracting complex entities and relationships from unstructured data

      To extract relationships between complex entities from unstructured data, a parser parses, using an existing language model, the unstructured data to generate a parse tree. From the parse tree, a set of tokens is created. A token in the set of tokens includes a set of words found in the unstructured data. The set of tokens is inserted in the existing language model to form an enhanced language model. The unstructured data is re-parsed using the enhanced language model to create a knowledge graph. From the knowledge graph, a relationship between a subset of the set of tokens is extracted.

      Read Full Article
    3. Information, Vol. 8, Pages 13: Dependency Parsing with Transformed Feature

      Information, Vol. 8, Pages 13: Dependency Parsing with Transformed Feature

      Information 2017 , 8 (1), 13; doi:10.3390/info8010013 (registering DOI) Dependency Parsing with Transformed Feature School of Astronautics, Beihang University, Beijing 100191, China Academic Editor: Günter Neumann Received: 2 November 2016 / Revised: 15 December 2016 / Accepted: 16 January 2017 / Published: 21 January 2017 (This article belongs to the Section Artificial Intelligence ) No Abstract Dependency parsing is an important subtask of natural language processing.

      Read Full Article
    4. Multiple rule development support for text analytics

      Methods, computer program products and systems are provided for applying text analytics rules to a corpus of documents. The embodiments facilitate selection of a document from the corpus within a graphical user interface (GUI), where the GUI opens the selected document to display text of the selected document and also a token parse tree that lists tokens associated with text components of the document, facilitate construction of a text analytics rule, via the GUI, by user selection of one or more tokens from the token parse tree, and, in response to a user selecting one or more tokens from the ...

      Read Full Article
      Mentions: GUI
    5. Language processing method and integrated circuit

      A parse unit parses an input sequence of token elements for an input string, wherein each token element contains a token and/or at least one corresponding token classifier. In a first mode the parse unit applies regular production rules on the token elements and on multi-token classifiers for phrases obtained from the token classifiers. If the first mode parsing does not result in a multi-token classifier encompassing all tokens of the input string, a control unit controls the parse unit to parse the input sequence in a second mode that applies both the regular and artificial production rules. A ...

      Read Full Article
    6. Automated messaging response

      Apparatuses, systems, methods, and computer program products are disclosed for an automated messaging response. A message parsing module may parse a textual message to determine whether the message includes a question. A question determination module may determine a question type for the question. A response presentation module may present a response interface that includes a plurality of selectable responses to the question.

      Read Full Article
    7. System for identifying textual relationships

      A computer-implemented method identifies textual statement relationships. Textual statement pairs including a first and second textual statement are identified, and parsed word group pairs are extracted from first and second textual statements. The parsed word groups are compared, and a parsed word score for each statement pair is calculated. Word vectors for the first and second textual statements are created and compared. A word vector score is calculated based on the comparison of the word vectors for the first and second textual statements. A match score is determined for the textual statement pair, with the match score being representative of ...

      Read Full Article
    8. Data Recombination for Neural Semantic Parsing. (arXiv:1606.03622v1 [cs.CL])

      Modeling crisp logical regularities is crucial in semantic parsing, making it difficult for neural models with no task-specific prior knowledge to achieve good results. In this paper, we introduce data recombination, a novel framework for injecting such prior knowledge into a model. From the training data, we induce a high-precision synchronous context-free grammar, which captures important conditional independence properties commonly found in semantic parsing.

      Read Full Article
    9. Neural Network Models for Implicit Discourse Relation Classification in English and Chinese without Surface Features. (arXiv:1606.01990v1 [cs.CL])

      Inferring implicit discourse relations in natural language text is the most difficult subtask in discourse parsing. Surface features achieve good performance, but they are not readily applicable to other languages without semantic lexicons. Previous neural models require parses, surface features, or a small label set to work well. Here, we propose neural network models that are based on feedforward and long-short term memory architecture without any surface features.

      Read Full Article
    10. Dependency Parsing as Head Selection. (arXiv:1606.01280v1 [cs.CL])

      Conventional dependency parsers rely on a statistical model and a transition system or graph algorithm to enforce tree-structured outputs during training and inference. In this work we formalize dependency parsing as the problem of selecting the head (a.k.a. parent) of each word in a sentence. Our model which we call DeNSe (as shorthand for Dependency Neural Selection) employs bidirectional recurrent neural networks for the head selection task.

      Read Full Article
    11. Parsing Argumentation Structures in Persuasive Essays. (arXiv:1604.07370v1 [cs.CL])

      In this article, we present the first end-to-end approach for parsing argumentation structures in persuasive essays. We model the argumentation structure as a tree including several types of argument components connected with argumentative support and attack relations. We consider the identification of argumentation structures in several consecutive steps. First, we segment a persuasive essay in order to identify relevant argument components.

      Read Full Article
    12. Shallow Parsing Pipeline for Hindi-English Code-Mixed Social Media Text. (arXiv:1604.03136v1 [cs.CL])

      In this study, the problem of shallow parsing of Hindi-English code-mixed social media text (CSMT) has been addressed. We have annotated the data, developed a language identifier, a normalizer, a part-of-speech tagger and a shallow parser. To the best of our knowledge, we are the first to attempt shallow parsing on CSMT. The pipeline developed has been made available to the research community with the goal of enabling better text analysis of Hindi English CSMT.

      Read Full Article
    13. A Fast Unified Model for Parsing and Sentence Understanding. (arXiv:1603.06021v1 [cs.CL])

      Tree-structured neural networks exploit valuable syntactic parse information as they interpret the meanings of sentences. However, they suffer from two key technical problems that make them slow and unwieldy for large-scale NLP tasks: they can only operate on parsed sentences and they do not directly support batched computation.

      Read Full Article
      Mentions: NLP
    14. Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations. (arXiv:1603.04351v1 [cs.CL])

      We present a simple and effective scheme for dependency parsing which is based on bidirectional-LSTMs (BiLSTMs). Each sentence token is associated with a BiLSTM vector representing the token in its sentential context, and feature vectors are constructed by concatenating a few BiLSTM vectors. The BiLSTM is trained jointly with the parser objective, resulting in very effective feature extractors for parsing.

      Read Full Article
    15. Easy-First Dependency Parsing with Hierarchical Tree LSTMs. (arXiv:1603.00375v1 [cs.CL])

      We suggest a compositional vector representation of parse trees that relies on a recursive combination of recurrent-neural network encoders. To demonstrate its effectiveness, we use the representation as the backbone of a greedy, bottom-up dependency parser, achieving state-of-the-art accuracies for English and Chinese, without relying on external word embeddings. The parser's implementation is available for download at the first author's webpage.

      Donate to arXiv

      Read Full Article
    16. Caching of deep structures for efficient parsing

      A parsing method and system. The method includes generating an n-gram model of a domain and computing a tf-idf frequency associated with n-grams of the n-gram model. A list including a frequently occurring group of n-grams based on the tf-idf frequency is generated. The frequently occurring group of n-grams is transmitted to a deep parser component and a deep parse output from the deep parser component is generated. The deep parse output is stored within a cache and a processor verifies if a specified text word sequence of the deep parse output is available in the cache.

      Read Full Article
    17. Recurrent Neural Network Grammars. (arXiv:1602.07776v1 [cs.CL])

      We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese.

      Donate to arXiv

      Read Full Article
    18. Generation of a semantic model from textual listings

      A corpus of textual listings is received and main concept words and attribute words therein are identified via an iterative process of parsing listings and expanding a semantic model. During the parsing phase, the corpus of textual listings is parsed to tag one or more head noun words and/or one or more identifier words in each listing based on previously identified main concept words or using a head noun identification rule. Once substantially each listing in the corpus has been parsed in this manner, the expansion phase assigns head noun words as main concept words and modifier words as ...

      Read Full Article
    19. System and method for extracting ontological information from a body of text

      A system for extracting ontological information from a body of text is disclosed. The system parses one or more sentences from the body of text into parse tree format to generate a set of parsed sentences. The system further performs named-entity-recognition by identifying a subset of parsed sentences from the set of parsed sentences. A subset of noun phrases from the subset of parsed sentences are identified and the noun phrases are examined to classify the noun phrases as an entity or as a property. The system also identifies and outputs a conceptual relationship between the entity and the property ...

      Read Full Article
    20. Paraphrase Generation from Latent-Variable PCFGs for Semantic Parsing. (arXiv:1601.06068v1 [cs.CL])

      One of the limitations of semantic parsing approaches to open-domain question answering is the lexicosyntactic gap between natural language questions and knowledge base entries -- there are many ways to ask a question, all with the same answer. In this paper we propose to bridge this gap by generating paraphrases of the input question with the goal that at least one of them will be correctly mapped to a knowledge-base query.

      Read Full Article
    21. Syntagma. A Linguistic Approach to Parsing. (arXiv:1303.5960v3 [cs.CL] UPDATED)

      SYNTAGMA is a rule-based parsing system, structured on two levels: a general parsing engine and a language specific grammar. The parsing engine is a language independent program, while grammar and language specific rules and resources are given as text files, consisting in a list of constituent structuresand a lexical database with word sense related features and constraints.

      Read Full Article
    22. Query parser derivation computing device and method for making a query parser for parsing unstructured search queries

      A system and method is provided which may comprise parsing an unstructured geographic web-search query into a field-based format, by utilizing conditional random fields, learned by semi-supervised automated learning, to parse structured information from the unstructured geographic web-search query. The system and method may also comprise establishing semi-supervised conditional random fields utilizing one of a rule-based finite state machine model and a statistics-based conditional random field model. Systematic geographic parsing may be used with the one of the rule-based finite state machine model and the statistics-based conditional random field model. Parsing an unstructured local geographical web-based query in local domain ...

      Read Full Article
    23. Edge-Linear First-Order Dependency Parsing with Undirected Minimum Spanning Tree Inference. (arXiv:1510.07482v1 [cs.CL])

      The run time complexity of state-of-the-art inference algorithms in graph-based dependency parsing is super-linear in the number of input words (n). Recently, pruning algorithms for these models have shown to cut a large portion of the graph edges, with minimal damage to the resulting parse trees. Solving the inference problem in run time complexity determined solely by the number of edges (m) is hence of obvious importance.

      Read Full Article
    1-24 of 562 1 2 3 4 ... 22 23 24 »
  1. Categories

    1. Default:

      Discourse, Entailment, Machine Translation, NER, Parsing, Segmentation, Semantic, Sentiment, Summarization, WSD
  2. Popular Articles