1. & Schapire

    0 Comments Leave a Comment

    1-6 of 6
    1. Mentioned In 6 Articles

    2. Parsing with a Single Neuron: Convolution Kernels for NaturalLanguage Problems

      Parsing with a Single Neuron: Convolution Kernels for Natural Language Problems Michael Collinsy and Nigel Du yz yAT&T Labs-Research, Florham Park, New Jersey. mcollins@research.att.com zDepartment of Computer Science, University of California at Santa Cruz. nigeduff@cs.ucsc.edu Abstract This paper introduces new train- ing criteria and algorithms for NLP problems, based on the Support Vec- tor Machine (SVM) approach to classi cation problems. SVMs can ...
      Read Full Article
    3. The Role of Occam's Razor in Knowledge Discovery

      c , , 1–19 () Kluwer Academic Publishers, Boston. Manufactured in The Netherlands. The Role of Occam’s Razor in Knowledge Discovery PEDRO DOMINGOS pedrod@cs.washington.edu Department of Computer Science and Engineering University of Washington Seattle, WA 98195 Abstract. Many KDD systems incorporate an implicit or explicit preference for simpler models, but this use of “Occam’s razor” has been strongly criticized by several authors (e.g., Schaffer, 1993; Webb ...
      Read Full Article
    4. Machine Learning

      E4 — Pedro Domingos Department of Computer Science & Engineering University of Washington Box 352350 Seattle, WA 98195-2350 pedrod@cs.washington.edu Tel.: 206-543-4229 / Fax: 206-543-2969 Abstract Machine learning’s focus on ill-defined problems and highly flexible methods makes it ideally suited for KDD applications. Among the ideas machine learning contributes to KDD are the importance of empirical validation, the impossibility of learning without a priori assumptions, and the
      Read Full Article
    5. Ranking Algorithms for Named-Entity Extraction: Boosting and theVoted Perceptron.

      Ranking Algorithms for Named--Entity Extraction: Boosting and the Voted Perceptron Michael Collins AT&T Labs-Research, Florham Park, New Jersey. mcollins@research.att.com Abstract This paper describes algorithms which rerank the top N hypotheses from a maximum-entropy tagger, the application being the recovery of named-entity boundaries in a corpus of web data. The first approach uses a boosting algorithm for ranking problems. The second approach uses the voted perceptron algorithm ...
      Read Full Article
    6. New Ranking Algorithms for Parsing and Tagging: Kernels overDiscrete Structures, and the Voted Perceptron

      New Ranking Algorithms for Parsing and Tagging: Kernels over Discrete Structures, and the Voted Perceptron Michael Collins AT&T Labs-Research, Florham Park, New Jersey. mcollins@research.att.com Nigel Duffy iKuni Inc., 3400 Hillview Ave., Building 5, Palo Alto, CA 94304. nigeduff@cs.ucsc.edu Abstract This paper introduces new learning algorithms for natural language processing based on the perceptron algorithm. We show how the algorithms can be efficiently applied ...
      Read Full Article
    7. Discriminative Training Methods for Hidden Markov Models: Theoryand Experiments with Perceptron Algorithms.

      Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms Michael Collins AT&T Labs-Research, Florham Park, New Jersey. mcollins@research.att.com Abstract We describe new algorithms for training tagging models, as an alternative to maximum-entropy models or conditional random fields (CRFs). The algorithms rely on Viterbi decoding of training examples, combined with simple additive updates. We describe theory justifying the algorithms through a mod
      Read Full Article
    8. 1-6 of 6
  1. Categories

    1. Default:

      Discourse, Entailment, Machine Translation, NER, Parsing, Segmentation, Semantic, Sentiment, Summarization, WSD