Jbg reading

From CSWiki
Jump to: navigation, search


WordNet, a ubiquitous tool for natural language processing, suffers from sparsity of connections between its component concepts (synsets). If new connections between synsets can be learned and weighted, WordNet can serve as a tool for determining, given an arbitrary phrase of text, which underlying meaning of the possibly ambiguous words was meant.

Through the use of human annotators, a subset of the connections between 1000 hand-chosen synsets was assigned a value of “evocation” representing how much the first concept brings to mind the second. These data, along with existing similarity measures, constitute the basis of a method for predicting evocation between previously unrated pairs by using boosting to predict new values of evocation using statistical measures of similarity derived from corpora and linguistic resources. The final, augmented network can then be used as a resource in a matching task with IR applications.

A second approach to learning the weights between a subset of the synsets is done in a completely unsupervised manner. Using the assumption that similar senses will appear in related documents, a generative model using the structure of nouns within WordNet was learned and adapted to work within the framework of latent Dirichlet allocation. Because words are generated directly from synsets within WordNet, a given text can be disambiguated with respect to the most likely sense.

Reading List

  • Jordan, M. Introduction to Graphical Models (Chapters 1-5, 9-12)
  • Russell, S., & Norvig, P (2002). Artificial Intelligence: A Modern Approach. New York: Prentice Hall. (Chapters 3-7, 18-21)


  • Manning, C., & Schütze, H. Statistical Natural Language Processing. Cambridge: MIT Press. (Chapter 3-8)
  • Schutze, H. Automatic word sense discrimination. Computational Linguistics, 24(1):97-124, 1998.


  • Yarowsky, D. (1999). Unsupervised Word Sense Disambiguation Rivaling Supervised Methods. Meeting of the Association for Computational Linguistics.


  • Schapire, R. (2001). The Boosting Approach to Machine Learning: An Overview. In MSRI Workshop on Nonlinear Estimation and Classification, Berkeley, CA, Mar. 2001.


  • Peter Stone, Robert E. Schapire, Michael L. Littman, János A. Csirik and David McAllester. Decision-theoretic bidding based on learned density models in simultaneous, interacting auctions. Journal of Artificial Intelligence Research, 19:209-242, 2003.


  • Miller G. Nouns in WordNet. (1995). Introduction to WordNet: An On-line Lexical Database. Cambridge: MIT Press.
  • Fellbaum, C. Verbs in WordNet. (1995). Introduction to WordNet: An On-line Lexical Database. Cambridge: MIT Press.


  • Snow, R., Jurafsky, D., & Ng, A. (2005). Learning syntactic patterns for automatic hypernym discovery. In NIPS 17.


  • Blei, D., Ng, A., Jordan, M. (2003). Latent Dirichlet allocation. Journal of Machine Learning Research Volume 3: 993 - 1022.


  • Winn, J., & Bishop, C.M. (2005). Variational message passing. Journal of Machine Learning Research 6, 661–694.


  • McCarthy, D., Koeling, R., Weeds, J. and Carroll, J. (2004) Finding predominant senses in untagged text. In Proceedings of the 42nd Annual Meeting of the Association for Computational Linguistics. Barclona, Spain. pp 280-287


To Do List

  • Compute statistics and summary for Dong, Wikipedia, etc.
  • Update presentation
  • Do practice presentation
  • Integrate walk into LDA code
  • Study
    • Boosting
    • LDA Smoothing
    • R&N
      • MDP
      • Constraint Heuristics
        • Fail-first (node)
        • Degree (node)
        • Least Constraining (value selection)
      • Binary Constraints (5.11)
      • Minimax w/ suboptimal opponents