eduzhai > Humanities > Linguistics >

Learning Analogies and Semantic Relations

  • paiqiu
  • (0) Download
  • 20210805
  • Save

... pages left unread,continue reading

Document pages: 28 pages

Abstract: We present an algorithm for learning from unlabeled text, based on the Vector Space Model (VSM) of information retrieval, that can solve verbal analogy questions of the kind found in the Scholastic Aptitude Test (SAT). A verbal analogy has the form A:B::C:D, meaning "A is to B as C is to D "; for example, mason:stone::carpenter:wood. SAT analogy questions provide a word pair, A:B, and the problem is to select the most analogous word pair, C:D, from a set of five choices. The VSM algorithm correctlyanswers 47 of a collection of 374 college-level analogy questions (random guessing would yield 20 correct). We motivate this research by relating it to work in cognitive science and linguistics, and by applying it to a difficult problem in natural language processing, determining semantic relations in noun-modifier pairs. The problem is to classify a noun-modifier pair, such as "laser printer ", according to the semantic relation between the noun (printer) and the modifier (laser). We use a supervised nearest-neighbour algorithm that assigns a class to a given noun-modifier pair by finding the most analogous noun-modifier pair in the training data. With 30 classes of semantic relations, on a collection of 600 labeled noun-modifier pairs, the learning algorithm attains an F value of 26.5 (random guessing: 3.3 ). With 5 classes of semanticrelations, the F value is 43.2 (random: 20 ). The performance is state-of-the-art for these challenging problems.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×