Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1458082.1458318acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
poster

A novel statistical chinese language model and its application in pinyin-to-character conversion

Authors Info & Claims
Published:26 October 2008Publication History

ABSTRACT

In this paper, we present a novel Chinese language model, and study its applications, in particular in Chinese pinyin-to-character conversion. In the new model, each word is associated with supporting context constructed by mining the frequent sets of nearby phrases and their distances to the word. Such information was usually overlooked in previous n-gram model and its variants. We apply the model to Chinese pinyin-to-character conversion and find that it offers a better solution to Chinese input. The model has lower perplexity in our evaluation and higher prediction accuracy than the state-of-the-art n-gram Markov model for Chinese language.

References

  1. Intelligent Pinyin Input Method Editor Demo Website, http://www.cais.ntu.edu.sg/~jzhang/pinyin/index_en.html.Google ScholarGoogle Scholar
  2. R. Agrawal, T. Imieliński, and A. Swami. Mining association rules between sets of items in large databases. ACM SIGMOD Record, 22(2):207--216, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. A. Berger, V. Della Pietra, and S. Della Pietra. A maximum entropy approach to natural language processing. Computational Linguistics, 22(1):39--71, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. H. Cheng, X. Yan, J. Han, and C. Hsu. Discriminative frequent pattern analysis for effective classification. In ICDE, pages 716--725, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  5. J. Gao, J. Goodman, M. Li, and K. Lee. Toward a unified approach to statistical language modeling for Chinese. ACM TALIP, 1(1):3--33, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. X. Luo and S. Roukos. An iterative algorithm to build Chinese language models. In ACL, pages 139--143, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. D. Yarowsky. Unsupervised word sense disambiguation rivaling supervised methods. In ACL, pages 189--196, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A novel statistical chinese language model and its application in pinyin-to-character conversion

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CIKM '08: Proceedings of the 17th ACM conference on Information and knowledge management
        October 2008
        1562 pages
        ISBN:9781595939913
        DOI:10.1145/1458082

        Copyright © 2008 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 26 October 2008

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • poster

        Acceptance Rates

        Overall Acceptance Rate1,861of8,427submissions,22%

        Upcoming Conference

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader