[go: up one dir, main page]

Skip to main content

MorphoGen: Full Inflection Generation Using Recurrent Neural Networks

  • Conference paper
  • First Online:
Computational Linguistics and Intelligent Text Processing (CICLing 2019)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13452))

  • 484 Accesses

Abstract

Sub-word level alternations during inflection (apophonies) are an common linguistic phenomenon present in morphologically-rich languages, like Romanian. Inflection learning, or predicting the inflection class of a partially regular or fully irregular verb or noun in such a language has been a widely studied task in NLP, but generative models are limited to capturing the most common ending patterns and apophonies. In this paper, we show how to train a character-level Recurrent Neural Network language model to be able to accurately generate the full inflection of verbs in Romanian, Finish, and Spanish and model stem-level phonological alternations triggered by inflection in an unsupervised way. We also introduce a method to evaluate the accuracy of the generated inflections.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    aiweirdness.com has used it extensively.

  2. 2.

    Note this skip-connects 1) the embedding layer and 2) the first LSTM layer to the attention layer, which helps to alleviate vanishing gradients when the model is learning the weights of these first two layers.

References

  1. Aharoni, R., Goldberg, Y.: Morphological inflection generation with hard monotonic attention. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, ACL 2017, Vancouver, Canada, July 30 - August 4, Volume 1: Long Papers. pp. 2004–2015 (2017), http://doi.org/10.18653/v1/P17-1183

  2. Ahlberg, M., Forsberg, M., Hulden, M.: Paradigm classification in supervised learning of morphology. In: Mihalcea, R., Chai, J.Y., Sarkar, A. (eds.) NAACL HLT 2015, The 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Denver, Colorado, USA, May 31 - June 5, 2015. pp. 1024–1029. The Association for Computational Linguistics (2015), http://aclweb.org/anthology/N/N15/N15-1107.pdf

  3. Radford, A., Narasimhan, K., Salimans, T., Sutskever, I.: Improving language understanding by generative pre-training. http://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf

  4. Radford, A., Narasimhan, K., Salimans, T., Sutskever, I.: Language models are unsupervised multitask learners. http://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf

  5. Barbu, A.M.: Conjugarea verbelor româneşti. Dicţionar: 7500 de verbe româneşti grupate pe clase de conjugare. Bucharest: Coresi (2007), 4th edition, revised. (In Romanian.) (263 pp.)

    Google Scholar 

  6. Barbu, A.M.: Romanian lexical databases: Inflected and syllabic forms dictionaries (2008)

    Google Scholar 

  7. Cotterell, R., Kirov, C., Sylak-Glassman, J., Yarowsky, D., Eisner, J., Hulden, M.: The SIGMORPHON 2016 shared task–morphological reinflection. In: Proceedings of the 2016 Meeting of SIGMORPHON. Association for Computational Linguistics, Berlin, Germany (August 2016)

    Google Scholar 

  8. Şulea, O.M.: Semi-supervised approach to romanian noun declension. In: Knowledge-Based and Intelligent Information Engineering Systems: Proceedings of the 20th International Conference KES-2016, York, UK, 5–7 September 2016. pp. 664–671 (2016), http://doi.org/10.1016/j.procs.2016.08.248

  9. Dinu, L.P., Şulea, O.M., Niculae, V.: Sequence tagging for verb conjugation in romanian. In: RANLP. pp. 215–220 (2013)

    Google Scholar 

  10. Dinu, L.P., Ionescu, E., Niculae, V., Şulea, O.M.: Can alternations be learned? A machine learning approach to verb alternations. In: Recent Advances in Natural Language Processing 2011. pp. 539–544 (September 2011)

    Google Scholar 

  11. Dinu, L.P., Niculae, V., Şulea, O.M.: Learning how to conjugate the romanian verb. Rules for regular and partially irregular verbs. In: European Chapter of the Association for Computational Linguistics 2012. pp. 524–528 (April 2012)

    Google Scholar 

  12. Durrett, G., DeNero, J.: Supervised learning of complete morphological paradigms. In: Vanderwende, L., III, H.D., Kirchhoff, K. (eds.) Human Language Technologies: Conference of the North American Chapter of the Association of Computational Linguistics, Proceedings, June 9–14, 2013, Westin Peachtree Plaza Hotel, Atlanta, Georgia, USA. pp. 1185–1195. The Association for Computational Linguistics (2013), http://aclweb.org/anthology/N/N13/N13-1138.pdf

  13. Faruqui, M., Tsvetkov, Y., Neubig, G., Dyer, C.: Morphological inflection generation using character sequence to sequence learning. CoRR abs/1512.06110 (2015), http://arxiv.org/abs/1512.06110

  14. Felbo, B., Mislove, A., Søgaard, A., Rahwan, I., Lehmann, S.: Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm. In: Conference on Empirical Methods in Natural Language Processing (EMNLP) (2017)

    Google Scholar 

  15. Guţu-Romalo, V.: Morfologie Structurală a limbii române. Editura Academiei Republicii Socialiste România (1968), in Romanian

    Google Scholar 

  16. Howard, J., Ruder, S.: Universal language model fine-tuning for text classification. In: ACL. Association for Computational Linguistics (2018), http://arxiv.org/abs/1801.06146

  17. Woolf, M.: textgenrnn, http://github.com/minimaxir/textgenrnn

  18. Zhou, C., Neubig, G.: Morphological inflection generation with multi-space variational encoder-decoders. In: Proceedings of the CoNLL SIGMORPHON 2017 Shared Task: Universal Morphological Reinflection, Vancouver, BC, Canada, August 3–4, 2017. pp. 58–65 (2017), http://doi.org/10.18653/v1/K17-2005

Download references

Acknowledgement

SY would like to thank Noisebridge hackerspace in San Francisco for use of their computing facilities.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Octavia-Maria Şulea .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Şulea, OM., Young, S., Dinu, L.P. (2023). MorphoGen: Full Inflection Generation Using Recurrent Neural Networks. In: Gelbukh, A. (eds) Computational Linguistics and Intelligent Text Processing. CICLing 2019. Lecture Notes in Computer Science, vol 13452. Springer, Cham. https://doi.org/10.1007/978-3-031-24340-0_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-24340-0_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-24339-4

  • Online ISBN: 978-3-031-24340-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics