Citations
Last updated
Last updated
's
's
Technical Analysis of Trait Expression Patterns in Large Language Models
Ashton, N. (2013, October 21). Machine readability. Know your Data Formats | School of Data - Evidence is Power.
Xiang, Wenxuan, et al. "Parsing and Encoding Interactive Phrase Structure for Implicit Discourse Relation Recognition." Neural Computing and Applications, vol. 36, 2024, pp. 13783-13797,
Holtzman, A., Buys, J., Du, L., Forbes, M., & Choi, Y. (2020). The Curious Case of Neural Text Degeneration. In International Conference on Learning Representations.
Fan, A., Lewis, M., & Dauphin, Y. (2018). Hierarchical neural story generation. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (pp. 889-898). Association for Computational Linguistics.
Welleck, S., Kulikov, I., Kim, J., Pang, R. Y., & Cho, K. (2020). Consistency of a recurrent language model with respect to incomplete decoding. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (pp. 5553-5568).
Meister, C., & Cotterell, R. (2021). Language model evaluation beyond perplexity. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics (pp. 5328-5339).
Zhang, H., Xu, J., & Wang, J. (2019). Pretraining-based natural language generation for text summarization. In Proceedings of the 23rd Conference on Computational Natural Language Learning (pp. 789-797).
Vijayakumar, A. K., Cogswell, M., Selvaraju, R. R., Sun, Q., Lee, S., Crandall, D., & Batra, D. (2018). Diverse beam search: Decoding diverse solutions from neural sequence models. In Proceedings of AAAI Conference on Artificial Intelligence.
Keskar, N. S., McCann, B., Varshney, L. R., Xiong, C., & Socher, R. (2019). CTRL: A conditional transformer language model for controllable generation. arXiv preprint arXiv:1909.05858.
Merity, S. (2019). Single headed attention RNN: Stop thinking with your head. arXiv preprint arXiv:1911.11423.
Dathathri, S., Madotto, A., Lan, J., Hung, J., Frank, E., Molino, P., Yosinski, J., & Liu, R. (2020). Plug and play language models: A simple approach to controlled text generation. In International Conference on Learning Representations.
Hugging Face. (2024). Text generation strategies. Hugging Face Documentation.
Wolf, T., Chaumond, J., Debut, L., Sanh, V., Delangue, C., Moi, A., & Rush, A. M. (2020). Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (pp. 38-45).