SopakcoSauce Docs
  • Welcome!
  • PList/SBF Guides
    • What's a PList/SBF
    • Core Concepts
    • PList/SBF vs. Natural Language
    • Templates
    • Category System
    • Character Definition
    • Scenario
    • Overemphasized Traits
    • Strategies to Address Challenges with PList
    • Machine Readability
  • Character Depth (On A Budget)
  • Token Efficiency
  • Multiple Characters
  • Personality Switching
  • Narrative Styles
  • Advanced Techniques
  • Master Categories
  • Complete Character Examples
  • Final Checklist
  • Text Generation Guides
  • Sampling Parameters/Generation Settings (ELI5)
  • Instruct Templates
  • Image Generation Guides
    • Parameters
  • Negative Prompting
  • OTHER
    • Archetypal Manifestations
    • Citations
    • Further Reading
    • Tools
Powered by GitBook
On this page
  1. OTHER

Citations

PreviousArchetypal ManifestationsNextFurther Reading

Last updated 3 months ago

  1. 's

  2. 's

  3. Technical Analysis of Trait Expression Patterns in Large Language Models

  4. Ashton, N. (2013, October 21). Machine readability. Know your Data Formats | School of Data - Evidence is Power.

  5. Xiang, Wenxuan, et al. "Parsing and Encoding Interactive Phrase Structure for Implicit Discourse Relation Recognition." Neural Computing and Applications, vol. 36, 2024, pp. 13783-13797,

  6. Holtzman, A., Buys, J., Du, L., Forbes, M., & Choi, Y. (2020). The Curious Case of Neural Text Degeneration. In International Conference on Learning Representations.

  7. Fan, A., Lewis, M., & Dauphin, Y. (2018). Hierarchical neural story generation. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (pp. 889-898). Association for Computational Linguistics.

  8. Welleck, S., Kulikov, I., Kim, J., Pang, R. Y., & Cho, K. (2020). Consistency of a recurrent language model with respect to incomplete decoding. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (pp. 5553-5568).

  9. Meister, C., & Cotterell, R. (2021). Language model evaluation beyond perplexity. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics (pp. 5328-5339).

  10. Zhang, H., Xu, J., & Wang, J. (2019). Pretraining-based natural language generation for text summarization. In Proceedings of the 23rd Conference on Computational Natural Language Learning (pp. 789-797).

  11. Vijayakumar, A. K., Cogswell, M., Selvaraju, R. R., Sun, Q., Lee, S., Crandall, D., & Batra, D. (2018). Diverse beam search: Decoding diverse solutions from neural sequence models. In Proceedings of AAAI Conference on Artificial Intelligence.

  12. Keskar, N. S., McCann, B., Varshney, L. R., Xiong, C., & Socher, R. (2019). CTRL: A conditional transformer language model for controllable generation. arXiv preprint arXiv:1909.05858.

    Merity, S. (2019). Single headed attention RNN: Stop thinking with your head. arXiv preprint arXiv:1911.11423.

  13. Dathathri, S., Madotto, A., Lan, J., Hung, J., Frank, E., Molino, P., Yosinski, J., & Liu, R. (2020). Plug and play language models: A simple approach to controlled text generation. In International Conference on Learning Representations.

  14. Hugging Face. (2024). Text generation strategies. Hugging Face Documentation.

  15. Wolf, T., Chaumond, J., Debut, L., Sanh, V., Delangue, C., Moi, A., & Rush, A. M. (2020). Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (pp. 38-45).

.trappu
PList guide
absolutetrash
bot creation guide
https://schoolofdata.org/2013/10/21/know-your-data-formats/
https://doi.org/10.1007/s00521-024-09709-8
https://doi.org/10.48550/arXiv.1904.09751
https://aclanthology.org/P18-1082/
https://doi.org/10.18653/v1/2020.emnlp-main.448
https://doi.org/10.18653/v1/2021.acl-long.414
https://doi.org/10.48550/arXiv.1902.09243
https://doi.org/10.48550/arXiv.1610.02424
https://doi.org/10.48550/arXiv.1909.05858
https://doi.org/10.48550/arXiv.1912.02164
https://huggingface.co/docs/transformers/main/en/generation_strategies
https://doi.org/10.18653/v1/2020.emnlp-demos.6
Page cover image