×
We show that soft-prompt based conditional text generation can be improved with simple and efficient methods that simulate modeling the discourse structure of ...
Dec 10, 2021 · We show that soft-prompt based conditional text generation can be improved with simple and efficient methods that simulate modeling the discourse structure of ...
We show that soft-prompt based conditional text generation can be im- proved with simple and efficient methods that simulate modeling the discourse structure of.
The document proposes improving soft prompt-based text generation using discourse-aware methods. It investigates applying hierarchical blocking to prefix ...
In relation to models based on the Transformer architecture, attention mechanisms constitute the focal point of researchers' interest.
This work shows that structured design of prefix parameters can yield more coherent, faithful and relevant generations than baseline prefix-tuning on all ...
Sampling soft attention (SoftSA) and Hierarchical Truncated (HTruncSA) and Soft Attention (HSoftSA) for Prefix-Tuning. Each model is repeated 3 times and ...
People also ask
Jan 16, 2022 · In this work, we show that prompt based conditional text generation can be improved with simple and efficient methods that simulate modeling the ...
Missing: Soft | Show results with:Soft
We show that soft-prompt based conditional text generation can be improved with simple and efficient methods that simulate modeling the discourse structure of ...
Discourse-aware soft prompting for text generation. M Ghazvininejad, V Karpukhin, V Gor, A Celikyilmaz. arXiv preprint arXiv:2112.05717, 2021. 6, 2021.