文献の詳細
論文の言語 | 英語 |
---|---|
著者 | David Selby, Kai Spriestersbach, Yuichiro Iwashita, Dennis Bappert, Archana Warrier, Sumantrak Mukherjee, Muhammad Nabeel Asim, Koichi Kise, Sebastian Vollmer |
論文名 | Quantitative knowledge retrieval from large language models |
論文誌名 | arXiv |
書名 | arXiv:2402.07770v1 [cs.IR] |
ページ数 | 21 pages |
査読の有無 | 無 |
年月 | 2024年2月 |
要約 | Large language models (LLMs) have been extensively studied for their abilities to generate convincing natural language sequences, however their utility for quantitative information retrieval is less well understood. In this paper we explore the feasibility of LLMs as a mechanism for quantitative knowledge retrieval to aid data analysis tasks such as elicitation of prior distributions for Bayesian models and imputation of missing data. We present a prompt engineering framework, treating an LLM as an interface to a latent space of scientific literature, comparing responses in different contexts and domains against more established approaches. Implications and challenges of using LLMs as 'experts' are discussed. |
URL | https://arxiv.org/abs/2402.07770 |
- BibTeX用エントリー
@Article{Selby2024, author = {David Selby and Kai Spriestersbach and Yuichiro Iwashita and Dennis Bappert and Archana Warrier and Sumantrak Mukherjee and Muhammad Nabeel Asim and Koichi Kise and Sebastian Vollmer}, title = {Quantitative knowledge retrieval from large language models}, journal = {arXiv}, book_title = {arXiv:2402.07770v1 [cs.IR]}, year = 2024, month = feb, numpages = {21}, URL = {https://arxiv.org/abs/2402.07770} }