Critical Understanding of LLM-Generated Statements
Main Article Content
Abstract
Now that we live in a world where most of the text in recent online interactions that we come across seems to be generated by LLMs, it becomes critical to understand the nature of statements being generated by LLMs. Technology has always been sold to humans under the tag that it is foolproof and will make lives easier. LLMs produce text by predicting the next token or sequence based on probabilities derived from their training data. A question then arises, whether they generate a ‘probability statement’ or ‘probability of a statement’. The difference between the two may seem elusive, but it is actually quite obvious. This paper intends to bring forward that difference to its audience, who, in turn, can understand the capabilities of the machine they are using and adapt a better framework to judge and use the response generated by LLM models in their applications.
Downloads
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
How to Cite
References
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems (NeurIPS), vol. 30, 2017.
DOI: https://doi.org/10.48550/arXiv.1706.03762
D. Acemoglu, “The simple macroeconomics of AI,” Economic Policy, vol. 40, no. 121, pp. 13–58, 2025.
DOI: https://doi.org/10.1093/epolic/eiae042
E. M. Bender, T. Gebru, A. McMillan-Major, and S. Shmitchell, “On the dangers of stochastic parrots: Can language models be too big?” in Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21), New York, NY, USA: ACM, 2021, pp. 610–623. DOI: https://doi.org/10.1145/3442188.3445922
D. Kahneman, Thinking, Fast and Slow, new ed. London, U.K.: Efinito, Jul. 2022, ISBN 9781802063059. https://www.abebooks.com/9781802063059/Thinking-Fast-Slow-Daniel-Kahneman-1802063056/plp
 
							 
						 
            
         
             
            