One of the most interesting features of LLMs is their ability to “expand” an idea. More advanced users of ChatGPT might have explored this concept by using the “canvas” mode, which not only allows for expansion but also lets you manipulate the reading difficulty of the text.

This particular ability of the models, while seemingly inconsequential, can dilute the content of the idea. To understand this better, we need to dive a bit deeper into the meaning of language itself.

Language, in general, contains units of meaning. Here, we’re not referring to grammatical units but rather to the fact that language itself conveys meaning. Interestingly, different languages have varying “densities,” essentially requiring fewer or more words to express the same idea.

You may notice this if you speak several languages: sometimes you read “slower” or feel more tired when reading in one language compared to another. For instance, Norwegian Bokmål is more compressed than my native Spanish, and I often notice that reading literature in Spanish feels faster. While my years of experience with each language might also be a factor, a simple conclusion is that Norwegian is “denser,” which demands more focused attention.

Returning to LLMs, generative tools that aim to expand what you’re trying to say often create associations typically used around certain ideas at different perceived education levels (e.g., elementary or university). This approach, while not necessarily wrong, is akin to filling a loaf of bread with air.

The more the model expands a paragraph, the more “air” is introduced into the idea. However, the core content remains the same. At the extreme, this can lead to a loss of meaning—either by adding so many words that they become meaningless or by condensing too much and leaving critical ideas out of the paragraph altogether.

This is why, as an educator, I tend to oppose “word limits.” While some students may struggle with formulating their thoughts and need a little push to either expand or condense their answers, ideas should conclude when the point has been made. Forcing a word limit on individuals can shift their focus to the “air content” of the text rather than on what they truly want to express.

Tools that alter the fundamental units of language risk undermining the value of what you intend to say. You may end up adding a lot of words without really adding anything of substance—like a conversation that circles endlessly without offering any benefit to anyone.

As always, it ultimately comes down to the users. They must ensure that the content they produce contains real substance. The fact that an idea is short has little to do with its actual + value, and in some cases, it may be best left concise.

Share what you think!

Trending