The question of whether videogames should attempt to tell stories was all the vogue in game studies in the late 90s and early 2000s. You’re less likely to encounter the issue in academia today (unless said academics are writing think pieces at The Atlantic.) But it is still very much an ongoing debate in game development: it isn’t too difficult to still find opinionated developers launching screeds against linearity, against the single-player campaign, and against games’ subservience to the logic of cinematic storytelling.
As is so often the case in such conversations, there is a temptation to jump directly to a categorical assessment, leaping over qualitative assessment entirely. The categorical question “should games tell stories?” is a good way to start a rousing bar fight of a debate. Alternately, the qualitative question “do games, as we know them, have a history of telling stories well?” will most likely lead to the reasoned response, “no.” This, in turn, will possibly lead to further avenues of polite and potentially incisive inquiry, such as “why do you suppose that is?” and “are there any ways that we could chart new types of storytelling that might be more compatible with games’ basic features?”
I’m going take the polite and careful qualitative route, not really because I prefer it (I enjoy a rousing debate as much as anyone else), but because I actually think it’s necessary to set the groundwork before making any larger qualitative claims.