Of course it can't. That's why there's inherent video latency in the technology. The "future frame" is actually the current frame, the "current" frame is actually the previous frame, and so on. Not that that matters to anything but input, anyway.
It does matter, apparently. When the FOV is moving quickly (like when controlling a FPS game with a mouse) motion blur causes a "streaking" effect due to that latency and the fact that the game can't actually predict what the next set of user input is going to look like.

I could see this effect being lessened quite a bit when the differences in the rendered scene from one frame to the next aren't as large, though.
For video memory, maybe a bit. It also causes a few annoying side effects for graphics programmers to deal with. But other than that, I wouldn't call it "expensive" at all.
Run a game on a rig that's on the fringe of the game's system requirements with triple buffering on and off and then tell me it's not expensive. It's not going to halve your framerate or anything like that, but it does cost you a good chunk of resources. Yes, it's more of a video memory (and memory bandwidth) hit than anything, but that matters too. If it can potentially make a noticeable difference in performance then I consider it "expensive." Of course, depending on the situation it can be less expensive than traditional "vsync" methods. That's not terribly surprising, though, since older methods used to achieve vertical sync with a display usually consisted of fairly arbitrarily dropping frames.
Don't get me wrong...triple buffering is awesome. If you're struggling to get framerates that seem smooth, however, it's probably not going to improve the situation, and neither is piling motion blur calculations on top of it.
I do. No need to get defensive...I'm not arguing against motion blur or telling you that you're wrong for liking it. I'm just saying that it doesn't compensate for lower framerates in the way that natural film motion blur does.
And there are other reasons that do apply. Computer graphics and film technology are more interrelated than they have ever been before. It's not valid to say that you can't make any comparisons between them, especially in something so fundamental as framerate and the impression of motion.
I disagree. Film has the advantage of not needing to be rendered in real-time. This makes a huge difference in considerations regarding framerate and impression of motion.

The OP is referring to the "conventional wisdom" that is often repeated around gaming forums and such that states that higher framerates are better. I was just pointing out that there are some legitimate reasons that a direct comparison of framerates in games and film is full of caveats.