when connections were slower for computers , it has sense to load the html data, page by page, but actually in background all the links of a webpage can be loaded as hidden iframes, and when redirect just load the data contained in the iframe.
I dont know why the links does not load recursively until a certain degree of deepness in this way it will be speedier to load the next page just as quick as press the history back button.
I dont know why ww3c consortions does not develope tags in order to specify how heavy will be the traffic of the page and what links are more quickly to load.

Regards David.