You are correct that in theory you could get away with just rebuilding part of parity in such a case, but the problem is identifying in a fool proof way programmatically when these use cases occur and changing the parity build process to take this into account. I think the strategy is therefore to simply play safe.
A 'hack' might be to use the New Config tool and check the parity is valid checkbox, and then after starting the array immediately start a correcting parity check which will start correcting parity errors in the first 1TB and then abandon the check once you get past that point and no more errors are being corrected. However if you try this it is at your own risk, and whether it is faster I am not sure.