Suppose for TCP’s TimeOut Interval estimation, EstimatedRTT is 4.0 at some point and subsequent measured RTTs all are 1.0. If the initial value of Deviation was 0.75, how long does it take before the TimeOut value falls below 4.0? Use α= 0.875, β = .125