1. A fiber optic line running from New York City to San Francisco 4500km long. Assuming a propagation velocity of 3108m/sec and a bit rate of 109b/s (i.e., 1Gbps ), what window size would be needed to achieve a utilization of the line above 98% (assume 1500 bit packets)? If P frames were received in error (0<P<1), how would you estimate the change in throughput?.