Klueless
Very Senior Member
Came up under comments elsewhere and got me to wondering ...
At one of my part time jobs we have 15 employees, 11 PCs and another dozen smart phones, tablets, etc. We also view security camera footage through the Internet. All in all pretty busy for a 15 x 1.5 Mbps Internet Service. Daily utilization was 15 to 20 GB.
Last month we upgraded the Internet to 400 x 20 Mbps. Daily utilization dropped. It is now 10 to 15 GB per day. (And here I was thinking better performance would motivate more utilization.)
I always figured there was some overhead (errors, dropped packets, retransmitts, etc.) to users competing for a limited resource but I never dreamed anything like 25%.
I'm left wondering if my observations are anywhere near correct (and wondering how we ever worked as well as we did all these years)?
And, I'm also left wondering how some of this stuff really works.
At one of my part time jobs we have 15 employees, 11 PCs and another dozen smart phones, tablets, etc. We also view security camera footage through the Internet. All in all pretty busy for a 15 x 1.5 Mbps Internet Service. Daily utilization was 15 to 20 GB.
Last month we upgraded the Internet to 400 x 20 Mbps. Daily utilization dropped. It is now 10 to 15 GB per day. (And here I was thinking better performance would motivate more utilization.)
I always figured there was some overhead (errors, dropped packets, retransmitts, etc.) to users competing for a limited resource but I never dreamed anything like 25%.
I'm left wondering if my observations are anywhere near correct (and wondering how we ever worked as well as we did all these years)?
And, I'm also left wondering how some of this stuff really works.
Last edited: