Probably a very basic question but confused the hell out of me - say if I have 100mb internet at home, and scenario one, a router with 100mb port speed and I connect two PCs to it, each has a 100mb NIC card, is it true that ignoring other factors I should be able to get close to, if not 100mb connection on each of the PCs? On the other hand, scenario 2, if I have a (unmanaged) switch and I connect the PCs to the switch I would only ended up getting 50mb each on each of the PCs (i.e., the switch essentially “halved” my internet speed if I connect 2 PCs to it, 1/3 if I connect 3 PCs to it, etc)?
Thanks for the detailed explanation - so if I understand this correctly, basically there is a port speed and there is an internal bandwidth speed - a port speed could be 100M, 1G, or 10Gbit, for example, but the internal bandwidth should be much much larger than that.
My follow up question is then: if I have a ISP modem -> router A and ISP modem -> switch -> router B connection set up (both connecting from the same ISP modem but using different ports on the modem) and all my PCs/game consoles/smart TVs are connecting to router B and all my IoT devices are connecting router A, in terms of the speed for devices connected to Router B it should, at least in theory, enjoying whatever bandwidth that’s not used by the IoT devices in router A (which I assume would be minimum) and if I only have one PC turned on and that’s the only device connecting to router B then my PC should almost have the same speed as the minimum of all port speed and my internet speed? Is that correct?