Please excuse my ignorance, but I would like to know:
- What is the relationship between the network cable's frequency and its bandwidth capacity? Is there a specific mathematical formula that relates these two variables?
- How does the cable's frequency affect the cable's capacity to deliver a certain level of bandwidth?
- What is the physics behind the relationship between frequency and bandwidth in network cable functionality?
For example, it is stated that a Category 5 UTP, 100 MHz caliber, can deliver up to 100 MB of bandwidth, while a Cat5e with 350 MHz can deliver up to a GB bandwidth. (I am not sure if it can? Can it?)
There are several formulas that work for frequency. They can all be found in their corresponding IEEE standard. But in short, each cable has the ability to carry an amount of traffic (current) based on the MHz rating of the cable. For instance, 10/100/1000 can be carried on 100 MHz Cable (category 5e), 250 MHz cable (like category 6), and 10 gigabit can be carried on 500 MHz cable (category 6a), 600 MHz cable (category 7) or 1000 MHz/1 GHz cable (category 7a). As the speed of the signals increase, the bandwidth capacity of the cable must also increase. Bear in mind that that the rating of a cable does not necessarily mean that these signals will travel on the cabling channel. If the channels are improperly installed and the channel will not test to the stated category rating, then the cable will cause faults in the network.