Magaret last edited by admin
A gigabit is 109 or 1,000,000,000 bits.
One gigabit (abbreviated Gb) is equal to 1,000 megabits or 1,000,000 kilobits. It is one-eighth the size of a gigabyte (GB).
Gigabits are most often used to measure data transfer rates of local networks and I/O connections. For example, Gigabit Ethernet is a common Ethernet standard that supports data transfer rates of one gigabit per second (Gbps) over a wired Ethernet network. Modern I/O technologies, such as USB 3.0 and Thunderbolt are also measured in gigabits per second. USB 3.0 can transfer data at up to 5 Gbps, while Thunderbolt 1.0 can transfer data bidirectionally at 10 Gbps.
While gigabits and gigabytes sound similar, it is important not to confuse the two terms. Since there are eight bits in one byte, there are also eight gigabits in one gigabyte. Gigabits are most often used to describe data transfer speeds, while gigabytes are used to measure data storage.
* These videos are coming directly from Youtube, they may or may not be most relevant to the word "Gigabit"
What is Define Dictionary Meaning?
Best thing is, its free and you can even contribute without creating an account.