Welcome back to Acing the A+, FSET’s guide to CompTIA A+ Certification, the main standard for a base understanding of the world of information technology. Today, we’re going to be talking about cellular standards – you’ve probably already seen “4G” or “5G” on the corner of your smartphones, and now you can learn about what they really mean. 

It’s no secret that here in the 21st century, smartphones and mobile devices go hand-in-hand with our daily lives; whether it’s work or play, in most cases, we can’t help but use these kinds of technology. Because devices like iPhones, Galaxies and tablets connect to cellular networks, it’s safe to say that cellular standards are a part of our daily lives, too – but what are they, exactly? 

Before smartphones rose to popularity, the device of choice was referred to as a cell phone, largely due to them requiring cellular networks to work properly. Basically, different geographical regions across the world were broken into sections known as cells that were then surrounded with antennas by telecommunications companies. By breaking down their services areas into different cells, these companies were able to ensure that their antennas would be able to send signals to everywhere, in an effort to ensure their customers would always have adequate signal strength and coverage 

The origins of the cellular standards that we know today began with 2G networks – short for second generation cellular networks – of which there were two main kinds: GSM and CDMA. Unlike the analog signals of older 0G and 1G networks, 2G networks used entirely digital signals which allowed for more users to make calls, better protected calls, and eventually text messages. 

GSM stood for the Global System for Mobile Communications, and it quickly became the standard for 2G networks for most places in the world. In North America, American companies AT&T and T Mobile pioneered the technology, creating cells and setting up antennas that could service subscriber identity module (SIM) cards. By using SIM cards, customers of these companies could switch their cards into newer and better phones while still keeping their old number.  

GSM technology used a process known as multiplexing to be able to have many people communicating with each other in the same cell (and therefore on the same frequency) at the same time. You can think of it as a circuit switch-based checklist, where everyone was in line to get a little bit of time to send information back and forth to each other. Eventually, GSM technology adapted into Enhanced Data Rates for GSM Evolution (EDGE), which allowed packets of data – such as text messages – to be sent over 2G networks.  

CDMA stood for Code-division multiple access, and during the 2G era, it was the main competitor to GSM technology. Instead of ‘waiting in line’ like GSM users, CDMA users each had a unique code that their devices then filtered out of the pool to connect each other to phone calls. In North America, CDMA technology was captained by Verizon and Sprint, but it didn’t really catch on in Europe and elsewhere like GSM technology did.  

As the world approached the Millennium, more and more people began pushing for enhanced technology with additional mobile capabilities. This eventually brought about the third generation of cellular standards in 1998, simplified as 3G, which saw more data become able to be transferred faster over cellular networks, making things like streaming audio and video a reality, as well as global positioning system (GPS) capabilities.  

Around that time, companies were beginning to realize that the separation of GSM and CDMA technologies was more of a hindrance than a help. To converge the two technologies, the industry shifted toward what’s known as Long-Term Evolution (LTE), also known as 4G. LTE drew more heavily on GSM EDGE technology, as well as Universal Mobile Telecommunications System (UTMS) technology, largely leaving CDMA tech behind. 

With LTE in place around 2004, telecommunications companies were able to send data over different networks much more easily, with some telcos able to offer download speeds of 150 Mbit/s on their best connections. Eventually, some companies upgraded their LTE networks into what was known as LTE-advanced or LTE-A, which doubled the max speed to 300 Mbit/s.  

In 2020, the industry began shifting toward the fifth generation of cellular standards, or 5G. While this rollout is still underway, 5G connections are known for their immensely fast rates of data transfer, with some showcasing speeds as high as 10 gigabits per second. 5G towers are currently being installed all around the world, with most major cities now featuring 5G hotspots capable of offering network customers previously unthinkable mobile connections.  

It’s expected that the power of 5G will result in changes to how developers approach creating new smartphones and mobile devices. New functionalities might include faster monitoring and notifications during global positioning, cloud-based applications being able to perform better – such as the ability to play games with intensive graphics that regular phone hardware can’t handle – and much more.   

Notably, the operating systems of our phones include code that, in most cases, automatically updates how our phones connect to cellular networks. For example, our smartphones must occasionally update their preferred roaming list (PRL), so that they can understand where the closest signal towers are in order to get the fastest possible connection speeds. These kinds of background processes are often updated over the air (OTA), meaning that they will update in the snap of a finger without users even being aware that it’s taken place.  

Finally, as has been discussed in previous entries of Acing the A+, when it comes to cellular standards, it’s important to think about possibilities of tethering. Most, if not all modern smartphones are capable of broadcasting their cellular network as an 802.11 Wi-Fi signal. In other words, you are able to broadcast your phone’s 4G or 5G signal as if your phone was a router that other devices can connect to (so long as they can also use 802.11 signals).  

Service companies sometimes limit how their signals can be shared, meaning your phone might not be able to perform (or share) all of the regular functions of a cellular network, or you could be looking at extra fees based on your data plan. For these reasons, it’s always a good idea to check with your mobile provider about what you can and can’t do – or how much it will cost – while tethering from your phone.  

Want to
learn more?

Image of Nicole Brown and Allison Mayne

Email Sign up

Keep up to date with FSET and join our mailing list!

Welcome back to Acing the A+, FSET’s guide to CompTIA A+ Certification, the main standard for a base understanding of the world of information technology. Today, we’re going to be talking about cellular standards – you’ve probably already seen “4G” or “5G” on the corner of your smartphones, and now you can learn about what they really mean. 

It’s no secret that here in the 21st century, smartphones and mobile devices go hand-in-hand with our daily lives; whether it’s work or play, in most cases, we can’t help but use these kinds of technology. Because devices like iPhones, Galaxies and tablets connect to cellular networks, it’s safe to say that cellular standards are a part of our daily lives, too – but what are they, exactly? 

Before smartphones rose to popularity, the device of choice was referred to as a cell phone, largely due to them requiring cellular networks to work properly. Basically, different geographical regions across the world were broken into sections known as cells that were then surrounded with antennas by telecommunications companies. By breaking down their services areas into different cells, these companies were able to ensure that their antennas would be able to send signals to everywhere, in an effort to ensure their customers would always have adequate signal strength and coverage 

The origins of the cellular standards that we know today began with 2G networks – short for second generation cellular networks – of which there were two main kinds: GSM and CDMA. Unlike the analog signals of older 0G and 1G networks, 2G networks used entirely digital signals which allowed for more users to make calls, better protected calls, and eventually text messages. 

GSM stood for the Global System for Mobile Communications, and it quickly became the standard for 2G networks for most places in the world. In North America, American companies AT&T and T Mobile pioneered the technology, creating cells and setting up antennas that could service subscriber identity module (SIM) cards. By using SIM cards, customers of these companies could switch their cards into newer and better phones while still keeping their old number.  

GSM technology used a process known as multiplexing to be able to have many people communicating with each other in the same cell (and therefore on the same frequency) at the same time. You can think of it as a circuit switch-based checklist, where everyone was in line to get a little bit of time to send information back and forth to each other. Eventually, GSM technology adapted into Enhanced Data Rates for GSM Evolution (EDGE), which allowed packets of data – such as text messages – to be sent over 2G networks.  

CDMA stood for Code-division multiple access, and during the 2G era, it was the main competitor to GSM technology. Instead of ‘waiting in line’ like GSM users, CDMA users each had a unique code that their devices then filtered out of the pool to connect each other to phone calls. In North America, CDMA technology was captained by Verizon and Sprint, but it didn’t really catch on in Europe and elsewhere like GSM technology did.  

As the world approached the Millennium, more and more people began pushing for enhanced technology with additional mobile capabilities. This eventually brought about the third generation of cellular standards in 1998, simplified as 3G, which saw more data become able to be transferred faster over cellular networks, making things like streaming audio and video a reality, as well as global positioning system (GPS) capabilities.  

Around that time, companies were beginning to realize that the separation of GSM and CDMA technologies was more of a hindrance than a help. To converge the two technologies, the industry shifted toward what’s known as Long-Term Evolution (LTE), also known as 4G. LTE drew more heavily on GSM EDGE technology, as well as Universal Mobile Telecommunications System (UTMS) technology, largely leaving CDMA tech behind. 

With LTE in place around 2004, telecommunications companies were able to send data over different networks much more easily, with some telcos able to offer download speeds of 150 Mbit/s on their best connections. Eventually, some companies upgraded their LTE networks into what was known as LTE-advanced or LTE-A, which doubled the max speed to 300 Mbit/s.  

In 2020, the industry began shifting toward the fifth generation of cellular standards, or 5G. While this rollout is still underway, 5G connections are known for their immensely fast rates of data transfer, with some showcasing speeds as high as 10 gigabits per second. 5G towers are currently being installed all around the world, with most major cities now featuring 5G hotspots capable of offering network customers previously unthinkable mobile connections.  

It’s expected that the power of 5G will result in changes to how developers approach creating new smartphones and mobile devices. New functionalities might include faster monitoring and notifications during global positioning, cloud-based applications being able to perform better – such as the ability to play games with intensive graphics that regular phone hardware can’t handle – and much more.   

Notably, the operating systems of our phones include code that, in most cases, automatically updates how our phones connect to cellular networks. For example, our smartphones must occasionally update their preferred roaming list (PRL), so that they can understand where the closest signal towers are in order to get the fastest possible connection speeds. These kinds of background processes are often updated over the air (OTA), meaning that they will update in the snap of a finger without users even being aware that it’s taken place.  

Finally, as has been discussed in previous entries of Acing the A+, when it comes to cellular standards, it’s important to think about possibilities of tethering. Most, if not all modern smartphones are capable of broadcasting their cellular network as an 802.11 Wi-Fi signal. In other words, you are able to broadcast your phone’s 4G or 5G signal as if your phone was a router that other devices can connect to (so long as they can also use 802.11 signals).  

Service companies sometimes limit how their signals can be shared, meaning your phone might not be able to perform (or share) all of the regular functions of a cellular network, or you could be looking at extra fees based on your data plan. For these reasons, it’s always a good idea to check with your mobile provider about what you can and can’t do – or how much it will cost – while tethering from your phone.