I've been reading a bit about 4G communications, and it's fascinating, but I lack the fundamental understanding. Consider myself an electronically-inept person with basic high-school-level understanding of physics when explaining.
I used to work with 4G, so I could probably answer some of your questions, especially about the signaling/coding/etc, but not so much protocols. Still, your question is very broad and vague, and I've got to admit I've got a very hard time thinking of stuff I could tell you which you would be able to understand, but which would still be interesting. The fact is that 4G, and for that matter 3G or 2G, are complex frameworks which entail much more than a single concept which can be simplified for the layman. AM or FM radio are simple concepts, due to being from an era when everyone didn't have a bunch of microprocessors in their pockets. There was pretty much no concept of "state", except for tuned frequency. When people speak at the same time on the same frequency, they speak over one another, like two people in the same room. With digital communications, like 2G and later, your phone and the base stations (the antennas which receive your signal, and send you your input signals) no longer simply send a signal on a single frequency, representing the audio from your voice. Thanks to digital processors, the signals can be coded and decoded in such a way that they are sent over a small range of frequencies, and time slots. Something like this (this is actually illustrating 4G, but the concept is similar for 2G): In 4G, any user can receive data on any one of these squares, they don't have to be in only one frequency. The base station tells you which blocks are for you. Because of this, several users can be on the same base sation, using the same frequency band, and still not hear each other. Your phone takes the blocks which are assigned to it, and reconstructs the caller's voice from them. In 4G, everything is simply data, from text messages to voice calls. There are no special modes for voice, as I think there is for 2G and 3G (again, I'm no expert on these). Every one of the blocks contain a signal which represents some binary data (1s and 0s), which in turn represents anything from a website to a voice call. The protocols tell you what the different blocks mean. There is more to know. Much, much more. However, I don't know where to start, or if you even want to know more. I hope this post isn't too vague or confusing, but it's legitimately difficult to explain this. Does this answer your question?
Alright. I should have been more precise as to what I was interested in, so I'm going to list questions, and if you mention something else interesting while replying, I'll ask something else, if that's okay with you. Here goes: - Is speed/quality of connection the only difference between the two generations of communication? - Are there any particular interesting events connected to the transmission of the signal with 4G (or in general)? I remember seeing something about radio echo while skimming over the Wikipedia article, and it sounded fascinating. - Are there differences in physical structure between the 4G and 3G signal transmitters/accepters (the proper term skips my mind) - as in, are they physically different schemes? I remember reading that 4G is non-backwards-compatible, but does it mean protocols only or the transmitters as well? Can one encode a 4G signal with a 3G transmitter (if there even is such a difference)? Do they have to upgrade the equipment of the cell towers to match the new generation? - How is it possible to remain within the same cell of the station and not tangle up the signal with others? - Does different encoding mean that it's simpler/easier/quicker to decode the signal by those of ill intent, thus enabling for easier eavesdropping? Alongside that, does 4G protocols of transmission mean that one would have to catch the required frequency first? is it as easy to do as with 3G communications? That should do for now. Thanks for answering.
I can't give a great answer to all of these, especially since my knowledge of 2G and especially 3G is pretty basic, but I'll give it my best. Again that's kind of vague, but the short answer is no. There are many differences. However, what most of these differences are attempting to achieve is better speed and quality. As my experience lies in the physical layer, that's what I can comment most on. The modulation schemes (how the physical signal is transmitted) are different between all three. I started writing about the differences, but it just got too long and complicated. I don't think it would interest you too much anyway, and I wouldn't want to say anything wrong about 2G/3G. What I think you're getting at here are reflective channels (to put it in layman's terms). When you send a signal over the air, it'll bounce off of all sorts of stuff, like buildings or mountains, etc. Therefore the receiver will receive the same signal several times, with different amplitudes and delays, in very rapid succession. These will overlap, and that of course causes problems. What was really sent, and how can you determine that when what you receive is basically a jumbled mess of reflected versions of the original? Think of it as if you were in a room that had a lot of echo in it. Somebody speaks to you, but the echo is so long that all the words just get blended together in a mush. The answer, and this is true for 2G, 3G and 4G, is that you periodically (in 4G about 4000 times per second) send a reference signal, a signal that you know what is supposed to be, and you compare that with what you receive. That's the purple blocks in the image in my last post. You take the differences, and use those to make a guess about what happened to the signal between the transmitter and the receiver. This is called channel estimation. Once you have a channel estimate, you use it to correct the received signal. Another thing that is done is that if you're sending, let's say, 8 bits of data, you will actually send a lot more bits over the channel. Something like (this varies based on channel quality) 24 bits of data. By adding these 16 "redundant" bits, you can retrieve the signal even if some received bits are wrong. This is called channel coding. Short answer: yes, they are different, and they must be upgraded. Here my knowledge is very limited, but I know that these standards are made in such a way that network providers shouldn't have to completely change all of their equipment to make it work. To give you a little bit of context, one of my more experienced colleagues at the company told me that when they originally implemented the 4G in the modem I was working on, they did it with a 3G receiver. This shows that it's possible, but he made a point of how impressive it was that they achieved this. I consider this guy to be a straight up genius, so the fact that he found it difficult to do makes me think it was far from the optimal situation. Modern digital processing. It's what I was talking about in the previous post, with that time/frequency grid. The base station tells you which of the squares are for you, and it tells you which squares you can send on. In 4G, you will actually receive all of those squares, but you will only decode the ones which are for you. This is made possible by using the Fourier Transform, which is a mathematical operation. The Fourier Transform takes a time domain signal (in this case, some radio wave) and decomposes it into its frequency components. In the picture you see there are 7 symbols in each time slot. That means that you must take the Fourier Transform at 7 different times to get the frequency components of each time instance. This is how you get the grid, which then in fact just becomes a grid of numbers. Those numbers each represent a set of bits. When you know which ones are for you, you can go ahead and decode your data. You're confusing channel coding with encryption. Channel coding (which is what I was referring to) is about how you represent bits on a physical signal, it's not about obfuscating the data. In fact, data transmitted over 4G isn't encrypted, although it's not trivial to eavesdrop. I think, however, that if you implement your own base station clone (a very non-trivial task), you'd pretty much be able to listen to anything going on around you. You'd have to know which frequency network operators are using, but this is not exactly secret information. This turned out fairly long, but I didn't want to be too superficial. I hope this helps.Is speed/quality of connection the only difference between the two generations of communication?
Are there any particular interesting events connected to the transmission of the signal with 4G (or in general)? I remember seeing something about radio echo while skimming over the Wikipedia article, and it sounded fascinating.
Are there differences in physical structure between the 4G and 3G signal transmitters/accepters (the proper term skips my mind) - as in, are they physically different schemes? I remember reading that 4G is non-backwards-compatible, but does it mean protocols only or the transmitters as well? Can one encode a 4G signal with a 3G transmitter (if there even is such a difference)? Do they have to upgrade the equipment of the cell towers to match the new generation?
How is it possible to remain within the same cell of the station and not tangle up the signal with others?
Does different encoding mean that it's simpler/easier/quicker to decode the signal by those of ill intent, thus enabling for easier eavesdropping? Alongside that, does 4G protocols of transmission mean that one would have to catch the required frequency first? is it as easy to do as with 3G communications?
It took me a while to read through this and reply. I must thank you for such a detailed exposition of the technology. Obviously, you've put up effort into making it as accessible to a layman as possible, and I appreciate that. If you have time for that or have a convenient explanation via a link at hand, I'd love to hear more about how the data is processed in 4G communication - what you referred to as "modern digital processing".
Hmm, well I went over it in some detail in response to "Are there any particular interesting events connected to the transmission of the signal with 4G (or in general)? I remember seeing something about radio echo while skimming over the Wikipedia article, and it sounded fascinating," so to add to this answer, have a look at that again. To elaborate, and perhaps clarify a bit, I'd say the most important difference between analogue and digital processing is that you can do with a small digital chip what it would take a room full of analogue electronics to do. The reason is that digital processors are general purpose programmable, you can make them do almost any calculation that you can fathom (given enough time, of course). However, with analogue, you basically need a special purpose circuit for any operation. If you want to estimate frequency errors, you need a circuit for that, if you want to estimate how fast you are moving, to correct for frequency shift, you need a circuit for that, etc. With a programmable processor, all of these things can be done with a single circuit (i.e. the processor). That's what I'm talking about when I say modern digital processing. The rest is complicated. I've typed out most of it now, so I might as well finish it, but frankly this is getting pretty technical, to the point where I simply cannot make it very easy to understand. And even now, I'm simplifying a lot. The book which introduces these concepts is several hundred pages long. From memory, here's what you do with the data in 4G (from a base station to a phone, for example): - You start with a set of bits that you want to send. These are a bunch of 1s and 0s, as that's all computers understand. 4G knows nothing about what these bits are for, they are just some information some application (for example WhatsApp) will be sending. - From this point on, everything we do to the data is to protect it from the channel. The reason is that the channel (i.e. the air around us) is noisy, full of surfaces which will reflect it all over the place, and can even absorb some of the signal. By the time it reaches your device, it's total shit. You need ways to recover the original signal in a reliable manner. - You send these bits through a turbo coder. This essentially adds a bunch of extra bits at the end, based on what bits you're sending. Typically you'd add something like twice as many bits at the end as what you're actually sending. The reason is that by representing each bit with three bits, we can make a safer guess about what it's supposed to be. To illustrate; imagine that 101 represents 0, and 010 represents 1. These are the only combinations which will be sent out. What if you receive 110? Obviously an error has been introduced by the channel, but what's more likely, that we sent out 101 and both the second and third bit are wrong, or that we sent 010 and only the first bit is wrong? Clearly it's more likely that only one bit is wrong, and so we "guess" that we should've received 010. - We then slap on a few more bits at the end, a "CRC". This is a value, like a hash, which is calculated based on the preceding bits. You will, when receiving, calculate your own CRC. If it matches with the received CRC, it means that every bit you've guessed is correct. Otherwise, you accept defeat, and ask for the data to be retransmitted; the signal was too poor. - We then scramble the bits, i.e. "randomly" mix them up. This is so that if one part of the bandwidth you're transceiving on has a weak signal, it'll affect random single bits instead of a chunk of related bits. Combined with the turbo coder, this increases robustness. - Once all of this has been done, the bits are converted to complex numbers, using quadrature amplitude modulation. Each one of these complex numbers will be sent on an individual frequency. Now you see that by scrambling them, the bits are distributed across frequencies. You do an inverse Fourier Transform, to turn it into a waveform. At this point you leave the digital domain. The waveform is sent to a digital to analogue converter, and transmitted over an antenna. When receiving, you basically do the same thing, but in the opposite order. I don't know how interesting this is for you, as I simply can't give a full explanation of each concept. It would be too difficult to understand, and too time consuming to write out (especially as there are numerous books out there on the subject). However, this gives a reasonably full overview of the signal chain.