• henfredemars@infosec.pub
    link
    fedilink
    English
    arrow-up
    92
    ·
    3 months ago

    Ensuring that the system complies with industry standards and integrating security measures for cross-technology communication are also necessary steps, Gao adds.

    This is absolutely a huge factor that could make or break the technology if they don’t do this perfectly. This could be the single most important part of the tech.

    2.4 GHz is super saturated. The last thing we need is long range i.e. large footprint signals in already saturated spectrum. How this technology is deployed should either be not at all, or very carefully, to prevent widespread interference with existing WiFi devices. This spectrum is already on the verge of being complete trash. Please please do not be deploying more stuff on 2.4 spanning an entire “smart city.”

    • shortwavesurfer@lemmy.zip
      link
      fedilink
      English
      arrow-up
      41
      ·
      3 months ago

      I actually ditched 2.4 gigahertz Wi-Fi on my home network entirely for this exact reason. If a device is not compatible with 5 gigahertz Wi-Fi, it doesn’t get purchased.

      • henfredemars@infosec.pub
        link
        fedilink
        English
        arrow-up
        31
        ·
        3 months ago

        It doesn’t just benefit you. You’re benefiting the current users of that spectrum that for one reason or another might not be able to switch.

        I suspect most users though couldn’t tell you what frequency their network uses let alone the devices on it.

        • cmnybo@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          12
          ·
          3 months ago

          Anyone with a NAS will immediately notice that they are on 2.4GHz because it will take several times longer to transfer files.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 months ago

          Yup, I have one device that’s stuck on 2.4GHz, my Brother laser printer. It works fantastically otherwise and it has an Ethernet port, but I haven’t bothered to run cable yet to it. I suspect a lot of people have that one device they’d rather not replace, which is still on an old wifi standard.

          So I just make sure to have a simultaneous dual-band setup. Everything else uses 5GHz, and the 2.4GHz band exists for that one device, or if I’m on the opposite side of the house or something. I use fancy networking stuff though (Ubiquiti APs), your average person would just be confused at why the internet is sometimes slow (i.e. when the printer wakes up).

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            While my printer only supports 2.4GHz, it’s always been on Ethernet

            But too many smart home devices and media streamers, even after making an effort to stick with local IoT meshes.

      • circuscritic@lemmy.ca
        link
        fedilink
        English
        arrow-up
        19
        ·
        3 months ago

        Do you live in a high density urban environment?

        Because if so, that totally makes sense, and the other benefit of 5GHz/6GHz not traveling too far outside your apartment or condo wall, is pretty nifty as well.

        But if you live in a house in the suburbs, man, that is commitment well outside of necessity, or convenience. Not saying it’s bad choice per se, just seems unnecessarily burdensome IMO.

        • shortwavesurfer@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          3 months ago

          I live in a single family house, but the area has quite a few single family houses packed pretty close together. So there’s still a lot of traffic on 2.4 GHz.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          I’m not OP, but I also live in a single family house in the suburbs and actively avoid 2.4-only gear. I do have one stubborn device on 2.4GHz though, my laser printer, so I have to keep buying simultaneous dual-band gear until I get around to running Ethernet cable to it.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      5
      ·
      3 months ago

      This spectrum is already on the verge of being complete trash.

      Radio shouldn’t be used when avoidable. It’s for emergencies, aviation, hiking, short-range communication for convenience maybe. Phones - yes.

      But providing internet connectivity via radio when you can lay cable is just stupid.

      • henfredemars@infosec.pub
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        3 months ago

        I mostly agree with you. I find it really weird how I live in a world where all my Internet is being run through 5G cellular for political and social reasons and not for technical ones. Due to the monopoly on the cables, it’s actually much cheaper here to buy 5G home internet. It seems unnecessarily complicated and choosing to use a shared medium for no reason. It’s just the politics.

        In case you’re not from the States, we have a monopoly pretty much everywhere for Internet services.

        With my 5G I have unlimited data, and it’s 300 down 44 up on a good day. It’s perfectly serviceable if you can live with increased latency.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          we have a monopoly pretty much everywhere for Internet services

          Fortunately, that’s not true everywhere, and municipal fiber is becoming more and more common.

          5G home internet

          The problem here is latency. It’s entirely sufficient for most web browsing and video streaming use-cases, but it sucks for multiplayer gaming and other interactive use-cases (e.g. video calls). So while it’s probably a solution for a lot of people, it’s not really a replacement for physical cables.

    • Windex007@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      3 months ago

      Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).

      • towerful@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        It’s LoRa on 2.4ghz.
        It’s just that chirp signals are easy to decode from a lot of noise.
        And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.

        LoRa is incredibly resilient.
        It’s just really really slow

        • Windex007@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          3 months ago

          I don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?

          How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.

          And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.

          • towerful@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            WiFi uses BPSK/QPSK/OFDM/OFDMA modulation.
            LoRa uses CSS modulation.

            This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
            Thus making WiFi chips work with LoRa chips.

            LoRa doesn’t care about the carrier frequency.
            So the fact that it’s LoRa at 2.4ghz doesn’t matter. It’s still LoRa.

            I’m sure there will be a use for this at some point.
            Certainly useful for directly interfacing with LoRa devices from a laptop.
            I feel that anyone actually deploying LoRa IoT would be working at a lower level than “throw a laptop at it” kinda thing

            • Windex007@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 months ago

              I didn’t realize that LoRa didn’t care about carrier frequency, that’s for sure the root of my faulty assumption! Thanks for taking the time to explain

              • towerful@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                It’s pretty serendipitous, actually.
                The past month I’ve done a somewhat deep dive into LoRa for a project.
                I ultimately dismissed it due to the data rates, but for simple remote controls or for sensors - things that report a couple bytes - it seems awesome.
                I’m sure you can squeeze higher data rates out of it, but when I evaluated it I decided to go with a hardwired network link (I had to have stability, dropped info wasn’t an option. But the client had a strong preference for wireless)