Dear God,

I hope they sack this “journalist” quickly.

  • Domi@lemmy.secnd.me
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 year ago

    It’s nitpicking, whether it runs at 3840x2160 or 4096x2160 does not matter. Same goes for calling it 4K or UHD, even when one is technically incorrect.

    If even Sony calls their 3840x2160 blu-rays “4K UHD” I’m fine with the average person using them interchangeable.

    • billwashere@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 year ago

      I had to go digging but 3840x2160 is both 2160p AND 4k UHD. 4096x2160 is something called 4K DCI which is more of a camera or film industry thing and is rarely used for things like TVs or video games.

      • towerful@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 year ago

        1080p 1080i 720p (IE the i/p suffix) denotes a SMPTE resolution and timing.
        HD/FHD/UHD (720,1080,2160 respectively) also denote SMPTE resolutions and timings.
        These are SMPTE ST2036-1 standards, which are 16:9 and have defined (but not arbitrary) frame rates up to 120fps.

        4k DCI is still a SMPTE timing, but used for cinema and is generally 24fps (tho can be 48fps for 2k DCI).
        It’s SMPTE 428-1.

        There are other “4k” standards, but not nearly as common.

        If you have arbitrary resolutions or timings outside of the SMPTE standards, and generally fall into VESA standard resolution/timings or custom EDID resolution/timings.
        Chances are your computer is actually running 1920x1080@60 CVT-RB rather than 1080p60.

        Whilst 1080p60 and 1920x1080@60 seem like they should be the same, some displays (and devices) might only support SMPTE timings or VESA timings.
        So, although a display is 1920x1080 it might expect SMPTE, but the device can only output VESA.

          • towerful@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            No problem.
            Displays, resolutions, framrates, edids are all very complex. And marketing muddies the water!

            I’ve encountered this issue before when using BlackMagic equipment.
            What I was plugging into was described to me as “1080p”.
            Laptop directly into it would work, and it looked like 1080p in windows display management.
            Going through BlackMagic SDI converters (SDI is a SMPTE standard protocol, so these boxes went hdmi->sdi, sdi cable, sdi->hdmi, and would only support SMPTE resolutions/timings), the display wouldn’t work.
            Because the display was VESA only.

            I then read a lot about SMPTE, VESA, and EDIDs!

      • Domi@lemmy.secnd.me
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Correct, but both can be called 2160p just because of their vertical resolution. Overall both terms don’t matter in gaming because aspect ratio can be changed on the fly (on PC) depending on the output device. Haven’t touched a console in years but I assume they are stuck with a 16:9 aspect ratio no matter what they are playing on?

    • Kalash@feddit.ch
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Same goes for calling it 4K or UHD, even when one is technically incorrect.

      Why is it incorrect? 4k isn’t a formal standard. It just means you have approximatly 4k horizontal pixels.

      • CybranM@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Calling 3840x2160 “4k” makes sense since 3840 is so close.
        On a different note sometimes I’ve heard people call 2560x1440 for “2k” but neither 2560 nor 1440p are close to 2k so that makes little sense to me

      • Domi@lemmy.secnd.me
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        The logic of some people goes that anything under 4000 horizontal pixels is not “real” 4k. But as mentioned, I don’t care and also call 3840x2160 “4k” simply because it’s shorter than “2160p”.

        • Kalash@feddit.ch
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Ok, can you formally define it or link me to it?

          And I don’t want a definition for “4k DCI” or “4k UHD” … just a formally accepted definition of “4k” (in the context of a display resolution). We can all agree that it colloquially means the number 4000, I hope.

          • jtmetcalfe@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K, neither of which have a vertical resolution of exactly 4000px. UHD standards are maintained by ITU DCI standards were developed by the DCI group and are now maintained by SMPTE

            • Kalash@feddit.ch
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K

              I agree. But then it’s not a formal standard.

    • jtmetcalfe@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      4K UHD (along with 8K UHD and 16K UHD) are the consumer format standards for 3840x2160 image formats which includes Blu-ray. Full 4K or True 4K or DCI 4K is the cinematic 4K standard shown at 4096x2160, which many TVs supper via slight letterboxing