• sudo42@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    ·
    3 months ago

    Sure, Microsoft is happy to let their AIs scan everyone else’s code., but is anyone aware of any software houses letting AIs scan their in-house code?

    Any lawyer worth their salt won’t let AIs anywhere near their company’s proprietary code intil they are positive that AI isn’t going to be blabbing the code out to every one of their competitors.

    But of course, IANAL.

    • hessenjunge@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      14
      ·
      3 months ago

      The LLMs they train on their code will only be accessible internally. They won’t leak their own intellectual property.

        • hessenjunge@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          Possibly. It’s hard to know without seeing the numbers and assessing output quality and volume.

          Also it’s not unheard of that some bigwig wastes millions of company €€ for some project they fancy. (Billions if they happen to be Elon)

      • prole@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 months ago

        If only we had an overarching structure that everyone in society has agreed exists for the purposes of enforcing laws and regulating things. Something that governs people living in a region… Maybe then they could be compelled to show exactly what they’re using, and what those models are being trained with.

        Oh well.