Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy.”

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

More concerning, they said, is a rush by medical centers to utilize Whisper-based tools to transcribe patients’ consultations with doctors, despite OpenAI’ s warnings that the tool should not be used in “high-risk domains.”

  • Sludgehammer@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    ·
    edit-2
    2 months ago

    Wow, that’s bad. I thought it would be more of a “confusing a sentence for a similar sounding one” type thing but from the above and the article it’s just generating semi-believable text and sticking them into the transcriptions.

    • TheBlackLounge@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      2 months ago

      It’s actually extremely good at figuring out confusing text. It gets weird when the audio quality is bad.

      I use it for generating subs for obscure movies.

      • tehn00bi@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        No one is good with bad audio. My wife did some transcription work for a little while, it can be pretty painful, especially for doctors, and all the medical terms.