• Sat. Oct 18th, 2025

    IEAGreen.co.uk

    Helping You Living Greener by Informing You

    Digital Ghosts: OpenAI’s Sora Sparks a Storm Over AI Resurrections

    edna

    ByEdna Martin

    Oct 17, 2025
    digital ghosts openai’s sora sparks a storm over ai resurrections

    It started innocently enough—people experimenting with OpenAI’s new video model, Sora, to create short cinematic clips from text prompts.

    But soon, something eerie began circulating: ultra-realistic videos of long-dead public figures appearing to speak, act, and even plead on camera.

    It wasn’t nostalgia—it was resurrection. Legal experts now warn that Sora may have opened Pandora’s box of digital afterlives, as seen in the report on how AI videos of the dead have raised alarms among ethicists and families.

    Within days of release, clips of Martin Luther King Jr., Amy Winehouse, and Robin Williams began trending across social media.

    One viral video even showed Stephen Hawking being “powerslapped” in a simulated wrestling skit—something his family reportedly found deeply distressing.

    The boundary between tribute and exploitation blurred, and for many, that line was crossed the moment grief became entertainment.

    I watched a few of those clips, and honestly, there’s something gut-wrenching about seeing familiar faces move and speak in ways they never did. It feels like memory theft wrapped in pixels.

    Under public pressure, OpenAI has started walking things back. The company quietly paused all clips featuring Martin Luther King Jr. after his estate complained, and introduced a system where families can request the removal of any likeness.

    That “opt-out” approach, described in the coverage about how OpenAI suspended Sora videos and allowed estates to block future appearances, still leaves much of the burden on the victims’ relatives.

    Imagine having to file a ticket just to stop strangers from animating your late father. It’s maddening.

    The issue isn’t just moral—it’s legal chaos. In the U.S., likeness rights for the deceased vary wildly from state to state.

    Some jurisdictions treat them like property, others barely acknowledge them. That means a deepfake of a dead actor could be illegal in California but perfectly fine in Texas.

    And because Sora’s clips spread globally in minutes, the law feels like it’s chasing a phantom.

    One tech analyst put it bluntly: “We’ve built time machines for the internet, but no one knows who owns the ghosts.”

    Adding fuel to the fire, creators are now using Sora for everything from satirical content to political “what if” scenarios.

    Some users even made videos parodying OpenAI’s CEO, showing him “caught stealing GPUs” in an AI-generated security camera feed—a clip that’s been circulating wildly since reports surfaced about fake Sora footage depicting Sam Altman caught on tape.

    What started as harmless parody now highlights just how powerful, and uncontrollable, these tools have become.

    The irony? While OpenAI scrambles to set up guidelines, tech outlets are already teaching people how to spot fakes—pointing to things like mismatched lighting or subtle eye flickers.

    But the deepfakes are evolving faster than detection methods. Even experts quoted in guides on how to spot a Sora fake while you still can admit that the line between real and generated video is fading by the week.

    It’s not like we didn’t see this coming. A few months back, when OpenAI unveiled the Sora beta, some journalists joked it would soon let people “deepfake their friends” for fun.

    And sure enough, Sora’s companion app—mentioned in early coverage about how the AI model allows users to deepfake each other through short social videos—has already become a playground for both creativity and chaos.

    From a personal standpoint, I get the fascination. Watching a digital ghost speak again tugs at something primal—a mix of curiosity and longing.

    But when I see Winehouse’s digital self crying or King delivering a speech written by an algorithm, it’s hard not to feel like we’re trampling on graves with code.

    The tech is extraordinary, sure, but maybe that’s the problem: it’s too powerful, too easy, too soon.

    And so we circle back to the uncomfortable question—who gets to own a face when the body’s long gone?

    If Sora teaches us anything, it’s that the dead may never really rest again, not when algorithms can keep them talking forever.

    Leave a Reply

    Your email address will not be published. Required fields are marked *