TL;DR: even if your delete script confirms a full wipe and your Reddit profile page shows zero comment, there may still be comments left over (that you can find through a search engine and delete manually on Reddit).

Weeks ago, I used redact.dev to delete all my Reddit comments (thousands of them over 10+ years). Redact.dev confirmed a full wipe, and my Profile > Comments page on Reddit confirmed I had no comment left.

Yet, as of today, Google still returns dozens of results for “$myredditusername site:reddit.com”. It’s not just Google’s crawler lagging; when I follow those links, those comments are still visible on the Reddit website, under my username, where I have the ability to manually delete them.

Thankfully, I hadn’t yet nuked my account, because I knew of other users whose deleted comments got reinstated (although that was thought to be caused by the deletion script exceeding the API rate limit; supposedly a different case, as those missed comments would still show in the Profile page).

spez: edited for clarity.

  • cassetti
    link
    fedilink
    101 year ago

    I don’t trust those snakes. I’m working on code to use reddit’s website and edit comments one at a time (one per minute so they don’t think it’s bot activity) and I’m going to deploy the code a month or two from now after the API is gone - because I want them to think they’ve “won” before I over-write and then erase a decade’s worth of content

    • @elboyoloco@lemmy.world
      link
      fedilink
      41 year ago

      Is this something you would be able to share with others when it’s finished. Or put it on Github and people can make suggestions or changes?

      • cassetti
        link
        fedilink
        41 year ago

        So I’m not a traditional programmer - I don’t use a lot of the common software and such. I have a lot of prior experience using AutoItScript automated software so I’ll probably use that to mimic keystrokes and clicks on my computer screen once I have programmed exact positions for things - it’ll likely be a very specific set of code for my computer.

        But I may create an account on github and share if there’s enough interest lol

    • TimeSquirrel
      link
      fedilink
      11 year ago

      So it scrapes the page manually? I was thinking of writing a small python program myself to do that.

      • cassetti
        link
        fedilink
        31 year ago

        More simple than that - I’ll likely use AutoItScript for windows - literally automate clicking links or simulating keystrokes (like the tab key) until it reaches the desired link then clicking the edit function, revising text, tab to the save button, saving change, and repeat over and over.

        It’s crude and inefficient, but I have over twenty years experience using the code for various small tasks so I’m sure I’ll get the job done.

        Just not sure when I want to start - I feel like they are still playing tricks un-deleting content and such for people using automated API code. So for now I’ve simply blocked reddit at the router level for another month or two before I go back and start writing my code to automate the deletion of 10+ years worth of content.