TL;DR: even if your delete script confirms a full wipe and your Reddit profile page shows zero comment, there may still be comments left over (that you can find through a search engine and delete manually on Reddit).

Weeks ago, I used redact.dev to delete all my Reddit comments (thousands of them over 10+ years). Redact.dev confirmed a full wipe, and my Profile > Comments page on Reddit confirmed I had no comment left.

Yet, as of today, Google still returns dozens of results for “$myredditusername site:reddit.com”. It’s not just Google’s crawler lagging; when I follow those links, those comments are still visible on the Reddit website, under my username, where I have the ability to manually delete them.

Thankfully, I hadn’t yet nuked my account, because I knew of other users whose deleted comments got reinstated (although that was thought to be caused by the deletion script exceeding the API rate limit; supposedly a different case, as those missed comments would still show in the Profile page).

spez: edited for clarity.

  • TimeSquirrel
    link
    fedilink
    11 year ago

    So it scrapes the page manually? I was thinking of writing a small python program myself to do that.

    • cassetti
      link
      fedilink
      31 year ago

      More simple than that - I’ll likely use AutoItScript for windows - literally automate clicking links or simulating keystrokes (like the tab key) until it reaches the desired link then clicking the edit function, revising text, tab to the save button, saving change, and repeat over and over.

      It’s crude and inefficient, but I have over twenty years experience using the code for various small tasks so I’m sure I’ll get the job done.

      Just not sure when I want to start - I feel like they are still playing tricks un-deleting content and such for people using automated API code. So for now I’ve simply blocked reddit at the router level for another month or two before I go back and start writing my code to automate the deletion of 10+ years worth of content.