Reddit has a form where you can request a copy of your data. The process can take up to 30 days, after which you will get a private message on your Reddit account with a download link. The data comes in the form of CSV files that you can open using Microsoft Excel or any text editor.

If you’d rather not wait for Reddit to deliver your data, or would prefer to keep your data in a searchable archive, you can use Brownman’s tool, reddit-user-to-sqlite. This command line application can download the complete public archive of any Reddit user and compile it in an SQLite database file. Just keep in mind that this method will stop working on July 1, 2023, when the API change occurs (because you don’t actually own that content you created on Reddit?).

See https://www.wired.com/story/how-to-download-your-reddit-data/

#technology #deletereddit #Reddit

  • jherazob@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I have 17 years of posts and comments and have been active, i don’t have much trust in those kinds of tools specially given that the API has hard limits to what it can reach, still will check

    Edit: Welp, from starters we have issues, it requires the absolute latest Python version, as a sysadmin i hate when things demand the latest and greatest just to be installed, i’m in a version still with full support…

    Oh well, we’ll see…

    • deadcade@lemmy.deadca.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      The official GDPR data request form returns “everything” (assuming you can trust Reddit to provide everything). But there’s no hard limit on amount of comments/posts.

      • jherazob@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah, that one they’re legally bound to comply, i was wondering on the mentioned tool or other similar ones