Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

  • weker01@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 months ago

    Google takeout is the best gdpr compliant platform of all the big tech giants. Amazon for example lets you wait until the very last day they legally can.

    Also they do minimal processing like with the metadata (as others commented) as it is probably how they internally store it and that’s what they need to deliver. The simple fact that you can select what you want to request and not having to download everything about you makes it good in my eyes.

    I actually see good faith compliance with the gdpr in the Plattform

  • BodilessGaze@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 months ago

    There’s no financial incentive for them to make is easy to leave Google. Takeout only exists to comply with regulations (e.g. digital markets act), and as usual, they’re doing the bare minimum to not get sued.

  • stepan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 months ago

    There was an option to split the download into archives of customizable size IIRC

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Yeah, that introduces an issue of queuing and monitoring dozens of downloads rather than just a few. I had similar results.

      As my family is continuing to add photos over the week, I see no way to verify that previously downloaded parts are identical to the same parts in another takeout. If that makes sense.

      • Willdrick@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 months ago

        You could try a download manager like DownThemAll on Firefox, set a queue with all the links and a depth of 1 download at a time.

        DtA has been a godsend when I had shitty ADSL. It splits download in multiple parts and manages to survive micro interruptions in the service

        • gedaliyah@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          DownloadThemAll seems to be helping. I’ll update the original post with the details once I have success. In this case, I was able to first download them internally in the browser, then copy the download link and add them to DtA using the link. Someone smarter than me will be able to explain why the extra step was necessary, or how to avoid it.

        • gedaliyah@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          I couldn’t get it working, but I didn’t try too hard. I may give it another shot. I’m trying a different approach right now.

  • Railcar8095@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    Not sure if somebody mentioned, but you can export to one drive. So you can get a 1TB account for a free trial or for a single month and export everything there as simple files, no large zips. Then with the app download to the computer and then cancel one drive.

    Pretend to be in California/EU and then ask full removal of all your data on both Microsoft and google

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      This route may be the answer. I didn’t have success so far in setting up a download manager that offered any real improvements over the browser. I wanted to avoid my photos being on two corporate services, but as you say, in theory everything is delete-able.

  • redxef@scribe.disroot.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    4 months ago

    Honestly I thought you were going to bitch about them separating your metadata from the photos and you then having to remerge them with a special tool to get them to work with any other program.

  • butitsnotme@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    I know it’s not ideal, but if you can afford it, you could rent a VPS in a cloud provider for a week or two, and do the download from Google Takeout on that, and then use sync or similar to copy the files to your own server.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I don’t know how to do any of that but I know it will help to know anyway. I’ll look into it. Thanks

      • Avid Amoeba@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Be completely dumb and install a desktop OS like Ubuntu Desktop. Then remote into it, and use the browser just as normal to download the stuff on it. We’ll help you with moving the data off it to your local afterwards. Critically the machine has to have as much storage as needed to store all of your download.

  • Flax@feddit.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    Try this then do them one at the time. You have to start the download in your browser first, but you can click “pause” and leave the browser open as it downloads to your server

  • Resol van Lemmy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    It’s bad because they don’t want you to use it, but they made it exist so that they don’t get sued by the European Union.

  • smeeps@lemmy.mtate.me.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    I think this is a bit unfair. Most Google Takeout requests are fulfilled in seconds or minutes. Obviously collating 100GB of photos into a zip takes time.

    And it’s not googles fault you have internet issues: even a fairly modest 20Mbps internet connection can do 50GB in 6h. If you have outages that’s on your ISP not Google. As others have said, have it download to a VPS or Dropbox etc then sync it from there. Or call your ISP and tell them to sort your line out, I’ve had 100℅ uptime on my VDSL copper line for over 2 years.

    I was able to use Google Takeout and my relatively modest 50Mbps connection to successfully Takeout 200GB of data in a couple of days.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      What download manager did you use? I’ve tried with whatever’s built into Firefox on two different networks and similar results. The downloads freeze every so often and I have to restart them (it picks up where it left off). Sometimes it just won’t reconnect, which I’m guessing is a timeout issue with Google, although I really have no idea.

      I don’t ever have to manage downloads of this size, so sorry if it’s an obvious question

  • Dave@lemmy.nz
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Can you do one album at a time? Select the albums you want to download, then do that file. Then do the next few albums. That way you have direct control over the data you’re getting in each batch, and so you’ll have a week to get that batch instead of having to start again if the whole thing didn’t finish in a week.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      That may be a thought. I could organize the photos first and then do multiple takeouts. Thanks

  • helenslunch@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go

    “Normal” home internet shouldn’t have any problem downloading 50 GB files. I download games larger than this multiple times a week.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    A 50GB download takes less than 12h on a 10Mbps internet. And I had a 10Mbps link 10 years ago in a third world country, so maybe check your options with your ISP. 50GB really should not be a problem nowadays.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      It’s not the speed - it’s the interruptions. If I could guarantee an uninterrupted download for 12 hours, then I could do it over the course of 3-4 days. I’m looking into some of the download management tools that people here have suggested.

          • aulin@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            I would recommend Aria2. It can download several chunks of a file in parallel, resume downloads automatically with a set number of retries, it supports mirrors (maybe not an option for Google Takeout, but for other cases), and it can dpwnload over many different protocols.

  • Darohan@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Just gone through this whole process myself. My god does it suck. Another thing you’ll want to be aware of around Takeout with Google Photos is that the photo metadata isn’t attached as EXIF like with a normal service, but rather it’s given as an accompanying JSON file for each image file. I’m using Memories for Nextcloud, and it has a tool that can restore the EXIF metadata using those files, but it’s not exact and now I have about 1.5k images tagged as being from this year when they’re really from 2018 or before. I’m looking at writing my own tool to restore some of this metadata but it’s going to be a right pain in the ass.

  • YurkshireLad@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    Because Google don’t want you to export your photos. They want you to depend on them 100%.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Looked promising until

      When Images are downloaded this strips EXIF location (according to the docs and my tests). This is a limitation of the Google Photos API and is covered by bug #112096115.

      The current google API does not allow photos to be downloaded at original resolution. This is very important if you are, for example, relying on “Google Photos” as a backup of your photos. You will not be able to use rclone to redownload original images. You could use ‘google takeout’ to recover the original photos as a last resort