Use Gsuite, plexdrive, rclone and unionfs for backups

— This is for plexdrive4, although plexdrive5 is available, it lacks the disk-cache option —

We just got a gsuite-Account with 5 Users, so we have unlimited storage. Great opportunity to store Backups (encrypted of course) and free local space for more useful data.

Before I found plexdrive no solution had enough performance or it pushed the google-API-limit too far . Too many API-Calls mean 24h-Ban and that’s not funny at all.

Thanks to plexdrive (it’s storing the meta data in a mongoDB), this is no problem at all. The downside is that it’s readonly because of it’s origin as a streaming-only solution.

This problem can be solved with rclone and unionfs.

What we need:

  1. An ubuntu 16.04 server
  2. Gsuite account ( )
  3. Own gsuite API ( )
  4. Local Folders
    1. /gsuite-cache
      1. for caching downloaded data – plexdrive handles this
    2. /gsuite-local
      1. for the local saved data
    3. /gsuite-remote
      1. the plexdrive-mountpoint
    4. /backup
      1. the backupfolder for unionfs

Step One – Install Plexdrive:

cd ~
apt install mongodb
chmod +x plexdrive-linux-amd64
mv plexdrive-linux-amd64
plexdrive -m localhost -t /gsuite-cache/ /gsuite-remote/  -o allow_other &

Follow the Instructions, pretty easy. Then your gsuite is mounted, even if you can’t see it with „df“. You can see it with „mount“.

If you screwed up, wrong gsuite account or something like that, drop the DB ( mongo plexdrive –eval „db.dropDatabase()“) and start again.

Now that plexdrive is up an running, we install rclone.

Step Two – Install rclone:

Stole this in parts from 🙂

curl -O
cd rclone-current-linux-amd64
sudo cp rclone /usr/bin/
sudo chown root:root /usr/bin/rclone
sudo chmod 755 /usr/bin/rclone
sudo mkdir -p /usr/local/share/man/man1
sudo cp rclone.1 /usr/local/share/man/man1/
sudo mandb
rclone config
Insert your API credentials
Paste verification Code

Now you can upload existing data (if you don’t start from scratch),
ONLY upload enrypted files, see Step Five.

rclone copy [existing Folder] [gsuite-name]:[folder-to-upload] --verbose --transfers 8 --checkers 20 --stats 10s

This can take time, depending on the amount of data and your bandwidth.

Step Three – install unionfs:

apt install unionfs-fuse
unionfs-fuse -o cow /gsuite-local/=RW:/gsuite-remote/=RO /backup -o allow_other

Now we have all the needed Folders, /backup/ contains all the files and  folders you can find in /gsuite-local/ and /gsuite-remote/.

Step Four – upload changes:

If you write new backups now, they don’t get uploaded to your gsuite but are saved locally in /gsuite-local/. That brings a much better performance, and is needed if you don’t want your nightly backups run several days instead of hours.

So what you have to do is to upload them with rclone.

rclone move /gsuite-local/ [gsuite-name]:/ --verbose --transfers 4 --checkers 20 --stats 10s

All new files get written in the cloud and deleted locally. Thanks to unionfs, no change in /backup.

Step Five – encrypt!:

There is no cloud, just another peoples computer. So all Data you upload should be encrypted, you can use encfs  for example. Be aware of the security problems, there is a way to decrypt them if someone really really wants to ( ). But I don’t think that the google bot scanning your drive will do that.

I prefer GoCryptFS, it almost has the same functions as encfs but without the vulnerability.

Rclone has build-in encryption.

The decision is your’s, better encrypt slightly unsafe than no encryption at all.

Step Six – donate!:

If you like the software, donate!


I have this running for several months now, it works fine. Maybe problems will occur later. So no guarantee at all.

Dieser Beitrag wurde unter Computer veröffentlicht. Setze ein Lesezeichen auf den Permalink.