Automatically backing up your files on a web server and uploading them to Google Drive using a Python script
Uploading our archive file
When we upload a file using our service token, the file gets uploaded into the drive associated with our project account (not to our personal Google Drive). There is no graphical way to access it as far as I know. Since, ideally, we would want to download the files whenever we need them without having to write scripts, we need a way to access such files through a web interface.
An easy hack is to create a folder in your personal Google Drive and share it with the service account using the service account’s email address (the email address you were asked to note down earlier).
Here, I have created a folder called backup and shared it with my service account’s email address. After you share the folder, we need to obtain the folder’s id. You can do it by navigating into the folder. Once you are inside the folder, take a look at the URL. The random string after the folders query parameter is the id.
Now, we need to upload our backup file into the folder, so that we will be able to access it through our Google Drive. Another collateral advantage of this is that your personal Drive will not lose any storage space.
def upload(fileName,service): print("Beginning backup upload...") media=MediaFileUpload(fileName,mimetype="application/gzip", resumable=True) file=service.files().create(body={'name':fileName,'parents':['parentID']},media_body=media,fields='id').execute() print("Backup uploaded. Online backup file ID is %s."%file.get('id')) print("Setting backup permissions...") def callback(request_id, response, exception): if exception: # Handle error print (exception) else: print ("Permission Id: %s" % response.get('id')) batch = service.new_batch_http_request(callback=callback) user_permission = { 'type': 'user', 'role': 'writer', 'emailAddress': '[email protected]' } batch.add(service.permissions().create( fileId=file.get('id'), body=user_permission, fields='id', )) batch.execute()
The upload function accepts the path of the directory and the service object as the arguments.
Import the following module to your script.
from googleapiclient.http import MediaFileUpload
We will create a media object using this module. Here, we pass the directory path, mimeType, and set resumable to true. Since our backup file is likely to be huge, setting this to true ensures, in case of an interruption, that the upload can be resumed from where it was interrupted.
Then, we pass the name of the file, and the media object to the create method to upload the file. To upload the file into the folder that we created, insert the id of the folder into the parents
array.
To be able to view and edit the file in our personal Google Drive, we need to assign ourselves permission to this file. To do so, create a user permission object and specify the email address of your Google Drive. Then, pass the file id and the user permission object into the add method of our batch object and call execute to grant permissions.
Deleting our archive file
Now, it’s time we mop up our workspace. You wouldn’t want the archive files you created in your web server to accumulate over time, hogging your storage space. So, let’s delete the file we created. To delete a file we need to import the os module.
import os
Then, create a function that accepts the file path as an argument and deletes the file.
def clean(fileName): print("Deleting temporary files...") os.remove(fileName) print("Temporary files deleted. Backup complete!")
Leave a Reply