git Tips & Tricks workflow

One command to backup and patch your git changes4 min read

July 21, 2019 3 min read

One command to backup and patch your git changes4 min read

Reading Time: 3 minutes

In this quick and short tutorial, I’ll show you how you can easily backup and patch all your latest git changes with just one command.

The Problem

Imagine this, after a whole day of intense coding, you are now all set to commit. Just before you push, you encounter a series of merge-conflicts and while resolving them you corrupt your orignal changes, and now you have no way of reverting back to those local changes.

Git is a lifesaver, but sometimes it can be a bit tricky to handle and if you don’t know what you are doing, you can even loose hours of uncommited work. So, it’s always a good idea to keep incremental backups of your local changes.

The Solution

In this tutorial we will write a small powershell script which will automatically create patches of your staged and unstagged changes, as well as copy all your changed files in a folder. So this way you’ll be able to revert back your changes in any way possible.

Backup and Patch all git changes: The Script

Here is the script:

$localGitWorkspacePath ='F:\MyProject\codebase'
$BackupDirectory='E:\Backup'

#redirecting to OIM directory
cd $localGitWorkspacePath 

Write-Host "Creating new Directory.." 

$date=Get-Date | foreach {$_ -replace ":", "-"}

#creating a folder in backup directory with name as current date/time
new-item $BackupDirectory$date -itemtype directory

Write-Host "Creating patches..." 

#Create a patch of unstaged changes
git diff > $BackupDirectory$date\UnstagedChanges.patch 
#create a patch on staged changes
git diff --cached > $BackupDirectory$date\StagedChanges.patch 

Write-Host "Copying files..."
 
#creating a folder called 'files' to copy all the files in.
new-item $BackupDirectory$date\files -itemtype directory

#getting a array list of all the files that have changes.
$changedFiles=git diff --name-only HEAD^ --diff-filter=ACMRTUXB

#looping through all files that have changes and copying it to 'files' folder.
foreach ($file in $changedFiles) {
    Copy-Item $file $BackupDirectory$date\files
}

Now lets try and understand what the script does. In the first line, the variable $localGitWorkspacePath store the path of your local git repository. The variable $backupFolder stores the path of folders in which the backups will be created. So change these variables accordingly. Next we change the current working directory to point the git repository.

In the backup folder, we’ll create sub-folders for each new backup, and the name of these folders will be current date and time at the time of backup. So the variable $date stores the current date so that it can be used to create new folders. Now in Windows, as the name of directories cannot contain the character “:”, we replace all “:” from the date with “-“.

$date=Get-Date | foreach {$_ -replace ":", "-"}

Next, we create a new folder in the backup directory with the name as the current date and time.

new-item $BackupDirectory$date -itemtype directory

This will create a patch file called UnstagedChanges.patch, which will store all the unstaged changes in your repository.

#Create a patch of unstaged changes
git diff > $BackupDirectory$date\UnstagedChanges.patch

StagedChanges.patch will have all the staged changes

#create a patch on staged changes
git diff --cached > $BackupDirectory$date\StagedChanges.patch 

Next, we create a folder called files in the subfolder to copy all the changes files and loop through to copy all the files.

#getting a array list of all the files that have changes.
$changedFiles=git diff --name-only HEAD^ --diff-filter=ACMRTUXB

#looping through all files that have changes and copying it to 'files' folder.
foreach ($file in $changedFiles) {
    Copy-Item $file $BackupDirectory$date\files
}

That’s it, you can save this script anywhere with the extension .ps1 and to execute it, you can open powershell with administrator access in the same folder where this script is kept and simply execute it by typing its name and pressing enter.

Limiting number of backups

With regular backups, the backup directory will get crowded and contain many backups. Next few commands will limit the number of backup revisions by deleting the older revisions and only keeping the latest few.

$numberOfBackupsToPreserve=4

Get-ChildItem $BackupDirectory -Directory |
Sort-Object CreationTime -Descending |
Select-Object -Skip $numberOfBackupsToPreserve |
Remove-Item -Recurse -Force -WhatIf  

These lines can be copied at the end of the script, so after every execution, the backup folder will have only as many backup revisions as specified in $numberOfBackupsToPreserve.

In the last line, as we have written the -WhatIf parameter, it will not delete the previous revisions directly, it will only specify the folders and files which will be removed. To actually delete the previous revisions and only keep the latest few, just remove the -WhatIf parameter. The parameter is kept as safety measure. As this step recursively deletes the files, you might first wanna ensure that all the right folders are getting deleted. Because if you accidentaly provide a wrong path in the variable all the folders of that folder might get deleted.

Conclusion

I have made it a habit to first backup all the local changes using this script before merging or commiting. It is always a relief to have a backup of all your changes in case of any mishap.

Thanks for reading, I hope this helps in improving your workflow. It only takes a few minutes to do, but it can save you hours of reconfiguration. Please share this with your friends if you found it helpful!

Other Articles

Synchronize VS Code settings with Setting Sync

Home Automation Using NodeMCU and Google Assistant In Under $20

Blogger, Web Developer, Software Developer and Indie Game Developer.
Leave a comment

Your email address will not be published. Required fields are marked *