I contribute to a software development project which has a specificity:
Multiple versions coexist, and there is a way to run any of those versions locally (to test a new feature or a bug fix).
A distribution of a version uses about 2 GB of disk space, with a lot of files.
Currently, the workflow looks like this. If, say, I need to test my bug fix on versions A and B, I would:
- Copy the official distribution for version A from
C:\references\AtoC:\current. - Compile my changes, overwriting some of the binaries in
C:\current. - Test the bug fix.
- Delete everything in
C:\current. - Copy
C:\references\BtoC:\current. - Compile my changes, overwriting some of the binaries in
C:\current. - Test the bug fix.
Copying a large amount of files takes time. This delay breaks the flow. I thought about another approach, but I don't know if it is feasible.
In SQL Server, there is a thing called database snapshots. In essence, at a given moment, you ask the database to create a snapshot of the current data. Once it is created, any change can be made to the database—one can add or delete data, destroy entire tables, etc.—but at any moment, I can say that I want to go back to the snapshot, and all the changes made since this snapshot would disappear in a matter of milliseconds.
If there was a similar feature for NTFS, I would be able to do something like that:
- Create snapshot for
C:\references. - Compile my change in a way it overwrites some of the binaries in
C:\references\A. - Compile my change in a way it overwrites some of the binaries in
C:\references\B. - Once I'm happy with the changes, restore the state of
C:\referencesusing the original snapshot.
Is there a way to do it?




