Background
I have a list of paths for configuration files that are stored in various Bitbucket repositories. I tried to download only those files for processing via an automated script.
I have successfully achieved that using a Python script that automates Bitbucket API 1.0 with OAuth, and I don't have trouble retrieving the files nor processing them.
The actual problem
So now my problem is, now that I've processed the files, how do I commit the diffs (via automated script) and push them back up to the remote repositories? I don't think Git typically supports single-file modifications.
Things I have considered:
- Cloning every single required repository, and using terminal calls to git commit, etc. to commit and push back up to the remotes. (Disadvantage: the repos are quite large and it would be cumbersome to clone them all.) 
- Using git submodules. I am aware git submodules was suggested as a replacement for svn.externals, which would solve my problem fairly well. (Problem: I read the git submodules docs and couldn't see how it would help in my situation). 
- Simulating the POST calls to Bitbucket's recently-implemented Online Editor (Problem: Bitbucket mentioned that it should only be used for testing purposes, and it is liable to change. Also hard to decipher the POST request) 
- Using SSH for Git? (Don't think that would help either.) 
 
     
     
    