As has been said, we need to know much more about your system to give a good answer. 
But I'll point out that in most environments it's worthwhile to set up the production machine as a git host and use git push to manage synchronization from the development environment. (If you've ever used Heroku, this will seem familiar.)
The reasons:
- It relieves you of remembering which files have been updated. Just push.
- It provides a quick and simple rollback mechanism. If you've pushed a change and it turns out to be broken, just roll back as shown here.
- You should have a local repo anyway for change management. 
git makes updating the production machine easy with a post receive hook.  Set up a bare repository in your login home on the production server.
mkdir site.git
cd site.git
git init --bare
Set up the site root, e.g.:
mkdir /var/www/www.mysite.com
Then create n shell script hooks/post-receive (don't forget chomod +x) containing something like:
#!/bin/sh
GIT_WORK_TREE=/var/www/www.mysite.com git checkout --force
Now add site.git as a remote of the development machine repository.  Pushing to that target will update the production machine.
NB
I was not advocating for git to be a complete solution. As @Will I Am said in the comments, you do need to think through sensitive data in light of where your repos are stored.  However even for this, git can be a useful tool if set up the right way e.g. as explained here. The general idea for sensitive data is to use git and a separate repo or submodule as a smart form of secure FTP.  Of course if the amount and/or complexity of sensitive data are small, a simple copy or remote shell script will do just as well.