I've got a reverse ssh tunnel set up so that I can access my work computer from home. However, for the past few days I've had the connection get stuffed up on a regular basis (it doesn't get dropped, but the connection gets refused), and it frustrates me a little bit.
While a proper ssh connection is unbeatable, I would at least be able to copy files back and forth via dropbox if I only had a way of sending commands to my work computer.
And an obvious way of doing that would be to use a cronjob and a tiny bit of bash scripting. So here we go:
While we don't have to (we could just have an empty script file instead) I like the idea of testing for the presence of a specific file in the Dropbox folder, and if it exists, execute it.
Let's call the file that tests for it runremote.sh, and put it in our home folder (~/). I personally suspect that making sure that execution output and error messages get properly logged is a good thing if you're going to fly blind like this, hence the 1> and 2>
runremote.sh
if [ -e ~/Dropbox/runme.sh ]; then sh ~/Dropbox/runme.sh 1>> ~/Dropbox/runme.log 2>> ~/Dropbox/runme.error & fi
Then when you want something executed, put a file called runme.sh in ~/Dropbox:
Note that any command in runme.sh is going to be run in the ~/ folder -- not in ~/Dropbox.pwd echo 'Is it working?' cp ~/testfile.text ~/Dropbox date
And set the runremote.sh file to be executed e.g. every five minutes through cron:
crontab -e
*/5 * * * * sh ~/runremote.sh
Again, you don't need to have it test for the presence of a file, but I just instinctively like the idea.
Anyway, any command you put in ~/Dropbox/runme.sh should be executed and logged within five minutes from being synced.
You CAN use sudo (echo mypassword| sudo -S ls /root )as well by providing your password in the script file, but this is obviously not terribly safe.