15 February 2012

68. Reverse ssh tunnel to access server behind firewall

This isn't so much a post to show how to do it as it is a post describing a specific example of it in use. I basically spent two minutes looking at http://www.howtoforge.com/reverse-ssh-tunneling and was up and running in no time. It's that simple.

The sitation:
I have a computer at work. Opening up a port to allow for remote access is a headache and a half, since it involves getting signatures from a range of people and drawing up an IT security plan etc. As an academic during grant season I don't have that kind of time. Nor do I want to put up with all that BS. I also  understand that opening up ports willy-nilly can leads to security threats.

Anyway, I have iinet at home and they leave port 22 open by default. I have a Linksys WRT54 running Tomato and I allow key-based ssh external access.

My IP address is not static but changes perhaps once per month at most.
On my main desktop at home I run  this as a cron job:


#!/bin/bash
ipaddr=`wget http://automation.whatismyip.com/n09230945.asp -O - -o /dev/null`
when=`date +%a' '%d' '%b' '%Y' '%H':'%M`
echo $when $ipaddr >>/home/me/Dropbox/currentip.dat
exit 0

That way I can easily look up the latest ip address in my dropbox folder.

I run debian testing on all boxes.
Connecting via ssh to my home router works flawlessly. The other way doesn't work at all.

The solution:
We'll pretend that my home ip is 124.54.34.23 and my work ip is 169.23.54.6

At work
While at work, I connect to my home router using
ssh -R 19999:localhost:22 root@124.54.34.23

This logs me in to my Tomato router. Once in, start
top -d 600

This will keep top running, updating every ten minutes. This is to prevent the connection from being dropped.

The alternative is of course to use autossh -- the basic usage is just to replace ssh.

Now, go home

At home
log in to your router from the local network, then connect to port 19999 on localhost:
me@niobium:~$ ssh root@192.168.2.1

Tomato v1.28.1816

BusyBox v1.14.4 (2010-06-27 20:11:16 PDT) built-in shell (ash)
Enter 'help' for a list of built-in commands.

# ssh me@localhost -p 19999
me@localhost's password:

Linux beryllium 3.2.0-1-amd64 #1 SMP Sun Feb 5 15:17:15 UTC 2012 x86_64

The programs included with the Debian GNU/Linux system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.

Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
permitted by applicable law.
Last login: Wed Feb 15 18:55:45 2012 from localhost

me@beryllium:~$ 

Simple as that

67. Neat trick using reverse proxy -- several http hosts behind a firewall with only one port open

The situation: I was running two wireless webcams (Airlink101 AIC 250W) in order to monitor my laboratory. Both of these were connected to a linksys router. Only port 22 and 80 were opened by the university. We were forwarding port 80 to a Debian box running apache.

The goal: We wanted to have a page, e.g.www.externalhostname.com/image.html, serve up images from both the webcams. Using apache.

The solution:
A friend came up with this neat solution.

The following is assumed:
  • The external dns name is www.externalhostname.com
  • The cameras have the LAN ips 192.168.1.121 and 192.168.1.122


First the html file -- image.html:

<html>
<head>
<title>Lab Webcams</title>
<META HTTP-EQUIV="REFRESH" CONTENT="5">
</head>
<body bgcolor="rgb(0,0,122)" text="white">
<table border="1">
<tr>
<td>
Cam 1480
</td>
<td>
Cam 1485
</td>
<tr>
<td>
<img src="http://www.externalhostname.com/cam1/image.jpg" width="320" height="240"/>
</td>
<td>
<img src="http://www.externalhostname.com/cam2/image.jpg" width="320" height="240"/>
</td>
</table>
</body>
</htm>
Next, configure apache using /etc/apache2/httpd.conf:
LoadModule proxy_module /usr/lib/apache2/modules/mod_proxy.so
LoadModule proxy_http_module /usr/lib/apache2/modules/mod_proxy_http.so
LoadModule proxy_connect_module /usr/lib/apache2/modules/mod_proxy_connect.so
ProxyRequests Off
<Proxy *>
Order deny,allow
Allow from all
</Proxy>
ProxyPass /cam1 http://192.168.1.121
ProxyPassReverse /cam1 http://192.168.1.121
ProxyPass /cam2 http://192.168.1.122
ProxyPassReverse /cam2 http://192.168.1.122

Finally, copy the following from /etc/apache2/mods-available to /etc/apache2/mods-enabled:
proxy.conf
<IfModule mod_proxy.c>
</IfModule>
proxy_http.load

# Depends: proxy
LoadModule proxy_http_module /usr/lib/apache2/modules/mod_proxy_http.so
proxy.load


LoadModule proxy_module /usr/lib/apache2/modules/mod_proxy.so


That's it.

66. Minor bug: evolution 3.2.2 crashes, google chrome sync going crazy on debian testing

Symptom:

  • Evolution 3.2.2 crashes every few minutes -- presumably on retrieving new mail every ten minutes
  • Evolution uses 150% of your resources and becomes unresponsive if you try to recover messages lost in the crash
  • If you sync your Chrome/Chromium browser with google, you get a warning triangle saying that sync failed. On trying to log in again the log-in window keeps disappearing

Logging evolution using CAMEL_DEBUG=all evolution >&evo.log gives
(evolution:8112): evolution-mail-CRITICAL **: e_mail_folder_uri_from_folder: assertion `CAMEL_IS_FOLDER (folder)' failed
**
GLib-GIO:ERROR:/tmp/buildd/glib2.0-2.30.2/./gio/gdbusmessage.c:1986:append_value_to_blob: assertion failed: (g_utf8_validate (v, -1, &end) && (end == v + len))
[imapx:F] adding command, fmt = 'IDLE'
[imapx:F] completing command buffer is [4] 'IDLE'
[imapx:F] Starting command (active=1, literal) F00104 IDLE
[imapx:F] camel_imapx_write: 'F00104 IDLE
'
[imapx:F] camel_imapx_read: buffer is '+ idling
'
[imapx:F] token '+'
[imapx:F] token TOKEN 'idling'
[imapx:F] token '
'
[imapx:F] Got continuation response for IDLE
[imapx:F] ** Starting next command
[imapx:F] * no, no jobs
Here's the log from another crash:

(evolution:19158): evolution-mail-CRITICAL **: e_mail_folder_uri_from_folder: assertion `CAMEL_IS_FOLDER (folder)' failed
(evolution:19158): GLib-CRITICAL **: g_hash_table_lookup: assertion `hash_table != NULL' failed
(evolution:19158): evolution-mail-CRITICAL **: e_mail_folder_uri_from_folder: assertion `CAMEL_IS_FOLDER (folder)' failed
**

GLib-GIO:ERROR:/tmp/buildd/glib2.0-2.30.2/./gio/gdbusmessage.c:1986:append_value_to_blob: assertion failed: (g_utf8_validate (v, -1, &end) && (end == v + len))
Followed by instant Evolution disappearance

If I start evolution and disable all gmail related accounts, it stays stable. If I enable our university gmail-hosted account it crashes --

Icedove/thunderbird is not crashing.

The whole thing seems to be a combination of weird Google stuff and Evolution behaviour.


Solution - sort of:  
I solved the google chrome issue by signing out/disabling sync, then re-enabling again. I also removed my Online Accounts in gnome, then added them again.
I also installed libnss3-tools due to some errors chrome was throwing up, but it's probably unrelated.
Evolution was still unhappy though. I did the 'windows' thing and rebooted -- evolution crashed after about twenty minutes. After starting evolution again it ran without a hickup for 8 hours before I shut my system down.