How do I use the browser using ssh tunnel?

(Henry) #1

I set up an ssh tunnel to access the cluster inside an AWS VPC. I point my browser at localhost:7474/browser and log in using the correct creds but it's giving a service unavailable error. I can curl to the cluster using the same creds with no problem. What could be the issue?

ssh -N -L 7474:clusternodeip:7474 user@myec2instance -i mypemfile

(Tom) #2

Hello Henry,
You not only need the http port, you also need to tunnel out the bolt port (7687). Add that and it should work.

Let me know if it does (or still doesn't).


(Henry) #3

Thanks Tom. I was able to connect using the browser however, I had to use the public ip of the cluster node. Why does the public ip work and not the private ip? The cluster nodes do not have outside vpc access. I'm a little confused with this. How can I make the browser connect to the host using its private ip? In my ssh tunnel command, I'm using the private ip of the cluster node.

ssh -N -L 7474:clusternodePrivateip:7474 -L 7687:clusternodePrivateip:7687 user@myec2instance -i mypemfile

Below is the conf of the cluster node.

dbms.connectors.default_advertised_address=[private ip]




(Tom) #4

Greetings Henry,

Lets establish facts here ... you say you can connect to the public ip with a browser (firefox, chrome, whatever), correct ? Well, if that works, I guarantee you that's because there are two ports exposed on the public ip, 7474 and 7687. That's easily confirmed too, just

nmap -p 7474 <public ip>
nmap -p 7687 <public ip>

and both ports will show they are open.

Next you claim that you can browse to the instance when you tunnel out port 7474 from the private ip, but you can not connect. Is that the correct situation ? Again, easily confirmed, because after your ssh command you check this

nmap -p 7474 localhost
nmap -p 7687 localhost

and you'll see that the second port is not open.

If the above is the situation (and that's what I understand from your initial request), this will solve your problem :
ssh -N -L 7474:clusternodeip:7474 -L 7687:clusternodeip:7687 user@myec2instance -i mypemfile

You can actually see the browser needs both ports. The browser-url is obviously to 7474, the connection itself however, requires you to specify the bolt-url.

Does that make sense ?


(Henry) #5

Hi Tom,

I did include ports 7474 & 7687 in the ssh tunnel command, I just forgot to list it in my previous post (I updated it now). The nmap shows:

Cluster Node (inside a vpc, no public access)

nmap -p 7474 <public ip>  
nmap -p 7687 <public ip>

Both shows host down which makes sense since the server shouldn't have public access

nmap -p 7474 <private ip>
nmap -p 7687 <private ip>

Both shows port is open

My laptop

nmap -p 7474 localhost
nmap -p 7687 localhost

Both shows port is open

When connecting to http://localhost:7474/browser on my laptop, I can connect successfully if I use the public ip of the cluster node but I get an error if I use the private ip. How can I connect using the private ip?


(Tom) #6

It's a bit strange ... but shouldn't that bolt-connection to the private ip read : bolt://localhost:7687 ? Obviously (your private ip) ... is not reachable as such ...

You're also not browsing to but to http://localhost:7474, right ?


(Henry) #7

@tom.geudens Correct. I'm browsing to http://localhost:7474. But If I put Host to bolt://localhost:7687 I get a websocket connection error.


(Henry) #8

Just an update. After bouncing the cluster core nodes, I was able to use bolt://localhost:7687 in the browser.