Docker can't resolve names when Apache ProxyPass set to localhost

I have managed to find answers for everything else so far, but have failed to find the answer to this one.

I have been installing 1.8.9 in Docker on a VPS (from Eukhost if it matters), to take over from a 1.7.9 installation which is now getting on a bit. It’s just doing a welcome screen in my wife’s primary school, so not a big system.

So far with the help of the instructions and many many forum pages I have:

  • Installed Docker at an port 8060
  • Migrated the 1.7.9 installation to 1.8.9
  • Got 1.8.9 running fine over http://www.theaddress.co.uk:8060
  • Got Apache ProxyPass running under a subdomain (administered by Plesk on the VPS), so that it’s responding to http://xibonew.theaddress.co.uk
  • Blocked that port in the firewall so it’s only accessible through the subdomain.
  • Discovered LetsEncrypt, and (again via Plesk) got a certificate installed for the subdomain.
  • Got https working so that it’s all nice and secure (and a lot more secure than it was before), so it’s responding to https://xibonew.theaddress.co.uk
  • That was OK, but the ‘RequiredFiles.xml’ had the wrong paths in (they had the internal port). In order for the ‘requiredfiles.xml’ to be created properly, I had to make the internal bit of the ProxyPass , so the Apache & nginx Settings (again via Plesk) are:

Additional directives for HTTP:

ProxyPreserveHost On
ProxyPass / http://localhost:8060/
ProxyPassReverse / http://localhost:8060/

Additional directives for HTTPS

ProxyPreserveHost On
RequestHeader set X-Forwarded-Proto "https"
ProxyPass / http://localhost:8060/
ProxyPassReverse / http://localhost:8060/

and then
Additional nginx directives

if ($scheme != "https") {
	rewrite ^ https://$host$uri permanent;
}

That is all responding properly to the CMS in the browser, and the player is now downloading the correct files (apart from emoji.css & emoji.png but I think that’s unrelated)

However, the Twitter module is a big part of what this screen is used for, and now the twitter module (more specifically cURL) can’t resolve the twitter URL, so I’m getting this error:

Twitter API returned cURL error 6: Could not resolve host: api.twitter.com (see http://curl.haxx.se/libcurl/c/libcurl-errors.html) status. Unable to proceed.

I have logged into the docker instance, and it is indeed unable to resolve the name.

Having read a bit about it (I’ve spent the last 2 days reading lots of things I don’t really understand) it seems that Docker can get a bit odd about DNS when localhost is in use, but the Docker people think it’s fixed, and even when I specified nslookup to use 8.8.4.4 it still didn’t come back with an IP for www.bbc.co.uk.

It is fair to say that it might be something else that has broken the Twitter retrieval - but because I was trying to get the requiredfiles.xml onto the player I couldn’t tell.

It seems to me that if I can get the Twitter module to grab info from the api then it will then all be working fine. Fingers crossed anyway…

All suggestions gratefully received. If there is anything useful I can provide in terms of diagnostic info please let me know.

Many thanks, Dave

Thank you for all of the information regarding your setup. In regards to your Apache and nginx settings, they look fine. Docker has its own DNS settings which you can configure to tell the containers which DNS server they should use to resolve hostnames.

If your provider is blocking Google public DNS, then they should be able to advise you of the correct resolvers to use. Once you have those, you need to configure Docker to use those resolvers:

https://development.robinwinslow.uk/2016/06/23/fix-docker-networking-dns/

For the issue you are having with the emoji files, if you select the Modules option in your CMS, you will see an option to select Verify all files. Selecting this may also help resolve the problem.