Advent of Cyber 2022
This post will look through some of the different Advent of Cyber 2022 challenges located on Try Hack Me. You can find the challenges here: https://tryhackme.com/room/adventofcyber4
DAY 2:
Day 1 is not too difficult and we will not be looking at it. Day 2 is all about logs. Lets look at how to complete this challenge.
First we need to start the machine, the credentials are lower down the page:
Starting off with an ls we can see the following:
Looks like we have 2 different files, lets us an ls -la to see if we can read those files:
We can, lets continue on, after seeing there are 2 different logs and the webserver log is most likely for the web server we then need to figure out when the logs were taken:
Looks like November 18th was a Friday. Next is what is the IP address of the attacker.
We can use the information we already have above to see the IP address of the attacker. Next is what is a name of an important list that was downloaded by an attacker. If the attacker was utilizing Linux they may have used wget to download the list. We can either grep for santa, which we can find the list in there easily, list (which shows way to much information) or just grep for wget:
Lastly, what is the flag, this can be found in SSHD.log, again we can use grep to make our lives much easier:
This room was an easy one that started someone off with looking at different files and starting to learn how to use grep. Thanks for reading the writeup for day 2 of the Advent of Cyber 2022.
DAY 3:
After a couple of days we have some new challenges. Day three starts to look at beginner OSINT. The first question asks about a registrar which we can find utilize whois on Kali:
Next we need to look at a github that stored some of Santa’s information. We can do some google hacking, however, we can also make our lives very easy and head over to github. Github has a very good search function within it:
As you can see above, we searched for SantaGift and the first repository shows a source code of santagift.shop website. Heading into there we find that they are utilizing WordPress, which I saw on the forums did trip some people up. We are not looking for wp-config.php however rather just config.php:
This area has the rest of the answers for day 3.
DAY 4:
Day 4 utilizes NMAP and smbclient to be able to find information that is needed throughout the machine. To make the scan faster I utilize RustScan.
We can see that the machine has the following ports, SSH, HTTP and SMB. Lets take a look at SMB and see if we can find anything within it. We will try to list the different shares with smbclient -L \\\\<victim IP>\\
We can see some information. However, when trying to get into those different shares we will be denied, at the bottom of the Try Hack Me box we see that there is a username and password, lets try to login with those to the smbshare of admins:
Awesome we can login and we can utilize dir to search within the directory. We can see that there are two different text files. Lets get those files by utilize the command get:
Now we can cat those files to see what was listed inside of them, and complete the day 4 challenge:
I almost forgot, question 1… what is the web server that is running, we can utilize NMAP to see this:
There we go, now day 4 is done.
DAY 5:
Day 5 is all about brute forcing, we will utilize hydra to be able to find a password for the VNC Server. Since we are logging into a VNC server there may not be a username, and even the challenge itself does not list a username. We will start off with an NMAP scan to make sure that VNC is at its default port of 5900:
We found the VNC Server, now lets try with no username to login to it:
As shown above we are utilizing the rockyou.txt wordlist and trying to attack VNC at the victims IP on port 5900.
After a minute or so we get a hit!
Now we can utilize vncviewer <victim IP> and then put in the password we found and we should have a connect back to the target machine:
DAY 6:
Day six was all about phishing and checking email attachments. For this machine you do need to utilize the split-view function located at the top of the page after starting your machine:
Now it is time to dive into the challenges, first thing we need to do is open a terminal, change directory into the Desktop and then from there search for files within the desktop, and lastly utilize sublime text to read the email:
As shown above we have a lot of information that is needed for the tasks at hand. Firstly we have who sent the email, who the email was to, the attachment of that email, the message id which is in base64, the spamscore and the return path.
Now that we have answered quite a few questions lets decode the message id:
Next we need to look at the email reputation, we can do this by going to the site https://emailrep.io:
Notice the blocked out part is your next answer for the email reputation, also notice I utilized the chief.elf email that we found earlier.
Now we need to download the attachment and get the sha256 hash. To do this we can utilize emlAnalyzer with the -i <file> flag and also the — extract-all as shown below:
Notice above, when we do an ls -la we see a new directory, eml_attachments. Going into there we can see a file and from there we can get the sha256 hash from that file:
Now putting that hash into virus total we get the following:
Lastly, we go to inquest and see what the subcategory is:
DAY 7:
Walkthrough is within the task.
DAY 8:
Day 8 is all about blockchains with sol files. This one is pretty quick and easy so lets hop right in:
First thing first, we need to download the task files, which will give us a zip file and when unzipped we see two .sol files
Now we need to head over to the following website:
From here we can upload the .sol files, add ether to them and then withdraw ether from them, thus giving us the flag:
Now we need to compile the EtherStore.sol file
Now deploy:
After we have deposited 1 Ether we can then click on withdraw and get the flag:
Keep coming back for more, I plan on putting quite a few Advent of Cyber walkthroughs within this post.
DAY 14:
We are now looking at day 14, where did the other days go you ask. Well the walkthroughs on the page do a very good job at walking you through everything. I did not want to just rinse and repeat exactly what is happening in the walkthrough. For this one we are going to take a different approach, we will be using ffuf to fuzz the website and find other users / pictures on the site. Lets hop right in.
As usual we will start off with an NMAP / Rustscan
Going to port 8080 we see a login page
The task does give us a login, lets get in and see what we are working with
We can see that it is calling for users/101.html. If we change this number to lets say 102 we should see a different page with different information on it. Depending on the application we should not be able to do this because then we would see others information that we should not be able to see. Instead of playing around with manual enumeration lets grab a bash script that will look for numbers for us and utilize ffuf.
The above command will make a list and save it to numbers.txt for numbers 100 to 999
Now lets fuzz the web application and see if we get any other 200 responses for any of the other numbers:
We can see above that there is 101–107. Now we can start to manually enumerate this. We could also do a wget to get all of those pages at once. Now we can right click on one of the images and go to open in new tab, we can see that close to the same URL exists, except this time we have an /images.png. Now we can fuzz the application again and look for different numbers with different images.
Now lets say we wanted to put this all together and do it all at once… lets try that. Notice I made 3 files pages, extensions and webpages. Also, make sure you pay attention to what is in each one:
Now we will fuzz different areas on the webpage at the same time.
Notice above we have called for webpages.txt to fuzz WEBPAGES, pages.txt is fuzzing PAGES and lastly we are fuzzing for different extensions.
Took about 5 seconds to get the above. Now we know what exists for both png and also html files.
Thanks for reading.