First thing that we noticed, was the actual flag size, which is 33 bytes. The second thing was that there is another mess with the decompilation but probably that’s not IDA’s fault. Gnu-cobol seems to work with integers in a very weird way, this is probably for better compatibility with COBOL types(?).
So b_15_7199 = 13618 = 0×3532 = little-endian ( chr(0×35) + chr(0×32) ) = 25 and then byte_560AF9A5F2A2 = 48 = chr(48) = 0 gives us the first byte for the flag_bytes array which is 250.

Rather, than working in this way to find out the encrypted flag_bytes we reviewed the test.cbl modules initiliaztion dynamically :-

As our pseudo code looks much more readable from the gnu-cobol one... we can finally see that it just reverses our input and multiplies each char by 2 and then compares it with the flag_bytes.
So we just need to divide the numbers that we found in memory by 2 and then reverse them to get the flag.

root

November 29th

Similarly to Gr8 Pictures 1 to find out how we can decrypt the hidden message from the flag.png, we will have to mess around with the service on gr8pics2.tuctf.com:5555. However, this one looks more tricky! The received pictures were different from the flag.png picture. So, for better understanding of the encryption service, we decided to receive two pictures for further analysis (one with 50A’s and one with 50B’s) :-

Again, by running a binary diff, but now for both of our pictures (not the flag.png) we came up with the following findings :-

We quickly notice, that the 1st CRC of IDAT png sections is the place for our hidden message. Firstly, we thought that we can create again the XOR key, by XORing the (A) 0×41 ^ 0×67 but that didn’t work. So the next try was to run pngcheck on the image as it was broken because of the CRC :-

Let’s focus on the first CRC error “CRC error in chunk IDAT (computed 26649350, expected 67649350)”. The encryption service changed the 1st CRC byte to 0×67 when it should be 0×26. Let’s XOR those chr( 0×67 ^ 0×26 ) give us the letter ‘A’! So, we get the XOR key by correcting the CRC values. Let’s create the key and cipher with the help of pngcheck and then write some python code to XOR them :-

root

November 29th

To decrypt the hidden message from the flag.png picture we will need to find out how the service running on gr8pics.tuctf.com:4444 encrypts the provided text message within the picture. By messing around with the service we find out out that we need to provide exactly 50 bytes to receive back a base64 string which contains the picture with our message encrypted. So let’s try to provide 50 A’s for a message and see how it goes :-

The flag50As.png received picture has the same size and looks the same as the flag.png. So let’s run a binary diff to these images :-

The results from the binary diff are clear enough. We can see that from the offset 0×335371 with step 8 bytes for 50 bytes we have all the differences. But how can we decrypt the hidden message? This requires some guessing… So we provided 50 A’s (0×41) and at the first position we got 0×08, what if the XOR operation of those give us the key to decrypt all the encrypted messages for any picture? By trying this chr( (0×41 ^ 0×08) ^ (0x1d) ) give us the letter ‘T’ -the first letter for the flag- so let’s write some python code to get the flag!

root

November 29th

WebSurgery is a suite of tools for security testing of web applications. It was designed for security auditors to help them with web application planning and exploitation.

It currently contains a spectrum of efficient, fast and stable tools such as Web Crawler with the embedded File/ Dir Brute forcer, Fuzzer (for advanced exploitation of known and unusual vulnerabilities such as SQL Injections, Cross site scripting (XSS)), Brute force (for login forms and identification of firewall-filtered rules, DOS Attacks) and WEB Proxy (to analyze, intercept and manipulate the traffic between your browser and the target web application).

Web Crawler

Web Crawler is designed to be fast, accurate, stable and completely parameterized using advanced techniques to extract links from Javascript and HTML Tags. It works with parameterized timing settings (Timeout, Threading, Max Data Size, Retries) and a number of rule parameters to prevent infinitive loops and pointless scanning (Case Sensitive, Dir Depth, Process Above/Below, Submit Forms, Fetch Indexes/Sitemaps, Max Requests per File/Script Parameters).

It is also possible to apply custom headers (user agent, cookies etc) and Include/Exclude Filters. For example, by default the crawler will scan only the initial web service (url at the specific port), however you could change the initial filter “^($protocol)://($hostport)/” to “^(http|https)://[^/]*\.test.com” to specify the whole domain site for a specific domain using regular expressions (i.e .net) (e.g. for http://test.com, https://test.com, http://www.test.com, https://something.test.com:9443 etc).

Web Crawler also includes an embedded File/Dir Brute Forcer which helps to directly brute force for files/dirs in the directories found from crawling.

By default, it will brute force from root / base dir recursively for both files and directories. It sends both HEAD and GET requests when it needs it (HEAD to identify if the file/dir exists and then GET to retrieve the full response).

Web Fuzzer

Web Fuzzer is a highly advanced tool to create a number of requests based on one initial request. Fuzzer has no limits and can be used to exploit known vulnerabilities such as (blind) SQL Inections and more uncommon ways such identifying improper input handling and firewall/filtering rules.

Web Editor

A Web Editor to send individual requests. It also contains a HEX Editor for more advanced requests.

Web Proxy

Web Proxy is a proxy server running locally and will allow you to analyze, intercept and manipulate HTTP/HTTPS requests coming from your browser or other application which support proxies.

root

September 8th

One of my old tools which helps for initials steps of Information Gathering. Basic, it works with dig, whois and nmap scan results. Unfortunately, it’s not really user-friendly and not documented. I’ve already coded the basic structure of new information gathering tool, however still needs a looot of work.

The following security solutions will most probably work with the different services’ versions and different Linux distributions. You will just need to find some more details for your system such as different paths for configuration files etc.

root

July 7th

This paper will try to cover the most important steps to properly securing and hardening your Linux web server.

Part I – Theory

Part II – Practice

In the first part, we will discuss what do we need to do theoretically to secure our Linux web server without explaining how we can exactly do it in practice. In this way, we can avoid the confusion of different configuration for different systems, Linux distributions and services.

That’s a very basic configuration for a Linux web server. However, in your case maybe it’s slightly different. For example, you will not need to run SSH if you have local access or maybe you need to run more services such as Mail server.

- What kind of attacks do we need to prevent for a server like this one?