Tuesday, November 5, 2013

Patching an Android Application to Bypass Custom Certificate Validation

One of the important tasks while performing mobile application security assessments is to be able to intercept the traffic (Man in The Middle, MiTM) between the mobile application and the server by a web proxy like Fiddler, Burp etc… This allows penetration tester to observe application behavior, modify the traffic and overcome the input restrictions enforced by application’s user interface to perform a holistic penetration test.
Mobile applications exchanging sensitive data typically use HTTPS protocol for data exchange as allows them to perform server authentication to ensure a secure communication channel. The client authenticates the server by verifying server’s certificate against its trusted root certificate authority (CA) store and also checks the certificate’s common name against the domain name of the server presenting the certificate. To perform MiTM on the HTTPS traffic for mobile application, web proxy’s certificate is imported to the trusted root CA store otherwise the application may not function due to certificate errors.
On a recent Android application assessment, I setup a web proxy to intercept mobile application’s SSL traffic by importing its certificate to device’s trusted root CA store. To ensure that the imported CA certificate works fine, I used Android’s browser to visit a couple of SSL based websites and the browser accepted the MiTM’ed traffic without complains. Typically, the native Android applications also use the common trusted root CA store to validate server certificates, so no extra work is required to intercept their traffic. However, the application I was testing was different as we will see below.


Analyzing the Unsuccessful MiTM
When I launched the application and attempted pass its traffic through the web proxy, it displayed an error screen indicating that it could not connect to the remote server because of no internet connection or it could not establish a connection for unknown reasons. Things were not adding up as this configuration has mostly worked in the past so I turned to analyzing systems logs and SSL cipher suite support.
Logcat
Logcat is Android’s logging mechanism that is used to view application debug messages and logs. I ran adb logcat to check if the application under test created any stack trace indicating the cause of the error but there was none. The application also did not leave any debug logs indicating that the developers did a good job with the error handling and did not write debug messages that could potentially expose application internal working to prying eyes.
Common SSL Cipher suites
When a web proxy acts as a MiTM between client and the server, it establishes two SSL communication channels. One channel is with the client to receive requests and return responses, the second channel is to forward application requests to the server and receive server responses. To establish these channels, the web proxy has to agree on common SSL cipher suits with both the client and the server and these cipher suites may not be the same as shown in the image below.


I have observed SSL proxying errors in the past to occur in one or both of the following scenarios which lead to failures while establishing a communication channel.
  1. Android application and the web proxy do not share any common SSL cipher suite.
  2. The web proxy and the server do not share any common SSL cipher suite.
In both scenarios, the communication channel cannot be established and the application does not work. To analyze the above mentioned scenarios, I fired up Wireshark to analyze SSL handshake between the application and the web proxy, and discovered that they shared common SSL cipher suites.
With the first scenario ruled out, I issued a HTTPS request to the server with the web proxy and that appeared to work without any errors indicating presence of common SSL ciphers between web proxy and the server.
So the web proxy was capable of performing MiTM for the test application and there was something else going under the hood.


Custom Certificate Validation
It was at this point that I started to look into the possibility of the application performing custom certificate validation to prevent the possibility of MiTM to monitor/modify its traffic flow. HTTPS clients can perform custom certificate validation by implementing the X509TrustManager interface and then using it for its HTTPS connections. The process of creating HTTPS connections with custom certificate validation is summarized below:
  1. Implement methods of the X509TrustManager interface as required. The server certificate validation code will live inside the checkServerTrusted method. This method will throw an exception if the certificate validation fails or will return void otherwise.
  2. Obtain a SSLContext instance.
  3. Create an instance of the X509TrustManager implementation and use it to initialize SSLContext.
  4. Obtain SSLSocketFactory from the SSLContext instance.
  5. Provide the SSLSocketFactory instance to setSSLSocketFactory method of the HttpsURLConnection.
  6. Instance of HttpsURLConnection class will then communicate with the server and will invoke checkServerTrusted method to perform custom server certificate validation.
Searching the decompiled code revealed X509TrustManager implementation in one of the core security classes of the application. The next step was to patch the code preventing the MiTM and deploy it for testing. The image two methods implemented for X509TrustManager.


Patching the checkServerTrusted Implementation
The image above shows implementation for two X509TrustManager methods, checkServerTrusted and checkClientTrusted. At this point it is important to point out that both the methods behave in a similar way except that the former is used by client side code and the latter is used by server side code. If the certificate validation fails, they would throw an exception, otherwise they return void.
The checkClientTrusted implementation allows the server side code to validate client certificate. Since this functionality is not required inside the mobile application, this method was empty and returned void for the test application; which is equivalent to successful validation. However, the checkServerTrusted contained significant chunk of code performing the custom certificate validation which I needed to bypass.
To bypass certificate validation code inside the checkServerTrusted method, I replaced its Dalvik code with the code from the checkClientTrusted method to return void, effectively bypassing the custom certificate check as shown in the image below.




Recompiling and Deploying the Modified Application
Confident that all checkServerTrusted invocations from this point onwards were going to be successful, I recompiled the application with ApkTool, signed it with SignApk and deployed it on the device. The web proxy MiTM worked like a charm and I was able view, modify and fuzz application traffic.

Tuesday, October 29, 2013

Debugging Out a Client Certificate from an Android Process

I had setup my web proxy to intercept the Android application’s traffic, tested the proxy configuration with HTTPS based Android applications and the traffic interception worked like a charm. However, for the application under test, things were different. Connections to the applications’ server returned HTTP 403 error code because SSL mutual authentication was enforced and I did not have the client certificate. The image below shows the error message.


I was in a situation where no meaningful communication could be established with the remote server. The resource files obtained by decompiling the application did not contain containing the client certificate and it was clear that it was stored in the obfuscated code somewhere.
The RSAPrivateCrtKey and two certificates were already extracted from the application’s memory as discussed in the previous blog post. As it turned out, those were not sufficient and I still needed the client certificate and the corresponding password to be able to connect to the server and test the server side code. This blog post will how they were retrieved by debugging the application.
Identifying the Code Using the Client Certificate
The knowledge of how Java clients use SSL certificate to support client authentication proved critical during this assessment and helped me identify the function calls to look for during the debugging process. The typical steps followed to load a client certificate for a HttpsURLConnection are summarized below:
  1. Create instances of following classes:
    1. HttpsURLConnection – to communicate with the remote server
    2. KeyStore – to hold client certificate
    3. KeyManagerFactory – to hold KeyStore
    4. SSLContext – to hold KeyManager
  2. Create File instance for the client certificate and wrap it inside an InputStream
  3. Invoke KeyStore instance’s load method with InputStream from step 2 and certificate password as char[] so it contains the client certificate
  4. Feed the  KeyManagerFactory instance with KeyStore from step 3 and certificate password by invoking its init method
  5. Obtain KeyManager[] array from the KeyManagerFactory created above
  6. Invoke SSLContext intance’s init method and feed it the KeyManager[] from step 5
  7. Obtain a SSLSocketFactory from the created SSLContext and setup the HttpsURLConnection instance to use it for all SSL communication.
The following image depicts the steps discussed:



Instantiating a KeyStore and loading an InputStream for a client certificate are central to SSL client authentication support. So I searched the decompiled code for KeyStore class usage, corresponding instance variables and identified classes and methods that were potentially configuring the client side SSL certificate for HttpsURLConnection.
Locating the Debug Points
I continued to eliminate KeyStore usages till I identified the class and method I was interested in. The identified class and method did not refer to any resource files to get the client certificate and its password but relied on couple of function calls to get the byte[] representation for client certificate and String representation for the password before feeding them to the load method of the KeyStore instance. Following the code paths led me to the two magic strings that I was looking for. They appeared to be Base64 encoded values of client certificate and the corresponding password.
Base64 decoding them returned gibberish which could not be put to any practical use as there was more to the encoded values than plain Base64 encoding. Further analysis revealed that they were subjected to standard crypto algorithms, and those algorithms were fed their Initialization Vectors and Encryption Keys from other Java classes. Additionally, the application also used some custom data manipulation tricks to further obfuscate them.
With limited time at hand I decided to briefly shelve the code analysis and move to application debugging to inspect the exact code points of interest for data extraction. To help with the debugging process, I noted down the class name, method name, and instance variable of interest where the potential client certificate and password were fed to the KeyStore instance.
Setting up the Application for Debugging
Reviewing AndroidManifest.xml of the decompiled application indicated that the application was not compiled with the debug flag and hence could not be debugged on a device. So I added the debug flag, recompiled it, signed the application and then installed it on the device. The following steps summarize the process of creating debuggable versions of existing Android applications if you plan to debug the application on an actual device.
  1. Decompile the application with apktool
  2. Add android:debuggable="true" attribute to the application element in the AndroidManifest.xml
  3. Recompile the application with apktool
  4. Sign the application with SignApk
  5. Install the application
The image below shows the debuggable attribute added to the AndroidManifest.xml file of the target application.


If you are using an emulator, you can extract the application from the device, install it on the emulator and attach a debugger without decompiling or adding the debuggable attribute to the AndroidManifest.xml file.
Let us now look at some of the important pieces of the debugging setup that was used.
Java Debug Wire Protocol (JDWP)
The Java Debug Wire Protocol is a protocol used for communication between a JDWP compliant debugger and the Java Virtual machine. The Dalvik Virtual Machine that is responsible for running the applications on Android devices supports JDWP as it debugging protocol. Each application that runs on a Dalvik VM exposes a unique port to which JDWP compliant debuggers can attach and debug the application.
Once the application was installed on the device in debug mode, the next step was to attach a JDWP compliant debugger, such as jdb, and get going.
jdb – The Java Debugger
jdb is a JDWP compatible command-line debugger that ships with Java JDK and I use jdb for its command line goodness. The typical process of attaching jdb to an Android application is summarized below:
  1. Launch the application that you want to debug
  2. Obtain its process ID
  3. Use adb to port forward JDWP connection to the application JDWP port
  4. Attach jdb to the application
  5. Set breakpoints and debug the application
The following resources can get you started on jdb debugging with Android.


Debugging for the Client Certificate
At this point, I knew the exact locations where breakpoints were needed to obtain client certificate and corresponding password. I setup the breakpoints in the functions that invoked the load method of a KeyStore instance to store the client certificate. So I launched the application and then browsed to the functionalities that would invoke the code paths leading to the breakpoints.
After hitting the breakpoint, I executed jdb dump to query the instance variable and invoked its different methods to retrieve the important information. The instances variable of interest was of class g. The Java class under analysis retrieved client certificate and its password by the following calls before feeding them to the load method:
  1. It called a method b()on its instance variable “g” to obtain the certificate password and converted it to char[]
  2. It called a method a() on its instance variable “g” to obtain byte[] representation of client certificate and wrapped it in a ByteArrayInputStream.
The following screenshot shows the rundown leading up to the client certificate and the password.


After obtaining the byte[] dump of the client certificate, I created the pfx file with following Java code and then imported it to my browser store and also inside the web proxy.
import java.io.FileOutputStream;
import java.io.IOException;


public class PfxCreatorFromByteArray {
public static void main(String... args) throws IOException {
// Contains the byte[]for client certificate
byte[] pfx = {48, -126, <more byte's here>};
FileOutputStream fos = new FileOutputStream("client-cert.pfx");
fos.write(pfx);
fos.close();
}
}



The following image shows successful client certificate import.

The imported client certificate then allowed me to successfully engage and assess the server portion of the application. In addition to the client certificate, combining the static and dynamic analysis techniques also allowed me to retrieve other sensitive information like Initialization Vectors, Encryption Keys etc… from the application.