When a target doesn’t have its APIs documented it can be hard to approach. The use of rogue API documentation is a valuable tool for security researchers and web penetration testers that can help get around this challenge.
In this article, I’ll discuss how you can craft your own rogue API docs when none exist. I’ll demonstrate how to automate this process and show you how to weaponize the rogue docs directly in Postman in just a few minutes.
Why care about API documentation?
API documentation is important for a few reasons. First, it lets you know what an API can do and how to use it. Second, it provides a great way to explore a new API without having to read the source code. Which, you may not have to begin with anyways.
API documentation can include a properly defined structure for everything from endpoint capabilities, to data types and actual object schema. Technical content can include details about the request method allowed, what parameters are needed for a request, and any additional data that might be needed in the request body.
And lastly, in some cases, the API documentation may include examples that you can use to create a client that can communicate with the API. In fact, some API documentation capabilities like those in the Swagger specification include implementation logic that will allow you to send requests directly to the APIs from the documentation.
We’ll talk about the Swagger specification later. But first, I want to explain what rogue API documentation is all about.
What is a rogue API doc?
A rogue API doc is simply documentation for an API that doesn’t exist that is created by you the attacker, instead of the API publisher. It’s used to enumerate and discover what an API can do by generating its specification documentation from the HTTP traffic you save directly from a proxy or from the network tab in your browser.
In some cases, it can also be used to attack an API directly. We’ll get to that. For now, let’s talk about some of the benefits of rogue API docs.
Benefits of rogue API documentation
In a perfect world, we would always have a description of the APIs we are asked to attack. Developers would offer full API documentation using the OpenAPI specification (don’t worry, I’ll cover what that is in a bit), and include a description of the variables, objects, and endpoints needed to support code that integrate with the remote service.
But this is the real world. We’d be lucky if a developer even knew what OpenAPI documents are. While it takes a minimal amount of effort to produce the specification documentation from the code, we need to take control ourselves and perform the work to craft the API documentation as we explore the server.
And that’s OK. There are benefits to this approach.
You get clear instructions for API operation
Let’s be honest. Keeping documentation up to date can be hard. As APIs change, a developer can easily forget to edit the details about an existing endpoint; they may forget to write new documentation or save it back to source control like GitHub.
It happens. Developers are human too.
So by generating your own rogue API docs, you always have the latest documentation based on what the API is actually DOING. You capture the actual URL in the requests, along with all its data, usually in XML or JSON formats. So even if you had specification documents, you are able to understand how the API REALLY works and can take note of any discrepancies.
Tracking the API lifecycle of a target
In the modern world of web applications, agile development makes it hard to keep up with the constant change in the code base. But as an attacker, it’s to your advantage to understand what endpoints are available in a target application, and how they change between releases. Rogue API docs give you this visibility so that you can quickly adapt your attack vectors on the fly as the service changes.
In many cases, with just a little extra work, you can also build out your own instructions on how the requests work. This gives you the ability to support “living” rogue documents, allowing you and your team to focus on where the platform needs the most testing.
So let’s get into what rogue API docs can look like. The best place to start is by discussing Swagger and the OpenAPI specification.
Swagger, and the OpenAPI specification
The OpenAPI specification, also known as the Swagger specification, is a document format that allows you to describe the capabilities of an API. It provides a way to define the resources that an API exposes, including the methods (or endpoints) and parameters that are available for each. Additionally, it can be used to build interactive documentation and client libraries.
The OpenAPI specification is a popular choice for describing modern APIs and is supported by a number of tools. For our purposes, we’ll be using the open-source tool Swagger Editor to display, search and edit our rogue API documentation once it’s created.
Swagger vs OpenAPI
As of late, there is a lot of confusion between Swagger and OpenAPI. The easiest way to understand the differences is this:
- OpenAPI = Specification
- Swagger = Tools for implementing the OpenAPI specification
The OpenAPI is the official name of the specification. It is backed by industry juggernauts like Microsoft, Google, and IBM.
Swagger is the name of the most well-known toolset used to generate OpenAPI documents. It’s currently owned by Smartbear Software, which is also a member of the OpenAPI initiative.
You will usually find Swagger and OpenAPI used interchangeably. But that’s incorrect. You can use anything you want to create Open API documents.
In fact, I am going to show you a tool called mitmproxy2swagger that can do just that.
How to generate rogue API docs
The best way to craft rogue API docs is by parsing live traffic to and from a web service. Usually, this can be done with a proxy. Great attack proxies like Burp Suite and ZAP can do this quite well. As can mitmproxy, better known as the man-in-the-middle proxy.
However, I am going to show you a different way. You can leverage the browser devtools to collect all this traffic and then save it to files for processing later.
This is an awesome way to do it when you are forced to collect traffic within a foreign target that may not have enough privileges to install additional tools like a proxy.
A good example of when to use this is during a penetration testing engagement where you have gotten a foothold on a users desktop in an internal corporate network.
You can use the new users desktop browser to pivot and attack an internal web application, and exfiltrate the browser’s recorded traffic to the API back to you.
Then you can create your rogue API documentation offline.
Let me show you how.
Set up and prep
Besides using the browser’s devtools, you will need to install mitmproxy2swagger. You can download it directly from GitHub if you like. An easier way is to use Python’s package manager (aka pip) to install it:
pip install mitmproxy2swagger
If you are on Windows, I recommend you install the Windows Subsystem for Linux (WSL) and install Ubuntu or KALI. It’s just a much better experience then trying to get Python running natively. I’ll be demonstrating this using WSL on Windows 11.
But YMMV. use what works for you.
Once you have it installed, we can start capturing some traffic!
Step 1: Set up your browser to capture traffic
Your browser’s devtools has the ability to record the requests and responses and capture it. This is shown within the Network tab of devtools.
To get to devtools (at least on Microsoft Edge on Windows 11), you can hit the shortcut
CTRL+SHIFT+I, or by going to
Menu → More Tools → Developer Tools.
One tip I recommend is you always start a new tab before loading devtools. This way, you have a clean slate with no traffic in the Network tab.
Once in devtools, make sure you select to “Preserve log” and “Disable cache“. This will ensure you record all requests across page loads. Finally, ensure you have the recording on. It will all look something like this:
Keep devtools open, and go back to the tab. It’s time to collect some traffic!
Step 2: Use the application normally to collect traffic
Time to happy path the application. In other words, use the application like a normal user would. Try to execute every feature and function the application supports. From signup to sign out, password reset to profile update, you want to run as much application logic as you can. Don’t forget those little things, like downloading a report or updating an avatar.
If you can signup for an account with admin privileges, all the better. It will allow you to explore more capabilities of the application, and expose more potential API endpoints.
Once you are confident that you have figured out everything the application can do, we can start moving towards weaponizing all this data.
Step 3: Generate an HTTP Archive (HAR)
Once you have completed your happy path execution, go back into devtools. You now want to export all this traffic data into an HTTP Archive file, or more commonly called a HAR capture. You can do this using the export tool, which typically looks like a down arrow in devtools:
This will create a file with a .har extension. It’s basically structured data in a JSON format of everything devtools recorded. And with that, we can start having some fun.
Step 4: Import HAR file to create OpenAPI definition
Now its time to leverage mitmproxy2swagger to convert the HAR into a rough OpenAPI definition file. We call this first pass.
It’s not an actual APi document yet, but instead will take all the traffic and scaffold a YAML file that we can edit. Here is an example command line you can use:
mitmproxy2swagger -i crapi.apisec.ai.har -o crapi.yaml -p http://crapi.apisec.ai -e -f har
Let me break the command line down for you:
- -i <filename>.har : The source input HAR file.
- -o <filename>.yaml : The output YAML file you want to generate. This will be the scaffolding you will be editing.
- -p http://target.domain : This is API prefix. Typically this will be something like target.domain or target.domain/v1 etc.
- -e : This tells the tool you want to also provide example data in the documentation. You want to be careful here, as it will use the actual data you provided during your happy path execution. You can ignore this option if you don’t want to include that data in the documentation.
- -f har : This tells the tool the input file is a HAR capture, and not the typical mitmproxy flow file.
With the first pass complete, we can take a look at what is produced.
Step 5: Edit the definition files to describe API endpoints
Open the YAML definition file in your favorite editor and take a look.
Notice you have a ton of lines that start with “- ignore:/” under the x-path-templates. That’s by design.
When your done, save the file. It might look something like this:
Now you’re ready to actually generate your rogue API documentation.
Step 6: Generate OpenAPI documentation from the definition
Now run the same command line to execute a second pass against the definition file:
mitmproxy2swagger -i crapi.apisec.ai.har -o crapi.yaml -p http://crapi.apisec.ai -e -f har
Open up the YAML file again. Look how much it has changed.
Notice the structure. It includes the endpoints, the methods used, the parameters and properties of objects, and even includes full examples if you included the -e option.
That is your API documentation that is following the OpenAPI specification!
But how compliant is it with the specs? Let’s find out.
Step 7: Test the OpenAPI document in Swagger Editor
Head over to https://editor.swagger.io. Either copy and paste the contents from your YAML file, or hit
File → Import File. The Swagger Editor will load your API doc, and try to parse it. Depending on how thorough your happy path execution was, you may or may not see a few errors.
That’s ok. We can fix them in the editor later if we need to. But for now, look how you see a fully described API based on all the traffic you collected:
Expand one of the endpoints and take a look at how well defined it is:
You can even click the Try it out button and execute a web service request directly from the API docs.
As you can see, at this point we have a fully working rogue API document covering all the endpoints we have seen during our happy path execution. If you truly have explored every feature and function of the application you are testing, you will have everything you need to start attacking the API.
How to use your rogue API docs in Postman
So if you have read my beginner’s guide you already know how to hook up Postman and BurpSuite together. With those tools wired together, you can now leverage your rogue API doc to start breaking the API.
You just need to import the doc into Postman as a new collection. Let’s do that.
Configure a new workspace in Postman
Start up Postman. Click on
Workspaces → Create Workspace and name it something unique that represents your engagement. I’ll call mine crAPI since I am demonstrating that here.
Import OpenAPI definition file
On the top left side of the workspace, beside the name you called it, is an Import button. Click that. When prompted use the explorer to find the YAML file you have created. Then hit the Import button.
It might look something like this just before the import:
You’ve now imported your collection into Postman and can start firing off requests! Don’t forget to start up Burp if you followed my guidance on wiring PostMan and Burp together.
Attack your target API
Now flip over to Burp, and send your request to Burp Repeater or Intruder.
Have at ‘er. Your rogue API docs are now weaponized and you can start searching for vulnerabilities!
With a little effort, you should now have your own custom API documentation that follows the OpenAPI specification. This allows you to use Postman and BurpSuite together to attack the target API with more precision, and allows you to test endpoints that may or may not be know from any existing or old documentation.
Remember, the key is to explore every feature and function of the application to record all the traffic needed in order to craft a complete picture of the API. Hope this helps. Good luck!
Want to learn more ways to hack APIs? Make sure you grab my Ultimate Guide to API Hacking Resources. It includes tons of resources you can find online that can help you improve your tradecraft.