Skip to content

c0dn/har-parser-mcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HAR Parser MCP Server

A Model Context Protocol (MCP) server for parsing and analyzing HTTP Archive (HAR) files. Extract insights from network traffic captures, debug API issues, and reproduce requests with ease.

Features

  • Metadata Analysis - Get a health dashboard with request counts, status distributions, and error detection
  • Smart Search - Filter entries by URL patterns, resource types, status codes, or error conditions
  • Request Details - Deep-dive into individual requests with formatted bodies and timing breakdowns
  • cURL Generation - Convert any captured request to a cURL command for reproduction
  • Binary Extraction - Decode and save base64-encoded binary responses (images, PDFs, etc.)
  • Intelligent Caching - Parsed HAR files are cached with automatic invalidation on file changes

Installation

Using uvx (Recommended)

uvx --from git+https://github.com/c0dn/har-parser-mcp.git har-parser-mcp

Using pip

pip install git+https://github.com/c0dn/har-parser-mcp.git
har-parser-mcp

From Source

git clone https://github.com/c0dn/har-parser-mcp.git
cd har-parser-mcp
uv sync
uv run har-parser-mcp

Configuration

Claude Desktop

Add to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json on Linux, ~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "har-parser": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/c0dn/har-parser-mcp.git", "har-parser-mcp"]
    }
  }
}

Claude Code

Add to your Claude Code MCP settings:

{
  "mcpServers": {
    "har-parser": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/c0dn/har-parser-mcp.git", "har-parser-mcp"]
    }
  }
}

Tools

read_har_metadata

Parse a HAR file and return a summary dashboard.

Parameters:
  file_path: string - Path to the .har file

Returns:
  - Total entries, pages, and duration
  - Response size statistics
  - Status code distribution (2xx, 3xx, 4xx, 5xx)
  - Network error and incomplete request counts

search_har_entries

Search and filter HAR entries with flexible criteria.

Parameters:
  file_path: string - Path to the .har file
  query: string (optional) - Regex pattern to match URLs or response bodies
  resource_type: string (optional) - Filter by type: xhr, document, stylesheet, image, font, script, media, websocket
  has_error: boolean (optional) - Filter to only 4xx/5xx/network errors
  status_code: integer (optional) - Filter by exact status code

Returns:
  - List of matching entries with index, method, URL, status, and timing

get_entry_details

Get complete details for a specific request by index.

Parameters:
  file_path: string - Path to the .har file
  entry_index: integer - Zero-based index of the entry

Returns:
  - Full request/response headers
  - Request/response bodies (auto-formatted for JSON)
  - Cookies sent and received
  - Detailed timing breakdown (DNS, connect, SSL, TTFB, download)
  - Server IP address

generate_curl_command

Convert a HAR entry to a reproducible cURL command.

Parameters:
  file_path: string - Path to the .har file
  entry_index: integer - Zero-based index of the entry
  unsafe: boolean (optional) - Include sensitive headers (Authorization, Cookie, etc.)

Returns:
  - Ready-to-use cURL command
  - Warning if sensitive headers are exposed

extract_binary_content

Decode base64-encoded binary responses and save to a temporary file.

Parameters:
  file_path: string - Path to the .har file
  entry_index: integer - Zero-based index of the entry

Returns:
  - Path to extracted file
  - MIME type and file size
  - Original URL

Use Cases

  • API Debugging - Analyze failed requests, inspect headers, and reproduce issues
  • Performance Analysis - Identify slow requests and analyze timing breakdowns
  • Security Audits - Review sensitive data in requests/responses
  • Documentation - Extract API patterns from recorded traffic
  • Testing - Generate cURL commands to reproduce specific scenarios

How to Capture HAR Files

Chrome/Edge

  1. Open DevTools (F12)
  2. Go to Network tab
  3. Perform the actions you want to capture
  4. Right-click in the network list → "Save all as HAR with content"

Firefox

  1. Open DevTools (F12)
  2. Go to Network tab
  3. Perform the actions you want to capture
  4. Click the gear icon → "Save All As HAR"

Safari

  1. Enable Developer menu (Preferences → Advanced)
  2. Open Web Inspector (Develop → Show Web Inspector)
  3. Go to Network tab
  4. Perform the actions you want to capture
  5. Export (File → Export HAR)

Development

# Install dependencies
uv sync

# Run tests
uv run pytest

# Type checking
uv run mypy src

# Linting
uv run ruff check src tests

License

MIT

Releases

No releases published

Packages

No packages published

Languages