Sometimes the best solution is the one that doesn’t reinvent the wheel
I just released rq
, an HTTP client that might be different from what you’re used to. Instead of building another GUI tool or inventing new syntax, I went the opposite direction: what if we just used HTTP syntax as-is?
The Problem: Too Many Abstractions
Let’s be honest about existing HTTP clients:
Postman – Great features, but it’s a heavy GUI that wants to own your entire workflow. Plus, good luck using it over SSH or in a CI pipeline.
Bruno – File-based approach (love it!), but why this custom syntax?
meta {
name: Login
type: http
}
post {
url: {{host}}/login
body: json
}
httpYac – Uses standard .http
files (perfect!), but no workspace organization.
HTTPie – Beautiful CLI syntax, but no persistence or project structure.
Each tool solves some problems while creating new ones. But here’s the thing: HTTP syntax is already perfect. It’s been around for decades, it’s human-readable, and every developer knows it.
So why are we reinventing it?
My Solution: Raw HTTP + Smart Organization
rq
uses actual HTTP syntax with workspace organization:
POST {{BASE_URL}}/api/login HTTP/1.1
Content-Type: application/json
Authorization: Bearer {{API_TOKEN}}
{
"username": "{{USERNAME}}",
"password": "{{PASSWORD}}"
}
That’s it. No custom syntax to learn. No abstractions. Just HTTP.
How It Works
1. Docks (Workspaces)
Think of docks like project folders for your API requests:
rq dock init my-api # Creates workspace
cd my-api/
2. Raw HTTP Files
Write requests in standard HTTP syntax:
rq new login # Creates login.http
Edit login.http
:
POST {{BASE_URL}}/auth/login HTTP/1.1
Content-Type: application/json
{"email": "{{EMAIL}}", "password": "{{PASSWORD}}"}
3. Environment Configuration
Simple key-value files (like .env
):
echo "BASE_URL=https://api.example.com" > env
echo "EMAIL=user@example.com" >> env
echo "PASSWORD=secret123" >> env
4. Run Requests
rq run login # Executes the request
The Magic: Hierarchical Configuration
Here’s where it gets interesting. You can organize requests in folders with inherited config:
my-api/
├── env # Global config
├── login.http
├── auth/
│ ├── env # Auth-specific config (inherits + overrides)
│ ├── signup.http
│ └── oauth/
│ ├── env # OAuth config (inherits from auth + global)
│ └── token.http
└── users/
├── list.http
└── create.http
Each level inherits configuration from above and can override specific values. Just like CSS cascading, but for API requests.
Smart Variables
Variables aren’t just simple substitution:
POST {{BASE_URL}}/upload HTTP/1.1
Content-Type: multipart/form-data
{
"file": "{{file(document.pdf)}}",
"checksum": "{{sha256(document.pdf)}}",
"timestamp": "{{timestamp()}}",
"id": "{{uuid()}}"
}
Functions handle common tasks:
-
{{file(path)}}
– reads and base64-encodes files -
{{sha256(data)}}
– computes hashes -
{{uuid()}}
– generates UUIDs -
{{timestamp()}}
– current timestamp
Environment Management
Multiple environments are first-class:
# env (default)
BASE_URL=http://localhost:3000
DEBUG=true
# env.staging
BASE_URL=https://staging-api.example.com
DEBUG=false
# env.prod
BASE_URL=https://api.example.com
DEBUG=false
LOG_LEVEL=error
Run with specific environments:
rq run login --env staging
rq run users/create --env prod
Auto-Discovery
Just specify the request name:
rq run login # Finds login.http automatically
If multiple protocols exist:
auth/
├── login.http # HTTP request
├── login.ws # WebSocket (future)
└── login.grpc # gRPC (planned)
rq
will prompt you to choose or you can specify explicitly:
rq run login.ws # Force WebSocket
Why This Approach Works
Version Control Native
git add api-requests/
git commit -m "Add user authentication flow"
Your API tests live with your code, not in some external tool.
Project Integration
my-project/
├── src/
├── tests/
└── api-dock/ # API requests as part of the project
├── env
└── *.http
CI/CD Ready
# GitHub Actions
- name: Test API endpoints
run: |
rq dock use api-tests
rq run health --env staging
rq run auth/login --env staging
Zero Learning Curve
If you know HTTP, you know rq
. No custom syntax, no abstractions.
Fast
Terminal tools start instantly. No waiting for Electron apps to load.
Protocol Agnostic Future
File extensions determine protocols:
-
.http
– HTTP requests -
.ws
– WebSocket connections (planned) -
.grpc
– gRPC calls (planned) -
.graphql
– GraphQL queries (planned)
Same workspace, same variable system, different protocols.
Getting Started
# Install (replace with your platform)
curl -L https://github.com/username/rq/releases/latest/download/rq-linux -o rq
chmod +x rq
# Try it out
./rq dock init test-api
cd test-api/
./rq new hello
echo "GET https://httpbin.org/get HTTP/1.1" > hello.http
./rq run hello
Code Examples
Basic API Testing
# users.http
GET {{BASE_URL}}/api/users?page={{PAGE}} HTTP/1.1
Authorization: Bearer {{JWT_TOKEN}}
Accept: application/json
File Upload
# upload.http
POST {{BASE_URL}}/api/files HTTP/1.1
Authorization: Bearer {{JWT_TOKEN}}
Content-Type: multipart/form-data
{
"name": "{{FILENAME}}",
"data": "{{file(./uploads/document.pdf)}}",
"hash": "{{sha256(./uploads/document.pdf)}}"
}
Authentication Flow
# auth/login.http
POST {{AUTH_URL}}/login HTTP/1.1
Content-Type: application/json
{
"email": "{{EMAIL}}",
"password": "{{PASSWORD}}"
}
With auth/env
:
AUTH_URL={{BASE_URL}}/auth
EMAIL=admin@example.com
PASSWORD=admin123
What’s Next
- WebSocket support (
.ws
files) - gRPC integration (
.grpc
files) - Test assertions and validation
- Response templating and extraction
- Plugin system for custom functions
Try It Out
GitHub: https://github.com/marcomit/rq
# Quick start
curl -L https://github.com/marcomit/rq/releases/latest/download/rq-linux -o rq
chmod +x rq
./rq dock init my-test
The tool is built with Go, works on all platforms, and has zero dependencies.
Discussion
What do you think? Is this approach too simple, or is simplicity exactly what we need?
I’d love to hear your thoughts:
- What HTTP client do you currently use and why?
- What frustrates you most about API testing tools?
- Would you try a tool like this in your workflow?