One pattern I really like is opening the networks tab of the browser, finding the request I'm interested in and "copying as curl" - the browser generates the equivalent command in curl.
Then I'd use something like https://curlconverter.com/ to convert it into request code of the language I'm using.
In a way curl is like the "intermediate representation" that we can use to translate into anything.
curlconverter.com looks amazing, instant bookmark – thanks!
I also use the browser's 'copy as curl' function quite frequently as it's so convenient, having all auth and encoding headers set to _definitely_ work (instead of messing around with handmade, multi-line curl command lines)
Be aware that online service like this one might log your request which could have sensitive data, I'm not saying it does, but those websites give me the creep
I agree that it's possible, and that the majority of utility websites do use a backend to provide their utility, but it seems curlconverter.com doesn't make any requests to a website to convert and instead does so in javascript.
It would be nice if more sites offered themselves as PWAs that worked when you set "throttling" to "offline" in the dev menu, so that you could ensure that no data is leaving the browser when working with sensitive info.
Maybe that would be a nice browser plugin. Something that blocks any further requests. I guess it would work similarly to ad blockers, only once enabled blocks everything.
True. Luckily it's open source. You can do `npx curlconverter --language go example.com` behind a firewall after downloading the npm module.
I kinda wish the address bar in any browser had an "advanced" popout menu that's basically a curl frontend with all of its bells and whistles. Basically move that functionality from the dev tools.
This is pretty interesting. It's not like HTTP needs an intermediate representation, but since cURL is so ubiquitous, it ends up functioning as one. cURL is popular so people write tools that can export requests as cURL, and it's popular so people write tools that can import it.
The benefit of curl over raw HTTP is the ability to strip away things that dont matter.
Eg an HTTP request should have a client header, but they're typically not relevant to what you're trying to do.
curl is like an HTTP request that specifies only the parts you care about. It's brief, and having access to bash makes it easy to express something like "set this field to a timestamp".
HTTP requests don't need to have any specific headers, and, if anything, curl will only add ones for you.
But specific HTTP requests might. Like cookies, an accept header, or anything.
I've used a similar tool as part of API logging, filtering out the signature on the bearer token... It's useful with request id for dealing with error reporting.
For R users, the httr2 package (a recent rewrite of httr) has a function that translates the copy as curl command to a request:
https://httr2.r-lib.org/reference/curl_translate.html
oh WOW! this is super useful. Thanks for the pointer!
Great Chrome feature! For those who haven't seen it, it also includes copy as PowerShell, fetch, and Node.js commands.
Yeah, that has made my life so much easier when troubleshooting an API endpoint. I can tweak the request params to run against a local instance as well as pipe through jq for formatting etc.
In curlconverter.com clicking on "C" redirects you to the --libcurl option documentation page instead of generating a C snippet. Wouldn't a more user-friendly way be to still generate a C snippet, but to mention that it can be done with the --libcurl option too?
I've used 'copy as curl' a bunch. Often find I have to append --compressed so the command line will provide the uncompressed output.
Postman can also import in "curl format", so yeah, the representation works.
This is such an interesting and true observation. Anytime something isn’t working with an endpoint, first step is “can you get it to work with curl”.