cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: [Survey] What people want us to do next

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Mon, 16 Jun 2014 10:11:16 +0200 (CEST)

On Sun, 15 Jun 2014, Alessandro Ghedini wrote:

(let me just remind readers that these are thing people asked for, we don't
necessarily think they are good ideas to do...)

>> colored output
>
> We can take HTTPie [0] as model here (ignoring the evident bias towards
> JSON-based HTTP APIs). The coloring of the HTTP headers is pretty neat [1],
> but if I understand things correctly it would need to be implemented
> directly in libcurl (I haven't looked into how to do it, but I could if
> there's interest).

Why would it have to be done in libcurl? I really can't see how coloring of
text output is any business of libcurl's.

>> HTTPS: Live CRL checking and OCSP.
>
> There's a wishlist bug about this in the Debian bts, and I have briefly
> looked into how to implement OCSP (RFC2560) a while back.

We all know OCSP is completely broken and barely a tad bit more than useless.
Browsers don't even implement it much or care about the responses, I don't
think we'll get much use out of implementing this now.

I think there's much more to gain by instead implementing the new methods that
are being developed, like certificate pinning, ocsp stapling etc.

> I have no idea how to make an HTTP request from inside libcurl during the
> certificate verification phase though. Is this actually possible?

Everything is possible, the question is only how much work it is!

Starting a new independent request would probably easist be made by simply
creating a new easy handle that's hidden from the outside and then use that as
a normal transfer. Would require a fair amount of new logic.

>> wget/httrack -like (and js/css/font/etc -aware) addon in the bundle for
>> recursive fetching
>
> I guess this would mean having curl understand HTML?

If anyone would like to do this, I would suggest building a separate binary
(or even separate project) for that purpose, since it'd require a busload of
new command line arguments, a significant amount of more code and would be a
deviation from what we've been doing so far. I'm not personally that
interested in working on that.

I believe wget and httrack already exist and work fine and I don't see why we
would need to do their job.

>> Curious about latest http/1.1 update in relation to libcurl.
>
> As discussed in a previous thread... haven't looked at this yet.

I'm credited in those RFCs as a contributor for a reason :-). I've been
participating in the IETF httpbis work for years and the RFC7230 series is
basically documenting how HTTP 1.1 works and how it is used, with unused stuff
removed and ambiguities clarified. An almost seven years long project!

We already talk HTTP 1.1 and this new set of RFCs don't really change anything
for us short-term. Servers didn't change anything over night either.

There might be some things that we can simplify/remove now at some point, but
we're also somewhat conservative and we often have users working with very
old, stupid and legacy servers so I don't expect any such work to happen fast.

>> Fix the problem where header files became platform/architecture dependent
>> (a few years ago), making cross-builds cumbersome even in such a simple
>> scenario as building 32-bit libcurl on a 64-bit Linux system. It would seem
>> to be a better solution to detect bitness and other platform dependent
>> stuff dynamically inside the headers at compile-time.

I too would like to see this improved, as I agree the current setup is
unfortunate. But the reality is of course slightly more complicated than just
32 or 64 bit.

-- 
  / daniel.haxx.se
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette:  http://curl.haxx.se/mail/etiquette.html
Received on 2014-06-16