cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: how libcurl receives chunked data from http server?

From: Loren Kirkby <loren_at_how2share.com>
Date: Tue, 23 Sep 2003 11:16:08 -0700

> > And before you scream "but disk I/O is too slow!" think about this:
> >
> > A) How long it takes to write/read a file from disk.
> > B) How long it takes to transfer the same file across the net.
> > C) How long it takes to change curl to make it do what you *think* you
> want.
>
> I don't have A)
> but I do have B) and B) is always the bottleneck in most cases
> For C), I just want to avoid unnecessary copy, seems like I have to change
> codes on my side in order to work well with libcurl
> Or, I need to patch libcurl to fit my need
>

Jerry,

1. I can assure you that modifying libcurl to do what you're talking about
will not buy you any noticible performance improvements.

2. Allocating huge chunks of memory is usually not a good idea. Depending
on the OS you're using, if the allocation is too large it will simply fail..
and then you're screwed. If you're using an OS that has memory-mapped
files, consider downloading the content to a file and then memory-mapping
the file.

Cheers,
Loren Kirkby

-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
Received on 2003-09-23