Are there any suggestions about how to download large files in Haskell? I figure Http.Conduit is is the library is a good library for this. However, how does it solve this? There is an example in its documentation but it is not fit for downloading large files, it just downloads a file:
import Data.Conduit.Binary (sinkFile)
import Network.HTTP.Conduit
import qualified Data.Conduit as C
main :: IO ()
main = do
request <- parseUrl "http://google.com/"
withManager $ \manager -> do
response <- http request manager
responseBody response C.$$+- sinkFile "google.html"
What I want is be able to download large files and not run out of RAM, e.g. do it effectively in terms of performance, etc. Preferably, being able to continue downloading them "later", meaning "some part now, another part later".
I also found the download-curl package on hackage, but I'm not positive this is a good fit, or even that it downloads files chunk by chunk like I need.
http-client
is one of the dependencies forhttp-conduit
. – Loselhttp-conduit
is built onhttp-client
? but what makes you positive thathttp-conduit
uses streaming data? what proves that? there is no such the proof in the documentation, is there? – Dobb