2021 answer using modern async libraries
The 2016 answer is good, but i figured i'd throw in another answer with httpx instead of aiohttp, since httpx is only a client and supports different async environments. I'm leaving out the OP's for loop with urls built from a number concatenated to the string for what i feel is a more generic answer.
import asyncio
import httpx
# you can have synchronous code here
async def getURL(url):
async with httpx.AsyncClient() as client:
response = await client.get(url)
# we could have some synchronous code here too
# to do CPU bound tasks on what we just fetched for instance
return response
# more synchronous code can go here
async def main():
response1, response2 = await asyncio.gather(getURL(url1),getURL(url2))
# do things with the responses
# you can also have synchronous code here
asyncio.run(main())
Code after any await within the async with block will run as soon as the awaited task is done. It is a good spot to parse your response without waiting for all your requests to have completed.
Code after the asyncio.gather will run once all the tasks have completed. It is a good place to do operations requiring information from all the requests, possibly pre-processed in the async function called by gather.