I load into the proxies variable my proxies and try to do async requests for get ip. Its simple:
async def get_ip(proxy):
    timeout = aiohttp.ClientTimeout(connect=5)
    async with aiohttp.ClientSession(timeout=timeout) as session:
        try:
            async with session.get('https://api.ipify.org?format=json', proxy=proxy, timeout=timeout) as response:
                json_response = await response.json()
                print(json_response)
        except:
            pass
if __name__ == "__main__":
    proxies = []
    start_time = time.time()
    loop = asyncio.get_event_loop()
    tasks = [asyncio.ensure_future(get_ip(proxy)) for proxy in proxies]
    loop.run_until_complete(asyncio.wait(tasks))
    print('time spent to work: {} sec --------------'.format(time.time()-start_time))
This code work fine when i try to do 100-200-300-400 requests, but when is count more than 500 i alltime getting error:
Traceback (most recent call last):
  File "async_get_ip.py", line 60, in <module>
    loop.run_until_complete(asyncio.wait(tasks))
  File "C:\Python37\lib\asyncio\base_events.py", line 571, in run_until_complete
    self.run_forever()
  File "C:\Python37\lib\asyncio\base_events.py", line 539, in run_forever
    self._run_once()
  File "C:\Python37\lib\asyncio\base_events.py", line 1739, in _run_once
    event_list = self._selector.select(timeout)
  File "C:\Python37\lib\selectors.py", line 323, in select
    r, w, _ = self._select(self._readers, self._writers, [], timeout)
  File "C:\Python37\lib\selectors.py", line 314, in _select
    r, w, x = select.select(r, w, w, timeout)
ValueError: too many file descriptors in select()
I was looking for a solution, but all I found was a limitation at the OS. Can I somehow get around this problem without using additional libraries?
 
    