TransWikia.com

How do you handle OSError: [Errno 24] Too many open files?

Stack Overflow Asked by Pavan Ajit on December 5, 2021

Trying to download data using below HTTP request. This request would be made sequentially a few thousand times.

with urllib.request.urlopen(url, timeout=120) as resp:
    with open(save_loc + '.part', 'wb') as fh:
        while True:
            chunk = resp.read(1024 * 1024)
            if not chunk:
                break
            fh.write(chunk)

This is being called by:

if __name__ == '__main__':
x = [str(x) for x in range(1,100)]
with Pool(initializer=init_worker, processes=1) as pool:
    result = pool.map(downloadData,x, chunksize=1)
    pool.close()
    pool.join()

Download functionality would be scaled up in the future which is why I have used the code for multiprocessing.

Workarounds that I’ve come across for such errors is to either increase the open files limit using

ulimit -n [limit]

Or to open the file using "with" statement.

I’m trying to understand why there are instances of files open (as the error suggests) when I am using the "with" statement which automatically closes the file handler.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP