TransWikia.com

Trying to download a large number of files from ENA programmatically

Bioinformatics Asked on August 22, 2021

I am downloading a large number of files from the ENA in Python using Multithreading. As an example, to be more concrete, I tried downloading 12 files simultaneously using 12 threads. I noticed that 10 files download relatively quickly, while 2 lag behind. Even when the 10 files are done downloading, the speed at which the other 2 files download is very slow. If I then try downloading these 2 files again, the speed will be a lot better. So what exactly is happening and how can I fix this?

PS: I tried decreasing the thread count to 8, so 8 files were downloading simultaneously, and I had 1 file that was lagging the rest again.

PPS: I then tried decreasing to 6 threads, and again the same issue. One file will lag. I don’t have this issue if I directly download from Firefox, and I get a significant speed up, but I’m trying to automate the process so downloading from Firefox isn’t an option.

For reference, here’s the code:

import queue
import threading
import time


start = time.perf_counter()

    
class MyThread(threading.Thread):
    def __init__(self, name):
        threading.Thread.__init__(self)
        self.name = name
    def run(self):
        print('Starting thread %s.' % self.name)
        process_queue()
        print('Exiting thread %s.' % self.name)
        
def process_queue():
    while True:
        try:
            url = my_queue.get(block=False)
            filename = url.split('/')[-1]
            with closing(request.urlopen(url)) as r:
                with open(filename, 'wb') as f:
                    shutil.copyfileobj(r, f)
        except queue.Empty:
            return
        
# setting up variables
urls = [
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/000/SRR8240860/SRR8240860_1.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/000/SRR8240860/SRR8240860_2.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/001/SRR8240861/SRR8240861_1.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/001/SRR8240861/SRR8240861_2.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/002/SRR8240862/SRR8240862_1.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/002/SRR8240862/SRR8240862_2.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/003/SRR8240863/SRR8240863_1.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/003/SRR8240863/SRR8240863_2.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/004/SRR8240864/SRR8240864_1.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/004/SRR8240864/SRR8240864_2.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/005/SRR8240865/SRR8240865_1.fastq.gz',
        'ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR824/005/SRR8240865/SRR8240865_2.fastq.gz',
        ]

# filling the queue
my_queue = queue.Queue()

for url in urls:
    my_queue.put(url)

# initializing and starting num_threads threads
num_threads = 8
threads = []

for i in range(num_threads):
    thread = MyThread(i)
    threads.append(thread)

for thread in threads:
    thread.start()

for thread in threads:
    thread.join()
    
finish = time.perf_counter()

print(f'Finished in {round(finish-start, 2)} second(s)')

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP