I was wondering how I could limit something like this too use only 10 threads at one time
with open("data.txt") as f:
    for line in f:
        lines = line.rstrip("\n\r")
        t1 = Thread(target=Checker, args=("company"))
        t1.start()
I was wondering how I could limit something like this too use only 10 threads at one time
with open("data.txt") as f:
    for line in f:
        lines = line.rstrip("\n\r")
        t1 = Thread(target=Checker, args=("company"))
        t1.start()
 
    
    Use Python's ThreadPoolExecutor with max_workers argument set to 10.
Something like this:`
pool = ThreadPoolExecutor(max_workers=10)
with open("data.txt") as f:
    for line in f:
        lines = line.rstrip("\n\r")
        pool.submit(Checker,"company")
pool.shutdown(wait=True)
The pool will automatically allocate threads as needed, limiting maximum number of allocation to 10. The first argument in pool.submit() is the function name, the arguments are simply passed as comma-separated values.
pool.shutdown(wait=True) waits for all threads to complete execution.
 
    
    Use the ThreadPoolExecutor and tell it that you want 10 threads. 
def your_function_processing_one_line(line):
    pass  # your computations
with concurrent.futures.ThreadPoolExecutor(10) as executor:
    result = executor.map(your_function_processing_one_line, [line for line in f])
...and you will have all the results in result.
 
    
    I wrote this nested loop to cap threads to a variable. This code relies on a preset array of commands to process. I have borrowed some elements from other answers for thread launch.
import os, sys, datetime, logging, thread, threading, time
from random import randint
# set number of threads
threadcount = 20
# alltests is an array of test data
numbertests = len(alltests)
testcounter = numbertests
# run tests
for test in alltests:
    # launch worker thread
    def worker():
        """thread worker function"""
        os.system(command)
        return
    threads = []
    t = threading.Thread(target=worker)
    threads.append(t)
    t.start()
    testcounter -= 1
    # cap the threads if over limit
    while threading.active_count() >= threadcount:
        threads = threading.active_count()
        string = "Excessive threads, pausing 5 secs - " + str(threads) 
        print (string)
        logging.info(string)
        time.sleep(5)
# monitor for threads winding down
while threading.active_count() != 1:
    threads = threading.active_count()
    string = "Active threads running - " + str(threads) 
    print (string)
    logging.info(string)
    time.sleep(5)
 
    
    (for both Python 2.6+ and Python 3)
Use the threadPool from multiprocessing module:
from multiprocessing.pool import ThreadPool
The only thing is that it is not well documented...
