I have next model: Command 'collect' (collect_positions.py) -> Celery task (tasks.py) -> ScrappySpider (MySpider) ...
collect_positions.py:
from django.core.management.base import BaseCommand
from tracker.models import Keyword
from tracker.tasks import positions
class Command(BaseCommand):
    help = 'collect_positions'
    def handle(self, *args, **options):
        def chunks(l, n):
            """Yield successive n-sized chunks from l."""
            for i in range(0, len(l), n):
                yield l[i:i + n]
        chunk_size = 1
        keywords = Keyword.objects.filter(product=product).values_list('id', flat=True)
        chunks_list = list(chunks(keywords, chunk_size))
        positions.chunks(chunks_list, 1).apply_async(queue='collect_positions')
        return 0
tasks.py:
from app_name.celery import app
from scrapy.settings import Settings
from scrapy_app import settings as scrapy_settings
from scrapy_app.spiders.my_spider import MySpider
from tracker.models import Keyword
from scrapy.crawler import CrawlerProcess
@app.task
def positions(*args):
    s = Settings()
    s.setmodule(scrapy_settings)
    keywords = Keyword.objects.filter(id__in=list(args))
    process = CrawlerProcess(s)
    process.crawl(MySpider, keywords_chunk=keywords)
    process.start()
    return 1
I run the command through the command line, which creates tasks for parsing. The first queue completes successfully, but other returned an error:
twisted.internet.error.ReactorNotRestartable
Please tell me how can I fix this error? I can provide any data if there is a need...
UPDATE 1
Thanks for the answer, @Chiefir! I managed to run all queues, but only the start_requests() function is started, and parse() does not run.
The main functions of the scrappy spider:
def start_requests(self):
    print('STEP1')
    yield scrapy.Request(
        url='exmaple.com',
        callback=self.parse,
        errback=self.error_callback,
        dont_filter=True
    )
def error_callback(self, failure):
    print(failure)
    # log all errback failures,
    # in case you want to do something special for some errors,
    # you may need the failure's type
    print(repr(failure))
    # if isinstance(failure.value, HttpError):
    if failure.check(HttpError):
        # you can get the response
        response = failure.value.response
        print('HttpError on %s', response.url)
    # elif isinstance(failure.value, DNSLookupError):
    elif failure.check(DNSLookupError):
        # this is the original request
        request = failure.request
        print('DNSLookupError on %s', request.url)
    # elif isinstance(failure.value, TimeoutError):
    elif failure.check(TimeoutError):
        request = failure.request
        print('TimeoutError on %s', request.url)
def parse(self, response):
    print('STEP2', response)
In the console I get:
STEP1
What could be the reason?
 
     
    