I am using scrapyd (project deployed on ec2 instance of AWS) that accept seed url to start, I want to start each time run spider with different name, so that I can manage items and logs easily on ec2 instance.
locally I can do like this
crawl spider_name -a name=new_name_of _spider
and it works fine,
when I try this launching spider on ec2 instance from command line
crawl http://hostname:port/schedule.json -d project=project_name -d spider=spider_default_name -d name=spider_new_name
It throws error
{"status": "error", "message": "add() got multiple values for keyword argument 'name'"}
Is there any way to run spider with overriding its name. ?