0

I am using scrapyd (project deployed on ec2 instance of AWS) that accept seed url to start, I want to start each time run spider with different name, so that I can manage items and logs easily on ec2 instance.

locally I can do like this

crawl spider_name -a name=new_name_of _spider

and it works fine,

when I try this launching spider on ec2 instance from command line

crawl http://hostname:port/schedule.json -d project=project_name -d spider=spider_default_name -d name=spider_new_name

It throws error

{"status": "error", "message": "add() got multiple values for keyword argument 'name'"}

Is there any way to run spider with overriding its name. ?

Tasawer Nawaz
  • 845
  • 6
  • 16
  • Scrapyd sucks, but there isn't an alternative (yet). `name` seems to be used internally by Scrapyd, so try renaming your keyword argument to `spider_name` or something. – Blender Dec 21 '13 at 07:40
  • it works fine (but only in logs) , still showing same name in jobs, and storing log and item in same folder :( – Tasawer Nawaz Dec 21 '13 at 07:57

0 Answers0