How we can get List of urls after crawling website from scrapy in costom python script? -


i working script need crawl websites, need crawl base_url site. has pretty idea how can launch scarpy in custom python scripts , urls link in list?

you can add scrapy commands external library adding scrapy.commands section entry_points in setup.py.

from setuptools import setup, find_packages  setup(name='scrapy-mymodule',   entry_points={     'scrapy.commands': [       'my_command=my_scrapy_module.commands:mycommand',     ],   },  ) 

http://doc.scrapy.org/en/latest/experimental/index.html?highlight=library#add-commands-using-external-libraries

also see scrapy basic example.


Comments

Popular posts from this blog

node.js - Mongoose: Cast to ObjectId failed for value on newly created object after setting the value -

[C++][SFML 2.2] Strange Performance Issues - Moving Mouse Lowers CPU Usage -

ios - Possible to get UIButton sizeThatFits to work? -