Skip to content

Commit

Permalink
updated documentation and code to use -s instead of --set option
Browse files Browse the repository at this point in the history
  • Loading branch information
pablohoffman committed Sep 1, 2011
1 parent 46edfd4 commit 76af0cd
Show file tree
Hide file tree
Showing 11 changed files with 13 additions and 88 deletions.
75 changes: 0 additions & 75 deletions debian/scrapy-files/scrapy.1

This file was deleted.

2 changes: 1 addition & 1 deletion debian/scrapy.manpages
Original file line number Diff line number Diff line change
@@ -1 +1 @@
debian/scrapy-files/scrapy.1
extras/scrapy.1
6 changes: 3 additions & 3 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -194,15 +194,15 @@ Simplest way to dump all my scraped items into a JSON/CSV/XML file?

To dump into a JSON file::

scrapy crawl myspider --set FEED_URI=items.json --set FEED_FORMAT=json
scrapy crawl myspider -s FEED_URI=items.json -s FEED_FORMAT=json

To dump into a CSV file::

scrapy crawl myspider --set FEED_URI=items.csv --set FEED_FORMAT=csv
scrapy crawl myspider -s FEED_URI=items.csv -s FEED_FORMAT=csv

To dump into a XML file::

scrapy crawl myspider --set FEED_URI=items.xml --set FEED_FORMAT=xml
scrapy crawl myspider -s FEED_URI=items.xml -s FEED_FORMAT=xml

For more information see :ref:`topics-feed-exports`

Expand Down
2 changes: 1 addition & 1 deletion docs/intro/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ Run the spider to extract the data
Finally, we'll run the spider to crawl the site an output file
``scraped_data.json`` with the scraped data in JSON format::

scrapy crawl mininova.org --set FEED_URI=scraped_data.json --set FEED_FORMAT=json
scrapy crawl mininova.org -s FEED_URI=scraped_data.json -s FEED_FORMAT=json

This uses :ref:`feed exports <topics-feed-exports>` to generate the JSON file.
You can easily change the export format (XML or CSV, for example) or the
Expand Down
2 changes: 1 addition & 1 deletion docs/intro/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -420,7 +420,7 @@ Storing the scraped data
The simplest way to store the scraped data is by using the :ref:`Feed exports
<topics-feed-exports>`, with the following command::

scrapy crawl dmoz --set FEED_URI=items.json --set FEED_FORMAT=json
scrapy crawl dmoz -s FEED_URI=items.json -s FEED_FORMAT=json

That will generate a ``items.json`` file containing all scraped items,
serialized in `JSON`_.
Expand Down
4 changes: 2 additions & 2 deletions docs/topics/settings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -56,13 +56,13 @@ Example::
>>> settings.overrides['LOG_ENABLED'] = True

You can also override one (or more) settings from command line using the
``--set`` command line argument.
``-s`` (or ``--set``) command line option.

.. highlight:: sh

Example::

scrapy crawl domain.com --set LOG_FILE=scrapy.log
scrapy crawl domain.com -s LOG_FILE=scrapy.log

2. Project settings module
--------------------------
Expand Down
2 changes: 1 addition & 1 deletion extras/scrapy.1
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Write python cProfile stats to FILE
Write lsprof profiling stats to FILE
.SS --pidfile=FILE
Write process ID to FILE
.SS --set=SET
.SS --set=NAME=VALUE, -s NAME=VALUE
Set/override setting (may be repeated)

.SH AUTHOR
Expand Down
2 changes: 1 addition & 1 deletion scrapy/command.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ def process_options(self, args, opts):
try:
settings.overrides.update(arglist_to_dict(opts.set))
except ValueError:
raise UsageError("Invalid --set value, use --set NAME=VALUE", print_help=False)
raise UsageError("Invalid -s value, use -s NAME=VALUE", print_help=False)

if opts.logfile:
settings.overrides['LOG_ENABLED'] = True
Expand Down
2 changes: 1 addition & 1 deletion scrapy/tests/test_cmdline/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def test_default_settings(self):
'default + loaded + started')

def test_override_settings_using_set_arg(self):
self.assertEqual(self._execute('settings', '--get', 'TEST1', '--set', 'TEST1=override'), \
self.assertEqual(self._execute('settings', '--get', 'TEST1', '-s', 'TEST1=override'), \
'override + loaded + started')

def test_override_settings_using_envvar(self):
Expand Down
2 changes: 1 addition & 1 deletion scrapyd/tests/test_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ def test_get_crawl_args(self):
def test_get_crawl_args_with_settings(self):
msg = {'_project': 'lolo', '_spider': 'lala', 'arg1': u'val1', 'settings': {'ONE': 'two'}}
cargs = get_crawl_args(msg)
self.assertEqual(cargs, ['lala', '-a', 'arg1=val1', '--set', 'ONE=two'])
self.assertEqual(cargs, ['lala', '-a', 'arg1=val1', '-s', 'ONE=two'])
assert all(isinstance(x, str) for x in cargs), cargs

class GetSpiderListTest(unittest.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion scrapyd/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def get_crawl_args(message):
args += ['-a']
args += ['%s=%s' % (k, v)]
for k, v in stringify_dict(settings, keys_only=False).items():
args += ['--set']
args += ['-s']
args += ['%s=%s' % (k, v)]
return args

Expand Down

0 comments on commit 76af0cd

Please sign in to comment.