-
Notifications
You must be signed in to change notification settings - Fork 78
Python3 support, Scrapy 1.0+ dependencies #67
Conversation
|
Well that is awkward. I wasn't planning on fixing tests. |
|
Rebased on top of #71 |
| assert mw.db.items() == [('test_key_1', 'test_v_1'), | ||
| ('test_key_2', 'test_v_2')] | ||
| assert mw.db.items() == [(b'test_key_1', b'test_v_1'), | ||
| (b'test_key_2', b'test_v_2')] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is strange: test_v_1 values are returned as bytes. But were inserted as strings (db.put(b'test_key_1', 'test_v_1')). Not sure if this is bsddb3 not casting back to int/string on fetch or what.
requirements.txt
Outdated
| @@ -1,5 +1,6 @@ | |||
| six | |||
| boto | |||
| hubstorage | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's require hubstorage>=0.23
|
@nyov , I re-triggered the builds on Travis. |
| 'Programming Language :: Python' | ||
| ], | ||
| install_requires=['Scrapy>=0.22.0'] | ||
| install_requires=['Scrapy>=1.0.0'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should it require same scrapy version as in requirements.txt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... maybe not, so as to support Scrapy 1.0 users
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure. Perhaps the line can be dropped completely?
|
Updated and rebased on master. |
Including Python 3 porting by @nyov from scrapinghub/scrapylib#67
|
So I guess this commit won't make it...? I've just been staring at a new deployment error File "/usr/local/lib/python3.5/dist-packages/scrapylib/processors/__init__.py", line 5, in <module>
from urlparse import urljoin
ImportError: No module named 'urlparse'And I had this deja-vu (didn't I fix this already?)... It's really great to see the new packages coming, but I would think a transitional package version would be great as well. Pull in those new depends and strip the package piecemeal. |
|
@nyov , you're right, a new release of scrapylib with new depends makes sense. |
|
Thanks for the info, @redapple. Should (¹ IIRC ItemLoader was kept from contrib rather than packaged itself, after several(?) discussions.) |
Since f347493, because
scrapy.loaderdoes not exist prior.