Skip to content

Commit

Permalink
Add Python 3.13, drop Python 3.8, update tool versions (#211)
Browse files Browse the repository at this point in the history
  • Loading branch information
wRAR authored Oct 16, 2024
1 parent 147d24b commit 21c6c9c
Show file tree
Hide file tree
Showing 28 changed files with 51 additions and 95 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.12'
python-version: '3.13'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
Expand Down
14 changes: 7 additions & 7 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,20 +16,20 @@ jobs:
fail-fast: false
matrix:
include:
- python-version: "3.8"
- python-version: "3.9"
toxenv: "min"
- python-version: "3.8"
- python-version: "3.9"
toxenv: "pinned-scrapy-2x7"
- python-version: "3.8"
- python-version: "3.9"
toxenv: "pinned-scrapy-2x8"
- python-version: "3.8"
- python-version: "3.9"
toxenv: "asyncio-min"
- python-version: "3.8"
- python-version: "3.9"
- python-version: "3.10"
- python-version: "3.11"
- python-version: "3.12"
- python-version: "3.12"
- python-version: "3.13"
- python-version: "3.13"
toxenv: "asyncio"

steps:
Expand All @@ -54,7 +54,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ['3.12']
python-version: ['3.12'] # Keep in sync with .readthedocs.yml
tox-job: ["mypy", "docs", "linters", "twinecheck"]

steps:
Expand Down
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@ repos:
- id: black
language_version: python3
repo: https://github.com/ambv/black
rev: 22.12.0
rev: 24.10.0
- hooks:
- id: isort
language_version: python3
repo: https://github.com/PyCQA/isort
rev: 5.11.5
rev: 5.13.2
- hooks:
- id: flake8
language_version: python3
Expand All @@ -19,4 +19,4 @@ repos:
- flake8-docstrings
- flake8-string-format
repo: https://github.com/pycqa/flake8
rev: 6.1.0
rev: 7.1.1
2 changes: 1 addition & 1 deletion .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ sphinx:
build:
os: ubuntu-22.04
tools:
python: "3.12" # Keep in sync with .github/workflows/tests.yml
python: "3.12" # Keep in sync with .github/workflows/test.yml

python:
install:
Expand Down
2 changes: 1 addition & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
include CHANGES.rst
include CHANGELOG.rst
include LICENSE
include README.rst

Expand Down
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Installation
pip install scrapy-poet
Requires **Python 3.8+** and **Scrapy >= 2.6.0**.
Requires **Python 3.9+** and **Scrapy >= 2.6.0**.

Usage in a Scrapy Project
=========================
Expand Down
2 changes: 1 addition & 1 deletion docs/intro/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Installation
Installing scrapy-poet
======================

``scrapy-poet`` is a Scrapy extension that runs on Python 3.8 and above.
``scrapy-poet`` is a Scrapy extension that runs on Python 3.9 and above.

If you’re already familiar with installation of Python packages, you can install
``scrapy-poet`` and its dependencies from PyPI with:
Expand Down
2 changes: 0 additions & 2 deletions docs/providers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -327,8 +327,6 @@ you could implement those limits in the library itself.
Attaching metadata to dependencies
==================================

.. note:: This feature requires Python 3.9+.

Providers can support dependencies with arbitrary metadata attached and use
that metadata when creating them. Attaching the metadata is done by wrapping
the dependency class in :data:`typing.Annotated`:
Expand Down
1 change: 1 addition & 0 deletions example/example/autoextract.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
Example of how to create a PageObject with a very different input data,
which even requires an API request.
"""

from typing import Any, Dict

import attr
Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_01.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Baseline: regular Scrapy spider, sweet & easy.
"""

import scrapy


Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_02.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
Scrapy spider which uses Page Objects to make extraction code more reusable.
BookPage is now independent of Scrapy.
"""

import scrapy
from web_poet import WebPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_02_1.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
BookPage is now independent of Scrapy. callback_for is used to reduce
boilerplate.
"""

import scrapy
from web_poet import WebPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_02_2.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
has problems now, it is used in the latter examples, because as an API
it is better than defining callback explicitly.
"""

import scrapy
from web_poet import WebPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_02_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
Page object is used instead of callback below. It doesn't work now,
but it can be implemented, with Scrapy support.
"""

import scrapy
from web_poet import WebPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_03.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Scrapy spider which uses AutoExtract API, to extract books as products.
"""

import scrapy
from example.autoextract import ProductPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_04.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Scrapy spider which uses Page Objects both for crawling and extraction.
"""

import scrapy
from web_poet import WebPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_04_overrides_01.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
The default configured PO logic contains the logic for books.toscrape.com
"""

import scrapy
from web_poet import ApplyRule, WebPage

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_04_overrides_02.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
No configured default logic: if used for an unregistered domain, no logic
at all is applied.
"""

import scrapy
from web_poet import WebPage
from web_poet.rules import ApplyRule
Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_04_overrides_03.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
difference is that this example is using the ``@handle_urls`` decorator to
store the rules in web-poet's registry.
"""

import scrapy
from web_poet import WebPage, default_registry, handle_urls

Expand Down
1 change: 1 addition & 0 deletions example/example/spiders/books_05.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
Scrapy spider which uses Page Objects both for crawling and extraction.
You can mix various page types freely.
"""

import scrapy
from example.autoextract import ProductPage
from web_poet import WebPage
Expand Down
6 changes: 1 addition & 5 deletions scrapy_poet/_request_fingerprinter.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
import json
from functools import cached_property
from logging import getLogger
from typing import Callable, Dict, List, Optional, get_args, get_origin
from typing import Annotated, Callable, Dict, List, Optional, get_args, get_origin
from weakref import WeakKeyDictionary

from andi import CustomBuilder
Expand All @@ -37,10 +37,6 @@
def _serialize_dep(cls):
if isinstance(cls, CustomBuilder):
cls = cls.result_class_or_fn
try:
from typing import Annotated
except ImportError:
pass
else:
if get_origin(cls) is Annotated:
annotated, *annotations = get_args(cls)
Expand Down
1 change: 1 addition & 0 deletions scrapy_poet/downloadermiddlewares.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
responsible for injecting Page Input dependencies before the request callbacks
are executed.
"""

import inspect
import logging
import warnings
Expand Down
1 change: 1 addition & 0 deletions scrapy_poet/page_input_providers.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
different providers in order to acquire data from multiple external sources,
for example, from scrapy-playwright or from an API for automatic extraction.
"""

from typing import Any, Callable, ClassVar, FrozenSet, List, Set, Union
from warnings import warn

Expand Down
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
"scrapy.commands": ["savefixture = scrapy_poet.commands:SaveFixtureCommand"]
},
package_data={"scrapy_poet": ["VERSION"]},
python_requires=">=3.8",
python_requires=">=3.9",
install_requires=[
"andi >= 0.6.0",
"attrs >= 21.3.0",
Expand All @@ -39,10 +39,10 @@
"Operating System :: OS Independent",
"Framework :: Scrapy",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
],
)
4 changes: 0 additions & 4 deletions tests/test_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
import tempfile
from pathlib import Path

import pytest
from twisted.web.resource import Resource
from web_poet.testing import Fixture

Expand Down Expand Up @@ -246,9 +245,6 @@ class CustomItemAdapter(ItemAdapter):
result.assert_outcomes(passed=3)


@pytest.mark.skipif(
sys.version_info < (3, 9), reason="No Annotated support in Python < 3.9"
)
def test_savefixture_annotated(pytester) -> None:
project_name = "foo"
cwd = Path(pytester.path)
Expand Down
Loading

0 comments on commit 21c6c9c

Please sign in to comment.