Skip to content

Commit

Permalink
Merge branch 'master' into issue259_allow-epsg-code-as-string
Browse files Browse the repository at this point in the history
  • Loading branch information
jdries authored Aug 8, 2023
2 parents 85a5fde + 94e848d commit e3ef5b9
Show file tree
Hide file tree
Showing 16 changed files with 628 additions and 138 deletions.
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,22 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Added


- Processes that take a CRS as argument now try harder to convert your input into a proper EPSG code, to avoid unexpected results when an invalid argument gets sent to the backend.
- Initial `load_geojson` support with `Connection.load_geojson()` ([#424](https://github.com/Open-EO/openeo-python-client/issues/424))
- Initial `load_url` (for vector cubes) support with `Connection.load_url()` ([#424](https://github.com/Open-EO/openeo-python-client/issues/424))


### Changed

- `Connection` based requests: always use finite timeouts by default (20 minutes in general, 30 minutes for synchronous execute requests)
([#454](https://github.com/Open-EO/openeo-python-client/issues/454))

### Removed

### Fixed

- Fix: MultibackendJobManager should stop when finished, also when job finishes with error ([#452](https://github.com/Open-EO/openeo-python-client/issues/432))

## [0.21.1] - 2023-07-19

Expand Down
7 changes: 7 additions & 0 deletions docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,13 @@ openeo.rest.mlmodel
:inherited-members:


openeo.metadata
----------------

.. automodule:: openeo.metadata
:members: CollectionMetadata, BandDimension, SpatialDimension, TemporalDimension


openeo.api.process
--------------------

Expand Down
12 changes: 8 additions & 4 deletions docs/process_mapping.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ method or function in the openEO Python Client Library.
* - `aggregate_spatial <https://processes.openeo.org/#aggregate_spatial>`_
- :py:meth:`ProcessBuilder.aggregate_spatial() <openeo.processes.ProcessBuilder.aggregate_spatial>`, :py:meth:`aggregate_spatial() <openeo.processes.aggregate_spatial>`, :py:meth:`DataCube.aggregate_spatial() <openeo.rest.datacube.DataCube.aggregate_spatial>`
* - `aggregate_spatial_window <https://processes.openeo.org/#aggregate_spatial_window>`_
- :py:meth:`ProcessBuilder.aggregate_spatial_window() <openeo.processes.ProcessBuilder.aggregate_spatial_window>`, :py:meth:`aggregate_spatial_window() <openeo.processes.aggregate_spatial_window>`
- :py:meth:`ProcessBuilder.aggregate_spatial_window() <openeo.processes.ProcessBuilder.aggregate_spatial_window>`, :py:meth:`aggregate_spatial_window() <openeo.processes.aggregate_spatial_window>`, :py:meth:`DataCube.aggregate_spatial_window() <openeo.rest.datacube.DataCube.aggregate_spatial_window>`
* - `aggregate_temporal <https://processes.openeo.org/#aggregate_temporal>`_
- :py:meth:`ProcessBuilder.aggregate_temporal() <openeo.processes.ProcessBuilder.aggregate_temporal>`, :py:meth:`aggregate_temporal() <openeo.processes.aggregate_temporal>`, :py:meth:`DataCube.aggregate_temporal() <openeo.rest.datacube.DataCube.aggregate_temporal>`
* - `aggregate_temporal_period <https://processes.openeo.org/#aggregate_temporal_period>`_
Expand Down Expand Up @@ -191,11 +191,15 @@ method or function in the openEO Python Client Library.
* - `ln <https://processes.openeo.org/#ln>`_
- :py:meth:`ProcessBuilder.ln() <openeo.processes.ProcessBuilder.ln>`, :py:meth:`ln() <openeo.processes.ln>`, :py:meth:`DataCube.ln() <openeo.rest.datacube.DataCube.ln>`
* - `load_collection <https://processes.openeo.org/#load_collection>`_
- :py:meth:`ProcessBuilder.load_collection() <openeo.processes.ProcessBuilder.load_collection>`, :py:meth:`load_collection() <openeo.processes.load_collection>`, :py:meth:`DataCube.load_collection() <openeo.rest.datacube.DataCube.load_collection>`
- :py:meth:`ProcessBuilder.load_collection() <openeo.processes.ProcessBuilder.load_collection>`, :py:meth:`load_collection() <openeo.processes.load_collection>`, :py:meth:`DataCube.load_collection() <openeo.rest.datacube.DataCube.load_collection>`, :py:meth:`Connection.load_collection() <openeo.rest.connection.Connection.load_collection>`
* - `load_geojson <https://processes.openeo.org/#load_geojson>`_
- :py:meth:`VectorCube.load_geojson() <openeo.rest.vectorcube.VectorCube.load_geojson>`, :py:meth:`Connection.load_geojson() <openeo.rest.connection.Connection.load_geojson>`
* - `load_ml_model <https://processes.openeo.org/#load_ml_model>`_
- :py:meth:`ProcessBuilder.load_ml_model() <openeo.processes.ProcessBuilder.load_ml_model>`, :py:meth:`load_ml_model() <openeo.processes.load_ml_model>`, :py:meth:`MlModel.load_ml_model() <openeo.rest.mlmodel.MlModel.load_ml_model>`
* - `load_result <https://processes.openeo.org/#load_result>`_
- :py:meth:`ProcessBuilder.load_result() <openeo.processes.ProcessBuilder.load_result>`, :py:meth:`load_result() <openeo.processes.load_result>`
- :py:meth:`ProcessBuilder.load_result() <openeo.processes.ProcessBuilder.load_result>`, :py:meth:`load_result() <openeo.processes.load_result>`, :py:meth:`Connection.load_result() <openeo.rest.connection.Connection.load_result>`
* - `load_stac <https://processes.openeo.org/#load_stac>`_
- :py:meth:`Connection.load_stac() <openeo.rest.connection.Connection.load_stac>`
* - `load_uploaded_files <https://processes.openeo.org/#load_uploaded_files>`_
- :py:meth:`ProcessBuilder.load_uploaded_files() <openeo.processes.ProcessBuilder.load_uploaded_files>`, :py:meth:`load_uploaded_files() <openeo.processes.load_uploaded_files>`
* - `log <https://processes.openeo.org/#log>`_
Expand Down Expand Up @@ -325,4 +329,4 @@ method or function in the openEO Python Client Library.
* - `xor <https://processes.openeo.org/#xor>`_
- :py:meth:`ProcessBuilder.xor() <openeo.processes.ProcessBuilder.xor>`, :py:meth:`xor() <openeo.processes.xor>`

:subscript:`(Table autogenerated on 2023-03-15)`
:subscript:`(Table autogenerated on 2023-08-07)`
13 changes: 13 additions & 0 deletions openeo/api/process.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,19 @@ def raster_cube(cls, name: str = "data", description: str = "A data cube.") -> '
"""
return cls(name=name, description=description, schema={"type": "object", "subtype": "raster-cube"})

@classmethod
def datacube(cls, name: str = "data", description: str = "A data cube.") -> "Parameter":
"""
Helper to easily create a 'datacube' parameter.
:param name: name of the parameter.
:param description: description of the parameter
:return: Parameter
.. versionadded:: 0.22.0
"""
return cls(name=name, description=description, schema={"type": "object", "subtype": "datacube"})

@classmethod
def string(cls, name: str, description: str = None, default=_DEFAULT_UNDEFINED, values=None) -> 'Parameter':
"""Helper to create a 'string' type parameter."""
Expand Down
1 change: 1 addition & 0 deletions openeo/extra/job_management.py
Original file line number Diff line number Diff line change
Expand Up @@ -256,6 +256,7 @@ def run_jobs(
(df.status != "finished")
& (df.status != "skipped")
& (df.status != "start_failed")
& (df.status != "error")
].size
> 0
):
Expand Down
8 changes: 5 additions & 3 deletions openeo/metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,9 @@ def rename_labels(self, target, source) -> 'Dimension':
"""
Rename labels, if the type of dimension allows it.
@param target: List of target labels
@param source: Source labels, or empty list
@return: A new dimension with modified labels, or the same if no change is applied.
:param target: List of target labels
:param source: Source labels, or empty list
:return: A new dimension with modified labels, or the same if no change is applied.
"""
# In general, we don't have/manage label info here, so do nothing.
return Dimension(type=self.type, name=self.name)
Expand Down Expand Up @@ -104,6 +104,7 @@ def common_names(self) -> List[str]:
def band_index(self, band: Union[int, str]) -> int:
"""
Resolve a given band (common) name/index to band index
:param band: band name, common name or index
:return int: band index
"""
Expand Down Expand Up @@ -446,4 +447,5 @@ def _repr_html_(self):
return render_component('collection', data=self._orig_metadata)

def __str__(self) -> str:
bands = self.band_names if self.has_band_dimension() else "no bands dimension"
return f"CollectionMetadata({self.extent} - {self.band_names} - {self.dimension_names()})"
98 changes: 98 additions & 0 deletions openeo/rest/_testing.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
import re

from openeo import Connection


class DummyBackend:
"""
Dummy backend that handles sync/batch execution requests
and allows inspection of posted process graphs
"""

# Default result (can serve both as JSON or binary data)
DEFAULT_RESULT = b'{"what?": "Result data"}'

def __init__(self, requests_mock, connection: Connection):
self.connection = connection
self.sync_requests = []
self.batch_jobs = {}
self.next_result = self.DEFAULT_RESULT
requests_mock.post(connection.build_url("/result"), content=self._handle_post_result)
requests_mock.post(connection.build_url("/jobs"), content=self._handle_post_jobs)
requests_mock.post(
re.compile(connection.build_url(r"/jobs/(job-\d+)/results$")), content=self._handle_post_job_results
)
requests_mock.get(re.compile(connection.build_url(r"/jobs/(job-\d+)$")), json=self._handle_get_job)
requests_mock.get(
re.compile(connection.build_url(r"/jobs/(job-\d+)/results$")), json=self._handle_get_job_results
)
requests_mock.get(
re.compile(connection.build_url("/jobs/(.*?)/results/result.data$")),
content=self._handle_get_job_result_asset,
)

def _handle_post_result(self, request, context):
"""handler of `POST /result` (synchronous execute)"""
pg = request.json()["process"]["process_graph"]
self.sync_requests.append(pg)
return self.next_result

def _handle_post_jobs(self, request, context):
"""handler of `POST /jobs` (create batch job)"""
pg = request.json()["process"]["process_graph"]
job_id = f"job-{len(self.batch_jobs):03d}"
self.batch_jobs[job_id] = {"job_id": job_id, "pg": pg, "status": "created"}
context.status_code = 201
context.headers["openeo-identifier"] = job_id

def _get_job_id(self, request) -> str:
match = re.match(r"^/jobs/(job-\d+)(/|$)", request.path)
if not match:
raise ValueError(f"Failed to extract job_id from {request.path}")
job_id = match.group(1)
assert job_id in self.batch_jobs
return job_id

def _handle_post_job_results(self, request, context):
"""Handler of `POST /job/{job_id}/results` (start batch job)."""
job_id = self._get_job_id(request)
assert self.batch_jobs[job_id]["status"] == "created"
# TODO: support custom status sequence (instead of directly going to status "finished")?
self.batch_jobs[job_id]["status"] = "finished"
context.status_code = 202

def _handle_get_job(self, request, context):
"""Handler of `GET /job/{job_id}` (get batch job status and metadata)."""
job_id = self._get_job_id(request)
return {"id": job_id, "status": self.batch_jobs[job_id]["status"]}

def _handle_get_job_results(self, request, context):
"""Handler of `GET /job/{job_id}/results` (list batch job results)."""
job_id = self._get_job_id(request)
assert self.batch_jobs[job_id]["status"] == "finished"
return {
"id": job_id,
"assets": {"result.data": {"href": self.connection.build_url(f"/jobs/{job_id}/results/result.data")}},
}

def _handle_get_job_result_asset(self, request, context):
"""Handler of `GET /job/{job_id}/results/result.data` (get batch job result asset)."""
job_id = self._get_job_id(request)
assert self.batch_jobs[job_id]["status"] == "finished"
return self.next_result

def get_sync_pg(self) -> dict:
"""Get one and only synchronous process graph"""
assert len(self.sync_requests) == 1
return self.sync_requests[0]

def get_batch_pg(self) -> dict:
"""Get one and only batch process graph"""
assert len(self.batch_jobs) == 1
return self.batch_jobs[max(self.batch_jobs.keys())]["pg"]

def get_pg(self) -> dict:
"""Get one and only batch process graph (sync or batch)"""
pgs = self.sync_requests + [b["pg"] for b in self.batch_jobs.values()]
assert len(pgs) == 1
return pgs[0]
Loading

0 comments on commit e3ef5b9

Please sign in to comment.