Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for a default_radius flag #6575

Merged

Conversation

whytro
Copy link
Contributor

@whytro whytro commented Mar 19, 2023

Issue

What issue is this PR targeting? If there is no issue that addresses the problem, please open a corresponding issue and link it here.

Relevant to:

#2983 Require a radius parameter other than unlimited when using bearings

Please read our documentation on release and version management.
If your PR is still work in progress please attach the relevant label.

Tasklist

Requirements / Relations

Link any requirements here. Other pull requests this PR is based on?

Relevant to PR #6572: Require a radius parameter when using bearings (GSoC)

@whytro whytro marked this pull request as ready for review March 20, 2023 10:17
@@ -112,6 +112,18 @@ test('constructor: throws if dataset_name is not a string', function(assert) {
assert.throws(function() { new OSRM({dataset_name: "unsued_name___", shared_memory: true}); }, /Could not find shared memory region/, 'Does not accept wrong name');
});

test('constructor: takes a default_bearing_radius argument', function(assert) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is some weird "Too many open files" error in these tests https://github.com/Project-OSRM/osrm-backend/actions/runs/4467036487/jobs/7850020794?pr=6575

I assume that probably it is something which we already had, but didn't know about: each test creates new OSRM object, this object opens files(e.g. to load dataset), but then doesn't close it(or close it, but only when NodeJs GC collects it - tbh I have a bit of lack of knowledge here). So you added a couple of more tests and now we exceed this system threshold. So, may I ask you to check if this is the case? You can just comment out these tests and push - and we will see if problem will disappear? And after that we can come up with what to do with it - probably we can fix that separately.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://github.com/nodejs/node-addon-api/blob/main/doc/object_wrap.md

Caution: When the JavaScript object is garbage collected, the call to the C++ destructor may be deferred until a later time. Within that period, Value() will return an empty value.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I've commented those tests out for now.

@whytro whytro marked this pull request as draft March 20, 2023 16:51
@whytro whytro marked this pull request as ready for review March 20, 2023 16:55
@SiarheiFedartsou
Copy link
Member

Hey, now I think we are good with this PR in general, but can we try to understand what is wrong with those NodeJs tests failing on macOS?

I guess the reason for it is simple: we close dataset files in desctructors of C++ object and the time when they are called in NodeJs is undetermined. And I think it is okay for production use cases since people don't create multiple OSRM instances. So I would propose trying to increase upper limit for number of file descriptors on CI https://superuser.com/questions/302754/increase-the-maximum-number-of-open-file-descriptors-in-snow-leopard
it probably can be done somewhere here

- name: Prepare environment

We can check if it works and if it is not, then just create a ticket to resolve it later and merge this PR. WDYT @whytro ? I think you can uncomment those tests back and I can play with fix I proposed(feel free to do it on your own though, but it won't be really convenient - as you can see github asks for manual approval for each your CI run).

@whytro
Copy link
Contributor Author

whytro commented Mar 22, 2023

Admittedly, I'm not too familiar on configuring CI workflows, but given the resources you've linked, I've made a minor adjustment in the section mentioned that should hopefully fix it. I've also re-enabled the tests.

@SiarheiFedartsou
Copy link
Member

SiarheiFedartsou commented Mar 22, 2023

Admittedly, I'm not too familiar on configuring CI workflows,

I didn't mean it would be hard to do for you, I meant it would be not very convenient as each time you would have to wait until I push this button :) But thanks for the try 👍

Screenshot 2023-03-22 at 14 19 08

@whytro
Copy link
Contributor Author

whytro commented Mar 22, 2023

Oh no worries, I understood you there - I just wanted to mean that I'd given it a try but I'm not too sure if the change I made was correct because of my unfamiliarity with it!

@SiarheiFedartsou
Copy link
Member

@whytro many thanks for this contribution 👍 Problem with tests is still there, so I want to understand why it happens (playing with it here #6578) and will update this PR afterwards with fix I will come up with.

@whytro
Copy link
Contributor Author

whytro commented Mar 22, 2023

@SiarheiFedartsou Thank you for the guidance! Should I close out the other PR (#6572)?

@SiarheiFedartsou
Copy link
Member

@SiarheiFedartsou Thank you for the guidance! Should I close out the other PR (#6572)?

I think we can keep it for the future: I think we can merge it on the next major version update.

@whytro
Copy link
Contributor Author

whytro commented Mar 22, 2023

Okay! I'll also work on refining it, as there are some things in that PR I've noticed while working on this one that I realized I didn't like, or missed. That way, it'll be in better shape for when the next major version update is being prepped.

@nilsnolde
Copy link
Contributor

Thanks for all the work and hanging in there @whytro ! I'd just like to remind you to not forget about the application document for GSoC:) There's still time of course, still wanted to mention it.

@whytro
Copy link
Contributor Author

whytro commented Mar 23, 2023

Thanks! I've just been reading through the guidelines quite a bit regarding the application - I'll probably drop any remaining questions about the scale/expectations of #6558 after a day or two of some additional preliminary research!

@nilsnolde
Copy link
Contributor

Feel free to get in touch beforehand, @SiarheiFedartsou and me both have a public email address on Github. We could schedule a getting-to-know-each-other meeting if you want. I'd personally actually prefer that. Then we can also let you know what we'd expect in terms of application, so you don't have to spend too much time on it.

@SiarheiFedartsou
Copy link
Member

Hey @whytro
I fixed those Node tests: not sure why we had this Too many open files error, but it has gone as soon as I added path to dataset when creating OSRM instance. So now I think we are good to merge it.

Thanks for this contribution and looking forward to work with you on our GSoC project 🚀

@SiarheiFedartsou SiarheiFedartsou merged commit d516314 into Project-OSRM:master Mar 24, 2023
@whytro whytro deleted the default_bearing_radius-flag branch March 28, 2023 07:22
@whytro whytro changed the title Add support for a default_bearing_radius flag Add support for a default_radius flag Apr 7, 2023
whytro added a commit to whytro/osrm-backend that referenced this pull request Apr 27, 2023
mattwigway pushed a commit to mattwigway/osrm-backend that referenced this pull request Jul 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants