Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Celery 4 support #67

Open
wants to merge 4 commits into
base: develop
Choose a base branch
from
Open

Celery 4 support #67

wants to merge 4 commits into from

Conversation

ddemid
Copy link

@ddemid ddemid commented Oct 3, 2017

The logic in task mostly moved to special handler class and the task became a minimal @shared_task instance.

Added proper exception handling.

@ddemid ddemid closed this Oct 3, 2017
@ddemid ddemid reopened this Oct 3, 2017
@ddemid ddemid mentioned this pull request Oct 3, 2017
@grjones
Copy link

grjones commented Feb 6, 2018

Hello. Any idea when this might be merged in?


def get_update_task(task_path=None):
import_path = task_path or settings.CELERY_HAYSTACK_DEFAULT_TASK
return get_class(import_path)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure this is working? The former get_update_task returns an instantiated task, but this returns the class.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like he left that part alone in the latest commit of his fork:
https://github.com/dwintergruen/celery-haystack/blame/develop/celery_haystack/utils.py

I am going to use his library to see if it works, but it seems like it should
@ddemid Do you plan to make a PR with your latest commit?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@grjones Yes, I'm sure. This version is supposed to work with @shared_task which is a function as it's recommended in the Celery documentation.

I'm not a big fan of huge class-based tasks which are harder to test.
Custom functionality should live in the handler in my opinion.

@pasevin
Copy link

pasevin commented Aug 1, 2018

Still no merge?

@HugoArts
Copy link

HugoArts commented Aug 8, 2018

I'm testing this branch right now and it appears to work with celery 4.x. There is one thing to be aware of though: If you are upgrading from a previous version of celery-haystack, and you were setting a custom CELERY_HAYSTACK_DEFAULT_TASK, that task must now be a function instead of a class, as mentioned above by @ddemid.

You can still use your custom default task class, but you must set it as the handler now, using the setting CELERY_HAYSTACK_HANDLER. In other words, in your settings just change DEFAULT_TASK to HANDLER and you should be mostly fine.

I would really like this pull request to be officially merged. I'm relying on other package which require celery 4.x, so using these in combination is very difficult. The change of settings variable names does make it a backwards incompatible change, though. If this wasn't changed it seems like an obvious merge.

@barseghyanartur
Copy link

Any progress on this?

@pasevin
Copy link

pasevin commented Mar 3, 2019

Looks like this repo is dead :/

@micahscopes
Copy link

Any word on this?

@johnyoonh
Copy link

I use this library. Haven't checked on the features though. It's fork of fork of this library.
https://github.com/dwintergruen/celery-haystack/tree/celery4

@micahscopes
Copy link

micahscopes commented Apr 16, 2019

@johnyoonh ooh, thanks for suggesting that fork!

I must admit that I'm personally fairly new to celery. That said, my team has been using celery successfully in our Django project, so I know at a basic level we have it set up correctly.

Unfortunately, I've tried the main fork and now the fork you've suggested and am still having the same issue in each of them.

Specifically, celery-haystack tasks are being sent to the worker but then they just sit there.

For example, if I create a new item for a haystack indexed model, the worker logs show:

studio-indexing-worker_1  | [2019-04-16 02:54:54,574: INFO/MainProcess] Received task: celery_haystack.tasks.haystack_signal_handler[870898d5-85b1-4bbf-ae84-206f4371250d]  ETA:[2019-04-16 02:54:59.572244-07:53] 
studio-indexing-worker_1  | [2019-04-16 02:54:54,574: DEBUG/MainProcess] basic.qos: prefetch_count->98

But then nothing happens!

When I do celery -A contentcuration inspect scheduled I get loads and loads of accumulated tasks that look like this.

{u'eta': u'2019-04-16T02:54:59.574667-07:53',
   u'priority': 6,
   u'request': {u'acknowledged': False,
    u'args': u"('update', 'contentcuration.contentnode.cee283f55ffc4322b4d21ad81b4567f3')",
    u'delivery_info': {u'exchange': u'',
     u'priority': 0,
     u'redelivered': None,
     u'routing_key': u'indexing'},
    u'hostname': u'indexing-worker@1eb5d32b3752',
    u'id': u'953cf225-9143-44b9-bf03-4cd61f8f7e0b',
    u'kwargs': u'{}',
    u'name': u'celery_haystack.tasks.haystack_signal_handler',
    u'time_start': None,
    u'type': u'celery_haystack.tasks.haystack_signal_handler',
    u'worker_pid': None}
}

The worker is started like this: cd contentcuration/ && celery -A contentcuration worker -Q indexing -l debug -n indexing-worker@%h

I've tried using the default celery, as well as the extra options --without-gossip --without-mingle --without-heartbeat -Ofair

Any thoughts? Am I missing something really basic about celery? I'm struggling to get a grip on how I can even further diagnose this.

@johnyoonh
Copy link

@micahscopes Please don't hijack the issue and your question is more appropriate for stackoverflow.
I'd check whether your broker is set up properly. I don't see if is using rabbitmq or redis, so I can't comment. Also remove queue option and start with default queue.

@micahscopes
Copy link

micahscopes commented Apr 17, 2019

@johnyoonh apologies, I didn't intend to hijack the issue... it's unclear to me if the issue I'm facing is potentially related to something that changed with celery 4. As you suggested, I can try stackoverflow.

edit: I realize that this is a PR. My bad! I'll open a separate issue.

@micahscopes
Copy link

micahscopes commented Jun 25, 2019

Hey again. I wanted to share back that my issue with Celery 4 support had to do with a bug in Celery 4.1.x where scheduled task times weren't correctly calculated for a given countdown and timezone. The bug was fixed in Celery 4.2.

If this PR ever gets merged, perhaps it'd be a good idea to either not support Celery 4.1.x or only support Celery >= 4.2.x.

@nikolas
Copy link

nikolas commented Mar 11, 2021

@acdha can you take a look at this PR so we can get this moving forward?

@acdha
Copy link
Contributor

acdha commented Mar 12, 2021

@acdha can you take a look at this PR so we can get this moving forward?

I don't use this project myself so I'd like to get some feedback from anyone actually running it. Has anyone used it much?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants