Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hi! We cleaned up your code for you! #17

Open
wants to merge 49 commits into
base: master
Choose a base branch
from

Conversation

GunioRobot
Copy link

Hi there!

This is WhitespaceBot from Gun.io. I'm an open-source robot that removes trailing white space in your code, and gives you a gitignore file if you didn't have one! I've only cleaned your most popular project, and I've added you to a list of users not to contact again, so you won't get any more pull requests from me unless you ask. If I'm misbehaving, please email my owner and tell him to turn me off!

== About Gun.io ==
Gun.io is a place for hackers to hire each other for small tasks. We offer no-hassle, winner-take-all freelance gigs, by hackers, for hackers. Got a bug you can't fix or a feature you want for your project? Post a gig and have somebody else sort it out for you. Oh, and it's free for open source! Sign up and get notified about new gigs you can work on!

== About WhitespaceBot==
WhitespaceBot is a simple open source robot which uses GitHub's API as a way of cleaning up open source projects! We've put up a paid bounty for whoever can add security fixing features to it.

Morgan Goose and others added 30 commits March 18, 2010 15:57
This is managed by decorators, or command line flags, or both.

It uses the multiprocessing module's Process class to fork each host on
a task. It will also check to see if a task is set to be run
sequentially instead of in parallel, like functions set to be
@runs_once.
execution model if it can't load multiprocessing.
It also adds in a bit more checking to make sure a @runs_parallel won't
raise an exception on python versions w/o the multiprocessing module.
It'll make it simpler to read what is going on, and describes in detail
what cases will make a function run in parallel, if it is able to do so.
I'll wait to try and implement my own Pool, as the lack of shared state is
killing the fabric tasks. They lose all knowledge of their env and the specific
host they are on.

I need to just make pool type class/function that just makes a bubble of size x
of Process()es to run, and keeps it to a said size until it completes.
So I have added in the Job_Queue class, and incorperated it into the main fab
section to manage the multiprocessing Processes.

I have hooked into the state.output to toggle the _debug flag for job_queue.

The job queue uses the host_string var that gets set as the Process().name to
then set the env.state.host_string again inside the running queue.
I was getting an error, non-fatal, for when a pool was specified that was
larger than the sum of the hosts. So in that event I just set the pool size to
the same size as number of hosts.
This will make the Job_Queue have a len throughout it's life, instead of just
at the close().
Conflicts:
	fabric/decorators.py
	fabric/main.py
	fabric/state.py
	tests/test_decorators.py
…halves the pool_size if larger than len(hosts)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants