-
Notifications
You must be signed in to change notification settings - Fork 355
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Big file issue #634
Comments
Closed
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello,
i am trying to download and upload files from S3 to local filesystem and vice versa using Knp Gaufrette in a Symfony project.
My code work run on a container in openshift and work fine for little files but with large file (1Go) i have memory issues. I am limited to 1,5Go in my container. I don't understand why my memory is growing so high. Maybe i don't understand well the concept of stream in php but i thought that the file wasn't loaded in memory with stream but loaded chunk by chunk.
With my code i can see that when i do
$srcStream = $this->fs_s3->createStream($filename); $srcStream->open(new StreamMode('rb+'));
my memory is growing with the size of the file.I also tried
copy('gaufrette://s3/'.$filename,'gaufrette://nfs/'.$filename);
but is is the same.Am i using stream in the wrong way? Any advice?
Thank you in advance for your help.
Regards
The text was updated successfully, but these errors were encountered: