You are viewing this forum as a guest. Login to an existing account, or create a new account, to reply to topics and to create new topics.
Hi,
We are offering downloadable audio files of up to 2 GB.
When I order a file of 443 MB or bigger and perform a post-order update and send the Internal Mail Confirmation or Customer Mail Confirmation, I get the following error message:
Fatal error: Out of memory (allocated 448266240) (tried to allocate 443063863 bytes) in H:\Websites\DWTC_com_Shop\khxc-private-com\core\KHXC_File\KHXC_File.php on line 351
It maybe that copying the audio file from the private folder to the public folder (which are on the same partition) takes too long, that the script finishes before the file is copied.
Its not a memory problem of the hard drive because there are about 150 GB free space.
Where the exact limit is I don't know. 250 MB are still processed.
How to solve this problem?
Thanks for any help.
Offline
PHP would have to be configured with enough memory to cover your largest file (plus some extra) which I doubt seriously will be something a provider will do (shared or dedicated).
Offline
Thank you for the answer.
How would I configure php to use more memory and what are the consequences?
In php.ini is the following line:
memory_limit = 16M ; Maximum amount of memory a script may consume
Is this the right place?
The server is our own server and we can change whatever is good and necessary.
Offline
On a Windows Server the memory settings in php.ini are like this:
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
memory_limit = 512M ; Maximum amount of memory a script may consume (128MB)
I changed it to the following settings:
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; Maximum execution time of each script, in seconds
;max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
max_input_time = 120 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
;memory_limit = 512M ; Maximum amount of memory a script may consume (128MB)
memory_limit = 2048M ; Maximum amount of memory a script may consume (128MB)
But this did not make a difference.
If the file size is more than about 380 MB the file is not copied to the public folder. And it seems the script even does not try to copy.
When I make a post-order update and resend the confirmation email I get right away a blank page.
While when the file size is less than 380 MB the post-order update takes about 20 seconds and I get the post-order update page again.
The server has 4 GB of RAM.
Any advice on this?
Offline
The method used for copying downloadable files that is used today is simply not going to work for files that large. There MAY be a new method available in a core file after the next update is released that MAY solve your problem but it will require a core file change (which can't happen until an update is released).
Offline
Thanks Dave, I am trying to solve this in another way.
Offline