Jump to content

How to configure php.ini to maximize performance when importing products


Recommended Posts

Hi,

 

I've installed Prestashpo 1.5.3.1 on a Bluehost.com shared hosting account,

I'm using a CSV file to import my products (more than 30,000) but I can't upload more than 200 products each time due to timeout error.

In the best case, each set of 150~200 products will take about 10 mins to import and then it'll stop and it'll either give me Error 500, or it just freezes without importing any more products.

 

Each product has 4 images (600x600px jpeg) that are uploaded on the same server, and I'm using absolute URLs in my CSV file to import them.

 

It's so hard to keep importing just 150~200 products at a time, I've heard that some guys managed to import products in 1000s or 2000s batches, that's so much better.

 

I believe it's something to do with my php.ini file, here is my exact php.ini file content:

 

 

max_input_vars = 5000;

suhosin.post.max_vars = 3000;

suhosin.request.max_vars = 3000;

max_execution_time = 10000;

memory_limit = 512M;

 

I just have the above lines in my php.ini file.

 

Any piece of info or tip is highly appreciated.

 

thanks

sia

Link to comment
Share on other sites

max_execution_time = 0; - no limit script execution time

memory_limit = -1; no limit in memory

 

don't forget to change values after import

 

Thanks for the info, I've made those changes in the php.ini file but unfortunately without success, I can still manage to import only 150~200 product in each try.

 

I've made a phpinfo file and confirmed that the above change in php.ini file were active, so I really don't know what is the issue now!

Link to comment
Share on other sites

hey

 

maybe you aren't allowed to change the configuration?

 

create phpinfo.php script with content:

 

<?PHP
phpinfo();
?>

 

open it and you've got the php settings there, are those values (in several post above) configured properly?

  • Like 1
Link to comment
Share on other sites

most shared hosting packages will not allow you to execute long running processes, it defeats the purpose of shared hosting.

 

you can create a cronjob that executes every 5 minutes, and imports 100 products at a time. you'll likely need to create a script that manages which products to import, and marks them complete so they are not imported twice.

  • Like 2
Link to comment
Share on other sites

Yes I have confirmed all those changes in php.ini file are actually active,

making a cron job makes sense but it need some scripting knowledge that I don't have.

 

I found a solution to make import faster though, I've disabled automatic search keyword indexation in "Preferences > Search" as the result the overall load to server decreased cause it doesn't have to both import the products and index them at the same time, I can manually index products later. this way I could manage to import twice as fast, 300~330 products with images in each try.

Link to comment
Share on other sites

<?php
$start = (float) array_sum(explode(' ',microtime()));
$timelimit = ini_get( 'max_execution_time' );
echo 'your time limit: ' . $timelimit . ' seconds<br />';
$newtimelimit = $timelimit + 10;
echo 'new timelimit: ' . $newtimelimit . ' seconds<br />';
set_time_limit( $newtimelimit );
$scriptpause = $newtimelimit - 5;
sleep( $scriptpause );
$end = (float) array_sum(explode(' ',microtime()));
echo "sript working time: ". sprintf("%.4f", ($end-$start))." seconds";
?>

 

can you upload this script into your server, run it and copy output here?

 

script makes an execution time test

  • Like 1
Link to comment
Share on other sites

Yes of course, here is the results:

 

your time limit: 0 seconds

new timelimit: 10 seconds

sript working time: 5.0003 seconds

 

so eveything works fine with max_execution_time, you've got unlimited execution time. So, the problem is probably somewhere else.

 

what is your memory_limit in phpinfo() ?

 

what is the size of file you tried to import?

Link to comment
Share on other sites

yes you've got right, but script shows us, that he is allowed to increase / decrease the value of max_execution_time param.

 

Yes, but my point is... it does not matter, if the host has a separate process that is killing long running processes.

This is bluehost on a shared hosting package, you do realize they kill long running processes, right?

https://my.bluehost.com/cgi/help/478

 

@danx88: Here is a script that will run indefinitely, and will occasionally echo output. See how long this runs before it gets killed, it should be 5 minutes, unless you have a dedicated IP address, in which case it will get killed if you consume too much memory.

 

<?php

set_time_limit (0);

$count=0;
while (true) {
   $count++;
   if ($count>10000) {
       echo 'Still running<br />';
       $count=0;
   }
}
?>

 

Then contact your host and ask them why the process is being killed.

Link to comment
Share on other sites

Memory limit is set to -1 both local and master values.

and my CSV file size is about 3MB for 1000 lines, and about 1MB for each batch of ~300 products.

 

upload 2M and post 8M

 

increase the upload_max_filesize value, for example 200M :P

Link to comment
Share on other sites

and how will increasing the upload size help in process being killed? if the file could not be uploaded, it would fail immediately.

 

you guys are overlooking the most basic concept here, please review my previous response

Link to comment
Share on other sites

I think Bellini is correct, cause I don't have a dedicated IP and they clearly say that they'll kill processes after 5 mins.

But I'm gonna try increasing the upload_max_size as well to see whats the difference. my own experience is they'll kill the process after 10 mins other than 5 mins.

Link to comment
Share on other sites

the max upload size will only effect the size of the file upload. if the file was too large, it would fail immediately.

 

if you increase the max upload size, you also need to increase the max post size, as they go hand in hand. it is generally considered a security issue if you make this size too large, so I would not suggest increasing it where it would have a global or public effect

Link to comment
Share on other sites

and how will increasing the upload size help in process being killed?

 

i don't said that increasing upload_size value will help in case of killing process. We still don't know where the problem is.

Link to comment
Share on other sites

  • 3 weeks later...

Any update from the original poster on this? I'm hitting the same issue, but even with all the changes to php.ini (and making sure they were effective via phpinfo) I'm only able to push through about 77-100 at once, and turn off the search indexing didn't seem to have any impact.

 

I'd really like to see some sort of import queue so we can just give it a file and watch it populate as the server finds time, rather than just dropping it.

Link to comment
Share on other sites

  • 1 year later...

We has this problem and it was solved by setting several things. We are using fpm-php and nginx.

 

php.ini:

max_execution_time = 0  # sets script execution time to unlimited

max_input_time = -1 # sets input request processing time to unlimited

 

fpm-php (www.conf):

request_terminate_timeout = 0 # keeps fpm-php from overriding max_execution_time

 

Make sure you restart fpm-php.

Edited by mohrt (see edit history)
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...