Tamas Szegedi Posted February 2, 2009 Share Posted February 2, 2009 Hey My hoster told me that my (prestashop) site makes a very high cpu progress. The product.php file and the category.php file, but it's templolery. So I think the googlebot (and another spiders/bots) makes this, so I add that two files to the robots.txt. I don't know maybe its help?! What do you think ? It can be problem ? It is the problem ? I have more then 1800 products, so maybe ... Goodlebot see every products very fast ? EXAMPLE: Googlebot find: 7725+56 144.65 MB 2009 Jan 31 - 23:59 Yahoo Slurp find: 5745+239 113.07 MB 2009 Jan 31 - 23:58 So it can makes the high process? Please help me or my hoster will delete my site (THANK YOU ! Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 You grow... its time to upgrade your hosting plan... Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 It's not too easy, not the hosting low, the presta (product.php) makes too high process... Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 It's not too easy, not the hosting low, the presta (product.php) makes too high process... Could you provide your URL? Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 www.bamby.hu Link to comment Share on other sites More sharing options...
giralatina Posted February 2, 2009 Share Posted February 2, 2009 Can you run this commanduptimetop Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 it's a shared server, not mine, where can i run this ? :$ Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 it's a shared server, not mine, where can i run this ? :$ Without ssh access, you won't be able to run these commands... Ask you host more details... Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 And what do you think about the robots.txt, the google is considerate it ? I realy hope My robots.txt User-agent: * # Directories Disallow: /classes/ Disallow: /config/ Disallow: /download/ Disallow: /mails/ Disallow: /modules/ Disallow: /themes/ Disallow: /translations/ Disallow: /tools/ # Files Disallow: /cart.php Disallow: /category.php Disallow: /order.php Disallow: /product.php Disallow: /my-account.php Disallow: /history.php Disallow: /addresses.php Disallow: /identity.php Disallow: /discount.php Disallow: /authentication.php Disallow: /pdf-invoice.php Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 Whos is your host? Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 It's a Hungarian host. http://www.xhu.hu But it isn't problem with the host, because it is a realy big host in hungary .. Link to comment Share on other sites More sharing options...
jhnstcks Posted February 2, 2009 Share Posted February 2, 2009 Google used over 700MB of my bandwidth in January (for one of my sites), so your 144mb isnt very much in comparison. I think you need to find a better host, if they are a big host then they have a lot of sites using their server space, which is why they threaten to delete your site. Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 I don't know why, but i put the new code to the robots.txt, and the server host send me an e-mail, that's okey now: bambyhu bamby.hu 0.36 1.51 0.0Top Process %CPU 26.0 Top Process %CPU 21.0 /usr/bin/php /home/bambyhu/public_html/category.phpCPU: 0.36 - so it's good now.... I hope it stay good. I don't think realy that the robots are the wrong :S so i don't know, ifsomebody have idea, i'm waiting for it.. .The category.php and the product.php sometimes do that....Here is when it wasn't good: Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 I too weren't thinking about a issue with your host but with PS db query, I readed this about a testimonial from http://www.journaldunet.com/developpeur/temoignage/temoignage/216931/phpshop-mitige/Here the english version translate with google: Have you encountered any difficulties?I was appalled by the structures and use these solutions make the underlying data (MySQL). It was obviously an excellent case to PHP programmers, but poor design database: no use of views (instead of having hard-SQL code in PHP) or roles database (for securing basic, but strong), ignorance stored procedures, foreign key relationships ... Btw, I'm not a db expert, but sound way to explore... Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 I don't realy uderstund it (Is there problem if i make new row in the data base ps_feature_value when i make a new product, because every product have 8-10 features... and every is diferent... so i hve more then 15.000 rows in the feature table... It can be problem ? Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 From what I understand PS is not db optimized... Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 2, 2009 Author Share Posted February 2, 2009 If you see my picture what i sent you can see my sql process: 0.0 ! The problem with the php files not with SQL.. Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2009 Share Posted February 2, 2009 If so it's really a big problem... my 2 cents... most bottleneck comes from DB side... Link to comment Share on other sites More sharing options...
Tamas Szegedi Posted February 8, 2009 Author Share Posted February 8, 2009 Hmm...THE GOOGLEBOT WAS THE WRONG !!! I solved the problem Maybe is also important for you, so:i put some meta (html) lines to the product.php and the category.php,and it's working !! I already test it, and it's work. The googlebot madebefore i put it into maybe 30-35 MB / day.. not it's make 0.5- max 1mb/day Soo It's realy good The meta lines: <meta name="robots" content="noindex, nofollow"> <meta name="distribution" content="global"> <meta name="language" content="HU"> ---- OR: "ENG" or "IT" <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> And if you want you can add this also: <meta name="revisit-after" content="3 Months"> ---- OR:"1 day" or "7 days" etc... You can choise the revisit time Link to comment Share on other sites More sharing options...
Damien Metzger Posted February 8, 2009 Share Posted February 8, 2009 The revisit-after meta is a legend and has only be used by one search engine something like 10 years ago. Link to comment Share on other sites More sharing options...
Guest Posted February 8, 2009 Share Posted February 8, 2009 I don't really think the cpu load is due to google if so every website have huge load in the industry... but i guess its really have something to do with PS itself... Link to comment Share on other sites More sharing options...
nathanielleee Posted February 29, 2012 Share Posted February 29, 2012 Im still having major spikes and still cant figure it out. I have close to 500 products on my page but its using close to 60GB/month Link to comment Share on other sites More sharing options...
Shoperis Posted January 4, 2013 Share Posted January 4, 2013 I noticed that a lot of people have the problem that the CPU uses the maximum, without big traffic on the Prestashop site. I also have this problem, but I have not found anywhere that someone advice on how to solve this problem. If you could Advise how to solve this problem. Thank you. Prestashop Version: 1.4.0.17 About 250 categories About 25000 products There is no difference between 70 and 300 people visit daily. Hosting: Bandwidth 2000 GB / month. + 720 GB / month. loyalty (used 7.80 GB) Disk space: 20,000 MB + 7,200 MB for loyalty (used 3100 MB) (inodes 100%) MySQL 5.5 - Unlimited PHP 5.2 / 5.3 / 5.4. yes PHP processes: 8 x Php (exceeded 1,316 times a day) Link to comment Share on other sites More sharing options...
wlongdonstudio Posted January 6, 2013 Share Posted January 6, 2013 I noticed that a lot of people have the problem that the CPU uses the maximum, without big traffic on the Prestashop site. I also have this problem, but I have not found anywhere that someone advice on how to solve this problem. If you could Advise how to solve this problem. Thank you. Prestashop Version: 1.4.0.17 About 250 categories About 25000 products There is no difference between 70 and 300 people visit daily. Hosting: Bandwidth 2000 GB / month. + 720 GB / month. loyalty (used 7.80 GB) Disk space: 20,000 MB + 7,200 MB for loyalty (used 3100 MB) (inodes 100%) MySQL 5.5 - Unlimited PHP 5.2 / 5.3 / 5.4. yes PHP processes: 8 x Php (exceeded 1,316 times a day) what's your memory? Link to comment Share on other sites More sharing options...
wlongdonstudio Posted January 7, 2013 Share Posted January 7, 2013 Hmm... THE GOOGLEBOT WAS THE WRONG !!! I solved the problem Maybe is also important for you, so: i put some meta (html) lines to the product.php and the category.php, and it's working !! I already test it, and it's work. The googlebot made before i put it into maybe 30-35 MB / day.. not it's make 0.5- max 1mb/day Soo It's realy good The meta lines: <meta name="robots" content="noindex, nofollow"> <meta name="distribution" content="global"> <meta name="language" content="HU"> ---- OR: "ENG" or "IT" <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> And if you want you can add this also: <meta name="revisit-after" content="3 Months"> ---- OR:"1 day" or "7 days" etc... You can choise the revisit time where do i put this? how to solve this pike??????? Link to comment Share on other sites More sharing options...
Shoperis Posted January 7, 2013 Share Posted January 7, 2013 what's your memory? I have no problem with RAM memory. My shop uses too much PHP process (CPU). I do not separately allocated RAM. My hoster said that the joint server is 16 GB of RAM, and if I do not exceed the PHP processes of RAM memory is always will be sufficient. Link to comment Share on other sites More sharing options...
Recommended Posts