Posting in the Magento forums has been disabled pending the implementation of a new and improved forum solution which should better serve the community.

For new questions please post at magento.stackexchange.com, the community-run support site for the Magento community. We will be providing updates on the new forum solution soon. For questions or concerns please email community@magento.com.

Magento Forum

Page 16 of 17
Poll
Do you think the product import in Magento is too slow?
Yes, it should be improved! 613
No, it’s fine. There are more important things to work on. 19
What are you talking about? 1
Total Votes: 633
You must be a logged-in member to vote
Import Speed / Performance optimization
 
Roberts Distributors
Jr. Member
 
Total Posts:  2
Joined:  2010-09-16
 

OK, so, after using the earlier example file for/from maggie which exports each import as a table row visually, I can determine that the following function just isn’t rocking anything for me:

foreach ($contents_array as $line{
    
    $sku 
$line['sku'];
    
$qty $line['qty'];
            
    
$exists $db_magento->query("SELECT COUNT(sku) cnt FROM catalog_product_entity WHERE sku = '$sku' LIMIT 1");
    
$find_product = (($exists->fetchObject()->cnt) > 0) ? true false;

    if (
$find_product == true
        $entity_id 
getEntityID_bySKU($db_magento$sku);   
        
updateQTY ($db_magento$entity_id$qty); 
        echo 
"</tr><td>$sku</td><td>$qty</td></tr>";
    
}    
}

Now, as far as I can tell the DB’s are exactly the same from community edition, so, is anyone aware of a difference in Mage.php that might account for this? Anything I should look for?

 
Magento Community Magento Community
Magento Community
Magento Community
 
shinesoftware
Jr. Member
 
Avatar
Total Posts:  21
Joined:  2009-01-22
Italy
 

Is there a way to use directly the LOAD DATA INFILE in order to create the categories and products?

Thanks

 
Magento Community Magento Community
Magento Community
Magento Community
 
johjoh
Member
 
Total Posts:  39
Joined:  2010-02-25
 

hey all,

i just read coldlampers post:

coldlamper - 13 October 2010 07:54 AM

I have spent weeks discovering a huge memory leak in magento.  It duplicates an array of products over and over.  So for example you tried to import 1000 products you would have an array containing 1000 products 1000 times.  It eats up memory but also a lot of cpu parsing the info. I have cut down my script import time dramatically. There are other memory leaks that deal with object clean up that haven’t been able to pin-point but have a method to fix that too.  The fix is only 1 line of code.  If you are willing to compensate me I will give you the fix.  I usually would just post it but I really had to spend a lot of time and compile php into debug mode and run it thru valgrind and then use a php debugger.

If you import more than 100 products this fix really helps and it speeds up regular magento operation as well.  I contacted magento about this and they did not reply.
(...)

it’s a bit confusing: this issue is discussed in several threads in this forum and elsewhere, there are several solutions, that have comments like “doesn’t work with xy”, or “helps a bit, but doesn’t solve the problem” and no coldlamper has found a one-line-solution, magento didn’t find? the bad import-speed is a big issue, no? this sounds a bit unbelievable all in all. but if it’s true, we should find a way to convince coldlamper to publish his solution and to compensate him in a way.

what do you think?
johannes

 
Magento Community Magento Community
Magento Community
Magento Community
 
shinesoftware
Jr. Member
 
Avatar
Total Posts:  21
Joined:  2009-01-22
Italy
 

We have solved this problem with a new import project. Magento classes, data flow, or the API are very slow for huge file.
We have created a shell batch import software and it is very fast. We have imported about 50.000 records and 800 categories in few hours. The problem is the reindexing of the data, in particular, the Catalog URL Rewrites. Anyway if the import will starts in the middle of the night the database is updated in the morning.
If you are interested in our import batch software don’t hesitate to contact us.

 
Magento Community Magento Community
Magento Community
Magento Community
 
johjoh
Member
 
Total Posts:  39
Joined:  2010-02-25
 
Shine Software - 25 October 2010 09:34 PM

The problem is the reindexing of the data, in particular, the Catalog URL Rewrites.

No, not for me as it seems. I wrote a script that mirrors the imported data in a DB-table. So only on insert and update of products the Magento-classes are used and stock-updates are done in the database directly. But I’m facing the problem, that the time to import just one product goes up to far over 100 sec. If I stop the script and directly start it again it starts with 1 sec per product again and i can’t find any missing data in the last products imported before i stopped.
So this suggests, that it’s a memory-leak.... and yes the memory-consumption increases over 200kb for every product imported. If there is an easy fix for this, it would be the holy grail of mass-import.

I can’t see, what this could have to do with reindexing rewrites?

An alternative would be to use dweeves’ script. But as it deals with the db-tables directly, it’s not robust against changes in the db-architecture. Who knows if this stays untouched in the next updates?

johannes

 
Magento Community Magento Community
Magento Community
Magento Community
 
chiefair
Mentor
 
Avatar
Total Posts:  1848
Joined:  2009-06-04
 

So, anyone want to experiment with making 1.4.x.x DataFlow instead of being DataSlow?

Curing a Magento Memory Leak in 1.4.1.1

While he mentions it being used for reducing memory bloat during import, it probably affects export as well.

Addition of one line of code:

if (!in_array(self::STRAIGHT_JOIN_ONself::$_joinTypes))
seems to eliminate an ever increasing array of death that’s probably what makes you assign ever increasing memory_limit amounts in the hopes that imports and exports will work.

Aaand, it’s official, the Array of Death(tm) has been officially noted and patched in 1.4.2.0rc1, see code comparison here.

And since this is a patch for /lib/Varien/Db/Select.php, it has far-reaching effects to Magento’s database performance beyond just the DataFlow problems it’s been causing us.

 
Magento Community Magento Community
Magento Community
Magento Community
 
dweeves
Enthusiast
 
Total Posts:  877
Joined:  2010-06-26
FRANCE
 

Just a reminder, for fast import you may use magmi

it still does not handle all dataflow use cases, but if you’re dealing with simple products, it has many interesting features & provides lightspeed import compared to magento dataflow.

 
Magento Community Magento Community
Magento Community
Magento Community
 
mhenze
Jr. Member
 
Total Posts:  7
Joined:  2009-11-06
 

Can somebody please help me out with this one?

I am using the attached script and the attached csv.
I have added the temp table to the database.

When executed the page remains blank and it looks like nothing is happening.

The system logs says:
2010-11-26T13:49:03+00:00 DEBUG (7): Mage_Eav_Model_Entity_Attribute_Source_Table

Thanks in advance.

File Attachments
script.zip  (File Size: 16KB - Downloads: 95)
 
Magento Community Magento Community
Magento Community
Magento Community
 
james Boomer
Jr. Member
 
Avatar
Total Posts:  28
Joined:  2010-01-07
USA
 

Data flow issue.  I have been beting my head up against the wall on this one.  I have tried now 6 different import methods with no luck.  My sistuation is this I am trying to updat pricing on over 6000 products - sounds easy enough.  (The import script here would work and i\’ve even written some of my own.  all with bad results) The problem is i have 2 websites.(still an easy problem to solve) each with seperate pricing.  The issue comes with magento \"admin value\” So in our store we have some custom modules that create a base line for pricing this pricing is fed into the admin store view.  We then have 2 webites\"base(local) and \"national\" (National) the national always uses the base admin level.  where as the local or \"base site in the store uses different prices for some product and the baseline for others.  So heresthe issue in magentos data flow you cn specify a store without specifying a website. ad it goes through and changes theproductfrom using the admin values to custom values at the store view level.  where as every other import method relies onthe website, which \"base\" contains th admin view as well.  So when i run any import to change the base pricing for that store view it also changes the admin pricing which affects the national store.  Any help would be aprreciated.  other wise i think i\’m stuck with the old magento dataflow(yuck).

 
Magento Community Magento Community
Magento Community
Magento Community
 
dweeves
Enthusiast
 
Total Posts:  877
Joined:  2010-06-26
FRANCE
 

don’t worry james, i’m working on your problem and think i’ll be able to solve it in some next magmi version smile

 
Magento Community Magento Community
Magento Community
Magento Community
 
Boatmagic
Sr. Member
 
Avatar
Total Posts:  91
Joined:  2010-08-14
 

Can I get some help with this.
I’m getting errors;

[Fri Dec 24 14:50:14 2010] [error] [client xx.xx.xxx.xxx] #0 /chroot/home/anniston/xx.xx.xxx.xxx]/html/lib/Zend/Db/Adapter/Pdo/Mysql.php(96): Zend_Db_Adapter_Pdo_Abstract->_connect()

[Fri Dec 24 14:50:14 2010] [error] [client 67.60.150.162] #1 /chroot/home/anniston/xx.xx.xxx.xxx]/html/lib/Zend/Db/Adapter/Abstract.php(448): Zend_Db_Adapter_Pdo_Mysql->_connect()

[Fri Dec 24 14:50:14 2010] [error] [client xx.xx.xxx.xxx] #2 /chroot/home/anniston/xx.xx.xxx.xxx]/html/lib/Zend/Db/Adapter/Pdo/Abstract.php(238): Zend_Db_Adapter_Abstract->query(’TRUNCATE TABLE ...’, Array)

[Fri Dec 24 14:50:14 2010] [error] [client xx.xx.xxx.xxx] #3 /chroot/home/anniston/xx.xx.xxx.xxx]/html/script.php(62): Zend_Db_Adapter_Pdo_Abstract->query(’TRUNCATE TABLE ...’)

[Fri Dec 24 14:50:14 2010] [error] [client xx.xx.xxx.xxx] #4 /chroot/home/anniston/xx.xx.xxx.xxx]/html/script.php(23): updateTempTableFromFile(Object(Zend_Db_Adapter_Pdo_Mysql), ‘/chroot/home/an...’)

[Fri Dec 24 14:50:14 2010] [error] [client xx.xx.xxx.xxx] #5 {main}

[Fri Dec 24 14:50:14 2010] [error] [client xx.xx.xxx.xxx] thro in /chroot/home/anniston/xx.xx.xxx.xxx/html/lib/Zend/Db/Adapter/Pdo/Abstract.php on line 144

 
Magento Community Magento Community
Magento Community
Magento Community
 
Boatmagic
Sr. Member
 
Avatar
Total Posts:  91
Joined:  2010-08-14
 

Ok I have it working and I must say this is quite possibly one of the best issues someone has worked out.
Thanks to all of you who put in the time to make this feasible.

The issues I had were file path errors and a few minor code changes,,,missing brackets and such..

This updates my stock/qty of 11974 items in about 5 seconds at most… That is awesome...lol

I had tried doing this with php and the API but could not get it working..I still have to do a few things manually. My supplier has an XML feed but I don’t know how to automatically make it a .csv file.. I have a cron job that can auto d/l the xml feed, but as far as taking the xml and auto making it an .csv is beyond me right now.

Well thanks again for getting this issue resolved for many ppl here on the forum.

 
Magento Community Magento Community
Magento Community
Magento Community
 
Boatmagic
Sr. Member
 
Avatar
Total Posts:  91
Joined:  2010-08-14
 

Is it possible to take the original code from the Wiki article, and make it work without having to use the is_in_stock portion of the code?

Let me explain why I\\\’m asking.

I can get a data feed from my distributor for sku and qty, and save it as a .CSV file with a cron job. Original article here:
http://www.bwilhelm.com/2010/01/05/get-xml-data-into-magento-via-magento-api/comment-page-1/#comment-3777

My feed however in .csv format does not have row headings and they also do not use the is_in_stock field.

The script.php file here update the database directly , reading from the WebStockImport.csv, but that file has to have the is_in_stock column filled out with 0\\\’s and 1\\\’s.

I want to automate all of this, and currently am stuck on this part.

I get .csv file, need to automatically add is_in_stock column and of course fill it in with 0\\\’s and 1\\\’s based on qty of 0 or >1, then have this file read from the script.php as posted on this thread (that part I can do)

 
Magento Community Magento Community
Magento Community
Magento Community
 
Boatmagic
Sr. Member
 
Avatar
Total Posts:  91
Joined:  2010-08-14
 

Well I almost had it worked out.
Pulled xml, saved as .csv all automatically.
Issue now..Saving files as .csv remove all the formulas…

I had the is_in_stock column calculate with a formula. If you changed the qty to 1 or more for example, that column would change to 1. Change it to 0, it would change to 0.

So what I had hoped to accomplish was when the new .csv file was saved over the old one (and with certain columns locked and hidden) the is_in_stock column would auto calculate each qty row and put a 1 or 0.

Then the next cron job would run the script.php as posted in this thread…

 
Magento Community Magento Community
Magento Community
Magento Community
 
Boatmagic
Sr. Member
 
Avatar
Total Posts:  91
Joined:  2010-08-14
 

Is it possible to change the script.php to make it work without using the code referring to ‘store’ and ‘is_in_stock’?

To me it looks like the script doesn’t need these fields to function.

 
Magento Community Magento Community
Magento Community
Magento Community
Magento Community
Magento Community
Back to top
Page 16 of 17