Posting in the Magento forums has been disabled pending the implementation of a new and improved forum solution which should better serve the community.

For new questions please post at magento.stackexchange.com, the community-run support site for the Magento community. We will be providing updates on the new forum solution soon. For questions or concerns please email community@magento.com.

Magento Forum

Page 1 of 2
More than 200 stores with multi store and same products? 
 
magemoss
Jr. Member
 
Total Posts:  1
Joined:  2012-05-19
 

hi, we have 250k products and we want to create more than 200 different site(multi site) all stores will have same products but different domain and different design. Magento multi site function is good for us but you need to match every product and every site so that means 50m(min) record for tables like “catalog_product_website"(and some other tables) and also flat catalog and indexing will be a real problem.

if we use store and store view than userbase will be same and also flat catalog will be problem.

So my question, is there any way to make all these stores work like single store? Or is there any way to make it work with nice performance?

 
Magento Community Magento Community
Magento Community
Magento Community
 
Extendware
Sr. Member
 
Avatar
Total Posts:  231
Joined:  2011-08-04
 

You can greatly increase performance with a full page cache solution. This will greatly reduce load on the database. Other than that, you will need a lot of hardware and ram for the DB.

 
Magento Community Magento Community
Magento Community
Magento Community
 
inwebservices
Jr. Member
 
Total Posts:  29
Joined:  2012-01-18
 

You can go to a full page cache however we find multiple optimisations such as apc, memcache, php compiling, optimised index management, etc are more efficient, we have worked with terrabyte databases, as long as you install the db correctly with optimal settings millions of records are not a problem.

In relation to Magento, it is slow especially with thousands of products and above, we normally use AWS with Elasticache, RDS, Route53 and load balancing as you can upscale instantly and leave the infrastructure headache to Amazon.  For high performance sites it is best to allocate a batch server to take the load off the web servers for index management, admin, etc.

 
Magento Community Magento Community
Magento Community
Magento Community
 
Ryan Grow
Jr. Member
 
Total Posts:  21
Joined:  2009-12-04
 

I have a site that is running with 200+ websites/store views and 10,000+ products, So it can be done. The issue with magneto is not on the product count but the website and store count. The more websites and store views you add, the exponentially slow it will become, this is because of the way they process and save the “Configuration” settings and their indexes. (the system configuration cache store all the stores settings for every website and loads it all in memory.

Here is what we have done to work around this

First do not use the file system for cache or sessions, but you will have to use the built in cache or your page load times will be about 10 mins. Memcahced will somewhat work, but it was not fast enough for our traffic (10-30 hits per sec, 200+ stores remember).
Use Redis and the Redis extension by Collin Mollenhour,
https://github.com/colinmollenhour/Cm_Cache_Backend_Redis

Next you will have to ditch the apache server, it will not be able to handle the site and install Nginx and PHP-FPM, make sure that the webserver and PHP both have enough memory, 1gb min per thread should be enough (e.g if nginx has 8 worker processes you need 8 gb of ram), also make sure that PHP is set to 1gb min. (this is because the whole system config has to be loaded and stored in memory to load a page).

You will also want to have multiple web nodes with a SAN for shared storage (i.e. at least the media, var, and includes folder). A NFS share or a rsync script will not be fast enough to enough to handle any kind of load that 200+ site will have.

A big hurdle will be the database we had to switch to the latest version of Percona’s MySQL since we kept killing the regular mysql, we have it on a dual 6 core server with 48gb ram (make sure your innodb buffer settings are high enough) and we can still max out all the cores if 2 or 3 people are saving product on the admin at the same time and it take about 5 mins to save a product. Also, you will have to set the indexes to update on save because it takes hours and hours to reindex.

Installing a full page cache module will help with keeping the load on the server down, but if you make a lot of changes or product updates your cache will have to be flushed and you may not see much benefit. But I still do recommend full page cacheing , Varnish works nicely or check out Collin Mollenhour’s cache extension.
https://github.com/colinmollenhour/Cm_Diehard

On a side note the built in product importer will take way to long to be functional, (1 day to import 1000 products with 200+ sites)
So, you will want to check out the Magmi (Magento Mass Importer) project
http://sourceforge.net/projects/magmi/

Good Luck and let me know if you have any questions

 
Magento Community Magento Community
Magento Community
Magento Community
 
inwebservices
Jr. Member
 
Total Posts:  29
Joined:  2012-01-18
 

Further to Ryan’s comment, we have built a streamlined custom solution (Pseudo Store) to handle this so that only one store is required under normal circumstances.  The solution has one website/store/store view and dynamically changes the currency, country and language/locale based on parameters thereby eliminating the performance issues with multiple stores and large product sets.  The features are as follows.

- Via a country selector changes the country, currency and language/locale for a single store
- Uses cookies to store user country preferences based on return visits
- Single store thereby increasing performance without requiring server or special software installations (nginx, san, full page cache, etc)
- Faster product load times due to reduced number of database records
- Shortened indexing time reducing the load on the database server
- Filters products so that users for that country only see available products to themselves, redirection to information page if not available in their region
- Single RSS data feed
- Single sitemap
- Avoids duplicate content penalties in Google

Restrictions
- If any products require language based url keys or descriptions then a separate store is required, this is beneficial for seo as the language specific store will be indexed separately

Next release
- Custom url per Pseudo Store domain, subdomain or subdirectory

You can see it in action at Quintessential Fashion (http://www.quintfashion.com), in this instance it is using country codes as the basis although this is not fixed.

The solution needs to be technically configured for each installation, it is $995.  We understand it is a high price, however the return on investment will save this amount in a few months by having reduced server requirements and not having the problem of dealing with millions of records overloading the database.

 
Magento Community Magento Community
Magento Community
Magento Community
 
inwebservices
Jr. Member
 
Total Posts:  29
Joined:  2012-01-18
 

We wanted to provide an update with some figures, we have moved a site with 4 websites/stores and 65,000 products to the new code with one website/store and the same number of products.  We have a custom load script using product->save which previously loaded approximately 30 records per minute.  With the single store method we are now seeing up to 3-4 times faster loading and have noticed increased frontend performance using file and php caching (no memcache or read database).

 
Magento Community Magento Community
Magento Community
Magento Community
 
Extendware
Sr. Member
 
Avatar
Total Posts:  231
Joined:  2011-08-04
 

Ryan Grow,

Full page cache does not need to be flushed after each product update if the store owner can tolerate data that can be a few hours stale. For example, for Full Page Cache the data can be set to expire after 3 days and something like the Full page crawler can be used to refresh the most critical pages every few hours.

 
Magento Community Magento Community
Magento Community
Magento Community
 
Ryan Grow
Jr. Member
 
Total Posts:  21
Joined:  2009-12-04
 

Inwebservices: I really like your inventive solution, I just wish that our needs would let us implement something like that, but our website and stores views are not for different countries or currencies, but for different co-op members and locations. I will say that the best thing to do is to never run more then 1 website and store view, because the page load times increase exponentially with each extra site and store added.

Extendware:  Yes you don’t need to flush cache after every admin change, and with 200+ sites we have lots of people making a lot of changes though out the day, so our cache doesn’t get flush but a few times a day. So, I am not trying to rip on full page cache, our site cannot run with out it, but saying that not flushing it often, is not going to allow someone to run multiple sties. We don’t flush often and we always have cache built for our sites (which can be really good if something breaks or if an index gets reindexed or corrupted), but with 200+ stores on our single install we also always have pages that aren’t cached. Here are a few of the short comings we have run into with cache (full page or other wise).

1st. The page has to be loaded in order for it to be saved to cache

With lots of stores and lots of products page load times can be incredibly slow (hence all the massive hardware requirements). Also it’s very difficult to hit every page on every site with a full page cache crawler (even with Google and Bing helping) before the next time the cache gets to be flushed, so we always have some uncached pages being delivered. This is why the hosting environment needs to be strong enough handle uncached pages and well as the full page crawler and all the web crawlers hitting the 200+ sites. 

2nd The page has to be saved somewhere

This is still a problem that we have, on the weekends when no one is flushing the cache we almost hit the limit of our Redis’s cache storage (10GB) and that’s only storing the 1st level of Page Depth (i.e. no page parameters). Since Redis store data in ram we need to add more ram or set up another node for the Redis cluster

3rd The page has to be retrieved quickly

I tested a few different cache storage engines and Redis was the only one able to handle our server’s load (it was faster when used with enterprise’s full page cache then varnish was for us). But also the cache server has to be able to handle multiple web nodes connection to it and also be able to be clustered and backed up… (Redis does all this)

4th The page has to be refreshed when it’s stale

I just don’t like the idea of dumping all the cache just because 1 change has been made for 1 product on 1 site, I still haven solved this and maybe your module can do this I don’t know, But on the same note cache should not expire simple because it’s old it should only expire if something on the page has changed, and I see that you can change setting for the life of your cache in your module.

Magemoss: On a side note you had asked a question about indexing and I don’t think I fully answered it, the more store and sites you add the longer it will take to reindex and since this is done on the database by default the bigger and bigger the database server requirements will get. So, one thing that we have been testing is moving the indexes out of the database and into Solr, lucene or Sphinx, I was able to get Solr working but the performance is not where I want it yet. I won’t go into detail since there is already some forum posting on the subject, but it’s just one more thing to think about when trying to stick 200+ sites in Magento.

 
Magento Community Magento Community
Magento Community
Magento Community
 
MagenX
Enthusiast
 
Total Posts:  791
Joined:  2008-05-26
Dublin
 

the one and only question in my head - WHY???
ok if you have 200 partners with magento shops, and they really want to sell your 250k products
here is your keywords:
mysql proxy, mysql replication, time to time sync and reload.

but if you want to create 200 shops - this is just a waste of resources and time and money… better to buy ads and develop 3-4 small load balanced servers with mysql proxy to replicated master-slave backend…

or may be i do not see full picture.

here is more keywords:
preorder, reserved, price valid at the time of purchase.

 
Magento Community Magento Community
Magento Community
Magento Community
 
Ryan Grow
Jr. Member
 
Total Posts:  21
Joined:  2009-12-04
 

I will state this again, if you can avoid having multiple sites/stores, then yes by all means do so, but this is not normally possible.
Here are a few reasons to have multiple sites:

1. Your company has multiple locations which sell the same and different brands, have inventory and/or stock amounts

2. You have a Co-op with multiple stores and locations each with the same situation as above

3. You have a central warehouse for partners to drop ship from but each partner can choose which products to carry

4. You have a multiple language store views setup and you need to have the products description and name in the other languages

5. You need to manage promotions, static blocks, and/or pages across multiple websites/stores

Those are only a few, but I hope it’s gets you to realize why so many people have to run multi site setups.

On a side note the whole master/slave write/read setup also has it’s issues as well (like the search not working sometimes), and if you think that we can just setup 1 database and replicate only the products to other magento database then you must have a lot of hard drive space and ram, because when dealing with a large amount of products, the databases get rather larger 6+ gb and replicating that over and over to 200+ database would not be pratical. In contrast our multi-site magento database is only 20+ gb… But I guess you could shard out the databases and run 1 for products and then multiple ones for each site, but your still going to have management issues and lose the ability for individual sites to make product changes.

So in conclusion, there is no 1 solution to making magento multi-site work! (at least not that I have seen, and if someone has it please share it) The closest I have seen is using a virtual image on a cloud server environment, but even then there is a limit to how many stores you can add before it slows down too much and then the costs of using that much resources can be high. We had all ready passed that point before I started looking into it so I don’t know where that is, I just know that it was a lot slower then out current setup.

But maybe if I were to build a private cloud or a super computer.....

 
Magento Community Magento Community
Magento Community
Magento Community
 
EverestCloud
Jr. Member
 
Total Posts:  1
Joined:  2012-06-08
 

In response to Ryan’s points, we were passed details by InWeb.  Nginx does the core functionality of a web server very well with low overhead whereas Apache does everything, if you only need a subset of the core then Nginx will provide a 5x performance boost to your site.  Couple this with APC, single store etc, and you are in league with the best websites out there.  We are running 50,000 product single Magento stores on an AWS micro instance web farm with Nginx, the results are impressive.  Adding websites/stores views degrades performance substantially.

We have dealt with very large corporate applications, the key to Magento with potential high store numbers is to minimise the stores to website level or regions and then use software such as dynamic filters or other business process changes.  Think of a pyramid separated in to three to four vertical levels, if you have 50 stores then this is the bottom of the pyramid and the most granular, the top is one store.

When dealing with high store counts and skus, you want to position your overall website at the highest point of the pyramid taking in to account cost, stability, performance, and return on investment, a cross between business needs and technical possibilities.  These are business decisions and not technical, fixing it only technically will always cause problems as nothing stays stable for long, upgrades, new functionality required, maintenance, etc.  Fixing it only with business process changes will cause lost opportunities and profitablity.

In the end, once you get in to multi-store with thousands of products there are three key options.
a) solve it technically with the best hardware, software, services and time
b) solve it with business process changes or reduced requirements, potentially reducin profitability although with reduced IT costs
c) solve it with a hybrid of the two, this falls somewhere in the middle of the pyramid and is equal art and science.

You can see this visibly in the included attachment, the further you are down the business or IT side, the less flexibility you have.  The goal is to move as close hybrid given your business circumstances, requirements, timescales, costs and business/technical knowledge available.  This process is the same for the smallest one person website to the largest multi-national companies, only the scale changes.

Image Attachments
Magento.png
 
Magento Community Magento Community
Magento Community
Magento Community
 
Ryan Grow
Jr. Member
 
Total Posts:  21
Joined:  2009-12-04
 

Update on our Multi-site Magento, We have switched over to MariaDB 5.5.25 for our database and reduced the CPU load (only 50% during the day) and increased admin performance, this is another total must do for multi sites. Unfortunately re-index time remains the same.

yashgt: Sorry, but 10,000 websites in Magento is not going to work. let me explain why. I think that the theoretical upper limit of Magento 1.7 is maybe 1,000 websites, maybe. Currently with 200 websites, running on major hardware, the admin is almost unusable, 3 mins to create products, 2 hours to process indexes, 5 mins to generate system cache and so on. The problem is that there are three major architectural flaws that are preventing magento from breaking the 1,000 site limit.

First, every-time Magento loads a web page it load the system configuration settings for all websites, this is why adding sites slows it down so much. As you add more sites the php data object gets bigger and bigger and php requires more and more memory per thread to run it. We currently require 1.5gb of ram per thread, and that’s for 235 sites, someone who is running 500 sites may need 3gb of ram per thread but 1,000 sites could be up to 6gb of ram per thread, and on a 16 core server running 16 php threads that’s 96gb of ram on 1 box, just for PHP. So, the solution is to make magento only load the settings it needs for that site, but you’ll have to also change how it creates, stores, and retrieves the system configuration’s caches. This would be a major code and architectural change, I had a co-worker who tried to fix this but he could not get it to work correctly.

Second, indexes, or more specifically the catalog URL rewrite index, here is the thing, you can turn off the catalog and product flat index, you can move the searches to solr or sphinx, but the catalog URL rewrite stays and it will grow slower exponentially in size with the more sites you add.  With our 235 sites and 10,000 products a 2 hour reindex time is not that big of a deal, but if we had 500 sites it could take 4 hours and a 1000 sites could take 8 hours. So, if you change the index to update on save, then product save times also get progressively slower and slower. But I don’t know what an acceptable time to save a new product should be, 3 mins, 5 mins, 10 mins, 15 mins it’s hard to say. In the end storing all that data in 1 table just does not make sense, because it creates the potential of having billions and billions of records, which makes it slow.

Third, Magento only works on 1 master mysql (or compatible) server, this is the real hard limit, only having database 1 server means that if it needs more resources you have to get a bigger server and server only get so big. Switching to a Virtual machine in a cloud or running on a cluster my gain some performance, but even then there is a limit as to how much resources a VM can have. The real solution would be to put it on oracle or mssql and run a cluster, this would require creating the connect scripts to those servers (which is the easy part) and then rewriting all of the mysql install and upgrade scripts for the new server, (and don’t forget the modules mysql). I’ve gotten magento to connect to a mssql server but the rewrite of the scripts was a huge task that I don’t have the skill for.

If those three things could be resolved then I think that magento could one day do 10,000 websites, unless there is another bottle neck that I don’t know about.

 
Magento Community Magento Community
Magento Community
Magento Community
 
Ryan Grow
Jr. Member
 
Total Posts:  21
Joined:  2009-12-04
 

yashgt: The time it takes to create a product is directly related to the amount of websites, well more specifically the size of the catalog url rewrite index, so even if you create a product and add it to 1 site or 1000 sites it takes the same amount of time. So the problem is, if you have 10,000 sites it will take way too many hours to save a product, even it’s it only for 1 site.

So I know your optimistic about the whole Master/Slave replication and that it will somehow resolve your database issue, but in practice it will not. The reason why is because you still have and can only have 1 master, and since the main load is due to writes you still have to have a massive DB server and hardware can only do so much. the other side of the issue is that Magento also high read loads, joins without indexes and full table scans the read DB server has to have the same high specs as well. Let me make sure you understand Scaling out the read loads (in a shared nothing architecture) does not speed things up, it only allows more read connections but you still have high write loads and once the server is maxed that’s it. The real solution is for ether clustering or sharding of the database so that way the giant tables can be processed in a timely manner. So, as I was saying earlier, only being able to have 1 Master (write) server is massive bottleneck.

There are other smaller issues, especially in the admin, I can’t image how big a dropdown menu with 10,000 websites would be or if it’s even possible to save a attribute with 10,000 sites, since with only 200 sites I receive slow script errors. Of course this can fixed by editing the admin theme, it’s just one more thing to think about…

 
Magento Community Magento Community
Magento Community
Magento Community
 
Ryan Grow
Jr. Member
 
Total Posts:  21
Joined:  2009-12-04
 

yashgt: You have been thoroughly warned of the underlining limitations of Magento and what needs to be fixed in order to do multiple sites, Good Luck with your adventure and let me know how it goes.

P.S. Your DB server will be inadequate, even amazon’s “Cluster Compute Eight Extra Large Instance” will be too small, but I would at least start from there..

 
Magento Community Magento Community
Magento Community
Magento Community
 
Vanquish
Jr. Member
 
Total Posts:  24
Joined:  2012-07-01
 

We have to agree with Ryan’s points completely, Magento will have a major problem with this.  There are three underlying options:

1) Use Magento as is already chosen
2) Develop a custom solution
3) Use an alternative application

We can say from experience that Magento is the best in class for eCommerce unless you are prepared to spend many 10s to 100s of thousands of dollars on an Enterprise level application for your needs.  As already chosen, Magento then breaks down in the the following three options for your specific installation, with distributed systems being the least risk in your scenario.

a) Business - change current core Magento functionality to behave as you require
b) Distributed - have a master Magento installation that then distributes to satellite installations
c) Technical - throw as much hardware as possible at it and hope it all works

With Magento you really want to change as little as possible otherwise you will spend your life developing code, never being able to upgrade, fixing problems on a continual basis and be at the mercy of the system 24x7, therefore let’s rule out a).

With b) you would have a master with the 100,000 product catalog and limited stores, propogating to slave Magento installations each with say 500-2000 websites (depending on your risk tolerance), however you will need fully automated propogation.  This makes it scalable, independent if one system crashes and minimizes code changes however you would have to manage multiple instances.

With c), as Ryan states, you will hit limits in the dashboard and the database server (we presume you are using RDS), you can only have one write database so your product saves will become an issue.  Adding stores/websites from single store starts to become almost exponential in the performance degradation.  On simple servers we can batch process 50records per minute with a single store, 20records per minute with 2-3 stores and 2-3 records per minute with 8-10 stores (50,000 product database), with products only active in a 2-3 of the websites.  These are with layered catalog indexing off as it produces the heaviest load.  You can switch indexes to manual update however you will need to run the indexer regularly, if you are using RDS you will be charged per million IOs, you will be suprised how quickly this mounts up.

We have tested clustering, you can replace foreign keys with stored procedures however we found it unreliable and maintenance intensive when being used for Magento.  Planning for a distributed (b) solution is you best approach as you can test a real live scenario with two instances (master and slave), then launch new instances as necessary.  If the venture becomes successful you can always join them together again should a solution be found to the various issues that will arise along the way.  You are putting all your eggs in one basket with c), Magento and AWS can become unresponsive for no particular reason, you will be in an all or nothing scenario with 24x7 maintenance.

With option a) you have 80% chance of failure, option b) you have 80% chance of success, option c) 50/50 chance of success/failure (80% of the time being midly unsuccessful, we hope you fall in the other 20% who become wildly successful).  In your scenario we always go with option b) even if our clients do not like it, we only deal with successful projects using risk mitigation.

 
Magento Community Magento Community
Magento Community
Magento Community
 
Kammalou
Sr. Member
 
Avatar
Total Posts:  285
Joined:  2009-01-20
Denmark
 

I tested a few different cache storage engines and Redis was the only one able to handle our server’s load (it was faster when used with enterprise’s full page cache then varnish was for us). But also the cache server has to be able to handle multiple web nodes connection to it and also be able to be clustered and backed up… (Redis does all this)

We found on a large traffic client that completely disabling Magento EE FPC and implementing an varnish extension we gained an quite effective setup using varnish, memcached, nginx, php-fpm.

currently looking into the redis features on multiple instance setups in contrast to running multiple memcached instance. - Any experience with that ?

In the example with 1 Master Magento and multiple magento “slaves” did we solve as a conseqeunce of how we address integration with backoffice ERP systems. We need to be able to scale integrations effecient, also address scalable flexibility of integrating to multple applications using same data and ended up implemting a “BizTalk” Application server for Magento. Where we can have an PIM Connector for product distribution. (Publishing Connector) and dynamically load one to many subscriber connectors.
Scale MySQL on Cloud is somewhat a paradox. - i mean large scale eCommerce solutions without any control on iops…

 
Magento Community Magento Community
Magento Community
Magento Community
Magento Community
Magento Community
Back to top
Page 1 of 2