What you will learn after reading this article:  

  • What work is needed on the Adobe Commerce side to make sure it can handle the load;  
  • What work is needed on the hosting/cloud side to handle so many orders;  
  • What questions you need to ask the dev team to make sure they are prepared;  
  • What questions the business team must answer to make sure your business is prepared;  

What is the difference between Magento and Adobe Commerce (and Adobe Commerce on Cloud) ? 

First, let us investigate some key differences between these – at first glance – identical platforms; Adobe Commerce is a licensed version of Magento 2 and has all the basic bells and whistles. Additionally, it sparks a few new features like full B2B support, a loyalty program, content scheduling, segmentation and so on – but those are only features. There are also a few more, not-so-obvious changes, if you look under the hood. 

The first one is the database change because of the scheduling system. As there can now be multiple instances of, for example, page or attribute, a simple entity ID row is not enough to be a primary key; in many places, row id is also added. This creates the need for new sequential tables that allow us to handle trigger fallbacks for all tables using new entity/row ids;  

The second one is the order archive – this allows us to tidy up the order table, so it is not only  easier to read and scan for the customer service team, but also each database query will be faster as there are fewer data entries to scan; This does not seem to be much at first glance – but considering the fact, that after just ten weeks we would have 15 thousand orders to scan, we could easily shave off few milliseconds of each page request;  

To cap it all, there is also Adobe Commerce in the Cloud. It’s basically just Adobe Commerce in a predefined cloud solution that Adobe upsells to their customers; Whether it’s effective in terms of cost and performance is a whole different story.   

Let’s start from the beginning. 

Configuration 

All basic configuration that could be done on the Magento side does not affect performance. However, if you start going deeper, some minor changes to one or two options could, in fact, make the site twice as slow. 

The configuration to be most cautious about is Catalog Configuration. For example, how many products are shown on the product listing page (PLP for short)? In this case, “less is better” is a solid strategy. It is also important to make sure that clients do not have too many options to change the number of visible products (Is infinite scroll and ajax loading a good strategy? Well, to be discussed another time). An optimal number of products from a dev point of view would be 3 rows multiplied by how many products you want to see in a row on a desktop – usually it ends up being 6, 9 or 12.  

There are also a few configurations – like an already legendary flat catalogue for people working with Magento 1 – that are no longer treated as performance upgrade changes. It is good to remember about this as some older developers might want to change this to an unpreferred option. 

During configuration, it is good to check and correctly set up the indexations. Currently, most of the indexes are recommended to be set up to upgrade on schedule (except one) and I cannot think of any cause (stocks, prices) that should change that, as it will highly impact performance. 

A particularly good thing is also to acknowledge the size of the catalogue and get the number of effective SKUs – it is what we call a total number of rows and versions of the product. This is simple to calculate as it is the number of SKUS x websites x customer groups.  

If you have, for example, only 10 000 products but 5 websites and 3 different customer groups this is actually 150 000 effective SKUs, which is a lot. Thanks to that, some minor configuration changes could be done to the catalogue to make it smaller – is the fifth website necessary? Could we remove one customer group? 

Order handling 

One of the elements that will allow any web shop to handle larger amounts of orders is order handling.  Although it seems obvious, it is often overlooked.  

The first thing to look into is the gathering of orders. Adobe Commerce allows us to do asynchronous order gathering. Orders are placed in temporary storage and moved in bulk to the Order Management grid without any collisions (this is a configuration change and can be done at any time). It will slow down the processing of orders in the backend but allow the frontend to take in bigger bulks of orders at the same time. 

The second option is the processing of orders – it is always recommended to do that externally (ERP) and only update the status in Magento using APIs. Also, the frequency of such updates is important – if it is happening in bigger bulks, for example, only once a day with more than 5 thousand rows, then it is worth pushing it through a queue system. 

3rd party extensions 

Magento is known for its substantial number of extensions – free or paid. One of the selling points of building an online shop on Magento would be the ability to use those 3rd party solutions to quickly deliver a surprisingly good MVP. This is where it gets tricky, though – there is a substantial risk of performance problems when using external vendors.  

Firstly, we are not sure how good their code is. Sure, an internal dev team can do an audit and say whether this extension is ok or not. However, that is an additional cost and sometimes some bugs may not be so obvious to find.  

Secondly, even if extensions are well designed and coded and there are two or three of them working in the same place, handling the same data may add unnecessary points of failure. For example, if there are two extensions handling a display list of products, one allowing us to create a custom sorting order and another one to just add a sorting order based on sales – both could add 3 or 4 databases queries and slow time to the first byte by 100ms; 

The rule of thumb is to avoid 3rd party extensions, if possible, and stick to the core implementation of functions available in Magento. 

Custom business logic 

This is rather simple – each time you add custom business logic to Adobe Commerce, you are by default slowing it down. If your main goal is performance, then each time a new customisation must be developed, you need to ask the ecommerce team behind the shop if it is really needed. Perhaps there is some way to avoid it and use Magento core functionalities instead. 

But if there is no way to avoid it, there are a few important things to look out for: 

  • Firstly, make sure that the database structure is well architected and that tables have correct primary keys, unique keys, and keys in general for each type of query that will be invoked. 
  • Secondly, make sure that there are as minimal preferences and plugins as possible. Each preference and plugin poses a risk of losing valuable milliseconds from time to first byte, and in the case of an ajax request, this could be very problematic. 
  • Do as few data transformations as possible. 
  • Always keep in mind the scalability. Make sure the architecture and code can work on many instances and will not concur for the same database resources. 

Performance-orientated actions in the code 

Optimising code for performance will always be one of the most impactful ways of making a web shop faster, but also one of the most difficult and costly. This always ends up taking a great amount of developers’ time and it is not always easy to perform as it often means a lot of refactoring. If all the new customisation code is done correctly from the start and undergoes unit tests, functional tests, and integration tests, then there is a high chance that few senior engineers can shave off every tiny bit of millisecond from running the code. Otherwise, it would be an ongoing struggle. 

Some effective ways to battle the code optimisation are: 

  • Avoide loops – figure out if there is a way to get more precise data, so you do not have to iterate through a complete collection of products or orders. 
  • Defer external queries like curls – use RabbitMQ or schedule them and use cached data. 
  • Avoid many data transformations – try to operate on values that are as close to the original as possible. 
  • Try not to use multiple “if-else” statements and do not nest them. 
  • Check if Magento already has this function – often, some actions in the backend are already defined in Magento core (like serialisation, tax calculations and so on); 

Performance-orientated actions on the server 

This is an overall statement as for optimal resource usage, we would suggest using a cloud solution (Azure, AWS or any other). The main point of interest here would be automatic scalability and separating different instances for different usage if possible.  

A good example of this is having separate servers for the frontend and separate for the backend but we could go much further than that and have additional instances dedicated only to the administrative backend (admin panel) or just for checkout routing. This way, CPU and RAM performance drop on the entire site could be easily mitigated and it would affect only one part. Of course, there is still a database issue, but this one could also be handled with a typical primary/secondary device configuration. 

It becomes problematic when we are talking about many orders adding to and changing orders table. In such a case, except for adding more power for the primary database instance, we can only configure it better on MySQL/MariaDB/Aurora level. 

A list of good practices would be: 

  • Change all tables in the database to InnoDB, as some external vendors might still use MyIsam. 
  • It is always good to verify max_connection if it’s set up correctly. 
  • A good configuration change is setting up innodb_buffer_pool_size to around 1GB if you have more than 8GB of RAM on the database instance.  
  • Another thing to verify is often a default value for innodb_io_capacity, and in cloud env. with SSD disks, it is worth increasing this value. 

A final thing to look at is php-fpm configuration. This part can be tricky, as for example in the cloud env, there is often no straightforward way to configure it.  

The default sample file that is present in the Magento git repository is quite good for starters, but pretty quickly there will be a need to change it, beginning from a “pm” value to on-demand if it was not already so, and pm.max_children to account for more RAM (RAM divided by max child pool size value, typically around 100MB); 

Cache and CDN 

The last layer of performance tuning and the first layer for clients to see is cache. There are a few layers of cache that can be used with Magento, starting from Redis as a backend cache for blocks, going through full page cache like varnish and ending on CDN that will deliver some files like JS scripts and images from the closest possible server.  

The more you cache, the more you should gain (in theory) – but there is a bad side to cache. The information we are sending to clients may be outdated. So, like always, there is a long journey to find the sweet spot of how much we can cache and what cannot be cached. 

However, for CDN, this is always a thumbs-up. If possible, it is better to defer image loading (and optimising) to external sources. Fastly and Cloud flare, for example, can deliver images quicker and send better-optimised web images directly to clients. 

Summary 

To sum up all our findings: 

  • Performance optimalisation starts even before the actual development of the web shop. 
  • As soon as the discovery phase, we need to find answers to the following questions: Will it impact performance? If yes, is it needed? 
  • Use only validated by experience and well-known 3rd party extensions. 
  • If possible, avoid using more than a few 3rd party extensions. 
  • Do not save on server/cloud configuration – it’s better to spend more on initial configuration than spend more on every reconfiguration later on. 
  • Invest in caching mechanism and CDN – this will make the site scale up quickly. 

Good questions to the dev team: 

  • Is indexing and caching enabled? 
  • How often will the imports/exports be processed and how will it impact the front end? 
  • Are there unit tests / functional tests present? 

Good questions for the business team: 

  • Where will orders be handled and how often? 
  • If there is a new feature request, who will use this feature? How often? Is it worth it, (not only from a business perspective but from how it will affect the page speed overall)? 

About the author

Lukasz-Gawronski-Principal Chief Technology Officer

Lukasz Gawronski

Chief Technology Officer