How To Optimize A Mess Of A Database And Make E-Commerce Run Like Clockwork?by QArea Team on July 5, 2017
Imagine having a store locator site that’s colossal in terms of both volumes and scaling! Oh, who are our Drupal developers kidding? You either already have one of these bad boys or are planning to make one work for your business in the nearest future given you are reading these very words.
And guess what? As examples of Big Data projects show, these kinds of sites are a major pain to be made. We are not saying you shouldn’t consider monetizing on one, we are just humbly warning – the road you’ve chosen won’t be easy.
How do we know? Because we are a team of experts specializing in crafting these sites among other services. And here’s how we make them tame and fuzzy as if they were warm little kittens. That noted the site will be equally as fast.
Is there a better way to show you how to enhance performance through the implementation of decent catching than on a real-deal example? Probably not, so tune in for an adventure we recently had with one of our recurring clients and his fresh, new idea of a store locator startup.
You may also like: Top 5 E-Commerce Platforms For Making Your Own Online Shop
Our client had a website and requested for an insanely large store locator that has a lot of data to process as is, but with the opportunity to add, modify and update new entries on a regular basis.
What we had on our hands? A mediocre Drupal e-Commerce website, a really tight schedule and literal mountains of work on our hands. It was never near industry standards, hence we were forced to redevelop a lot from scratch. Our inner goal was in making the site meet top quality standards and vivid industry benchmarks.
We were to enhance store locating possibilities as well. The site was to show the location of actual, brick and mortar retailers and we were to add more flexibility to the amounts of data inputs like supporting descriptions, ratings, etc.
We also needed to allow for the owner to re-upload the entire database of more than 400 000 entries (shops, stores and/or manufacturers) at will. The database was to be re-uploaded on a monthly basis and the operator needed the functionality to choose which data entries should be saved and which are to be erased.
But the biggest challenge of them all was the initial site itself. What was wrong with it, you ask?
It had serious trouble with both the Data Storage and the Architecture the storage was built upon.
Oh, yeah, there’s still mobile-responsiveness but that’ barely a challenge, especially for our team of savvy Drupal developers.
You may also like: The Reasons To Choose E-Commerce For Small Businesses
How’d our Drupal developers do it?
Let’s kick off with the easiest part. Drupal is like Lego by nature, but it has modules to serve as bricks. You can do practically anything with them, responsiveness included. We chose not to re-invent the bicycle here and simply used Mobile Drupal themes to ensure a friendly UI and UX on mobile devices.
Let’s proceed to the fun part – the Data Storage – shall we?
The one true best way to deal with product-specific information like, in our case, store locators, is to use custom entities. Your Database tables have to be developed with respect to them.
Drupal allows you to use standard Drupal fields through the module’s API to do so but we would still advise you to develop a custom controller for these purposes. That’s exactly what we did.
That noted, the database needed more flexibility. You see, it simply deleted all import-specific data from the cache once re-uploaded. We couldn’t have allowed for that to happen as these entry fields are needed and are to be used by the client.
At the same time, we could not afford toying around with re-imagining the entire database as well as the site’s functionality. We never had the time to do it, because the client requested results to be delivered, live and operational in the shortest terms possible.
What do developers do when in trouble? That’s right – they write some code. We did just that. A script allowed for us to save sensitive data without actually toggling with the DB’s structure or the site’s functionality.
What about speeding Drupal 8 deployment up a notch?
Now we are finally getting to the fun part – how we’ve made the site run like clockwork while maintaining the quickness of the Scarlet Speedster.
Batch upload speed and the fetching speed were two of our primary concerns. The first was to be decreased, while the latter was to be pumped up a bit. Max, our senior developer had a fantastic idea we all agreed upon and never regretted for a single moment.
We used temporary database tables paired with MySQL commands. This way we could import data straight from the CSV and into a specific table.
How’d we win time by doing so? We could ensure the system is working fine with the Database without seeking out the last imported row in colossal, overweight files.
We prevented the system from slowing down from overload by ensuring all specific Database tables are removed straight after being used for direct import from the SCV.
That noted, we chose Redis as our internal Drupal 8 cache storage and Varnish as the reserve proxy. We bulked the combo up with Drupal Eight’s caching system thus winning priceless milliseconds of load and investing them directly into speeding up all requests.
You may also like: How To Recover Your E-Commerce Platform After A Technical Failure
Drupal Redis + Varnish: The Endgame
Our client was happy with a fast, responsive and flexible store locator platform in which he no longer needed to attend occult rituals for proper maintenance. All was already done without a single drop of chicken blood.
Feel free to contact us, if you can no longer rely on magic while keeping your systems up-to-date. We can help!
Professional Development Services
You may also like: