Showing posts from 2019

Ruby on Rails optimization techniques

There’s a reputation that Ruby on Rails apps are slow, however, because of its simplicity, readability and many good techniques and tools available out of the box, it’s easy to make ROR apps behave faster than other language frameworks. So unless your app’s business model heavily depends on how much each transaction costs (e.g WhatsApp, Twitter, and other social networks) ROR can be the right tool for you. Let’s begin! No Optimization No optimization article would be complete without mentioning premature optimization. So the first rule of optimization is that you shouldn’t optimize unless you have a problem, and unless you know for sure there’s going to be a bottleneck, you should focus on readability and business value instead of solving a problem that doesn’t exist. Measure The second rule is measure! How do you know you are making improvements unless you have numbers to back you up? There are different tools available based on what you want to measure. For optimizin

Running ActiveJob after transaction

Transactions! If you know how to cook them they can be a real life saver when it comes to dealing with inconsistent data. But... There's always a "but", am I right? Unfortunately at some point you add code to your system that has sideeffects outside of your database, and obviously transactions won't help you with that. One of such things is Active Job (unless it's based on Delayed Job, which stores your jobs in the same database. Imagine the following scenario: What is wrong here? A lot actually! First of all the SendSmsJob . The SMS is sent regardless of whether a charge via ChargeCustomer succeeds or not. If the charge fails the customer is still going to get the SMS, even though the whole transaction was rolled back. Confusing experience. Second PrepareForShippingJob (it could be any other job referencing the order). What happens if the job is executed before the transaction commits? Payment gateways are not the fastest kids on the block, and

Importing Compose.IO MongoDB dump into local docker instance

Importing Compose MongoDB dump into local docker instance Sometimes in order to easily test performance on your local machine or to test data migrations on real data you need to import production database. To make things a bit more complicated we are going to use docker. In general I avoid making such tutorials, but I was surpised to find out there are no tutorials for this particular operation . If you inspect what contents of the dump that Compose provides you look like, it's something like that: What this folder contains is actually raw database data. So no need for mongorestore and such tools. Also keep in mind that even if your local Mongo uses WiredTiger format, this old format is still going to work. Just show me the code Fine! # tab 1 # extract the backup tar -xvf backup.tar -C backup # stop running mongo instance docker-compose stop mongo # spin up mongo container and run a bash command in it docker-compose run mongo /bin/bash # executing code inside