3 minute read

Have you encountered a website with a very slow response? How do you feel when you have to wait for 5 or more seconds, for the page to even start responding? How do you feel when the app you are waiting for ages, is something you are paying for on a monthly basis? Fuck this, let’s try something different? Yep, that is the same thing your users (customers) are feeling when your app is waiting for something. And users have the tendency of voting with their wallets, which in this case, can make your much thinner than it is at the moment.

First and foremost issue is sending emails, when the user registers or when they ask for a password renewal. Those situations can even stand sending emails in sync, but you are blocking the web process thread for new requests while the email processing ends. Let’s consider a situation where you have some larger data massaging operation, that hides between a single form submit. Take for example large business reports, which have to collect data from a number of different data sources, both internal and external. Now, this process can take from a couple of seconds, to hours if your data is not well optimized. You are basically wasting that person’s time on waiting for the report to finish. Now multiply the average time that a person waits, with their hourly rate and that with the average number of reports someone waits for, and with the number of people running such reports in a given company. That is a serious waste of money right there, now if you could somehow optimize this process, you could stand to easily save $10,000+ on lost time, and more over the lifespan of the application.

Imagine if the users abandoned your shopping cart, while it was processing something that could have ran in the background. Now you are losing real sales, and your client is on their way of either firing you, or going broke.

OK, enough melodrama for now. Let us now consider a different option from doing everything in one process in sync. You have the option of threading, but that won’t get you very far, because you are still blocking the user’s flow, while they wait for the response. Enter background workers, which are best run on a separate machine, and they have queue of tasks that need to be performed. A set number of processes which take handle of those tasks, and report back(if needed) when the task is finished. Here are a few options for background jobs in Ruby on Rails:

  • Sidekiq - Multi threaded background processing built on Celluloid::IO and actor based model
  • Delayed::Job - One of the first, if not the first background workers for rails, extracted from Shopify
  • Resque - Background worker using redis instead of the database for queuing jobs

I’ve been using all of them over the years, but as of a year ago, I have been using only Sidekiq on new and migrated some old applications from other workers to it. The benefit of Sidekiq is using actor model and threads to process jobs concurrently, it has a low memory footprint and can be used to boost your application performance, and responsiveness. If you are on Heroku, this will cost you one worker instance per month, as that should be enough to process a very large number of background jobs. If you are running your own stack, one server with the minimum hardware requirements will suffice. No one cares if their email is delivered in 3 minutes instead of right now. But they do care if they can’t do anything for that 3 minutes while waiting for “something to finish”. Here is a basic example of how it’s done with Sidekiq:

class UserGeoLocator
  include Sidekiq::Worker
  def perform(user_id)
    user = User.find(user_id)
    user.perform_long_geolocation_api_call!
  end
end

You can find out all the things it can be used for in the Sidekiq Wiki on Github.

Comments