Rails application best optimization concepts
Optimization is MUST of a Rails application specially for the high traffic Ruby On Rails application and for good UX. Here are some optimization tips which I already applied in my recent Rails projects.
Optimization of Rails Application and application server
Optimization of Rails Application and application server
- Apply caching(Object cache, fragment_cache, action_cache, page_cache) and use memcached server instead of disc for storing cached data. You can get Rails caching idea from here http://guides.rubyonrails.org/caching_with_rails.html
- If you have complex searching functionality into your project then use search engine like sphinx or solr instead of doing sql like search
- If you use passanger for application server, try to use passenger pool size equal to your server's number of CPUs(cores), for more, see from here http://www.modrails.com/documentation/Users%20guide%20Apache.html#_passengermaxpoolsize_lt_integer_gt
- You could use Amazon S3 to host your assets(including images,compressed javascript and css)
- There is a great ruby gem for Rails https://github.com/AssetSync/asset_sync to sync your assets after immediate deploy.
- You could also use CDN(Content deliver network) for static resources(images,videos, pdf,css,js, etc) to give location wise resource delivery.
- If CDN looks costly then use rails assets host(it is not alternative of CDN) for concurrent static resources download, for more info about asset host check this link http://api.rubyonrails.org/classes/ActionView/Helpers/AssetTagHelper.html
- Use eager loading to avoid N+1 query problem. see from here
http://guides.rubyonrails.org/active_record_querying.html#eager-loading-associations - You could use background job(like Resque, Sidekiq) to process things asynchronously. For instance, after a user gets registered, you want to send email via sendgrid/mailchimp/AWS SES, simply execute email sending code inside a worker process to give user faster UX.
- To process big data set, check this link for parallel processing with fork
http://abdul-barek-rails.blogspot.com/2012/02/parallel-processing-in-ruby-on-rails.html
Optimization of web server
- Use nginx instead of Apache for better static resources(images,videos,css,javascript,text, html, etc) response
- For static resources cached on client side for specific time(for example, 30 days), use this configuration
http://abdul-barek-rails.blogspot.com/2012/07/integrate-http-reverse-proxy-cache.html
Client Side Optimization
- Try to use more Asynchronous requests using AJAX for better user experience and responsiveness in your application where you can.
- Try to reduce number of requests which is the key optimization on client side
- On production environment, use Rails asset pipeline feature so that your JavaScripts and css files are being combined and minified or compressed
- Use css stripe to avoid loading more images(same type of images) so that you can reduce number of http requests for images
- Try to reduce static resources weight(reduce size in KB or Byte)
- You can use jquery lazy load plugin to load images lazily or you can use jquery appear plugin to load images/contents on when screen appears to view
- To measure your application's load time and performance, use this google tool
https://developers.google.com/speed/pagespeed/insights - For better inspection of your server's resources on browser, use firefox's firebug adons in firefox browser and see how much time all resources are taking to load and see how weight of resources are and see what are the unnecessary http requests and then reduce resources weight and load time and reduce unnecessary http requests as you can.
Database Optimization
- Apply database indexing on table's those columns which are being used for join, foreign key, ordering or on where clause
- Apply data reporting, I mean instead of pulling big data result set on the fly, prepare them before they are being used, you can use any scheduler gem like rufus scheduler or delayed job to make stuff pre-ready.
- Inspect slow query from your database server log file and optimize it
You can use NewRelic for details servers logs, graphs, database logs and many many stuff to analyze more.