Skip to content

Instantly share code, notes, and snippets.

@Carpk
Forked from ndelage/web_performance.md
Created February 24, 2014 16:09
Show Gist options
  • Save Carpk/9191232 to your computer and use it in GitHub Desktop.
Save Carpk/9191232 to your computer and use it in GitHub Desktop.

Web App Performance

Back End

N+1

  @users = User.all.includes(:profile) # SELECT * from users

Excessive joins/subselects or exta queries

Counts of things that depend on complicated queries or computations are usually good candidates for denormalization.

Are you using Ruby Enumerable or SQL?

User.all.select{ |u| u.active }.each do |u|

vs

User.where(:active => true)

Do you need all the data from a table?

Consider using #pluck to only load the fields you need. Prove this is actually helping before you use this strategy.

# Returns an array of Person objects, with only the id & name
Person.select(:id, :name)
# Returns an array of the first 100 active persons: [1,2,4,6,...]
Person.active.limit(100).pluck(:id)

Begin with the end in mind

Given this model:

class User
  has_many :friends
  has_many :posts
end

How can we find the posts written by a user's friends?

Slow round about approach

u = User.find(params[:id]) posts = [] u.friends.includes(:posts).each do |f| posts << f.posts end

[[p,p,p], [p]] posts.flatten.sort_by do |p| p.created end


__Getting to the point, with speed__

Post.order("created_at DESC").where(:poster_id => u.friends.pluck(:id))


#### Indexing
If the same queries are getting slower as your add more data to a table, you'd likely benefit from an index.

Indexes are more common to track foreign keys (e.g. company\_id). But are more generally useful for any field you do a lot of searching against. For example, an index on `users.email` would speed up the query to find a user by email. Which might be important once you reach a few thousand users.

Without an index the database must check every single row in a table (called a table scan).

### Caching

#### Simple per request caching
@foo ||=

#### Caching across requests
Rails.cache (maybe memcache)

#### Shorter term data storage
Redis or similar. Support for data structures like lists, hashes & arrays. Built in operations list .include?

### Long running tasks
* Switch to background jobs
* Includes email & API requests

### Links
* [query_analyzer for pg](https://github.com/trevorturk/pg_query_analyzer)
* [Google Pagespeed Insights](http://developers.google.com/speed/pagespeed/insights/)


## Front End

#### Fewer requests
This is especially important for slower (mobile) clients where latency is an issue. Imagine the cost of 80ms of latency per every request when you're dependent on 12 CSS files and 8 JS files...ugh!

The asset pipeline is a great example to follow.

#### Load less data
* Lazy load via JS
* Image resizing
* Image sprites
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment