Created
February 26, 2012 17:39
-
-
Save jcasimir/1917900 to your computer and use it in GitHub Desktop.
Export ActiveRecord Tables to CSV
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
require 'csv' | |
class Exporter | |
DEFAULT_EXPORT_TABLES = [ Invoice, InvoiceItem, Item, Merchant, Transaction, User ] | |
DESTINATION_FOLDER = "tmp/" | |
def self.export_tables_to_csv(tables = DEFAULT_EXPORT_TABLES) | |
tables.each { |klass| export_table_to_csv(klass) } | |
end | |
def self.export_table_to_csv(klass) | |
output_file = CSV.open(filename_for_class(klass), "w") | |
output_file << klass.headers | |
klass.data.each{ |row| output_file << row } | |
output_file.close | |
end | |
def self.filename_for_class(klass) | |
DESTINATION_FOLDER + klass.to_s.underscore + '.csv' | |
end | |
end | |
class ActiveRecord::Base | |
def self.data | |
find(:all).map{ |i| i.data} | |
end | |
def self.headers | |
columns.map(&:name) | |
end | |
def data | |
self.class.headers.map { |column| self.send(column) } | |
end | |
end |
Line 25, I recommend you don't call find(:all)
but instead use find_in_batches
or find_each(:batch_size => 500)
to show that you can totally blow the process memory if you have too many records.
SELECT INTO OUTFILE BRO
I note that two of the three comments assume that the database is MySQL.
@ccalvert my comment includes the command for Postgres, so no, I didn't assume that.
@norman, wow, I read your code too fast, didn't I? Sorry about that. Comment retracted.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
If you have a reasonable amount of data it's much faster and easier to drop down a level and just use the command-line switches provided by mysqldump and friends: