-
-
Save fiftin/18221054c7777e1f1207 to your computer and use it in GitHub Desktop.
1. Dump the data only sql to file | |
$ pg_dump --data-only --inserts YOUR_DB_NAME > dump.sql | |
2. scp to local | |
3. Remove the SET statements at the top | |
such as: | |
SET statement_timeout = 0; | |
SET client_encoding = 'SQL_ASCII'; | |
4. Remove the setval sequence queries | |
such as: | |
SELECT pg_catalog.setval('MY_OBJECT_id_seq', 10, true); | |
5. Replace true => ‘t’ and false => ‘f’ | |
-- These: | |
INSERT INTO table_name VALUES (1, true, false); | |
-- Should be replace to: | |
INSERT INTO table_name VALUES (1, 't', 'f'); | |
6. Add BEGIN; and END; to wrap the whole thing as a trasaction | |
7. Import | |
$ rm db/development.sqlite3 | |
$ rake db:migrate | |
$ sqlite3 db/development.sqlite3 | |
> delete from schema_migrations; | |
> .read dump.sql | |
> .exit |
This was super helpful. Thank you!
I'm migrating a Rails database, and thought it worth adding that the INSERTS
for both the ar_internal_metadata
table and schema_migrations
table should be removed as these are created when running rails db:migrate
.
They will raise UNIQUE constraint failed
errors otherwise.
Thanks @equivalent and @aaronmoodie I added your suggestions to my script -- you can see them here:
hirefrank/pg-to-sqlite#1
Also @aaronmoodie 👋 it's been a long time since Etsy!
Oh! hey @hirefrank! 👋 I didn't realise that was you! Thanks for updating the script and for adding my suggestions.
@hirefrank you forgot to remove pg_catalog occurrences
Thanks @jvitoroc! Pls review and rest: hirefrank/pg-to-sqlite#3
If this works, I'll merge!
Merged!
such a clusterfuck....
it's not straight forward, if you use enums, lots of timestamps etc.
the db dump is just not compatible.
solution: dump to raw data, and load from raw data.
add this to your ApplicationRecord
or modify whatever you want
def self.to_csv
file_path = Rails.root.join('db', 'seeds', "#{self.name.underscore.pluralize}.csv")
FileUtils.mkdir_p(File.dirname(file_path)) unless File.directory?(File.dirname(file_path))
CSV.open(file_path, 'wb') do |csv|
csv << column_names
self.unscoped.each do |record|
csv << record.attributes.values
end
end
true
end
def self.from_csv
file_path = Rails.root.join('db', 'seeds', "#{self.name.underscore.pluralize}.csv")
raise "file no exista" unless File.exist?(file_path)
rows = []
CSV.foreach(file_path, headers: true) do |row|
attributes = row.to_hash
column_types = self.columns_hash
column_types.each do |column_name, column_info|
if column_info.type == :datetime || column_info.type == :timestamp
if attributes[column_name]
attributes[column_name] = Time.parse(attributes[column_name])
end
end
end
rows << attributes
end
if rows.any?
self.insert_all(rows)
puts "#{rows.size} records imported successfully!"
end
end
def self.dump_all_models_to_csv
descendants.each do |model|
next unless model.descendants.empty?
model.to_csv
puts "Exported #{model.name} to CSV successfully."
end
end
def self.import_all_models_from_csv
seeds_path = Rails.root.join('db', 'seeds')
csv_files = Dir.glob(seeds_path.join('*.csv'))
csv_files.each do |file_path|
file_name = File.basename(file_path, '.csv')
model_name = file_name.singularize.camelize
if Object.const_defined?(model_name)
model = model_name.constantize
model.unscoped.delete_all
model.from_csv
puts "Imported data from #{file_name}.csv successfully."
else
puts "No model found for #{file_name}.csv; skipping import."
end
end
end
I'm on Postgresql 16 and the default dump file has non compatible format of INSERT INTO with the one sqlite uses. I found out if you dump DB with
--attribute-inserts
it will workpg_dump --attribute-inserts --data-only -d my_psql_db > /tmp.dump.sql
reference pg_dump docs
https://hhh.how/notes/2067-convert-postgresql-to-sqlite