Skip to content

Instantly share code, notes, and snippets.

@viezel
Last active August 29, 2024 08:11
Show Gist options
  • Save viezel/b684778a2c790fb685f14eb0b4c3ab74 to your computer and use it in GitHub Desktop.
Save viezel/b684778a2c790fb685f14eb0b4c3ab74 to your computer and use it in GitHub Desktop.
Import test using Laravel-Excel
<?php
public function import(Request $request)
{
ini_set('max_execution_time', 300);
info("\n\n------------------ NEW ---------------------- ");
info("Start import: " . (memory_get_usage()/1024/1024) . " MB");
$before = microtime(true);
// use chunk to import the products from CSV
\Excel::filter('chunk')->load(storage_path('products.csv'))->chunk(1500, function($results) {
$data = [];
info("Chunk start: " . (memory_get_usage()/1024/1024) . " MB");
foreach ($results as $row) {
$data[] = [
"name" => $row->name,
"description" => $row->description_short,
"width" => $row->width,
"ingredient_description" => $row->ingredient_description
];
}
// insert to db
//\DB::table('testimport')->insert($data);
info("Chunk end: " . (memory_get_usage()/1024/1024) . " MB");
});
$after = microtime(true);
info("Total duration: " . ($after-$before) ." sec");
info("Memory peak: " . (memory_get_peak_usage()/1024/1024) ." MB");
info("------------------ END ---------------------- \n\n");
}

Mass CSV Import

Using Laravel 5.2, and Laravel-Excel 2.1.3

Import v1 results

Importing 3.500 products from a csv file (file size: 2.2 MB).

Test 1

Chunk: 1500. Duration 43 sec. Memory peak: 56 MB.

Test 2

Chunk: 700. Duration 48 sec. Memory peak: 54 MB.

Test 3

Chunk: 300. Duration: 80 sec. Memory peak: 53 MB.

Test 4

No Chunk. Duration: 18 sec. Memory peak: 59 MB.

Import v2 (big file)

Importing 50.000 products from a csv file (file size: 57.1 MB).

Test 1

Chunk: 1500. Duration 2019 sec (33.5 min). Memory peak: 352 MB.

Test 2

Chunk: 5000. Duration 694 sec (11.5 min). Memory peak: 353 MB.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment