-
-
Save Integralist/749153aa53fea7168e7e to your computer and use it in GitHub Desktop.
const flattenTco = ([first, ...rest], accumulator) => | |
(first === undefined) | |
? accumulator | |
: (Array.isArray(first)) | |
? flattenTco([...first, ...rest]) | |
: flattenTco(rest, accumulator.concat(first)) | |
const flatten = (n) => flattenTco(n, []); | |
console.log(flatten([[1,[2,[[3]]]],4,[5,[[[6]]]]])) |
There is a bug on line 5:
? flattenTco([...first, ...rest])
should be ? flattenTco([...first, ...rest], accumulator)
I imagine once JS runtimes implement tail-call optimisation themselves that the reduce version may work with large depth arrays however, Array#reduce
fails on large depth arrays presently. Here is a quick test case, the code creates an array with a depth of 3000 and fills each array with a random amount of integers ranging from 0 to 10. If you run this code after compiling it to ES5, you will see that flatten2
executes correctly whereas flatten
errors with a maximum call stack exceeded error.
Here is a link to the code within the BabelJS online REPL
const randomArray = (arr = []) =>
Math.random() * 10 < 9
? randomArray([...arr, Math.floor(Math.random() * 10)])
: arr
const nestedRandomArray = (depth = 0, arr = [], fillWith) =>
depth == 0
? arr
: nestedRandomArray(depth - 1, [...randomArray(), arr])
const array = nestedRandomArray(3000)
const flatten = list => list.reduce(
(a, b) => a.concat(Array.isArray(b) ? flatten(b) : b), []
)
const flatten2 = ([first, ...rest], arr = []) =>
(first === undefined)
? arr
: (Array.isArray(first))
? flatten2([...first, ...rest], arr)
: flatten2(rest, [...arr, first])
const flatarray2 = flatten2(array)
console.log(flatarray2)
const flatarray = flatten(array)
console.log(flatarray)
If we write a left fold function in user-land that is tail-recursive it will work:
const foldl = (fn, terminalValue, [first, ...rest]) =>
first === void 0
? terminalValue
: foldl(fn, fn(terminalValue, first), rest)
const flatten2 = list => foldl(
(a, b) => a.concat(Array.isArray(b) ? flatten2(b) : b)
, []
, list
);
With loops only
const flatten = arr => {
let index
while ( (index = arr.findIndex(el => Array.isArray(el))) > -1 ) {
arr.splice(index, 1, ...arr[index])
}
return arr
}
flatten([1, [2, 'a', { b: 1, c: [2, 3] } ], [3, 4, [5, 6]]])
// => [ 1, 2, 'a', { b: 1, c: [ 2, 3 ] }, 3, 4, 5, 6 ]
To flatten one level deep is really simple:
function flatten(arr) {
return Array.prototype.concat(...arr);
}
Usage:
> flatten([1,2,3])
[ 1, 2, 3 ]
> flatten([1,2,3,[4,5]])
[ 1, 2, 3, 4, 5 ]
> flatten([1,2,3,[4,5,[6,7]]])
[ 1, 2, 3, 4, 5, [ 6, 7 ] ]
@mnpenner such a great solution.
I imagine you could also do Array.concat(...Array.concat(...arr))
and so on.
Awesome solution by @mnpenner, I needed this for Object.entries
, so single level flatten is all i needed. Thanks!
Array.prototype.flatten = function() {
return this.toString().replace('[', '').replace(']', '').split(',')
}
Nested arrays too
Deep version of @mnpenner
const flatten = arr => {
while (arr.find(el => Array.isArray(el))) arr = Array.prototype.concat(...arr)
return arr
}
function flatten(arr) {
return arr.reduce( (acc,arr) => [...acc, ...arr], []);
}
@christophemarois : how performant is that? 👍
@gkatsanos not really, you are right. This is the most performant version I can think of, though it's not as succinct.
const flatten = arr => {
let i = 0
while (i < arr.length) {
if (Array.isArray(arr[i])) {
arr.splice(i, 1, ...arr[i])
} else {
i++
}
}
return arr
}
It iterates on an array from left to right, staying on the same index as long as the current index is an array. It then uses splice to remove the array from the current index and insert its flatten elements at the current position. Because arr.length
is computed at every loop iteration, it will update on each loop to match the array's current length.
@christophemarois a little short
const flatten = arr => {
let i = 0
while (i < arr.length) {
Array.isArray(arr[i]) && arr.splice(i, 1, ...arr[i]) || i++
}
return arr
}
OR
const flatten = arr => {
for (let i=0; i < arr.length;Array.isArray(arr[i]) && arr.splice(i, 1, ...arr[i]) || i++){}
return arr
}
you can even omit return statement, since the array is modified by reference and will be mutated
@Phederic, doesn't work without Babel, and with Babel works..but not as expected
function flatten(arr) {
return arr.reduce( (acc,arr) => [...acc, ...arr], []);
}
let res = flatten([1, [2], 3, 4]); //logs [2]
Hi guys.
I've created a 5 test cases with different approaches. Can someone plz explain me why while+shift+concat+push technique is the fastest one?
Perhaps concat is the cheapest operation.
const flatten_loop = arr => {
let stack = [];
let item;
while (item = arr.shift()) {
if (Array.isArray(item)) {
arr = item.concat(arr);
} else {
stack.push(item);
}
}
return stack;
}
Test cases
https://jsperf.com/flatten-variants
`/* flatten depth 1 */
let flatten = arr => [].concat(...arr);
/**
flatten depth d < 4
not working properly
*/
let flattend =(arr, d)=>{
return d===1 ? flatten(arr) : flatten(flatten(arr, d-1))
}`
is there any alternative regardless flatten
I'm using the following, and it works like a charm:
export function flattenArray( array: Array<any> ): Array<any> {
const flattenedArray: Array<any> = [].concat( ...array );
return flattenedArray.some( Array.isArray )
? flattenArray( flattenedArray )
: flattenedArray;
}
You can perhaps use Array.prototype.flat()
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/flat
@bt4R9 your code has a bug. You can't just check if a value is truthy, you must check if it's undefined, otherwise the method stops processing. For example, flatten_loop([1,null,2])
will equal [1]
, not [1,null,2]
.
This version should fix that:
const flatten_loop = arr => {
let stack = [];
let item;
while ((item = arr.shift()) !== undefined) {
if (Array.isArray(item)) {
arr = item.concat(arr);
} else {
stack.push(item);
}
}
return stack;
}
var a = [1,2,3,[1,2,3],4,[5,6,7], 9,10] ;
var result = [];
a.forEach(data=>{
result = [...result].concat(data)
})
console.log(result);
Output
[1, 2, 3, 1, 2, 3, 4, 5, 6, 7, 9, 10]
Not effective but interesting!
arr = [1,2,3,[4,5,[1,2,3,4,56,67],6,7,[7.5,7.6,7.9],8],9,10,11]
const flat = (arr, depth= Infinity ,arr2 = []) =>{
arr.forEach(e => {
typeof e === 'object' && depth ? flat(e, depth-1 ,arr2) : arr2.push(e)
});
return arr2
}
console.log(flat( arr, 1 )) // [ 1, 2, 3, 4, 5, [ 1, 2, 3, 4, 56, 67 ], 6, 7, [ 7.5, 7.6, 7.9 ], 8, 9, 10, 11 ] // Depth 1 falt
console.log(flat(arr)) // [ 1, 2, 3, 4, 5, 1, 2, 3, 4, 56, 67, 6, 7, 7.5, 7.6, 7.9, 8, 9, 10, 11 ] // Full flat
const deepFlat = (arr) => arr.flat(Infinity)
one hack solution. (deep nested also works)
JSON.stringify(arr).split(',').map(each => each.match(/[+-]?\d+(?:\.\d+)?/g)[0])
one hack solution. (deep nested also works)
JSON.stringify(arr).split(',').map(each => each.match(/[+-]?\d+(?:\.\d+)?/g)[0])
work like a charm for number, but not with string ['un', 2, 'trois', 4, 'cinq'].
For immutable solution, this should be the most performant one, because it uses only pop() and push():
const flatten = (list) => {
const stack = [list];
const result = [];
while(stack.length > 0) {
const elem = stack.pop();
if (Array.isArray(elem)) {
for (let i = elem.length - 1; i >= 0; i--) {
stack.push(elem[i]);
}
} else {
result.push(elem);
}
}
return result;
};
Could probably have used
Array#reduce
as well...