-
-
Save ahtcx/0cd94e62691f539160b32ecda18af3d6 to your computer and use it in GitHub Desktop.
// ⚠ IMPORTANT: this is old and doesn't work for many different edge cases but I'll keep it as-is for any of you want it | |
// ⚠ IMPORTANT: you can find more robust versions in the comments or use a library implementation such as lodash's `merge` | |
// Merge a `source` object to a `target` recursively | |
const merge = (target, source) => { | |
// Iterate through `source` properties and if an `Object` set property to merge of `target` and `source` properties | |
for (const key of Object.keys(source)) { | |
if (source[key] instanceof Object) Object.assign(source[key], merge(target[key], source[key])) | |
} | |
// Join `target` and modified `source` | |
Object.assign(target || {}, source) | |
return target | |
} |
const merge=(t,s)=>{const o=Object,a=o.assign;for(const k of o.keys(s))s[k]instanceof o&&a(s[k],merge(t[k],s[k]));return a(t||{},s),t} |
A small modification of @foreverbule2003 's version, with two differences:
- A "mergeArrays" option which, if set to true, will have any source array's elements overwrite those of the target array at the same index.
- It no longer mutates the original target --- all objects passed as parameters are unchanged; the merged object is contained in the return value only
export const mergeDeep = (target, source, isMergingArrays = false) => {
target = ((obj) => {
let cloneObj;
try {
cloneObj = JSON.parse(JSON.stringify(obj));
} catch(err) {
// If the stringify fails due to circular reference, the merge defaults
// to a less-safe assignment that may still mutate elements in the target.
// You can change this part to throw an error for a truly safe deep merge.
cloneObj = Object.assign({}, obj);
}
return cloneObj;
})(target);
const isObject = (obj) => obj && typeof obj === "object";
if (!isObject(target) || !isObject(source))
return source;
Object.keys(source).forEach(key => {
const targetValue = target[key];
const sourceValue = source[key];
if (Array.isArray(targetValue) && Array.isArray(sourceValue))
if (isMergingArrays) {
target[key] = targetValue.map((x, i) => sourceValue.length <= i
? x
: mergeDeep(x, sourceValue[i], isMergingArrays));
if (sourceValue.length > targetValue.length)
target[key] = target[key].concat(sourceValue.slice(targetValue.length));
} else {
target[key] = targetValue.concat(sourceValue);
}
else if (isObject(targetValue) && isObject(sourceValue))
target[key] = mergeDeep(Object.assign({}, targetValue), sourceValue, isMergingArrays);
else
target[key] = sourceValue;
});
return target;
};
Some test cases:
const testTarget = {
one: "one",
two: ["one", "two"],
three: {
one: "one",
two: ["one", "two"],
three: {
one: "one",
two: "two",
three: "three"
}
}
};
const testSource = {
one: 1,
two: [1, 1.5, 2],
three: {
one: 1,
two: [1],
three: {
one: 1,
two: 2,
three: 3
},
four: 4
},
four: 4,
five: [1, 2, 3, 4, 5]
};
[
mergeDeep(testTarget, testSource), // Case #1 - Standard Usage: Concatenates Arrays
mergeDeep(testTarget, testSource, true), // Case #2 - MergedArrays = true: Merges Arrays
testTarget // Confirming Unchanged testTarget
].forEach((testCase) => console.log(testCase));
/* RESULTS:
Case #1 - Standard Usage: Concatenates Arrays
{ one: 1,
two: [ 'one', 'two', 1, 1.5, 2 ],
three:
{ one: 1,
two: [ 'one', 'two', 1 ],
three: { one: 1, two: 2, three: 3 },
four: 4 },
four: 4,
five: [ 1, 2, 3, 4, 5 ] }
Case #2 - MergedArrays = true: Merges Arrays
{ one: 1,
two: [ 1, 1.5, 2 ],
three:
{ one: 1,
two: [ 1, 'two' ],
three: { one: 1, two: 2, three: 3 },
four: 4 },
four: 4,
five: [ 1, 2, 3, 4, 5 ] }
Confirming Unchanged testTarget
{ one: 'one',
two: [ 'one', 'two' ],
three:
{ one: 'one',
two: [ 'one', 'two' ],
three: { one: 'one', two: 'two', three: 'three' } } }
*/
I have modified the Typescript code from the comment by adrian-marcelo-gallardo
This solution is for iteratively merge any number of objects (it's typed with TypeScript, you might need to remove typings before using on regular JS projects):
I made changes such as adding a type guard etc. to remove the Typescript errors,
and also separating out mergeDeepInner
to be a passed in as a function because deepMergeInner
will likely change depending upon the application.
For example right now I need to remove duplicates in the arrays because they are enums being or'd together.
Also considered the cases object/array and array/object .... have to remember that array is a subject of object.
Some other application might require a visited table to check for circular references, etc.
type objectType= Record<string,any>
// notice the return spec "obj is objectType" - it is saying believe me this an objectType,
// Inside the guard you check it at run time, so it's not an empty promise.
export const isObject = (obj: unknown): obj is objectType => {
return <boolean>obj && typeof obj === 'object'
};
export function deepMerge(
deepMergeInner:(target: objectType, source: objectType)=>objectType,
...objects: objectType[]):objectType {
if (objects.length===0)
return {};
if (objects.length===1)
return objects[0];
if (objects.some(object => !isObject(object))) {
throw new Error('deepMerge: all values should be of type "object"');
}
const target = objects.shift() as objectType;
let source: objectType;
while (source = objects.shift() as objectType) {
deepMergeInner(target, source);
}
return target;
}
Here is the deepMergeInnerDedupeArrays
with arrays deduping after concat. This part is likely to need modification per job, but deepMerge
is not.
import * as dm from 'deepMerge'
export function deepMergeInnerDedupeArrays(target: objectType, source: objectType): objectType {
function uniquify(a: any[]): any[] {
return a.filter((v, i) => a.indexOf(v) === i);
}
Object.keys(source).forEach((key: string) => {
const targetValue = target[key];
const sourceValue = source[key];
if (Array.isArray(targetValue) && Array.isArray(sourceValue)) {
target[key] = uniquify(targetValue.concat(sourceValue));
} else if (isObject(targetValue) && Array.isArray(sourceValue)) {
target[key] = sourceValue;
} else if (Array.isArray(targetValue) && isObject(sourceValue)) {
target[key] = sourceValue;
} else if (isObject(targetValue) && isObject(sourceValue)) {
target[key] = deepMergeInnerDedupeArrays(Object.assign({}, targetValue), sourceValue);
} else {
target[key] = sourceValue;
}
});
return target;
}
const r=deepMerge(deepMergeInnerDedupeArrays,[{a:1,2]},{a:[2,3]})
// r is {a:[1,2,3]}
function deepMerge(target, source){
const result = {...target,...source};
const keys = Object.keys(result);
for(const key of keys){
const tprop = target[key];
const sprop = source[key];
//if two objects are in conflict
if(typeof(tprop) == 'object' && typeof(sprop) == 'object'){
result[key] = deepMerge(tprop, sprop);
}
}
return result;
}
Hey,
Here's my implementation, with source object overwrite:
function objectMerge(target, source) {
for (const key of Object.keys(source)) {
const currenttarget = target[key];
const currentsource = source[key];
if (currenttarget) {
const objectsource = typeof currentsource === 'object';
const objecttarget = typeof currenttarget === 'object';
if (objectsource && objecttarget) {
void (Array.isArray(currenttarget) && Array.isArray(currentsource)
? void (target[key] = currenttarget.concat(currentsource))
: void objectMerge(currenttarget, currentsource));
continue;
}
}
target[key] = currentsource;
}
return target;
}
Edit my less verbose version with unique value array merge
const deepMerge = (source, target) => {
return void Object.keys(target).forEach(key => {
source[key] instanceof Object && target[key] instanceof Object
? source[key] instanceof Array && target[key] instanceof Array
? void (source[key] = Array.from(new Set(source[key].concat(target[key]))))
: !(source[key] instanceof Array) && !(target[key] instanceof Array)
? void deepMerge(source[key], target[key])
: void (source[key] = target[key])
: void (source[key] = target[key]);
}) || source;
}
you'll probably want to update the last two lines to this:
return Object.assign(target || {}, source);
otherwise the target you return in case it was null willl not have received the source props.
@craigphicks @arnotes @c0d3r111 @cxe
I still consider myself fairly new to Javascript, and wanted to ask if the implementations you have posted since my version are improvements or corrections to any errors/inefficiencies in how I coded things, and, if so, what the shortcomings in my code are? (I realize this request goes a bit beyond the scope of this thread, but I would very much appreciate your insights all the same, if you are so inclined! Please and thanks in advance :) )
@craigphicks @arnotes @c0d3r111 @cxe
I still consider myself fairly new to Javascript, and wanted to ask if the implementations you have posted since my version are improvements or corrections to any errors/inefficiencies in how I coded things, and, if so, what the shortcomings in my code are? (I realize this request goes a bit beyond the scope of this thread, but I would very much appreciate your insights all the same, if you are so inclined! Please and thanks in advance :) )
In my case I was not improving, evaluating or criticizing your code, which looks fine to me.
My interpretation the code by @arnotes, @c0d3r111, @cxe is that they are aiming for the shortest length code expression of the single most likely use case.
I think the most "mind expanding: pointer I could give would be to check out what the lodash library has to offer for deep merging.The basic deep merge (https://lodash.com/docs/#merge) merges the arrays index by index, rather than append and dedupe, as shown in the example.
How about a custom deep merge with array append and dedupe?
I think it should be possible using lodash mergeWith
and union
functions. If you can figure it out and post it here I will read it for sure, and I think it will be relevant and valuable information for anyone else who visits this page.
Happy coding!
Hi there, thanks for such an elegant and short solution!
I've just faced some trouble after trying to merge two objects (one of them is deep).
So I just like to share my solution, just in case if someone faces it too.
Consider the following piece of code:
const firstObject = { test: 1 };
const secondObject = {
firstLevel: {
secondLevel: {
thirdLevel: 'value'
}
}
};
const mergedObject = merge(firstObject, secondObject);
console.log(mergedObject);
If you try to reproduce it, you'll get an error Uncaught TypeError: Cannot read property 'secondLevel' of undefined
.
This is because the merge script (5th line) tries to get target[key]
, while there is no 'secondLevel'
key in target
object.
Therefore, you can add an additional check for key existence by slightly modifying if
condition, and the problem will be fixed:
if (source[key] instanceof Object && target.hasOwnProperty(key))
Hope this helps someone.
We can harden this a little by taken advantage of 2021 JS. First, weeding out the "primitive" null, because there's only one null and it's a straight assignment, not a property copy:
function merge(source, target) {
for (const [key, val] of Object.entries(source)) {
if (val !== null && typeof val === `object`) {
target[key] ??=new val.__proto__.constructor();
merge(val, target[key]);
} else {
target[key] = val;
}
}
return target; // we're replacing in-situ, so this is more for chaining than anything else
}
With the improvements relying on iterating with key/values using for (const [k,v] of Object.entries(...))
, and constructing "whatever this object was if it's missing in the target" based on the __proto__
constructor.
(and with swapped args because you look for needles in haystacks, and you merge sources into targets)
@Pomax This ^^ is the best solution to date and solves some multi-level nested object issues with the original gist. Good work!
For those who want a deep merge function that only mutates the original target if given permission, I've updated (and simplified) my earlier solution to make use of @Pomax 's great use of __proto__
constructors (something I know less than nothing about). I did decide to retain the original target, source
ordering of the parameters, as it aligns with Object.assign()
and I intuit merge()
to be more similar to that than to needle/haystack search functions --- totally a matter of personal opinion, of course!
(I also decided to separate the cloneObj()
and merge()
functions, as the former is quite useful on its own and doesn't need to clutter up the latter's function body.)
function clone(obj, isStrictlySafe = false) {
/* Clones an object. First attempt is safe. If it errors (e.g. from a circular reference),
'isStrictlySafe' determines if error is thrown or an unsafe clone is returned. */
try {
return JSON.parse(JSON.stringify(obj));
} catch(err) {
if (isStrictlySafe) { throw new Error(err) }
console.warn(`Unsafe clone of object`, obj);
return {...obj};
}
}
function merge(target, source, {isMutatingOk = false, isStrictlySafe = false} = {}) {
/* Returns a deep merge of source into target.
Does not mutate target unless isMutatingOk = true. */
target = isMutatingOk ? target : clone(target, isStrictlySafe);
for (const [key, val] of Object.entries(source)) {
if (val !== null && typeof val === `object`) {
if (target[key] === undefined) {
target[key] = new val.__proto__.constructor();
}
/* even where isMutatingOk = false, recursive calls only work on clones, so they can always
safely mutate --- saves unnecessary cloning */
target[key] = merge(target[key], val, {isMutatingOk: true, isStrictlySafe});
} else {
target[key] = val;
}
}
return target;
}
@Pomax If you happen by here somewhere along the line and are in a sharing mood, I'd love to hear your insights on the above --- I'm still learning JavaScript and am rabid for any pearls of wisdom I can find! :) (Oh, and that obviously goes for anyone else who happens by and has pearls to cast before... uh, me!)
@Eunomiac if you're making the behaviour contingent on an explicit argument, there's no need for a console warn, but I would make that an options object for clone
(for a single property) to align it with your merge
. A bigger issue is that you're using the JSON mechanism for cloning, but JSON cannot represent arbitrary JS objects because it's intended for data transport only, so non-data like symbols and functions end up getting ignored by JSON.stringify
. While you can use the JSON.parse(JSON.stringify)) trick as a one liner to clone a pure data object, it is not suitable for deep cloning JS objects.
Finally, note that if you have merge
, clone
is basically a fallthrough function:
function merge(target = {}, source={}) {
// this does not need to rely on clone
// ...code goes here...
return target;
}
function clone(obj) {
// this doesn't need its own code: cloning is the same as merging into an empty object
return merge({}, obj);
}
@Pomax Thanks a ton for taking the time to explain, I really appreciate it!
I was aware of the limitations of the parse/stringify trick in terms of losing anything that wasn't pure data, I just kind of accepted that as a necessary limitation of cloning (I actually stole the trick out of Underscore.js's library -- it's how their _.clone() method works -- and so I assumed that was the "best way" to do it).
But your way is definitely superior, as I'd love to convert my un-mutating merge function into one that performs an actual full deep clone, including of non-data properties.
I do have a few hopefully-quick follow-up questions, if you'd be so kind:
-
Is there anything your method won't accurately clone? I'm thinking of things like getters/setters, class definitions, or more exotic function definitions like generators and whatnot?
-
The "merge into an empty object" trick is so elegant and obvious in hindsight, I can't believe it never occurred to me. Am I right in concluding that, to take your original function and make it return a merged object without mutating the target, all I need to do is merge the target into an empty object at the top of the function (... with
{isMutatingOk: true}
, to avoid an infinite loop)? -
Would it be better to use
new target.__proto__.constructor()
instead of{}
as the first parameter in theclone()
function, to allow for merging array objects as well?
- if you want to deep clone classed objects, you need to set the correct prototype on the resulting cloned object
- not sure why you'd get an infinite loop at all?
copy(source)
falls through tomerge({}, source)
, but a regular merge you want to update the target, you don't want a new object at all. However, for the times that you really do,merge(copy(target), source)
is always an option since we have thatcopy
function =) - always tricky, as you have no guarantee that the constructor will even run without any arguments. Copying as plain object first, and then forcing the original prototype on, is generally more likely to succeed, but you do miss out on whatever side-effects the constructor might have. There is no universal solution here unfortunately.
let target = {...existing,...newdata};
This code will be more than enough to merge the data by using javascript.
let target = {...existing,...newdata}; This code will be more than enough to merge the data by using javascript.
As explain by @ahtcx , this gist is old. But its purpose is to merge objects deeply.
The gist {...existing,...newdata}
operates a non-deep merge: it's not the same purpose.
@rmp0101
let target = {...existing,...newdata}; This code will be more than enough to merge the data by using javascript.
What part of the word deep you don't understand?
Very usefull! If someone wants to use more than of two objects you can combine this function with Array.reduce() and an array of objects.
[{}, {}, {}].reduce((ci, ni) => merge(ci, ni), {})
Non-mutating deep merge and copy, making use of the newish structuredClone
function for the copying
function deepMerge(target, source) {
const result = { ...target, ...source };
for (const key of Object.keys(result)) {
result[key] =
typeof target[key] == 'object' && typeof source[key] == 'object'
? deepMerge(target[key], source[key])
: structuredClone(result[key]);
}
return result;
}
(some more care would be needed if you need to handle Arrays)
note that structuredClone
still requires you to do prototype assignment for classed/prototyped objects, though. That doesn't come free.
If you want to merge with a specific deep, possible solution