1677

I have a very simple JavaScript array that may or may not contain duplicates.

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

I need to remove the duplicates and put the unique values in a new array.

I could point to all the codes that I've tried but I think it's useless because they don't work. I accept jQuery solutions too.

Similar question:

John Slegers
  • 38,420
  • 17
  • 182
  • 152
kramden88
  • 17,049
  • 3
  • 16
  • 17
  • 72
    `_.uniq(peoplenames)` solves this http://lodash.com/docs#uniq – Connor Leech Jul 29 '14 at 19:29
  • 8
    @ConnorLeech its easy with lodash but not optimized way – Suhail Mumtaz Awan May 16 '17 at 10:35
  • Best Solution has simple converting array to object, with object keys be arrary elements, value of each key let say "true". Then just convert back to array with Object.keys(obj_name) – nirbhaygp Dec 30 '17 at 13:46
  • 24
    The simplest approach (in my opinion) is to use the Set object which lets you store unique values of any type. In other words, Set will automatically remove duplicates for us. `const names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"]; let unique = [...new Set(names)]; console.log(unique); // 'Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Carl'` – Asif vora Dec 10 '18 at 10:43
  • 3
    There are too many Mikes in the world - why not remove them? Nancy got owned on this. – toad Feb 09 '19 at 02:47
  • This is just another way to remove duplicate var uniquearr= []; names.forEach(function(x){ if(!uniquearr.find(y => y === x)) { uniquearr.push(x); } }); console.log(uniquearr.toString()); – vitts Jun 21 '19 at 10:37
  • There're couple of ways to achieve it. A beautiful ref: https://medium.com/dailyjs/how-to-remove-array-duplicates-in-es6-5daa8789641c – Sunil Kumar Singh Jan 12 '20 at 02:14
  • 1
    in my solution, I sort data before filtering : ` const result = data.sort().filter((v, idx, t) => idx==0 || v != t[idx-1]); – Didier68 Jul 02 '20 at 09:06
  • The simple code `$.unique(arr)` worked for me – angela.mirjalili Sep 08 '20 at 08:11
  • https://www.codegrepper.com/search.php?q=array%20unique%20js – mars-o Nov 24 '20 at 15:35
  • using ramda R.uniq(names) solves this. https://ramdajs.com/docs/#uniq – varad11 Feb 22 '21 at 08:09

54 Answers54

4447

TL;DR

Using the Set constructor and the spread syntax:

uniq = [...new Set(array)];

"Smart" but naïve way

uniqueArray = a.filter(function(item, pos) {
    return a.indexOf(item) == pos;
})

Basically, we iterate over the array and, for each element, check if the first position of this element in the array is equal to the current position. Obviously, these two positions are different for duplicate elements.

Using the 3rd ("this array") parameter of the filter callback we can avoid a closure of the array variable:

uniqueArray = a.filter(function(item, pos, self) {
    return self.indexOf(item) == pos;
})

Although concise, this algorithm is not particularly efficient for large arrays (quadratic time).

Hashtables to the rescue

function uniq(a) {
    var seen = {};
    return a.filter(function(item) {
        return seen.hasOwnProperty(item) ? false : (seen[item] = true);
    });
}

This is how it's usually done. The idea is to place each element in a hashtable and then check for its presence instantly. This gives us linear time, but has at least two drawbacks:

  • since hash keys can only be strings or symbols in JavaScript, this code doesn't distinguish numbers and "numeric strings". That is, uniq([1,"1"]) will return just [1]
  • for the same reason, all objects will be considered equal: uniq([{foo:1},{foo:2}]) will return just [{foo:1}].

That said, if your arrays contain only primitives and you don't care about types (e.g. it's always numbers), this solution is optimal.

The best from two worlds

A universal solution combines both approaches: it uses hash lookups for primitives and linear search for objects.

function uniq(a) {
    var prims = {"boolean":{}, "number":{}, "string":{}}, objs = [];

    return a.filter(function(item) {
        var type = typeof item;
        if(type in prims)
            return prims[type].hasOwnProperty(item) ? false : (prims[type][item] = true);
        else
            return objs.indexOf(item) >= 0 ? false : objs.push(item);
    });
}

sort | uniq

Another option is to sort the array first, and then remove each element equal to the preceding one:

function uniq(a) {
    return a.sort().filter(function(item, pos, ary) {
        return !pos || item != ary[pos - 1];
    });
}

Again, this doesn't work with objects (because all objects are equal for sort). Additionally, we silently change the original array as a side effect - not good! However, if your input is already sorted, this is the way to go (just remove sort from the above).

Unique by...

Sometimes it's desired to uniquify a list based on some criteria other than just equality, for example, to filter out objects that are different, but share some property. This can be done elegantly by passing a callback. This "key" callback is applied to each element, and elements with equal "keys" are removed. Since key is expected to return a primitive, hash table will work fine here:

function uniqBy(a, key) {
    var seen = {};
    return a.filter(function(item) {
        var k = key(item);
        return seen.hasOwnProperty(k) ? false : (seen[k] = true);
    })
}

A particularly useful key() is JSON.stringify which will remove objects that are physically different, but "look" the same:

a = [[1,2,3], [4,5,6], [1,2,3]]
b = uniqBy(a, JSON.stringify)
console.log(b) // [[1,2,3], [4,5,6]]

If the key is not primitive, you have to resort to the linear search:

function uniqBy(a, key) {
    var index = [];
    return a.filter(function (item) {
        var k = key(item);
        return index.indexOf(k) >= 0 ? false : index.push(k);
    });
}

In ES6 you can use a Set:

function uniqBy(a, key) {
    let seen = new Set();
    return a.filter(item => {
        let k = key(item);
        return seen.has(k) ? false : seen.add(k);
    });
}

or a Map:

function uniqBy(a, key) {
    return [
        ...new Map(
            a.map(x => [key(x), x])
        ).values()
    ]
}

which both also work with non-primitive keys.

First or last?

When removing objects by a key, you might to want to keep the first of "equal" objects or the last one.

Use the Set variant above to keep the first, and the Map to keep the last:

function uniqByKeepFirst(a, key) {
    let seen = new Set();
    return a.filter(item => {
        let k = key(item);
        return seen.has(k) ? false : seen.add(k);
    });
}


function uniqByKeepLast(a, key) {
    return [
        ...new Map(
            a.map(x => [key(x), x])
        ).values()
    ]
}

//

data = [
    {a:1, u:1},
    {a:2, u:2},
    {a:3, u:3},
    {a:4, u:1},
    {a:5, u:2},
    {a:6, u:3},
];

console.log(uniqByKeepFirst(data, it => it.u))
console.log(uniqByKeepLast(data, it => it.u))

Libraries

Both underscore and Lo-Dash provide uniq methods. Their algorithms are basically similar to the first snippet above and boil down to this:

var result = [];
a.forEach(function(item) {
     if(result.indexOf(item) < 0) {
         result.push(item);
     }
});

This is quadratic, but there are nice additional goodies, like wrapping native indexOf, ability to uniqify by a key (iteratee in their parlance), and optimizations for already sorted arrays.

If you're using jQuery and can't stand anything without a dollar before it, it goes like this:

  $.uniqArray = function(a) {
        return $.grep(a, function(item, pos) {
            return $.inArray(item, a) === pos;
        });
  }

which is, again, a variation of the first snippet.

Performance

Function calls are expensive in JavaScript, therefore the above solutions, as concise as they are, are not particularly efficient. For maximal performance, replace filter with a loop and get rid of other function calls:

function uniq_fast(a) {
    var seen = {};
    var out = [];
    var len = a.length;
    var j = 0;
    for(var i = 0; i < len; i++) {
         var item = a[i];
         if(seen[item] !== 1) {
               seen[item] = 1;
               out[j++] = item;
         }
    }
    return out;
}

This chunk of ugly code does the same as the snippet #3 above, but an order of magnitude faster (as of 2017 it's only twice as fast - JS core folks are doing a great job!)

function uniq(a) {
    var seen = {};
    return a.filter(function(item) {
        return seen.hasOwnProperty(item) ? false : (seen[item] = true);
    });
}

function uniq_fast(a) {
    var seen = {};
    var out = [];
    var len = a.length;
    var j = 0;
    for(var i = 0; i < len; i++) {
         var item = a[i];
         if(seen[item] !== 1) {
               seen[item] = 1;
               out[j++] = item;
         }
    }
    return out;
}

/////

var r = [0,1,2,3,4,5,6,7,8,9],
    a = [],
    LEN = 1000,
    LOOPS = 1000;

while(LEN--)
    a = a.concat(r);

var d = new Date();
for(var i = 0; i < LOOPS; i++)
    uniq(a);
document.write('<br>uniq, ms/loop: ' + (new Date() - d)/LOOPS)

var d = new Date();
for(var i = 0; i < LOOPS; i++)
    uniq_fast(a);
document.write('<br>uniq_fast, ms/loop: ' + (new Date() - d)/LOOPS)

ES6

ES6 provides the Set object, which makes things a whole lot easier:

function uniq(a) {
   return Array.from(new Set(a));
}

or

let uniq = a => [...new Set(a)];

Note that, unlike in python, ES6 sets are iterated in insertion order, so this code preserves the order of the original array.

However, if you need an array with unique elements, why not use sets right from the beginning?

Generators

A "lazy", generator-based version of uniq can be built on the same basis:

  • take the next value from the argument
  • if it's been seen already, skip it
  • otherwise, yield it and add it to the set of already seen values

function* uniqIter(a) {
    let seen = new Set();

    for (let x of a) {
        if (!seen.has(x)) {
            seen.add(x);
            yield x;
        }
    }
}

// example:

function* randomsBelow(limit) {
    while (1)
        yield Math.floor(Math.random() * limit);
}

// note that randomsBelow is endless

count = 20;
limit = 30;

for (let r of uniqIter(randomsBelow(limit))) {
    console.log(r);
    if (--count === 0)
        break
}

// exercise for the reader: what happens if we set `limit` less than `count` and why
meager
  • 209,754
  • 38
  • 307
  • 315
georg
  • 195,833
  • 46
  • 263
  • 351
  • 15
    filter and indexOf have been introduced in ECMAScript 5, so this will not work in old IE versions (<9). If you care about those browsers, you will have to use libraries with similar functions (jQuery, underscore.js etc.) – Roman Bataev Feb 10 '12 at 15:26
  • 14
    @RoderickObrist you might if you want your page to work in older browsers – Michael Robinson Dec 17 '12 at 02:25
  • Why not make use of the array reference provided by the filter method to make the answer more general? – qw3n Jan 07 '13 at 16:31
  • 1
    @qw3n: readability. The 3rd parameter of `filter()` is rarely used, one has to look up the docs to figure out what it does. That said, your suggestion still deserves to be included in the answer, see edits. – georg Jan 07 '13 at 20:07
  • @thg435 well the reason I wondered is I was looking for this answer which was what I wanted, but in copying it over I didn't like the fact that it was so specific. So, I checked the docs myself to see if there was another parameter. – qw3n Jan 07 '13 at 23:44
  • 13
    This is `O(n^2)` solution, which can run very slow in large arrays... – seriyPS Feb 03 '13 at 00:47
  • Your first two functions will return `[]` for the array `[ NaN, NaN ]`. This is because `NaN === NaN // false` so as far as indexOf is concerned it fails to find it in the array. Although unlikely this is possible if the array element is the result of calculations that might return NaN – Bruno Jun 06 '13 at 12:55
  • Does any body have an idea how we can apply this solution to a use case like `[[1,2,3],[4,5,6],[1,2,3]]` to give `[[1,2,3],[4,5,6]]`? – gmajivu Aug 01 '13 at 13:14
  • 9
    Try this array: `["toString", "valueOf", "failed"]`. `toString` and `valueOf` are stripped completely. Use `Object.create(null)` instead of `{}`. – Charles Beattie Jun 16 '14 at 14:02
  • 2
    I was facing the same issue bit instead I figured maybe it was best to avoid duplicates in the first place. Since there is nothing like a unique List in JS, I decided to just add properties to an object; If you add the same twice, it will just be overwritten and problem of duplicates is solved. No need to filter anymore and all I need is just 1 for loop. – Lawrence Nov 04 '14 at 09:04
  • @Lawrence: sure, but see the part 2 (Hashtables) above. You can only add strings this way, no objects. – georg Nov 04 '14 at 09:06
  • Not enough jQuery! [This answer](http://stackoverflow.com/a/9229932/1252647) should do it. :-) – Noel Llevares Nov 23 '14 at 23:23
  • @georg in the last example, (uniq_fast), wouldn't it be easier to use out.push(item) instead of out[j++] = item ?. Update: never mind, you opted to avoid any function call altogether.. +1 – elad.chen Dec 13 '14 at 13:57
  • @georg need your help plz, I written code here to avoid duplicate insertion of values into arry but its not working , can u plz help me http://jsfiddle.net/pewn1d4m/10/ – Dan Apr 08 '15 at 20:51
  • minus one for shortage of dollar signs. *grin* – quillbreaker May 14 '15 at 19:50
  • 1
    btw, "Smart" but naïve way support third argument with full array. uniqueArray = a.filter(function(item, pos, arr) { return arr.indexOf(item) == pos; }) – Denis Sep 15 '15 at 18:59
  • @vasa: thanks for your edit, it was rejected for some reason, but I restored it. ES6 code looks much better now! – georg Feb 04 '16 at 12:21
  • 1
    Convert to a Set, and then back again. Lovely. – superluminary Aug 04 '16 at 09:00
  • 7
    Anyone know how fast the Set conversion solution is, compared to the others? – Eric Nguyen Aug 12 '16 at 19:34
  • Since _the default sort order is according to string Unicode code points_ [...](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort) (`[10, 2, 1].sort()` results `[1, 10, 2]`), you'll need to use `.sort((x,y)=>x-y)` instead. – Washington Guedes Sep 29 '16 at 14:47
  • @Guedes: this doesn't matter for this case, because whatever sorting logic you use, equal elements are always next to each other. – georg Sep 29 '16 at 18:04
  • I tried the `uniqBy(a, key)` function, but it said 'key is not a function' and does not exist... Don't I get to fill in a property name of the objects which are in my array? – mesqueeb Feb 25 '17 at 06:43
  • trying the code snippet, i don't get an order of magnitude improvement with the ugly solution any more. it's about twice as fast. – worc Mar 14 '17 at 21:38
  • @worc: right, things improve greatly over time. – georg Mar 14 '17 at 22:59
  • Really nice post. Thanks for it, was really useful (and a really interesting reading). BTW, functions should be declared as `const`, unless you want to leave them open to be overwritten (which is not the case, and not a typical one). – elboletaire Jun 18 '17 at 22:05
  • 1
    @elboletaire: thanks. My personal preference is to only use `const` for actual constant values (as in `const pi = 3.14`). I guess this is a bad naming decision in ES6 to use the `const` keyword for "immutable bindings". Obviously, a function is not a "constant", in any reasonable sense. – georg Jun 19 '17 at 14:00
  • Yeah but makes sense in javascript, where you can easily override a functions code. I'm with you that it's, strange, but I think is something about javascript, not about ES6. – elboletaire Jun 19 '17 at 17:43
  • The one line solution <3 – Nicholas Jan 18 '18 at 10:18
  • @georg hi can you please check this question https://stackoverflow.com/questions/48355599/loading-large-array-in-oi-select-takes-too-much-of-time-in-angularjs thank you... – Sudarshan Kalebere Jan 24 '18 at 09:49
  • Will ES6 effort support older browsers? – Hidayt Rahman Feb 22 '18 at 10:48
  • Keep in mind that if you have an array of objects, each one of them are different references, so in ES6 SET will not filter those – Lug Mar 30 '18 at 04:27
  • Suggestion: Reorder the possible methods, "best" first. – Fred Gandt Apr 05 '18 at 21:07
  • how `(seen[item] = true)` this works? check hashtables code. i am not understanding. can u just elaborate – Prashant Tapase Jul 04 '18 at 12:41
  • 4
    Note: The `Set` version uses shallow comparison, so won't work if the elements in the array are objects. – A Jar of Clay Oct 03 '18 at 16:49
  • Thanks Sir. Explanation is really good. – Ashfaque Rifaye Oct 23 '18 at 06:47
  • if elements of the array are the type of object, we should change smart solution to that via unique key like "name": a.filter(function (item, index, self) { return self.findIndex(child => item.name === child.name) == index; }) – Tugrul Emre Atalay Jun 02 '19 at 11:55
  • Found exact answer I wanted here: https://medium.com/@pankaj.itdeveloper/remove-duplicates-from-javascript-array-f3e15ead09e9 – Soni Kumari Jul 27 '19 at 03:30
  • @SoniKumari: yep, looks like the author forgot to cite their sources. ;) – georg Jul 27 '19 at 07:40
  • Why this is on the second place? – Tahazzot May 21 '20 at 14:52
  • Bench of the different suggested methods : https://jsbench.me/3fkaqz9y16/1 Results may depend on browser (Chrome and Firefox don't give the same results). – Emmanuel Sellier May 29 '20 at 16:44
  • I'd offer that your 'The best from two worlds' scenario could probably be improved by using the hash for everything and simply encoding the type alongside a stringified object – Schalton Oct 27 '20 at 17:38
  • I think a downside to the first solution is that it doesn't keep order. Although not a requirement, it is usually better to keep the order, and we can do that by using reduce: `arr.reduce(([set, acc], item) => set.has(item) ? [set, acc] : [set.add(item), (acc.push(item), acc)], [new Set(), []])[1]` – user239558 Feb 20 '21 at 21:48
  • `[...new Set(a)]` didn't work for me, but `return Array.from(new Set(a))` did. Using an array object I get from `JSON.parse()` if that makes any difference, if anyone happens to wonder.. – Emanuel Lindström May 23 '21 at 23:14
492

Quick and dirty using jQuery:

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];
var uniqueNames = [];
$.each(names, function(i, el){
    if($.inArray(el, uniqueNames) === -1) uniqueNames.push(el);
});
Martijn Pieters
  • 889,049
  • 245
  • 3,507
  • 2,997
Roman Bataev
  • 8,835
  • 2
  • 18
  • 15
  • 312
    wouldn't mind a non-jquery answer for those who don't use it – Matej Voboril Apr 11 '17 at 18:08
  • 7
    As this was reverted back to the original `inArray` solution by a reputable person, I am going to again mention: this solution is O(n^2), making it inefficient. – Casey Kuball Sep 21 '17 at 16:35
  • 18
    I really wish in 2020 we could start depreciating jQuery and other even-more dated answers... Stackoverflow is starting to show some age here... – Nick Steele Jun 29 '20 at 23:23
  • 3
    I agree @NickSteele but I find it does happen naturally over time if you look at votes and not the accepted answer. The best answer will gravitate towards the top as older deprecated answers get downvoted – Chris Sep 30 '20 at 07:25
  • let uniqueNames = names.filter((item, pos ,self) => self.indexOf(item) == pos); – Sajith Sajan Nov 17 '20 at 20:04
  • If you're using jquery, there is $.unique, though that will also sort the items in the result. The best answer is below (creating a set from the array), this answer is inefficient, and out-of-date. – Casey Kuball Nov 28 '20 at 20:30
  • @Chris It'll gravitate towards the top, but not higher than the accepted answer... I've learnt to just look at the highest-voted, non-accepted answer :D – Tobias Feil Feb 10 '21 at 13:53
  • booooooooooooooo – Jared Beach May 11 '21 at 20:28
  • @NickSteele There is no reason jQuery would go obsolete, react and angular are basically great but they are not a replacement for jQuery. If you have to have a good seo, or you work on WordPress and a lot of other stuff you have to use jQuery anyways. – steve moretz May 20 '21 at 16:52
  • JQuery was and is still made to provide document traversal (now built into browsers), animation (now much faster and cleaner alternatives exist), event handling (now much better implementations exist that also work in Node), and Ajax (Ajax was replaced by WebSocket almost a decade ago). Since all 4 corners of JQuery are comparatively outdated, the only reason to use JQuery is if you already know it and don't have the bandwidth to learn something better. Everything that JQuery does today is done better by other libraries, or has been totally replaced. – Nick Steele May 20 '21 at 18:12
350

Got tired of seeing all bad examples with for-loops or jQuery. Javascript has the perfect tools for this nowadays: sort, map and reduce.

Uniq reduce while keeping existing order

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

var uniq = names.reduce(function(a,b){
    if (a.indexOf(b) < 0 ) a.push(b);
    return a;
  },[]);

console.log(uniq, names) // [ 'Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Carl' ]

// one liner
return names.reduce(function(a,b){if(a.indexOf(b)<0)a.push(b);return a;},[]);

Faster uniq with sorting

There are probably faster ways but this one is pretty decent.

var uniq = names.slice() // slice makes copy of array before sorting it
  .sort(function(a,b){
    return a > b;
  })
  .reduce(function(a,b){
    if (a.slice(-1)[0] !== b) a.push(b); // slice(-1)[0] means last item in array without removing it (like .pop())
    return a;
  },[]); // this empty array becomes the starting value for a

// one liner
return names.slice().sort(function(a,b){return a > b}).reduce(function(a,b){if (a.slice(-1)[0] !== b) a.push(b);return a;},[]);

Update 2015: ES6 version:

In ES6 you have Sets and Spread which makes it very easy and performant to remove all duplicates:

var uniq = [ ...new Set(names) ]; // [ 'Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Carl' ]

Sort based on occurrence:

Someone asked about ordering the results based on how many unique names there are:

var names = ['Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Nancy', 'Carl']

var uniq = names
  .map((name) => {
    return {count: 1, name: name}
  })
  .reduce((a, b) => {
    a[b.name] = (a[b.name] || 0) + b.count
    return a
  }, {})

var sorted = Object.keys(uniq).sort((a, b) => uniq[a] < uniq[b])

console.log(sorted)
Christian Landgren
  • 11,124
  • 6
  • 31
  • 31
  • 5
    This is perfect because unlike filter it actually allows to do some deep manipulation of objects – Necronet Jul 18 '14 at 23:00
  • 4
    This answer deserves more upvotes. Just beautiful, and only Javascript solution as requested by OP! Thank you!! – Mbuso Jul 20 '14 at 12:14
  • 4
    Perfect answer, clean and functional. – amaurs Jul 16 '15 at 01:30
  • Nice! Would it be possible to sort the array based on the frequency of duplicate objects? So that `"Nancy"` in the above example is moved to the front (or back) of the modified array? – ALx Nov 25 '15 at 13:38
  • @ALx - I updated with an example for sorting based on occurrence. – Christian Landgren Dec 02 '15 at 22:21
  • sort() appears to be called incorrectly in your second example: if a is < b then it returns the same value as if a == b, which can lead to unsorted results. Unless you're doing something clever here that I'm missing, should be `.sort(function(a,b){ return a > b ? 1 : a < b ? -1 : 0; })` – Shog9 Feb 05 '16 at 01:51
  • If the data is just an array of names with no requirement other than to eliminate duplicates, why bother with sort, map, and reduce? Just use a set - job done in O(n) time. -- https://msdn.microsoft.com/en-us/library/dn251547 – Dave May 19 '16 at 16:47
  • @Dave yes- see my example on `[...new Set(names)]` above – Christian Landgren May 24 '16 at 14:59
  • 10
    The ES6 version is beautiful. – superluminary Aug 04 '16 at 09:01
  • As per http://kangax.github.io/compat-table/es6/#test-Set , your ECMA6 version would not work in Android 5 and below. – Tushar Shukla Oct 13 '16 at 14:17
  • @TusharShukla - a lot of people are using Babel to transpile their ES6-code to ES5/3 – Christian Landgren Oct 16 '16 at 21:00
  • All examples I've seen so far are all related to ONE element in array. What if you need uniqueness over more than one element? – Jonny Feb 11 '17 at 20:14
  • To make it clearer I would personally use `const uniqueList = [ ...(new Set(names)) ];` (with the extra parentheses). – Alexander Mills Nov 27 '17 at 21:11
  • ES6 Set is case sensitive hence it won't give you a unique string list if it contains multiple strings with the same char but different case. For example, if array of string is `const duplicateArray = [ 'phone', 'PHONE', 'Phone', 'phone' ];` Then, `const uniqArray = [ ...new Set(duplicateArray) ];` would give you result as `[ 'phone', 'PHONE', 'Phone' ]` – Vinayaka S P Jul 26 '20 at 07:21
137

Vanilla JS: Remove duplicates using an Object like a Set

You can always try putting it into an object, and then iterating through its keys:

function remove_duplicates(arr) {
    var obj = {};
    var ret_arr = [];
    for (var i = 0; i < arr.length; i++) {
        obj[arr[i]] = true;
    }
    for (var key in obj) {
        ret_arr.push(key);
    }
    return ret_arr;
}

Vanilla JS: Remove duplicates by tracking already seen values (order-safe)

Or, for an order-safe version, use an object to store all previously seen values, and check values against it before before adding to an array.

function remove_duplicates_safe(arr) {
    var seen = {};
    var ret_arr = [];
    for (var i = 0; i < arr.length; i++) {
        if (!(arr[i] in seen)) {
            ret_arr.push(arr[i]);
            seen[arr[i]] = true;
        }
    }
    return ret_arr;

}

ECMAScript 6: Use the new Set data structure (order-safe)

ECMAScript 6 adds the new Set Data-Structure, which lets you store values of any type. Set.values returns elements in insertion order.

function remove_duplicates_es6(arr) {
    let s = new Set(arr);
    let it = s.values();
    return Array.from(it);
}

Example usage:

a = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

b = remove_duplicates(a);
// b:
// ["Adam", "Carl", "Jenny", "Matt", "Mike", "Nancy"]

c = remove_duplicates_safe(a);
// c:
// ["Mike", "Matt", "Nancy", "Adam", "Jenny", "Carl"]

d = remove_duplicates_es6(a);
// d:
// ["Mike", "Matt", "Nancy", "Adam", "Jenny", "Carl"]
Casey Kuball
  • 6,555
  • 5
  • 38
  • 62
  • 7
    In more recent browsers, you could even do `var c = Object.keys(b)`. It should be noted that this approach will only work for strings, but it's alright, that's what the original question was asking for. – amenthes Aug 26 '14 at 08:33
  • 1
    It should also be noted that you may lose the order of the array because objects don't keep their properties in order. – Juan Mendes May 27 '15 at 19:38
  • 1
    @JuanMendes I have created an order-safe version, which simply copies to the new array if the value has not been seen before. – Casey Kuball Jun 23 '15 at 21:36
  • What is happening on this line `obj[arr[i]] = true;` ?? – kittu Oct 11 '18 at 16:46
  • 1
    @kittu, that is getting the `i`th element of the array, and putting it into the object (being used as a set). The key is the element, and the value is `true`, which is entirely arbitrary, as we only care about the keys of the object. – Casey Kuball Oct 12 '18 at 02:54
  • @Darthfett So the element will become key here and value could be anything/arbitrary? – kittu Oct 12 '18 at 04:55
  • @kittu precisely. If the element already exists in the object, it doesn't matter, as there can only be one entry for each unique value, thus eliminating the duplicate. – Casey Kuball Oct 12 '18 at 05:12
82

A single line version using array filter and indexOf functions:

arr = arr.filter (function (value, index, array) { 
    return array.indexOf (value) == index;
});
meager
  • 209,754
  • 38
  • 307
  • 315
HBP
  • 14,098
  • 4
  • 25
  • 33
73

Use Underscore.js

It's a library with a host of functions for manipulating arrays.

It's the tie to go along with jQuery's tux, and Backbone.js's suspenders.

_.uniq

_.uniq(array, [isSorted], [iterator]) Alias: unique
Produces a duplicate-free version of the array, using === to test object equality. If you know in advance that the array is sorted, passing true for isSorted will run a much faster algorithm. If you want to compute unique items based on a transformation, pass an iterator function.

Example

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

alert(_.uniq(names, false));

Note: Lo-Dash (an underscore competitor) also offers a comparable .uniq implementation.

Brandon Boone
  • 15,271
  • 4
  • 69
  • 93
  • 2
    unfortunately underscore does not provide the ability to define a custom equality function. The callback they do allow is for an 'iteratee' function e.g. with args (item, value, array). – Rene Wooller Nov 17 '14 at 07:02
61

You can simply do it in JavaScript, with the help of the second - index - parameter of the filter method:

var a = [2,3,4,5,5,4];
a.filter(function(value, index){ return a.indexOf(value) == index });

or in short hand

a.filter((v,i) => a.indexOf(v) == i)
Ashutosh Jha
  • 11,603
  • 8
  • 42
  • 72
58

One line:

let names = ['Mike','Matt','Nancy','Adam','Jenny','Nancy','Carl', 'Nancy'];
let dup = [...new Set(names)];
console.log(dup);
Jonca33
  • 2,574
  • 4
  • 20
  • 32
38

use Array.filter() like this

var actualArr = ['Apple', 'Apple', 'Banana', 'Mango', 'Strawberry', 'Banana'];

console.log('Actual Array: ' + actualArr);

var filteredArr = actualArr.filter(function(item, index) {
  if (actualArr.indexOf(item) == index)
    return item;
});

console.log('Filtered Array: ' + filteredArr);

this can be made shorter in ES6 to

actualArr.filter((item,index,self) => self.indexOf(item)==index);

Here is nice explanation of Array.filter()

Sumit Joshi
  • 922
  • 1
  • 12
  • 21
37

The most concise way to remove duplicates from an array using native javascript functions is to use a sequence like below:

vals.sort().reduce(function(a, b){ if (b != a[0]) a.unshift(b); return a }, [])

there's no need for slice nor indexOf within the reduce function, like i've seen in other examples! it makes sense to use it along with a filter function though:

vals.filter(function(v, i, a){ return i == a.indexOf(v) })

Yet another ES6(2015) way of doing this that already works on a few browsers is:

Array.from(new Set(vals))

or even using the spread operator:

[...new Set(vals)]

cheers!

Ivo
  • 5,937
  • 1
  • 23
  • 42
  • Set is great and very intuitive for those used to python. Too bad they do not have those great (union, intersect, difference) methods. – caiohamamura Oct 28 '15 at 01:38
  • I went with the simplistic one line of code that utilizes the `set` mechanic. This was for a custom automation task so I was not leery of using it in the latest version of Chrome (within jsfiddle). However, I would still like to know the shortest **all browser compliant** way to de-dupe an array. – Alexander Dixon Jun 10 '16 at 14:07
  • sets are part of the new specification, you should use the sort/reduce combo to assure cross-browser compatibility @AlexanderDixon – Ivo Jun 10 '16 at 14:17
  • `.reduce()` is not cross-browser compatible as I would have to apply a poly-fill. I appreciate your response though. https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/Reduce – Alexander Dixon Jun 10 '16 at 14:56
33

The top answers have complexity of O(n²), but this can be done with just O(n) by using an object as a hash:

function getDistinctArray(arr) {
    var dups = {};
    return arr.filter(function(el) {
        var hash = el.valueOf();
        var isDup = dups[hash];
        dups[hash] = true;
        return !isDup;
    });
}

This will work for strings, numbers, and dates. If your array contains objects, the above solution won't work because when coerced to a string, they will all have a value of "[object Object]" (or something similar) and that isn't suitable as a lookup value. You can get an O(n) implementation for objects by setting a flag on the object itself:

function getDistinctObjArray(arr) {
    var distinctArr = arr.filter(function(el) {
        var isDup = el.inArray;
        el.inArray = true;
        return !isDup;
    });
    distinctArr.forEach(function(el) {
        delete el.inArray;
    });
    return distinctArr;
}

2019 edit: Modern versions of JavaScript make this a much easier problem to solve. Using Set will work, regardless of whether your array contains objects, strings, numbers, or any other type.

function getDistinctArray(arr) {
    return [...new Set(arr)];
}

The implementation is so simple, defining a function is no longer warranted.

gilly3
  • 78,870
  • 23
  • 132
  • 169
22

Solution 1

Array.prototype.unique = function() {
    var a = [];
    for (i = 0; i < this.length; i++) {
        var current = this[i];
        if (a.indexOf(current) < 0) a.push(current);
    }
    return a;
}

Solution 2 (using Set)

Array.prototype.unique = function() {
    return Array.from(new Set(this));
}

Test

var x=[1,2,3,3,2,1];
x.unique() //[1,2,3]

Performance

When I tested both implementation (with and without Set) for performance in chrome, I found that the one with Set is much much faster!

Array.prototype.unique1 = function() {
    var a = [];
    for (i = 0; i < this.length; i++) {
        var current = this[i];
        if (a.indexOf(current) < 0) a.push(current);
    }
    return a;
}


Array.prototype.unique2 = function() {
    return Array.from(new Set(this));
}

var x=[];
for(var i=0;i<10000;i++){
 x.push("x"+i);x.push("x"+(i+1));
}

console.time("unique1");
console.log(x.unique1());
console.timeEnd("unique1");



console.time("unique2");
console.log(x.unique2());
console.timeEnd("unique2");
ShAkKiR
  • 765
  • 6
  • 18
  • 3
    Upvote for the use of Set. I don't know the performance comparison though – ken May 24 '18 at 02:03
  • I have read somewhere that an Array is faster than a Set (overall performance), But when I tested in chrome, the implementation with Set was much much faster! see the edited answer :) – ShAkKiR May 25 '18 at 07:04
  • better practice is to use Object.defineProperty(Array.prototype,"unique".. instead of Array.prototype.unique = ... See more info here https://stackoverflow.com/questions/10105824/when-do-you-use-object-defineproperty?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa – ShAkKiR Jun 06 '18 at 10:11
  • the Set approach doesn't seem to work for me in Node. new Set([5,5]) seems to return [5,5] in some cases. I'm as baffled as you are. Edit: I found out what's happening. new Set([new Number(5), new Number(5)]) returns [5,5]. Apparently Node thinks the two number 5s are different if I instantiate them with new... which is honestly the stupidest thing I've ever seen. – Demonblack Jun 18 '18 at 13:43
  • @Demonblack This is a valid concern. x=new Number(5) and another y=new Number(5) will be two different Objects, as oppose to just var x=5 and var y=5. new keyword will create a new object. I know this explanation is obvious but that's all I know :) – ShAkKiR Aug 28 '18 at 13:47
  • @ShAkKiR After dabbling with JS for a while now I think I get it. 5 is a primitive for JS, and Number("5") is a constructor which gives you that primitive 5. JS being JS, you're allowed to call new even on classes that never meant you to do so, but it usually produces undesired behavior. Since Number wasn't meant to be used with new, it defaults to creating two different Number _objects_, as you said. And since they don't redefine the equality operator (because in js _you can't_) and they aren't special cases with autoboxing like in Java either, Set doesn't know they're supposed to be equal. – Demonblack Aug 28 '18 at 15:38
21

Simplest One I've run into so far. In es6.

 var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl", "Mike", "Nancy"]

 var noDupe = Array.from(new Set(names))

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set

Deke
  • 3,837
  • 2
  • 35
  • 53
  • For Mac users, even though this is an ES6 function, it works in macOS 10.11.6 El Capitan, using the Script Editor. – JMichaelTX Apr 06 '17 at 20:08
20

Go for this one:

var uniqueArray = duplicateArray.filter(function(elem, pos) {
    return duplicateArray.indexOf(elem) == pos;
}); 

Now uniqueArray contains no duplicates.

Pang
  • 8,605
  • 144
  • 77
  • 113
Juhan
  • 1,187
  • 1
  • 10
  • 26
17

The following is more than 80% faster than the jQuery method listed (see tests below). It is an answer from a similar question a few years ago. If I come across the person who originally proposed it I will post credit. Pure JS.

var temp = {};
for (var i = 0; i < array.length; i++)
  temp[array[i]] = true;
var r = [];
for (var k in temp)
  r.push(k);
return r;

My test case comparison: http://jsperf.com/remove-duplicate-array-tests

mgthomas99
  • 2,960
  • 1
  • 16
  • 20
Levi
  • 1,008
  • 8
  • 8
  • 1
    I add a more fast version in revision 4. Please, review! – seriyPS Feb 03 '13 at 00:46
  • 2
    the test didn't seem to be using arrays??? i've added (yet another) one that seems to be consistently fast over different browsers (see http://jsperf.com/remove-duplicate-array-tests/10) : for (var n = array.length, result = [array[n--]], i; n--;) { i = array[n]; if (!(i in result)) result.push(i); } return result; – imma Aug 09 '13 at 12:58
17

In ECMAScript 6 (aka ECMAScript 2015), Set can be used to filter out duplicates. Then it can be converted back to an array using the spread operator.

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"],
    unique = [...new Set(names)];
Michael Oryl
  • 18,335
  • 14
  • 68
  • 107
Oriol
  • 225,583
  • 46
  • 371
  • 457
17

I had done a detailed comparison of dupes removal at some other question but having noticed that this is the real place i just wanted to share it here as well.

I believe this is the best way to do this

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
    reduced = Object.keys(myArray.reduce((p,c) => (p[c] = true,p),{}));
console.log(reduced);

OK .. even though this one is O(n) and the others are O(n^2) i was curious to see benchmark comparison between this reduce / look up table and filter/indexOf combo (I choose Jeetendras very nice implementation https://stackoverflow.com/a/37441144/4543207). I prepare a 100K item array filled with random positive integers in range 0-9999 and and it removes the duplicates. I repeat the test for 10 times and the average of the results show that they are no match in performance.

  • In firefox v47 reduce & lut : 14.85ms vs filter & indexOf : 2836ms
  • In chrome v51 reduce & lut : 23.90ms vs filter & indexOf : 1066ms

Well ok so far so good. But let's do it properly this time in the ES6 style. It looks so cool..! But as of now how it will perform against the powerful lut solution is a mystery to me. Lets first see the code and then benchmark it.

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
    reduced = [...myArray.reduce((p,c) => p.set(c,true),new Map()).keys()];
console.log(reduced);

Wow that was short..! But how about the performance..? It's beautiful... Since the heavy weight of the filter / indexOf lifted over our shoulders now i can test an array 1M random items of positive integers in range 0..99999 to get an average from 10 consecutive tests. I can say this time it's a real match. See the result for yourself :)

var ranar = [],
     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 10;
for (var i = 0; i<count; i++){
  ranar = (new Array(1000000).fill(true)).map(e => Math.floor(Math.random()*100000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");

Which one would you use..? Well not so fast...! Don't be deceived. Map is at displacement. Now look... in all of the above cases we fill an array of size n with numbers of range < n. I mean we have an array of size 100 and we fill with random numbers 0..9 so there are definite duplicates and "almost" definitely each number has a duplicate. How about if we fill the array in size 100 with random numbers 0..9999. Let's now see Map playing at home. This time an Array of 100K items but random number range is 0..100M. We will do 100 consecutive tests to average the results. OK let's see the bets..! <- no typo

var ranar = [],
     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 100;
for (var i = 0; i<count; i++){
  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*100000000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");

Now this is the spectacular comeback of Map()..! May be now you can make a better decision when you want to remove the dupes.

Well ok we are all happy now. But the lead role always comes last with some applause. I am sure some of you wonder what Set object would do. Now that since we are open to ES6 and we know Map is the winner of the previous games let us compare Map with Set as a final. A typical Real Madrid vs Barcelona game this time... or is it? Let's see who will win the el classico :)

var ranar = [],
     red1 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     red2 = a => Array.from(new Set(a)),
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 100;
for (var i = 0; i<count; i++){
  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*10000000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("map & spread took: " + avg1 + "msec");
console.log("set & A.from took: " + avg2 + "msec");

Wow.. man..! Well unexpectedly it didn't turn out to be an el classico at all. More like Barcelona FC against CA Osasuna :))

Community
  • 1
  • 1
Redu
  • 19,106
  • 4
  • 44
  • 59
  • Just btw I get `arr.reduce(...).keys(...).slice is not a function` in Typescript trying to use your ES6 method – mjwrazor Dec 01 '17 at 00:37
14

Here is a simple answer to the question.

var names = ["Alex","Tony","James","Suzane", "Marie", "Laurence", "Alex", "Suzane", "Marie", "Marie", "James", "Tony", "Alex"];
var uniqueNames = [];

    for(var i in names){
        if(uniqueNames.indexOf(names[i]) === -1){
            uniqueNames.push(names[i]);
        }
    }
drew7721
  • 1,503
  • 16
  • 19
  • +1 for `===` . It wont work for arrays with mixed types if we dont check for it types. Simple but effective answer – Santhosh Jan 19 '16 at 16:45
11

A simple but effective technique, is to use the filter method in combination with the filter function(value, index){ return this.indexOf(value) == index }.

Code example :

var data = [2,3,4,5,5,4];
var filter = function(value, index){ return this.indexOf(value) == index };
var filteredData = data.filter(filter, data );

document.body.innerHTML = '<pre>' + JSON.stringify(filteredData, null, '\t') +  '</pre>';

See also this Fiddle.

John Slegers
  • 38,420
  • 17
  • 182
  • 152
  • 1
    Genious! And, for instances, if you want to have the repeated ones, (instead of removing them) all you have to do is replace `this.indexOf(value) == index ` by `this.indexOf(value, index+1) > 0 ` Thanks! – Pedro Ferreira Apr 03 '17 at 13:17
  • You could even resume it to a single "filter" line: `filterData = data.filter((v, i) => (data.indexOf(v) == i) ); ` – Pedro Ferreira Apr 04 '17 at 22:03
  • Last time I bother! Sorry... picking up my 1st answer, in 2 lines you could get a JSON `var JSON_dupCounter = {};` with the repeated ones and how many times they were repeated: `data.filter((testItem, index) => (data.indexOf(testItem, index + 1) > 0)).forEach((found_duplicated) => (JSON_dupCounter[found_duplicated] = (JSON_dupCounter [found_duplicated] || 1) + 1));` – Pedro Ferreira Apr 05 '17 at 00:12
  • this only works for arrays of primitives? – frozen Jul 16 '17 at 03:37
  • 1
    @frozen : If works with everything where `==` can be used to determine equality. So, if you're dealing with eg. arrays, objects or functions, the filter will work only for different entries that are references to the same array, object or function ([**see demo**](https://jsfiddle.net/s6skxLtz/6/)). If you want to determine equality based on [**different criteria**](https://stackoverflow.com/questions/1068834/object-comparison-in-javascript), you'll need to include those criteria in your filter. – John Slegers Jul 16 '17 at 09:07
10

Here is very simple for understanding and working anywhere (even in PhotoshopScript) code. Check it!

var peoplenames = new Array("Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl");

peoplenames = unique(peoplenames);
alert(peoplenames);

function unique(array){
    var len = array.length;
    for(var i = 0; i < len; i++) for(var j = i + 1; j < len; j++) 
        if(array[j] == array[i]){
            array.splice(j,1);
            j--;
            len--;
        }
    return array;
}

//*result* peoplenames == ["Mike","Matt","Nancy","Adam","Jenny","Carl"]
bodich
  • 904
  • 8
  • 22
9

So the options is:

let a = [11,22,11,22];
let b = []


b = [ ...new Set(a) ];     
// b = [11, 22]

b = Array.from( new Set(a))   
// b = [11, 22]

b = a.filter((val,i)=>{
  return a.indexOf(val)==i
})                        
// b = [11, 22]
ofir_aghai
  • 2,223
  • 1
  • 28
  • 33
9

here is the simple method without any special libraries are special function,

name_list = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];
get_uniq = name_list.filter(function(val,ind) { return name_list.indexOf(val) == ind; })

console.log("Original name list:"+name_list.length, name_list)
console.log("\n Unique name list:"+get_uniq.length, get_uniq)

enter image description here

Mohideen bin Mohammed
  • 14,471
  • 7
  • 86
  • 95
8

Apart from being a simpler, more terse solution than the current answers (minus the future-looking ES6 ones), I perf tested this and it was much faster as well:

var uniqueArray = dupeArray.filter(function(item, i, self){
  return self.lastIndexOf(item) == i;
});

One caveat: Array.lastIndexOf() was added in IE9, so if you need to go lower than that, you'll need to look elsewhere.

csuwldcat
  • 7,103
  • 1
  • 35
  • 31
7

Generic Functional Approach

Here is a generic and strictly functional approach with ES2015:

// small, reusable auxiliary functions

const apply = f => a => f(a);

const flip = f => b => a => f(a) (b);

const uncurry = f => (a, b) => f(a) (b);

const push = x => xs => (xs.push(x), xs);

const foldl = f => acc => xs => xs.reduce(uncurry(f), acc);

const some = f => xs => xs.some(apply(f));


// the actual de-duplicate function

const uniqueBy = f => foldl(
   acc => x => some(f(x)) (acc)
    ? acc
    : push(x) (acc)
 ) ([]);


// comparators

const eq = y => x => x === y;

// string equality case insensitive :D
const seqCI = y => x => x.toLowerCase() === y.toLowerCase();


// mock data

const xs = [1,2,3,1,2,3,4];

const ys = ["a", "b", "c", "A", "B", "C", "D"];


console.log( uniqueBy(eq) (xs) );

console.log( uniqueBy(seqCI) (ys) );

We can easily derive unique from unqiueBy or use the faster implementation utilizing Sets:

const unqiue = uniqueBy(eq);

// const unique = xs => Array.from(new Set(xs));

Benefits of this approach:

  • generic solution by using a separate comparator function
  • declarative and succinct implementation
  • reuse of other small, generic functions

Performance Considerations

uniqueBy isn't as fast as an imperative implementation with loops, but it is way more expressive due to its genericity.

If you identify uniqueBy as the cause of a concrete performance penalty in your app, replace it with optimized code. That is, write your code first in an functional, declarative way. Afterwards, provided that you encounter performance issues, try to optimize the code at the locations, which are the cause of the problem.

Memory Consumption and Garbage Collection

uniqueBy utilizes mutations (push(x) (acc)) hidden inside its body. It reuses the accumulator instead of throwing it away after each iteration. This reduces memory consumption and GC pressure. Since this side effect is wrapped inside the function, everything outside remains pure.

6
for (i=0; i<originalArray.length; i++) {  
    if (!newArray.includes(originalArray[i])) {
        newArray.push(originalArray[i]); 
    }
}
MBJH
  • 1,390
  • 2
  • 14
  • 32
5

If by any chance you were using

D3.js

You could do

d3.set(["foo", "bar", "foo", "baz"]).values() ==> ["foo", "bar", "baz"]

https://github.com/mbostock/d3/wiki/Arrays#set_values

Shankar ARUL
  • 9,495
  • 8
  • 59
  • 62
  • Beautiful, but loading the full fledged powerful visualization library to only filter duplicates seems overkill. Lucky I need the lib for some purpose and I will be using this. Thanks very much. – Vijay Kumar Kanta May 11 '19 at 10:03
4

A slight modification of thg435's excellent answer to use a custom comparator:

function contains(array, obj) {
    for (var i = 0; i < array.length; i++) {
        if (isEqual(array[i], obj)) return true;
    }
    return false;
}
//comparator
function isEqual(obj1, obj2) {
    if (obj1.name == obj2.name) return true;
    return false;
}
function removeDuplicates(ary) {
    var arr = [];
    return ary.filter(function(x) {
        return !contains(arr, x) && arr.push(x);
    });
}
Pang
  • 8,605
  • 144
  • 77
  • 113
vin_schumi
  • 41
  • 1
4
$(document).ready(function() {

    var arr1=["dog","dog","fish","cat","cat","fish","apple","orange"]

    var arr2=["cat","fish","mango","apple"]

    var uniquevalue=[];
    var seconduniquevalue=[];
    var finalarray=[];

    $.each(arr1,function(key,value){

       if($.inArray (value,uniquevalue) === -1)
       {
           uniquevalue.push(value)

       }

    });

     $.each(arr2,function(key,value){

       if($.inArray (value,seconduniquevalue) === -1)
       {
           seconduniquevalue.push(value)

       }

    });

    $.each(uniquevalue,function(ikey,ivalue){

        $.each(seconduniquevalue,function(ukey,uvalue){

            if( ivalue == uvalue)

            {
                finalarray.push(ivalue);
            }   

        });

    });
    alert(finalarray);
});
Gwenc37
  • 2,038
  • 7
  • 16
  • 22
4

The following script returns a new array containing only unique values. It works on string and numbers. No requirement for additional libraries only vanilla JS.

Browser support:

Feature Chrome  Firefox (Gecko)     Internet Explorer   Opera   Safari
Basic support   (Yes)   1.5 (1.8)   9                   (Yes)   (Yes)

https://jsfiddle.net/fzmcgcxv/3/

var duplicates = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl","Mike","Mike","Nancy","Carl"]; 
var unique = duplicates.filter(function(elem, pos) {
    return duplicates.indexOf(elem) == pos;
  }); 
alert(unique);
GibboK
  • 64,078
  • 128
  • 380
  • 620
4

Although ES6 Solution is the best, I'm baffled as to how nobody has shown the following solution:

function removeDuplicates(arr){
    o={}
    arr.forEach(function(e){
        o[e]=true
    })
    return Object.keys(o)
}

The thing to remember here is that objects MUST have unique keys. We are exploiting this to remove all the duplicates. I would have thought this would be the fastest solution (before ES6).

Bear in mind though that this also sorts the array.

Sancarn
  • 2,047
  • 11
  • 29
3

https://jsfiddle.net/2w0k5tz8/

function remove_duplicates(array_){
    var ret_array = new Array();
    for (var a = array_.length - 1; a >= 0; a--) {
        for (var b = array_.length - 1; b >= 0; b--) {
            if(array_[a] == array_[b] && a != b){
                delete array_[b];
            }
        };
        if(array_[a] != undefined)
            ret_array.push(array_[a]);
    };
    return ret_array;
}

console.log(remove_duplicates(Array(1,1,1,2,2,2,3,3,3)));

Loop through, remove duplicates, and create a clone array place holder because the array index will not be updated.

Loop backward for better performance ( your loop wont need to keep checking the length of your array)

THE AMAZING
  • 1,374
  • 2
  • 15
  • 37
3

This was just another solution but different than the rest.

function diffArray(arr1, arr2) {
  var newArr = arr1.concat(arr2);
  newArr.sort();
  var finalArr = [];
  for(var i = 0;i<newArr.length;i++) {
   if(!(newArr[i] === newArr[i+1] || newArr[i] === newArr[i-1])) {
     finalArr.push(newArr[i]);
   } 
  }
  return finalArr;
}
Isaac Pak
  • 2,865
  • 1
  • 30
  • 41
2

This is probably one of the fastest way to remove permanently the duplicates from an array 10x times faster than the most functions here.& 78x faster in safari

function toUnique(a,b,c){               //array,placeholder,placeholder
 b=a.length;while(c=--b)while(c--)a[b]!==a[c]||a.splice(c,1)
}
  1. Test: http://jsperf.com/wgu
  2. Demo: http://jsfiddle.net/46S7g/
  3. More: https://stackoverflow.com/a/25082874/2450730

if you can't read the code above ask, read a javascript book or here are some explainations about shorter code. https://stackoverflow.com/a/21353032/2450730

Community
  • 1
  • 1
cocco
  • 15,256
  • 6
  • 53
  • 73
  • 7
    I'd guess its due to posting *minified* code as a solution... – Mark K Cowan Nov 06 '14 at 12:26
  • 2
    You would deserve a downvote for claiming to be "fastest" when using while within while and splice (which is a loop too). The fastest way uses a hash-map (JS-Object) and a single loop and is O(n). Try to write the function only using hasOwnProperty(), push() and forEach() and it will be fast. See: https://en.wikipedia.org/wiki/Big_O_notation – cat Jul 26 '15 at 10:17
  • From tests i made and read on many other sites ... direct setting is faster than push (x[0] vs x.push()), forEach is much slower then while & hasOwnProperty() is useless if you don't mess up your array. but i would be happy if you post a code which is faster . You can also downvote. anyway i wrote the function in many different ways , and even if i find it also strange that this one works, this one gave the best results in most browsers on normal arrays. – cocco Jul 26 '15 at 14:18
  • by normal arrays i mean in the range of 10-100000 elements. Note... again i would be really happy to find a better way as i'm using this function and i'm also someone who tries to avoi useless loops. check out my other functions. – cocco Jul 26 '15 at 14:24
  • At the other side by fastest i mean fastest code based on the one listed in this specific post.. copy and past those codes and try it yourself... none of the codes here is faster than this strange whilewhile function. – cocco Jul 26 '15 at 14:31
  • May I suggest add spaces and indentation to make it more readable? – McKay M Mar 15 '21 at 02:41
2

For anyone looking to flatten arrays with duplicate elements into one unique array:

function flattenUniq(arrays) {
  var args = Array.prototype.slice.call(arguments);

  var array = [].concat.apply([], args)

  var result = array.reduce(function(prev, curr){
    if (prev.indexOf(curr) < 0) prev.push(curr);
    return prev;
  },[]);

  return result;
}
cjjenkinson
  • 289
  • 4
  • 6
1

Another method of doing this without writing much code is using the ES5 Object.keys-method:

var arrayWithDuplicates = ['a','b','c','d','a','c'],
    deduper = {};
arrayWithDuplicates.forEach(function (item) {
    deduper[item] = null;
});
var dedupedArray = Object.keys(deduper); // ["a", "b", "c", "d"]

Extracted in a function

function removeDuplicates (arr) {
    var deduper = {}
    arr.forEach(function (item) {
        deduper[item] = null;
    });
    return Object.keys(deduper);
}
Willem de Wit
  • 8,216
  • 8
  • 54
  • 88
1

The simplest way to remove a duplicate is to do a for loop and compare the elements that are not the same and push them into the new array

 var array = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

 var removeDublicate = function(arr){
 var result = []
 var sort_arr = arr.sort() //=> optional
 for (var i = 0; i < arr.length; i++) {
        if(arr[ i + 1] !== arr[i] ){
            result.push(arr[i])
        }
 };
  return result
}  
console.log(removeDublicate(array))
==>  ["Adam", "Carl", "Jenny", "Matt", "Mike", "Nancy"]
user3449311
  • 167
  • 1
  • 5
1

Nested loop method for removing duplicates in array and preserving original order of elements.

var array = [1, 3, 2, 1, [5], 2, [4]]; // INPUT

var element = 0;
var decrement = array.length - 1;
while(element < array.length) {
  while(element < decrement) {
    if (array[element] === array[decrement]) {
      array.splice(decrement, 1);
      decrement--;
    } else {
      decrement--;
    }
  }
  decrement = array.length - 1;
  element++;
}

console.log(array);// [1, 3, 2, [5], [4]]

Explanation: Inner loop compares first element of array with all other elements starting with element at highest index. Decrementing towards the first element a duplicate is spliced from the array.

When inner loop is finished the outer loop increments to the next element for comparison and resets the new length of the array.

y2knoproblem
  • 421
  • 6
  • 8
1
function arrayDuplicateRemove(arr){
    var c = 0;
    var tempArray = [];
    console.log(arr);
    arr.sort();
    console.log(arr);
    for (var i = arr.length - 1; i >= 0; i--) {
        if(arr[i] != tempArray[c-1]){
            tempArray.push(arr[i])
            c++;
        }
    };
    console.log(tempArray);
    tempArray.sort();
    console.log(tempArray);
}
M.A.K. Ripon
  • 1,561
  • 3
  • 24
  • 42
1
const numbers = [1, 1, 2, 3, 4, 4];

function unique(array) {
  return array.reduce((a,b) => {
    let isIn = a.find(element => {
        return element === b;
    });
    if(!isIn){
      a.push(b);
    }
    return a;
  },[]);
}

let ret = unique(numbers); // [1, 2, 3, 4]

the way using reduce and find.

gnujoow
  • 359
  • 1
  • 3
  • 15
0

Here is another approach using jQuery,

function uniqueArray(array){
  if ($.isArray(array)){
    var dupes = {}; var len, i;
    for (i=0,len=array.length;i<len;i++){
      var test = array[i].toString();
      if (dupes[test]) { array.splice(i,1); len--; i--; } else { dupes[test] = true; }
    }
  } 
  else {
    if (window.console) console.log('Not passing an array to uniqueArray, returning whatever you sent it - not filtered!');
      return(array);
  }
  return(array);
}

Author: William Skidmore

Mathankumar
  • 567
  • 1
  • 8
  • 18
0
function removeDuplicates(inputArray) {
            var outputArray=new Array();

            if(inputArray.length>0){
                jQuery.each(inputArray, function(index, value) {
                    if(jQuery.inArray(value, outputArray) == -1){
                        outputArray.push(value);
                    }
                });
            }           
            return outputArray;
        }
realmag777
  • 1,893
  • 1
  • 23
  • 19
0

If you don't want to include a whole library, you can use this one off to add a method that any array can use:

Array.prototype.uniq = function uniq() {
  return this.reduce(function(accum, cur) { 
    if (accum.indexOf(cur) === -1) accum.push(cur); 
    return accum; 
  }, [] );
}

["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"].uniq()
Jonah
  • 14,529
  • 18
  • 76
  • 146
0

If you're creating the array yourself, you can save yourself a loop and the extra unique filter by doing the check as you're inserting the data;

var values = [];
$.each(collection, function() {
    var x = $(this).value;
    if (!$.inArray(x, values)) {
        values.push(x);
    }
});
  • Be careful with using the jQuery inArray method: it returns the index of the element in array, **not a boolean value**. Check the documentation: [jQuery.inArray()](https://api.jquery.com/jQuery.inArray/) – xonya Apr 24 '15 at 08:52
0

The easiest way to remove string duplicates is to use associative array and then iterate over the associative array to make the list/array back.

Like below:

var toHash = [];
var toList = [];

// add from ur data list to hash
$(data.pointsToList).each(function(index, Element) {
    toHash[Element.nameTo]= Element.nameTo;
});

// now convert hash to array
// don't forget the "hasownproperty" else u will get random results 
for (var key in toHash)  {
    if (toHash.hasOwnProperty(key)) { 
      toList.push(toHash[key]);
   }
}

Voila, now duplicates are gone!

Pranav 웃
  • 8,094
  • 5
  • 36
  • 48
nondescript
  • 1,298
  • 1
  • 11
  • 16
0

I know Im a little late, but here is another option using jinqJs

See Fiddle

var result = jinqJs().from(["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"]).distinct().select();
NYTom
  • 494
  • 2
  • 14
0

Vanilla JS solutions with complexity of O(n) (fastest possible for this problem). Modify the hashFunction to distinguish the objects (e.g. 1 and "1") if needed. The first solution avoids hidden loops (common in functions provided by Array).

var dedupe = function(a) 
{
    var hash={},ret=[];
    var hashFunction = function(v) { return ""+v; };
    var collect = function(h)
    {
        if(hash.hasOwnProperty(hashFunction(h)) == false) // O(1)
        {
            hash[hashFunction(h)]=1;
            ret.push(h); // should be O(1) for Arrays
            return;
        }
    };

    for(var i=0; i<a.length; i++) // this is a loop: O(n)
        collect(a[i]);
    //OR: a.forEach(collect); // this is a loop: O(n)

    return ret;
}

var dedupe = function(a) 
{
    var hash={};
    var isdupe = function(h)
    {
        if(hash.hasOwnProperty(h) == false) // O(1)
        {
            hash[h]=1;
            return true;
        }

        return false;
    };

    return a.filter(isdupe); // this is a loop: O(n)
}
cat
  • 2,721
  • 1
  • 21
  • 28
0
var duplicates = function(arr){
     var sorted = arr.sort();
   var dup = [];
   for(var i=0; i<sorted.length; i++){
        var rest  = sorted.slice(i+1); //slice the rest of array
       if(rest.indexOf(sorted[i]) > -1){//do indexOf
            if(dup.indexOf(sorted[i]) == -1)    
         dup.push(sorted[i]);//store it in another arr
      }
   }
   console.log(dup);
}

duplicates(["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"]);
neelmeg
  • 2,316
  • 6
  • 32
  • 43
0
function removeDuplicates (array) {
  var sorted = array.slice().sort()
  var result = []

  sorted.forEach((item, index) => {
    if (sorted[index + 1] !== item) {
      result.push(item)
    }
  })
  return result
}
Matt MacPherson
  • 175
  • 1
  • 8
0
var lines = ["Mike", "Matt", "Nancy", "Adam", "Jenny", "Nancy", "Carl"];
var uniqueNames = [];

for(var i = 0; i < lines.length; i++)
{
    if(uniqueNames.indexOf(lines[i]) == -1)
        uniqueNames.push(lines[i]);
}
if(uniqueNames.indexOf(uniqueNames[uniqueNames.length-1])!= -1)
    uniqueNames.pop();
for(var i = 0; i < uniqueNames.length; i++)
{
    document.write(uniqueNames[i]);
      document.write("<br/>");
}
saniales
  • 391
  • 2
  • 13
Vishnu
  • 11
  • Your code works great. but the code 'uniqueNames.pop()' is removing the last array element for no reason. It makes the 'Carl' not listing from the array. – Santosh Mar 16 '17 at 17:51
0

Quick and Easy using lodash - var array = ["12346","12347","12348","12349","12349"]; console.log(_.uniqWith(array,_.isEqual));

Anand Somani
  • 711
  • 1
  • 5
  • 14
0

aLinks is a simple JavaScript array object. If any element exist before the elements on which the index shows that a duplicate record deleted. I repeat to cancel all duplicates. One passage array cancel more records.

var srt_ = 0;
var pos_ = 0;
do {
    var srt_ = 0;
    for (var i in aLinks) {
        pos_ = aLinks.indexOf(aLinks[i].valueOf(), 0);
        if (pos_ < i) {
            delete aLinks[i];
            srt_++;
        }
    }
} while (srt_ != 0);
0

This solution uses a new array, and an object map inside the function. All it does is loop through the original array, and adds each integer into the object map.If while looping through the original array it comes across a repeat, the

`if (!unique[int])`

catches this because there is already a key property on the object with the same number. Thus, skipping over that number and not allowing it to be pushed into the new array.

    function removeRepeats(ints) {
      var unique = {}
      var newInts = []

      for (var i = 0; i < ints.length; i++) {
        var int = ints[i]

        if (!unique[int]) {
          unique[int] = 1
          newInts.push(int)
        }
      }
      return newInts
    }

    var example = [100, 100, 100, 100, 500]
    console.log(removeRepeats(example)) // prints [100, 500]
Dan Zuzevich
  • 1,935
  • 1
  • 14
  • 28
-1
var uniqueCompnies = function(companyArray) {
    var arrayUniqueCompnies = [],
        found, x, y;

    for (x = 0; x < companyArray.length; x++) {
        found = undefined;
        for (y = 0; y < arrayUniqueCompnies.length; y++) {
            if (companyArray[x] === arrayUniqueCompnies[y]) {
                found = true;
                break;
            }
        }

        if ( ! found) {
            arrayUniqueCompnies.push(companyArray[x]);
        }
    }

    return arrayUniqueCompnies;
}

var arr = [
    "Adobe Systems Incorporated",
    "IBX",
    "IBX",
    "BlackRock, Inc.",
    "BlackRock, Inc.",
];
Zayn Ali
  • 4,055
  • 1
  • 23
  • 37
-8

ES2015, 1-liner, which chains well with map, but only works for integers:

[1, 4, 1].sort().filter((current, next) => current !== next)

[1, 4]

user1429980
  • 6,044
  • 2
  • 39
  • 47
  • That works with anything, but only removes sequential duplicates. e.g. `[1,1,2,2,3,3]` -> `[1,2,3]` but `[1,2,3,1,2,3]` -> `[1,2,3,1,2,3]` – Kroltan Oct 05 '17 at 16:18
  • 3
    @Kroltan It's actually not a matter of sequential duplicates, but it's a big issue about understanding what's passed to `filter`: it's `(value, index)` not `(current, next)`, so it would work for `[1,4,1]` but not for `[2,4,2]`... – Xenos Dec 20 '17 at 10:34
  • @Xenos You're right! I skimmed over it too fast xD – Kroltan Dec 20 '17 at 20:17
  • I think the approach is good and can easlily work for arrays of other types too, with a slight modification: `["1", "4", "1"].sort().filter((value, index, array) => value !== array[index + 1])` – scipper Nov 26 '20 at 06:38