I have a problem where I need to filter an array of duplicate values. EG: [1,2,3,3,3,4] -> [1,2,3,4]
Currently I've written the following code which works but I don't think there is enough redundancy.
const deduper = (arrayToDedupe) =>
  Object.values(
    Object.assign({}, [
      ...new Set(
        [...new Set(arrayToDedupe)]
          .filter((element, index, array) => index === array.indexOf(element))
          .reduce((acc, elementv2) => {
            if (acc.includes(elementv2)) {
              return acc;
            } else {
              acc.push(elementv2);
              return acc;
            }
          }, [])
          .map((elementv3, indexv3, arrayv3) => {
            if (indexv3 === arrayv3.indexOf(elementv3)) {
              return elementv3;
            } else {
              return undefined;
            }
          })
          .filter((x) => x)
      ),
    ])
  )
    .sort()
    .map((element, index, array) => {
      if (array[index + 1] === element) return undefined;
      return element;
    })
    .filter((x) => x);
Is there a way to really really really ensure without a doubt that the returned array will not have any duplicates. Do I need to add more chained array methods?
 
    