Let's say I have this client-side JSON input:
{
   id: "5",
   types: [
      {id: "1", types:[]},
      {id: "2", types:[]},
      {id: "1", types[]}
   ]
}
I have this class:
class Entity {
    private String id;
    private Set<Entity> types = new LinkedHashSet<>();
    public String getId() {
        return this.id;
    }
    public String setId(String id) {
        this.id = id;
    }
    public Set<Entity> getTypes() {
        return types;
    }
    @JsonDeserialize(as=LinkedHashSet.class)
    public void setTypes(Set<Entity> types) {
        this.types = types;
    }
    @Override
    public boolean equals(Object o){
        if (o == null || !(o instanceof Entity)){
            return false;
        }
        return this.getId().equals(((Entity)o).getId());
    }
}
I have this Java Spring endpoint where I pass the input in the body of a POST request:
@RequestMapping(value = "api/entity", method = RequestMethod.POST)
public Entity createEntity(@RequestBody final Entity in) {
    Set<Entity> types = in.getTypes();
    [...]
}
I would like in:
Set<Entity> types = in.getTypes();
to have only two entries in the correct order... since one of them is a duplicate based on the id... Instead I get duplicates in the LinkedHashSet (!)
I thought from the code I have that removing duplicates would work automatically, but apparently it is not.
This question has a broader context than Why do I need to override the equals and hashCode methods in Java? since it is using implicit Jackson serialization through Java Spring.
 
     
     
     
    