Enums are full types. While internally they are represented and can be (implicitly?) cast into integers (the number type they use/equate to can be changed), they are each a distinct type.
This actually enables all those Additional typecheks that make them useful. We recently had a Question about why we do not just use int constants over Enums:
Another enum vs. int
is compares types and only types. If you do not give it a type, it will just go and find the type of what you did give it. With the values primitive types it can be ambigious. It is a compiler/runtime detail if 1 is a Int16, Int32 or Int64 today. Indeed it will likely varry on the digit lenght of the constant I give it. Not something I can rely on.
It could propably tell you that 1.2 is not the same as type 1 (unless some braindead designer decided to use decimal or float values in both cases for once).
But all possible values of the type EngineType are of the type EngineType. That is a big reasons Enums are a thing in the first place. Exactly to allow such type checks.