tl;dr
Use code points, not char/Character.
"d".codePointAt( 0 ) == 100 // true.
Details
The Answer by Alex Shesterov is correct. But bigger picture, you should not be using Character objects.
Character is broken
The Character class is a wrapper class for the primitive type char. The char/Character type is legacy as of Java 2, and is essentially broken. As a 16-bit value, it is physically incapable of representing most characters.
For example, try running:
System.out.println( Character.valueOf( '' ) ) ;
Code points
Instead, when working with individual characters, use code point integer numbers. In Java that means using the int/Integer type.
If you look around classes such as String, StringBuilder, and Character you will find codePoint methods.
Let's revise your code snippet. We will change the names to be more descriptive. We switch out Character and char usage for mere int primitive integers. As such, we can compare our int values using == or !=.
package work.basil.text;
public class App7
{
public static void main ( String[] args )
{
int codePointOf_LATIN_SMALL_LETTER_D = "d".codePointAt( 0 ); // Annoying zero-based index counting, not ordinal.
int codePoint2 = getInt();
boolean sameCharacter = ( codePointOf_LATIN_SMALL_LETTER_D == codePoint2 ); // Comparing `int` primitives with double-equals.
System.out.println( sameCharacter );
}
public static int getInt ()
{
return 100; // Code point 100 is LATIN SMALL LETTER D, `d`.
}
}
When run:
true
Of course, if you use auto-boxing or otherwise mix the wrapper class Integer with the primitive int, then the same explanation in that other Answer applies here too.