I am trying to model a request for a column that will be VARCHAR(120), but will actually store 480 bytes.
The length of 480 is established as VARCHAR (120) = 120 x 4 = 480 bytes. This is due to string unit specification code unit 32. That means for each 1 character, it will consume 4 bytes.
I am not certain of the syntax for the DDL. My research indicates that there should be a statement during table creation that sets the Character Set to UTF-32, but I am uncertain. This is for DB2 Mid-Tier (Non Mainframe)
Here is something I found, but I am not sure it is valid, nor how COLLATE works.
ALTER TABLE table_name
MODIFY column_name VARCHAR(255)
CHARACTER SET utf8
COLLATE utf8_unicode_ci;
Any help would be appreciated!