I have the following class:
  @interface MyObj : NSObject {
      NSInteger _x;
      NSInteger _y;
      ....
  }
  @property(nonatomic, readonly) NSInteger x;
  @property(nonatomic, readonly) NSInteger y;
  ....
  - (id)initWithX:(NSInteger)posX Y:(NSInteger)posY;
and implementation:
  @implementation MyObj
  @synthesize x = _x;
  @synthesize y = _y;
  - (id)initWithX:(NSInteger)posX Y:(NSInteger)posY{
       self = [super init];
       _x = posX;
       _y = posY;
       return self;
  }
I am trying to init the array of MyObj's like this:
    for (NSInteger y=0; y<5; ++y) {
    NSMutableArray *rowContent = [[NSMutableArray alloc] init];
    for (NSInteger x=0; x<5; ++x) {
        MyObj *myObj = [[MyObj alloc] initWithX:x Y:y];
                    ....
                    [rowContent addObject:myObj];
For the first iteration, when x=0, y=0 myObj created as expect, but when x=1, y=0 in the function initWithX I see that 'posX' argument value is 1065353216. So I get invalid myObj object with ivar _x = 1065353216. I can't understand why so happen?
