There is no single correct answer to this question.
- Historically, colder was considered better.
- Computers are designed to work in an environment suitable for humans, so
20-23 degrees is typical - and 19-22 degrees is typical for a data center.
- Some places (like Google) have found it cheaper to run DC's hotter because
the hardware failure rate is lower then the additional cooling bills.
I would be more concerned about having a single air conditioning unit - when it fails, you create a big change in heat, and this can be stressful to computer components - I expect more-so then a constant, warmer temperature.
Air conditioning units can typically maintain a reasonable level of humidity (aim for 45-55% if you can control it) for a DC - enough that static is not too much of a problem, and neither is condensation.
With respect of condensation, you probably don't need to worry too much about it (except for your air conditioning freezing up if its overworked). The amount of water that can be held in air is relative to the temperature. Thus any condensation which is going to occur is likely to occur at the point the air is cooled down - this is at the air conditioner - not the computers. (Especially because computers generate heat, they will drive away condensation because the air surrounding them can absorb more water). In fact, when an air conditioner is installed, it includes an outlet for the condensed water from the aircon.
[ Another tangentially related thought / explanation - if you ever have had a hot shower in a cold room - particularly with an extractor fan running, you will notice mist forming. This mist forms when the warm, wet air from the shower is cooled by the cold air moving past it, and squeezes out to form water droplets in the air. This happens at the place of cooling - ie by the air conditioner. If your mirror fogs up and you then blow over it with a hair dryer it clears rapidly (same effect of heat on a computer if their is water in the air) ]