Is it possible to optimize this code using Parallel.Foreach or something? 
using (var zipStream = new ZipOutputStream(OpenWriteArchive()))
{
    zipStream.CompressionLevel = CompressionLevel.Level9;    
    foreach (var document in docuemnts)
    {
        zipStream.PutNextEntry(GetZipEntryName(type));    
        using (var targetStream = new MemoryStream()) // document stream
        {
            DocumentHelper.SaveDocument(document.Value, targetStream, type);    
            targetStream.Position = 0; targetStream.CopyTo(zipStream);
        }    
        GC.Collect();
    };
}
The problem is DotNetZip's and SharpZipLib's ZipOutputStream doesn't support position changing or seeking. 
Writing to zip stream from multiple threads leads to error. It's also impossible to accumulate result streams into ConcurrentStack beacuse application can work with 1000+ documents and should to compress and save streams into the cloud on the fly.
Is there any way to solve this?
 
    