This is only the idea on what im doing in a window service. I get the idea from this video to do it in parallel processing.
I have two different method and a model class.
Model Class code:
public class Email(){
    public string Recipient { get; set; }
    public string Message { get; set; }
}
Methods is something like this:
public void LoadData(){
    while(Main.IsProcessRunning){
        // 1. Get All Emails
        var emails = new dummyRepositories().GetAllEmails(); //This will return List<Emails>.
        // 2. Send it
        // After sending assume that the data will move to other table so it will not be query again for the next loop.
        SendDataParallel(emails);//this will function async? even though the calling method is sync.
        // This will continue here or wait until it already send?
        // If it will continue here even though it will not send already
        // So there's a chance to get the email again for the next loop and send it again?
    }
}
//This will send email at parallel
public async void SendDataParallel(IList<Email> emails){
    var allTasks = emails.Select(SendDataAsync);
    await TaskEx.WhenAll(allTasks);
}
//Assume this code will send email asynchronously. (this will not send email, for sample only)
public async void SendDataAsync(Email email){
    using (var client = new HttpClient())
    {
        client.PostAsync(email);
    }
}
I only want to get all queued emails then send it in parallel then wait until it already send.
I'm avoiding using foreach on every email that I get.
 
     
     
    