2

I have created a PowerShell script that copies any e-books I download to a predetermined directory, which is periodically scanned by my e-book manager and added to my library. The script is run immediately after every download.

The problem is that when several books are downloaded at the same time or around the same time, the e-book manager locks up and goes unresponsive.

Therefore, I would like to queue the copying using PowerShell jobs, but I do not know how to create a single queue (single concurrency) that each subsequent job will await completion of every older job.

That is, I would like the script to create a job (let's call it a "Book Job") that periodically checks the queue of running Book Jobs to see if all older Book Jobs have finished before it runs. When it completes, a Book Job should declare that it has finished in some way that can be detected by younger Book Jobs.

Does anyone know how I can do this? I saw a similar question here that I am looking at: Powershell background tasks, however, in my case I am running the script multiple times (after every new download).

1 Answers1

3

My thought is to establish a queue by creating one lock file per new instance of your script. When the script runs, it checks a directory dedicated to tracking the queue for existing instances of the script. If there are none, the script adds itself to the front of the queue, performs some action (runs your code), then cleans up its lock. If there are locks, a new one will be added to the end of the queue, and the instance will endlessly check until it is at the front of the queue.

This allows you to run the same script multiple times, all individually handling themselves by checking the externally-available queue.

The lock files are structured as Index, delimiter ("_"), process ID.

Clear-Host

function New-Lock ([int] $index) {
    $newLock = "$index" + "_" + $pid + ".lck"
    New-Item $queue$newLock | Out-Null
}

$queue = "C:\locks\"

# find the end of the stack
$locks = gci $queue *.lck | sort | select -expandproperty name

# if locks exist, find the end of the stack by selecting the index of the last lock
if($locks) {
    # gets the last lock file, selects the index by splitting on the delimiter
    [int]$last = [convert]::ToInt32(($locks | select -last 1).Split("_")[0],10)

    # add the lock to the end of the stack
    New-Lock ($last + 1)
}
# if no locks exist, create one at the top of the stack
else {
    New-Lock 0
}

# check if we're at the top of the stack
do {
    $locks = gci $queue *.lck | sort | select -expandproperty name

    # this is the PID on the top of the stack
    [int]$top = [convert]::ToInt32(($locks | select -first 1).Split("_")[1].Split(".")[0],10)
    write-verbose "not at the top..."
    sleep 1
} until ($pid -eq $top)

# if we're here, we've been to the top. it's our turn to do something
Write-Verbose "we've reached the top!"
# <do something. put your code here>
# might be good to add some Start-Sleep here
# </do something put your code here>

# now that we're done, let's delete our lock
gci $queue | select -first 1 | Remove-Item

Below is a fictitious timeline example in which you've download three files (I've chosen random PIDs).

  • File 1 is downloaded and launches the script. There are no existing locks. Create lock "0_19831". We're at the top of the stack, so your code is executed. This is a big e-book, so your file transfer code will take a full min to run.
  • File 2 is downloaded and launches the script. Lock(s) exist. Create lock "1_332". We're not at the top of the stack, so we'll wait in our do/until and keep checking until we're first in line.
  • File 1 finished copying. Delete lock "0_19831".
  • File 3 is downloaded and launches the script. Lock(s) exist. Create lock "2_7582". We're not at the top of the stack, wait until we are.
  • File 2 finished copying. Delete lock "1_332".
  • File 3 finished copying. Delete lock "2_7582".

This solution isn't bullet proof, but might work depending on the scale.

root
  • 3,920