I've read several threads on SO about checking whether a URL exists or not in bash, e.g. #37345831, and the recommended solution was to use wget with --spider. However, the --spider option appears to fail when used with AWS S3 presigned URLs.
Calling:
wget -S --spider "${URL}" 2>&1
Results in:
HTTP request sent, awaiting response...
  HTTP/1.1 403 Forbidden
  x-amz-request-id: [REF]
  x-amz-id-2: [REF]
  Content-Type: application/xml
  Date: [DATE]
Server: AmazonS3
Remote file does not exist -- broken link!!!
Whereas the following returns as expected, HTTP/1.1 200 OK, for the same input URL:
wget -S "${URL}" -O /dev/stdout | head
The version of wget I'm running is:
GNU Wget 1.20.3 built on linux-gnu.
Any clue as to what's going on?
 
     
    