There are two factors here which both contribute.
- You are running - curlin a pipeline. The- set -efunctionality only applies to the last command in a pipe; this is how the shell is designed.
 
- There is no error message from - curlbecause you are running it with- -s.
 
Capturing the output from a command and checking its exit status at the same time is easy, but then you end up with the results in a shell variable. Depending on your scenario, this may or may not be cumbersome.
if curlout=$(curl -s https://api.github.com/repos/dnote-io/cli/tags); then
    # curl succeeded; *now* parse the result
    latest="$(sed -n 's/.*"name": "\(v\[0-9]*\.\[0-9]*\.\[0-9]*\)",.*/!d;s//\1/;q' <<<"$curlout")"
else
    rc=$?
    echo "$0: curl failed: $rc" >&2
    exit "$rc"
fi
I refactored your long pipeline into a single sed script, since that was relatively easy to do.  I still doubt you can find many grep -E implementations which actually understand the Perl regex-ism \dso maybe  your code simply wasn't actually working.  If the result is JSON, the proper solution would be to use a proper JSON parser like jq instead.
Also notice how your variables should be lower case; uppercase variables are generally reserved for system use.
Your shebang says #!/bin/sh so you can't use Bash features like pipefail with that.  Perhaps you actually want #!/bin/bash in the shebang so you can use Bash features in your script (seeing as you tagged this question bash anyway).