I had a similar problem and couldn't find anything better than scraping the downloads page. You mentioned curl, so I'm assuming you want a shell script. I ended up with this:
url='https://www.python.org/ftp/python/'
curl --silent "$url" |
    sed -n 's!.*href="\([0-9]\+\.[0-9]\+\.[0-9]\+\)/".*!\1!p' |
    sort -rV |
while read -r version; do
    filename="Python-$version.tar.xz"
    # Versions which only have alpha, beta, or rc releases will fail here.
    # Stop when we find one with a final release.
    if curl --fail --silent -O "$url/$version/$filename"; then
        echo "$filename"
        break
    fi
done
This relies on sort -V, which I believe is specific to GNU coreutils. That shouldn't be a problem on Ubuntu.
If you're using this in a larger script and want to use the version or filename variables after the loop, see How to pipe input to a Bash while loop and preserve variables after loop ends.