You can do basically this:
- get the list of files
- get the time for each of them (also check os.path.getmtime()for updates)
- use datetimemodule to get a value to compare against (that 1h)
- compare
For that I've used a dictionary to both store paths and timestamps in a compact format. Then you can sort the dictionary by its values (dict.values()) (which is a float, timestamp) and by that you will get the latest files created within 1 hour that are sorted. (e.g. by sorted(...) function):
import os
import glob
from datetime import datetime, timedelta
hour_files = {
    key: val for key, val in {
        path: os.path.getctime(path)
        for path in glob.glob("./*")
    }.items()
    if datetime.fromtimestamp(val) >= datetime.now() - timedelta(hours=1)
}
Alternatively, without the comprehension:
files = glob.glob("./*")
times = {}
for path in files:
    times[path] = os.path.getctime(path)
hour_files = {}
for key, val in times.items():
    if datetime.fromtimestamp(val) < datetime.now() - timedelta(hours=1):
        continue
    hour_files[key] = val
Or, perhaps your folder is just a mess and you have too many files. In that case, approach it incrementally:
hour_files = {}
for file in glob.glob("./*"):
    timestamp = os.path.getctime(file)
    if datetime.fromtimestamp(timestamp) < datetime.now() - timedelta(hours=1):
        continue
    hour_files[file] = timestamp