In Snowflake/SQL we can do:
SELECT * FROM myTbl 
WHERE date_col 
BETWEEN 
  CONVERT_TIMEZONE('UTC','America/Los_Angeles', some_date_string_col)::DATE - INTERVAL '7 DAY'
AND 
  CONVERT_TIMEZONE('UTC','America/Los_Angeles', some_date_string_col)::DATE - INTERVAL '1 DAY'
Is there a pyspark translation for this for dataframes?
I imagine if something like this
myDf.filter(
  col(date_col) >= to_utc_timestamp(...)
)
But how can we do BETWEEN and also the interval?
 
    