I have a csv file that contains 8 columns with more than 2000 rows. I need to import those data into the postgresql table.
I am using pgAdmin4 and I tried to import data to my postgresql table. But it gives me following error.
ERROR: invalid input syntax for type timestamp: "earliest_d" CONTEXT:
COPY cors_data, line 1, column earliest_d: "earliest_d"
I think the reason is csv date fields are not recognize as a time stamp.
earliest_d : 2015 Oct 15 21:18
latest_dat : 2018 Feb 26 23:59
How can I make these csv date fields recongize by the postgresql application.
This is the SQL code generated in postgresql
-- Table: public.cors_data
-- DROP TABLE public.cors_data;
CREATE TABLE public.cors_data
(
    id character varying(80) COLLATE pg_catalog."default" NOT NULL,
    "interval" character varying(80) COLLATE pg_catalog."default",
    lat character varying(120) COLLATE pg_catalog."default",
    lon character varying(120) COLLATE pg_catalog."default",
    earliest_d timestamp without time zone,
    latest_dat timestamp without time zone,
    "Name" character varying(120) COLLATE pg_catalog."default",
    "stnID" character varying(4) COLLATE pg_catalog."default",
    CONSTRAINT cors_data_pkey PRIMARY KEY (id),
    CONSTRAINT cors_data_interval_key UNIQUE ("interval"),
    CONSTRAINT cors_data_lat_key UNIQUE (lat),
    CONSTRAINT cors_data_lon_key UNIQUE (lon)
)
WITH (
    OIDS = FALSE
)
TABLESPACE pg_default;
ALTER TABLE public.cors_data
    OWNER to postgres;
I have referred some String to date conversion formats but nothing related to string date fromat in my csv.
 
    