if you want to run against HDFS - command - awk -F "/" '{print $NF}' will you just file name.
[cloudera@quickstart ~]$ hadoop fs -ls /user/cloudera/departments|awk -F "/" '{print $NF}'|egrep -v 'Found|_SUCCESS'
part-m-00000
part-m-00001
[cloudera@quickstart ~]$ 
if you want to run against local file system then - command - ls -1 will give you file name. you can also use awk -F "/" '{print $NF}' 
you can create shell script as: (uncomment hive statements)
#!/bin/sh
files=`hadoop fs -ls /user/cloudera/departments|awk -F "/" '{print $NF}'|egrep -v 'Found|_SUCCESS'`
for file in $files
 do
   #hive -e "insert into table t(name) values (\"$file\");"
   echo "insert into table t(name) values (\"$file\");"
done
should instert in hive table:
[cloudera@quickstart ~]$ ./test.sh 
insert into table t(name) values ("part-m-00000");
insert into table t(name) values ("part-m-00001");