I have an issue while trying to insert data into Hbase. I am running the scala code on Google Cloud Spark shell and trying to insert the Data from RDD into Hbase ( BigTable )
Format of hbaseRDD :-- RDD[(String, Map[String, String])]
String is Row id and the map contains it's corresponding column and it's values.
Code is like this :-
val tableName: String = "omniture";
val connection = BigtableConfiguration.connect("*******", "**********")   
val admin = connection.getAdmin();
val table = connection.getTable(TableName.valueOf(tableName));
TRY 1 : 
  hbaseRDD.foreach{w => 
         val put = new Put(Bytes.toBytes(w._1));
         var ColumnValue = w._2
         ColumnValue.foreach{x =>       
         put.addColumn(Bytes.toBytes("u"), Bytes.toBytes(x._1 ), Bytes.toBytes(x._2));
                             }
         table.put(put);
      }      
TRY 2 : 
        hbaseRDD.map{w => 
        val put = new Put(Bytes.toBytes(w._1));
        var ColumnValue = w._2
        ColumnValue.map{x =>       
        put.addColumn(Bytes.toBytes("u"), Bytes.toBytes(x._1 ), Bytes.toBytes(x._2));
                             }
         table.put(put);
      } 
Bellow is the error i am getting :-
org.apache.spark.SparkException: Task not serializable
Caused by: java.io.NotSerializableException: com.google.cloud.bigtable.hbase.BigtableTable
Serialization stack:
        - object not serializable (class: com.google.cloud.bigtable.hbase.BigtableTable, value: BigtableTable{hashCode=0x7d96618, project=cdp-dev-201706-01, instance=cdp-dev-cl-hbase-instance, table=omniture, host=bigtable.googleapis.com})
        - field (class: logic.ingestion.Ingestion$$anonfun$insertTransactionData$1, name: table$1, type: interface org.apache.hadoop.hbase.client.Table)
        - object (class logic.ingestion.Ingestion$$anonfun$insertTransactionData$1, <function1>)
        at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
        at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
        at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
        at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)
        ... 27 more
Any help would be appreciated. Thanks in advance.