How do I add the lz4 native libraries for use by Spark workers?
I have tried to add them via both  LD_LIBRARY_PATH and ( as shown - but no accepted or even upvoted answer - in Apache Spark Native Libraries ) - in SPARK_LIBRARY_PATH.  They are not working: we get:
java.lang.RuntimeException: native lz4 library not available
  at org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
  at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150)
  at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:165)
  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1201)
  at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1094)
  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1444)
  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:277)
  at BIDMat.HDFSIO.writeThing(HDFSIO.scala:96)
Here is the LD_LIBRARY_PATH
$echo $LD_LIBRARY_PATH
/usr/local/Cellar/lz4/r131/lib:/usr/local/Cellar/hadoop/2.7.2/libexec/lib:
12:15:35/BIDMach_Spark $ll /usr/local/Cellar/lz4/r131/lib
and the contents of the lz4 related entry:
$ll /usr/local/Cellar/lz4/r131/lib
total 528
-r--r--r--  1 macuser  admin  71144 Sep 21  2015 liblz4.a
drwxr-xr-x  7 macuser  admin    238 Sep 21  2015 .
drwxr-xr-x  3 macuser  admin    102 Jun 13 10:41 pkgconfig
-r--r--r--  1 macuser  admin  64120 Jun 13 10:41 liblz4.dylib
-r--r--r--  1 macuser  admin  64120 Jun 13 10:41 liblz4.1.dylib
-r--r--r--  1 macuser  admin  64120 Jun 13 10:41 liblz4.1.7.1.dylib