How about, instead of setting lr_mult: 0 for all parameters to all layers prior to the new fully connected layer, just stop the back propagation after the new layer?
You can do that by setting propagate_down: false.
For example:
layer {
  name: "new_layer"
  type: "InnerProduct"
  ...
  inner_product_param {
    ...
  }
  propagate_down: false # do not continue backprop after this layer
}
Alternatively, you can use sed, a command line utility, to directly change all entries in your prototxt file:
~$ sed -i -E 's/lr_mult *: *[0-9]+/lr_mult: 0/g' train_val.prototxt
This one line will change all lr_mult in your train_val.prototxt to zero. You'll only need to manually set the lr_mult for the new layer.