I am having a hard time to understand how to get Tensorboard work properly from a notebook running on Google Colab. I will post below a series of code snippets that I use to work with tensorboard.
TensorFlow version:  2.2.0 
Eager mode:  True 
Hub version:  0.8.0 
GPU is available 
%load_ext tensorboard
import tensorflow as tf
from tensorboard.plugins.hparams import api as hp
callbacks = [
        
        EarlyStopping(monitor=monitor_metric,
                      min_delta=minimum_delta,
                      patience=patience_limit,
                      verbose=verbose_value,
                      mode=mode_value,
                      restore_best_weights=True),
        ModelCheckpoint(filepath=weights_fname,
                        monitor=monitor_metric,
                        verbose=verbose_value,
                        save_best_only=True,
                        save_weights_only=True),
        
        tf.keras.callbacks.TensorBoard(logdir), #used here
        TensorBoardColabCallback(tbc),
        
        hp.KerasCallback(logdir, hparams) #used here
    ]
    
    return callbacks
Initializing Hyper-parameters that will be logged by Tensorboard
HP_HIDDEN_UNITS = hp.HParam('batch_size', hp.Discrete([128]))
HP_EMBEDDING_DIM = hp.HParam('embedding_dim', hp.Discrete([50, 100]))
HP_LEARNING_RATE = hp.HParam('learning_rate', hp.Discrete([0.01])) # Adam default: 0.001, SGD default: 0.01, RMSprop default: 0.001
HP_DECAY_STEPS_MULTIPLIER = hp.HParam('decay_steps_multiplier', hp.Discrete([10, 100]))
METRIC_ACCURACY = "hamming_loss"
Write the hp parameters file to the logging directory of Tensorboard.
hp_logging_directory=os.path.join(os.getcwd(), "model_one/logs/hparam_tuning")
with tf.summary.create_file_writer(hp_logging_directory).as_default():
    hp.hparams_config(
    hparams=[HP_HIDDEN_UNITS, HP_EMBEDDING_DIM, HP_LEARNING_RATE, HP_DECAY_STEPS_MULTIPLIER],
    metrics=[hp.Metric(METRIC_ACCURACY, display_name='hamming_loss')],
  )
    
try:
    os.path.exists(hp_logging_directory)
    print("Directory of hyper parameters logging exists!")
except Exception as e:
    print(e)
    print("Directory not found!")
Calling the Tensorboard API
%tensorboard --logdir model_one/logs/hparam_tuning
Links I have seen:
- https://www.tensorflow.org/tensorboard/hyperparameter_tuning_with_hparams
- Stack Overflow question -> Tried many different options from this question without any luck.
I have also installed the TensorboardColab module
from tensorboardcolab import *
tbc = TensorBoardColab() # To create a tensorboardcolab object it will automatically creat a link
writer = tbc.get_writer() # To create a FileWriter
writer.add_graph(tf.get_default_graph()) # add the graph 
writer.flush()
Executing the above I get the following error: AttributeError: module 'tensorboard.summary._tf.summary' has no attribute 'FileWriter'
When I try to access the IP localhost:6006, I get the error that This site can’t be reached
Please check my colab notebook and kindly write in the comments if you miss any additional information that I might forget to include.

