Expanding on @Ivailo Bardarov's answer I wrote the following script duplicate tables that are in a remote DynamoDB to a local one:
#!/bin/bash
declare -a arr=("table1" "table2" "table3" "table4")
for i in "${arr[@]}"
do
    TABLE=$i
    maxItems=25
    index=0
    echo "Getting table description of $TABLE from remote database..."
    aws dynamodb describe-table --table-name $TABLE > table-description.json
    echo
    echo "Creating table $TABLE in the local database..."
    ATTRIBUTE_DEFINITIONS=$(jq .Table.AttributeDefinitions table-description.json)
    KEY_SCHEMA=$(jq .Table.KeySchema table-description.json)
    BILLING_MODE=$(jq .Table.BillingModeSummary.BillingMode table-description.json)
    READ_CAPACITY_UNITS=$(jq .Table.ProvisionedThroughput.ReadCapacityUnits table-description.json)
    WRITE_CAPACITY_UNITS=$(jq .Table.ProvisionedThroughput.WriteCapacityUnits table-description.json)
    TABLE_DEFINITION=""
    if [[ "$READ_CAPACITY_UNITS" > 0 && "$WRITE_CAPACITY_UNITS" > 0 ]]
    then
    TABLE_DEFINITION="{\"AttributeDefinitions\":$ATTRIBUTE_DEFINITIONS,\"TableName\":\"$TABLE\",\"KeySchema\":$KEY_SCHEMA,\"ProvisionedThroughput\":{\"ReadCapacityUnits\":$READ_CAPACITY_UNITS,\"WriteCapacityUnits\":$WRITE_CAPACITY_UNITS}}"
    else
    TABLE_DEFINITION="{\"AttributeDefinitions\":$ATTRIBUTE_DEFINITIONS,\"TableName\":\"$TABLE\",\"KeySchema\":$KEY_SCHEMA,\"BillingMode\":$BILLING_MODE}"
    fi
    echo $TABLE_DEFINITION > create-table.json
    aws dynamodb create-table --cli-input-json file://create-table.json --endpoint-url http://localhost:8000
    echo "Querying table $TABLE from remote..."
    DATA=$(aws dynamodb scan --table-name $TABLE --max-items $maxItems)
    ((index+=1))
    echo "Saving remote table [$TABLE] contents to inserts.json file..."
    echo $DATA | jq ".Items | {\"$TABLE\": [{\"PutRequest\": { \"Item\": .[]}}]}" > inserts.json
    echo "Inserting rows to $TABLE in local database..."
    aws dynamodb batch-write-item --request-items file://inserts.json --endpoint-url http://localhost:8000
    nextToken=$(echo $DATA | jq '.NextToken')        
    while [[ "$nextToken" != "" && "$nextToken" != "null" ]]
    do
      echo "Querying table $TABLE from remote..."
      DATA=$(aws dynamodb scan --table-name $TABLE --max-items $maxItems --starting-token $nextToken)
      ((index+=1))
      echo "Saving remote table [$TABLE] contents to inserts.json file..."
      echo $DATA | jq ".Items | {\"$TABLE\": [{\"PutRequest\": { \"Item\": .[]}}]}" > inserts.json
      echo "Inserting rows to $TABLE in local database..."
      aws dynamodb batch-write-item --request-items file://inserts.json --endpoint-url http://localhost:8000
      nextToken=$(echo "$DATA" | jq '.NextToken')
    done
done
echo "Deleting temporary files..."
rm -f table-description.json
rm -f create-table.json
rm -f inserts.json
echo "Database sync complete!"
This script loops over the string array and for each table name it first gets the description of the table and builds a create JSON file with the minimum required parameters and creates the table. Then it uses rest of the @Ivailo Bardarov's logic to generate inserts and pushes them to the created table. Finally it cleans up the generated JSON files.
Keep in mind, my purpose was to just create a rough duplicate (hence the minimum required parameters) of tables for development purposes.