I built a dictionary with Jupyter Notebook in Azure Machine Learning Studio:
w_att = {
    '398465': 0,
    '8837.58': 1,
    '74967': 2,
    'jjpereza1': 3,
    '3180311358': 4,
    '56450': 5,
    '812723.990000033': 6,
    'guaba': 7}
w_att length is 1372600 so when I tried to store the object in the Notebook's instance with this code:
import json
json_object = json.dumps(w_att, indent = 4)
print(json_object)
I obtained this error:
> IOPub data rate exceeded.
> The notebook server will temporarily stop sending output
> to the client in order to avoid crashing it.
> To change this limit, set the config variable
> NotebookApp iopub_data_rate_limit.
> Current values:
> NotebookApp iopub_data_rate_limit 1000000 bytes sec
> NotebookApp rate_limit_window 3.0 secs
Then I tried:
import azureml.core
from azureml.core import Workspace, Datastore
import json
ws = Workspace.from_config()
datastore = Datastore.get(ws, datastore_name='xxx')
datastore.upload_files(json.dumps(w_att, indent = 4), overwrite = True)
And I got this error:
UserErrorException: UserErrorException:
    Message: '{' does not point to a file. Please upload the file to cloud first if running in a cloud notebook.
    InnerException None
    ErrorResponse 
{
    "error": {
        "code": "UserError",
        "message": "'{' does not point to a file. Please upload the file to cloud first if running in a cloud notebook."
    }
}
How can I directly save the object w_att to my storage account as a json file?
 
    