I created a Resource Manager template which has the Resources to run a function app basically using the default template for this purpose and I can manually copy files to my File Storage under sites\wwwroot and when I curl the functionapp it works.
But what I'm wondering is, what is the right way to deploy a bunch of files to update the actual functionapp, but not using git. I can see some examples of how to use git but my problem is that I want to have all my functions in a single repo not a bunch of smaller repos (I want a monorepo).
In Azure you just upload your code in a .gz file to S3 then when creating the lambda you give it a path to the zip. And then later there is a simple API that you can call to push up a new zip and presto.
What is the equivalent API in azure to just give it a zip and have it unpack the files in the right directory?
EDIT:
I managed to figure it out finally, the top answer was essentially right but there were a few extra steps I had to figure out so I'll put those here.
The specific doc on curl is here. My issue thread with the team is here.
The actual curl call I used is this:
$ curl -XPUT --data-binary @index.js.zip "https://abc:xyz@ugh.scm.azurewebsites.net/api/zip/site/wwwroot/hello"
Notes on this:
- You must use
--data-binary,-dresults in a bad payload - To create the zip in bash I used:
$ zip index.js.zip index.js abc:xyzis the deployment username and password- The right path to send is
site/wwwroot/hellowherehellois the name of the function.
You can set the username and password in the azure portal UI (documented here) but you can also set it with the azure-cli like so:
$ az login
$ az webapp deployment user set --user-name abc --password xyz
And that was the trick, once you've setup the deployment credentials you can use them in basic auth to call the kudu api.