You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jan 3, 2023. It is now read-only.
@@ -38,8 +38,7 @@ To view the license for cuDNN included in the cuda base image, click [here](http
38
38
39
39
## Contents
40
40
1.[Repo Layout](#repo-layout)
41
-
2.[Quickstart](#Quickstart)
42
-
3.[Quickstart Tutorial](#Quickstart-Tutorial)
41
+
2.[Quickstart Tutorial](#Quickstart-Tutorial)
43
42
1.[Choose a base image or example](#Choose-a-base-image-or-example)
44
43
2.[Insert code to call your model](#Insert-code-to-call-your-model)
45
44
3.[Input handling](#Input-handling)
@@ -53,7 +52,7 @@ To view the license for cuDNN included in the cuda base image, click [here](http
53
52
11.[Publish to Azure Container Registry](#Publish-to-Azure-Container-Registry)
54
53
12.[Run your container in ACI](#Run-your-container-in-ACI)
55
54
13.[FAQs](#FAQs)
56
-
4.[Contributing](#Contributing)
55
+
3.[Contributing](#Contributing)
57
56
58
57
## Repo Layout
59
58
- Containers
@@ -118,18 +117,19 @@ AI for Earth APIs are all built from an AI for Earth base image. You may use a
118
117
In general, if you're using Python, you will want to use an image or example with the base-py or blob-py images. If you are using R, you will want to use an image or example with the base-r or blob-r images. The difference between them: the blob-* image contains everything that the cooresponding base-* image contains, plus additional support for mounting [Azure blob storage](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction). This may be useful if you need to process (for example) a batch of images all at once; you can upload them all to Azure blob storage, the container in which your model is running can mount that storage, and access it like it is local storage.
119
118
120
119
## Asynchronous (async) vs. Synchronous (sync) Endpoint
121
-
In addition to your language choice, you should think about whether your API call should be synchronous or asynchronous. A synchronous API call will invoke your model, get results, and return immediately. This is a good paradigm to use if you want to perform classification with your model on a single image, for example. An asynchronous API call should be used for long-running tasks, like processing a whole folder of images, performing object detection on each image with your model, and storing the results.
120
+
In addition to your language choice, think about whether your API call should be synchronous or asynchronous.
121
+
- A synchronous API call will invoke your model, get results, and return immediately. This is a good paradigm to use if you want to perform classification with your model on a single image, for example.
122
+
- An asynchronous API call should be used for long-running tasks, like processing a whole folder of images using your model and storing the results, or constructing a forecasting model from historical data that the user provides.
122
123
123
124
### Asynchronous Implementation Examples
124
125
The following examples demonstrate async endpoints:
@@ -141,18 +141,18 @@ While input patterns can be used for sync or async designs, your output design i
141
141
142
142
#### Binary Input
143
143
Many applications of AI apply models to image/binary inputs. Here are some approaches:
144
-
- Send the image directly via request data. See the [tensorflow](./examples/tensorflow/tf_iNat_api/runserver.py) example to see how it is accomplished.
144
+
- Send the image directly via request data. See the [tensorflow](./Examples/tensorflow/tf_iNat_api/runserver.py) example to see how it is accomplished.
145
145
- Upload your binary input to an Azure Blob, create a [SAS key](https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1), and add a JSON field for it.
146
146
- If you would like users to use your own Azure blob storage, we provide tools to [mount blobs as local drives](https://github.com/Azure/azure-storage-fuse) within your service. You may then use this virtual file system, locally.
147
147
- Serializing your payload is a very efficient method for transmission. [BSON](http://bsonspec.org/) is an open standard, binary-encoded serialization for such purposes.
148
148
149
149
### Asynchronous Pattern
150
-
The preferred way of handling asynchronous API calls is to provide a task status endpoint to your users. When a request is submitted, a new taskId is immediately returned to the caller to track the status of their request as it is processed.
150
+
The preferred way of handling asynchronous API calls is to provide a task status endpoint to your users. When a request is submitted, a new `taskId` is immediately returned to the caller to track the status of their request as it is processed.
151
151
152
152
We have several tools to help with task tracking that you can use for local development and testing. These tools create a database within the service instance and are not recommended for production use.
153
153
154
154
Once a task is completed, the user needs to retrieve the result of their service call. This can be accomplished in several ways:
155
-
- Return a SAS-keyed URL to an Azure Blob Container via a call to the task endpoint.
155
+
- Return a SAS-keyed URL to an Azure Blob Container via a call to the `task` endpoint.
156
156
- Request that a writable SAS-keyed URL is provided as input to your API call. Indicate completion via the task interface and write the output to that URL.
157
157
- If you would like users to use your own Azure blob storage, you can write directly to a virtually-mounted drive.
158
158
@@ -291,7 +291,7 @@ Each decorator contains the following parameters:
291
291
-```maximum_concurrent_requests = 5```: If the number of requests exceed this limit, a 503 is returned to the caller.
292
292
-```content_types = ['application/json']```: An array of accepted content types. If the requested type is not found in the array, a 503 will be returned.
293
293
-```content_max_length = 1000```: The maximum length of the request data (in bytes) permitted. If the length of the data exceeds this setting, a 503 will be returned.
294
-
-```trace_name = 'post:my_long_running_funct'```: A trace name to associate with this function. This allows you to search logs and metrics for this particular function.
294
+
-```trace_name = 'post:my_long_running_funct'```: A trace name to associate with this function. This allows you to search logs and metrics for this particular function.
295
295
296
296
## Create AppInsights instrumentation keys
297
297
[Application Insights](https://docs.microsoft.com/en-us/azure/application-insights/app-insights-overview) is an Azure service for application performance management. We have integrated with Application Insights to provide advanced monitoring capabilities. You will need to generate both an Instrumentation key and an API key to use in your application.
@@ -324,7 +324,6 @@ Now, let's look at the Dockerfile in your code. Update the Dockerfile to instal
324
324
```Dockerfile
325
325
RUN /usr/local/envs/ai4e_py_api/bin/pip install grpcio opencensus
326
326
```
327
-
```
328
327
329
328
- apt-get
330
329
```Dockerfile
@@ -410,7 +409,7 @@ In the above command, the -p switch designates the local port mapping to the con
410
409
```Dockerfile
411
410
EXPOSE 80
412
411
```
413
-
TIP: Depending on your git settings and your operating system, the "docker run" command may fail with the error 'standard_init_linux.go:190: exec user process caused "no such file or directory"'. If this happens, you need to change the end-of-line characters in startup.sh to LF. One way to do this is using VS Code; open the startup.sh file and click on CRLF in the bottom right corner in the blue bar and select LF instead, then save.
412
+
TIP: Depending on your git settings and your operating system, the "docker run" command may fail with the error `standard_init_linux.go:190: exec user process caused "no such file or directory"`. If this happens, you need to change the end-of-line characters in startup.sh to LF. One way to do this is using VS Code; open the startup.sh file and click on CRLF in the bottom right corner in the blue bar and select LF instead, then save.
414
413
415
414
If you find that there are errors and you need to go back and rebuild your docker container, run the following commands:
0 commit comments