Cognitive Services with PowerApps using Custom Connectors
Microsoft Cognitive Services are a set of APIs, SDKs and services available to developers to make their applications more intelligent, engaging and discoverable. This blog is an attempt to share an approach for PowerApps makers to use these Cognitive Services using custom connectors.
There is an out-of-the-box Computer Vision API connector, however the API uses dynamic types, which are not supported in PowerApps yet. There is also an out-of-the-box Face API connector, however it isn’t much useful in enterprise scenarios yet, as it’s an API Key based connector and when a PowerApps app created using this connector is shared with other users, each user has to bring in their own API keys for using the app.
About this blog
In the approach described in this blog, the cognitive services APIs are called from an Azure API App, using the client libraries which are just a C# client wrapper for the APIs. We shall be using the Microsoft.ProjectOxford.Face and Microsoft.ProjectOxford.Vision Nuget packages in our API app. I have shared the complete source code for the API App for download as well as the PowerApps app file to use & test the APIs as well. I have also combined the code for uploading an image to Azure Blob storage in this new API app – ImageUtilities, so that one can use all these functions together in an app if needed.
First we shall create the necessary Azure Services (App Service, Blob Storage, Computer Vision API & Face API), then publish the API App to Azure.
Then, we shall register a new custom connector in PowerApps using the swagger definition from the API App.
Finally, we shall create the PowerApps app using this new custom connector.
In PowerApps, when a maker shares an app created using a custom connector, the connector is automatically shared and hence other users can use the app (without needing to type in an API key). In the blog below, we shall also protect our API app endpoint with AAD and also register an app in Azure AD for the custom connector, so that other users can use their own AAD credentials to authenticate and get access to the APIs.
Let’s get started.
Here are the steps :
- Create the API App project
- Publish the API to Azure
- Protect with the API with Azure AD authentication (optional but highly recommended)
- Register another app in Azure AD for the custom connector (optional but highly recommended)
- Register a new custom connector in PowerApps
- Create a PowerApps App to use the connector
- Save, Publish and Test the App
Create the API App project
I have chosen to use an API App for the custom connector versus Azure Functions primarily as API apps can generate the Swagger document for use with PowerApps using the Swashbuckle Nuget package. This auto generates the definitions for all the reference classes like AnalysisResult and others used by the APIs in the swagger document. Since I have used the Swashbuckle feature to include XML Comments from my assembly to generate Summary and Description tags in swagger, it becomes quite handy for both the API registration and also for providing intelli-sense when authoring the functions from the PowerApps app.
You will need the following Azure Services setup before you can complete this step:
- Azure Subscription (you can setup your free azure account with a $200 credit to use in any Azure product for 30 days)
- Azure Blob Storage
- Create blob storage container named “images” to store the Images that you may upload
- Create a Computer Vision API service and obtain subscription keys (It is free to signup). Note down the subscription key and the address of the endpoint – for e.g. : https://westus.api.cognitive.microsoft.com/vision/v1.0/
- Create a Face API service and obtain subscription keys (It is free to signup). Note down the subscription key and the address of the endpoint – for e.g. : https://westus.api.cognitive.microsoft.com/face/v1.0/
Download the code for the API App from this location: https://aka.ms/ImageUtilitiesCode
If you haven’t created an API App for use with PowerApps, I recommend reading this tutorial to familiarize with the basic concepts. I have used Visual Studio 2017 for the code used in this app. Extract the contents of the zip file and open the ImageUtilities.sln file to launch the project in Visual Studio. We have added the following Controllers to correspond to the respective APIs from Azure Cognitive Services. The list of APIs for reference is given below:
- AnalyzeController –> Analyze Image
- DescribeController –> Describe Image
- HandwrittenTextController –> Recognize Handwritten Text
- OCRController –> OCR
- ImageTagsController –> Tag Image
- ThumbnailController –> Get Thumbnail
Open the Web.config file and modify the highlighted values in the Web.Config file with corresponding values from your Azure services.
For the StorageConnectionString – use the format DefaultEndpointsProtocol=https;AccountName=myAccount;AccountKey=myKey;
Use the API keys for the Computer Vision and Face APIs that you had noted down earlier. Also populate the root end points for both the services as well.
<appSettings> <!--<add key="StorageConnectionString" value="UseDevelopmentStorage=true" />--> <add key="StorageConnectionString" value="DummyValue" /> <add key="VisionAPIKey" value="DummyValue" /> <add key="FaceAPIKey" value="DummyValue" /> <add key="VisionAPIRoot" value="https://westus.api.cognitive.microsoft.com/vision/v1.0"/> <add key="FaceAPIRoot" value="https://westus.api.cognitive.microsoft.com/face/v1.0"/> </appSettings>
Compile & test the APIs locally using the SwaggerUI (CTRL + F5 and navigating to http://localhost:64279/swagger).
Publish the API to Azure
Once satisfied, publish the app to Azure. Refer to this article for a good tutorial for Publishing to Azure with Visual Studio. Test your API end point by browsing to https://<yoursitename>.azurewebsites.net/swagger/.
Download the swagger document. Browse to the Swagger URL mentioned in the Swagger UI of your API (https://<yoursitename>.azurewebsites.net/swagger/docs/v1) in IE and download the json file onto your disk and name it ImageUtilitiesSwaggerv1.json
Protect the API with Azure AD Authentication
Follow the steps mentioned in this article to secure your API with Azure Active Directory. If your Azure Subscription is in a different directory than your organizational Azure AD (like I had), follow the steps mentioned in the Alternative Method but make sure that you do the app registration in your organization’s Active Directory (Tip: you can open the AAD specific https://aad.portal.azure.com portal in another browser tab or window). I have registered my app as ImageUtilitiesAPI in my active directory.
Register another app in Azure AD for the PowerApps custom connector
We now need to register a second application in AAD for the PowerApps connector. This is required to secure the PowerApps custom connector with AAD so that other users within the organization can use their own AAD credentials to authenticate against the connector for the app. I have registered this app as ImageUtilities_PAConnector in my Azure Active Directory.
The second AAD application is used to secure the custom connector registration and acquire delegated access to the ImageUtilitiesAPI app protected by the first application. Name this one ImageUtilities_PAConnector.
- Sign-on URL:
- Reply URL:
- Add permissions to have delegated access to ImageUtilitiesAPI.
- You need the application ID of this application later as well, so note it.
- Generate a client key and store is somewhere safe. We need this key later.
Register a new Custom Connector in PowerApps
Browse to the PowerApps portal, and add a custom connector as described in this article: Register and use custom connectors. Note down the Environment name where you are creating the Custom connector.
Once you have uploaded the OpenAPI (Swagger) file, the wizard auto-detects that you are using the AAD authentication for your web API. (If you are curious how, we have already added a security definitions in the SwaggerConfig file itself, using the c.OAuth2("oauth2") method)
Configure the AAD authentication for the custom connector.
- Client ID: Client ID of ImageUtilities_PAConnector
- Secret: Client key of ImageUtilities_PAConnector
- Login URL: https://login.windows.net
- ResourceUri: Application ID of ImageUtilitiesAPI
You should be able to see all the actions and References imported from the swagger file in the Definition section as per screenshot below.
Click Create Connector to create the custom connector.
Create a PowerApps App to use the custom connector
Open the PowerApps Portal. Make sure that you have the same Environment selected as the one in which you created the Custom connector in the previous step.
Click on Apps from the left nav bar.
Download the PowerApps app resources from this location: https://aka.ms/CognitiveServicesPowerAppsDemo. Right click the downloaded zip file (on windows) and select Properties to unblock the contents by checking the Unblock checkbox, before extracting the contents of the zip file.
Then click on Create an App from the top right. This will navigate you to the PowerApps web authoring site. Click on Open from the left nav, then click on Browse.
Then select the CognitiveDemos.msapp file that you extracted from the downloaded zip file. This will load the PowerApps app in edit mode.
Click on View > DataSources
Click on + Add data source
Click on + New connection, and select the ImageUtilitiesAPI (custom connector that you created earlier). Authenticate using your AAD credentials to add it to your application. In case you are wondering the other two connections already present in the app, they are static data files imported from Excel. The source of the file is shared along with the PowerApps app file AppData.xlsx.
Save, Publish and Test the app
Click on File > Save > The cloud (Save to PowerApps) > Provide a unique name to your app and click on Save. For every subsequent saves, you will have to Publish your app.
Let’s run the app in Preview mode to test the app by click on F5 or the Preview the app button. There are 6 options to choose for the API that you want to call and 4 choices for selecting an image.
Select Analyze Image & Select from sample Image. This will display a gallery of sample images below for you to select. Select any one from the Gallery and click on Next to display the results.
There are a lot of hidden nuggets in the app for you to discover and enjoy! Share the app with your colleagues by click on the Share button.
Please share your comments in the section below for any feedback.