Skip to content

Commit a2210e1

Browse files
committed
Merge pull request #4 from Microsoft/Development
Merging up to master
2 parents ab7e1aa + 0caebe1 commit a2210e1

File tree

5 files changed

+164
-0
lines changed

5 files changed

+164
-0
lines changed
Lines changed: 145 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,145 @@
1+
2+
Explore a basic Windows application that uses Emotion API to recognize the emotions expressed by the faces in an image. The below example lets you submit an image URL or a locally stored file. You can use this open source example as a template for building your own app for Windows using the Emotion API and WPF (Windows Presentation Foundation), a part of .NET Framework.
3+
## Table of Contents
4+
[Prerequisites](#Prerequisites)
5+
[Step 1: Open the example](#Step1)
6+
[Step 2: Build the example](#Step2)
7+
[Step 3: Run the example](#Step3)
8+
[Review and Learn](#Review)
9+
[Related Topics](#Related)
10+
## <a name="Requisites">Prerequisites </a>
11+
#### *  Platform requirements
12+
13+
The below example has been developed for the .NET Framework using [Visual Studio 2015, Community Edition](https://www.visualstudio.com/products/visual-studio-community-vs).
14+
#### *  Subscribe to Emotion API and get a subscription key
15+
Before creating the example, you must subscribe to Emotion API which is part of the Microsoft Project Oxford services. For subscription and key management details, see [Subscription Management](https://www.projectoxford.ai/doc/general/subscription-key-mgmt). Both the primary and secondary key can be used in this tutorial. Make sure to follow best practices for keeping your API key secret and secure.
16+
#### *  Get the client library and example
17+
You may download the Emotion API client library and example through [https://www.projectoxford.ai/sdk](https://www.projectoxford.ai/sdk) or access them via [GitHub](https://github.com/Microsoft/ProjectOxford-ClientSDK/tree/master/Emotion/Windows). The downloaded zip file needs to be extracted to a folder of your choice, many users choose the Visual Studio 2015 folder.
18+
19+
## <a name="Step1">Step 1: Open the example</a>
20+
1. Start Microsoft Visual Studio 2015 and click **File**, select **Open**, then **Project/Solution**.
21+
2. Browse to the folder where you saved the downloaded Emotion API files. Click on **Emotion**, then **Windows**, and then the **Sample-WPF** folder.
22+
3. Double-click to open the Visual Studio 2015 Solution (.sln) file named **EmotionAPI-WPF-Samples.sln**. This will open the solution in Visual Studio.
23+
24+
## <a name="Step2">Step 2: Build the example</a>
25+
1. In **Solution Explorer**, right click **References** and select **Manage NuGet Packages**.
26+
![Open Nuget Package Manager](./Images/EmotionNuget.png)
27+
2. The **NuGet Package Manager** window opens. First select **Browse** in the upper left corner, then in the search box type “Newtonsoft.Json”, select the **Newtonsoft.Json** package and click **Install**.
28+
![Browse to NuGet Package](./Images/EmotionNugetBrowse.png)
29+
3. Press Ctrl+Shift+B, or click **Build** on the ribbon menu, then select **Build Solution**.
30+
31+
## <a name="Step3">Step 3: Run the example</a>
32+
1. After the build is complete, press **F5** or click **Start** on the ribbon menu to run the example.
33+
2. Locate the Emotion API window with the **text box** reading "**Paste your subscription key here to start**". Paste your subscription key into the text box as shown in below screenshot. You can choose to persist your subscription key on your PC or laptop by clicking the "Save Key" button. When you want to delete the subscription key from the system, click "Delete Key" to remove it from your PC or laptop.
34+
35+
![Emotion Functionality Interface](./Images/EmotionKey.png)
36+
37+
3. Under "**Select Scenario**" click to use either of the two scenarios, “**Detect emotion using a stream**” or “**Detect emotion using a URL**”, then follow the instructions on the screen. Microsoft receives the images you upload and may use them to improve Emotion API and related services. By submitting an image, you confirm that you have followed our [Developer Code of Conduct](http://research.microsoft.com/en-us/UM/legal/ProjectOxford_CodeOfConduct.htm).
38+
4. There are example images to be used with this example application. You can find these images on [Github](https://github.com/Microsoft/ProjectOxford-ClientSDK/tree/master/Face/Windows/Data) under **Face**, **Windows**, then search the **Data** folder. Please note the use of these images is licensed under Fair Use agreement meaning they are OK to use for testing this example, but not for republishing.
39+
## <a name="Review">Review and Learn</a>
40+
Now that you have a running application, let us review how this example app integrates with Project Oxford technology. This will make it easier to either continue building onto this app or develop your own app using Project Oxford’s Emotion API.
41+
42+
This example app makes use of the Emotion API Client Library, a thin C# client wrapper for the Project Oxford Emotion API. When you built the example app as described above, you got the Client Library from a NuGet package. You can review the Client Library source code in the folder titled “[Client Library](https://github.com/Microsoft/ProjectOxford-ClientSDK/tree/master/Emotion/Windows/ClientLibrary)” under **Emotion**, **Windows**, **Client Library**, which is part of the downloaded file repository mentioned above in [Prerequisites](#Prerequisites).
43+
44+
You can also find out how to use the Client Library code in **Solution Explorer**: Under **EmotionAPI-WPF_Samples**, expand **DetectEmotionUsingStreamPage.xaml** to locate **DetectEmotionUsingStreamPage.xaml.cs**, which is used for browsing to a locally stored file, or expand **DetectEmotionUsingURLPage.xaml** to find **DetectEmotionUsingURLPage.xaml.cs**, which is used when uploading an image URL. Double-click the .xaml.cs files to have them open in new windows in Visual Studio.
45+
46+
Reviewing how the Emotion Client Library gets used in our example app, let's look at two code snippets from **DetectEmotionUsingStreamPage.xaml.cs** and **DetectEmotionUsingURLPage.xaml.cs**. Each file contains code comments indicating “KEY SAMPLE CODE STARTS HERE” and “KEY SAMPLE CODE ENDS HERE” to help you locate the code snippets reproduced below.
47+
48+
The Emotion API is able to work with either an image URL or binary image data (in form of an octet stream) as input. The two options are reviewed below. In both cases, you first find a using directive, which lets you use the Emotion Client Library.
49+
50+
51+
```csharp
52+
53+
// -----------------------------------------------------------------------
54+
// KEY SAMPLE CODE STARTS HERE
55+
// Use the following namespace for EmotionServiceClient
56+
// -----------------------------------------------------------------------
57+
using Microsoft.ProjectOxford.Emotion;
58+
using Microsoft.ProjectOxford.Emotion.Contract;
59+
// -----------------------------------------------------------------------
60+
// KEY SAMPLE CODE ENDS HERE
61+
// -----------------------------------------------------------------------
62+
```
63+
#### DetectEmotionUsingURLPage.xaml.cs
64+
65+
This code snippet shows how to use the Client Library to submit your subscription key and a photo URL to the Emotion API service.
66+
67+
```csharp
68+
69+
// -----------------------------------------------------------------------
70+
// KEY SAMPLE CODE STARTS HERE
71+
// -----------------------------------------------------------------------
72+
73+
window.Log("EmotionServiceClient is created");
74+
75+
//
76+
// Create Project Oxford Emotion API Service client
77+
//
78+
EmotionServiceClient emotionServiceClient = new EmotionServiceClient(subscriptionKey);
79+
80+
window.Log("Calling EmotionServiceClient.RecognizeAsync()...");
81+
try
82+
{
83+
//
84+
// Detect the emotions in the URL
85+
//
86+
Emotion[] emotionResult = await emotionServiceClient.RecognizeAsync(url);
87+
return emotionResult;
88+
}
89+
catch (Exception exception)
90+
{
91+
window.Log("Dection failed. Please make sure that you have the right subscription key and proper URL to detect.");
92+
window.Log(exception.ToString());
93+
return null;
94+
}
95+
// -----------------------------------------------------------------------
96+
// KEY SAMPLE CODE ENDS HERE
97+
// -----------------------------------------------------------------------
98+
```
99+
#### DetectEmotionUsingStreamPage.xaml.cs
100+
101+
Shown below is how to submit your subscription key and a locally stored image to the Emotion API.
102+
103+
104+
```csharp
105+
106+
107+
// -----------------------------------------------------------------------
108+
// KEY SAMPLE CODE STARTS HERE
109+
// -----------------------------------------------------------------------
110+
111+
//
112+
// Create Project Oxford Emotion API Service client
113+
//
114+
EmotionServiceClient emotionServiceClient = new EmotionServiceClient(subscriptionKey);
115+
116+
window.Log("Calling EmotionServiceClient.RecognizeAsync()...");
117+
try
118+
{
119+
Emotion[] emotionResult;
120+
using (Stream imageFileStream = File.OpenRead(imageFilePath))
121+
{
122+
//
123+
// Detect the emotions in the URL
124+
//
125+
emotionResult = await emotionServiceClient.RecognizeAsync(imageFileStream);
126+
return emotionResult;
127+
}
128+
}
129+
catch (Exception exception)
130+
{
131+
window.Log(exception.ToString());
132+
return null;
133+
}
134+
// -----------------------------------------------------------------------
135+
// KEY SAMPLE CODE ENDS HERE
136+
// -----------------------------------------------------------------------
137+
```
138+
## <a name="Related">Related Topics</a>
139+
[Emotion API Overview](Emotion%20API%20Overview.xml)
140+
141+
142+
143+
144+
145+

Content/en-us/Emotion/Home.md

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
2+
Welcome to the Microsoft Project Oxford Emotion API, which allows you to build more personalized apps with Microsoft’s cutting edge cloud-based emotion recognition algorithm.
3+
4+
5+
6+
7+
8+
#### Emotion Detection
9+
10+
11+
The Emotion API beta takes an image as an input, and returns the confidence across a set of emotions for each face in the image, as well as bounding box for the face, from the Face API. The emotions detected are happiness, sadness, surprise, anger, fear, contempt, disgust or neutral. These emotions are communicated cross-culturally and universally via the same basic facial expressions, where are identified by Emotion API.
12+
13+
In interpreting results from the Emotion API, the emotion detected should be interpreted as the emotion with the highest score, as scores are normalized to sum to one. Users may choose to set a higher confidence threshold within their application, depending on their needs. If a user has already called the Face API, they can submit the face rectangle as an optional input.
14+
15+
16+
For more details about emotion detection, please refer to the [API Reference](https://dev.projectoxford.ai/docs/services/5639d931ca73072154c1ce89).
17+
18+
19+
26 KB
Loading
17 KB
Loading
62.5 KB
Loading

0 commit comments

Comments
 (0)