Xam.Plugins.OnDeviceCustomVision 0.1.1-alpha

The Azure Custom Vision service (https://customvision.ai/) is able to create models that can be exported as CoreML or Tensorflow models to do image classification.

     This plugin makes it easy to download and use these models offline from inside your mobile app, using CoreML on iOS or Tensorflow on Android.
     These models can then be called from a .NET standard library or PCL, using something like Xam.Plugins.Media to take photos for classification.

This is a prerelease version of Xam.Plugins.OnDeviceCustomVision.
There is a newer version of this package available.
See the version list below for details.

Requires NuGet 2.8.1 or higher.

Install-Package Xam.Plugins.OnDeviceCustomVision -Version 0.1.1-alpha
dotnet add package Xam.Plugins.OnDeviceCustomVision --version 0.1.1-alpha
<PackageReference Include="Xam.Plugins.OnDeviceCustomVision" Version="0.1.1-alpha" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Xam.Plugins.OnDeviceCustomVision --version 0.1.1-alpha
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

Xam.Plugins.OnDeviceCustomVision

The Azure Custom Vision service is able to create models that can be exported as CoreML or Tensorflow models to do image classification.

This plugin makes it easy to download and use these models offline from inside your mobile app, using CoreML on iOS or Tensorflow on Android. These models can then be called from a .NET standard library or PCL, using something like Xam.Plugins.Media to take photos for classification.

Setup
  • Available on NuGet: https://www.nuget.org/packages/Xam.Plugins.OnDeviceCustomVision/ NuGet
  • Install into your PCL/.NET Standard project and iOS and Android client projects.
Platform Support

|Platform|Version|
| ------------------- | :------------------: |
|Xamarin.iOS|iOS 11+|
|Xamarin.Android|API 21+|

Usage

Before you can use this API, you need to initialise it with the model file downloaded from CustomVision.

iOS

Download the Core ML model from Custom Vision. Compile it using:

xcrun coremlcompiler compile <model file name>.mlmodel <model name>.mlmodelc

This will spit out a folder called &lt;model name&gt;.mlmodelc containing a number of files. Add this entire folder to the Resources folder in your iOS app. Once this has been added, add a call to Init to your app delegate, passing in the name of your compiled model (i.e. the name of the model folder without mlmodelc):

public override bool FinishedLaunching(UIApplication uiApplication, NSDictionary launchOptions)
{
   ...
   CrossImageClassifier.Current.Init("<model name>");
   return base.FinishedLaunching(uiApplication, launchOptions);
}
Android

Download the tensorflow model from Custom Vision. This will be a folder containing two files.

  • labels.txt
  • model.pb

Add both these files to the Assets folder in your Android app. Once this is added, add a call to Init to your main activity passing in the name of the model file:

protected override void OnCreate(Bundle savedInstanceState)
{
   ...
   CrossImageClassifier.Current.Init("model.pb");
}

Note - the labels file must be present and called labels.txt.

Calling this from your PCL/.NET Standard library

To classify an image, call:

var tags = await CrossImageClassifier.Current.ClassifyImage(stream);

Passing in an image as a stream. You can use a library like Xam.Plugins.Media to get an image as a stream from the camera or image library.

This will return a list of ImageClassification instances, one per tag in the model with the probabilty that the image matches that tag. Probabilities are doubles in the range of 0 - 1, with 1 being 100% probability that the image matches the tag. To find the most likely classification use:

tags..OrderByDescending(t => t.Probability)
     .First().Tag;

Xam.Plugins.OnDeviceCustomVision

The Azure Custom Vision service is able to create models that can be exported as CoreML or Tensorflow models to do image classification.

This plugin makes it easy to download and use these models offline from inside your mobile app, using CoreML on iOS or Tensorflow on Android. These models can then be called from a .NET standard library or PCL, using something like Xam.Plugins.Media to take photos for classification.

Setup
  • Available on NuGet: https://www.nuget.org/packages/Xam.Plugins.OnDeviceCustomVision/ NuGet
  • Install into your PCL/.NET Standard project and iOS and Android client projects.
Platform Support

|Platform|Version|
| ------------------- | :------------------: |
|Xamarin.iOS|iOS 11+|
|Xamarin.Android|API 21+|

Usage

Before you can use this API, you need to initialise it with the model file downloaded from CustomVision.

iOS

Download the Core ML model from Custom Vision. Compile it using:

xcrun coremlcompiler compile <model file name>.mlmodel <model name>.mlmodelc

This will spit out a folder called &lt;model name&gt;.mlmodelc containing a number of files. Add this entire folder to the Resources folder in your iOS app. Once this has been added, add a call to Init to your app delegate, passing in the name of your compiled model (i.e. the name of the model folder without mlmodelc):

public override bool FinishedLaunching(UIApplication uiApplication, NSDictionary launchOptions)
{
   ...
   CrossImageClassifier.Current.Init("<model name>");
   return base.FinishedLaunching(uiApplication, launchOptions);
}
Android

Download the tensorflow model from Custom Vision. This will be a folder containing two files.

  • labels.txt
  • model.pb

Add both these files to the Assets folder in your Android app. Once this is added, add a call to Init to your main activity passing in the name of the model file:

protected override void OnCreate(Bundle savedInstanceState)
{
   ...
   CrossImageClassifier.Current.Init("model.pb");
}

Note - the labels file must be present and called labels.txt.

Calling this from your PCL/.NET Standard library

To classify an image, call:

var tags = await CrossImageClassifier.Current.ClassifyImage(stream);

Passing in an image as a stream. You can use a library like Xam.Plugins.Media to get an image as a stream from the camera or image library.

This will return a list of ImageClassification instances, one per tag in the model with the probabilty that the image matches that tag. Probabilities are doubles in the range of 0 - 1, with 1 being 100% probability that the image matches the tag. To find the most likely classification use:

tags..OrderByDescending(t => t.Probability)
     .First().Tag;

Dependencies

This package has no dependencies.

Showing the top 1 GitHub repositories that depend on Xam.Plugins.OnDeviceCustomVision:

Repository Stars
microsoft/ConferenceVision
Sample Xamarin.Forms Phone App showed in Microsoft Build 2018

Version History

Version Downloads Last updated
2.1.1 226 6/10/2019
2.1.0-alpha 83 6/6/2019
2.0.0 671 7/18/2018
1.0.0 3,187 2/26/2018
0.1.5-alpha 301 1/24/2018
0.1.1-alpha 297 1/9/2018
0.1.0-alpha 375 1/9/2018
Show less