GestureRecognizer
class GestureRecognizer : NSObject
@brief Performs gesture recognition on images.
This API expects a pre-trained TFLite hand gesture recognizer model or a custom one created using MediaPipe Solutions Model Maker. See https://developers.google.com/mediapipe/solutions/model_maker.
-
Creates a new instance of
GestureRecognizer
from an absolute path to a TensorFlow Lite model file stored locally on the device and the defaultGestureRecognizerOptions
.Declaration
Swift
convenience init(modelPath: String) throws
Parameters
modelPath
An absolute path to a TensorFlow Lite model file stored locally on the device.
error
An optional error parameter populated when there is an error in initializing the gesture recognizer.
Return Value
A new instance of
GestureRecognizer
with the given model path.nil
if there is an error in initializing the gesture recognizer. -
Creates a new instance of
GestureRecognizer
from the givenGestureRecognizerOptions
.Declaration
Swift
init(options: GestureRecognizerOptions) throws
Parameters
options
The options of type
GestureRecognizerOptions
to use for configuring theGestureRecognizer
.error
An optional error parameter populated when there is an error in initializing the gesture recognizer.
Return Value
A new instance of
GestureRecognizer
with the given options.nil
if there is an error in initializing the gesture recognizer. -
Performs gesture recognition on the provided
MPImage
using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when theGestureRecognizer
is created with running mode,.image
.This method supports performing gesture recognition on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Swift
func recognize(image: MPImage) throws -> GestureRecognizerResult
Parameters
image
The
MPImage
on which gesture recognition is to be performed.error
An optional error parameter populated when there is an error in performing gesture recognition on the input image.
Return Value
An
GestureRecognizerResult
object that contains the hand gesture recognition results. -
Performs gesture recognition on the provided video frame of type
MPImage
using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when theGestureRecognizer
is created with running mode,.video
.It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
This method supports performing gesture recognition on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Swift
func recognize(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> GestureRecognizerResult
Parameters
image
The
MPImage
on which gesture recognition is to be performed.timestampInMilliseconds
The video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
error
An optional error parameter populated when there is an error in performing gesture recognition on the input video frame.
Return Value
An
GestureRecognizerResult
object that contains the hand gesture recognition results. -
Sends live stream image data of type
MPImage
to perform gesture recognition using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when theGestureRecognizer
is created with running mode,.liveStream
.The object which needs to be continuously notified of the available results of gesture recognition must confirm to
GestureRecognizerLiveStreamDelegate
protocol and implement thegestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)
delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.
This method supports performing gesture recognition on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If the input
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.If this method is used for performing gesture recognition on live camera frames using
AVFoundation
, ensure that you requestAVCaptureVideoDataOutput
to output frames inkCMPixelFormat_32BGRA
using itsvideoSettings
property.Declaration
Swift
func recognizeAsync(image: MPImage, timestampInMilliseconds: Int) throws
Parameters
image
A live stream image data of type
MPImage
on which gesture recognition is to be performed.timestampInMilliseconds
The timestamp (in milliseconds) which indicates when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.
error
An optional error parameter populated when there is an error in performing gesture recognition on the input live stream image data.
Return Value
YES
if the image was sent to the task successfully, otherwiseNO
. -
Undocumented
-
Undocumented