PoseLandmarker
class PoseLandmarker : NSObject@brief Performs pose landmarks detection on images.
This API expects a pre-trained pose landmarks model asset bundle.
-
The array of connections between all the landmarks in the detected pose.
Declaration
Swift
class var poseLandmarks: [Connection] { get } -
Creates a new instance of
PoseLandmarkerfrom an absolute path to a model asset bundle stored locally on the device and the defaultPoseLandmarkerOptions.Declaration
Swift
convenience init(modelPath: String) throwsParameters
modelPathAn absolute path to a model asset bundle stored locally on the device.
errorAn optional error parameter populated when there is an error in initializing the pose landmarker.
Return Value
A new instance of
PoseLandmarkerwith the given model path.nilif there is an error in initializing the pose landmarker. -
Creates a new instance of
PoseLandmarkerfrom the givenPoseLandmarkerOptions.Declaration
Swift
init(options: PoseLandmarkerOptions) throwsParameters
optionsThe options of type
PoseLandmarkerOptionsto use for configuring thePoseLandmarker.errorAn optional error parameter populated when there is an error in initializing the pose landmarker.
Return Value
A new instance of
PoseLandmarkerwith the given options.nilif there is an error in initializing the pose landmarker. -
Performs pose landmarks detection on the provided
MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when thePoseLandmarkeris created with running mode.image.This method supports performing pose landmarks detection on RGBA images. If your
MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your
MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.Declaration
Swift
func detect(image: MPImage) throws -> PoseLandmarkerResultParameters
imageThe
MPImageon which pose landmarks detection is to be performed.errorAn optional error parameter populated when there is an error in performing pose landmark detection on the input image.
Return Value
An
PoseLandmarkerResultobject that contains the pose landmarks detection results. -
Performs pose landmarks detection on the provided video frame of type
MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when thePoseLandmarkeris created with running mode.video.It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
This method supports performing pose landmarks detection on RGBA images. If your
MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your
MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.Declaration
Swift
func detect(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> PoseLandmarkerResultParameters
imageThe
MPImageon which pose landmarks detection is to be performed.timestampInMillisecondsThe video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
errorAn optional error parameter populated when there is an error in performing pose landmark detection on the input video frame.
Return Value
An
PoseLandmarkerResultobject that contains the pose landmarks detection results. -
Sends live stream image data of type
MPImageto perform pose landmarks detection using the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when thePoseLandmarkeris created with running mode.liveStream.The object which needs to be continuously notified of the available results of pose landmark detection must confirm to
PoseLandmarkerLiveStreamDelegateprotocol and implement theposeLandmarker(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the pose landmarker. The input timestamps must be monotonically increasing.
This method supports performing pose landmarks detection on RGBA images. If your
MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If the input
MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.If this method is used for performing pose landmarks detection on live camera frames using
AVFoundation, ensure that you requestAVCaptureVideoDataOutputto output frames inkCMPixelFormat_32BGRAusing itsvideoSettingsproperty.Declaration
Swift
func detectAsync(image: MPImage, timestampInMilliseconds: Int) throwsParameters
imageA live stream image data of type
MPImageon which pose landmarks detection is to be performed.timestampInMillisecondsThe timestamp (in milliseconds) which indicates when the input image is sent to the pose landmarker. The input timestamps must be monotonically increasing.
errorAn optional error parameter populated when there is an error in performing pose landmark detection on the input live stream image data.
Return Value
YESif the image was sent to the task successfully, otherwiseNO. -
Undocumented
-
Undocumented