Skip to content
Merged
5 changes: 3 additions & 2 deletions packages/camera/camera_avfoundation/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
## NEXT
## 0.9.17+4

* Adds possibility to use any supported FPS and fixes crash when using unsupported FPS.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

*Adds ability to...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where is the new test that triggers the codepath that used to crash?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When the real captureDevice.activeVideoMinFrameDuration (or ...Max...) is set to value outside of what is supported by active format it immediately throws an objc exception. For tests, mocked wrappers are used so this cannot be triggered. But the test testSettings_ShouldSelectFormatWhichSupports60FPS tries to set 60 fps which actually triggered this crash on my device for media presets other than 1280x720.

* Updates minimum supported SDK version to Flutter 3.19/Dart 3.3.

## 0.9.17+3
Expand All @@ -12,7 +13,7 @@

## 0.9.17+1

* Fixes a crash due to appending sample buffers when readyForMoreMediaData is NO
* Fixes a crash due to appending sample buffers when readyForMoreMediaData is NO.

## 0.9.17

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -148,9 +148,9 @@ - (void)testSettings_shouldPassConfigurationToCameraDeviceAndWriter {

// Expect FPS configuration is passed to camera device.
[self waitForExpectations:@[
injectedWrapper.lockExpectation, injectedWrapper.beginConfigurationExpectation,
injectedWrapper.beginConfigurationExpectation, injectedWrapper.lockExpectation,
injectedWrapper.minFrameDurationExpectation, injectedWrapper.maxFrameDurationExpectation,
injectedWrapper.commitConfigurationExpectation, injectedWrapper.unlockExpectation
injectedWrapper.unlockExpectation, injectedWrapper.commitConfigurationExpectation
]
timeout:1
enforceOrder:YES];
Expand Down Expand Up @@ -202,4 +202,20 @@ - (void)testSettings_ShouldBeSupportedByMethodCall {
XCTAssertNotNil(resultValue);
}

- (void)testSettings_ShouldSelectFormatWhichSupports60FPS {
FCPPlatformMediaSettings *settings =
[FCPPlatformMediaSettings makeWithResolutionPreset:gTestResolutionPreset
framesPerSecond:@(60)
videoBitrate:@(gTestVideoBitrate)
audioBitrate:@(gTestAudioBitrate)
enableAudio:gTestEnableAudio];

FLTCam *camera = FLTCreateCamWithCaptureSessionQueueAndMediaSettings(
dispatch_queue_create("test", NULL), settings, nil, nil);

AVFrameRateRange *range = camera.captureDevice.activeFormat.videoSupportedFrameRateRanges[0];
XCTAssertLessThanOrEqual(range.minFrameRate, 60);
XCTAssertGreaterThanOrEqual(range.maxFrameRate, 60);
}

@end
Original file line number Diff line number Diff line change
Expand Up @@ -52,14 +52,44 @@
OCMStub([audioSessionMock addInputWithNoConnections:[OCMArg any]]);
OCMStub([audioSessionMock canSetSessionPreset:[OCMArg any]]).andReturn(YES);

id frameRateRangeMock1 = OCMClassMock([AVFrameRateRange class]);
OCMStub([frameRateRangeMock1 minFrameRate]).andReturn(3);
OCMStub([frameRateRangeMock1 maxFrameRate]).andReturn(30);
id captureDeviceFormatMock1 = OCMClassMock([AVCaptureDeviceFormat class]);
OCMStub([captureDeviceFormatMock1 videoSupportedFrameRateRanges]).andReturn(@[
frameRateRangeMock1
]);

id frameRateRangeMock2 = OCMClassMock([AVFrameRateRange class]);
OCMStub([frameRateRangeMock2 minFrameRate]).andReturn(3);
OCMStub([frameRateRangeMock2 maxFrameRate]).andReturn(60);
id captureDeviceFormatMock2 = OCMClassMock([AVCaptureDeviceFormat class]);
OCMStub([captureDeviceFormatMock2 videoSupportedFrameRateRanges]).andReturn(@[
frameRateRangeMock2
]);

id captureDeviceMock = OCMClassMock([AVCaptureDevice class]);
OCMStub([captureDeviceMock lockForConfiguration:[OCMArg setTo:nil]]).andReturn(YES);
OCMStub([captureDeviceMock formats]).andReturn((@[
captureDeviceFormatMock1, captureDeviceFormatMock2
]));
__block AVCaptureDeviceFormat *format = captureDeviceFormatMock1;
OCMStub([captureDeviceMock setActiveFormat:[OCMArg any]]).andDo(^(NSInvocation *invocation) {
[invocation retainArguments];
[invocation getArgument:&format atIndex:2];
});
OCMStub([captureDeviceMock activeFormat]).andDo(^(NSInvocation *invocation) {
[invocation setReturnValue:&format];
});

id fltCam = [[FLTCam alloc] initWithMediaSettings:mediaSettings
mediaSettingsAVWrapper:mediaSettingsAVWrapper
orientation:UIDeviceOrientationPortrait
videoCaptureSession:videoSessionMock
audioCaptureSession:audioSessionMock
captureSessionQueue:captureSessionQueue
captureDeviceFactory:captureDeviceFactory ?: ^AVCaptureDevice *(void) {
return [AVCaptureDevice deviceWithUniqueID:@"camera"];
return captureDeviceMock;
}
videoDimensionsForFormat:^CMVideoDimensions(AVCaptureDeviceFormat *format) {
return CMVideoFormatDescriptionGetDimensions(format.formatDescription);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -211,20 +211,50 @@ - (instancetype)initWithMediaSettings:(FCPPlatformMediaSettings *)mediaSettings
[_motionManager startAccelerometerUpdates];

if (_mediaSettings.framesPerSecond) {
[_mediaSettingsAVWrapper beginConfigurationForSession:_videoCaptureSession];

// Possible values for presets are hard-coded in FLT interface having
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is all this code moving out of the locked section; is that safe?

Copy link
Contributor Author

@misos1 misos1 Sep 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not see anything saying that the order of beginConfiguration and lockForConfiguration is important or that setting the session preset must be under that lock. There is example where it is used in that order: https://developer.apple.com/documentation/avfoundation/avcapturedevice/1389221-activeformat?language=objc

I moved setCaptureSessionPreset outside of lock because it can itself call lockForConfiguration and unlockForConfiguration and I did not see anything about whether that lock is recursive.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems lockForConfiguration is actually recursive from my observation (I could not get mentioned exceptions until now) as locking twice and unlocking once does not cause exceptions but unlocking twice yes. So I can revert this change (?).

Also it seems not all commitConfiguration calls go through _mediaSettingsAVWrapper.

// corresponding AVCaptureSessionPreset counterparts.
// If _resolutionPreset is not supported by camera there is
// fallback to lower resolution presets.
// If none can be selected there is error condition.
if (![self setCaptureSessionPreset:_mediaSettings.resolutionPreset withError:error]) {
[_videoCaptureSession commitConfiguration];
return nil;
}

// The frame rate can be changed only on a locked for configuration device.
if ([mediaSettingsAVWrapper lockDevice:_captureDevice error:error]) {
[_mediaSettingsAVWrapper beginConfigurationForSession:_videoCaptureSession];

// Possible values for presets are hard-coded in FLT interface having
// corresponding AVCaptureSessionPreset counterparts.
// If _resolutionPreset is not supported by camera there is
// fallback to lower resolution presets.
// If none can be selected there is error condition.
if (![self setCaptureSessionPreset:_mediaSettings.resolutionPreset withError:error]) {
[_videoCaptureSession commitConfiguration];
[_captureDevice unlockForConfiguration];
return nil;
// find the format which frame rate ranges are closest to the wanted frame rate
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comments need to follow standard sentence rules. https://google.github.io/styleguide/objcguide.html#comments

CMVideoDimensions targetRes = self.videoDimensionsForFormat(_captureDevice.activeFormat);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You've added several calls to self in this init method. The fact that there was already such a call is a problem, and that should not be made worse by adding more.

If there are general, stateless utilities that you want to call from init, they should be turned into freestanding static functions.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't abbreviate variable names. https://google.github.io/styleguide/objcguide.html#naming

double targetFrameRate = _mediaSettings.framesPerSecond.doubleValue;
FourCharCode preferredSubType =
CMFormatDescriptionGetMediaSubType(_captureDevice.activeFormat.formatDescription);
AVCaptureDeviceFormat *bestFormat = _captureDevice.activeFormat;
double bestFrameRate = [self frameRateForFormat:bestFormat closestTo:targetFrameRate];
double minDistance = fabs(bestFrameRate - targetFrameRate);
int bestSubTypeScore = 1;
for (AVCaptureDeviceFormat *format in _captureDevice.formats) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you also move this to a helper function?

CMVideoDimensions res = self.videoDimensionsForFormat(format);
if (res.width != targetRes.width || res.height != targetRes.height) {
continue;
}
double frameRate = [self frameRateForFormat:format closestTo:targetFrameRate];
double distance = fabs(frameRate - targetFrameRate);
FourCharCode subType = CMFormatDescriptionGetMediaSubType(format.formatDescription);
int subTypeScore = subType == preferredSubType ? 1 : 0;
if (distance < minDistance ||
(distance == minDistance && subTypeScore > bestSubTypeScore)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm confused by this condition. bestSubTypeScore is initialized to 1, and subTypeScore is always either 0 or 1. How could this last piece of the condition ever be true?

I'm not clear on what this score even is; this whole algorithm needs some explanatory comment at the beginning.

Copy link
Contributor Author

@misos1 misos1 Aug 30, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if distance < minDistance then bestSubTypeScore can be overwritten by 0. This line explains it: subTypeScore = subType == preferredSubType ? 1 : 0, the score of the subtype is 1 when it is the preferred subtype and 0 otherwise. This is how to choose by minimum distance and then from formats which have equal distance choose that one with preferred subtype, it gives to subtypes some order. Instead of subTypeScore > bestSubTypeScore there could be something like bestSubType != preferredSubType && subType == preferredSubType but that makes it less clear as it deviates from simple way how to order something by multiple properties current.A < other.A || (current.A == other.A && current.B < other.B).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but that makes it less clear

I strongly disagree that abstracting the thing that matters—whether the current type is a preferred type—behind a completely arbitrary numeric value makes this code easier to understand. The code you have there that you described as "less clear" makes sense to me when I read it, whereas I did not understand what this score was, or why something that could only have two possible values was being called a score in the first place.

bestFormat = format;
bestFrameRate = frameRate;
minDistance = distance;
bestSubTypeScore = subTypeScore;
}
}
if (![bestFormat isEqual:_captureDevice.activeFormat]) {
_captureDevice.activeFormat = bestFormat;
}
_mediaSettings.framesPerSecond = @(bestFrameRate);

// Set frame rate with 1/10 precision allowing not integral values.
int fpsNominator = floor([_mediaSettings.framesPerSecond doubleValue] * 10.0);
Expand All @@ -233,9 +263,10 @@ - (instancetype)initWithMediaSettings:(FCPPlatformMediaSettings *)mediaSettings
[mediaSettingsAVWrapper setMinFrameDuration:duration onDevice:_captureDevice];
[mediaSettingsAVWrapper setMaxFrameDuration:duration onDevice:_captureDevice];

[_mediaSettingsAVWrapper commitConfigurationForSession:_videoCaptureSession];
[_mediaSettingsAVWrapper unlockDevice:_captureDevice];
[_mediaSettingsAVWrapper commitConfigurationForSession:_videoCaptureSession];
} else {
[_mediaSettingsAVWrapper commitConfigurationForSession:_videoCaptureSession];
return nil;
}
} else {
Expand All @@ -251,6 +282,20 @@ - (instancetype)initWithMediaSettings:(FCPPlatformMediaSettings *)mediaSettings
return self;
}

- (double)frameRateForFormat:(AVCaptureDeviceFormat *)format closestTo:(double)targetFrameRate {
double bestFrameRate = 0;
double minDistance = DBL_MAX;
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
double frameRate = MIN(MAX(targetFrameRate, range.minFrameRate), range.maxFrameRate);
double distance = fabs(frameRate - targetFrameRate);
if (distance < minDistance) {
bestFrameRate = frameRate;
minDistance = distance;
}
}
return bestFrameRate;
}

- (AVCaptureConnection *)createConnection:(NSError **)error {
// Setup video capture input.
_captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:_captureDevice error:error];
Expand Down Expand Up @@ -474,56 +519,42 @@ - (BOOL)setCaptureSessionPreset:(FCPPlatformResolutionPreset)resolutionPreset
// Set the best device format found and finish the device configuration.
_captureDevice.activeFormat = bestFormat;
[_captureDevice unlockForConfiguration];

// Set the preview size based on values from the current capture device.
_previewSize =
CGSizeMake(_captureDevice.activeFormat.highResolutionStillImageDimensions.width,
_captureDevice.activeFormat.highResolutionStillImageDimensions.height);
break;
}
}
}
case FCPPlatformResolutionPresetUltraHigh:
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPreset3840x2160]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPreset3840x2160;
_previewSize = CGSizeMake(3840, 2160);
break;
}
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPresetHigh]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPresetHigh;
_previewSize =
CGSizeMake(_captureDevice.activeFormat.highResolutionStillImageDimensions.width,
_captureDevice.activeFormat.highResolutionStillImageDimensions.height);
break;
}
case FCPPlatformResolutionPresetVeryHigh:
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
_previewSize = CGSizeMake(1920, 1080);
break;
}
case FCPPlatformResolutionPresetHigh:
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPreset1280x720;
_previewSize = CGSizeMake(1280, 720);
break;
}
case FCPPlatformResolutionPresetMedium:
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPreset640x480;
_previewSize = CGSizeMake(640, 480);
break;
}
case FCPPlatformResolutionPresetLow:
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPreset352x288]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPreset352x288;
_previewSize = CGSizeMake(352, 288);
break;
}
default:
if ([_videoCaptureSession canSetSessionPreset:AVCaptureSessionPresetLow]) {
_videoCaptureSession.sessionPreset = AVCaptureSessionPresetLow;
_previewSize = CGSizeMake(352, 288);
} else {
if (error != nil) {
*error =
Expand All @@ -537,23 +568,32 @@ - (BOOL)setCaptureSessionPreset:(FCPPlatformResolutionPreset)resolutionPreset
return NO;
}
}
CMVideoDimensions size = self.videoDimensionsForFormat(_captureDevice.activeFormat);
_previewSize = CGSizeMake(size.width, size.height);
_audioCaptureSession.sessionPreset = _videoCaptureSession.sessionPreset;
return YES;
}

/// Finds the highest available resolution in terms of pixel count for the given device.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: "... if same pixel count, use the subtype as the tie breaker."

- (AVCaptureDeviceFormat *)highestResolutionFormatForCaptureDevice:
(AVCaptureDevice *)captureDevice {
FourCharCode preferredSubType =
CMFormatDescriptionGetMediaSubType(_captureDevice.activeFormat.formatDescription);
AVCaptureDeviceFormat *bestFormat = nil;
NSUInteger maxPixelCount = 0;
int bestSubTypeScore = 0;
for (AVCaptureDeviceFormat *format in _captureDevice.formats) {
CMVideoDimensions res = self.videoDimensionsForFormat(format);
NSUInteger height = res.height;
NSUInteger width = res.width;
NSUInteger pixelCount = height * width;
if (pixelCount > maxPixelCount) {
maxPixelCount = pixelCount;
FourCharCode subType = CMFormatDescriptionGetMediaSubType(format.formatDescription);
int subTypeScore = subType == preferredSubType ? 1 : 0;
if (pixelCount > maxPixelCount ||
(pixelCount == maxPixelCount && subTypeScore > bestSubTypeScore)) {
bestFormat = format;
maxPixelCount = pixelCount;
bestSubTypeScore = subTypeScore;
}
}
return bestFormat;
Expand Down
2 changes: 1 addition & 1 deletion packages/camera/camera_avfoundation/pubspec.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: camera_avfoundation
description: iOS implementation of the camera plugin.
repository: https://github.com/flutter/packages/tree/main/packages/camera/camera_avfoundation
issue_tracker: https://github.com/flutter/flutter/issues?q=is%3Aissue+is%3Aopen+label%3A%22p%3A+camera%22
version: 0.9.17+3
version: 0.9.17+4

environment:
sdk: ^3.3.0
Expand Down