diff --git a/README.md b/README.md index 8bc5655..23073be 100644 --- a/README.md +++ b/README.md @@ -7,57 +7,59 @@ Handling media-routes/sensors/events during a audio/video chat on React Native ## Purpose: -The purpose of this module is to handle actions/events during a phone call (audio/video) on `react-native`, ex: -* manage devices events like wired-headset plugged in state, proximity sensors and expose functionalities to javascript. -* automatically route audio to proper devices based on events and platform API. -* toggle speaker or microphone on/off, toggle flash light on/off -* play ringtone/ringback/dtmftone - -basically, it is a telecommunication module which handles most of requirements when making/receiving/talking with a call. -This module is desinged to work with [react-native-webrtc](https://github.com/oney/react-native-webrtc) -you can find demo here: https://github.com/oney/RCTWebRTCDemo +The purpose of this module is to handle actions/events during a phone call (audio/video) on `react-native`, ex: +* Manage devices events like wired-headset plugged-in state, proximity sensors and expose functionalities to javascript. +* Automatically route audio to proper devices based on events and platform API. +* Toggle speaker or microphone on/off, toggle flashlight on/off +* Play ringtone/ringback/dtmftone + +Basically, it is a telecommunication module which handles most of the requirements when making/receiving/talking with a call. + +This module is designed to work with [react-native-webrtc](https://github.com/oney/react-native-webrtc) + +## TODO / Contribution Wanted: + +* Make operations run on the main thread. ( iOS/Android ) +* Fix iOS audio shared instance singleton conflict with internal webrtc. +* Detect hardware button press event and react to it. + ex: press bluetooth button, send an event to JS to answer/hangup. + ex: press power button to mute incoming ringtone. +* Use config-based to decide which event should start and report. maybe control behavior as well. +* Flash API on Android. + ## Installation: - -#### BREAKING NOTE: - -* since `2.1.0`, you should use `RN 40+` and upgrade your xcode to support `swift 3`. - after upgrading xcode, `Edit -> Convert -> To Current Swift Syntax` to invoke `Swift Migration Assistant` - see [Migrating to Swift 2.3 or Swift 3 from Swift 2.2](https://swift.org/migration-guide/) - -* for old RN versions (RN < 0.40) please use version `1.5.4` ( Swift 2.2~2.3 ) - - -**from npm package**: `npm install react-native-incall-manager` -**from git package**: `npm install git://github.com/zxcpoiu/react-native-incall-manager.git` +**From npm package**: `npm install react-native-incall-manager` +**From git package**: `npm install git://github.com/zxcpoiu/react-native-incall-manager.git` =================================================== -### android: - +### Android: + +note: you might need `android.permission.BLUETOOTH` permisions for Bluetooth to work. + After install, you can use `rnpm` (`npm install rnpm -g`) to link android. -use `rnpm link react-native-incall-manager` to link or manually if you like. +use `react-native link react-native-incall-manager` to link or manually if you like. We use android support library v4 to check/request permissions. -You should add `compile "com.android.support:support-v4:23.0.1"` in `$your_project/android/app/build.gradle` dependencies on android. +You should add `compile "com.android.support:support-v4:$YOUR_VERSION"` in `$YOUR_PROJECT/android/app/build.gradle` dependencies on android. -#### Manually Link +#### Manually Linking -if rnpm link doesn't work. ( see: https://github.com/zxcpoiu/react-native-incall-manager/issues/21#issuecomment-279575516 ) -please add it manually in your main project: +If `react-native link` doesn't work, ( see: https://github.com/zxcpoiu/react-native-incall-manager/issues/21#issuecomment-279575516 ) please add it manually in your main project: -1. in `android/app/build.gradle` - should have a line `compile(project(':react-native-incall-manager'))` in `dependencies {}` section +1. In `android/app/build.gradle` + Should have a line `compile(project(':react-native-incall-manager'))` in `dependencies {}` section -2. in `android/settings.gradle` - should have: +2. In `android/settings.gradle` + Should have: ``` include ':react-native-incall-manager' -project(':react-native-incall-manager').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-incall-manager/android') + project(':react-native-incall-manager').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-incall-manager/android') ``` -3. in `MainApplication.java` +3. In `MainApplication.java` ```java import com.zxcpoiu.incallmanager.InCallManagerPackage; @@ -69,9 +71,9 @@ project(':react-native-incall-manager').projectDir = new File(rootProject.projec ); } ``` -#### optional sound files on android +#### Optional sound files on android -if you want to use bundled ringtone/ringback/busytone sound instead of system sound, +If you want to use bundled ringtone/ringback/busytone sound instead of system sound, put files in `android/app/src/main/res/raw` and rename file correspond to sound type: @@ -81,49 +83,50 @@ incallmanager_ringback.mp3 incallmanager_ringtone.mp3 ``` -on android, as long as your file extension supported by android, this module will load it. +On android, as long as your file extension supported by android, this module will load it. =================================================== ### ios: -since ios part written in swift and it doesn't support static library yet. -before that, you should add this project manually -please do it step by step carefully :pray: : +`react-native link react-native-incall-manager` + +#### Using CocoaPods + +Update the following line with your path to node_modules/ and add it to your Podfile: -#### Add files in to your project: +`pod 'ReactNativeIncallManager', :path => '../node_modules/react-native-incall-manager'` - 1. Open your project in xcode - 2. find your_project directory under your project's xcodeproject root. ( it's a sub-directoory, not root xcodeproject itself ) - 3. you can do either: - (recommended) directly drag your node_modules/react-native-incall-manager/ios/RNInCallManager/ into it. - (may have some [path issue](https://github.com/zxcpoiu/react-native-incall-manager/issues/39)) right click on your_project directory, `add files` to your project and add `node_modules/react-native-incall-manager/ios/RNInCallManager/` - 4. on the pou-up window, uncheck `Copy items if needed` and select `Added folders: Create groups` then add it. you will see a new directory named `RNInCallmanager under your_project` directory. +#### Manually Linking -#### Setup Objective-C Bridging Header: - 1. click your `project's xcodeproject root`, go to `build setting` and search `Objective-C Bridging Header` - 2. set you header location, the default path is: `ReactNativeProjectRoot/ios/`, - in this case, you should set `../node_modules/react-native-incall-manager/ios/RNInCallManager/RNInCallManager-Bridging-Header.h` +In case `react-native link` doesn't work, + +- Drag `node_modules/react-native-incall-manager/ios/RNInCallManager.xcodeproj` under `/Libraries` +- Select `` --> `Build Phases` --> `Link Binary With Libraries` + - Drag `Libraries/RNInCallManager.xcodeproj/Products/libRNInCallManager.a` to `Link Binary With Libraries` +- Select `` --> `Build Settings` + - In `Header Search Paths`, add `$(SRCROOT)/../node_modules/react-native-incall-manager/ios/RNInCallManager` #### Clean project if messed up: - The installation steps are a bit complex, it might related your xcode version, xcode cache, converting swift version, and your own path configurations. if something messed up, please folow steps below to clean this project, then do it again steps by steps. - 1. delete all project/directory in xcode related to incall-manager - 2. delete `react-native-incall-manager` in node_modules ( rm -rf ) + The installation steps are a bit complex, it might be related your xcode version, xcode cache, converting swift version, and your own path configurations. if something messed up, please follow steps below to clean this project, then do it again steps by steps. + + 1. Delete all project/directory in xcode related to incall-manager + 2. Delete `react-native-incall-manager` in node_modules ( rm -rf ) 3. Xcode -> Product -> clean - 4. close xcode - 5. npm install again - 6. open xcode and try the install process again steps by steps + 4. Close xcode + 5. Run `npm install` again + 6. Open xcode and try the install process again steps by steps - if someone knows a simpler way to set this project up, let me know plz. + If someone knows a simpler way to set this project up, let me know plz. -#### optional sound files on android +#### Optional sound files on iOS -if you want to use bundled ringtone/ringback/busytone sound instead of system sound +If you want to use bundled ringtone/ringback/busytone sound instead of system sound -1. add files into your_project directory under your project's xcodeproject root. ( or drag into it as described above. ) -2. check `copy file if needed` -3. make sure filename correspond to sound type: +1. Add files into your_project directory under your project's xcodeproject root. ( or drag into it as described above. ) +2. Check `copy file if needed` +3. Make sure filename correspond to sound type: ``` incallmanager_busytone.mp3 @@ -131,11 +134,11 @@ incallmanager_ringback.mp3 incallmanager_ringtone.mp3 ``` -on ios, we only support mp3 files currently. +On ios, we only support mp3 files currently. ## Usage: -This module implement a basic handle logic automatically, just: +This module implements a basic handle logic automatically, just: ```javascript import InCallManager from 'react-native-incall-manager'; @@ -148,10 +151,10 @@ InCallManager.start({media: 'audio'}); // audio/video, default: audio // --- On Call Hangup: InCallManager.stop(); -// ... it will also remote event listeners ... +// ... it will also remove event listeners ... ``` -if you want to use ringback: +If you want to use ringback: ```javascript // ringback is basically for OUTGOING call. and is part of start(). @@ -161,7 +164,7 @@ InCallManager.start({media: 'audio', ringback: '_BUNDLE_'}); // or _DEFAULT_ or InCallManager.stopRingback(); ``` -if you want to use busytone: +If you want to use busytone: ```javascript // busytone is basically for OUTGOING call. and is part of stop() @@ -170,7 +173,7 @@ if you want to use busytone: InCallManager.stop({busytone: '_DTMF_'}); // or _BUNDLE_ or _DEFAULT_ ``` -if you want to use ringtone: +If you want to use ringtone: ```javascript // ringtone is basically for INCOMING call. it's independent to start() and stop() @@ -188,8 +191,8 @@ InCallManager.stop(); ``` -also can interact with events if you want: -see API section. +Also can interact with events if you want: +See API section. ```javascript import { DeviceEventEmitter } from 'react-native'; @@ -200,89 +203,28 @@ DeviceEventEmitter.addListener('Proximity', function (data) { ``` -## About Permission: - - -since version 1.2.0, two functions and a property were added: - -```javascript -// --- function -async checkRecordPermission() // return promise -async requestRecordPermission() // return promise - -// --- property -recordPermission = 'unknow' or 'granted' or 'denied', default is 'unknow' -``` - -After incall-manager initialized, it will check current state of record permission and set to `recordPermission` property. -so you can just write below code in your `ComponentDidMount` like: - -```javascript -if (InCallManager.recordPermission !== 'granted') { - InCallManager.requestRecordPermission() - .then((requestedRecordPermissionResult) => { - console.log("InCallManager.requestRecordPermission() requestedRecordPermissionResult: ", requestedRecordPermissionResult); - }) - .catch((err) => { - console.log("InCallManager.requestRecordPermission() catch: ", err); - }); -} -``` - -We use android support library v4 to check/request permissions. -You should add `compile "com.android.support:support-v4:23.0.1"` in `$your_project/android/app/build.gradle` dependencies on android. - - -**NOTE for android:** - -React Native does not officially support api 23 currently ( it is on api 22 now. see: [RN known issues](https://facebook.github.io/react-native/docs/known-issues.html#android-m-permissions)) and android supports request permission at runtime since api 23, so it will always return 'granted' immediately after calling `checkRecordPermission()` or `requestRecordPermission()`. - -If you really need the functionality, you can do the following to make them work but at your own risk: -( I've tested it though, but who knows :) ) - -Step 1: change your `targetSdkVersion` to 23 in `$your_project/android/app/build.gradle` -Step 2: override `onRequestPermissionsResult` in your `MainActivity.java` like: - -``` - @Override - public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) { - InCallManagerPackage.onRequestPermissionsResult(requestCode, permissions, grantResults); - super.onRequestPermissionsResult(requestCode, permissions, grantResults); - } -``` - -then you can test it on android 6 now. - -**Another thing you should know is:** - -If you change targetSdkVersion to 23, the `red box` which React Native used to display errors in development mode requires permission `Draw Over Other Apps`. -So in **development mode**, you should manually grant permission in `app settings` on your device or declare `android.permission.SYSTEM_ALERT_WINDOW` in your manifest. -You don't have to do this in **release mode** since there are no red box. - - -checkout this awesome project: [react-native-android-permissions](https://github.com/lucasferreira/react-native-android-permissions) by @lucasferreira for more information. ## Automatic Basic Behavior: -**on start:** -* store current settings, set KeepScreenOn flag = true, and register some event listeners. -* if media type is `audio`, route voice to earpiece, otherwise route to speaker. -* audio will enable proximity sensor which is disabled by default if media=video -* when proximity detect user closed to screen, turn off screen to avoid accident touch and route voice to earpiece. -* when newly external device plugged, such as wired-headset, route audio to external device. -* optional play ringback +**On start:** +* Store current settings, set KeepScreenOn flag = true, and register some event listeners. +* If media type is `audio`, route voice to earpiece, otherwise route to speaker. +* Audio will enable proximity sensor which is disabled by default if media=video +* When proximity detects user close to screen, turn off screen to avoid accident touch and route voice to the earpiece. +* When newly external device plugged, such as wired-headset, route audio to an external device. +* Optional play ringback -**on stop:** +**On stop:** -* set KeepScreenOn flag = false, remote event listeners, restore original user settings. -* optional play busytone +* Set KeepScreenOn flag = false, remote event listeners, restore original user settings. +* Optionally play busytone ## Custom Behavior: -you can custom behavior use API/events exposed by this module. see `API` section. +You can customize behavior using API/events exposed by this module. See `API` section. -note: ios only supports `auto` currently. +Note: iOS only supports `auto` currently. ## API: @@ -298,8 +240,6 @@ note: ios only supports `auto` currently. | setSpeakerphoneOn(`enable: ?boolean`) | :smile: | :rage: | toggle speaker ON/OFF once. but not force
default: false | | setForceSpeakerphoneOn(`flag: ?boolean`) | :smile: | :smile: | true -> force speaker on
false -> force speaker off
null -> use default behavior according to media type
default: null | | setMicrophoneMute(`enable: ?boolean`) | :smile: | :rage: | mute/unmute micophone
default: false
p.s. if you use webrtc, you can just use `track.enabled = false` to mute | -| async checkRecordPermission() | :smile: | :smile: | check record permission without promt. return Promise. see **about permission** section above | -| async requestRecordPermission() | :smile: | :smile: | request record permission to user. return Promise. see **about permission** section above | | async getAudioUriJS() | :smile: | :smile: | get audio Uri path. this would be useful when you want to pass Uri into another module. | | startRingtone(`ringtone: string, ?vibrate_pattern: array, ?ios_category: string, ?seconds: number`) | :smile: | :smile: | play ringtone.
`ringtone`: '_DEFAULT_' or '_BUNDLE_'
`vibrate_pattern`: same as RN, but does not support repeat
`ios_category`: ios only, if you want to use specific audio category
`seconds`: android only, specify how long do you want to play rather than play once nor repeat. in sec.| | stopRingtone() | :smile: | :smile: | stop play ringtone if previous started via `startRingtone()` | @@ -315,17 +255,16 @@ note: ios only supports `auto` currently. | :--- | :---: | :---: | :--- | | 'Proximity' | :smile: | :smile: | proximity sensor detected changes.
data: `{'isNear': boolean}` | | 'WiredHeadset'| :smile: | :smile: | fire when wired headset plug/unplug
data: `{'isPlugged': boolean, 'hasMic': boolean, 'deviceName': string }` | -| 'NoisyAudio' | :smile: | :rage: | see [andriod doc](http://developer.android.com/reference/android/media/AudioManager.html#ACTION_AUDIO_BECOMING_NOISY).
data: `null` | +| 'NoisyAudio' | :smile: | :rage: | see [android doc](http://developer.android.com/reference/android/media/AudioManager.html#ACTION_AUDIO_BECOMING_NOISY).
data: `null` | | 'MediaButton' | :smile: | :rage: | when external device controler pressed button. see [android doc](http://developer.android.com/reference/android/content/Intent.html#ACTION_MEDIA_BUTTON)
data: `{'eventText': string, 'eventCode': number }` | -| 'onAudioFocusChange' | :smile: | :rage: | see [andriod doc](http://developer.android.com/reference/android/media/AudioManager.OnAudioFocusChangeListener.html#onAudioFocusChange(int))
data: `{'eventText': string, 'eventCode': number }` | +| 'onAudioFocusChange' | :smile: | :rage: | see [android doc](http://developer.android.com/reference/android/media/AudioManager.OnAudioFocusChangeListener.html#onAudioFocusChange(int))
data: `{'eventText': string, 'eventCode': number }` | -**NOTE: platform OS always has the final decision, so some toggle api may not work in some case -be care when customize your own behavior** +**NOTE: platform OS always has the final decision, so some toggle API may not work in some cases +be careful when customizing your own behavior** ## LICENSE: **[ISC License](https://opensource.org/licenses/ISC)** ( functionality equivalent to **MIT License** ) -## Contributing: - -I'm not expert neither on ios nor android, any suggestions, pull request, corrections are really appreciated and welcome. +## Original Author: +[![zxcpoiu](https://github.com/zxcpoiu.png)](https://github.com/zxcpoiu) diff --git a/ReactNativeIncallManager.podspec b/ReactNativeIncallManager.podspec new file mode 100644 index 0000000..30650b0 --- /dev/null +++ b/ReactNativeIncallManager.podspec @@ -0,0 +1,22 @@ +require 'json' + +package = JSON.parse(File.read(File.join(__dir__, 'package.json'))) + +Pod::Spec.new do |s| + s.name = 'ReactNativeIncallManager' + s.version = package['version'] + s.summary = package['description'] + s.description = package['description'] + s.homepage = package['homepage'] + s.license = package['license'] + s.author = package['author'] + s.source = { :git => 'https://github.com/zxcpoiu/react-native-incall-manager.git', :tag => s.version } + + s.platform = :ios, '9.0' + s.ios.deployment_target = '8.0' + + s.preserve_paths = 'LICENSE', 'package.json' + s.source_files = '**/*.{h,m}' + s.exclude_files = 'example/**/*' + s.dependency 'React-Core' +end diff --git a/android/build.gradle b/android/build.gradle index 3898c28..93e44cb 100644 --- a/android/build.gradle +++ b/android/build.gradle @@ -1,17 +1,27 @@ apply plugin: 'com.android.library' +def safeExtGet(prop, fallback) { + rootProject.ext.has(prop) ? rootProject.ext.get(prop) : fallback +} + android { - compileSdkVersion 23 - buildToolsVersion "23.0.1" + def agpVersion = com.android.Version.ANDROID_GRADLE_PLUGIN_VERSION + if (agpVersion.tokenize('.')[0].toInteger() >= 7) { + namespace "com.zxcpoiu.incallmanager" + } + + compileSdkVersion safeExtGet('compileSdkVersion', 33) + buildToolsVersion safeExtGet('buildToolsVersion', "30.0.2") defaultConfig { - minSdkVersion 16 - targetSdkVersion 22 + minSdkVersion safeExtGet('minSdkVersion', 21) + targetSdkVersion safeExtGet('targetSdkVersion', 30) versionCode 1 versionName "1.0" } } dependencies { - compile 'com.facebook.react:react-native:+' + implementation 'com.facebook.react:react-native:+' + implementation "androidx.media:media:1.4.3" } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCBluetoothManager.java b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCBluetoothManager.java index 672977a..0d897fe 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCBluetoothManager.java +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCBluetoothManager.java @@ -7,10 +7,9 @@ * in the file PATENTS. All contributing project authors may * be found in the AUTHORS file in the root of the source tree. */ - package com.zxcpoiu.incallmanager.AppRTC; - import android.annotation.SuppressLint; +import android.bluetooth.BluetoothClass; import android.bluetooth.BluetoothAdapter; import android.bluetooth.BluetoothDevice; import android.bluetooth.BluetoothHeadset; @@ -20,28 +19,33 @@ import android.content.Intent; import android.content.IntentFilter; import android.content.pm.PackageManager; +import android.media.AudioDeviceCallback; +import android.media.AudioDeviceInfo; import android.media.AudioManager; +import android.os.Build; import android.os.Handler; import android.os.Looper; import android.os.Process; import android.util.Log; +import androidx.annotation.Nullable; +import androidx.annotation.RequiresApi; + import java.util.List; import java.util.Set; - +import java.util.ArrayList; +import com.zxcpoiu.incallmanager.AppRTC.AppRTCUtils; +import com.zxcpoiu.incallmanager.AppRTC.ThreadUtils; import com.zxcpoiu.incallmanager.InCallManagerModule; - /** * AppRTCProximitySensor manages functions related to Bluetoth devices in the * AppRTC demo. */ public class AppRTCBluetoothManager { private static final String TAG = "AppRTCBluetoothManager"; - // Timeout interval for starting or stopping audio to a Bluetooth SCO device. - private static final int BLUETOOTH_SCO_TIMEOUT_MS = 4000; + private static final int BLUETOOTH_SCO_TIMEOUT_MS = 6000; // Maximum number of SCO connection attempts. - private static final int MAX_SCO_CONNECTION_ATTEMPTS = 2; - + private static final int MAX_SCO_CONNECTION_ATTEMPTS = 10; // Bluetooth connection state. public enum State { // Bluetooth is not available; no adapter or Bluetooth is off. @@ -61,20 +65,26 @@ public enum State { // Bluetooth audio SCO connection with remote device is established. SCO_CONNECTED } - private final Context apprtcContext; private final InCallManagerModule apprtcAudioManager; + @Nullable private final AudioManager audioManager; private final Handler handler; - int scoConnectionAttempts; private State bluetoothState; private final BluetoothProfile.ServiceListener bluetoothServiceListener; + @Nullable private BluetoothAdapter bluetoothAdapter; + @Nullable private BluetoothHeadset bluetoothHeadset; + @Nullable private BluetoothDevice bluetoothDevice; - private final BroadcastReceiver bluetoothHeadsetReceiver; + @Nullable + private AudioDeviceInfo bluetoothAudioDevice; + + private AudioDeviceCallback bluetoothAudioDeviceCallback; + private final BroadcastReceiver bluetoothHeadsetReceiver; // Runs when the Bluetooth timeout expires. We use that timeout after calling // startScoAudio() or stopScoAudio() because we're not guaranteed to get a // callback after those calls. @@ -84,7 +94,6 @@ public void run() { bluetoothTimeout(); } }; - /** * Implementation of an interface that notifies BluetoothProfile IPC clients when they have been * connected to or disconnected from the service. @@ -104,7 +113,6 @@ public void onServiceConnected(int profile, BluetoothProfile proxy) { updateAudioDeviceState(); Log.d(TAG, "onServiceConnected done: BT state=" + bluetoothState); } - @Override /** Notifies the client when the proxy object has been disconnected from the service. */ public void onServiceDisconnected(int profile) { @@ -121,6 +129,34 @@ public void onServiceDisconnected(int profile) { } } + @RequiresApi(api = Build.VERSION_CODES.S) + private class BluetoothAudioDeviceCallback extends AudioDeviceCallback { + @Override + public void onAudioDevicesAdded(AudioDeviceInfo[] addedDevices) { + updateDeviceList(); + } + + public void onAudioDevicesRemoved(AudioDeviceInfo[] removedDevices) { + updateDeviceList(); + } + + private void updateDeviceList() { + final AudioDeviceInfo newBtDevice = getScoDevice(); + boolean needChange = false; + if (bluetoothAudioDevice != null && newBtDevice == null) { + needChange = true; + bluetoothState = State.HEADSET_UNAVAILABLE; + } else if (bluetoothAudioDevice == null && newBtDevice != null) { + needChange = true; + } else if (bluetoothAudioDevice != null && bluetoothAudioDevice.getId() != newBtDevice.getId()) { + needChange = true; + } + if (needChange) { + updateAudioDeviceState(); + } + } + } + // Intent broadcast receiver which handles changes in Bluetooth device availability. // Detects headset changes and Bluetooth SCO state changes. private class BluetoothHeadsetBroadcastReceiver extends BroadcastReceiver { @@ -188,29 +224,30 @@ public void onReceive(Context context, Intent intent) { Log.d(TAG, "onReceive done: BT state=" + bluetoothState); } } - /** Construction. */ public static AppRTCBluetoothManager create(Context context, InCallManagerModule audioManager) { - Log.d(TAG, "create"); + Log.d(TAG, "create" + AppRTCUtils.getThreadInfo()); return new AppRTCBluetoothManager(context, audioManager); } - protected AppRTCBluetoothManager(Context context, InCallManagerModule audioManager) { Log.d(TAG, "ctor"); + ThreadUtils.checkIsOnMainThread(); apprtcContext = context; apprtcAudioManager = audioManager; this.audioManager = getAudioManager(context); bluetoothState = State.UNINITIALIZED; bluetoothServiceListener = new BluetoothServiceListener(); bluetoothHeadsetReceiver = new BluetoothHeadsetBroadcastReceiver(); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + bluetoothAudioDeviceCallback = new BluetoothAudioDeviceCallback(); + } handler = new Handler(Looper.getMainLooper()); } - /** Returns the internal state. */ public State getState() { + ThreadUtils.checkIsOnMainThread(); return bluetoothState; } - /** * Activates components required to detect Bluetooth devices and to enable * BT SCO (audio is routed via BT SCO) for the headset profile. The end @@ -224,9 +261,12 @@ public State getState() { * Note that the AppRTCAudioManager is also involved in driving this state * change. */ + @SuppressLint("MissingPermission") public void start() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "start"); - if (!hasPermission(apprtcContext, android.Manifest.permission.BLUETOOTH)) { + String p = Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ? android.Manifest.permission.BLUETOOTH_CONNECT : android.Manifest.permission.BLUETOOTH; + if (!hasPermission(apprtcContext, p)) { Log.w(TAG, "Process (pid=" + Process.myPid() + ") lacks BLUETOOTH permission"); return; } @@ -256,22 +296,26 @@ public void start() { Log.e(TAG, "BluetoothAdapter.getProfileProxy(HEADSET) failed"); return; } - // Register receivers for BluetoothHeadset change notifications. - IntentFilter bluetoothHeadsetFilter = new IntentFilter(); - // Register receiver for change in connection state of the Headset profile. - bluetoothHeadsetFilter.addAction(BluetoothHeadset.ACTION_CONNECTION_STATE_CHANGED); - // Register receiver for change in audio connection state of the Headset profile. - bluetoothHeadsetFilter.addAction(BluetoothHeadset.ACTION_AUDIO_STATE_CHANGED); - registerReceiver(bluetoothHeadsetReceiver, bluetoothHeadsetFilter); - Log.d(TAG, "HEADSET profile state: " - + stateToString(bluetoothAdapter.getProfileConnectionState(BluetoothProfile.HEADSET))); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + audioManager.registerAudioDeviceCallback(bluetoothAudioDeviceCallback, null); + } else { + // Register receivers for BluetoothHeadset change notifications. + IntentFilter bluetoothHeadsetFilter = new IntentFilter(); + // Register receiver for change in connection state of the Headset profile. + bluetoothHeadsetFilter.addAction(BluetoothHeadset.ACTION_CONNECTION_STATE_CHANGED); + // Register receiver for change in audio connection state of the Headset profile. + bluetoothHeadsetFilter.addAction(BluetoothHeadset.ACTION_AUDIO_STATE_CHANGED); + registerReceiver(bluetoothHeadsetReceiver, bluetoothHeadsetFilter); + Log.d(TAG, "HEADSET profile state: " + + stateToString(bluetoothAdapter.getProfileConnectionState(BluetoothProfile.HEADSET))); + } Log.d(TAG, "Bluetooth proxy for headset profile has started"); bluetoothState = State.HEADSET_UNAVAILABLE; Log.d(TAG, "start done: BT state=" + bluetoothState); } - /** Stops and closes all components related to Bluetooth audio. */ public void stop() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "stop: BT state=" + bluetoothState); if (bluetoothAdapter == null) { return; @@ -282,8 +326,12 @@ public void stop() { if (bluetoothState == State.UNINITIALIZED) { return; } - unregisterReceiver(bluetoothHeadsetReceiver); - cancelTimer(); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + audioManager.unregisterAudioDeviceCallback(bluetoothAudioDeviceCallback); + } else { + unregisterReceiver(bluetoothHeadsetReceiver); + cancelTimer(); + } if (bluetoothHeadset != null) { bluetoothAdapter.closeProfileProxy(BluetoothProfile.HEADSET, bluetoothHeadset); bluetoothHeadset = null; @@ -293,7 +341,6 @@ public void stop() { bluetoothState = State.UNINITIALIZED; Log.d(TAG, "stop done: BT state=" + bluetoothState); } - /** * Starts Bluetooth SCO connection with remote device. * Note that the phone application always has the priority on the usage of the SCO connection @@ -308,6 +355,7 @@ public void stop() { * accept SCO audio without a "call". */ public boolean startScoAudio() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "startSco: BT state=" + bluetoothState + ", " + "attempts: " + scoConnectionAttempts + ", " + "SCO is on: " + isScoOn()); @@ -319,95 +367,143 @@ public boolean startScoAudio() { Log.e(TAG, "BT SCO connection fails - no headset available"); return false; } - // Start BT SCO channel and wait for ACTION_AUDIO_STATE_CHANGED. - Log.d(TAG, "Starting Bluetooth SCO and waits for ACTION_AUDIO_STATE_CHANGED..."); - // The SCO connection establishment can take several seconds, hence we cannot rely on the - // connection to be available when the method returns but instead register to receive the - // intent ACTION_SCO_AUDIO_STATE_UPDATED and wait for the state to be SCO_AUDIO_STATE_CONNECTED. - bluetoothState = State.SCO_CONNECTING; - audioManager.startBluetoothSco(); - audioManager.setBluetoothScoOn(true); - scoConnectionAttempts++; - startTimer(); - Log.d(TAG, "startScoAudio done: BT state=" + bluetoothState + ", " - + "SCO is on: " + isScoOn()); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + if (bluetoothAudioDevice != null) { + audioManager.setCommunicationDevice(bluetoothAudioDevice); + bluetoothState = State.SCO_CONNECTED; + Log.d(TAG, "Set bluetooth audio device as communication device: " + + "id=" + bluetoothAudioDevice.getId()); + } else { + bluetoothState = State.SCO_DISCONNECTING; + Log.d(TAG, "Cannot find any bluetooth SCO device to set as communication device"); + } + updateAudioDeviceState(); + } else { + // The SCO connection establishment can take several seconds, hence we cannot rely on the + // connection to be available when the method returns but instead register to receive the + // intent ACTION_SCO_AUDIO_STATE_UPDATED and wait for the state to be SCO_AUDIO_STATE_CONNECTED. + // Start BT SCO channel and wait for ACTION_AUDIO_STATE_CHANGED. + Log.d(TAG, "Starting Bluetooth SCO and waits for ACTION_AUDIO_STATE_CHANGED..."); + bluetoothState = State.SCO_CONNECTING; + startTimer(); + audioManager.startBluetoothSco(); + audioManager.setBluetoothScoOn(true); + scoConnectionAttempts++; + Log.d(TAG, "startScoAudio done: BT state=" + bluetoothState + ", " + + "SCO is on: " + isScoOn()); + } return true; } + private List getFinalConnectedDevices() { + List connectedDevices = bluetoothHeadset.getConnectedDevices(); + List finalDevices = new ArrayList(); + + for (BluetoothDevice device : connectedDevices) { + int majorClass = device.getBluetoothClass().getMajorDeviceClass(); + if (majorClass == BluetoothClass.Device.Major.AUDIO_VIDEO) { + finalDevices.add(device); + } + } + return finalDevices; + } /** Stops Bluetooth SCO connection with remote device. */ public void stopScoAudio() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "stopScoAudio: BT state=" + bluetoothState + ", " + "SCO is on: " + isScoOn()); if (bluetoothState != State.SCO_CONNECTING && bluetoothState != State.SCO_CONNECTED) { return; } - cancelTimer(); - audioManager.stopBluetoothSco(); - audioManager.setBluetoothScoOn(false); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + audioManager.clearCommunicationDevice(); + } else { + cancelTimer(); + audioManager.stopBluetoothSco(); + audioManager.setBluetoothScoOn(false); + } bluetoothState = State.SCO_DISCONNECTING; Log.d(TAG, "stopScoAudio done: BT state=" + bluetoothState + ", " + "SCO is on: " + isScoOn()); } - /** * Use the BluetoothHeadset proxy object (controls the Bluetooth Headset * Service via IPC) to update the list of connected devices for the HEADSET * profile. The internal state will change to HEADSET_UNAVAILABLE or to - * HEADSET_AVAILABLE and |bluetoothDevice| will be mapped to the connected + * HEADSET_AVAILABLE and `bluetoothDevice` will be mapped to the connected * device if available. */ + @SuppressLint("MissingPermission") public void updateDevice() { if (bluetoothState == State.UNINITIALIZED || bluetoothHeadset == null) { return; } Log.d(TAG, "updateDevice"); - // Get connected devices for the headset profile. Returns the set of - // devices which are in state STATE_CONNECTED. The BluetoothDevice class - // is just a thin wrapper for a Bluetooth hardware address. - List devices = bluetoothHeadset.getConnectedDevices(); - if (devices.isEmpty()) { - bluetoothDevice = null; - bluetoothState = State.HEADSET_UNAVAILABLE; - Log.d(TAG, "No connected bluetooth headset"); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + bluetoothAudioDevice = getScoDevice(); + if (bluetoothAudioDevice != null) { + bluetoothState = State.HEADSET_AVAILABLE; + Log.d(TAG, "Connected bluetooth headset: " + + "name=" + bluetoothAudioDevice.getProductName()); + } else { + bluetoothState = State.HEADSET_UNAVAILABLE; + } } else { - // Always use first device in list. Android only supports one device. - bluetoothDevice = devices.get(0); - bluetoothState = State.HEADSET_AVAILABLE; - Log.d(TAG, "Connected bluetooth headset: " - + "name=" + bluetoothDevice.getName() + ", " - + "state=" + stateToString(bluetoothHeadset.getConnectionState(bluetoothDevice)) - + ", SCO audio=" + bluetoothHeadset.isAudioConnected(bluetoothDevice)); + // Get connected devices for the headset profile. Returns the set of + // devices which are in state STATE_CONNECTED. The BluetoothDevice class + // is just a thin wrapper for a Bluetooth hardware address. + List devices = getFinalConnectedDevices(); + if (devices.isEmpty()) { + bluetoothDevice = null; + bluetoothState = State.HEADSET_UNAVAILABLE; + Log.d(TAG, "No connected bluetooth headset"); + } else { + // Always use first device in list. Android only supports one device. + bluetoothDevice = devices.get(0); + bluetoothState = State.HEADSET_AVAILABLE; + Log.d(TAG, "Connected bluetooth headset: " + + "name=" + bluetoothDevice.getName() + ", " + + "state=" + stateToString(bluetoothHeadset.getConnectionState(bluetoothDevice)) + + ", SCO audio=" + bluetoothHeadset.isAudioConnected(bluetoothDevice)); + } } Log.d(TAG, "updateDevice done: BT state=" + bluetoothState); } - /** * Stubs for test mocks. */ + @Nullable protected AudioManager getAudioManager(Context context) { return (AudioManager) context.getSystemService(Context.AUDIO_SERVICE); } - protected void registerReceiver(BroadcastReceiver receiver, IntentFilter filter) { - apprtcContext.registerReceiver(receiver, filter); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) { + apprtcContext.registerReceiver(receiver, filter, Context.RECEIVER_NOT_EXPORTED); + } else { + apprtcContext.registerReceiver(receiver, filter); + } } - protected void unregisterReceiver(BroadcastReceiver receiver) { - apprtcContext.unregisterReceiver(receiver); + if (receiver != null) { + try { + apprtcContext.unregisterReceiver(receiver); + } catch (final Exception exception) { + // The receiver was not registered. + // There is nothing to do in that case. + // Everything is fine. + } + } } - protected boolean getBluetoothProfileProxy( Context context, BluetoothProfile.ServiceListener listener, int profile) { return bluetoothAdapter.getProfileProxy(context, listener, profile); } - protected boolean hasPermission(Context context, String permission) { return apprtcContext.checkPermission(permission, Process.myPid(), Process.myUid()) == PackageManager.PERMISSION_GRANTED; } - /** Logs the state of the local Bluetooth adapter. */ - @SuppressLint("HardwareIds") + @SuppressLint({"HardwareIds", "MissingPermission"}) protected void logBluetoothAdapterInfo(BluetoothAdapter localAdapter) { Log.d(TAG, "BluetoothAdapter: " + "enabled=" + localAdapter.isEnabled() + ", " @@ -415,77 +511,85 @@ protected void logBluetoothAdapterInfo(BluetoothAdapter localAdapter) { + "name=" + localAdapter.getName() + ", " + "address=" + localAdapter.getAddress()); // Log the set of BluetoothDevice objects that are bonded (paired) to the local adapter. - Set pairedDevices = localAdapter.getBondedDevices(); + Set pairedDevices = localAdapter.getBondedDevices(); if (!pairedDevices.isEmpty()) { Log.d(TAG, "paired devices:"); for (BluetoothDevice device : pairedDevices) { - Log.d(TAG, " name=" + device.getName() + ", address=" + device.getAddress()); + Log.d(TAG, " name=" + device.getName() + ", address=" + device.getAddress() + ", deviceClass=" + String.valueOf(device.getBluetoothClass().getDeviceClass()) + ", deviceMajorClass=" + String.valueOf(device.getBluetoothClass().getMajorDeviceClass())); } } } - /** Ensures that the audio manager updates its list of available audio devices. */ private void updateAudioDeviceState() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "updateAudioDeviceState"); apprtcAudioManager.updateAudioDeviceState(); } - /** Starts timer which times out after BLUETOOTH_SCO_TIMEOUT_MS milliseconds. */ private void startTimer() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "startTimer"); handler.postDelayed(bluetoothTimeoutRunnable, BLUETOOTH_SCO_TIMEOUT_MS); } - /** Cancels any outstanding timer tasks. */ private void cancelTimer() { + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "cancelTimer"); handler.removeCallbacks(bluetoothTimeoutRunnable); } - /** * Called when start of the BT SCO channel takes too long time. Usually * happens when the BT device has been turned on during an ongoing call. */ + @SuppressLint("MissingPermission") private void bluetoothTimeout() { + ThreadUtils.checkIsOnMainThread(); if (bluetoothState == State.UNINITIALIZED || bluetoothHeadset == null) { return; } - Log.d(TAG, "bluetoothTimeout: BT state=" + bluetoothState + ", " - + "attempts: " + scoConnectionAttempts + ", " - + "SCO is on: " + isScoOn()); - if (bluetoothState != State.SCO_CONNECTING) { - return; - } - // Bluetooth SCO should be connecting; check the latest result. - boolean scoConnected = false; - List devices = bluetoothHeadset.getConnectedDevices(); - if (devices.size() > 0) { - bluetoothDevice = devices.get(0); - if (bluetoothHeadset.isAudioConnected(bluetoothDevice)) { - Log.d(TAG, "SCO connected with " + bluetoothDevice.getName()); - scoConnected = true; + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + Log.w(TAG, "Invalid state, the timeout should not be running on the version: " + Build.VERSION.SDK_INT); + } else { + Log.d(TAG, "bluetoothTimeout: BT state=" + bluetoothState + ", " + + "attempts: " + scoConnectionAttempts + ", " + + "SCO is on: " + isScoOn()); + if (bluetoothState != State.SCO_CONNECTING) { + return; + } + // Bluetooth SCO should be connecting; check the latest result. + boolean scoConnected = false; + List devices = getFinalConnectedDevices(); + if (devices.size() > 0) { + bluetoothDevice = devices.get(0); + if (bluetoothHeadset.isAudioConnected(bluetoothDevice)) { + Log.d(TAG, "SCO connected with " + bluetoothDevice.getName()); + scoConnected = true; + } else { + Log.d(TAG, "SCO is not connected with " + bluetoothDevice.getName()); + } + } + if (scoConnected) { + // We thought BT had timed out, but it's actually on; updating state. + bluetoothState = State.SCO_CONNECTED; + scoConnectionAttempts = 0; } else { - Log.d(TAG, "SCO is not connected with " + bluetoothDevice.getName()); + // Give up and "cancel" our request by calling stopBluetoothSco(). + Log.w(TAG, "BT failed to connect after timeout"); + stopScoAudio(); } } - if (scoConnected) { - // We thought BT had timed out, but it's actually on; updating state. - bluetoothState = State.SCO_CONNECTED; - scoConnectionAttempts = 0; - } else { - // Give up and "cancel" our request by calling stopBluetoothSco(). - Log.w(TAG, "BT failed to connect after timeout"); - stopScoAudio(); - } updateAudioDeviceState(); Log.d(TAG, "bluetoothTimeout done: BT state=" + bluetoothState); } - /** Checks whether audio uses Bluetooth SCO. */ private boolean isScoOn() { - return audioManager.isBluetoothScoOn(); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { + AudioDeviceInfo communicationDevice = audioManager.getCommunicationDevice(); + return communicationDevice != null && bluetoothAudioDevice != null && communicationDevice.getId() == bluetoothAudioDevice.getId(); + } else { + return audioManager.isBluetoothScoOn(); + } } - /** Converts BluetoothAdapter states into local string representations. */ private String stateToString(int state) { switch (state) { @@ -513,4 +617,19 @@ private String stateToString(int state) { return "INVALID"; } } + + @Nullable + @RequiresApi(api = Build.VERSION_CODES.S) + private AudioDeviceInfo getScoDevice() { + if (audioManager != null) { + List devices = audioManager.getAvailableCommunicationDevices(); + for (AudioDeviceInfo device : devices) { + if (device.getType() == AudioDeviceInfo.TYPE_BLE_HEADSET + || device.getType() == AudioDeviceInfo.TYPE_BLUETOOTH_SCO) { + return device; + } + } + } + return null; + } } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCProximitySensor.java b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCProximitySensor.java index a8f07ba..7ae52fc 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCProximitySensor.java +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/AppRTCProximitySensor.java @@ -7,9 +7,7 @@ * in the file PATENTS. All contributing project authors may * be found in the AUTHORS file in the root of the source tree. */ - package com.zxcpoiu.incallmanager.AppRTC; - import android.content.Context; import android.hardware.Sensor; import android.hardware.SensorEvent; @@ -17,7 +15,9 @@ import android.hardware.SensorManager; import android.os.Build; import android.util.Log; - +import androidx.annotation.Nullable; +import com.zxcpoiu.incallmanager.AppRTC.AppRTCUtils; +import com.zxcpoiu.incallmanager.AppRTC.ThreadUtils; /** * AppRTCProximitySensor manages functions related to the proximity sensor in * the AppRTC demo. @@ -29,33 +29,30 @@ */ public class AppRTCProximitySensor implements SensorEventListener { private static final String TAG = "AppRTCProximitySensor"; - // This class should be created, started and stopped on one thread - // (e.g. the main thread). We use |nonThreadSafe| to ensure that this is - // the case. Only active when |DEBUG| is set to true. - + // (e.g. the main thread). We use `nonThreadSafe` to ensure that this is + // the case. Only active when `DEBUG` is set to true. + private final ThreadUtils.ThreadChecker threadChecker = new ThreadUtils.ThreadChecker(); private final Runnable onSensorStateListener; private final SensorManager sensorManager; - private Sensor proximitySensor = null; - private boolean lastStateReportIsNear = false; - + @Nullable private Sensor proximitySensor; + private boolean lastStateReportIsNear; /** Construction */ public static AppRTCProximitySensor create(Context context, Runnable sensorStateListener) { return new AppRTCProximitySensor(context, sensorStateListener); } - private AppRTCProximitySensor(Context context, Runnable sensorStateListener) { - Log.d(TAG, "AppRTCProximitySensor"); + Log.d(TAG, "AppRTCProximitySensor" + AppRTCUtils.getThreadInfo()); onSensorStateListener = sensorStateListener; sensorManager = ((SensorManager) context.getSystemService(Context.SENSOR_SERVICE)); } - /** * Activate the proximity sensor. Also do initialization if called for the * first time. */ public boolean start() { - Log.d(TAG, "start"); + threadChecker.checkIsOnValidThread(); + Log.d(TAG, "start" + AppRTCUtils.getThreadInfo()); if (!initDefaultSensor()) { // Proximity sensor is not supported on this device. return false; @@ -63,30 +60,32 @@ public boolean start() { sensorManager.registerListener(this, proximitySensor, SensorManager.SENSOR_DELAY_NORMAL); return true; } - /** Deactivate the proximity sensor. */ public void stop() { - Log.d(TAG, "stop"); + threadChecker.checkIsOnValidThread(); + Log.d(TAG, "stop" + AppRTCUtils.getThreadInfo()); if (proximitySensor == null) { return; } sensorManager.unregisterListener(this, proximitySensor); } - /** Getter for last reported state. Set to true if "near" is reported. */ public boolean sensorReportsNearState() { + threadChecker.checkIsOnValidThread(); return lastStateReportIsNear; } - @Override public final void onAccuracyChanged(Sensor sensor, int accuracy) { + threadChecker.checkIsOnValidThread(); + AppRTCUtils.assertIsTrue(sensor.getType() == Sensor.TYPE_PROXIMITY); if (accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) { Log.e(TAG, "The values returned by this sensor cannot be trusted"); } } - @Override public final void onSensorChanged(SensorEvent event) { + threadChecker.checkIsOnValidThread(); + AppRTCUtils.assertIsTrue(event.sensor.getType() == Sensor.TYPE_PROXIMITY); // As a best practice; do as little as possible within this method and // avoid blocking. float distanceInCentimeters = event.values[0]; @@ -97,18 +96,15 @@ public final void onSensorChanged(SensorEvent event) { Log.d(TAG, "Proximity sensor => FAR state"); lastStateReportIsNear = false; } - // Report about new state to listening client. Client can then call // sensorReportsNearState() to query the current state (NEAR or FAR). if (onSensorStateListener != null) { onSensorStateListener.run(); } - - Log.d(TAG, "onSensorChanged" + ": " + Log.d(TAG, "onSensorChanged" + AppRTCUtils.getThreadInfo() + ": " + "accuracy=" + event.accuracy + ", timestamp=" + event.timestamp + ", distance=" + event.values[0]); } - /** * Get default proximity sensor if it exists. Tablet devices (e.g. Nexus 7) * does not support this type of sensor and false will be returned in such @@ -125,7 +121,6 @@ private boolean initDefaultSensor() { logProximitySensorInfo(); return true; } - /** Helper method for logging information about the proximity sensor. */ private void logProximitySensorInfo() { if (proximitySensor == null) { @@ -138,16 +133,10 @@ private void logProximitySensorInfo() { info.append(", resolution: ").append(proximitySensor.getResolution()); info.append(", max range: ").append(proximitySensor.getMaximumRange()); info.append(", min delay: ").append(proximitySensor.getMinDelay()); - if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT_WATCH) { - // Added in API level 20. - info.append(", type: ").append(proximitySensor.getStringType()); - } - if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) { - // Added in API level 21. - info.append(", max delay: ").append(proximitySensor.getMaxDelay()); - info.append(", reporting mode: ").append(proximitySensor.getReportingMode()); - info.append(", isWakeUpSensor: ").append(proximitySensor.isWakeUpSensor()); - } + info.append(", type: ").append(proximitySensor.getStringType()); + info.append(", max delay: ").append(proximitySensor.getMaxDelay()); + info.append(", reporting mode: ").append(proximitySensor.getReportingMode()); + info.append(", isWakeUpSensor: ").append(proximitySensor.isWakeUpSensor()); Log.d(TAG, info.toString()); } } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCBluetoothManager.java.diff b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCBluetoothManager.java.diff index c0a83ae..f8172f8 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCBluetoothManager.java.diff +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCBluetoothManager.java.diff @@ -1,119 +1,64 @@ ---- /home/zxcpoiu/git/webrtcbuilds/out/src/examples/androidapp/src/org/appspot/apprtc/AppRTCBluetoothManager.java 2017-11-30 16:59:50.918956062 +0800 -+++ AppRTCBluetoothManager.java 2017-12-08 18:01:45.348130079 +0800 -@@ -8,7 +8,7 @@ +--- AppRTCBluetoothManager.orig.java ++++ AppRTCBluetoothManager.java +@@ -7,7 +7,7 @@ + * in the file PATENTS. All contributing project authors may * be found in the AUTHORS file in the root of the source tree. */ - -package org.appspot.apprtc; +package com.zxcpoiu.incallmanager.AppRTC; - import android.annotation.SuppressLint; import android.bluetooth.BluetoothAdapter; -@@ -27,8 +27,8 @@ - import android.util.Log; + import android.bluetooth.BluetoothDevice; +@@ -19,6 +19,7 @@ + import android.content.IntentFilter; + import android.content.pm.PackageManager; + import android.media.AudioManager; ++import android.os.Build; + import android.os.Handler; + import android.os.Looper; + import android.os.Process; +@@ -26,8 +27,9 @@ + import androidx.annotation.Nullable; import java.util.List; import java.util.Set; -import org.appspot.apprtc.util.AppRTCUtils; -import org.webrtc.ThreadUtils; -+ ++import com.zxcpoiu.incallmanager.AppRTC.AppRTCUtils; ++import com.zxcpoiu.incallmanager.AppRTC.ThreadUtils; +import com.zxcpoiu.incallmanager.InCallManagerModule; - /** * AppRTCProximitySensor manages functions related to Bluetoth devices in the -@@ -63,7 +63,7 @@ + * AppRTC demo. +@@ -58,7 +60,7 @@ + SCO_CONNECTED } - private final Context apprtcContext; - private final AppRTCAudioManager apprtcAudioManager; + private final InCallManagerModule apprtcAudioManager; + @Nullable private final AudioManager audioManager; private final Handler handler; - -@@ -190,14 +190,13 @@ +@@ -183,11 +185,11 @@ + } } - /** Construction. */ - static AppRTCBluetoothManager create(Context context, AppRTCAudioManager audioManager) { -- Log.d(TAG, "create" + AppRTCUtils.getThreadInfo()); + public static AppRTCBluetoothManager create(Context context, InCallManagerModule audioManager) { -+ Log.d(TAG, "create"); + Log.d(TAG, "create" + AppRTCUtils.getThreadInfo()); return new AppRTCBluetoothManager(context, audioManager); } - - protected AppRTCBluetoothManager(Context context, AppRTCAudioManager audioManager) { + protected AppRTCBluetoothManager(Context context, InCallManagerModule audioManager) { Log.d(TAG, "ctor"); -- ThreadUtils.checkIsOnMainThread(); + ThreadUtils.checkIsOnMainThread(); apprtcContext = context; - apprtcAudioManager = audioManager; - this.audioManager = getAudioManager(context); -@@ -209,7 +208,6 @@ - - /** Returns the internal state. */ - public State getState() { -- ThreadUtils.checkIsOnMainThread(); - return bluetoothState; - } - -@@ -227,7 +225,6 @@ - * change. - */ +@@ -219,7 +221,8 @@ public void start() { -- ThreadUtils.checkIsOnMainThread(); + ThreadUtils.checkIsOnMainThread(); Log.d(TAG, "start"); - if (!hasPermission(apprtcContext, android.Manifest.permission.BLUETOOTH)) { +- if (!hasPermission(apprtcContext, android.Manifest.permission.BLUETOOTH)) { ++ String p = Build.VERSION.SDK_INT >= Build.VERSION_CODES.S ? android.Manifest.permission.BLUETOOTH_CONNECT : android.Manifest.permission.BLUETOOTH; ++ if (!hasPermission(apprtcContext, p)) { Log.w(TAG, "Process (pid=" + Process.myPid() + ") lacks BLUETOOTH permission"); -@@ -275,7 +272,6 @@ - - /** Stops and closes all components related to Bluetooth audio. */ - public void stop() { -- ThreadUtils.checkIsOnMainThread(); - Log.d(TAG, "stop: BT state=" + bluetoothState); - if (bluetoothAdapter == null) { - return; -@@ -312,7 +308,6 @@ - * accept SCO audio without a "call". - */ - public boolean startScoAudio() { -- ThreadUtils.checkIsOnMainThread(); - Log.d(TAG, "startSco: BT state=" + bluetoothState + ", " - + "attempts: " + scoConnectionAttempts + ", " - + "SCO is on: " + isScoOn()); -@@ -341,7 +336,6 @@ - - /** Stops Bluetooth SCO connection with remote device. */ - public void stopScoAudio() { -- ThreadUtils.checkIsOnMainThread(); - Log.d(TAG, "stopScoAudio: BT state=" + bluetoothState + ", " - + "SCO is on: " + isScoOn()); - if (bluetoothState != State.SCO_CONNECTING && bluetoothState != State.SCO_CONNECTED) { -@@ -432,21 +426,18 @@ - - /** Ensures that the audio manager updates its list of available audio devices. */ - private void updateAudioDeviceState() { -- ThreadUtils.checkIsOnMainThread(); - Log.d(TAG, "updateAudioDeviceState"); - apprtcAudioManager.updateAudioDeviceState(); - } - - /** Starts timer which times out after BLUETOOTH_SCO_TIMEOUT_MS milliseconds. */ - private void startTimer() { -- ThreadUtils.checkIsOnMainThread(); - Log.d(TAG, "startTimer"); - handler.postDelayed(bluetoothTimeoutRunnable, BLUETOOTH_SCO_TIMEOUT_MS); - } - - /** Cancels any outstanding timer tasks. */ - private void cancelTimer() { -- ThreadUtils.checkIsOnMainThread(); - Log.d(TAG, "cancelTimer"); - handler.removeCallbacks(bluetoothTimeoutRunnable); - } -@@ -456,7 +447,6 @@ - * happens when the BT device has been turned on during an ongoing call. - */ - private void bluetoothTimeout() { -- ThreadUtils.checkIsOnMainThread(); - if (bluetoothState == State.UNINITIALIZED || bluetoothHeadset == null) { return; } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCProximitySensor.java.diff b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCProximitySensor.java.diff index 8a4ed9d..bbd9bc6 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCProximitySensor.java.diff +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCProximitySensor.java.diff @@ -1,96 +1,31 @@ ---- /home/zxcpoiu/git/webrtcbuilds/out/src/examples/androidapp/src/org/appspot/apprtc/AppRTCProximitySensor.java 2017-11-30 16:59:50.918956062 +0800 -+++ AppRTCProximitySensor.java 2017-12-08 18:02:05.004106849 +0800 -@@ -8,7 +8,7 @@ +--- AppRTCProximitySensor.orig.java ++++ AppRTCProximitySensor.java +@@ -7,7 +7,7 @@ + * in the file PATENTS. All contributing project authors may * be found in the AUTHORS file in the root of the source tree. */ - -package org.appspot.apprtc; +package com.zxcpoiu.incallmanager.AppRTC; - import android.content.Context; import android.hardware.Sensor; -@@ -17,8 +17,6 @@ - import android.hardware.SensorManager; + import android.hardware.SensorEvent; +@@ -16,8 +16,8 @@ import android.os.Build; import android.util.Log; + import androidx.annotation.Nullable; -import org.appspot.apprtc.util.AppRTCUtils; -import org.webrtc.ThreadUtils; - ++import com.zxcpoiu.incallmanager.AppRTC.AppRTCUtils; ++import com.zxcpoiu.incallmanager.AppRTC.ThreadUtils; /** * AppRTCProximitySensor manages functions related to the proximity sensor in -@@ -35,7 +33,6 @@ - // This class should be created, started and stopped on one thread - // (e.g. the main thread). We use |nonThreadSafe| to ensure that this is - // the case. Only active when |DEBUG| is set to true. -- private final ThreadUtils.ThreadChecker threadChecker = new ThreadUtils.ThreadChecker(); - - private final Runnable onSensorStateListener; - private final SensorManager sensorManager; -@@ -43,12 +40,12 @@ - private boolean lastStateReportIsNear = false; - + * the AppRTC demo. +@@ -38,7 +38,7 @@ + @Nullable private Sensor proximitySensor; + private boolean lastStateReportIsNear; /** Construction */ - static AppRTCProximitySensor create(Context context, Runnable sensorStateListener) { + public static AppRTCProximitySensor create(Context context, Runnable sensorStateListener) { return new AppRTCProximitySensor(context, sensorStateListener); } - private AppRTCProximitySensor(Context context, Runnable sensorStateListener) { -- Log.d(TAG, "AppRTCProximitySensor" + AppRTCUtils.getThreadInfo()); -+ Log.d(TAG, "AppRTCProximitySensor"); - onSensorStateListener = sensorStateListener; - sensorManager = ((SensorManager) context.getSystemService(Context.SENSOR_SERVICE)); - } -@@ -58,8 +55,7 @@ - * first time. - */ - public boolean start() { -- threadChecker.checkIsOnValidThread(); -- Log.d(TAG, "start" + AppRTCUtils.getThreadInfo()); -+ Log.d(TAG, "start"); - if (!initDefaultSensor()) { - // Proximity sensor is not supported on this device. - return false; -@@ -70,8 +66,7 @@ - - /** Deactivate the proximity sensor. */ - public void stop() { -- threadChecker.checkIsOnValidThread(); -- Log.d(TAG, "stop" + AppRTCUtils.getThreadInfo()); -+ Log.d(TAG, "stop"); - if (proximitySensor == null) { - return; - } -@@ -80,14 +75,11 @@ - - /** Getter for last reported state. Set to true if "near" is reported. */ - public boolean sensorReportsNearState() { -- threadChecker.checkIsOnValidThread(); - return lastStateReportIsNear; - } - - @Override - public final void onAccuracyChanged(Sensor sensor, int accuracy) { -- threadChecker.checkIsOnValidThread(); -- AppRTCUtils.assertIsTrue(sensor.getType() == Sensor.TYPE_PROXIMITY); - if (accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) { - Log.e(TAG, "The values returned by this sensor cannot be trusted"); - } -@@ -95,8 +87,6 @@ - - @Override - public final void onSensorChanged(SensorEvent event) { -- threadChecker.checkIsOnValidThread(); -- AppRTCUtils.assertIsTrue(event.sensor.getType() == Sensor.TYPE_PROXIMITY); - // As a best practice; do as little as possible within this method and - // avoid blocking. - float distanceInCentimeters = event.values[0]; -@@ -114,7 +104,7 @@ - onSensorStateListener.run(); - } - -- Log.d(TAG, "onSensorChanged" + AppRTCUtils.getThreadInfo() + ": " -+ Log.d(TAG, "onSensorChanged" + ": " - + "accuracy=" + event.accuracy + ", timestamp=" + event.timestamp + ", distance=" - + event.values[0]); - } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCUtils.java.diff b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCUtils.java.diff new file mode 100644 index 0000000..db0a93a --- /dev/null +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/AppRTCUtils.java.diff @@ -0,0 +1,11 @@ +--- AppRTCUtils.orig.java 2022-05-26 06:35:49.532008067 +0800 ++++ AppRTCUtils.java 2022-05-26 06:36:31.007700973 +0800 +@@ -7,7 +7,7 @@ + * in the file PATENTS. All contributing project authors may + * be found in the AUTHORS file in the root of the source tree. + */ +-package org.appspot.apprtc.util; ++package com.zxcpoiu.incallmanager.AppRTC; + import android.os.Build; + import android.util.Log; + /** diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/ThreadUtils.java.diff b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/ThreadUtils.java.diff new file mode 100644 index 0000000..65fcba9 --- /dev/null +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/diff/ThreadUtils.java.diff @@ -0,0 +1,176 @@ +--- ThreadUtils.orig.java 2022-05-26 06:50:16.889507506 +0800 ++++ ThreadUtils.java 2022-05-26 06:49:53.697682611 +0800 +@@ -7,14 +7,9 @@ + * in the file PATENTS. All contributing project authors may + * be found in the AUTHORS file in the root of the source tree. + */ +-package org.webrtc; +-import android.os.Handler; ++package com.zxcpoiu.incallmanager.AppRTC; + import android.os.Looper; +-import android.os.SystemClock; + import androidx.annotation.Nullable; +-import java.util.concurrent.Callable; +-import java.util.concurrent.CountDownLatch; +-import java.util.concurrent.TimeUnit; + public class ThreadUtils { + /** + * Utility class to be used for checking that a method is called on the correct thread. +@@ -41,157 +36,4 @@ + throw new IllegalStateException("Not on main thread!"); + } + } +- /** +- * Utility interface to be used with executeUninterruptibly() to wait for blocking operations +- * to complete without getting interrupted.. +- */ +- public interface BlockingOperation { void run() throws InterruptedException; } +- /** +- * Utility method to make sure a blocking operation is executed to completion without getting +- * interrupted. This should be used in cases where the operation is waiting for some critical +- * work, e.g. cleanup, that must complete before returning. If the thread is interrupted during +- * the blocking operation, this function will re-run the operation until completion, and only then +- * re-interrupt the thread. +- */ +- public static void executeUninterruptibly(BlockingOperation operation) { +- boolean wasInterrupted = false; +- while (true) { +- try { +- operation.run(); +- break; +- } catch (InterruptedException e) { +- // Someone is asking us to return early at our convenience. We can't cancel this operation, +- // but we should preserve the information and pass it along. +- wasInterrupted = true; +- } +- } +- // Pass interruption information along. +- if (wasInterrupted) { +- Thread.currentThread().interrupt(); +- } +- } +- public static boolean joinUninterruptibly(final Thread thread, long timeoutMs) { +- final long startTimeMs = SystemClock.elapsedRealtime(); +- long timeRemainingMs = timeoutMs; +- boolean wasInterrupted = false; +- while (timeRemainingMs > 0) { +- try { +- thread.join(timeRemainingMs); +- break; +- } catch (InterruptedException e) { +- // Someone is asking us to return early at our convenience. We can't cancel this operation, +- // but we should preserve the information and pass it along. +- wasInterrupted = true; +- final long elapsedTimeMs = SystemClock.elapsedRealtime() - startTimeMs; +- timeRemainingMs = timeoutMs - elapsedTimeMs; +- } +- } +- // Pass interruption information along. +- if (wasInterrupted) { +- Thread.currentThread().interrupt(); +- } +- return !thread.isAlive(); +- } +- public static void joinUninterruptibly(final Thread thread) { +- executeUninterruptibly(new BlockingOperation() { +- @Override +- public void run() throws InterruptedException { +- thread.join(); +- } +- }); +- } +- public static void awaitUninterruptibly(final CountDownLatch latch) { +- executeUninterruptibly(new BlockingOperation() { +- @Override +- public void run() throws InterruptedException { +- latch.await(); +- } +- }); +- } +- public static boolean awaitUninterruptibly(CountDownLatch barrier, long timeoutMs) { +- final long startTimeMs = SystemClock.elapsedRealtime(); +- long timeRemainingMs = timeoutMs; +- boolean wasInterrupted = false; +- boolean result = false; +- do { +- try { +- result = barrier.await(timeRemainingMs, TimeUnit.MILLISECONDS); +- break; +- } catch (InterruptedException e) { +- // Someone is asking us to return early at our convenience. We can't cancel this operation, +- // but we should preserve the information and pass it along. +- wasInterrupted = true; +- final long elapsedTimeMs = SystemClock.elapsedRealtime() - startTimeMs; +- timeRemainingMs = timeoutMs - elapsedTimeMs; +- } +- } while (timeRemainingMs > 0); +- // Pass interruption information along. +- if (wasInterrupted) { +- Thread.currentThread().interrupt(); +- } +- return result; +- } +- /** +- * Post `callable` to `handler` and wait for the result. +- */ +- public static V invokeAtFrontUninterruptibly( +- final Handler handler, final Callable callable) { +- if (handler.getLooper().getThread() == Thread.currentThread()) { +- try { +- return callable.call(); +- } catch (Exception e) { +- throw new RuntimeException(e); +- } +- } +- // Place-holder classes that are assignable inside nested class. +- class CaughtException { +- Exception e; +- } +- class Result { +- public V value; +- } +- final Result result = new Result(); +- final CaughtException caughtException = new CaughtException(); +- final CountDownLatch barrier = new CountDownLatch(1); +- handler.post(new Runnable() { +- @Override +- public void run() { +- try { +- result.value = callable.call(); +- } catch (Exception e) { +- caughtException.e = e; +- } +- barrier.countDown(); +- } +- }); +- awaitUninterruptibly(barrier); +- // Re-throw any runtime exception caught inside the other thread. Since this is an invoke, add +- // stack trace for the waiting thread as well. +- if (caughtException.e != null) { +- final RuntimeException runtimeException = new RuntimeException(caughtException.e); +- runtimeException.setStackTrace( +- concatStackTraces(caughtException.e.getStackTrace(), runtimeException.getStackTrace())); +- throw runtimeException; +- } +- return result.value; +- } +- /** +- * Post `runner` to `handler`, at the front, and wait for completion. +- */ +- public static void invokeAtFrontUninterruptibly(final Handler handler, final Runnable runner) { +- invokeAtFrontUninterruptibly(handler, new Callable() { +- @Override +- public Void call() { +- runner.run(); +- return null; +- } +- }); +- } +- static StackTraceElement[] concatStackTraces( +- StackTraceElement[] inner, StackTraceElement[] outer) { +- final StackTraceElement[] combined = new StackTraceElement[inner.length + outer.length]; +- System.arraycopy(inner, 0, combined, 0, inner.length); +- System.arraycopy(outer, 0, combined, inner.length, outer.length); +- return combined; +- } + } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/util/AppRTCUtils.java b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/util/AppRTCUtils.java new file mode 100644 index 0000000..64fad3c --- /dev/null +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/util/AppRTCUtils.java @@ -0,0 +1,41 @@ +/* + * Copyright 2014 The WebRTC Project Authors. All rights reserved. + * + * Use of this source code is governed by a BSD-style license + * that can be found in the LICENSE file in the root of the source + * tree. An additional intellectual property rights grant can be found + * in the file PATENTS. All contributing project authors may + * be found in the AUTHORS file in the root of the source tree. + */ +package com.zxcpoiu.incallmanager.AppRTC; +import android.os.Build; +import android.util.Log; +/** + * AppRTCUtils provides helper functions for managing thread safety. + */ +public final class AppRTCUtils { + private AppRTCUtils() {} + /** Helper method which throws an exception when an assertion has failed. */ + public static void assertIsTrue(boolean condition) { + if (!condition) { + throw new AssertionError("Expected condition to be true"); + } + } + /** Helper method for building a string of thread information.*/ + public static String getThreadInfo() { + return "@[name=" + Thread.currentThread().getName() + ", id=" + Thread.currentThread().getId() + + "]"; + } + /** Information about the current build, taken from system properties. */ + public static void logDeviceInfo(String tag) { + Log.d(tag, "Android SDK: " + Build.VERSION.SDK_INT + ", " + + "Release: " + Build.VERSION.RELEASE + ", " + + "Brand: " + Build.BRAND + ", " + + "Device: " + Build.DEVICE + ", " + + "Id: " + Build.ID + ", " + + "Hardware: " + Build.HARDWARE + ", " + + "Manufacturer: " + Build.MANUFACTURER + ", " + + "Model: " + Build.MODEL + ", " + + "Product: " + Build.PRODUCT); + } +} diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/util/ThreadUtils.java b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/util/ThreadUtils.java new file mode 100644 index 0000000..2780b17 --- /dev/null +++ b/android/src/main/java/com/zxcpoiu/incallmanager/AppRTC/util/ThreadUtils.java @@ -0,0 +1,39 @@ +/* + * Copyright 2015 The WebRTC project authors. All Rights Reserved. + * + * Use of this source code is governed by a BSD-style license + * that can be found in the LICENSE file in the root of the source + * tree. An additional intellectual property rights grant can be found + * in the file PATENTS. All contributing project authors may + * be found in the AUTHORS file in the root of the source tree. + */ +package com.zxcpoiu.incallmanager.AppRTC; +import android.os.Looper; +import androidx.annotation.Nullable; +public class ThreadUtils { + /** + * Utility class to be used for checking that a method is called on the correct thread. + */ + public static class ThreadChecker { + @Nullable private Thread thread = Thread.currentThread(); + public void checkIsOnValidThread() { + if (thread == null) { + thread = Thread.currentThread(); + } + if (Thread.currentThread() != thread) { + throw new IllegalStateException("Wrong thread"); + } + } + public void detachThread() { + thread = null; + } + } + /** + * Throws exception if called from other than main thread. + */ + public static void checkIsOnMainThread() { + if (Thread.currentThread() != Looper.getMainLooper().getThread()) { + throw new IllegalStateException("Not on main thread!"); + } + } +} diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerModule.java b/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerModule.java index bbd14d0..10cb1e5 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerModule.java +++ b/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerModule.java @@ -22,20 +22,19 @@ import android.content.IntentFilter; import android.content.BroadcastReceiver; import android.content.pm.PackageManager; -import android.Manifest.permission; -//import android.media.AudioAttributes; // --- for API 21+ -import android.media.AudioManager; +import android.media.AudioAttributes; import android.media.AudioDeviceInfo; +import android.media.AudioFocusRequest; +import android.media.AudioManager; import android.media.MediaPlayer; import android.media.ToneGenerator; import android.net.Uri; import android.os.PowerManager; import android.os.Build; import android.os.Handler; +import android.os.Looper; import android.provider.Settings; -import android.support.annotation.Nullable; -import android.support.v4.app.ActivityCompat; -import android.support.v4.content.ContextCompat; +import androidx.annotation.Nullable; import android.util.Log; import android.util.SparseArray; import android.view.Display; @@ -66,11 +65,9 @@ import com.zxcpoiu.incallmanager.AppRTC.AppRTCBluetoothManager; -public class InCallManagerModule extends ReactContextBaseJavaModule implements LifecycleEventListener { +public class InCallManagerModule extends ReactContextBaseJavaModule implements LifecycleEventListener, AudioManager.OnAudioFocusChangeListener { private static final String REACT_NATIVE_MODULE_NAME = "InCallManager"; private static final String TAG = REACT_NATIVE_MODULE_NAME; - private static SparseArray mRequestPermissionCodePromises; - private static SparseArray mRequestPermissionCodeTargetPermission; private String mPackageName = "com.zxcpoiu.incallmanager"; // --- Screen Manager @@ -82,11 +79,11 @@ public class InCallManagerModule extends ReactContextBaseJavaModule implements L private AudioManager audioManager; private boolean audioManagerActivated = false; private boolean isAudioFocused = false; + //private final Object mAudioFocusLock = new Object(); private boolean isOrigAudioSetupStored = false; private boolean origIsSpeakerPhoneOn = false; private boolean origIsMicrophoneMute = false; private int origAudioMode = AudioManager.MODE_INVALID; - private int origRingerMode = AudioManager.RINGER_MODE_NORMAL; private boolean defaultSpeakerOn = false; private int defaultAudioMode = AudioManager.MODE_IN_COMMUNICATION; private int forceSpeakerOn = 0; @@ -97,7 +94,8 @@ public class InCallManagerModule extends ReactContextBaseJavaModule implements L private BroadcastReceiver wiredHeadsetReceiver; private BroadcastReceiver noisyAudioReceiver; private BroadcastReceiver mediaButtonReceiver; - private OnFocusChangeListener mOnFocusChangeListener; + private AudioAttributes mAudioAttributes; + private AudioFocusRequest mAudioFocusRequest; // --- same as: RingtoneManager.getActualDefaultRingtoneUri(reactContext, RingtoneManager.TYPE_RINGTONE); private Uri defaultRingtoneUri = Settings.System.DEFAULT_RINGTONE_URI; @@ -113,8 +111,6 @@ public class InCallManagerModule extends ReactContextBaseJavaModule implements L private MyPlayerInterface mBusytone; private Handler mRingtoneCountDownHandler; private String media = "audio"; - private static String recordPermission = "unknow"; - private static String cameraPermission = "unknow"; private static final String SPEAKERPHONE_AUTO = "auto"; private static final String SPEAKERPHONE_TRUE = "true"; @@ -159,7 +155,7 @@ public enum AudioManagerState { private final String useSpeakerphone = SPEAKERPHONE_AUTO; // Handles all tasks related to Bluetooth headset devices. - private final AppRTCBluetoothManager bluetoothManager; + private AppRTCBluetoothManager bluetoothManager = null; private final InCallProximityManager proximityManager; @@ -169,9 +165,6 @@ public enum AudioManagerState { // avoid duplicate elements. private Set audioDevices = new HashSet<>(); - // Callback method for changes in audio focus. - private AudioManager.OnAudioFocusChangeListener audioFocusChangeListener; - interface MyPlayerInterface { public boolean isPlaying(); public void startPlay(Map data); @@ -197,25 +190,27 @@ public InCallManagerModule(ReactApplicationContext reactContext) { audioUriMap.put("bundleRingtoneUri", bundleRingtoneUri); audioUriMap.put("bundleRingbackUri", bundleRingbackUri); audioUriMap.put("bundleBusytoneUri", bundleBusytoneUri); - mRequestPermissionCodePromises = new SparseArray(); - mRequestPermissionCodeTargetPermission = new SparseArray(); - mOnFocusChangeListener = new OnFocusChangeListener(); - bluetoothManager = AppRTCBluetoothManager.create(reactContext, this); - proximityManager = InCallProximityManager.create(reactContext, this); wakeLockUtils = new InCallWakeLockUtils(reactContext); + proximityManager = InCallProximityManager.create(reactContext, this); + + UiThreadUtil.runOnUiThread(() -> { + bluetoothManager = AppRTCBluetoothManager.create(reactContext, this); + }); Log.d(TAG, "InCallManager initialized"); } private void manualTurnScreenOff() { Log.d(TAG, "manualTurnScreenOff()"); + Activity mCurrentActivity = getCurrentActivity(); + + if (mCurrentActivity == null) { + Log.d(TAG, "ReactContext doesn't have any Activity attached."); + return; + } + UiThreadUtil.runOnUiThread(new Runnable() { public void run() { - Activity mCurrentActivity = getCurrentActivity(); - if (mCurrentActivity == null) { - Log.d(TAG, "ReactContext doesn't hava any Activity attached."); - return; - } Window window = mCurrentActivity.getWindow(); WindowManager.LayoutParams params = window.getAttributes(); lastLayoutParams = params; // --- store last param @@ -228,13 +223,15 @@ public void run() { private void manualTurnScreenOn() { Log.d(TAG, "manualTurnScreenOn()"); + Activity mCurrentActivity = getCurrentActivity(); + + if (mCurrentActivity == null) { + Log.d(TAG, "ReactContext doesn't have any Activity attached."); + return; + } + UiThreadUtil.runOnUiThread(new Runnable() { public void run() { - Activity mCurrentActivity = getCurrentActivity(); - if (mCurrentActivity == null) { - Log.d(TAG, "ReactContext doesn't hava any Activity attached."); - return; - } Window window = mCurrentActivity.getWindow(); if (lastLayoutParams != null) { window.setAttributes(lastLayoutParams); @@ -251,7 +248,6 @@ public void run() { private void storeOriginalAudioSetup() { Log.d(TAG, "storeOriginalAudioSetup()"); if (!isOrigAudioSetupStored) { - origRingerMode = audioManager.getRingerMode(); origAudioMode = audioManager.getMode(); origIsSpeakerPhoneOn = audioManager.isSpeakerphoneOn(); origIsMicrophoneMute = audioManager.isMicrophoneMute(); @@ -265,7 +261,6 @@ private void restoreOriginalAudioSetup() { setSpeakerphoneOn(origIsSpeakerPhoneOn); setMicrophoneMute(origIsMicrophoneMute); audioManager.setMode(origAudioMode); - audioManager.setRingerMode(origRingerMode); if (getCurrentActivity() != null) { getCurrentActivity().setVolumeControlStream(AudioManager.USE_DEFAULT_STREAM_TYPE); } @@ -281,7 +276,7 @@ private void startWiredHeadsetEvent() { @Override public void onReceive(Context context, Intent intent) { if (ACTION_HEADSET_PLUG.equals(intent.getAction())) { - hasWiredHeadset = true; + hasWiredHeadset = intent.getIntExtra("state", 0) == 1; updateAudioRoute(); String deviceName = intent.getStringExtra("name"); if (deviceName == null) { @@ -297,24 +292,14 @@ public void onReceive(Context context, Intent intent) { } } }; - ReactContext reactContext = getReactApplicationContext(); - if (reactContext != null) { - reactContext.registerReceiver(wiredHeadsetReceiver, filter); - } else { - Log.d(TAG, "startWiredHeadsetEvent() reactContext is null"); - } + this.registerReceiver(wiredHeadsetReceiver, filter); } } private void stopWiredHeadsetEvent() { if (wiredHeadsetReceiver != null) { Log.d(TAG, "stopWiredHeadsetEvent()"); - ReactContext reactContext = getReactApplicationContext(); - if (reactContext != null) { - reactContext.unregisterReceiver(wiredHeadsetReceiver); - } else { - Log.d(TAG, "stopWiredHeadsetEvent() reactContext is null"); - } + this.unregisterReceiver(this.wiredHeadsetReceiver); wiredHeadsetReceiver = null; } } @@ -332,24 +317,14 @@ public void onReceive(Context context, Intent intent) { } } }; - ReactContext reactContext = getReactApplicationContext(); - if (reactContext != null) { - reactContext.registerReceiver(noisyAudioReceiver, filter); - } else { - Log.d(TAG, "startNoisyAudioEvent() reactContext is null"); - } + this.registerReceiver(noisyAudioReceiver, filter); } } private void stopNoisyAudioEvent() { if (noisyAudioReceiver != null) { Log.d(TAG, "stopNoisyAudioEvent()"); - ReactContext reactContext = getReactApplicationContext(); - if (reactContext != null) { - reactContext.unregisterReceiver(noisyAudioReceiver); - } else { - Log.d(TAG, "stopNoisyAudioEvent() reactContext is null"); - } + this.unregisterReceiver(this.noisyAudioReceiver); noisyAudioReceiver = null; } } @@ -404,24 +379,15 @@ public void onReceive(Context context, Intent intent) { } } }; - ReactContext reactContext = getReactApplicationContext(); - if (reactContext != null) { - reactContext.registerReceiver(mediaButtonReceiver, filter); - } else { - Log.d(TAG, "startMediaButtonEvent() reactContext is null"); - } + + this.registerReceiver(mediaButtonReceiver, filter); } } private void stopMediaButtonEvent() { if (mediaButtonReceiver != null) { Log.d(TAG, "stopMediaButtonEvent()"); - ReactContext reactContext = getReactApplicationContext(); - if (reactContext != null) { - reactContext.unregisterReceiver(mediaButtonReceiver); - } else { - Log.d(TAG, "stopMediaButtonEvent() reactContext is null"); - } + this.unregisterReceiver(this.mediaButtonReceiver); mediaButtonReceiver = null; } } @@ -440,8 +406,8 @@ public void onProximitySensorChangedState(boolean isNear) { sendEvent("Proximity", data); } - - private void startProximitySensor() { + @ReactMethod + public void startProximitySensor() { if (!proximityManager.isProximitySupported()) { Log.d(TAG, "Proximity Sensor is not supported."); return; @@ -459,7 +425,8 @@ private void startProximitySensor() { isProximityRegistered = true; } - private void stopProximitySensor() { + @ReactMethod + public void stopProximitySensor() { if (!proximityManager.isProximitySupported()) { Log.d(TAG, "Proximity Sensor is not supported."); return; @@ -473,45 +440,46 @@ private void stopProximitySensor() { isProximityRegistered = false; } - private class OnFocusChangeListener implements AudioManager.OnAudioFocusChangeListener { - - @Override - public void onAudioFocusChange(final int focusChange) { - String focusChangeStr; - switch (focusChange) { - case AudioManager.AUDIOFOCUS_GAIN: - focusChangeStr = "AUDIOFOCUS_GAIN"; - break; - case AudioManager.AUDIOFOCUS_GAIN_TRANSIENT: - focusChangeStr = "AUDIOFOCUS_GAIN_TRANSIENT"; - break; - case AudioManager.AUDIOFOCUS_GAIN_TRANSIENT_EXCLUSIVE: - focusChangeStr = "AUDIOFOCUS_GAIN_TRANSIENT_EXCLUSIVE"; - break; - case AudioManager.AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK: - focusChangeStr = "AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK"; - break; - case AudioManager.AUDIOFOCUS_LOSS: - focusChangeStr = "AUDIOFOCUS_LOSS"; - break; - case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT: - focusChangeStr = "AUDIOFOCUS_LOSS_TRANSIENT"; - break; - case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: - focusChangeStr = "AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK"; - break; - default: - focusChangeStr = "AUDIOFOCUS_UNKNOW"; - break; - } + // --- see: https://developer.android.com/reference/android/media/AudioManager + @Override + public void onAudioFocusChange(int focusChange) { + String focusChangeStr; + switch (focusChange) { + case AudioManager.AUDIOFOCUS_GAIN: + focusChangeStr = "AUDIOFOCUS_GAIN"; + break; + case AudioManager.AUDIOFOCUS_GAIN_TRANSIENT: + focusChangeStr = "AUDIOFOCUS_GAIN_TRANSIENT"; + break; + case AudioManager.AUDIOFOCUS_GAIN_TRANSIENT_EXCLUSIVE: + focusChangeStr = "AUDIOFOCUS_GAIN_TRANSIENT_EXCLUSIVE"; + break; + case AudioManager.AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK: + focusChangeStr = "AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK"; + break; + case AudioManager.AUDIOFOCUS_LOSS: + focusChangeStr = "AUDIOFOCUS_LOSS"; + break; + case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT: + focusChangeStr = "AUDIOFOCUS_LOSS_TRANSIENT"; + break; + case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: + focusChangeStr = "AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK"; + break; + case AudioManager.AUDIOFOCUS_NONE: + focusChangeStr = "AUDIOFOCUS_NONE"; + break; + default: + focusChangeStr = "AUDIOFOCUS_UNKNOW"; + break; + } - Log.d(TAG, "onAudioFocusChange: " + focusChange + " - " + focusChangeStr); + Log.d(TAG, "onAudioFocusChange(): " + focusChange + " - " + focusChangeStr); - WritableMap data = Arguments.createMap(); - data.putString("eventText", focusChangeStr); - data.putInt("eventCode", focusChange); - sendEvent("onAudioFocusChange", data); - } + WritableMap data = Arguments.createMap(); + data.putString("eventText", focusChangeStr); + data.putInt("eventCode", focusChange); + sendEvent("onAudioFocusChange", data); } /* @@ -591,7 +559,9 @@ public void start(final String _media, final boolean auto, final String ringback storeOriginalAudioSetup(); requestAudioFocus(); startEvents(); - bluetoothManager.start(); + UiThreadUtil.runOnUiThread(() -> { + bluetoothManager.start(); + }); // TODO: even if not acquired focus, we can still play sounds. but need figure out which is better. //getCurrentActivity().setVolumeControlStream(AudioManager.STREAM_VOICE_CALL); audioManager.setMode(defaultAudioMode); @@ -630,9 +600,11 @@ public void stop(final String busytoneUriType) { setSpeakerphoneOn(false); setMicrophoneMute(false); forceSpeakerOn = 0; - bluetoothManager.stop(); + UiThreadUtil.runOnUiThread(() -> { + bluetoothManager.stop(); + }); restoreOriginalAudioSetup(); - releaseAudioFocus(); + abandonAudioFocus(); audioManagerActivated = false; } wakeLockUtils.releasePartialWakeLock(); @@ -643,10 +615,7 @@ private void startEvents() { startWiredHeadsetEvent(); startNoisyAudioEvent(); startMediaButtonEvent(); - if (!defaultSpeakerOn) { - // video, default disable proximity - startProximitySensor(); - } + startProximitySensor(); // --- proximity event always enable, but only turn screen off when audio is routing to earpiece. setKeepScreenOn(true); } @@ -659,28 +628,148 @@ private void stopEvents() { turnScreenOn(); } - private void requestAudioFocus() { - if (!isAudioFocused) { - int result = audioManager.requestAudioFocus(mOnFocusChangeListener, AudioManager.STREAM_VOICE_CALL, AudioManager.AUDIOFOCUS_GAIN); - if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) { - Log.d(TAG, "AudioFocus granted"); + @ReactMethod + public void requestAudioFocusJS(Promise promise) { + promise.resolve(requestAudioFocus()); + } + + private String requestAudioFocus() { + String requestAudioFocusResStr = (android.os.Build.VERSION.SDK_INT >= 26) + ? requestAudioFocusV26() + : requestAudioFocusOld(); + Log.d(TAG, "requestAudioFocus(): res = " + requestAudioFocusResStr); + return requestAudioFocusResStr; + } + + private String requestAudioFocusV26() { + if (isAudioFocused) { + return ""; + } + + if (mAudioAttributes == null) { + mAudioAttributes = new AudioAttributes.Builder() + .setUsage(AudioAttributes.USAGE_VOICE_COMMUNICATION) + .setContentType(AudioAttributes.CONTENT_TYPE_SPEECH) + .build(); + } + + if (mAudioFocusRequest == null) { + mAudioFocusRequest = new AudioFocusRequest.Builder(AudioManager.AUDIOFOCUS_GAIN_TRANSIENT) + .setAudioAttributes(mAudioAttributes) + .setAcceptsDelayedFocusGain(false) + .setWillPauseWhenDucked(false) + .setOnAudioFocusChangeListener(this) + .build(); + } + + int requestAudioFocusRes = audioManager.requestAudioFocus(mAudioFocusRequest); + + String requestAudioFocusResStr; + switch (requestAudioFocusRes) { + case AudioManager.AUDIOFOCUS_REQUEST_FAILED: + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_FAILED"; + break; + case AudioManager.AUDIOFOCUS_REQUEST_GRANTED: isAudioFocused = true; - } else if (result == AudioManager.AUDIOFOCUS_REQUEST_FAILED) { - Log.d(TAG, "AudioFocus failed"); - isAudioFocused = false; - } + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_GRANTED"; + break; + case AudioManager.AUDIOFOCUS_REQUEST_DELAYED: + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_DELAYED"; + break; + default: + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_UNKNOWN"; + break; } + + return requestAudioFocusResStr; } - private void releaseAudioFocus() { + private String requestAudioFocusOld() { if (isAudioFocused) { - audioManager.abandonAudioFocus(null); - isAudioFocused = false; + return ""; + } + + int requestAudioFocusRes = audioManager.requestAudioFocus(this, AudioManager.STREAM_VOICE_CALL, AudioManager.AUDIOFOCUS_GAIN_TRANSIENT); + + String requestAudioFocusResStr; + switch (requestAudioFocusRes) { + case AudioManager.AUDIOFOCUS_REQUEST_FAILED: + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_FAILED"; + break; + case AudioManager.AUDIOFOCUS_REQUEST_GRANTED: + isAudioFocused = true; + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_GRANTED"; + break; + default: + requestAudioFocusResStr = "AUDIOFOCUS_REQUEST_UNKNOWN"; + break; + } + + return requestAudioFocusResStr; + } + + @ReactMethod + public void abandonAudioFocusJS(Promise promise) { + promise.resolve(abandonAudioFocus()); + } + + private String abandonAudioFocus() { + String abandonAudioFocusResStr = (android.os.Build.VERSION.SDK_INT >= 26) + ? abandonAudioFocusV26() + : abandonAudioFocusOld(); + Log.d(TAG, "abandonAudioFocus(): res = " + abandonAudioFocusResStr); + return abandonAudioFocusResStr; + } + + private String abandonAudioFocusV26() { + if (!isAudioFocused || mAudioFocusRequest == null) { + return ""; + } + + int abandonAudioFocusRes = audioManager.abandonAudioFocusRequest(mAudioFocusRequest); + String abandonAudioFocusResStr; + switch (abandonAudioFocusRes) { + case AudioManager.AUDIOFOCUS_REQUEST_FAILED: + abandonAudioFocusResStr = "AUDIOFOCUS_REQUEST_FAILED"; + break; + case AudioManager.AUDIOFOCUS_REQUEST_GRANTED: + isAudioFocused = false; + abandonAudioFocusResStr = "AUDIOFOCUS_REQUEST_GRANTED"; + break; + default: + abandonAudioFocusResStr = "AUDIOFOCUS_REQUEST_UNKNOWN"; + break; + } + + return abandonAudioFocusResStr; + } + + private String abandonAudioFocusOld() { + if (!isAudioFocused) { + return ""; + } + + int abandonAudioFocusRes = audioManager.abandonAudioFocus(this); + + String abandonAudioFocusResStr; + switch (abandonAudioFocusRes) { + case AudioManager.AUDIOFOCUS_REQUEST_FAILED: + abandonAudioFocusResStr = "AUDIOFOCUS_REQUEST_FAILED"; + break; + case AudioManager.AUDIOFOCUS_REQUEST_GRANTED: + isAudioFocused = false; + abandonAudioFocusResStr = "AUDIOFOCUS_REQUEST_GRANTED"; + break; + default: + abandonAudioFocusResStr = "AUDIOFOCUS_REQUEST_UNKNOWN"; + break; } + + return abandonAudioFocusResStr; } @ReactMethod - public void pokeScreen(long timeout) { + public void pokeScreen(int timeout) { Log.d(TAG, "pokeScreen()"); wakeLockUtils.acquirePokeFullWakeLockReleaseAfter(timeout); // --- default 3000 ms } @@ -749,14 +838,18 @@ public void turnScreenOff() { @ReactMethod public void setKeepScreenOn(final boolean enable) { Log.d(TAG, "setKeepScreenOn() " + enable); + + Activity mCurrentActivity = getCurrentActivity(); + + if (mCurrentActivity == null) { + Log.d(TAG, "ReactContext doesn't have any Activity attached."); + return; + } + UiThreadUtil.runOnUiThread(new Runnable() { public void run() { - Activity mCurrentActivity = getCurrentActivity(); - if (mCurrentActivity == null) { - Log.d(TAG, "ReactContext doesn't hava any Activity attached."); - return; - } Window window = mCurrentActivity.getWindow(); + if (enable) { window.addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON); } else { @@ -770,6 +863,7 @@ public void run() { public void setSpeakerphoneOn(final boolean enable) { if (enable != audioManager.isSpeakerphoneOn()) { Log.d(TAG, "setSpeakerphoneOn(): " + enable); + audioManager.setMode(defaultAudioMode); audioManager.setSpeakerphoneOn(enable); } } @@ -789,7 +883,15 @@ public void setForceSpeakerphoneOn(final int flag) { Log.d(TAG, "setForceSpeakerphoneOn() flag: " + flag); forceSpeakerOn = flag; - selectAudioDevice((flag == 1) ? AudioDevice.SPEAKER_PHONE : AudioDevice.NONE); // --- will call updateAudioDeviceState() + // --- will call updateAudioDeviceState() + // --- Note: in some devices, it may not contains specified route thus will not be effected. + if (flag == 1) { + selectAudioDevice(AudioDevice.SPEAKER_PHONE); + } else if (flag == -1) { + selectAudioDevice(AudioDevice.EARPIECE); // --- use the most common earpiece to force `speaker off` + } else { + selectAudioDevice(AudioDevice.NONE); // --- NONE will follow default route, the default route of `video` call is speaker. + } } // --- TODO (zxcpoiu): Implement api to let user choose audio devices @@ -806,50 +908,55 @@ public void setMicrophoneMute(final boolean enable) { * This is part of start() process. * ringbackUriType must not empty. empty means do not play. */ + @ReactMethod public void startRingback(final String ringbackUriType) { if (ringbackUriType.isEmpty()) { return; } try { Log.d(TAG, "startRingback(): UriType=" + ringbackUriType); + if (mRingback != null) { if (mRingback.isPlaying()) { Log.d(TAG, "startRingback(): is already playing"); return; - } else { - stopRingback(); // --- use brandnew instance } + + stopRingback(); // --- use brandnew instance } Uri ringbackUri; Map data = new HashMap(); data.put("name", "mRingback"); + + // --- use ToneGenerator instead file uri if (ringbackUriType.equals("_DTMF_")) { mRingback = new myToneGenerator(myToneGenerator.RINGBACK); mRingback.startPlay(data); return; - } else { - ringbackUri = getRingbackUri(ringbackUriType); - if (ringbackUri == null) { - Log.d(TAG, "startRingback(): no available media"); - return; - } + } + + ringbackUri = getRingbackUri(ringbackUriType); + if (ringbackUri == null) { + Log.d(TAG, "startRingback(): no available media"); + return; } mRingback = new myMediaPlayer(); data.put("sourceUri", ringbackUri); data.put("setLooping", true); - data.put("audioStream", AudioManager.STREAM_VOICE_CALL); - /* - TODO: for API 21 - data.put("audioFlag", AudioAttributes.FLAG_AUDIBILITY_ENFORCED); - data.put("audioUsage", AudioAttributes.USAGE_VOICE_COMMUNICATION); // USAGE_VOICE_COMMUNICATION_SIGNALLING ? - data.put("audioContentType", AudioAttributes.CONTENT_TYPE_SPEECH); // CONTENT_TYPE_MUSIC ? - */ + + //data.put("audioStream", AudioManager.STREAM_VOICE_CALL); // --- lagacy + // --- The ringback doesn't have to be a DTMF. + // --- Should use VOICE_COMMUNICATION for sound during call or it may be silenced. + data.put("audioUsage", AudioAttributes.USAGE_VOICE_COMMUNICATION); + data.put("audioContentType", AudioAttributes.CONTENT_TYPE_MUSIC); + setMediaPlayerEvents((MediaPlayer)mRingback, "mRingback"); + mRingback.startPlay(data); } catch(Exception e) { - Log.d(TAG, "startRingback() failed"); + Log.d(TAG, "startRingback() failed", e); } } @@ -881,42 +988,42 @@ public boolean startBusytone(final String busytoneUriType) { if (mBusytone.isPlaying()) { Log.d(TAG, "startBusytone(): is already playing"); return false; - } else { - stopBusytone(); // --- use brandnew instance } + + stopBusytone(); // --- use brandnew instance } Uri busytoneUri; Map data = new HashMap(); data.put("name", "mBusytone"); + + // --- use ToneGenerator instead file uri if (busytoneUriType.equals("_DTMF_")) { mBusytone = new myToneGenerator(myToneGenerator.BUSY); mBusytone.startPlay(data); return true; - } else { - busytoneUri = getBusytoneUri(busytoneUriType); - if (busytoneUri == null) { - Log.d(TAG, "startBusytone(): no available media"); - return false; - } + } + + busytoneUri = getBusytoneUri(busytoneUriType); + if (busytoneUri == null) { + Log.d(TAG, "startBusytone(): no available media"); + return false; } mBusytone = new myMediaPlayer(); + data.put("sourceUri", busytoneUri); data.put("setLooping", false); - data.put("audioStream", AudioManager.STREAM_VOICE_CALL); - /* - TODO: for API 21 - data.put("audioFlag", AudioAttributes.FLAG_AUDIBILITY_ENFORCED); - data.put("audioUsage", AudioAttributes.USAGE_VOICE_COMMUNICATION_SIGNALLING); // USAGE_VOICE_COMMUNICATION ? - data.put("audioContentType", AudioAttributes.CONTENT_TYPE_SPEECH); - */ + //data.put("audioStream", AudioManager.STREAM_VOICE_CALL); // --- lagacy + // --- Should use VOICE_COMMUNICATION for sound during a call or it may be silenced. + data.put("audioUsage", AudioAttributes.USAGE_VOICE_COMMUNICATION); + data.put("audioContentType", AudioAttributes.CONTENT_TYPE_SONIFICATION); // --- CONTENT_TYPE_MUSIC? + setMediaPlayerEvents((MediaPlayer)mBusytone, "mBusytone"); mBusytone.startPlay(data); return true; } catch(Exception e) { - Log.d(TAG, "startBusytone() failed"); - Log.d(TAG, e.getMessage()); + Log.d(TAG, "startBusytone() failed", e); return false; } } @@ -934,88 +1041,106 @@ public void stopBusytone() { @ReactMethod public void startRingtone(final String ringtoneUriType, final int seconds) { - try { - Log.d(TAG, "startRingtone(): UriType=" + ringtoneUriType); - if (mRingtone != null) { - if (mRingtone.isPlaying()) { - Log.d(TAG, "startRingtone(): is already playing"); - return; - } else { - stopRingtone(); // --- use brandnew instance - } - } + Thread thread = new Thread() { + @Override + public void run() { + try { + Looper.prepare(); + + Log.d(TAG, "startRingtone(): UriType=" + ringtoneUriType); + if (mRingtone != null) { + if (mRingtone.isPlaying()) { + Log.d(TAG, "startRingtone(): is already playing"); + return; + } else { + stopRingtone(); // --- use brandnew instance + } + } - //if (!audioManager.isStreamMute(AudioManager.STREAM_RING)) { - //if (origRingerMode == AudioManager.RINGER_MODE_NORMAL) { - if (audioManager.getStreamVolume(AudioManager.STREAM_RING) == 0) { - Log.d(TAG, "startRingtone(): ringer is silent. leave without play."); - return; - } + //if (!audioManager.isStreamMute(AudioManager.STREAM_RING)) { + //if (origRingerMode == AudioManager.RINGER_MODE_NORMAL) { + if (audioManager.getStreamVolume(AudioManager.STREAM_RING) == 0) { + Log.d(TAG, "startRingtone(): ringer is silent. leave without play."); + return; + } - // --- there is no _DTMF_ option in startRingtone() - Uri ringtoneUri = getRingtoneUri(ringtoneUriType); - if (ringtoneUri == null) { - Log.d(TAG, "startRingtone(): no available media"); - return; - } + // --- there is no _DTMF_ option in startRingtone() + Uri ringtoneUri = getRingtoneUri(ringtoneUriType); + if (ringtoneUri == null) { + Log.d(TAG, "startRingtone(): no available media"); + return; + } - if (audioManagerActivated) { - stop(); - } + if (audioManagerActivated) { + InCallManagerModule.this.stop(); + } - wakeLockUtils.acquirePartialWakeLock(); + wakeLockUtils.acquirePartialWakeLock(); - storeOriginalAudioSetup(); - Map data = new HashMap(); - mRingtone = new myMediaPlayer(); - data.put("name", "mRingtone"); - data.put("sourceUri", ringtoneUri); - data.put("setLooping", true); - data.put("audioStream", AudioManager.STREAM_RING); - /* - TODO: for API 21 - data.put("audioFlag", 0); - data.put("audioUsage", AudioAttributes.USAGE_NOTIFICATION_RINGTONE); // USAGE_NOTIFICATION_COMMUNICATION_REQUEST ? - data.put("audioContentType", AudioAttributes.CONTENT_TYPE_MUSIC); - */ - setMediaPlayerEvents((MediaPlayer) mRingtone, "mRingtone"); - mRingtone.startPlay(data); - - if (seconds > 0) { - mRingtoneCountDownHandler = new Handler(); - mRingtoneCountDownHandler.postDelayed(new Runnable() { - public void run() { - try { - Log.d(TAG, String.format("mRingtoneCountDownHandler.stopRingtone() timeout after %d seconds", seconds)); - stopRingtone(); - } catch(Exception e) { - Log.d(TAG, "mRingtoneCountDownHandler.stopRingtone() failed."); - } + storeOriginalAudioSetup(); + Map data = new HashMap(); + mRingtone = new myMediaPlayer(); + + data.put("name", "mRingtone"); + data.put("sourceUri", ringtoneUri); + data.put("setLooping", true); + + //data.put("audioStream", AudioManager.STREAM_RING); // --- lagacy + data.put("audioUsage", AudioAttributes.USAGE_NOTIFICATION_RINGTONE); // --- USAGE_NOTIFICATION_COMMUNICATION_REQUEST? + data.put("audioContentType", AudioAttributes.CONTENT_TYPE_MUSIC); + + setMediaPlayerEvents((MediaPlayer) mRingtone, "mRingtone"); + + mRingtone.startPlay(data); + + if (seconds > 0) { + mRingtoneCountDownHandler = new Handler(); + mRingtoneCountDownHandler.postDelayed(new Runnable() { + public void run() { + try { + Log.d(TAG, String.format("mRingtoneCountDownHandler.stopRingtone() timeout after %d seconds", seconds)); + stopRingtone(); + } catch(Exception e) { + Log.d(TAG, "mRingtoneCountDownHandler.stopRingtone() failed."); + } + } + }, seconds * 1000); } - }, seconds * 1000); + + Looper.loop(); + } catch(Exception e) { + wakeLockUtils.releasePartialWakeLock(); + Log.e(TAG, "startRingtone() failed", e); + } } - } catch(Exception e) { - wakeLockUtils.releasePartialWakeLock(); - Log.d(TAG, "startRingtone() failed"); - } + }; + + thread.start(); } @ReactMethod public void stopRingtone() { - try { - if (mRingtone != null) { - mRingtone.stopPlay(); - mRingtone = null; - restoreOriginalAudioSetup(); - } - if (mRingtoneCountDownHandler != null) { - mRingtoneCountDownHandler.removeCallbacksAndMessages(null); - mRingtoneCountDownHandler = null; + Thread thread = new Thread() { + @Override + public void run() { + try { + if (mRingtone != null) { + mRingtone.stopPlay(); + mRingtone = null; + restoreOriginalAudioSetup(); + } + if (mRingtoneCountDownHandler != null) { + mRingtoneCountDownHandler.removeCallbacksAndMessages(null); + mRingtoneCountDownHandler = null; + } + } catch (Exception e) { + Log.d(TAG, "stopRingtone() failed"); + } + wakeLockUtils.releasePartialWakeLock(); } - } catch(Exception e) { - Log.d(TAG, "stopRingtone() failed"); - } - wakeLockUtils.releasePartialWakeLock(); + }; + + thread.start(); } private void setMediaPlayerEvents(MediaPlayer mp, final String name) { @@ -1349,10 +1474,6 @@ public void run() { private class myMediaPlayer extends MediaPlayer implements MyPlayerInterface { - //myMediaPlayer() { - // super(); - //} - @Override public void stopPlay() { stop(); @@ -1363,38 +1484,24 @@ public void stopPlay() { @Override public void startPlay(final Map data) { try { - Uri sourceUri = (Uri) data.get("sourceUri"); - boolean setLooping = (Boolean) data.get("setLooping"); - int stream = (Integer) data.get("audioStream"); - String name = (String) data.get("name"); - ReactContext reactContext = getReactApplicationContext(); - setDataSource(reactContext, sourceUri); - setLooping(setLooping); - setAudioStreamType(stream); // is better using STREAM_DTMF for ToneGenerator? - /* - // TODO: use modern and more explicit audio stream api - if (android.os.Build.VERSION.SDK_INT >= 21) { - int audioFlag = (Integer) data.get("audioFlag"); - int audioUsage = (Integer) data.get("audioUsage"); - int audioContentType = (Integer) data.get("audioContentType"); - - setAudioAttributes( - new AudioAttributes.Builder() - .setFlags(audioFlag) - .setLegacyStreamType(stream) - .setUsage(audioUsage) - .setContentType(audioContentType) - .build() - ); - } - */ + setDataSource(reactContext, (Uri) data.get("sourceUri")); + setLooping((Boolean) data.get("setLooping")); + + // --- the `minSdkVersion` is 21 since RN 64, + // --- if you want to suuport api < 21, comment out `setAudioAttributes` and use `setAudioStreamType((Integer) data.get("audioStream"))` instead + setAudioAttributes( + new AudioAttributes.Builder() + .setUsage((Integer) data.get("audioUsage")) + .setContentType((Integer) data.get("audioContentType")) + .build() + ); // -- will start at onPrepared() event prepareAsync(); } catch (Exception e) { - Log.d(TAG, "startPlay() failed"); + Log.d(TAG, "startPlay() failed", e); } } @@ -1405,104 +1512,20 @@ public boolean isPlaying() { } // ===== Internal Classes End ===== -// ===== Permission Start ===== @ReactMethod - public void checkRecordPermission(Promise promise) { - Log.d(TAG, "RNInCallManager.checkRecordPermission(): enter"); - _checkRecordPermission(); - if (recordPermission.equals("unknow")) { - Log.d(TAG, "RNInCallManager.checkRecordPermission(): failed"); - promise.reject(new Exception("checkRecordPermission failed")); - } else { - promise.resolve(recordPermission); - } - } + public void chooseAudioRoute(String audioRoute, Promise promise) { + Log.d(TAG, "RNInCallManager.chooseAudioRoute(): user choose audioDevice = " + audioRoute); - @ReactMethod - public void checkCameraPermission(Promise promise) { - Log.d(TAG, "RNInCallManager.checkCameraPermission(): enter"); - _checkCameraPermission(); - if (cameraPermission.equals("unknow")) { - Log.d(TAG, "RNInCallManager.checkCameraPermission(): failed"); - promise.reject(new Exception("checkCameraPermission failed")); - } else { - promise.resolve(cameraPermission); - } - } - - private void _checkRecordPermission() { - recordPermission = _checkPermission(permission.RECORD_AUDIO); - Log.d(TAG, String.format("RNInCallManager.checkRecordPermission(): recordPermission=%s", recordPermission)); - } - - private void _checkCameraPermission() { - cameraPermission = _checkPermission(permission.CAMERA); - Log.d(TAG, String.format("RNInCallManager.checkCameraPermission(): cameraPermission=%s", cameraPermission)); - } - - private String _checkPermission(String targetPermission) { - try { - ReactContext reactContext = getReactApplicationContext(); - if (ContextCompat.checkSelfPermission(reactContext, targetPermission) == PackageManager.PERMISSION_GRANTED) { - return "granted"; - } else { - return "denied"; - } - } catch (Exception e) { - Log.d(TAG, "_checkPermission() catch"); - return "denied"; - } - } - - @ReactMethod - public void requestRecordPermission(Promise promise) { - Log.d(TAG, "RNInCallManager.requestRecordPermission(): enter"); - _checkRecordPermission(); - if (!recordPermission.equals("granted")) { - _requestPermission(permission.RECORD_AUDIO, promise); - } else { - // --- already granted - promise.resolve(recordPermission); - } - } - - @ReactMethod - public void requestCameraPermission(Promise promise) { - Log.d(TAG, "RNInCallManager.requestCameraPermission(): enter"); - _checkCameraPermission(); - if (!cameraPermission.equals("granted")) { - _requestPermission(permission.CAMERA, promise); - } else { - // --- already granted - promise.resolve(cameraPermission); - } - } - - private void _requestPermission(String targetPermission, Promise promise) { - Activity currentActivity = getCurrentActivity(); - if (currentActivity == null) { - Log.d(TAG, String.format("RNInCallManager._requestPermission(): ReactContext doesn't hava any Activity attached when requesting %s", targetPermission)); - promise.reject(new Exception("_requestPermission(): currentActivity is not attached")); - return; - } - int requestPermissionCode = getRandomInteger(1, 99999999); - while (mRequestPermissionCodePromises.get(requestPermissionCode, null) != null) { - requestPermissionCode = getRandomInteger(1, 99999999); - } - mRequestPermissionCodePromises.put(requestPermissionCode, promise); - mRequestPermissionCodeTargetPermission.put(requestPermissionCode, targetPermission); - /* - if (ActivityCompat.shouldShowRequestPermissionRationale(currentActivity, permission.RECORD_AUDIO)) { - showMessageOKCancel("You need to allow access to microphone for making call", new DialogInterface.OnClickListener() { - @Override - public void onClick(DialogInterface dialog, int which) { - ActivityCompat.requestPermissions(currentActivity, new String[] {permission.RECORD_AUDIO}, requestPermissionCode); - } - }); - return; + if (audioRoute.equals(AudioDevice.EARPIECE.name())) { + selectAudioDevice(AudioDevice.EARPIECE); + } else if (audioRoute.equals(AudioDevice.SPEAKER_PHONE.name())) { + selectAudioDevice(AudioDevice.SPEAKER_PHONE); + } else if (audioRoute.equals(AudioDevice.WIRED_HEADSET.name())) { + selectAudioDevice(AudioDevice.WIRED_HEADSET); + } else if (audioRoute.equals(AudioDevice.BLUETOOTH.name())) { + selectAudioDevice(AudioDevice.BLUETOOTH); } - */ - ActivityCompat.requestPermissions(currentActivity, new String[] {targetPermission}, requestPermissionCode); + promise.resolve(getAudioDeviceStatusMap()); } private static int getRandomInteger(int min, int max) { @@ -1513,47 +1536,6 @@ private static int getRandomInteger(int min, int max) { return random.nextInt((max - min) + 1) + min; } - protected static void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) { - Log.d(TAG, "RNInCallManager.onRequestPermissionsResult(): enter"); - Promise promise = mRequestPermissionCodePromises.get(requestCode, null); - String targetPermission = mRequestPermissionCodeTargetPermission.get(requestCode, null); - mRequestPermissionCodePromises.delete(requestCode); - mRequestPermissionCodeTargetPermission.delete(requestCode); - if (promise != null && targetPermission != null) { - - Map permissionResultMap = new HashMap(); - - for (int i = 0; i < permissions.length; i++) { - permissionResultMap.put(permissions[i], grantResults[i]); - } - - if (!permissionResultMap.containsKey(targetPermission)) { - Log.wtf(TAG, String.format("RNInCallManager.onRequestPermissionsResult(): requested permission %s but did not appear", targetPermission)); - promise.reject(String.format("%s_PERMISSION_NOT_FOUND", targetPermission), String.format("requested permission %s but did not appear", targetPermission)); - return; - } - - String _requestPermissionResult = "unknow"; - if (permissionResultMap.get(targetPermission) == PackageManager.PERMISSION_GRANTED) { - _requestPermissionResult = "granted"; - } else { - _requestPermissionResult = "denied"; - } - - if (targetPermission.equals(permission.RECORD_AUDIO)) { - recordPermission = _requestPermissionResult; - } else if (targetPermission.equals(permission.CAMERA)) { - cameraPermission = _requestPermissionResult; - } - promise.resolve(_requestPermissionResult); - } else { - //super.onRequestPermissionsResult(requestCode, permissions, grantResults); - Log.wtf(TAG, "RNInCallManager.onRequestPermissionsResult(): request code not found"); - promise.reject("PERMISSION_REQUEST_CODE_NOT_FOUND", "request code not found"); - } - } -// ===== Permission End ===== - private void pause() { if (audioManagerActivated) { Log.d(TAG, "pause audioRouteManager"); @@ -1653,8 +1635,9 @@ public void setDefaultAudioDevice(AudioDevice defaultDevice) { /** Changes selection of the currently active audio device. */ public void selectAudioDevice(AudioDevice device) { - if (!audioDevices.contains(device)) { - Log.e(TAG, "Can not select " + device + " from available " + audioDevices); + if (device != AudioDevice.NONE && !audioDevices.contains(device)) { + Log.e(TAG, "selectAudioDevice() Can not select " + device + " from available " + audioDevices); + return; } userSelectedAudioDevice = device; updateAudioDeviceState(); @@ -1672,12 +1655,30 @@ public AudioDevice getSelectedAudioDevice() { /** Helper method for receiver registration. */ private void registerReceiver(BroadcastReceiver receiver, IntentFilter filter) { - getReactApplicationContext().registerReceiver(receiver, filter); + final ReactContext reactContext = getReactApplicationContext(); + if (reactContext != null) { + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) { + reactContext.registerReceiver(receiver, filter, Context.RECEIVER_NOT_EXPORTED); + } else { + reactContext.registerReceiver(receiver, filter); + } + } else { + Log.d(TAG, "registerReceiver() reactContext is null"); + } } /** Helper method for unregistration of an existing receiver. */ - private void unregisterReceiver(BroadcastReceiver receiver) { - getReactApplicationContext().unregisterReceiver(receiver); + private void unregisterReceiver(final BroadcastReceiver receiver) { + final ReactContext reactContext = this.getReactApplicationContext(); + if (reactContext != null) { + try { + reactContext.unregisterReceiver(receiver); + } catch (final Exception e) { + Log.d(TAG, "unregisterReceiver() failed"); + } + } else { + Log.d(TAG, "unregisterReceiver() reactContext is null"); + } } /** Sets the speaker phone mode. */ @@ -1728,150 +1729,171 @@ private boolean hasWiredHeadset() { } else if (type == AudioDeviceInfo.TYPE_USB_DEVICE) { Log.d(TAG, "hasWiredHeadset: found USB audio device"); return true; + } else if (type == AudioDeviceInfo.TYPE_WIRED_HEADPHONES) { + Log.d(TAG, "hasWiredHeadset: found wired headphones"); + return true; } } return false; } } + @ReactMethod + public void getIsWiredHeadsetPluggedIn(Promise promise) { + promise.resolve(this.hasWiredHeadset()); + } /** * Updates list of possible audio devices and make new device selection. */ public void updateAudioDeviceState() { - Log.d(TAG, "--- updateAudioDeviceState: " - + "wired headset=" + hasWiredHeadset + ", " - + "BT state=" + bluetoothManager.getState()); - Log.d(TAG, "Device status: " - + "available=" + audioDevices + ", " - + "selected=" + selectedAudioDevice + ", " - + "user selected=" + userSelectedAudioDevice); - - // Check if any Bluetooth headset is connected. The internal BT state will - // change accordingly. - // TODO(henrika): perhaps wrap required state into BT manager. - if (bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE - || bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_UNAVAILABLE - || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_DISCONNECTING) { - bluetoothManager.updateDevice(); - } - - // Update the set of available audio devices. - Set newAudioDevices = new HashSet<>(); - - if (bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTED - || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTING - || bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE) { - newAudioDevices.add(AudioDevice.BLUETOOTH); - } - - if (hasWiredHeadset) { - // If a wired headset is connected, then it is the only possible option. - newAudioDevices.add(AudioDevice.WIRED_HEADSET); - } else { - // No wired headset, hence the audio-device list can contain speaker - // phone (on a tablet), or speaker phone and earpiece (on mobile phone). + UiThreadUtil.runOnUiThread(() -> { + Log.d(TAG, "--- updateAudioDeviceState: " + + "wired headset=" + hasWiredHeadset + ", " + + "BT state=" + bluetoothManager.getState()); + Log.d(TAG, "Device status: " + + "available=" + audioDevices + ", " + + "selected=" + selectedAudioDevice + ", " + + "user selected=" + userSelectedAudioDevice); + + // Check if any Bluetooth headset is connected. The internal BT state will + // change accordingly. + // TODO(henrika): perhaps wrap required state into BT manager. + if (bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE + || bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_UNAVAILABLE + || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_DISCONNECTING) { + bluetoothManager.updateDevice(); + } + + // Update the set of available audio devices. + Set newAudioDevices = new HashSet<>(); + + // always assume device has speaker phone newAudioDevices.add(AudioDevice.SPEAKER_PHONE); + + if (bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTED + || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTING + || bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE) { + newAudioDevices.add(AudioDevice.BLUETOOTH); + } + + if (hasWiredHeadset) { + newAudioDevices.add(AudioDevice.WIRED_HEADSET); + } + if (hasEarpiece()) { newAudioDevices.add(AudioDevice.EARPIECE); } - } - // Store state which is set to true if the device list has changed. - boolean audioDeviceSetUpdated = !audioDevices.equals(newAudioDevices); - // Update the existing audio device set. - audioDevices = newAudioDevices; - // Correct user selected audio devices if needed. - if (bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_UNAVAILABLE - && userSelectedAudioDevice == AudioDevice.BLUETOOTH) { - // If BT is not available, it can't be the user selection. - userSelectedAudioDevice = AudioDevice.NONE; - } - if (hasWiredHeadset && userSelectedAudioDevice == AudioDevice.SPEAKER_PHONE) { - // If user selected speaker phone, but then plugged wired headset then make - // wired headset as user selected device. - userSelectedAudioDevice = AudioDevice.WIRED_HEADSET; - } - if (!hasWiredHeadset && userSelectedAudioDevice == AudioDevice.WIRED_HEADSET) { - // If user selected wired headset, but then unplugged wired headset then make - // speaker phone as user selected device. - userSelectedAudioDevice = AudioDevice.SPEAKER_PHONE; - } - // Need to start Bluetooth if it is available and user either selected it explicitly or - // user did not select any output device. - boolean needBluetoothAudioStart = - bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE - && (userSelectedAudioDevice == AudioDevice.NONE - || userSelectedAudioDevice == AudioDevice.BLUETOOTH); + // --- check whether user selected audio device is available + if (userSelectedAudioDevice != null + && userSelectedAudioDevice != AudioDevice.NONE + && !newAudioDevices.contains(userSelectedAudioDevice)) { + userSelectedAudioDevice = AudioDevice.NONE; + } + + // Store state which is set to true if the device list has changed. + boolean audioDeviceSetUpdated = !audioDevices.equals(newAudioDevices); + // Update the existing audio device set. + audioDevices = newAudioDevices; + + AudioDevice newAudioDevice = getPreferredAudioDevice(); - // Need to stop Bluetooth audio if user selected different device and - // Bluetooth SCO connection is established or in the process. - boolean needBluetoothAudioStop = - (bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTED + // --- stop bluetooth if needed + if (selectedAudioDevice == AudioDevice.BLUETOOTH + && newAudioDevice != AudioDevice.BLUETOOTH + && (bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTED || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTING) - && (userSelectedAudioDevice != AudioDevice.NONE - && userSelectedAudioDevice != AudioDevice.BLUETOOTH); - - if (bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE - || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTING - || bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTED) { - Log.d(TAG, "Need BT audio: start=" + needBluetoothAudioStart + ", " - + "stop=" + needBluetoothAudioStop + ", " - + "BT state=" + bluetoothManager.getState()); - } + ) { + bluetoothManager.stopScoAudio(); + bluetoothManager.updateDevice(); + } - // Start or stop Bluetooth SCO connection given states set earlier. - if (needBluetoothAudioStop) { - bluetoothManager.stopScoAudio(); - bluetoothManager.updateDevice(); - } + // --- start bluetooth if needed + if (selectedAudioDevice != AudioDevice.BLUETOOTH + && newAudioDevice == AudioDevice.BLUETOOTH + && bluetoothManager.getState() == AppRTCBluetoothManager.State.HEADSET_AVAILABLE) { + // Attempt to start Bluetooth SCO audio (takes a few second to start). + if (!bluetoothManager.startScoAudio()) { + // Remove BLUETOOTH from list of available devices since SCO failed. + audioDevices.remove(AudioDevice.BLUETOOTH); + audioDeviceSetUpdated = true; + if (userSelectedAudioDevice == AudioDevice.BLUETOOTH) { + userSelectedAudioDevice = AudioDevice.NONE; + } + newAudioDevice = getPreferredAudioDevice(); + } + } + + if (newAudioDevice == AudioDevice.BLUETOOTH + && bluetoothManager.getState() != AppRTCBluetoothManager.State.SCO_CONNECTED) { + newAudioDevice = getPreferredAudioDevice(true); // --- skip bluetooth + } + + // Switch to new device but only if there has been any changes. + if (newAudioDevice != selectedAudioDevice || audioDeviceSetUpdated) { - if (needBluetoothAudioStart && !needBluetoothAudioStop) { - // Attempt to start Bluetooth SCO audio (takes a few second to start). - if (!bluetoothManager.startScoAudio()) { - // Remove BLUETOOTH from list of available devices since SCO failed. - audioDevices.remove(AudioDevice.BLUETOOTH); - audioDeviceSetUpdated = true; + // Do the required device switch. + setAudioDeviceInternal(newAudioDevice); + Log.d(TAG, "New device status: " + + "available=" + audioDevices + ", " + + "selected=" + newAudioDevice); + /* + if (audioManagerEvents != null) { + // Notify a listening client that audio device has been changed. + audioManagerEvents.onAudioDeviceChanged(selectedAudioDevice, audioDevices); + } + */ + sendEvent("onAudioDeviceChanged", getAudioDeviceStatusMap()); } + Log.d(TAG, "--- updateAudioDeviceState done"); + }); + } + + private WritableMap getAudioDeviceStatusMap() { + WritableMap data = Arguments.createMap(); + String audioDevicesJson = "["; + for (AudioDevice s: audioDevices) { + audioDevicesJson += "\"" + s.name() + "\","; + } + + // --- strip the last `,` + if (audioDevicesJson.length() > 1) { + audioDevicesJson = audioDevicesJson.substring(0, audioDevicesJson.length() - 1); } + audioDevicesJson += "]"; + + data.putString("availableAudioDeviceList", audioDevicesJson); + data.putString("selectedAudioDevice", (selectedAudioDevice == null) ? "" : selectedAudioDevice.name()); + + return data; + } + + private AudioDevice getPreferredAudioDevice() { + return getPreferredAudioDevice(false); + } - // Update selected audio device. + private AudioDevice getPreferredAudioDevice(boolean skipBluetooth) { final AudioDevice newAudioDevice; - if (bluetoothManager.getState() == AppRTCBluetoothManager.State.SCO_CONNECTED) { + if (userSelectedAudioDevice != null && userSelectedAudioDevice != AudioDevice.NONE) { + newAudioDevice = userSelectedAudioDevice; + } else if (!skipBluetooth && audioDevices.contains(AudioDevice.BLUETOOTH)) { // If a Bluetooth is connected, then it should be used as output audio // device. Note that it is not sufficient that a headset is available; // an active SCO channel must also be up and running. newAudioDevice = AudioDevice.BLUETOOTH; - } else if (hasWiredHeadset) { + } else if (audioDevices.contains(AudioDevice.WIRED_HEADSET)) { // If a wired headset is connected, but Bluetooth is not, then wired headset is used as // audio device. newAudioDevice = AudioDevice.WIRED_HEADSET; - } else if (userSelectedAudioDevice != null - && userSelectedAudioDevice != AudioDevice.NONE - && userSelectedAudioDevice != defaultAudioDevice) { - newAudioDevice = userSelectedAudioDevice; - } else { - // No wired headset and no Bluetooth, hence the audio-device list can contain speaker - // phone (on a tablet), or speaker phone and earpiece (on mobile phone). - // |defaultAudioDevice| contains either AudioDevice.SPEAKER_PHONE or AudioDevice.EARPIECE - // depending on the user's selgection. + } else if (audioDevices.contains(defaultAudioDevice)) { newAudioDevice = defaultAudioDevice; + } else { + newAudioDevice = AudioDevice.SPEAKER_PHONE; } - // Switch to new device but only if there has been any changes. - if (newAudioDevice != selectedAudioDevice || audioDeviceSetUpdated) { - // Do the required device switch. - setAudioDeviceInternal(newAudioDevice); - Log.d(TAG, "New device status: " - + "available=" + audioDevices + ", " - + "selected=" + newAudioDevice); - /* - if (audioManagerEvents != null) { - // Notify a listening client that audio device has been changed. - audioManagerEvents.onAudioDeviceChanged(selectedAudioDevice, audioDevices); - } - */ - } - Log.d(TAG, "--- updateAudioDeviceState done"); + + return newAudioDevice; } } + diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerPackage.java b/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerPackage.java index 055286f..faf3681 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerPackage.java +++ b/android/src/main/java/com/zxcpoiu/incallmanager/InCallManagerPackage.java @@ -42,7 +42,4 @@ public List createViewManagers(ReactApplicationContext reactContext return Collections.emptyList(); } - public static void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) { - InCallManagerModule.onRequestPermissionsResult(requestCode, permissions, grantResults); - } } diff --git a/android/src/main/java/com/zxcpoiu/incallmanager/InCallProximityManager.java b/android/src/main/java/com/zxcpoiu/incallmanager/InCallProximityManager.java index 65e4429..1411063 100644 --- a/android/src/main/java/com/zxcpoiu/incallmanager/InCallProximityManager.java +++ b/android/src/main/java/com/zxcpoiu/incallmanager/InCallProximityManager.java @@ -27,6 +27,8 @@ import java.lang.reflect.Method; import java.lang.Runnable; +import com.facebook.react.bridge.UiThreadUtil; + import com.zxcpoiu.incallmanager.AppRTC.AppRTCProximitySensor; public class InCallProximityManager { @@ -46,14 +48,11 @@ private InCallProximityManager(Context context, final InCallManagerModule inCall Log.d(TAG, "InCallProximityManager"); checkProximitySupport(context); if (proximitySupported) { - proximitySensor = AppRTCProximitySensor.create(context, - new Runnable() { - @Override - public void run() { - inCallManager.onProximitySensorChangedState(proximitySensor.sensorReportsNearState()); - } - } - ); + UiThreadUtil.runOnUiThread(() -> { + proximitySensor = AppRTCProximitySensor.create(context, () -> { + inCallManager.onProximitySensorChangedState(proximitySensor.sensorReportsNearState()); + }); + }); } } @@ -108,11 +107,16 @@ public boolean start() { if (!proximitySupported) { return false; } - return proximitySensor.start(); + UiThreadUtil.runOnUiThread(() -> { + proximitySensor.start(); + }); + return true; } public void stop() { - proximitySensor.stop(); + UiThreadUtil.runOnUiThread(() -> { + proximitySensor.stop(); + }); } public boolean isProximitySupported() { diff --git a/index.d.ts b/index.d.ts new file mode 100644 index 0000000..c8ebe9d --- /dev/null +++ b/index.d.ts @@ -0,0 +1,64 @@ +declare class InCallManager { + vibrate: boolean; + audioUriMap: { + ringtone: { _BUNDLE_: null; _DEFAULT_: null }; + ringback: { _BUNDLE_: null; _DEFAULT_: null }; + busytone: { _BUNDLE_: null; _DEFAULT_: null }; + }; + + constructor(); + + start(setup?: { + auto?: boolean; + media?: "video" | "audio"; + ringback?: string; + }): void; + + stop(setup?: { busytone?: string }): void; + + turnScreenOff(): void; + + turnScreenOn(): void; + + getIsWiredHeadsetPluggedIn(): Promise<{ isWiredHeadsetPluggedIn: boolean }>; + + setFlashOn(enable: boolean, brightness: number): void; + + setKeepScreenOn(enable: boolean): void; + + setSpeakerphoneOn(enable: boolean): void; + + setForceSpeakerphoneOn(flag: boolean): void; + + setMicrophoneMute(enable: boolean): void; + + startRingtone( + ringtone: string, + vibrate_pattern: number | number[], + ios_category: string, + seconds: number + ): void; + + stopRingtone(): void; + + startProximitySensor(): void; + + stopProximitySensor(): void; + + startRingback(ringback: string): void; + + stopRingback(): void; + + pokeScreen(timeout: number): void; + + getAudioUri(audioType: string, fileType: string): Promise; + + chooseAudioRoute(route: string): Promise; + + requestAudioFocus(): Promise; + + abandonAudioFocus(): Promise; +} + +declare const inCallManager: InCallManager; +export default inCallManager; diff --git a/index.js b/index.js index 9923066..ac5cafb 100644 --- a/index.js +++ b/index.js @@ -8,19 +8,11 @@ import { class InCallManager { constructor() { this.vibrate = false; - this.recordPermission = 'unknow'; - this.cameraPermission = 'unknow'; this.audioUriMap = { ringtone: { _BUNDLE_: null, _DEFAULT_: null}, ringback: { _BUNDLE_: null, _DEFAULT_: null}, busytone: { _BUNDLE_: null, _DEFAULT_: null}, }; - this.checkRecordPermission = this.checkRecordPermission.bind(this); - this.requestRecordPermission = this.requestRecordPermission.bind(this); - this.checkCameraPermission = this.checkCameraPermission.bind(this); - this.requestCameraPermission = this.requestCameraPermission.bind(this); - this.checkRecordPermission(); - this.checkCameraPermission(); } start(setup) { @@ -46,12 +38,8 @@ class InCallManager { } async getIsWiredHeadsetPluggedIn() { - if (Platform.OS === 'ios') { - return await _InCallManager.getIsWiredHeadsetPluggedIn(); - } else { - console.log("Android doesn't support getIsWiredHeadsetPluggedIn() yet."); - return null; - } + let isPluggedIn = await _InCallManager.getIsWiredHeadsetPluggedIn(); + return { isWiredHeadsetPluggedIn: isPluggedIn }; } setFlashOn(enable, brightness) { @@ -110,36 +98,21 @@ class InCallManager { _InCallManager.stopRingtone(); } - stopRingback() { - _InCallManager.stopRingback(); - } - - async checkRecordPermission() { - // --- on android which api < 23, it will always be "granted" - let result = await _InCallManager.checkRecordPermission(); - this.recordPermission = result; - return result; + startProximitySensor() { + _InCallManager.startProximitySensor(); } - - async requestRecordPermission() { - // --- on android which api < 23, it will always be "granted" - let result = await _InCallManager.requestRecordPermission(); - this.recordPermission = result; - return result; + + stopProximitySensor() { + _InCallManager.stopProximitySensor(); } - async checkCameraPermission() { - // --- on android which api < 23, it will always be "granted" - let result = await _InCallManager.checkCameraPermission(); - this.cameraPermission = result; - return result; + startRingback(ringback) { + ringback = (typeof ringback === 'string') ? ringback : "_DTMF_"; + _InCallManager.startRingback(ringback); } - async requestCameraPermission() { - // --- on android which api < 23, it will always be "granted" - let result = await _InCallManager.requestCameraPermission(); - this.cameraPermission = result; - return result; + stopRingback() { + _InCallManager.stopRingback(); } pokeScreen(_timeout) { @@ -171,6 +144,27 @@ class InCallManager { } } } + + async chooseAudioRoute(route) { + let result = await _InCallManager.chooseAudioRoute(route); + return result; + } + + async requestAudioFocus() { + if (Platform.OS === 'android') { + return await _InCallManager.requestAudioFocusJS(); + } else { + console.log("ios doesn't support requestAudioFocus()"); + } + } + + async abandonAudioFocus() { + if (Platform.OS === 'android') { + return await _InCallManager.abandonAudioFocusJS(); + } else { + console.log("ios doesn't support requestAudioFocus()"); + } + } } export default new InCallManager(); diff --git a/ios/RNInCallManager.xcodeproj/project.pbxproj b/ios/RNInCallManager.xcodeproj/project.pbxproj new file mode 100644 index 0000000..22e610d --- /dev/null +++ b/ios/RNInCallManager.xcodeproj/project.pbxproj @@ -0,0 +1,283 @@ +// !$*UTF8*$! +{ + archiveVersion = 1; + classes = { + }; + objectVersion = 48; + objects = { + +/* Begin PBXBuildFile section */ + 231CD25B1FD68A17004DD25D /* RNInCallManager.m in Sources */ = {isa = PBXBuildFile; fileRef = 231CD25A1FD68A17004DD25D /* RNInCallManager.m */; }; + 231CD25C1FD68A17004DD25D /* RNInCallManager.h in CopyFiles */ = {isa = PBXBuildFile; fileRef = 231CD2591FD68A17004DD25D /* RNInCallManager.h */; }; +/* End PBXBuildFile section */ + +/* Begin PBXCopyFilesBuildPhase section */ + 231CD2541FD68A17004DD25D /* CopyFiles */ = { + isa = PBXCopyFilesBuildPhase; + buildActionMask = 2147483647; + dstPath = "include/$(PRODUCT_NAME)"; + dstSubfolderSpec = 16; + files = ( + 231CD25C1FD68A17004DD25D /* RNInCallManager.h in CopyFiles */, + ); + runOnlyForDeploymentPostprocessing = 0; + }; +/* End PBXCopyFilesBuildPhase section */ + +/* Begin PBXFileReference section */ + 231CD2561FD68A17004DD25D /* libRNInCallManager.a */ = {isa = PBXFileReference; explicitFileType = archive.ar; includeInIndex = 0; path = libRNInCallManager.a; sourceTree = BUILT_PRODUCTS_DIR; }; + 231CD2591FD68A17004DD25D /* RNInCallManager.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = RNInCallManager.h; sourceTree = ""; }; + 231CD25A1FD68A17004DD25D /* RNInCallManager.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = RNInCallManager.m; sourceTree = ""; }; +/* End PBXFileReference section */ + +/* Begin PBXFrameworksBuildPhase section */ + 231CD2531FD68A17004DD25D /* Frameworks */ = { + isa = PBXFrameworksBuildPhase; + buildActionMask = 2147483647; + files = ( + ); + runOnlyForDeploymentPostprocessing = 0; + }; +/* End PBXFrameworksBuildPhase section */ + +/* Begin PBXGroup section */ + 231CD24D1FD68A17004DD25D = { + isa = PBXGroup; + children = ( + 231CD2581FD68A17004DD25D /* RNInCallManager */, + 231CD2571FD68A17004DD25D /* Products */, + ); + sourceTree = ""; + }; + 231CD2571FD68A17004DD25D /* Products */ = { + isa = PBXGroup; + children = ( + 231CD2561FD68A17004DD25D /* libRNInCallManager.a */, + ); + name = Products; + sourceTree = ""; + }; + 231CD2581FD68A17004DD25D /* RNInCallManager */ = { + isa = PBXGroup; + children = ( + 231CD2591FD68A17004DD25D /* RNInCallManager.h */, + 231CD25A1FD68A17004DD25D /* RNInCallManager.m */, + ); + path = RNInCallManager; + sourceTree = ""; + }; +/* End PBXGroup section */ + +/* Begin PBXNativeTarget section */ + 231CD2551FD68A17004DD25D /* RNInCallManager */ = { + isa = PBXNativeTarget; + buildConfigurationList = 231CD25F1FD68A17004DD25D /* Build configuration list for PBXNativeTarget "RNInCallManager" */; + buildPhases = ( + 231CD2521FD68A17004DD25D /* Sources */, + 231CD2531FD68A17004DD25D /* Frameworks */, + 231CD2541FD68A17004DD25D /* CopyFiles */, + ); + buildRules = ( + ); + dependencies = ( + ); + name = RNInCallManager; + productName = RNInCallManager; + productReference = 231CD2561FD68A17004DD25D /* libRNInCallManager.a */; + productType = "com.apple.product-type.library.static"; + }; +/* End PBXNativeTarget section */ + +/* Begin PBXProject section */ + 231CD24E1FD68A17004DD25D /* Project object */ = { + isa = PBXProject; + attributes = { + LastUpgradeCheck = 0910; + ORGANIZATIONNAME = zxcpoiu; + TargetAttributes = { + 231CD2551FD68A17004DD25D = { + CreatedOnToolsVersion = 9.1; + ProvisioningStyle = Automatic; + }; + }; + }; + buildConfigurationList = 231CD2511FD68A17004DD25D /* Build configuration list for PBXProject "RNInCallManager" */; + compatibilityVersion = "Xcode 8.0"; + developmentRegion = en; + hasScannedForEncodings = 0; + knownRegions = ( + en, + ); + mainGroup = 231CD24D1FD68A17004DD25D; + productRefGroup = 231CD2571FD68A17004DD25D /* Products */; + projectDirPath = ""; + projectRoot = ""; + targets = ( + 231CD2551FD68A17004DD25D /* RNInCallManager */, + ); + }; +/* End PBXProject section */ + +/* Begin PBXSourcesBuildPhase section */ + 231CD2521FD68A17004DD25D /* Sources */ = { + isa = PBXSourcesBuildPhase; + buildActionMask = 2147483647; + files = ( + 231CD25B1FD68A17004DD25D /* RNInCallManager.m in Sources */, + ); + runOnlyForDeploymentPostprocessing = 0; + }; +/* End PBXSourcesBuildPhase section */ + +/* Begin XCBuildConfiguration section */ + 231CD25D1FD68A17004DD25D /* Debug */ = { + isa = XCBuildConfiguration; + buildSettings = { + ALWAYS_SEARCH_USER_PATHS = NO; + CLANG_ANALYZER_NONNULL = YES; + CLANG_ANALYZER_NUMBER_OBJECT_CONVERSION = YES_AGGRESSIVE; + CLANG_CXX_LANGUAGE_STANDARD = "gnu++14"; + CLANG_CXX_LIBRARY = "libc++"; + CLANG_ENABLE_MODULES = YES; + CLANG_ENABLE_OBJC_ARC = YES; + CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES; + CLANG_WARN_BOOL_CONVERSION = YES; + CLANG_WARN_COMMA = YES; + CLANG_WARN_CONSTANT_CONVERSION = YES; + CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR; + CLANG_WARN_DOCUMENTATION_COMMENTS = YES; + CLANG_WARN_EMPTY_BODY = YES; + CLANG_WARN_ENUM_CONVERSION = YES; + CLANG_WARN_INFINITE_RECURSION = YES; + CLANG_WARN_INT_CONVERSION = YES; + CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES; + CLANG_WARN_OBJC_LITERAL_CONVERSION = YES; + CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; + CLANG_WARN_RANGE_LOOP_ANALYSIS = YES; + CLANG_WARN_STRICT_PROTOTYPES = YES; + CLANG_WARN_SUSPICIOUS_MOVE = YES; + CLANG_WARN_UNGUARDED_AVAILABILITY = YES_AGGRESSIVE; + CLANG_WARN_UNREACHABLE_CODE = YES; + CLANG_WARN__DUPLICATE_METHOD_MATCH = YES; + CODE_SIGN_IDENTITY = "iPhone Developer"; + COPY_PHASE_STRIP = NO; + DEBUG_INFORMATION_FORMAT = dwarf; + ENABLE_STRICT_OBJC_MSGSEND = YES; + ENABLE_TESTABILITY = YES; + GCC_C_LANGUAGE_STANDARD = gnu11; + GCC_DYNAMIC_NO_PIC = NO; + GCC_NO_COMMON_BLOCKS = YES; + GCC_OPTIMIZATION_LEVEL = 0; + GCC_PREPROCESSOR_DEFINITIONS = ( + "DEBUG=1", + "$(inherited)", + ); + GCC_WARN_64_TO_32_BIT_CONVERSION = YES; + GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR; + GCC_WARN_UNDECLARED_SELECTOR = YES; + GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE; + GCC_WARN_UNUSED_FUNCTION = YES; + GCC_WARN_UNUSED_VARIABLE = YES; + IPHONEOS_DEPLOYMENT_TARGET = 8.0; + MTL_ENABLE_DEBUG_INFO = YES; + ONLY_ACTIVE_ARCH = YES; + SDKROOT = iphoneos; + }; + name = Debug; + }; + 231CD25E1FD68A17004DD25D /* Release */ = { + isa = XCBuildConfiguration; + buildSettings = { + ALWAYS_SEARCH_USER_PATHS = NO; + CLANG_ANALYZER_NONNULL = YES; + CLANG_ANALYZER_NUMBER_OBJECT_CONVERSION = YES_AGGRESSIVE; + CLANG_CXX_LANGUAGE_STANDARD = "gnu++14"; + CLANG_CXX_LIBRARY = "libc++"; + CLANG_ENABLE_MODULES = YES; + CLANG_ENABLE_OBJC_ARC = YES; + CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES; + CLANG_WARN_BOOL_CONVERSION = YES; + CLANG_WARN_COMMA = YES; + CLANG_WARN_CONSTANT_CONVERSION = YES; + CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR; + CLANG_WARN_DOCUMENTATION_COMMENTS = YES; + CLANG_WARN_EMPTY_BODY = YES; + CLANG_WARN_ENUM_CONVERSION = YES; + CLANG_WARN_INFINITE_RECURSION = YES; + CLANG_WARN_INT_CONVERSION = YES; + CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES; + CLANG_WARN_OBJC_LITERAL_CONVERSION = YES; + CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; + CLANG_WARN_RANGE_LOOP_ANALYSIS = YES; + CLANG_WARN_STRICT_PROTOTYPES = YES; + CLANG_WARN_SUSPICIOUS_MOVE = YES; + CLANG_WARN_UNGUARDED_AVAILABILITY = YES_AGGRESSIVE; + CLANG_WARN_UNREACHABLE_CODE = YES; + CLANG_WARN__DUPLICATE_METHOD_MATCH = YES; + CODE_SIGN_IDENTITY = "iPhone Developer"; + COPY_PHASE_STRIP = NO; + DEBUG_INFORMATION_FORMAT = "dwarf-with-dsym"; + ENABLE_NS_ASSERTIONS = NO; + ENABLE_STRICT_OBJC_MSGSEND = YES; + GCC_C_LANGUAGE_STANDARD = gnu11; + GCC_NO_COMMON_BLOCKS = YES; + GCC_WARN_64_TO_32_BIT_CONVERSION = YES; + GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR; + GCC_WARN_UNDECLARED_SELECTOR = YES; + GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE; + GCC_WARN_UNUSED_FUNCTION = YES; + GCC_WARN_UNUSED_VARIABLE = YES; + IPHONEOS_DEPLOYMENT_TARGET = 8.0; + MTL_ENABLE_DEBUG_INFO = NO; + SDKROOT = iphoneos; + VALIDATE_PRODUCT = YES; + }; + name = Release; + }; + 231CD2601FD68A17004DD25D /* Debug */ = { + isa = XCBuildConfiguration; + buildSettings = { + CODE_SIGN_STYLE = Automatic; + OTHER_LDFLAGS = "-ObjC"; + PRODUCT_NAME = "$(TARGET_NAME)"; + SKIP_INSTALL = YES; + TARGETED_DEVICE_FAMILY = "1,2"; + }; + name = Debug; + }; + 231CD2611FD68A17004DD25D /* Release */ = { + isa = XCBuildConfiguration; + buildSettings = { + CODE_SIGN_STYLE = Automatic; + OTHER_LDFLAGS = "-ObjC"; + PRODUCT_NAME = "$(TARGET_NAME)"; + SKIP_INSTALL = YES; + TARGETED_DEVICE_FAMILY = "1,2"; + }; + name = Release; + }; +/* End XCBuildConfiguration section */ + +/* Begin XCConfigurationList section */ + 231CD2511FD68A17004DD25D /* Build configuration list for PBXProject "RNInCallManager" */ = { + isa = XCConfigurationList; + buildConfigurations = ( + 231CD25D1FD68A17004DD25D /* Debug */, + 231CD25E1FD68A17004DD25D /* Release */, + ); + defaultConfigurationIsVisible = 0; + defaultConfigurationName = Release; + }; + 231CD25F1FD68A17004DD25D /* Build configuration list for PBXNativeTarget "RNInCallManager" */ = { + isa = XCConfigurationList; + buildConfigurations = ( + 231CD2601FD68A17004DD25D /* Debug */, + 231CD2611FD68A17004DD25D /* Release */, + ); + defaultConfigurationIsVisible = 0; + defaultConfigurationName = Release; + }; +/* End XCConfigurationList section */ + }; + rootObject = 231CD24E1FD68A17004DD25D /* Project object */; +} diff --git a/ios/RNInCallManager/RNInCallManager-Bridging-Header.h b/ios/RNInCallManager/RNInCallManager-Bridging-Header.h deleted file mode 100644 index 6cad69c..0000000 --- a/ios/RNInCallManager/RNInCallManager-Bridging-Header.h +++ /dev/null @@ -1,14 +0,0 @@ -// RNInCallManager-Bridging-Header.h -// RNInCallManager -// -// Created by zxcpoiu, Henry Hung-Hsien Lin on 2016-04-10 -// Copyright 2016 Facebook. All rights reserved. -// - -#ifndef RNInCallManager_Bridging_Header_h -#define RNInCallManager_Bridging_Header_h -#import -#import -#import - -#endif /* RNInCallManager_Bridging_Header_h */ diff --git a/ios/RNInCallManager/RNInCallManager.h b/ios/RNInCallManager/RNInCallManager.h new file mode 100644 index 0000000..9013970 --- /dev/null +++ b/ios/RNInCallManager/RNInCallManager.h @@ -0,0 +1,17 @@ +// +// RNInCallManager.h +// RNInCallManager +// +// Created by Ian Yu-Hsun Lin (@ianlin) on 05/12/2017. +// Copyright © 2017 zxcpoiu. All rights reserved. +// + +#import +#import + +#import +#import + +@interface RNInCallManager : RCTEventEmitter + +@end diff --git a/ios/RNInCallManager/RNInCallManager.m b/ios/RNInCallManager/RNInCallManager.m new file mode 100644 index 0000000..7c54f6e --- /dev/null +++ b/ios/RNInCallManager/RNInCallManager.m @@ -0,0 +1,1284 @@ +// +// RNInCallManager.m +// RNInCallManager +// +// Created by Ian Yu-Hsun Lin (@ianlin) on 05/12/2017. +// Copyright © 2017 zxcpoiu. All rights reserved. +// + +#import "RNInCallManager.h" + +#import +#import +#import +#import + +//static BOOL const automatic = YES; + +@implementation RNInCallManager +{ + UIDevice *_currentDevice; + + AVAudioSession *_audioSession; + AVAudioPlayer *_ringtone; + AVAudioPlayer *_ringback; + AVAudioPlayer *_busytone; + + NSURL *_defaultRingtoneUri; + NSURL *_defaultRingbackUri; + NSURL *_defaultBusytoneUri; + NSURL *_bundleRingtoneUri; + NSURL *_bundleRingbackUri; + NSURL *_bundleBusytoneUri; + + //BOOL isProximitySupported; + BOOL _proximityIsNear; + + // --- tags to indicating which observer has added + BOOL _isProximityRegistered; + BOOL _isAudioSessionInterruptionRegistered; + BOOL _isAudioSessionRouteChangeRegistered; + BOOL _isAudioSessionMediaServicesWereLostRegistered; + BOOL _isAudioSessionMediaServicesWereResetRegistered; + BOOL _isAudioSessionSilenceSecondaryAudioHintRegistered; + + // -- notification observers + id _proximityObserver; + id _audioSessionInterruptionObserver; + id _audioSessionRouteChangeObserver; + id _audioSessionMediaServicesWereLostObserver; + id _audioSessionMediaServicesWereResetObserver; + id _audioSessionSilenceSecondaryAudioHintObserver; + + NSString *_incallAudioMode; + NSString *_incallAudioCategory; + NSString *_origAudioCategory; + NSString *_origAudioMode; + BOOL _audioSessionInitialized; + int _forceSpeakerOn; + NSString *_media; +} + ++ (BOOL)requiresMainQueueSetup +{ + return NO; +} + +RCT_EXPORT_MODULE(InCallManager) + +- (instancetype)init +{ + if (self = [super init]) { + _currentDevice = [UIDevice currentDevice]; + _audioSession = [AVAudioSession sharedInstance]; + _ringtone = nil; + _ringback = nil; + _busytone = nil; + + _defaultRingtoneUri = nil; + _defaultRingbackUri = nil; + _defaultBusytoneUri = nil; + _bundleRingtoneUri = nil; + _bundleRingbackUri = nil; + _bundleBusytoneUri = nil; + + _proximityIsNear = NO; + + _isProximityRegistered = NO; + _isAudioSessionInterruptionRegistered = NO; + _isAudioSessionRouteChangeRegistered = NO; + _isAudioSessionMediaServicesWereLostRegistered = NO; + _isAudioSessionMediaServicesWereResetRegistered = NO; + _isAudioSessionSilenceSecondaryAudioHintRegistered = NO; + + _proximityObserver = nil; + _audioSessionInterruptionObserver = nil; + _audioSessionRouteChangeObserver = nil; + _audioSessionMediaServicesWereLostObserver = nil; + _audioSessionMediaServicesWereResetObserver = nil; + _audioSessionSilenceSecondaryAudioHintObserver = nil; + + _incallAudioMode = AVAudioSessionModeVoiceChat; + _incallAudioCategory = AVAudioSessionCategoryPlayAndRecord; + _origAudioCategory = nil; + _origAudioMode = nil; + _audioSessionInitialized = NO; + _forceSpeakerOn = 0; + _media = @"audio"; + + NSLog(@"RNInCallManager.init(): initialized"); + } + return self; +} + +- (void)dealloc +{ + [[NSNotificationCenter defaultCenter] removeObserver:self]; + [self stop:@""]; +} + +- (NSArray *)supportedEvents +{ + return @[@"Proximity", + @"WiredHeadset"]; +} + +RCT_EXPORT_METHOD(start:(NSString *)mediaType + auto:(BOOL)_auto + ringbackUriType:(NSString *)ringbackUriType) +{ + if (_audioSessionInitialized) { + return; + } + _media = mediaType; + + // --- auto is always true on ios + if ([_media isEqualToString:@"video"]) { + _incallAudioMode = AVAudioSessionModeVideoChat; + } else { + _incallAudioMode = AVAudioSessionModeVoiceChat; + } + NSLog(@"RNInCallManager.start() start InCallManager. media=%@, type=%@, mode=%@", _media, _media, _incallAudioMode); + [self storeOriginalAudioSetup]; + _forceSpeakerOn = 0; + [self startAudioSessionNotification]; + [self audioSessionSetCategory:_incallAudioCategory + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + [self audioSessionSetMode:_incallAudioMode + callerMemo:NSStringFromSelector(_cmd)]; + [self audioSessionSetActive:YES + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + + if (ringbackUriType.length > 0) { + NSLog(@"RNInCallManager.start() play ringback first. type=%@", ringbackUriType); + [self startRingback:ringbackUriType]; + } + + if ([_media isEqualToString:@"audio"]) { + [self startProximitySensor]; + } + [self setKeepScreenOn:YES]; + _audioSessionInitialized = YES; + //self.debugAudioSession() +} + +RCT_EXPORT_METHOD(stop:(NSString *)busytoneUriType) +{ + if (!_audioSessionInitialized) { + return; + } + + [self stopRingback]; + + if (busytoneUriType.length > 0 && [self startBusytone:busytoneUriType]) { + // play busytone first, and call this func again when finish + NSLog(@"RNInCallManager.stop(): play busytone before stop"); + return; + } else { + NSLog(@"RNInCallManager.stop(): stop InCallManager"); + [self restoreOriginalAudioSetup]; + [self stopBusytone]; + [self stopProximitySensor]; + [self audioSessionSetActive:NO + options:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation + callerMemo:NSStringFromSelector(_cmd)]; + [self setKeepScreenOn:NO]; + [self stopAudioSessionNotification]; + [[NSNotificationCenter defaultCenter] removeObserver:self]; + _forceSpeakerOn = 0; + _audioSessionInitialized = NO; + } +} + +RCT_EXPORT_METHOD(turnScreenOn) +{ + NSLog(@"RNInCallManager.turnScreenOn(): ios doesn't support turnScreenOn()"); +} + +RCT_EXPORT_METHOD(turnScreenOff) +{ + NSLog(@"RNInCallManager.turnScreenOff(): ios doesn't support turnScreenOff()"); +} + +RCT_EXPORT_METHOD(setFlashOn:(BOOL)enable + brightness:(nonnull NSNumber *)brightness) +{ + if ([AVCaptureDevice class]) { + AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; + if (device.hasTorch && device.position == AVCaptureDevicePositionBack) { + @try { + [device lockForConfiguration:nil]; + + if (enable) { + [device setTorchMode:AVCaptureTorchModeOn]; + } else { + [device setTorchMode:AVCaptureTorchModeOff]; + } + + [device unlockForConfiguration]; + } @catch (NSException *e) {} + } + } +} + +RCT_EXPORT_METHOD(setKeepScreenOn:(BOOL)enable) +{ + NSLog(@"RNInCallManager.setKeepScreenOn(): enable: %@", enable ? @"YES" : @"NO"); + dispatch_async(dispatch_get_main_queue(), ^{ + [[UIApplication sharedApplication] setIdleTimerDisabled:enable]; + }); +} + +RCT_EXPORT_METHOD(setSpeakerphoneOn:(BOOL)enable) +{ + BOOL success; + NSError *error = nil; + NSArray* routes = [_audioSession availableInputs]; + + if(!enable){ + NSLog(@"Routing audio via Earpiece"); + @try { + success = [_audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error]; + if (!success) NSLog(@"Cannot set category due to error: %@", error); + success = [_audioSession setMode:AVAudioSessionModeVoiceChat error:&error]; + if (!success) NSLog(@"Cannot set mode due to error: %@", error); + [_audioSession setPreferredOutputNumberOfChannels:0 error:nil]; + if (!success) NSLog(@"Port override failed due to: %@", error); + [_audioSession overrideOutputAudioPort:[AVAudioSessionPortBuiltInReceiver intValue] error:&error]; + success = [_audioSession setActive:YES error:&error]; + if (!success) NSLog(@"Audio session override failed: %@", error); + else NSLog(@"AudioSession override is successful "); + + } @catch (NSException *e) { + NSLog(@"Error occurred while routing audio via Earpiece: %@", e.reason); + } + } else { + NSLog(@"Routing audio via Loudspeaker"); + @try { + NSLog(@"Available routes: %@", routes[0]); + success = [_audioSession setCategory:AVAudioSessionCategoryPlayAndRecord + withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker + error:nil]; + if (!success) NSLog(@"Cannot set category due to error: %@", error); + success = [_audioSession setMode:AVAudioSessionModeVoiceChat error: &error]; + if (!success) NSLog(@"Cannot set mode due to error: %@", error); + [_audioSession setPreferredOutputNumberOfChannels:0 error:nil]; + [_audioSession overrideOutputAudioPort:[AVAudioSessionPortBuiltInSpeaker intValue] error: &error]; + if (!success) NSLog(@"Port override failed due to: %@", error); + success = [_audioSession setActive:YES error:&error]; + if (!success) NSLog(@"Audio session override failed: %@", error); + else NSLog(@"AudioSession override is successful "); + } @catch (NSException *e) { + NSLog(@"Error occurred while routing audio via Loudspeaker: %@", e.reason); + } + } +} + +RCT_EXPORT_METHOD(setForceSpeakerphoneOn:(int)flag) +{ + _forceSpeakerOn = flag; + NSLog(@"RNInCallManager.setForceSpeakerphoneOn(): flag: %d", flag); + [self updateAudioRoute]; +} + +RCT_EXPORT_METHOD(setMicrophoneMute:(BOOL)enable) +{ + NSLog(@"RNInCallManager.setMicrophoneMute(): ios doesn't support setMicrophoneMute()"); +} + +RCT_EXPORT_METHOD(startRingback:(NSString *)_ringbackUriType) +{ + // you may rejected by apple when publish app if you use system sound instead of bundled sound. + NSLog(@"RNInCallManager.startRingback(): type=%@", _ringbackUriType); + + @try { + if (_ringback != nil) { + if ([_ringback isPlaying]) { + NSLog(@"RNInCallManager.startRingback(): is already playing"); + return; + } else { + [self stopRingback]; + } + } + // ios don't have embedded DTMF tone generator. use system dtmf sound files. + NSString *ringbackUriType = [_ringbackUriType isEqualToString:@"_DTMF_"] + ? @"_DEFAULT_" + : _ringbackUriType; + NSURL *ringbackUri = [self getRingbackUri:ringbackUriType]; + if (ringbackUri == nil) { + NSLog(@"RNInCallManager.startRingback(): no available media"); + return; + } + //self.storeOriginalAudioSetup() + _ringback = [[AVAudioPlayer alloc] initWithContentsOfURL:ringbackUri error:nil]; + _ringback.delegate = self; + _ringback.numberOfLoops = -1; // you need to stop it explicitly + [_ringback prepareToPlay]; + + //self.audioSessionSetCategory(self.incallAudioCategory, [.DefaultToSpeaker, .AllowBluetooth], #function) + [self audioSessionSetCategory:_incallAudioCategory + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + [self audioSessionSetMode:_incallAudioMode + callerMemo:NSStringFromSelector(_cmd)]; + [_ringback play]; + } @catch (NSException *e) { + NSLog(@"RNInCallManager.startRingback(): caught error=%@", e.reason); + } +} + +RCT_EXPORT_METHOD(stopRingback) +{ + if (_ringback != nil) { + NSLog(@"RNInCallManager.stopRingback()"); + [_ringback stop]; + _ringback = nil; + // --- need to reset route based on config because WebRTC seems will switch audio mode automatically when call established. + //[self updateAudioRoute]; + } +} + +RCT_EXPORT_METHOD(startRingtone:(NSString *)ringtoneUriType + ringtoneCategory:(NSString *)ringtoneCategory) +{ + // you may rejected by apple when publish app if you use system sound instead of bundled sound. + NSLog(@"RNInCallManager.startRingtone(): type: %@", ringtoneUriType); + @try { + if (_ringtone != nil) { + if ([_ringtone isPlaying]) { + NSLog(@"RNInCallManager.startRingtone(): is already playing."); + return; + } else { + [self stopRingtone]; + } + } + NSURL *ringtoneUri = [self getRingtoneUri:ringtoneUriType]; + if (ringtoneUri == nil) { + NSLog(@"RNInCallManager.startRingtone(): no available media"); + return; + } + + // --- ios has Ringer/Silent switch, so just play without check ringer volume. + [self storeOriginalAudioSetup]; + _ringtone = [[AVAudioPlayer alloc] initWithContentsOfURL:ringtoneUri error:nil]; + _ringtone.delegate = self; + _ringtone.numberOfLoops = -1; // you need to stop it explicitly + [_ringtone prepareToPlay]; + + // --- 1. if we use Playback, it can supports background playing (starting from foreground), but it would not obey Ring/Silent switch. + // --- make sure you have enabled 'audio' tag ( or 'voip' tag ) at XCode -> Capabilities -> BackgroundMode + // --- 2. if we use SoloAmbient, it would obey Ring/Silent switch in the foreground, but does not support background playing, + // --- thus, then you should play ringtone again via local notification after back to home during a ring session. + + // we prefer 2. by default, since most of users doesn't want to interrupted by a ringtone if Silent mode is on. + + //self.audioSessionSetCategory(AVAudioSessionCategoryPlayback, [.DuckOthers], #function) + if ([ringtoneCategory isEqualToString:@"playback"]) { + [self audioSessionSetCategory:AVAudioSessionCategoryPlayback + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + } else { + [self audioSessionSetCategory:AVAudioSessionCategorySoloAmbient + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + } + [self audioSessionSetMode:AVAudioSessionModeDefault + callerMemo:NSStringFromSelector(_cmd)]; + //[self audioSessionSetActive:YES + // options:nil + // callerMemo:NSStringFromSelector(_cmd)]; + [_ringtone play]; + } @catch (NSException *e) { + NSLog(@"RNInCallManager.startRingtone(): caught error = %@", e.reason); + } +} + +RCT_EXPORT_METHOD(stopRingtone) +{ + if (_ringtone != nil) { + NSLog(@"RNInCallManager.stopRingtone()"); + [_ringtone stop]; + _ringtone = nil; + [self restoreOriginalAudioSetup]; + [self audioSessionSetActive:NO + options:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation + callerMemo:NSStringFromSelector(_cmd)]; + } +} + +RCT_EXPORT_METHOD(getAudioUriJS:(NSString *)audioType + fileType:(NSString *)fileType + resolve:(RCTPromiseResolveBlock)resolve + reject:(RCTPromiseRejectBlock)reject) +{ + NSURL *result = nil; + if ([audioType isEqualToString:@"ringback"]) { + result = [self getRingbackUri:fileType]; + } else if ([audioType isEqualToString:@"busytone"]) { + result = [self getBusytoneUri:fileType]; + } else if ([audioType isEqualToString:@"ringtone"]) { + result = [self getRingtoneUri:fileType]; + } + if (result != nil) { + if (result.absoluteString.length > 0) { + resolve(result.absoluteString); + return; + } + } + reject(@"error_code", @"getAudioUriJS() failed", RCTErrorWithMessage(@"getAudioUriJS() failed")); +} + +RCT_EXPORT_METHOD(getIsWiredHeadsetPluggedIn:(RCTPromiseResolveBlock)resolve + reject:(RCTPromiseRejectBlock)reject) +{ + BOOL wiredHeadsetPluggedIn = [self isWiredHeadsetPluggedIn]; + resolve(wiredHeadsetPluggedIn ? @YES : @NO); +} + +- (void)updateAudioRoute +{ + NSLog(@"RNInCallManager.updateAudioRoute(): [Enter] forceSpeakerOn flag=%d media=%@ category=%@ mode=%@", _forceSpeakerOn, _media, _audioSession.category, _audioSession.mode); + //self.debugAudioSession() + + //AVAudioSessionPortOverride overrideAudioPort; + int overrideAudioPort; + NSString *overrideAudioPortString = @""; + NSString *audioMode = @""; + + // --- WebRTC native code will change audio mode automatically when established. + // --- It would have some race condition if we change audio mode with webrtc at the same time. + // --- So we should not change audio mode as possible as we can. Only when default video call which wants to force speaker off. + // --- audio: only override speaker on/off; video: should change category if needed and handle proximity sensor. ( because default proximity is off when video call ) + if (_forceSpeakerOn == 1) { + // --- force ON, override speaker only, keep audio mode remain. + overrideAudioPort = AVAudioSessionPortOverrideSpeaker; + overrideAudioPortString = @".Speaker"; + if ([_media isEqualToString:@"video"]) { + audioMode = AVAudioSessionModeVideoChat; + [self stopProximitySensor]; + } + } else if (_forceSpeakerOn == -1) { + // --- force off + overrideAudioPort = AVAudioSessionPortOverrideNone; + overrideAudioPortString = @".None"; + if ([_media isEqualToString:@"video"]) { + audioMode = AVAudioSessionModeVoiceChat; + [self startProximitySensor]; + } + } else { // use default behavior + overrideAudioPort = AVAudioSessionPortOverrideNone; + overrideAudioPortString = @".None"; + if ([_media isEqualToString:@"video"]) { + audioMode = AVAudioSessionModeVideoChat; + [self stopProximitySensor]; + } + } + + BOOL isCurrentRouteToSpeaker; + isCurrentRouteToSpeaker = [self checkAudioRoute:@[AVAudioSessionPortBuiltInSpeaker] + routeType:@"output"]; + if ((overrideAudioPort == AVAudioSessionPortOverrideSpeaker && !isCurrentRouteToSpeaker) + || (overrideAudioPort == AVAudioSessionPortOverrideNone && isCurrentRouteToSpeaker)) { + @try { + [_audioSession overrideOutputAudioPort:overrideAudioPort error:nil]; + NSLog(@"RNInCallManager.updateAudioRoute(): audioSession.overrideOutputAudioPort(%@) success", overrideAudioPortString); + } @catch (NSException *e) { + NSLog(@"RNInCallManager.updateAudioRoute(): audioSession.overrideOutputAudioPort(%@) fail: %@", overrideAudioPortString, e.reason); + } + } else { + NSLog(@"RNInCallManager.updateAudioRoute(): did NOT overrideOutputAudioPort()"); + } + + if (![_audioSession.category isEqualToString:_incallAudioCategory]) { + [self audioSessionSetCategory:_incallAudioCategory + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + NSLog(@"RNInCallManager.updateAudioRoute() audio category has changed to %@", _incallAudioCategory); + } else { + NSLog(@"RNInCallManager.updateAudioRoute() did NOT change audio category"); + } + + if (audioMode.length > 0 && ![_audioSession.mode isEqualToString:audioMode]) { + [self audioSessionSetMode:audioMode + callerMemo:NSStringFromSelector(_cmd)]; + NSLog(@"RNInCallManager.updateAudioRoute() audio mode has changed to %@", audioMode); + } else { + NSLog(@"RNInCallManager.updateAudioRoute() did NOT change audio mode"); + } + //self.debugAudioSession() +} + +- (BOOL)checkAudioRoute:(NSArray *)targetPortTypeArray + routeType:(NSString *)routeType +{ + AVAudioSessionRouteDescription *currentRoute = _audioSession.currentRoute; + + if (currentRoute != nil) { + NSArray *routes = [routeType isEqualToString:@"input"] + ? currentRoute.inputs + : currentRoute.outputs; + for (AVAudioSessionPortDescription *portDescription in routes) { + if ([targetPortTypeArray containsObject:portDescription.portType]) { + return YES; + } + } + } + return NO; +} + +- (BOOL)startBusytone:(NSString *)_busytoneUriType +{ + // you may rejected by apple when publish app if you use system sound instead of bundled sound. + NSLog(@"RNInCallManager.startBusytone(): type: %@", _busytoneUriType); + @try { + if (_busytone != nil) { + if ([_busytone isPlaying]) { + NSLog(@"RNInCallManager.startBusytone(): is already playing"); + return NO; + } else { + [self stopBusytone]; + } + } + + // ios don't have embedded DTMF tone generator. use system dtmf sound files. + NSString *busytoneUriType = [_busytoneUriType isEqualToString:@"_DTMF_"] + ? @"_DEFAULT_" + : _busytoneUriType; + NSURL *busytoneUri = [self getBusytoneUri:busytoneUriType]; + if (busytoneUri == nil) { + NSLog(@"RNInCallManager.startBusytone(): no available media"); + return NO; + } + //[self storeOriginalAudioSetup]; + _busytone = [[AVAudioPlayer alloc] initWithContentsOfURL:busytoneUri error:nil]; + _busytone.delegate = self; + _busytone.numberOfLoops = 0; // it's part of start(), will stop at stop() + [_busytone prepareToPlay]; + + //self.audioSessionSetCategory(self.incallAudioCategory, [.DefaultToSpeaker, .AllowBluetooth], #function) + [self audioSessionSetCategory:_incallAudioCategory + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + [self audioSessionSetMode:_incallAudioMode + callerMemo:NSStringFromSelector(_cmd)]; + [_busytone play]; + } @catch (NSException *e) { + NSLog(@"RNInCallManager.startBusytone(): caught error = %@", e.reason); + return NO; + } + return YES; +} + +- (void)stopBusytone +{ + if (_busytone != nil) { + NSLog(@"RNInCallManager.stopBusytone()"); + [_busytone stop]; + _busytone = nil; + } +} + +- (BOOL)isWiredHeadsetPluggedIn +{ + // --- only check for a audio device plugged into headset port instead bluetooth/usb/hdmi + return [self checkAudioRoute:@[AVAudioSessionPortHeadphones] + routeType:@"output"] + || [self checkAudioRoute:@[AVAudioSessionPortHeadsetMic] + routeType:@"input"]; +} + +- (void)audioSessionSetCategory:(NSString *)audioCategory + options:(AVAudioSessionCategoryOptions)options + callerMemo:(NSString *)callerMemo +{ + @try { + if (options != 0) { + [_audioSession setCategory:audioCategory + withOptions:options + error:nil]; + } else { + [_audioSession setCategory:audioCategory + error:nil]; + } + NSLog(@"RNInCallManager.%@: audioSession.setCategory: %@, withOptions: %lu success", callerMemo, audioCategory, (unsigned long)options); + } @catch (NSException *e) { + NSLog(@"RNInCallManager.%@: audioSession.setCategory: %@, withOptions: %lu fail: %@", callerMemo, audioCategory, (unsigned long)options, e.reason); + } +} + +- (void)audioSessionSetMode:(NSString *)audioMode + callerMemo:(NSString *)callerMemo +{ + @try { + [_audioSession setMode:audioMode error:nil]; + NSLog(@"RNInCallManager.%@: audioSession.setMode(%@) success", callerMemo, audioMode); + } @catch (NSException *e) { + NSLog(@"RNInCallManager.%@: audioSession.setMode(%@) fail: %@", callerMemo, audioMode, e.reason); + } +} + +- (void)audioSessionSetActive:(BOOL)audioActive + options:(AVAudioSessionSetActiveOptions)options + callerMemo:(NSString *)callerMemo +{ + @try { + if (options != 0) { + [_audioSession setActive:audioActive + withOptions:options + error:nil]; + } else { + [_audioSession setActive:audioActive + error:nil]; + } + NSLog(@"RNInCallManager.%@: audioSession.setActive(%@), withOptions: %lu success", callerMemo, audioActive ? @"YES" : @"NO", (unsigned long)options); + } @catch (NSException *e) { + NSLog(@"RNInCallManager.%@: audioSession.setActive(%@), withOptions: %lu fail: %@", callerMemo, audioActive ? @"YES" : @"NO", (unsigned long)options, e.reason); + } +} + +- (void)storeOriginalAudioSetup +{ + NSLog(@"RNInCallManager.storeOriginalAudioSetup(): origAudioCategory=%@, origAudioMode=%@", _audioSession.category, _audioSession.mode); + _origAudioCategory = _audioSession.category; + _origAudioMode = _audioSession.mode; +} + +- (void)restoreOriginalAudioSetup +{ + NSLog(@"RNInCallManager.restoreOriginalAudioSetup(): origAudioCategory=%@, origAudioMode=%@", _audioSession.category, _audioSession.mode); + [self audioSessionSetCategory:_origAudioCategory + options:0 + callerMemo:NSStringFromSelector(_cmd)]; + [self audioSessionSetMode:_origAudioMode + callerMemo:NSStringFromSelector(_cmd)]; +} + +RCT_EXPORT_METHOD(startProximitySensor) +{ + if (_isProximityRegistered) { + return; + } + + NSLog(@"RNInCallManager.startProximitySensor()"); + dispatch_async(dispatch_get_main_queue(), ^{ + self->_currentDevice.proximityMonitoringEnabled = YES; + }); + + // --- in case it didn't deallocate when ViewDidUnload + [self stopObserve:_proximityObserver + name:UIDeviceProximityStateDidChangeNotification + object:nil]; + + _proximityObserver = [self startObserve:UIDeviceProximityStateDidChangeNotification + object:_currentDevice + queue: nil + block:^(NSNotification *notification) { + BOOL state = self->_currentDevice.proximityState; + if (state != self->_proximityIsNear) { + NSLog(@"RNInCallManager.UIDeviceProximityStateDidChangeNotification(): isNear: %@", state ? @"YES" : @"NO"); + self->_proximityIsNear = state; + [self sendEventWithName:@"Proximity" body:@{@"isNear": state ? @YES : @NO}]; + } + }]; + + _isProximityRegistered = YES; +} + +RCT_EXPORT_METHOD(stopProximitySensor) +{ + if (!_isProximityRegistered) { + return; + } + + NSLog(@"RNInCallManager.stopProximitySensor()"); + dispatch_async(dispatch_get_main_queue(), ^{ + self->_currentDevice.proximityMonitoringEnabled = NO; + }); + + // --- remove all no matter what object + [self stopObserve:_proximityObserver + name:UIDeviceProximityStateDidChangeNotification + object:nil]; + + _isProximityRegistered = NO; +} + +- (void)startAudioSessionNotification +{ + NSLog(@"RNInCallManager.startAudioSessionNotification() starting..."); + [self startAudioSessionInterruptionNotification]; + [self startAudioSessionRouteChangeNotification]; + [self startAudioSessionMediaServicesWereLostNotification]; + [self startAudioSessionMediaServicesWereResetNotification]; + [self startAudioSessionSilenceSecondaryAudioHintNotification]; +} + +- (void)stopAudioSessionNotification +{ + NSLog(@"RNInCallManager.startAudioSessionNotification() stopping..."); + [self stopAudioSessionInterruptionNotification]; + [self stopAudioSessionRouteChangeNotification]; + [self stopAudioSessionMediaServicesWereLostNotification]; + [self stopAudioSessionMediaServicesWereResetNotification]; + [self stopAudioSessionSilenceSecondaryAudioHintNotification]; +} + +- (void)startAudioSessionInterruptionNotification +{ + if (_isAudioSessionInterruptionRegistered) { + return; + } + NSLog(@"RNInCallManager.startAudioSessionInterruptionNotification()"); + + // --- in case it didn't deallocate when ViewDidUnload + [self stopObserve:_audioSessionInterruptionObserver + name:AVAudioSessionInterruptionNotification + object:nil]; + + _audioSessionInterruptionObserver = [self startObserve:AVAudioSessionInterruptionNotification + object:nil + queue:nil + block:^(NSNotification *notification) { + if (notification.userInfo == nil + || ![notification.name isEqualToString:AVAudioSessionInterruptionNotification]) { + return; + } + + //NSUInteger rawValue = notification.userInfo[AVAudioSessionInterruptionTypeKey].unsignedIntegerValue; + NSNumber *interruptType = [notification.userInfo objectForKey:@"AVAudioSessionInterruptionTypeKey"]; + if ([interruptType unsignedIntegerValue] == AVAudioSessionInterruptionTypeBegan) { + NSLog(@"RNInCallManager.AudioSessionInterruptionNotification: Began"); + } else if ([interruptType unsignedIntegerValue] == AVAudioSessionInterruptionTypeEnded) { + NSLog(@"RNInCallManager.AudioSessionInterruptionNotification: Ended"); + } else { + NSLog(@"RNInCallManager.AudioSessionInterruptionNotification: Unknow Value"); + } + //NSLog(@"RNInCallManager.AudioSessionInterruptionNotification: could not resolve notification"); + }]; + + _isAudioSessionInterruptionRegistered = YES; +} + +- (void)stopAudioSessionInterruptionNotification +{ + if (!_isAudioSessionInterruptionRegistered) { + return; + } + NSLog(@"RNInCallManager.stopAudioSessionInterruptionNotification()"); + // --- remove all no matter what object + [self stopObserve:_audioSessionInterruptionObserver + name:AVAudioSessionInterruptionNotification + object: nil]; + _isAudioSessionInterruptionRegistered = NO; +} + +- (void)startAudioSessionRouteChangeNotification +{ + if (_isAudioSessionRouteChangeRegistered) { + return; + } + + NSLog(@"RNInCallManager.startAudioSessionRouteChangeNotification()"); + + // --- in case it didn't deallocate when ViewDidUnload + [self stopObserve:_audioSessionRouteChangeObserver + name: AVAudioSessionRouteChangeNotification + object: nil]; + + _audioSessionRouteChangeObserver = [self startObserve:AVAudioSessionRouteChangeNotification + object: nil + queue: nil + block:^(NSNotification *notification) { + if (notification.userInfo == nil + || ![notification.name isEqualToString:AVAudioSessionRouteChangeNotification]) { + return; + } + + NSNumber *routeChangeType = [notification.userInfo objectForKey:@"AVAudioSessionRouteChangeReasonKey"]; + NSUInteger routeChangeTypeValue = [routeChangeType unsignedIntegerValue]; + + switch (routeChangeTypeValue) { + case AVAudioSessionRouteChangeReasonUnknown: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: Unknown"); + break; + case AVAudioSessionRouteChangeReasonNewDeviceAvailable: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: NewDeviceAvailable"); + if ([self checkAudioRoute:@[AVAudioSessionPortHeadsetMic] + routeType:@"input"]) { + [self sendEventWithName:@"WiredHeadset" + body:@{ + @"isPlugged": @YES, + @"hasMic": @YES, + @"deviceName": AVAudioSessionPortHeadsetMic, + }]; + } else if ([self checkAudioRoute:@[AVAudioSessionPortHeadphones] + routeType:@"output"]) { + [self sendEventWithName:@"WiredHeadset" + body:@{ + @"isPlugged": @YES, + @"hasMic": @NO, + @"deviceName": AVAudioSessionPortHeadphones, + }]; + } + break; + case AVAudioSessionRouteChangeReasonOldDeviceUnavailable: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: OldDeviceUnavailable"); + if (![self isWiredHeadsetPluggedIn]) { + [self sendEventWithName:@"WiredHeadset" + body:@{ + @"isPlugged": @NO, + @"hasMic": @NO, + @"deviceName": @"", + }]; + } + break; + case AVAudioSessionRouteChangeReasonCategoryChange: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: CategoryChange. category=%@ mode=%@", self->_audioSession.category, self->_audioSession.mode); + [self updateAudioRoute]; + break; + case AVAudioSessionRouteChangeReasonOverride: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: Override"); + break; + case AVAudioSessionRouteChangeReasonWakeFromSleep: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: WakeFromSleep"); + break; + case AVAudioSessionRouteChangeReasonNoSuitableRouteForCategory: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: NoSuitableRouteForCategory"); + break; + case AVAudioSessionRouteChangeReasonRouteConfigurationChange: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: RouteConfigurationChange. category=%@ mode=%@", self->_audioSession.category, self->_audioSession.mode); + break; + default: + NSLog(@"RNInCallManager.AudioRouteChange.Reason: Unknow Value"); + break; + } + + NSNumber *silenceSecondaryAudioHintType = [notification.userInfo objectForKey:@"AVAudioSessionSilenceSecondaryAudioHintTypeKey"]; + NSUInteger silenceSecondaryAudioHintTypeValue = [silenceSecondaryAudioHintType unsignedIntegerValue]; + switch (silenceSecondaryAudioHintTypeValue) { + case AVAudioSessionSilenceSecondaryAudioHintTypeBegin: + NSLog(@"RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: Begin"); + case AVAudioSessionSilenceSecondaryAudioHintTypeEnd: + NSLog(@"RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: End"); + default: + NSLog(@"RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: Unknow Value"); + } + }]; + + _isAudioSessionRouteChangeRegistered = YES; +} + +- (void)stopAudioSessionRouteChangeNotification +{ + if (!_isAudioSessionRouteChangeRegistered) { + return; + } + + NSLog(@"RNInCallManager.stopAudioSessionRouteChangeNotification()"); + // --- remove all no matter what object + [self stopObserve:_audioSessionRouteChangeObserver + name:AVAudioSessionRouteChangeNotification + object:nil]; + _isAudioSessionRouteChangeRegistered = NO; +} + +- (void)startAudioSessionMediaServicesWereLostNotification +{ + if (_isAudioSessionMediaServicesWereLostRegistered) { + return; + } + + NSLog(@"RNInCallManager.startAudioSessionMediaServicesWereLostNotification()"); + + // --- in case it didn't deallocate when ViewDidUnload + [self stopObserve:_audioSessionMediaServicesWereLostObserver + name:AVAudioSessionMediaServicesWereLostNotification + object:nil]; + + _audioSessionMediaServicesWereLostObserver = [self startObserve:AVAudioSessionMediaServicesWereLostNotification + object:nil + queue:nil + block:^(NSNotification *notification) { + // --- This notification has no userInfo dictionary. + NSLog(@"RNInCallManager.AudioSessionMediaServicesWereLostNotification: Media Services Were Lost"); + }]; + + _isAudioSessionMediaServicesWereLostRegistered = YES; +} + +- (void)stopAudioSessionMediaServicesWereLostNotification +{ + if (!_isAudioSessionMediaServicesWereLostRegistered) { + return; + } + + NSLog(@"RNInCallManager.stopAudioSessionMediaServicesWereLostNotification()"); + + // --- remove all no matter what object + [self stopObserve:_audioSessionMediaServicesWereLostObserver + name:AVAudioSessionMediaServicesWereLostNotification + object:nil]; + + _isAudioSessionMediaServicesWereLostRegistered = NO; +} + +- (void)startAudioSessionMediaServicesWereResetNotification +{ + if (_isAudioSessionMediaServicesWereResetRegistered) { + return; + } + + NSLog(@"RNInCallManager.startAudioSessionMediaServicesWereResetNotification()"); + + // --- in case it didn't deallocate when ViewDidUnload + [self stopObserve:_audioSessionMediaServicesWereResetObserver + name:AVAudioSessionMediaServicesWereResetNotification + object:nil]; + + _audioSessionMediaServicesWereResetObserver = [self startObserve:AVAudioSessionMediaServicesWereResetNotification + object:nil + queue:nil + block:^(NSNotification *notification) { + // --- This notification has no userInfo dictionary. + NSLog(@"RNInCallManager.AudioSessionMediaServicesWereResetNotification: Media Services Were Reset"); + }]; + + _isAudioSessionMediaServicesWereResetRegistered = YES; +} + +- (void)stopAudioSessionMediaServicesWereResetNotification +{ + if (!_isAudioSessionMediaServicesWereResetRegistered) { + return; + } + + NSLog(@"RNInCallManager.stopAudioSessionMediaServicesWereResetNotification()"); + + // --- remove all no matter what object + [self stopObserve:_audioSessionMediaServicesWereResetObserver + name:AVAudioSessionMediaServicesWereResetNotification + object:nil]; + + _isAudioSessionMediaServicesWereResetRegistered = NO; +} + +- (void)startAudioSessionSilenceSecondaryAudioHintNotification +{ + if (_isAudioSessionSilenceSecondaryAudioHintRegistered) { + return; + } + + NSLog(@"RNInCallManager.startAudioSessionSilenceSecondaryAudioHintNotification()"); + + // --- in case it didn't deallocate when ViewDidUnload + [self stopObserve:_audioSessionSilenceSecondaryAudioHintObserver + name:AVAudioSessionSilenceSecondaryAudioHintNotification + object:nil]; + + _audioSessionSilenceSecondaryAudioHintObserver = [self startObserve:AVAudioSessionSilenceSecondaryAudioHintNotification + object:nil + queue:nil + block:^(NSNotification *notification) { + if (notification.userInfo == nil + || ![notification.name isEqualToString:AVAudioSessionSilenceSecondaryAudioHintNotification]) { + return; + } + + NSNumber *silenceSecondaryAudioHintType = [notification.userInfo objectForKey:@"AVAudioSessionSilenceSecondaryAudioHintTypeKey"]; + NSUInteger silenceSecondaryAudioHintTypeValue = [silenceSecondaryAudioHintType unsignedIntegerValue]; + switch (silenceSecondaryAudioHintTypeValue) { + case AVAudioSessionSilenceSecondaryAudioHintTypeBegin: + NSLog(@"RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: Begin"); + break; + case AVAudioSessionSilenceSecondaryAudioHintTypeEnd: + NSLog(@"RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: End"); + break; + default: + NSLog(@"RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: Unknow Value"); + break; + } + }]; + _isAudioSessionSilenceSecondaryAudioHintRegistered = YES; +} + +- (void)stopAudioSessionSilenceSecondaryAudioHintNotification +{ + if (!_isAudioSessionSilenceSecondaryAudioHintRegistered) { + return; + } + + NSLog(@"RNInCallManager.stopAudioSessionSilenceSecondaryAudioHintNotification()"); + // --- remove all no matter what object + [self stopObserve:_audioSessionSilenceSecondaryAudioHintObserver + name:AVAudioSessionSilenceSecondaryAudioHintNotification + object:nil]; + + _isAudioSessionSilenceSecondaryAudioHintRegistered = NO; +} + +- (id)startObserve:(NSString *)name + object:(id)object + queue:(NSOperationQueue *)queue + block:(void (^)(NSNotification *))block +{ + return [[NSNotificationCenter defaultCenter] addObserverForName:name + object:object + queue:queue + usingBlock:block]; +} + +- (void)stopObserve:(id)observer + name:(NSString *)name + object:(id)object +{ + if (observer == nil) return; + [[NSNotificationCenter defaultCenter] removeObserver:observer + name:name + object:object]; +} + +- (NSURL *)getRingbackUri:(NSString *)_type +{ + NSString *fileBundle = @"incallmanager_ringback"; + NSString *fileBundleExt = @"mp3"; + //NSString *fileSysWithExt = @"vc~ringing.caf"; // --- ringtone of facetime, but can't play it. + //NSString *fileSysPath = @"/System/Library/Audio/UISounds"; + NSString *fileSysWithExt = @"Marimba.m4r"; + NSString *fileSysPath = @"/Library/Ringtones"; + + // --- you can't get default user perfrence sound in ios + NSString *type = [_type isEqualToString:@""] || [_type isEqualToString:@"_DEFAULT_"] + ? fileSysWithExt + : _type; + + NSURL *bundleUri = _bundleRingbackUri; + NSURL *defaultUri = _defaultRingbackUri; + + NSURL *uri = [self getAudioUri:type + fileBundle:fileBundle + fileBundleExt:fileBundleExt + fileSysWithExt:fileSysWithExt + fileSysPath:fileSysPath + uriBundle:&bundleUri + uriDefault:&defaultUri]; + + _bundleRingbackUri = bundleUri; + _defaultRingbackUri = defaultUri; + + return uri; +} + +- (NSURL *)getBusytoneUri:(NSString *)_type +{ + NSString *fileBundle = @"incallmanager_busytone"; + NSString *fileBundleExt = @"mp3"; + NSString *fileSysWithExt = @"ct-busy.caf"; //ct-congestion.caf + NSString *fileSysPath = @"/System/Library/Audio/UISounds"; + // --- you can't get default user perfrence sound in ios + NSString *type = [_type isEqualToString:@""] || [_type isEqualToString:@"_DEFAULT_"] + ? fileSysWithExt + : _type; + + NSURL *bundleUri = _bundleBusytoneUri; + NSURL *defaultUri = _defaultBusytoneUri; + + NSURL *uri = [self getAudioUri:type + fileBundle:fileBundle + fileBundleExt:fileBundleExt + fileSysWithExt:fileSysWithExt + fileSysPath:fileSysPath + uriBundle:&bundleUri + uriDefault:&defaultUri]; + + _bundleBusytoneUri = bundleUri; + _defaultBusytoneUri = defaultUri; + + return uri; +} + +- (NSURL *)getRingtoneUri:(NSString *)_type +{ + NSString *fileBundle = @"incallmanager_ringtone"; + NSString *fileBundleExt = @"mp3"; + NSString *fileSysWithExt = @"Opening.m4r"; //Marimba.m4r + NSString *fileSysPath = @"/Library/Ringtones"; + // --- you can't get default user perfrence sound in ios + NSString *type = [_type isEqualToString:@""] || [_type isEqualToString:@"_DEFAULT_"] + ? fileSysWithExt + : _type; + + NSURL *bundleUri = _bundleRingtoneUri; + NSURL *defaultUri = _defaultRingtoneUri; + + NSURL *uri = [self getAudioUri:type + fileBundle:fileBundle + fileBundleExt:fileBundleExt + fileSysWithExt:fileSysWithExt + fileSysPath:fileSysPath + uriBundle:&bundleUri + uriDefault:&defaultUri]; + + _bundleRingtoneUri = bundleUri; + _defaultRingtoneUri = defaultUri; + + return uri; +} + +- (NSURL *)getAudioUri:(NSString *)_type + fileBundle:(NSString *)fileBundle + fileBundleExt:(NSString *)fileBundleExt + fileSysWithExt:(NSString *)fileSysWithExt + fileSysPath:(NSString *)fileSysPath + uriBundle:(NSURL **)uriBundle + uriDefault:(NSURL **)uriDefault +{ + NSString *type = _type; + if ([type isEqualToString:@"_BUNDLE_"]) { + if (*uriBundle == nil) { + *uriBundle = [[NSBundle mainBundle] URLForResource:fileBundle withExtension:fileBundleExt]; + if (*uriBundle == nil) { + NSLog(@"RNInCallManager.getAudioUri(): %@.%@ not found in bundle.", fileBundle, fileBundleExt); + type = fileSysWithExt; + } else { + return *uriBundle; + } + } else { + return *uriBundle; + } + } + + if (*uriDefault == nil) { + NSString *target = [NSString stringWithFormat:@"%@/%@", fileSysPath, type]; + *uriDefault = [self getSysFileUri:target]; + } + return *uriDefault; +} + +- (NSURL *)getSysFileUri:(NSString *)target +{ + NSURL *url = [[NSURL alloc] initFileURLWithPath:target isDirectory:NO]; + + if (url != nil) { + NSString *path = url.path; + if (path != nil) { + NSFileManager *fileManager = [[NSFileManager alloc] init]; + BOOL isTargetDirectory; + if ([fileManager fileExistsAtPath:path isDirectory:&isTargetDirectory]) { + if (!isTargetDirectory) { + return url; + } + } + } + } + NSLog(@"RNInCallManager.getSysFileUri(): can not get url for %@", target); + return nil; +} + +#pragma mark - AVAudioPlayerDelegate + +// --- this only called when all loop played. it means, an infinite (numberOfLoops = -1) loop will never into here. +- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player + successfully:(BOOL)flag +{ + NSString *filename = player.url.URLByDeletingPathExtension.lastPathComponent; + NSLog(@"RNInCallManager.audioPlayerDidFinishPlaying(): finished playing: %@", filename); + if ([filename isEqualToString:_bundleBusytoneUri.URLByDeletingPathExtension.lastPathComponent] + || [filename isEqualToString:_defaultBusytoneUri.URLByDeletingPathExtension.lastPathComponent]) { + //[self stopBusytone]; + NSLog(@"RNInCallManager.audioPlayerDidFinishPlaying(): busytone finished, invoke stop()"); + [self stop:@""]; + } +} + +- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player + error:(NSError *)error +{ + NSString *filename = player.url.URLByDeletingPathExtension.lastPathComponent; + NSLog(@"RNInCallManager.audioPlayerDecodeErrorDidOccur(): player=%@, error=%@", filename, error.localizedDescription); +} + +// --- Deprecated in iOS 8.0. +//- (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player +//{ +//} + +// --- Deprecated in iOS 8.0. +//- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player +//{ +//} + +//- (void)debugAudioSession +//{ +// let currentRoute: Dictionary = ["input": self.audioSession.currentRoute.inputs[0].uid, "output": self.audioSession.currentRoute.outputs[0].uid] +// var categoryOptions = "" +// switch self.audioSession.categoryOptions { +// case AVAudioSessionCategoryOptions.mixWithOthers: +// categoryOptions = "MixWithOthers" +// case AVAudioSessionCategoryOptions.duckOthers: +// categoryOptions = "DuckOthers" +// case AVAudioSessionCategoryOptions.allowBluetooth: +// categoryOptions = "AllowBluetooth" +// case AVAudioSessionCategoryOptions.defaultToSpeaker: +// categoryOptions = "DefaultToSpeaker" +// default: +// categoryOptions = "unknow" +// } +// if #available(iOS 9, *) { +// if categoryOptions == "unknow" && self.audioSession.categoryOptions == AVAudioSessionCategoryOptions.interruptSpokenAudioAndMixWithOthers { +// categoryOptions = "InterruptSpokenAudioAndMixWithOthers" +// } +// } +// self._checkRecordPermission() +// let audioSessionProperties: Dictionary = [ +// "category": self.audioSession.category, +// "categoryOptions": categoryOptions, +// "mode": self.audioSession.mode, +// //"inputAvailable": self.audioSession.inputAvailable, +// "otherAudioPlaying": self.audioSession.isOtherAudioPlaying, +// "recordPermission" : self.recordPermission, +// //"availableInputs": self.audioSession.availableInputs, +// //"preferredInput": self.audioSession.preferredInput, +// //"inputDataSources": self.audioSession.inputDataSources, +// //"inputDataSource": self.audioSession.inputDataSource, +// //"outputDataSources": self.audioSession.outputDataSources, +// //"outputDataSource": self.audioSession.outputDataSource, +// "currentRoute": currentRoute, +// "outputVolume": self.audioSession.outputVolume, +// "inputGain": self.audioSession.inputGain, +// "inputGainSettable": self.audioSession.isInputGainSettable, +// "inputLatency": self.audioSession.inputLatency, +// "outputLatency": self.audioSession.outputLatency, +// "sampleRate": self.audioSession.sampleRate, +// "preferredSampleRate": self.audioSession.preferredSampleRate, +// "IOBufferDuration": self.audioSession.ioBufferDuration, +// "preferredIOBufferDuration": self.audioSession.preferredIOBufferDuration, +// "inputNumberOfChannels": self.audioSession.inputNumberOfChannels, +// "maximumInputNumberOfChannels": self.audioSession.maximumInputNumberOfChannels, +// "preferredInputNumberOfChannels": self.audioSession.preferredInputNumberOfChannels, +// "outputNumberOfChannels": self.audioSession.outputNumberOfChannels, +// "maximumOutputNumberOfChannels": self.audioSession.maximumOutputNumberOfChannels, +// "preferredOutputNumberOfChannels": self.audioSession.preferredOutputNumberOfChannels +// ] +// /* +// // --- Too noisy +// if #available(iOS 8, *) { +// //audioSessionProperties["secondaryAudioShouldBeSilencedHint"] = self.audioSession.secondaryAudioShouldBeSilencedHint +// } else { +// //audioSessionProperties["secondaryAudioShouldBeSilencedHint"] = "unknow" +// } +// if #available(iOS 9, *) { +// //audioSessionProperties["availableCategories"] = self.audioSession.availableCategories +// //audioSessionProperties["availableModes"] = self.audioSession.availableModes +// } +// */ +// NSLog("RNInCallManager.debugAudioSession(): ==========BEGIN==========") +// // iterate over all keys +// for (key, value) in audioSessionProperties { +// NSLog("\(key) = \(value)") +// } +// NSLog("RNInCallManager.debugAudioSession(): ==========END==========") +//} + +@end diff --git a/ios/RNInCallManager/RNInCallManager.swift b/ios/RNInCallManager/RNInCallManager.swift deleted file mode 100644 index 09fbfc7..0000000 --- a/ios/RNInCallManager/RNInCallManager.swift +++ /dev/null @@ -1,996 +0,0 @@ -// RNInCallManager.swift -// RNInCallManager -// -// Created by zxcpoiu, Henry Hung-Hsien Lin on 2016-04-10 -// Copyright 2016 Facebook. All rights reserved. -// - -import Foundation -import UIKit -import NotificationCenter -import AVFoundation - -@objc(RNInCallManager) -class RNInCallManager: NSObject, AVAudioPlayerDelegate { - var bridge: RCTBridge! // this is synthesized - var currentDevice: UIDevice! - var audioSession: AVAudioSession! - var mRingtone: AVAudioPlayer! - var mRingback: AVAudioPlayer! - var mBusytone: AVAudioPlayer! - - var defaultRingtoneUri: URL! - var defaultRingbackUri: URL! - var defaultBusytoneUri: URL! - var bundleRingtoneUri: URL! - var bundleRingbackUri: URL! - var bundleBusytoneUri: URL! - - var isProximitySupported: Bool = false - var proximityIsNear: Bool = false - - // --- tags to indicating which observer has added - var isProximityRegistered: Bool = false - var isAudioSessionInterruptionRegistered: Bool = false - var isAudioSessionRouteChangeRegistered: Bool = false - var isAudioSessionMediaServicesWereLostRegistered: Bool = false - var isAudioSessionMediaServicesWereResetRegistered: Bool = false - var isAudioSessionSilenceSecondaryAudioHintRegistered: Bool = false - - // -- notification observers - var proximityObserver: NSObjectProtocol? - var audioSessionInterruptionObserver: NSObjectProtocol? - var audioSessionRouteChangeObserver: NSObjectProtocol? - var audioSessionMediaServicesWereLostObserver: NSObjectProtocol? - var audioSessionMediaServicesWereResetObserver: NSObjectProtocol? - var audioSessionSilenceSecondaryAudioHintObserver: NSObjectProtocol? - - var incallAudioMode: String = AVAudioSessionModeVoiceChat - var incallAudioCategory: String = AVAudioSessionCategoryPlayAndRecord - var origAudioCategory: String! - var origAudioMode: String! - var audioSessionInitialized: Bool = false - let automatic: Bool = true - var forceSpeakerOn: Int = 0 //UInt8? - var recordPermission: String! - var cameraPermission: String! - var media: String = "audio" - - private lazy var device: AVCaptureDevice? = { AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) }() - - // --- AVAudioSessionCategoryOptionAllowBluetooth: - // --- Valid only if the audio session category is AVAudioSessionCategoryPlayAndRecord or AVAudioSessionCategoryRecord. - // --- Using VoiceChat/VideoChat mode has the side effect of enabling the AVAudioSessionCategoryOptionAllowBluetooth category option. - // --- So basically, we don't have to add AllowBluetooth options by hand. - - //@objc func initWithBridge(_bridge: RCTBridge) { - //self.bridge = _bridge - override init() { - super.init() - self.currentDevice = UIDevice.current - self.audioSession = AVAudioSession.sharedInstance() - self.checkProximitySupport() - NSLog("RNInCallManager.init(): initialized") - } - - deinit { - self.stop("") - } - - @objc func start(_ media: String, auto: Bool, ringbackUriType: String) -> Void { - guard !self.audioSessionInitialized else { return } - guard self.recordPermission == "granted" else { - NSLog("RNInCallManager.start(): recordPermission should be granted. state: \(self.recordPermission)") - return - } - self.media = media - - // --- auto is always true on ios - if self.media == "video" { - self.incallAudioMode = AVAudioSessionModeVideoChat - } else { - self.incallAudioMode = AVAudioSessionModeVoiceChat - } - NSLog("RNInCallManager.start() start InCallManager. media=\(self.media), type=\(self.media), mode=\(self.incallAudioMode)") - self.storeOriginalAudioSetup() - self.forceSpeakerOn = 0; - self.startAudioSessionNotification() - //self.audioSessionSetCategory(self.incallAudioCategory, [.DefaultToSpeaker, .AllowBluetooth], #function) - self.audioSessionSetCategory(self.incallAudioCategory, nil, #function) - self.audioSessionSetMode(self.incallAudioMode, #function) - self.audioSessionSetActive(true, nil, #function) - if !(ringbackUriType ?? "").isEmpty { - NSLog("RNInCallManager.start() play ringback first. type=\(ringbackUriType)") - self.startRingback(ringbackUriType) - } - - if self.media == "audio" { - self.startProximitySensor() - } - self.setKeepScreenOn(true) - self.audioSessionInitialized = true - //self.debugAudioSession() - } - - @objc func stop(_ busytoneUriType: String) -> Void { - guard self.audioSessionInitialized else { return } - - self.stopRingback() - if !(busytoneUriType ?? "").isEmpty && self.startBusytone(busytoneUriType) { - // play busytone first, and call this func again when finish - NSLog("RNInCallManager.stop(): play busytone before stop") - return - } else { - NSLog("RNInCallManager.stop(): stop InCallManager") - self.restoreOriginalAudioSetup() - self.stopBusytone() - self.stopProximitySensor() - self.audioSessionSetActive(false, .notifyOthersOnDeactivation, #function) - self.setKeepScreenOn(false) - self.stopAudioSessionNotification() - NotificationCenter.default.removeObserver(self) - self.forceSpeakerOn = 0; - self.audioSessionInitialized = false - } - } - - @objc func turnScreenOn() -> Void { - NSLog("RNInCallManager.turnScreenOn(): ios doesn't support turnScreenOn()") - } - - @objc func turnScreenOff() -> Void { - NSLog("RNInCallManager.turnScreenOff(): ios doesn't support turnScreenOff()") - } - - func updateAudioRoute() -> Void { - NSLog("RNInCallManager.updateAudioRoute(): [Enter] forceSpeakerOn flag=\(self.forceSpeakerOn) media=\(self.media) category=\(self.audioSession.category) mode=\(self.audioSession.mode)") - //self.debugAudioSession() - var overrideAudioPort: AVAudioSessionPortOverride - var overrideAudioPortString: String = "" - var audioMode: String = "" - - // --- WebRTC native code will change audio mode automatically when established. - // --- It would have some race condition if we change audio mode with webrtc at the same time. - // --- So we should not change audio mode as possible as we can. Only when default video call which wants to force speaker off. - // --- audio: only override speaker on/off; video: should change category if needed and handle proximity sensor. ( because default proximity is off when video call ) - if self.forceSpeakerOn == 1 { - // --- force ON, override speaker only, keep audio mode remain. - overrideAudioPort = .speaker - overrideAudioPortString = ".Speaker" - if self.media == "video" { - audioMode = AVAudioSessionModeVideoChat - self.stopProximitySensor() - } - } else if self.forceSpeakerOn == -1 { - // --- force off - overrideAudioPort = .none - overrideAudioPortString = ".None" - if self.media == "video" { - audioMode = AVAudioSessionModeVoiceChat - self.startProximitySensor() - } - } else { // use default behavior - overrideAudioPort = .none - overrideAudioPortString = ".None" - if self.media == "video" { - audioMode = AVAudioSessionModeVideoChat - self.stopProximitySensor() - } - } - - let isCurrentRouteToSpeaker: Bool = self.checkAudioRoute([AVAudioSessionPortBuiltInSpeaker], "output") - if (overrideAudioPort == .speaker && !isCurrentRouteToSpeaker) || (overrideAudioPort == .none && isCurrentRouteToSpeaker) { - do { - try self.audioSession.overrideOutputAudioPort(overrideAudioPort) - NSLog("RNInCallManager.updateAudioRoute(): audioSession.overrideOutputAudioPort(\(overrideAudioPortString)) success") - } catch let err { - NSLog("RNInCallManager.updateAudioRoute(): audioSession.overrideOutputAudioPort(\(overrideAudioPortString)) failed: \(err)") - } - } else { - NSLog("RNInCallManager.updateAudioRoute(): did NOT overrideOutputAudioPort()") - } - - if !audioMode.isEmpty && self.audioSession.mode != audioMode { - self.audioSessionSetMode(audioMode, #function) - NSLog("RNInCallManager.updateAudioRoute() audio mode has changed to \(audioMode)") - } else { - NSLog("RNInCallManager.updateAudioRoute() did NOT change audio mode") - } - //self.debugAudioSession() - } - - func checkAudioRoute(_ targetPortTypeArray: [String], _ routeType: String) -> Bool { - if let currentRoute: AVAudioSessionRouteDescription = self.audioSession.currentRoute { - let routes: [AVAudioSessionPortDescription] = (routeType == "input" ? currentRoute.inputs : currentRoute.outputs) - for _portDescription in routes { - let portDescription: AVAudioSessionPortDescription = _portDescription as AVAudioSessionPortDescription - if targetPortTypeArray.contains(portDescription.portType) { - return true - } - } - } - return false - } - - @objc func getIsWiredHeadsetPluggedIn(_ resolve: RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) -> Void { - let isWiredHeadsetPluggedIn = self.isWiredHeadsetPluggedIn() - resolve([ - ["isWiredHeadsetPluggedIn": isWiredHeadsetPluggedIn] - ]) - } - - func isWiredHeadsetPluggedIn() -> Bool { - // --- only check for a audio device plugged into headset port instead bluetooth/usb/hdmi - return self.checkAudioRoute([AVAudioSessionPortHeadphones], "output") || self.checkAudioRoute([AVAudioSessionPortHeadsetMic], "input") - } - - func audioSessionSetCategory(_ audioCategory: String, _ options: AVAudioSessionCategoryOptions?, _ callerMemo: String) -> Void { - do { - if let withOptions = options { - try self.audioSession.setCategory(audioCategory, with: withOptions) - } else { - try self.audioSession.setCategory(audioCategory) - } - NSLog("RNInCallManager.\(callerMemo): audioSession.setCategory(\(audioCategory), withOptions: \(options)) success") - } catch let err { - NSLog("RNInCallManager.\(callerMemo): audioSession.setCategory(\(audioCategory), withOptions: \(options)) failed: \(err)") - } - } - - func audioSessionSetMode(_ audioMode: String, _ callerMemo: String) -> Void { - do { - try self.audioSession.setMode(audioMode) - NSLog("RNInCallManager.\(callerMemo): audioSession.setMode(\(audioMode)) success") - } catch let err { - NSLog("RNInCallManager.\(callerMemo): audioSession.setMode(\(audioMode)) failed: \(err)") - } - } - - func audioSessionSetActive(_ audioActive: Bool, _ options:AVAudioSessionSetActiveOptions?, _ callerMemo: String) -> Void { - do { - if let withOptions = options { - try self.audioSession.setActive(audioActive, with: withOptions) - } else { - try self.audioSession.setActive(audioActive) - } - NSLog("RNInCallManager.\(callerMemo): audioSession.setActive(\(audioActive), withOptions: \(options)) success") - } catch let err { - NSLog("RNInCallManager.\(callerMemo): audioSession.setActive(\(audioActive), withOptions: \(options)) failed: \(err)") - } - } - - @objc func setFlashOn(enable: Bool, brightness: NSNumber) -> Void { - guard let device = device else { return } - if device.hasTorch && device.position == .back { - do { - try device.lockForConfiguration() - if enable { - try device.setTorchModeOnWithLevel(brightness.floatValue) - } else { - device.torchMode = .off - } - NSLog("RNInCallManager.setForceSpeakerphoneOn(): enable: \(enable)") - device.unlockForConfiguration() - } catch let error { - NSLog("RNInCallManager.setFlashOn error != \(error)") - } - } - } - - @objc func setKeepScreenOn(_ enable: Bool) -> Void { - NSLog("RNInCallManager.setKeepScreenOn(): enable: \(enable)") - UIApplication.shared.isIdleTimerDisabled = enable - } - - @objc func setSpeakerphoneOn(_ enable: Bool) -> Void { - NSLog("RNInCallManager.setSpeakerphoneOn(): ios doesn't support setSpeakerphoneOn()") - } - - @objc func setForceSpeakerphoneOn(_ flag: Int) -> Void { - self.forceSpeakerOn = flag - NSLog("RNInCallManager.setForceSpeakerphoneOn(): flag=\(flag)") - self.updateAudioRoute() - } - - @objc func setMicrophoneMute(_ enable: Bool) -> Void { - NSLog("RNInCallManager.setMicrophoneMute(): ios doesn't support setMicrophoneMute()") - } - - func storeOriginalAudioSetup() -> Void { - NSLog("RNInCallManager.storeOriginalAudioSetup(): origAudioCategory=\(self.audioSession.category), origAudioMode=\(self.audioSession.mode)") - self.origAudioCategory = self.audioSession.category - self.origAudioMode = self.audioSession.mode - } - - func restoreOriginalAudioSetup() -> Void { - NSLog("RNInCallManager.restoreOriginalAudioSetup(): origAudioCategory=\(self.audioSession.category), origAudioMode=\(self.audioSession.mode)") - self.audioSessionSetCategory(self.origAudioCategory, nil, #function) - self.audioSessionSetMode(self.origAudioMode, #function) - } - - func checkProximitySupport() -> Void { - self.currentDevice.isProximityMonitoringEnabled = true - self.isProximitySupported = self.currentDevice.isProximityMonitoringEnabled - self.currentDevice.isProximityMonitoringEnabled = false - NSLog("RNInCallManager.checkProximitySupport(): isProximitySupported=\(self.isProximitySupported)") - } - - func startProximitySensor() -> Void { - guard !self.isProximityRegistered else { return } - - NSLog("RNInCallManager.startProximitySensor()") - self.currentDevice.isProximityMonitoringEnabled = true - - self.stopObserve(self.proximityObserver, name: NSNotification.Name.UIDeviceProximityStateDidChange.rawValue, object: nil) // --- in case it didn't deallocate when ViewDidUnload - self.proximityObserver = self.startObserve(NSNotification.Name.UIDeviceProximityStateDidChange.rawValue, object: self.currentDevice, queue: nil) { notification in - let state: Bool = self.currentDevice.proximityState - if state != self.proximityIsNear { - NSLog("RNInCallManager.UIDeviceProximityStateDidChangeNotification(): isNear: \(state)") - self.proximityIsNear = state - self.bridge.eventDispatcher().sendDeviceEvent(withName: "Proximity", body: ["isNear": state]) - } - } - - self.isProximityRegistered = true - } - - func stopProximitySensor() -> Void { - guard self.isProximityRegistered else { return } - - NSLog("RNInCallManager.stopProximitySensor()") - self.currentDevice.isProximityMonitoringEnabled = false - self.stopObserve(self.proximityObserver, name: NSNotification.Name.UIDeviceProximityStateDidChange.rawValue, object: nil) // --- remove all no matter what object - self.isProximityRegistered = false - } - - func startAudioSessionNotification() -> Void { - NSLog("RNInCallManager.startAudioSessionNotification() starting...") - self.startAudioSessionInterruptionNotification() - self.startAudioSessionRouteChangeNotification() - self.startAudioSessionMediaServicesWereLostNotification() - self.startAudioSessionMediaServicesWereResetNotification() - self.startAudioSessionSilenceSecondaryAudioHintNotification() - } - - func stopAudioSessionNotification() -> Void { - NSLog("RNInCallManager.startAudioSessionNotification() stopping...") - self.stopAudioSessionInterruptionNotification() - self.stopAudioSessionRouteChangeNotification() - self.stopAudioSessionMediaServicesWereLostNotification() - self.stopAudioSessionMediaServicesWereResetNotification() - self.stopAudioSessionSilenceSecondaryAudioHintNotification() - } - - func startAudioSessionInterruptionNotification() -> Void { - guard !self.isAudioSessionInterruptionRegistered else { return } - NSLog("RNInCallManager.startAudioSessionInterruptionNotification()") - - self.stopObserve(self.audioSessionInterruptionObserver, name: NSNotification.Name.AVAudioSessionInterruption.rawValue, object: nil) // --- in case it didn't deallocate when ViewDidUnload - self.audioSessionInterruptionObserver = self.startObserve(NSNotification.Name.AVAudioSessionInterruption.rawValue, object: nil, queue: nil) { notification in - guard notification.name == NSNotification.Name.AVAudioSessionInterruption && notification.userInfo != nil else { return } - - if let rawValue = (notification.userInfo?[AVAudioSessionInterruptionTypeKey] as AnyObject).uintValue { - //if let type = AVAudioSessionInterruptionType.fromRaw(rawValue) { - if let type = AVAudioSessionInterruptionType(rawValue: rawValue) { - switch type { - case .began: - NSLog("RNInCallManager.AudioSessionInterruptionNotification: Began") - case .ended: - NSLog("RNInCallManager.AudioSessionInterruptionNotification: Ended") - default: - NSLog("RNInCallManager.AudioSessionInterruptionNotification: Unknow Value") - } - return - } - } - NSLog("RNInCallManager.AudioSessionInterruptionNotification: could not resolve notification") - } - self.isAudioSessionInterruptionRegistered = true - } - - func stopAudioSessionInterruptionNotification() -> Void { - guard self.isAudioSessionInterruptionRegistered else { return } - - NSLog("RNInCallManager.stopAudioSessionInterruptionNotification()") - self.stopObserve(self.audioSessionInterruptionObserver, name: NSNotification.Name.AVAudioSessionInterruption.rawValue, object: nil) // --- remove all no matter what object - self.isAudioSessionInterruptionRegistered = false - } - - func startAudioSessionRouteChangeNotification() -> Void { - guard !self.isAudioSessionRouteChangeRegistered else { return } - - NSLog("RNInCallManager.startAudioSessionRouteChangeNotification()") - self.stopObserve(self.audioSessionRouteChangeObserver, name: NSNotification.Name.AVAudioSessionRouteChange.rawValue, object: nil) // --- in case it didn't deallocate when ViewDidUnload - self.audioSessionRouteChangeObserver = self.startObserve(NSNotification.Name.AVAudioSessionRouteChange.rawValue, object: nil, queue: nil) { notification in - guard notification.name == NSNotification.Name.AVAudioSessionRouteChange && notification.userInfo != nil else { return } - - if let rawValue = notification.userInfo?[AVAudioSessionRouteChangeReasonKey] as? UInt { - if let type = AVAudioSessionRouteChangeReason(rawValue: rawValue) { - switch type { - case .unknown: - NSLog("RNInCallManager.AudioRouteChange.Reason: Unknown") - case .newDeviceAvailable: - NSLog("RNInCallManager.AudioRouteChange.Reason: NewDeviceAvailable") - if self.checkAudioRoute([AVAudioSessionPortHeadsetMic], "input") { - self.bridge.eventDispatcher().sendDeviceEvent(withName: "WiredHeadset", body: ["isPlugged": true, "hasMic": true, "deviceName": AVAudioSessionPortHeadsetMic]) - } else if self.checkAudioRoute([AVAudioSessionPortHeadphones], "output") { - self.bridge.eventDispatcher().sendDeviceEvent(withName: "WiredHeadset", body: ["isPlugged": true, "hasMic": false, "deviceName": AVAudioSessionPortHeadphones]) - } - case .oldDeviceUnavailable: - NSLog("RNInCallManager.AudioRouteChange.Reason: OldDeviceUnavailable") - if !self.isWiredHeadsetPluggedIn() { - self.bridge.eventDispatcher().sendDeviceEvent(withName: "WiredHeadset", body: ["isPlugged": false, "hasMic": false, "deviceName": ""]) - } - case .categoryChange: - NSLog("RNInCallManager.AudioRouteChange.Reason: CategoryChange. category=\(self.audioSession.category) mode=\(self.audioSession.mode)") - self.updateAudioRoute() - case .override: - NSLog("RNInCallManager.AudioRouteChange.Reason: Override") - case .wakeFromSleep: - NSLog("RNInCallManager.AudioRouteChange.Reason: WakeFromSleep") - case .noSuitableRouteForCategory: - NSLog("RNInCallManager.AudioRouteChange.Reason: NoSuitableRouteForCategory") - case .routeConfigurationChange: - NSLog("RNInCallManager.AudioRouteChange.Reason: RouteConfigurationChange. category=\(self.audioSession.category) mode=\(self.audioSession.mode)") - default: - NSLog("RNInCallManager.AudioRouteChange.Reason: Unknow Value") - } - } else { - NSLog("RNInCallManager.AudioRouteChange.Reason: cound not resolve notification") - } - } else { - NSLog("RNInCallManager.AudioRouteChange.Reason: cound not resolve notification") - } - if #available(iOS 8, *) { - if let rawValue = (notification.userInfo?[AVAudioSessionSilenceSecondaryAudioHintTypeKey] as AnyObject).uintValue { - if let type = AVAudioSessionSilenceSecondaryAudioHintType(rawValue: rawValue) { - switch type { - case .begin: - NSLog("RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: Begin") - case .end: - NSLog("RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: End") - default: - NSLog("RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: Unknow Value") - } - } else { - NSLog("RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: cound not resolve notification") - } - } else { - NSLog("RNInCallManager.AudioRouteChange.SilenceSecondaryAudioHint: cound not resolve notification") - } - } - } - self.isAudioSessionRouteChangeRegistered = true - } - - func stopAudioSessionRouteChangeNotification() -> Void { - guard self.isAudioSessionRouteChangeRegistered else { return } - - NSLog("RNInCallManager.stopAudioSessionRouteChangeNotification()") - self.stopObserve(self.audioSessionRouteChangeObserver, name: NSNotification.Name.AVAudioSessionRouteChange.rawValue, object: nil) // --- remove all no matter what object - self.isAudioSessionRouteChangeRegistered = false - } - - func startAudioSessionMediaServicesWereLostNotification() -> Void { - guard !self.isAudioSessionMediaServicesWereLostRegistered else { return } - - NSLog("RNInCallManager.startAudioSessionMediaServicesWereLostNotification()") - self.stopObserve(self.audioSessionMediaServicesWereLostObserver, name: NSNotification.Name.AVAudioSessionMediaServicesWereLost.rawValue, object: nil) // --- in case it didn't deallocate when ViewDidUnload - self.audioSessionMediaServicesWereLostObserver = self.startObserve(NSNotification.Name.AVAudioSessionMediaServicesWereLost.rawValue, object: nil, queue: nil) { notification in - // --- This notification has no userInfo dictionary. - NSLog("RNInCallManager.AudioSessionMediaServicesWereLostNotification: Media Services Were Lost") - } - self.isAudioSessionMediaServicesWereLostRegistered = true - } - - func stopAudioSessionMediaServicesWereLostNotification() -> Void { - guard self.isAudioSessionMediaServicesWereLostRegistered else { return } - - NSLog("RNInCallManager.stopAudioSessionMediaServicesWereLostNotification()") - self.stopObserve(self.audioSessionMediaServicesWereLostObserver, name: NSNotification.Name.AVAudioSessionMediaServicesWereLost.rawValue, object: nil) // --- remove all no matter what object - self.isAudioSessionMediaServicesWereLostRegistered = false - } - - func startAudioSessionMediaServicesWereResetNotification() -> Void { - guard !self.isAudioSessionMediaServicesWereResetRegistered else { return } - - NSLog("RNInCallManager.startAudioSessionMediaServicesWereResetNotification()") - self.stopObserve(self.audioSessionMediaServicesWereResetObserver, name: NSNotification.Name.AVAudioSessionMediaServicesWereReset.rawValue, object: nil) // --- in case it didn't deallocate when ViewDidUnload - self.audioSessionMediaServicesWereResetObserver = self.startObserve(NSNotification.Name.AVAudioSessionMediaServicesWereReset.rawValue, object: nil, queue: nil) { notification in - // --- This notification has no userInfo dictionary. - NSLog("RNInCallManager.AudioSessionMediaServicesWereResetNotification: Media Services Were Reset") - } - self.isAudioSessionMediaServicesWereResetRegistered = true - } - - func stopAudioSessionMediaServicesWereResetNotification() -> Void { - guard self.isAudioSessionMediaServicesWereResetRegistered else { return } - - NSLog("RNInCallManager.stopAudioSessionMediaServicesWereResetNotification()") - self.stopObserve(self.audioSessionMediaServicesWereResetObserver, name: NSNotification.Name.AVAudioSessionMediaServicesWereReset.rawValue, object: nil) // --- remove all no matter what object - self.isAudioSessionMediaServicesWereResetRegistered = false - } - - func startAudioSessionSilenceSecondaryAudioHintNotification() -> Void { - guard #available(iOS 8, *) else { return } - guard !self.isAudioSessionSilenceSecondaryAudioHintRegistered else { return } - - NSLog("RNInCallManager.startAudioSessionSilenceSecondaryAudioHintNotification()") - self.stopObserve(self.audioSessionSilenceSecondaryAudioHintObserver, name: NSNotification.Name.AVAudioSessionSilenceSecondaryAudioHint.rawValue, object: nil) // --- in case it didn't deallocate when ViewDidUnload - self.audioSessionSilenceSecondaryAudioHintObserver = self.startObserve(NSNotification.Name.AVAudioSessionSilenceSecondaryAudioHint.rawValue, object: nil, queue: nil) { notification in - guard notification.name == NSNotification.Name.AVAudioSessionSilenceSecondaryAudioHint && notification.userInfo != nil else { return } - - if let rawValue = (notification.userInfo?[AVAudioSessionSilenceSecondaryAudioHintTypeKey] as AnyObject).uintValue { - if let type = AVAudioSessionSilenceSecondaryAudioHintType(rawValue: rawValue) { - switch type { - case .begin: - NSLog("RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: Begin") - case .end: - NSLog("RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: End") - default: - NSLog("RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: Unknow Value") - } - return - } - } - NSLog("RNInCallManager.AVAudioSessionSilenceSecondaryAudioHintNotification: could not resolve notification") - } - self.isAudioSessionSilenceSecondaryAudioHintRegistered = true - } - - func stopAudioSessionSilenceSecondaryAudioHintNotification() -> Void { - guard #available(iOS 8, *) else { return } - guard self.isAudioSessionSilenceSecondaryAudioHintRegistered else { return } - - NSLog("RNInCallManager.stopAudioSessionSilenceSecondaryAudioHintNotification()") - self.stopObserve(self.audioSessionSilenceSecondaryAudioHintObserver, name: NSNotification.Name.AVAudioSessionSilenceSecondaryAudioHint.rawValue, object: nil) // --- remove all no matter what object - self.isAudioSessionSilenceSecondaryAudioHintRegistered = false - } - - func startObserve(_ name: String, object: AnyObject?, queue: OperationQueue?, block: @escaping (Notification) -> ()) -> NSObjectProtocol { - return NotificationCenter.default.addObserver(forName: NSNotification.Name(rawValue: name), object: object, queue: queue, using: block) - } - - func stopObserve(_ _observer: AnyObject?, name: String?, object: AnyObject?) -> Void { - if let observer = _observer { - NotificationCenter.default.removeObserver(observer, name: name.map { NSNotification.Name(rawValue: $0) }, object: object) - } - } - - // --- _ringbackUriType: never go here with be empty string. - func startRingback(_ _ringbackUriType: String) -> Void { - // you may rejected by apple when publish app if you use system sound instead of bundled sound. - NSLog("RNInCallManager.startRingback(): type=\(_ringbackUriType)") - do { - if self.mRingback != nil { - if self.mRingback.isPlaying { - NSLog("RNInCallManager.startRingback(): is already playing") - return - } else { - self.stopRingback() - } - } - // ios don't have embedded DTMF tone generator. use system dtmf sound files. - let ringbackUriType: String = (_ringbackUriType == "_DTMF_" ? "_DEFAULT_" : _ringbackUriType) - let ringbackUri: URL? = getRingbackUri(ringbackUriType) - if ringbackUri == nil { - NSLog("RNInCallManager.startRingback(): no available media") - return - } - //self.storeOriginalAudioSetup() - self.mRingback = try AVAudioPlayer(contentsOf: ringbackUri!) - self.mRingback.delegate = self - self.mRingback.numberOfLoops = -1 // you need to stop it explicitly - self.mRingback.prepareToPlay() - - //self.audioSessionSetCategory(self.incallAudioCategory, [.DefaultToSpeaker, .AllowBluetooth], #function) - self.audioSessionSetCategory(self.incallAudioCategory, nil, #function) - self.audioSessionSetMode(self.incallAudioMode, #function) - self.mRingback.play() - } catch let err { - NSLog("RNInCallManager.startRingback(): caught error=\(err)") - } - } - - @objc func stopRingback() -> Void { - if self.mRingback != nil { - NSLog("RNInCallManager.stopRingback()") - self.mRingback.stop() - self.mRingback = nil - // --- need to reset route based on config because WebRTC seems will switch audio mode automatically when call established. - //self.updateAudioRoute() - } - } - - // --- _busytoneUriType: never go here with be empty string. - func startBusytone(_ _busytoneUriType: String) -> Bool { - // you may rejected by apple when publish app if you use system sound instead of bundled sound. - NSLog("RNInCallManager.startBusytone(): type=\(_busytoneUriType)") - do { - if self.mBusytone != nil { - if self.mBusytone.isPlaying { - NSLog("RNInCallManager.startBusytone(): is already playing") - return false - } else { - self.stopBusytone() - } - } - - // ios don't have embedded DTMF tone generator. use system dtmf sound files. - let busytoneUriType: String = (_busytoneUriType == "_DTMF_" ? "_DEFAULT_" : _busytoneUriType) - let busytoneUri: URL? = getBusytoneUri(busytoneUriType) - if busytoneUri == nil { - NSLog("RNInCallManager.startBusytone(): no available media") - return false - } - //self.storeOriginalAudioSetup() - self.mBusytone = try AVAudioPlayer(contentsOf: busytoneUri!) - self.mBusytone.delegate = self - self.mBusytone.numberOfLoops = 0 // it's part of start(), will stop at stop() - self.mBusytone.prepareToPlay() - - //self.audioSessionSetCategory(self.incallAudioCategory, [.DefaultToSpeaker, .AllowBluetooth], #function) - self.audioSessionSetCategory(self.incallAudioCategory, nil, #function) - self.audioSessionSetMode(self.incallAudioMode, #function) - self.mBusytone.play() - } catch let err { - NSLog("RNInCallManager.startBusytone(): caught error=\(err)") - return false - } - return true - } - - func stopBusytone() -> Void { - if self.mBusytone != nil { - NSLog("RNInCallManager.stopBusytone()") - self.mBusytone.stop() - self.mBusytone = nil - } - } - - // --- ringtoneUriType May be empty - @objc func startRingtone(_ ringtoneUriType: String, ringtoneCategory: String) -> Void { - // you may rejected by apple when publish app if you use system sound instead of bundled sound. - NSLog("RNInCallManager.startRingtone(): type=\(ringtoneUriType)") - do { - if self.mRingtone != nil { - if self.mRingtone.isPlaying { - NSLog("RNInCallManager.startRingtone(): is already playing.") - return - } else { - self.stopRingtone() - } - } - let ringtoneUri: URL? = getRingtoneUri(ringtoneUriType) - if ringtoneUri == nil { - NSLog("RNInCallManager.startRingtone(): no available media") - return - } - - // --- ios has Ringer/Silent switch, so just play without check ringer volume. - self.storeOriginalAudioSetup() - self.mRingtone = try AVAudioPlayer(contentsOf: ringtoneUri!) - self.mRingtone.delegate = self - self.mRingtone.numberOfLoops = -1 // you need to stop it explicitly - self.mRingtone.prepareToPlay() - - // --- 1. if we use Playback, it can supports background playing (starting from foreground), but it would not obey Ring/Silent switch. - // --- make sure you have enabled 'audio' tag ( or 'voip' tag ) at XCode -> Capabilities -> BackgroundMode - // --- 2. if we use SoloAmbient, it would obey Ring/Silent switch in the foreground, but does not support background playing, - // --- thus, then you should play ringtone again via local notification after back to home during a ring session. - - // we prefer 2. by default, since most of users doesn't want to interrupted by a ringtone if Silent mode is on. - - //self.audioSessionSetCategory(AVAudioSessionCategoryPlayback, [.DuckOthers], #function) - if ringtoneCategory == "playback" { - self.audioSessionSetCategory(AVAudioSessionCategoryPlayback, nil, #function) - } else { - self.audioSessionSetCategory(AVAudioSessionCategorySoloAmbient, nil, #function) - } - self.audioSessionSetMode(AVAudioSessionModeDefault, #function) - //self.audioSessionSetActive(true, nil, #function) - self.mRingtone.play() - } catch let err { - NSLog("RNInCallManager.startRingtone(): caught error=\(err)") - } - } - - @objc func stopRingtone() -> Void { - if self.mRingtone != nil { - NSLog("RNInCallManager.stopRingtone()") - self.mRingtone.stop() - self.mRingtone = nil - self.restoreOriginalAudioSetup() - self.audioSessionSetActive(false, .notifyOthersOnDeactivation, #function) - } - } - - @objc func getAudioUriJS(_ audioType: String, fileType: String, resolve: RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) { - var _result: URL? = nil - if audioType == "ringback" { - _result = getRingbackUri(fileType) - } else if audioType == "busytone" { - _result = getBusytoneUri(fileType) - } else if audioType == "ringtone" { - _result = getRingtoneUri(fileType) - } - if let result: URL? = _result { - if let urlString = result?.absoluteString { - resolve(urlString) - return - } - } - reject("error_code", "getAudioUriJS() failed", NSError(domain:"getAudioUriJS", code: 0, userInfo: nil)) - } - - func getRingbackUri(_ _type: String) -> URL? { - let fileBundle: String = "incallmanager_ringback" - let fileBundleExt: String = "mp3" - //let fileSysWithExt: String = "vc~ringing.caf" // --- ringtone of facetine, but can't play it. - //let fileSysPath: String = "/System/Library/Audio/UISounds" - let fileSysWithExt: String = "Marimba.m4r" - let fileSysPath: String = "/Library/Ringtones" - let type = (_type == "" || _type == "_DEFAULT_" ? fileSysWithExt : _type) // --- you can't get default user perfrence sound in ios - return self.getAudioUri(type, fileBundle, fileBundleExt, fileSysWithExt, fileSysPath, &self.bundleRingbackUri, &self.defaultRingbackUri) - } - - func getBusytoneUri(_ _type: String) -> URL? { - let fileBundle: String = "incallmanager_busytone" - let fileBundleExt: String = "mp3" - let fileSysWithExt: String = "ct-busy.caf" //ct-congestion.caf - let fileSysPath: String = "/System/Library/Audio/UISounds" - let type = (_type == "" || _type == "_DEFAULT_" ? fileSysWithExt : _type) // --- you can't get default user perfrence sound in ios - return self.getAudioUri(type, fileBundle, fileBundleExt, fileSysWithExt, fileSysPath, &self.bundleBusytoneUri, &self.defaultBusytoneUri) - } - - func getRingtoneUri(_ _type: String) -> URL? { - let fileBundle: String = "incallmanager_ringtone" - let fileBundleExt: String = "mp3" - let fileSysWithExt: String = "Opening.m4r" //Marimba.m4r - let fileSysPath: String = "/Library/Ringtones" - let type = (_type == "" || _type == "_DEFAULT_" ? fileSysWithExt : _type) // --- you can't get default user perfrence sound in ios - return self.getAudioUri(type, fileBundle, fileBundleExt, fileSysWithExt, fileSysPath, &self.bundleRingtoneUri, &self.defaultRingtoneUri) - } - - func getAudioUri(_ _type: String, _ fileBundle: String, _ fileBundleExt: String, _ fileSysWithExt: String, _ fileSysPath: String, _ uriBundle: inout URL!, _ uriDefault: inout URL!) -> URL? { - var type = _type - if type == "_BUNDLE_" { - if uriBundle == nil { - uriBundle = Bundle.main.url(forResource: fileBundle, withExtension: fileBundleExt) - if uriBundle == nil { - NSLog("RNInCallManager.getAudioUri(): \(fileBundle).\(fileBundleExt) not found in bundle.") - type = fileSysWithExt - } else { - return uriBundle - } - } else { - return uriBundle - } - } - - if uriDefault == nil { - let target: String = "\(fileSysPath)/\(type)" - uriDefault = self.getSysFileUri(target) - } - return uriDefault - } - - func getSysFileUri(_ target: String) -> URL? { - if let url: URL? = URL(fileURLWithPath: target, isDirectory: false) { - if let path = url?.path { - let fileManager: FileManager = FileManager() - var isTargetDirectory: ObjCBool = ObjCBool(false) - if fileManager.fileExists(atPath: path, isDirectory: &isTargetDirectory) { - if !isTargetDirectory.boolValue { - return url - } - } - } - } - NSLog("RNInCallManager.getSysFileUri(): can not get url for \(target)") - return nil - } - - func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) -> Void { - // --- this only called when all loop played. it means, an infinite (numberOfLoops = -1) loop will never into here. - //if player.url!.isFileReferenceURL() { - let filename = player.url?.deletingPathExtension().lastPathComponent - NSLog("RNInCallManager.audioPlayerDidFinishPlaying(): finished playing: \(filename)") - if filename == self.bundleBusytoneUri?.deletingPathExtension().lastPathComponent - || filename == self.defaultBusytoneUri?.deletingPathExtension().lastPathComponent { - //self.stopBusytone() - NSLog("RNInCallManager.audioPlayerDidFinishPlaying(): busytone finished, invoke stop()") - self.stop("") - } - } - - func audioPlayerDecodeErrorDidOccur(_ player: AVAudioPlayer, error: Error?) -> Void { - let filename = player.url?.deletingPathExtension().lastPathComponent - NSLog("RNInCallManager.audioPlayerDecodeErrorDidOccur(): player=\(filename), error=\(error?.localizedDescription)") - } - - // --- Deprecated in iOS 8.0. - func audioPlayerBeginInterruption(_ player: AVAudioPlayer) -> Void { - let filename = player.url?.deletingPathExtension().lastPathComponent - NSLog("RNInCallManager.audioPlayerBeginInterruption(): player=\(filename)") - } - - // --- Deprecated in iOS 8.0. -// func audioPlayerEndInterruption(_ player: AVAudioPlayer) -> Void { -// let filename = player.url?.deletingPathExtension().lastPathComponent -// NSLog("RNInCallManager.audioPlayerEndInterruption(): player=\(filename)") -// } - - func debugAudioSession() -> Void { - let currentRoute: Dictionary = ["input": self.audioSession.currentRoute.inputs[0].uid, "output": self.audioSession.currentRoute.outputs[0].uid] - var categoryOptions = "" - switch self.audioSession.categoryOptions { - case AVAudioSessionCategoryOptions.mixWithOthers: - categoryOptions = "MixWithOthers" - case AVAudioSessionCategoryOptions.duckOthers: - categoryOptions = "DuckOthers" - case AVAudioSessionCategoryOptions.allowBluetooth: - categoryOptions = "AllowBluetooth" - case AVAudioSessionCategoryOptions.defaultToSpeaker: - categoryOptions = "DefaultToSpeaker" - default: - categoryOptions = "unknow" - } - if #available(iOS 9, *) { - if categoryOptions == "unknow" && self.audioSession.categoryOptions == AVAudioSessionCategoryOptions.interruptSpokenAudioAndMixWithOthers { - categoryOptions = "InterruptSpokenAudioAndMixWithOthers" - } - } - self._checkRecordPermission() - let audioSessionProperties: Dictionary = [ - "category": self.audioSession.category, - "categoryOptions": categoryOptions, - "mode": self.audioSession.mode, - //"inputAvailable": self.audioSession.inputAvailable, - "otherAudioPlaying": self.audioSession.isOtherAudioPlaying, - "recordPermission" : self.recordPermission, - //"availableInputs": self.audioSession.availableInputs, - //"preferredInput": self.audioSession.preferredInput, - //"inputDataSources": self.audioSession.inputDataSources, - //"inputDataSource": self.audioSession.inputDataSource, - //"outputDataSources": self.audioSession.outputDataSources, - //"outputDataSource": self.audioSession.outputDataSource, - "currentRoute": currentRoute, - "outputVolume": self.audioSession.outputVolume, - "inputGain": self.audioSession.inputGain, - "inputGainSettable": self.audioSession.isInputGainSettable, - "inputLatency": self.audioSession.inputLatency, - "outputLatency": self.audioSession.outputLatency, - "sampleRate": self.audioSession.sampleRate, - "preferredSampleRate": self.audioSession.preferredSampleRate, - "IOBufferDuration": self.audioSession.ioBufferDuration, - "preferredIOBufferDuration": self.audioSession.preferredIOBufferDuration, - "inputNumberOfChannels": self.audioSession.inputNumberOfChannels, - "maximumInputNumberOfChannels": self.audioSession.maximumInputNumberOfChannels, - "preferredInputNumberOfChannels": self.audioSession.preferredInputNumberOfChannels, - "outputNumberOfChannels": self.audioSession.outputNumberOfChannels, - "maximumOutputNumberOfChannels": self.audioSession.maximumOutputNumberOfChannels, - "preferredOutputNumberOfChannels": self.audioSession.preferredOutputNumberOfChannels - ] - /* - // --- Too noisy - if #available(iOS 8, *) { - //audioSessionProperties["secondaryAudioShouldBeSilencedHint"] = self.audioSession.secondaryAudioShouldBeSilencedHint - } else { - //audioSessionProperties["secondaryAudioShouldBeSilencedHint"] = "unknow" - } - if #available(iOS 9, *) { - //audioSessionProperties["availableCategories"] = self.audioSession.availableCategories - //audioSessionProperties["availableModes"] = self.audioSession.availableModes - } - */ - NSLog("RNInCallManager.debugAudioSession(): ==========BEGIN==========") - // iterate over all keys - for (key, value) in audioSessionProperties { - NSLog("\(key) = \(value)") - } - NSLog("RNInCallManager.debugAudioSession(): ==========END==========") - } - - @objc func checkRecordPermission(_ resolve: RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) { - self._checkRecordPermission() - if self.recordPermission != nil { - resolve(self.recordPermission) - } else { - reject("error_code", "error message", NSError(domain:"checkRecordPermission", code: 0, userInfo: nil)) - } - } - - func _checkRecordPermission() { - var recordPermission: String = "unsupported" - var usingApi: String = "" - if #available(iOS 8, *) { - usingApi = "iOS8+" - switch self.audioSession.recordPermission() { - case AVAudioSessionRecordPermission.granted: - recordPermission = "granted" - case AVAudioSessionRecordPermission.denied: - recordPermission = "denied" - case AVAudioSessionRecordPermission.undetermined: - recordPermission = "undetermined" - default: - recordPermission = "unknow" - } - } else { - // --- target api at least iOS7+ - usingApi = "iOS7" - recordPermission = self._checkMediaPermission(AVMediaTypeAudio) - } - self.recordPermission = recordPermission - NSLog("RNInCallManager._checkRecordPermission(): using \(usingApi) api. recordPermission=\(self.recordPermission)") - } - - @objc func requestRecordPermission(_ resolve: @escaping RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) { - NSLog("RNInCallManager.requestRecordPermission(): waiting for user confirmation...") - self.audioSession.requestRecordPermission({(granted: Bool) -> Void in - if granted { - self.recordPermission = "granted" - } else { - self.recordPermission = "denied" - } - NSLog("RNInCallManager.requestRecordPermission(): \(self.recordPermission)") - resolve(self.recordPermission) - }) - } - - @objc func checkCameraPermission(_ resolve: RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) { - self._checkCameraPermission() - if self.cameraPermission != nil { - resolve(self.cameraPermission) - } else { - reject("error_code", "error message", NSError(domain:"checkCameraPermission", code: 0, userInfo: nil)) - } - } - - func _checkCameraPermission() -> Void { - self.cameraPermission = self._checkMediaPermission(AVMediaTypeVideo) - NSLog("RNInCallManager._checkCameraPermission(): using iOS7 api. cameraPermission=\(self.cameraPermission)") - } - - @objc func requestCameraPermission(_ resolve: @escaping RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) { - NSLog("RNInCallManager.requestCameraPermission(): waiting for user confirmation...") - AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: {(granted: Bool) -> Void in - if granted { - self.cameraPermission = "granted" - } else { - self.cameraPermission = "denied" - } - NSLog("RNInCallManager.requestCameraPermission(): \(self.cameraPermission)") - resolve(self.cameraPermission) - }) - } - - func _checkMediaPermission(_ targetMediaType: String) -> String { - switch AVCaptureDevice.authorizationStatus(forMediaType: targetMediaType) { - case AVAuthorizationStatus.authorized: - return "granted" - case AVAuthorizationStatus.denied: - return "denied" - case AVAuthorizationStatus.notDetermined: - return "undetermined" - case AVAuthorizationStatus.restricted: - return "restricted" - default: - return "unknow" - } - } - - func debugApplicationState() -> Void { - var appState = "unknow" - switch UIApplication.shared.applicationState { - case UIApplicationState.active: - appState = "Active" - case UIApplicationState.inactive: - appState = "Inactive" - case UIApplicationState.background: - appState = "Background" - } - - NSLog("RNInCallManage ZXCPOIU: appState: \(appState)") - } -} diff --git a/ios/RNInCallManager/RNInCallManagerBridge.m b/ios/RNInCallManager/RNInCallManagerBridge.m deleted file mode 100644 index 77d5f91..0000000 --- a/ios/RNInCallManager/RNInCallManagerBridge.m +++ /dev/null @@ -1,30 +0,0 @@ -// RNInCallManagerBridge.m -// RNInCallManager -// -// Created by zxcpoiu, Henry Hung-Hsien Lin on 2016-04-10 -// Copyright 2016 Facebook. All rights reserved. -// - -#import - -@interface RCT_EXTERN_REMAP_MODULE(InCallManager, RNInCallManager, NSObject) - -RCT_EXTERN_METHOD(start:(NSString *)mediaType auto:(BOOL)auto ringbackUriType:(NSString *)ringbackUriType) -RCT_EXTERN_METHOD(stop:(NSString *)busytone) -RCT_EXTERN_METHOD(turnScreenOn) -RCT_EXTERN_METHOD(turnScreenOff) -RCT_EXTERN_METHOD(setFlashOn:(BOOL)enable brightness:(nonnull NSNumber *)brightness) -RCT_EXTERN_METHOD(setKeepScreenOn:(BOOL)enable) -RCT_EXTERN_METHOD(setSpeakerphoneOn:(BOOL)enable) -RCT_EXTERN_METHOD(setForceSpeakerphoneOn:(int)flag) -RCT_EXTERN_METHOD(setMicrophoneMute:(BOOL)enable) -RCT_EXTERN_METHOD(stopRingback) -RCT_EXTERN_METHOD(startRingtone:(NSString *)ringtoneUriType ringtoneCategory:(NSString *)ringtoneCategory) -RCT_EXTERN_METHOD(stopRingtone) -RCT_EXTERN_METHOD(checkRecordPermission:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) -RCT_EXTERN_METHOD(requestRecordPermission:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) -RCT_EXTERN_METHOD(checkCameraPermission:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) -RCT_EXTERN_METHOD(requestCameraPermission:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) -RCT_EXTERN_METHOD(getAudioUriJS:(NSString *)audioType fileType:(NSString *)fileType resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) -RCT_EXTERN_METHOD(getIsWiredHeadsetPluggedIn:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) -@end diff --git a/package.json b/package.json index f7cbcba..d4e89c1 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "react-native-incall-manager", - "version": "2.2.0", + "version": "4.2.1", "description": "Handling media-routes/sensors/events during a audio/video chat on React Native", "main": "index.js", "scripts": { @@ -8,7 +8,7 @@ }, "repository": { "type": "git", - "url": "git+https://github.com/zxcpoiu/react-native-incall-manager.git" + "url": "git+https://github.com/react-native-webrtc/react-native-incall-manager.git" }, "keywords": [ "React", @@ -24,9 +24,9 @@ "author": "Henry Lin ", "license": "ISC", "bugs": { - "url": "https://github.com/zxcpoiu/react-native-incall-manager/issues" + "url": "https://github.com/react-native-webrtc/react-native-incall-manager/issues" }, - "homepage": "https://github.com/zxcpoiu/react-native-incall-manager#readme", + "homepage": "https://github.com/react-native-webrtc/react-native-incall-manager#readme", "peerDependencies": { "react-native": ">=0.40.0" }