EN

EN

CN
Start Coding Free

How to Integrate Voice and Video Calling into Your iOS App with Nexconn Call SDK

How to Integrate Voice and Video Calling into Your iOS App with Nexconn Call SDK
Leo
Leo
Product Director at Nexconn, overseeing Chat and Call suites. Transforms complex telecom infrastructure into developer-friendly SDKs.

Real-time voice and video calling has moved from a premium feature to a baseline user expectation. Whether you're building a dating app, a telehealth platform, a marketplace, or a social product, the moment a user wants to move from messaging to a live conversation — your infrastructure either handles it or it doesn't.

This guide walks through a complete integration of Nexconn's Call SDK into an iOS app, from dependency setup to placing and managing your first call. The full integration takes under 30 minutes with a working Chat SDK setup already in place.

Quick Start: 30-Minute iOS SDK Integration Guide

Prerequisites

Before starting, make sure the following are in place:

  • Xcode 9.0 or later
  • A physical iOS device running iOS 10.0 or later — audio and video call testing requires real hardware, not a simulator
  • A completed Nexconn Chat SDK initialization and sign-in flow
  • CocoaPods 1.10.0 or later, since the Call SDK uses XCFramework and full XCFramework support starts from that version

If CocoaPods isn't set up yet, run sudo gem install cocoapods in Terminal before proceeding.

Step 1: Add the NexconnCall Framework

If your project doesn't already have a Podfile, open Terminal, navigate to your project root, and run:

pod init

Open the generated Podfile and add the following entry:

pod 'NexconnCall/Call'

Then install the dependency:

pod install

If Terminal reports that the required version can't be found, run pod repo update first, then run pod install again.

Once installation completes, open the .xcworkspace file in Xcode — not the .xcodeproj file. From this point, all development should happen through the workspace.

Step 2: Initialize the Call SDK

The Call SDK initialization has a specific order that matters. Skipping steps or initializing out of sequence will cause unexpected behavior.

First, register with the Chat bridge. This must happen before initialize: and before any Call SDK features are used. Call it once at app startup:

[NCCallEngine install];

Second, initialize the Chat SDK with your app key:

NCInitParams *params = [[NCInitParams alloc] initWithAppKey:@"appKey"];
[NCEngine initializeWithParams:params];

Complete your existing Chat connection and sign-in flow at this point. The Call SDK depends on an authenticated Chat session.

Third, once Chat is initialized and the user is signed in, configure and initialize the Call engine:

NCCallInitParams *initParams = [NCCallInitParams new];
initParams.pubLowResolutionStream = YES;
initParams.enableHardwareEncoderHighProfile = NO;

[[NCCallEngine getInstance] initialize:initParams];

pubLowResolutionStream controls whether a secondary low-resolution stream is published alongside the main stream — useful for layouts with multiple video tiles. enableHardwareEncoderHighProfile should generally remain NO for broad device compatibility.

Step 3: Register Call Event Handlers

Handlers need to be registered during app startup, before any calls are placed or received. Missing this step means incoming call events won't be processed.

@interface MyCallHandler () <NCCallEventHandler, NCCallAPIResultHandler>
@end

@implementation MyCallHandler

- (void)setupCallHandlers {
    NCCallEngine *engine = [NCCallEngine getInstance];
    [engine setCallEventHandler:self];
    [engine setAPIResultHandler:self];
}

@end

NCCallEventHandler handles real-time call events — incoming calls, state changes, participant updates. NCCallAPIResultHandler handles the result callbacks from API operations like startCall: and endCall:. Both are needed for a complete implementation.

Step 4: Set Up Video Views

Before placing or accepting a video call, configure the local and remote video views. This wires the SDK's rendering layer to your UI components.

NCCallLocalVideoView *localView = [NCCallLocalVideoView new];
localView.userId = self.currentUserId;
localView.renderMode = NCCallRenderModeAspectFit;

NCCallRemoteVideoView *remoteView = [NCCallRemoteVideoView new];
remoteView.userId = self.remoteUserId;
remoteView.renderMode = NCCallRenderModeAspectFit;
remoteView.enableLowResolutionStream = NO;

NCCallEngine *engine = [NCCallEngine getInstance];
[engine setLocalVideoView:localView];
[engine setRemoteVideoView:@[ remoteView ]];

NCCallRenderModeAspectFit maintains the original aspect ratio within the view bounds — the right default for most calling interfaces. enableLowResolutionStream on the remote view controls whether to subscribe to the secondary stream when one is available; set to YES for thumbnail views in multi-party layouts.

Note that setRemoteVideoView: accepts an array, which means multi-party video layouts are handled by passing multiple NCCallRemoteVideoView instances.

Step 5: Place a Call

With the SDK initialized and video views configured, placing a call requires building a NCCallStartCallParams object and passing it to startCall::

NCCallStartCallParams *params =
    [[NCCallStartCallParams alloc] initWithCalleeIds:@[ self.remoteUserId ]
                                            callType:NCCallTypeSingle
                                           mediaType:NCCallMediaTypeAudioVideo];
[[NCCallEngine getInstance] startCall:params];

NCCallTypeSingle initiates a one-to-one call. For multi-party calls, use the appropriate call type and pass multiple callee IDs. NCCallMediaTypeAudioVideo starts with both audio and video active — use NCCallMediaTypeAudio for voice-only calls.

Step 6: Accept, Reject, and End Calls

Accepting an incoming call happens inside onCallReceived:, which fires on the callee's device when a call arrives. Read the call ID from the event and pass it to acceptCall::

- (void)onCallReceived:(NCCallReceivedEvent *)event {
    NCCallAcceptCallParams *accept =
        [[NCCallAcceptCallParams alloc] initWithCallId:event.session.callId];
    [[NCCallEngine getInstance] acceptCall:accept];
}

NCCallReceivedEvent carries the session object, which contains the call ID, caller information, and media type. Use these fields to build your incoming call UI before accepting.

Ending a call requires fetching the current session and passing its call ID to endCall::

NCCallSession *currentSession = [[NCCallEngine getInstance] getCurrentCallSession];
if (currentSession == nil) {
    return;
}

NCCallEndCallParams *endParams = [[NCCallEndCallParams alloc] initWithCallId:currentSession.callId];
endParams.pushConfig = nil;
[[NCCallEngine getInstance] endCall:endParams];

The nil check on getCurrentCallSession is important — calling endCall: without an active session will return an error. pushConfig can be set to send a push notification to the remote party when the call ends; nil skips the notification.

Step 7: Control Devices During a Call

Once a call is connected, microphone, speaker, and camera controls are exposed directly on the NCCallEngine instance:

NCCallEngine *engine = [NCCallEngine getInstance];

engine.enableMicrophone = YES;
engine.enableSpeaker = YES;

[engine enableCamera:YES];
[engine switchCamera];

A few notes on device behavior worth knowing: the camera is enabled by default after a video call connects, so you don't need to call enableCamera:YES on connection unless you've explicitly disabled it. switchCamera toggles between front and rear cameras — call it in response to a user action in your UI. Microphone and speaker state can be toggled at any point during the call.


What You've Built

At this point, your iOS app can:

  • Initialize the Nexconn Call SDK alongside the Chat SDK in the correct sequence
  • Register handlers to receive and process incoming call events
  • Configure local and remote video rendering
  • Place outbound one-to-one audio and video calls
  • Accept incoming calls and read session metadata
  • End calls and handle the session lifecycle cleanly
  • Control microphone, speaker, and camera state during a live call

What Comes Next

This walkthrough covers the core one-to-one calling flow. Nexconn's Call SDK goes further on several dimensions that matter for production deployments:

Weak network resilience is built into the infrastructure layer. The SDK maintains clear audio with up to 80% packet loss and stable video with up to 60% packet loss — handled automatically through adaptive bitrate streaming, packet loss concealment, and anti-jitter algorithms. For apps serving users in Southeast Asia, the Middle East, or any market with inconsistent mobile infrastructure, this behavior happens without additional implementation work.

AI background noise reduction strips environmental audio — traffic, keyboard clicks, ambient noise — from the call stream. This is particularly relevant for dating and social apps where calls happen in unpredictable environments.

Beauty filters and virtual backgrounds are available for video calls, directly relevant for social discovery and dating products where user confidence in their appearance affects call adoption rates.

Cloud recording captures sessions server-side with configurable retention — useful for telehealth compliance, marketplace dispute resolution, and legal platform requirements.

Call timing and billing infrastructure provides high-precision duration tracking for per-minute billing models, premium consultation tiers, and audit-ready records.

Real-time content moderation monitors audio and video streams for inappropriate content, running automatically without requiring manual review workflows.

For the full architecture behind production calling deployments — latency optimization, cross-platform consistency strategies, monetization models, and trust and safety frameworks — download the Real-Time Presence Guide 2026.

Download the Nexconn Real-Time Presence Guide 2026: Strategic Architectures for Dating, Gaming and Social Discovery

Frequently Asked Questions

What's the minimum iOS version supported?

iOS 10.0 or later for audio and video call functionality.

How does the SDK handle calls when the network switches between WiFi and cellular?

Nexconn's Weak Network Resilience Engine handles network transitions automatically. When a user switches between WiFi and cellular data, the SDK maintains the call session and adjusts stream quality based on available bandwidth rather than dropping the connection.

Does Nexconn Call support group calls?

Yes. Multi-party calling is supported — pass multiple callee IDs in NCCallStartCallParams and provide an array of NCCallRemoteVideoView instances to handle multiple remote video streams.

What CocoaPods version is required?

CocoaPods 1.10.0 or later. The Call SDK uses XCFramework, and full XCFramework support starts from that version.

Contact us
Contact us
We'd love to discuss how Nexconn's real-time communication solutions can support your business. Request a demo, explore pricing, or get tailored onboarding guidance.

Related Articles

Sendbird Alternatives in 2026: Why Engineering Teams Are Moving On

Sendbird Alternatives in 2026: Why Engineering Teams Are Moving On

Home > Blog > Sendbird Alternatives in 2026 Sendbird did something important: it made in-app chat accessible at a time when the alternative was building the entire stack from scratch. For a lot of teams, that mattered. For a lot of teams, it still does. The frustration that's driving migrations in 2026 isn't that Sendbird doesn't work. It's that the cost of working with it — the pricing structure, the engineering overhead, the gaps between documentation and actual SDK behavior — has s

What is a Chat API? The Definitive Guide to In-App Messaging (2026)

What is a Chat API? The Definitive Guide to In-App Messaging (2026)

Home > Blog > The Definitive Guide to In-App Messaging Building chat looks easy on a whiteboard. You send a string, they get a string. Simple, right? But then reality kicks in. You’re dealing with a user on a shaky 3G connection in a Jakarta basement where packets simply vanish. Then there’s the aggressive battery management on budget Android phones that kills your background process the second a user switches apps—all just to save a tiny bit of "juice." That "instant" message is now

How to Integrate In-App Chat into Your Flutter App with Nexconn SDK

How to Integrate In-App Chat into Your Flutter App with Nexconn SDK

Home > Blog > How to Integrate In-App Chat into Your Flutter App The digital world has hit a tipping point. In 2026, we are no longer just making tools; user engagement has become a core growth metric. According to Nexconn's 2026 In-App Connectivity Playbook, platforms that prioritize real-time communication see measurably higher retention and revenue per user compared to those relying on passive, notification-driven models. In fact, 80% of users say the overall user experience of an