As I wrote PeerJS client (WebRTC) for Android
Recently, I had to write a client application on Android for a server that organized video communication between users using the PeerJS library . This library is an add-on over WebRTC, well, or something like that.
He approached the matter with enthusiasm, because before that he hadn’t done anything so complicated.
Naturally, the first step was to search for libraries, projects that implement such functionality.
I found a sample WebRTC, but then I found a project that implemented all this more simply.
I started to redo it for myself, because the broker that I use is peerjs.
I changed the JavaScript code when connecting and started to catch error messages. Not immediately, but doper that the whole point is that the standard WebView (through which we execute JavaScript code) does not support WebRTC.
“How to get around this problem” - I asked myself this question. Googled for a long time and did not find anything sensible.
I decided that it would be useful to dig into the PeerJS API and see how they all implement.
I found several requests that the library sends, which will be easy to reproduce, and also understood how peerjs connects clients. Via WebSocket!
After that, I came across a project and it decided everything!
Next, I will give a code that fits in one Activity.
Well, firstly, you need to take the libs folder from the last mentioned project and add it to the project.
Next, create an Activity and add links to objects that we will create later:
In onCreate, create a GLSurfaceView, in which we will output the video, and add it to some container:
Next, configure the video display locations:
Here we indicate that the resulting video will be displayed at 100% of the height and width of the surfaceView, and the video from our camera will be displayed in the lower left corner, indented to the left 1% of the width, 74% of the height to the right, and 25% in size.
initializeAndroidGlobals - and don't ask why this method is. I only know that without it you cannot create a connection.
iceServers - read more on the Internet. Sometimes the video is not transmitted, because you need to add more servers in the same way. And on the server side, if you plan to communicate with the site too, you need to add iceServer.
Next, we implement the createPC () method:
If you know the id (issued by the broker) of the user you want to connect to, then in the onConnect method of the socket you need to create an offer.
If not, do nothing.
the code from the socket listener must be executed in this method
In the onMessage (final String data) method, we get a message. It can be:
- offer, which another user sent to connect to us:
- answer, which the user sent to whom we sent offer to connect to it:
- candidate - they contain information on iceServer. We add them to our connection.
Please note that messages have a structure defined by peerjs, so other brokers will have to parse them differently.
We implement such a class:
This class sends candidates to the second user and receives the incoming stream with video and audio.
And one more class:
This class defines the settings of SDP, the protocol used to send information about iceServer and offer / answer.
Offer / Answer also sends this class in the sendLocalDescription () method. Offer / Answer also has a specific structure.
We also add secondary methods:
We have an initiator variable. By default, it is false. It means whether you are calling someone or waiting for a call.
In an asynchronous task, I check for the id of the user I'm calling in the database on the server. If found, then it is connected to a broker. I put initiator = true. When the socket is connected, offer will be created immediately and we will connect to the second user.
If there is no user id, then we just wait. When someone wants to call us, he should find out our id and send offer.
If you want to make buttons to disable the transmission of video, audio, then the code will help you:
Well, of course, add activity to the manifest and permissions:
Well, that's all. I ask you not to be offended by the chaotic story. And the quality of the code does not tend to the standard.
I hope someone comes in handy.
He approached the matter with enthusiasm, because before that he hadn’t done anything so complicated.
Naturally, the first step was to search for libraries, projects that implement such functionality.
I found a sample WebRTC, but then I found a project that implemented all this more simply.
I started to redo it for myself, because the broker that I use is peerjs.
I changed the JavaScript code when connecting and started to catch error messages. Not immediately, but doper that the whole point is that the standard WebView (through which we execute JavaScript code) does not support WebRTC.
“How to get around this problem” - I asked myself this question. Googled for a long time and did not find anything sensible.
I decided that it would be useful to dig into the PeerJS API and see how they all implement.
I found several requests that the library sends, which will be easy to reproduce, and also understood how peerjs connects clients. Via WebSocket!
After that, I came across a project and it decided everything!
Next, I will give a code that fits in one Activity.
Well, firstly, you need to take the libs folder from the last mentioned project and add it to the project.
Next, create an Activity and add links to objects that we will create later:
private static boolean factoryStaticInitialized;
private GLSurfaceView surfaceView;
private VideoRenderer.Callbacks localRender;
private VideoRenderer.Callbacks remoteRender;
private VideoRenderer localRenderer;
private VideoSource videoSource;
private VideoTrack videoTrack;
private AudioTrack audioTrack;
private MediaStream localMediaStream;
private boolean videoSourceStopped;
private boolean initiator = false;
private boolean video = true;
private boolean audio = true;
private WebSocketClient client;
private PeerConnectionFactory factory;
private PeerConnection peerConnection;
private final PCObserver pcObserver = new PCObserver();
private final SDPObserver sdpObserver = new SDPObserver();
private MediaConstraints sdpMediaConstraints;
private LinkedList iceServers = new LinkedList();
private LinkedList queuedRemoteCandidates = new LinkedList();
private Toast logToast;
private final Boolean[] quit = new Boolean[] { false };
private String id;
private String token = "имяпакета"; // здесь указываем набор символов, я использую имя пакета без точек
private String connectionId = "mc_имяпакета"; // также случайный набор символов, только в начале "mc_"
In onCreate, create a GLSurfaceView, in which we will output the video, and add it to some container:
surfaceView = new GLSurfaceView(this);
surfaceView.setLayoutParams(new LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT));
LinearLayout content = (LinearLayout)findViewById(R.id.activity_webrtc_content);
content.addView(surfaceView);
Next, configure the video display locations:
VideoRendererGui.setView(surfaceView);
remoteRender = VideoRendererGui.create(0, 0, 100, 100);
localRender = VideoRendererGui.create(1, 74, 25, 25);
Here we indicate that the resulting video will be displayed at 100% of the height and width of the surfaceView, and the video from our camera will be displayed in the lower left corner, indented to the left 1% of the width, 74% of the height to the right, and 25% in size.
if (!factoryStaticInitialized) {
PeerConnectionFactory.initializeAndroidGlobals(this, true, true);
factoryStaticInitialized = true;
}
audioManager = ((AudioManager) getSystemService(AUDIO_SERVICE));
@SuppressWarnings("deprecation")
boolean isWiredHeadsetOn = audioManager.isWiredHeadsetOn();
audioManager.setMode(isWiredHeadsetOn ? AudioManager.MODE_IN_CALL : AudioManager.MODE_IN_COMMUNICATION);
audioManager.setSpeakerphoneOn(!isWiredHeadsetOn);
sdpMediaConstraints = new MediaConstraints();
sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
iceServers.add(new PeerConnection.IceServer("stun:stun.l.google.com:19302"));
createPC();
initializeAndroidGlobals - and don't ask why this method is. I only know that without it you cannot create a connection.
iceServers - read more on the Internet. Sometimes the video is not transmitted, because you need to add more servers in the same way. And on the server side, if you plan to communicate with the site too, you need to add iceServer.
Next, we implement the createPC () method:
void createPC(){
factory = new PeerConnectionFactory();
MediaConstraints pcConstraints = new MediaConstraints();
pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
pcConstraints.optional.add(new MediaConstraints.KeyValuePair("RtpDataChannels", "true"));
pcConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
peerConnection = factory.createPeerConnection(iceServers, pcConstraints, pcObserver);
// а это и есть наше подключение
createDataChannelToRegressionTestBug2302(peerConnection);
// проводим какую-то проверку подключения
logAndToast("Creating local video source...");
MediaConstraints videoConstraints = new MediaConstraints();
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", "240"));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", "320"));
// можно и не указывать размер видео
localMediaStream = factory.createLocalMediaStream("ARDAMS");
VideoCapturer capturer = getVideoCapturer();
videoSource = factory.createVideoSource(capturer, videoConstraints);
videoTrack = factory.createVideoTrack("ARDAMSv0", videoSource);
localRenderer = new VideoRenderer(localRender);
videoTrack.addRenderer(localRenderer); // наше видео, которое можно будет отключать
localMediaStream.addTrack(videoTrack);
audioTrack = factory.createAudioTrack("ARDAMSa0", factory.createAudioSource(new MediaConstraints())); // наше аудио с микрофона
localMediaStream.addTrack(audioTrack);
peerConnection.addStream(localMediaStream, new MediaConstraints());
GetID getId = new GetID();
try {
getId.execute(); //запускаем асинхронную задачу, которая выполнит http-запрос на получение id от брокера
}catch (Exception e) {
logAndToast("No Internet connection");
disconnectAndExit();
}
}
private class GetID extends AsyncTask{
@Override
protected String doInBackground(Void... params) {
NetHelper a = NetHelper.getInstance(RTCActivity.this);
// выполняем http запрос, который вернет нам id
String result = a.executeHttpGet("http://0.peerjs.com:9000/" + roomKey + "/id?ts=" + Calendar.getInstance().getTimeInMillis() + ".7330598266421392");
// roomKey - номер комнаты. его можно получить на сайте PeerJS или использовать стандартный "lwjd5qra8257b9"
// 7330598266421392 - случайный набор цифр, который обеспечит уникальность id
if (result==null)
return null;
result = result.replace("\n", ""); // id возвращается с переносом строки в конце, так что удаляем его
return result;
}
@Override
protected void onPostExecute(String result) {
super.onPostExecute(result);
if (result==null)
return;
id = result;
// создаем слушатель, который будет отлавливать получаемые события для сокета
WebSocketClient.Listener listener = new WebSocketClient.Listener() {
@Override
public void onMessage(byte[] arg0) {
}
@Override
public void onMessage(final String data) {
runOnUiThread(new Runnable() {
public void run() {
try {
JSONObject json = new JSONObject(data);
String type = (String) json.get("type");
if (type.equalsIgnoreCase("candidate")) {
JSONObject jsonCandidate = json.getJSONObject("payload").getJSONObject("candidate");
IceCandidate candidate = new IceCandidate(
(String) jsonCandidate.get("sdpMid"),
jsonCandidate.getInt("sdpMLineIndex"),
(String) jsonCandidate.get("candidate"));
if (queuedRemoteCandidates != null) {
queuedRemoteCandidates.add(candidate);
} else {
peerConnection.addIceCandidate(candidate);
}
} else if (type.equalsIgnoreCase("answer") || type.equalsIgnoreCase("offer")) {
connectionId = json.getJSONObject("payload").getString("connectionId");
friendId = json.getString("src");
JSONObject jsonSdp = json.getJSONObject("payload").getJSONObject("sdp");
SessionDescription sdp = new SessionDescription(
SessionDescription.Type.fromCanonicalForm(type),
preferISAC((String) jsonSdp.get("sdp")));
peerConnection.setRemoteDescription(sdpObserver, sdp);
} else if (type.equalsIgnoreCase("bye")) {
logAndToast("Remote end hung up; dropping PeerConnection");
disconnectAndExit();
} else {
//throw new RuntimeException("Unexpected message: " + data);
}
} catch (JSONException e) {
//throw new RuntimeException(e);
}
}
});
}
@Override
public void onError(Exception arg0) {
runOnUiThread(new Runnable() {
public void run() {
disconnectAndExit();
}
});
}
@Override
public void onDisconnect(int arg0, String arg1) {
runOnUiThread(new Runnable() {
public void run() {
disconnectAndExit();
}
});
}
@Override
public void onConnect() {
// когда сокет подключился
runOnUiThread(new Runnable() {
public void run() {
if (initiator){
logAndToast("Creating offer...");
peerConnection.createOffer(sdpObserver, sdpMediaConstraints);
}
}
});
}
};
URI uri = null;
try {
// создадим URI для сокета. для брокера peerjs он должен иметь такой вид
uri = new URI("ws", "", "0.peerjs.com", 9000, "/peerjs", "key=" + roomKey + "&id=" + id + "&token=" + token, "");
// roomKey - уже описывал, указываем тот же
// id - только что полученный от брокера id
// token - случайный набор символов (я использую имя пакета без точек)
} catch (URISyntaxException e) {
disconnectAndExit();
}
client = new WebSocketClient(uri, listener, null); // непосредственно создаем сокет
client.connect();
}
}
If you know the id (issued by the broker) of the user you want to connect to, then in the onConnect method of the socket you need to create an offer.
If not, do nothing.
the code from the socket listener must be executed in this method
runOnUiThread(new Runnable() {
public void run() {
}
});
In the onMessage (final String data) method, we get a message. It can be:
- offer, which another user sent to connect to us:
- answer, which the user sent to whom we sent offer to connect to it:
- candidate - they contain information on iceServer. We add them to our connection.
Please note that messages have a structure defined by peerjs, so other brokers will have to parse them differently.
We implement such a class:
private class PCObserver implements PeerConnection.Observer {
@Override
public void onIceCandidate(final IceCandidate candidate){
runOnUiThread(new Runnable() {
public void run() {
JSONObject json = new JSONObject();
JSONObject payload = new JSONObject();
JSONObject jsonCandidate = new JSONObject();
jsonPut(json, "type", "CANDIDATE");
jsonPut(jsonCandidate, "sdpMid", candidate.sdpMid);
jsonPut(jsonCandidate, "sdpMLineIndex", candidate.sdpMLineIndex);
jsonPut(jsonCandidate, "candidate", candidate.sdp);
jsonPut(payload, "candidate", jsonCandidate);
jsonPut(payload, "type", "media");
jsonPut(payload, "connectionId", connectionId);
jsonPut(json, "payload", payload);
jsonPut(json, "dst", friendId);
jsonPut(json, "src", id);
sendMessage(json);
}
});
}
@Override
public void onError(){
runOnUiThread(new Runnable() {
public void run() {
disconnectAndExit();
}
});
}
@Override
public void onSignalingChange(PeerConnection.SignalingState newState) {
}
@Override
public void onIceConnectionChange(PeerConnection.IceConnectionState newState) {
}
@Override
public void onIceGatheringChange(PeerConnection.IceGatheringState newState) {
}
@Override
public void onAddStream(final MediaStream stream){
runOnUiThread(new Runnable() {
public void run() {
if (stream.videoTracks.size() == 1) {
stream.videoTracks.get(0).addRenderer(new VideoRenderer(remoteRender));
}
}
});
}
@Override
public void onRemoveStream(final MediaStream stream){
runOnUiThread(new Runnable() {
public void run() {
stream.videoTracks.get(0).dispose();
}
});
}
@Override
public void onDataChannel(final DataChannel dc) {
}
@Override
public void onRenegotiationNeeded() {
}
}
This class sends candidates to the second user and receives the incoming stream with video and audio.
And one more class:
private class SDPObserver implements SdpObserver {
private SessionDescription localSdp;
@Override
public void onCreateSuccess(final SessionDescription origSdp) {
final SessionDescription sdp = new SessionDescription(origSdp.type, preferISAC(origSdp.description));
localSdp = sdp;
runOnUiThread(new Runnable() {
public void run() {
peerConnection.setLocalDescription(sdpObserver, sdp);
}
});
}
private void sendLocalDescription() {
logAndToast("Sending " + localSdp.type);
JSONObject json = new JSONObject();
JSONObject payload = new JSONObject();
JSONObject sdp = new JSONObject();
jsonPut(json, "type", localSdp.type.canonicalForm().toUpperCase());
jsonPut(sdp, "sdp", localSdp.description);
jsonPut(sdp, "type", localSdp.type.canonicalForm().toLowerCase());
jsonPut(payload, "sdp", sdp);
jsonPut(payload, "type", "media");
jsonPut(payload, "connectionId", connectionId);
jsonPut(payload, "browser", "Chrome");
jsonPut(json, "payload", payload);
jsonPut(json, "dst", friendId);
sendMessage(json);
}
@Override
public void onSetSuccess() {
runOnUiThread(new Runnable() {
public void run() {
if (initiator) {
if (peerConnection.getRemoteDescription() != null) {
drainRemoteCandidates();
} else {
sendLocalDescription();
}
} else {
if (peerConnection.getLocalDescription() == null) {
logAndToast("Creating answer");
peerConnection.createAnswer(SDPObserver.this, sdpMediaConstraints);
} else {
sendLocalDescription();
drainRemoteCandidates();
}
}
}
});
}
@Override
public void onCreateFailure(final String error) {
}
@Override
public void onSetFailure(final String error) {
}
private void drainRemoteCandidates() {
for (IceCandidate candidate : queuedRemoteCandidates) {
peerConnection.addIceCandidate(candidate);
}
queuedRemoteCandidates = null;
}
}
This class defines the settings of SDP, the protocol used to send information about iceServer and offer / answer.
Offer / Answer also sends this class in the sendLocalDescription () method. Offer / Answer also has a specific structure.
We also add secondary methods:
// здесь мы получаем локальное видео с камеры
private VideoCapturer getVideoCapturer() {
String[] cameraFacing = { "front", "back" };
int[] cameraIndex = { 0, 1 };
int[] cameraOrientation = { 0, 90, 180, 270 };
for (String facing : cameraFacing) {
for (int index : cameraIndex) {
for (int orientation : cameraOrientation) {
String name = "Camera " + index + ", Facing " + facing + ", Orientation " + orientation;
VideoCapturer capturer = VideoCapturer.create(name);
if (capturer != null) {
logAndToast("Using camera: " + name);
return capturer;
}
}
}
}
return null;
}
@Override
protected void onDestroy() {
disconnectAndExit();
super.onDestroy();
}
private void logAndToast(String msg) {
Log.d(TAG, msg);
if (logToast != null) {
logToast.cancel();
}
logToast = Toast.makeText(this, msg, Toast.LENGTH_SHORT);
logToast.show();
}
private void sendMessage(JSONObject json) {
client.send(json.toString()); // отправляем сообщение
}
private static void jsonPut(JSONObject json, String key, Object value) {
try {
json.put(key, value);
} catch (JSONException e) {
}
}
// сложный метод, который я не трогал руками. Он разбирает sdp-параметры
private static String preferISAC(String sdpDescription) {
String[] lines = sdpDescription.split("\r\n");
int mLineIndex = -1;
String isac16kRtpMap = null;
Pattern isac16kPattern = Pattern.compile("^a=rtpmap:(\\d+) ISAC/16000[\r]?$");
for (int i = 0; (i < lines.length) && (mLineIndex == -1 || isac16kRtpMap == null); ++i) {
if (lines[i].startsWith("m=audio ")) {
mLineIndex = i;
continue;
}
Matcher isac16kMatcher = isac16kPattern.matcher(lines[i]);
if (isac16kMatcher.matches()) {
isac16kRtpMap = isac16kMatcher.group(1);
continue;
}
}
if (mLineIndex == -1) {
Log.d(TAG, "No m=audio line, so can't prefer iSAC");
return sdpDescription;
}
if (isac16kRtpMap == null) {
Log.d(TAG, "No ISAC/16000 line, so can't prefer iSAC");
return sdpDescription;
}
String[] origMLineParts = lines[mLineIndex].split(" ");
StringBuilder newMLine = new StringBuilder();
int origPartIndex = 0;
newMLine.append(origMLineParts[origPartIndex++]).append(" ");
newMLine.append(origMLineParts[origPartIndex++]).append(" ");
newMLine.append(origMLineParts[origPartIndex++]).append(" ");
newMLine.append(isac16kRtpMap);
for (; origPartIndex < origMLineParts.length; ++origPartIndex) {
if (!origMLineParts[origPartIndex].equals(isac16kRtpMap)) {
newMLine.append(" ").append(origMLineParts[origPartIndex]);
}
}
lines[mLineIndex] = newMLine.toString();
StringBuilder newSdpDescription = new StringBuilder();
for (String line : lines) {
newSdpDescription.append(line).append("\r\n");
}
return newSdpDescription.toString();
}
// освобождаем все ресурсы и выходим
private void disconnectAndExit() {
synchronized (quit[0]) {
if (quit[0]) {
return;
}
quit[0] = true;
if (peerConnection != null) {
peerConnection.dispose();
peerConnection = null;
}
if (client != null) {
client.send("{\"type\": \"bye\"}");
client.disconnect();
client = null;
}
if (videoSource != null) {
videoSource.dispose();
videoSource = null;
}
if (factory != null) {
factory.dispose();
factory = null;
}
if (audioManager!=null)
audioManager.abandonAudioFocus(audioFocusListener);
finish();
}
}
@Override
public void onStop() {
disconnectAndExit();
super.onStop();
}
@Override
public void onPause() {
super.onPause();
surfaceView.onPause();
if (videoSource != null) {
videoSource.stop(); // останавливаем трансляцию видео
videoSourceStopped = true;
}
}
@Override
public void onResume() {
super.onResume();
surfaceView.onResume();
if (videoSource != null && videoSourceStopped) {
videoSource.restart(); // возобновляем трансляцию видео
}
}
// проверка на какую-то ошибку
private static void createDataChannelToRegressionTestBug2302(PeerConnection pc) {
DataChannel dc = pc.createDataChannel("dcLabel", new DataChannel.Init());
dc.close();
dc.dispose();
}
We have an initiator variable. By default, it is false. It means whether you are calling someone or waiting for a call.
In an asynchronous task, I check for the id of the user I'm calling in the database on the server. If found, then it is connected to a broker. I put initiator = true. When the socket is connected, offer will be created immediately and we will connect to the second user.
If there is no user id, then we just wait. When someone wants to call us, he should find out our id and send offer.
If you want to make buttons to disable the transmission of video, audio, then the code will help you:
final ImageView noVideo = (ImageView)findViewById(R.id.activity_webrtc_video);
noVideo.setOnClickListener(new OnClickListener(){
@Override
public void onClick(View v) {
if (video){
noVideo.setImageResource(R.drawable.video_off);
video = false;
videoTrack.setEnabled(false);
}else{
noVideo.setImageResource(R.drawable.video_on);
video = true;
videoTrack.setEnabled(true);
}
}
});
final ImageView noAudio = (ImageView)findViewById(R.id.activity_webrtc_voice);
noAudio.setOnClickListener(new OnClickListener(){
@Override
public void onClick(View v) {
if (audio){
noAudio.setImageResource(R.drawable.voice_off);
audio = false;
audioTrack.setEnabled(false);
}else{
noAudio.setImageResource(R.drawable.voice_on);
audio = true;
audioTrack.setEnabled(true);
}
}
});
Well, of course, add activity to the manifest and permissions:
Well, that's all. I ask you not to be offended by the chaotic story. And the quality of the code does not tend to the standard.
I hope someone comes in handy.