Compare commits

...

52 Commits

Author SHA1 Message Date
linkmauve
8d0d92a437 Log the amount of local tracks properly
This changes a log message from “initialized with %s local tracks 2” to “initialized with 2 local tracks”.
2018-08-14 10:53:47 +02:00
linkmauve
faada0abae Print a nicer log message on participant join/part
This makes the logs more readable.
2018-08-14 10:53:18 +02:00
Ritwik Heda
1d99abc4a4 removes need for eslint-disable-next-line react/jsx-wrap-multilines and eslint-diable-line no extra-parens 2018-08-12 17:06:35 -05:00
Lyubo Marinov
9aed4df6d2 react-native-webrtc: android: pass correct constraints map to VideoCaptureController 2018-08-11 18:03:05 -05:00
Saúl Ibarra Corretgé
d92b720704 [RN] Update calendar-events dependency
Includes a fix for not running expensive operations on the main thread.
2018-08-10 15:11:37 +02:00
bgrozev
25aaa74edc Merge pull request #3223 from ztl8702/local-recording
Feature: Local recording (Ready for review)
2018-08-08 19:35:11 -05:00
Boris Grozev
195462a1a8 Merge branch 'master' into pr/3223 2018-08-08 15:35:40 -05:00
bgrozev
9c03e95bf1 npm: Updates lib-jitsi-meet to 4a28a196160411d657518022de8bded7c02ad679. (#3357) 2018-08-08 14:42:32 -05:00
virtuacoplenny
c353e9377f feat(tile-view): initial implementation for tile view (#3317)
* feat(tile-view): initial implementation for tile view

- Modify the classname on the app root so layout can adjust
  depending on the desired layout mode--vertical filmstrip,
  horizontal filmstrip, and tile view.
- Create a button for toggling tile view.
- Add a StateListenerRegistry to automatically update the
  selected participant and max receiver frame height on tile
  view toggle.
- Rezise thumbnails when switching in and out of tile view.
- Move the local video when switching in and out of tile view.
- Update reactified pieces of thumbnails when switching in and
  out of tile view.
- Cap the max receiver video quality in tile view based on tile
  size.
- Use CSS to hide UI components that should not display in tile
  view.
- Signal follow me changes.

* change local video id for tests

* change approach: leverage more css

* squash: fix some formatting

* squash: prevent pinning, hide pin border in tile view

* squash: change logic for maxReceiverQuality due to sidestepping resizing logic

* squash: fix typo, columns configurable, remove unused constants

* squash: resize with js again

* squash: use yana's math for calculating tile size
2018-08-08 13:48:23 -05:00
Radium Zheng
913c56c408 fix comments and docs 2018-08-08 11:58:38 +10:00
bgrozev
2f1223f721 fix: Handles the case of e2eRtt being undefined. (#3354) 2018-08-07 18:39:10 -07:00
Radium Zheng
4f1aaf89bf update package-lock.json 2018-08-08 09:26:49 +10:00
Radium Zheng
df6df1c6c3 refactor: AbstractAudioContextAdapter
move duplicate code from WavAdapter and FlacAdapter to a base class
2018-08-08 09:19:53 +10:00
Radium Zheng
1e804e552e fix: FlacAdapter get sampleRate 2018-08-08 09:19:53 +10:00
Radium Zheng
b284f25fde Refactor how download works. Cleaner filenames. 2018-08-08 09:19:53 +10:00
Radium Zheng
49bdd53bee Fix issue on mobile platforms 2018-08-08 09:19:53 +10:00
Radium Zheng
0827e02de9 use official repo for libflac.js 2018-08-08 09:19:53 +10:00
Radium Zheng
0410af9e5e add guard before APP in middleware.js 2018-08-08 09:19:28 +10:00
Radium Zheng
5a051024e6 clean up WavAdapter 2018-08-08 09:19:28 +10:00
Radium Zheng
e2def5f88b simplify Promise chaining in FlacAdapter 2018-08-08 09:19:28 +10:00
Radium Zheng
1078fa9d05 remove 'localRecording' from interface_config.js 2018-08-08 09:19:28 +10:00
Radium Zheng
dda7568a48 UI: refine LocalRecordingInfoDialog 2018-08-08 09:19:28 +10:00
Radium Zheng
4550848eac fix comments in flac-related codebase 2018-08-08 09:19:28 +10:00
Radium Zheng
7822831b1e UI: add a "Local Recording" label 2018-08-08 09:19:28 +10:00
Radium Zheng
e03126e422 fix sampleRate issues in flac and wav 2018-08-08 09:19:28 +10:00
Radium Zheng
61652c69b3 SessionManager 2018-08-08 09:19:28 +10:00
Radium Zheng
b6e1a49d33 Switching microphone on the fly: flac and wav support 2018-08-08 09:19:28 +10:00
Radium Zheng
e0ac3efb5c comment out section in config.js 2018-08-08 09:19:28 +10:00
Radium Zheng
65c76dcde5 Muting support
fix Promise in setMuted
2018-08-08 09:19:28 +10:00
Radium Zheng
5daa91ec1b update libflac.js to 4 and use proper fork 2018-08-08 09:19:28 +10:00
Radium Zheng
473ba28171 feature flag 2018-08-08 09:18:16 +10:00
Radium Zheng
52b55d65a0 change LocalRecordingInfoDialog 2018-08-08 09:18:16 +10:00
Radium Zheng
8ebf2b7e47 analytics: keyboard shortcut 2018-08-08 09:18:16 +10:00
Radium Zheng
cc38fcc5d0 register shortcuts in the middleware 2018-08-08 09:18:16 +10:00
Radium Zheng
a277421ecb WIP: Convert inline dialog to modal dialog 2018-08-08 09:18:16 +10:00
Radium Zheng
2f2e69a6f5 Add keyboard shortcuts for LocalRecordingInfoDialog
Which key should we use? Using "L" for now.
2018-08-08 09:18:16 +10:00
Radium Zheng
0490a3cf73 Refactor RecordingController 2018-08-08 09:18:16 +10:00
Radium Zheng
bfc8ecfaa6 changed one comment line 2018-08-08 09:18:16 +10:00
Radium Zheng
42c827434c clean up in LocalRecordingInfoDialog 2018-08-08 09:18:16 +10:00
Radium Zheng
0f3b67e53e reducer should be a pure function 2018-08-08 09:18:16 +10:00
Radium Zheng
2dfb107c57 UI strings: durationNA and moderater's finish message 2018-08-08 09:18:16 +10:00
Radium Zheng
f8c01646c7 Temp fix: newly joined clients miss the commands
When newly joined clients register for XMPP events upon
CONFERENCE_JOINED, those events that is carried by presence (e.g. START_COMMAND) was
already fired.
Temporary solution is to let the client send a ping message after
registering XMPP event listeners. The moderator will respond with
pong, which forces the presence to be resent.
2018-08-08 09:18:16 +10:00
Radium Zheng
0f0f9ea1b2 bug fix: multiple StartCommands
Situation when the RecordingController receives a new START_COMMAND
while it is initializing the recording adapter for the previous
START_COMMAND.
2018-08-08 09:18:16 +10:00
Radium Zheng
ce308eaa8b refactor: remove ensureInitialized 2018-08-08 09:18:16 +10:00
Radium Zheng
337cea6488 don't use params to switch actionType 2018-08-08 09:18:16 +10:00
Radium Zheng
e125861b29 refactor: use createLocalTracks instead of gUM; fix some docs; 2018-08-08 09:18:16 +10:00
Radium Zheng
3241c7a929 guard LocalRecordingButton with _shouldShowButton 2018-08-08 09:18:16 +10:00
Radium Zheng
55a2ef30a0 a11y label 2018-08-08 09:18:16 +10:00
Radium Zheng
ae0bd9e64e remove excessive comments in flacEncodeWorker.js 2018-08-08 09:18:16 +10:00
Radium Zheng
9c769a650e fix a missing doc string in Toolbox.js; reorder props alphabetically 2018-08-08 09:18:16 +10:00
Radium Zheng
07bc70c2f5 Implement local recording
index.js of local recording

local-recording(ui): recording button

local-recording(encoding): flac support with libflac.js

Fixes in RecordingController; integration with UI

local-recording(controller): coordinate recording on different clients

local-recording(controller): allow recording on remote participants

local-recording(controller): global singleton

local-recording(controller): use middleware to init LocalRecording

cleanup and documentation in RecordingController

local-recording(refactor): "Delegate" -> "Adapter"

code style

stop eslint and flow from complaining

temp save: client status

fix linter issues

fix some docs; remove global LocalRecording instance

use node.js packaging for libflac.js; remove vendor/ folder

code style: flacEncodeWorker.js

use moment.js to do time diff

remove the use of console.log

code style: flac related files

remove excessive empty lines; and more docs

remove the use of clockTick for UI updates

initalize flacEncodeWorker properly, to avoid premature audio data transmission

move the realization of recordingController events
from LocalRecordingButton to middleware

i18n strings

minor markup changes in LocalRecordingInfoDialog

fix documentation
2018-08-08 09:18:16 +10:00
bgrozev
2ee1bf9351 feat: Displays the E2E RTT in the connection stats table. (#3344)
* feat: Displays the E2E RTT in the connection stats table.

* fix: Whitelists the ping config properties.

* ref: Addresses feedback.

* npm: Updates lib-jitsi-meet to e097a1189ed99838605d90b959e129155bc0e50a.

* ref: Moves the e2ertt and region to the existing stats object.
2018-08-07 11:31:51 -07:00
87 changed files with 4535 additions and 166 deletions

View File

@@ -2,6 +2,7 @@ BUILD_DIR = build
CLEANCSS = ./node_modules/.bin/cleancss
DEPLOY_DIR = libs
LIBJITSIMEET_DIR = node_modules/lib-jitsi-meet/
LIBFLAC_DIR = node_modules/libflacjs/dist/min/
NODE_SASS = ./node_modules/.bin/node-sass
NPM = npm
OUTPUT_DIR = .
@@ -19,7 +20,7 @@ compile:
clean:
rm -fr $(BUILD_DIR)
deploy: deploy-init deploy-appbundle deploy-lib-jitsi-meet deploy-css deploy-local
deploy: deploy-init deploy-appbundle deploy-lib-jitsi-meet deploy-libflac deploy-css deploy-local
deploy-init:
rm -fr $(DEPLOY_DIR)
@@ -33,6 +34,8 @@ deploy-appbundle:
$(BUILD_DIR)/do_external_connect.min.map \
$(BUILD_DIR)/external_api.min.js \
$(BUILD_DIR)/external_api.min.map \
$(BUILD_DIR)/flacEncodeWorker.min.js \
$(BUILD_DIR)/flacEncodeWorker.min.map \
$(BUILD_DIR)/device_selection_popup_bundle.min.js \
$(BUILD_DIR)/device_selection_popup_bundle.min.map \
$(BUILD_DIR)/dial_in_info_bundle.min.js \
@@ -50,6 +53,12 @@ deploy-lib-jitsi-meet:
$(LIBJITSIMEET_DIR)/modules/browser/capabilities.json \
$(DEPLOY_DIR)
deploy-libflac:
cp \
$(LIBFLAC_DIR)/libflac4-1.3.2.min.js \
$(LIBFLAC_DIR)/libflac4-1.3.2.min.js.mem \
$(DEPLOY_DIR)
deploy-css:
$(NODE_SASS) $(STYLES_MAIN) $(STYLES_BUNDLE) && \
$(CLEANCSS) $(STYLES_BUNDLE) > $(STYLES_DESTINATION) ; \
@@ -58,7 +67,7 @@ deploy-css:
deploy-local:
([ ! -x deploy-local.sh ] || ./deploy-local.sh)
dev: deploy-init deploy-css deploy-lib-jitsi-meet
dev: deploy-init deploy-css deploy-lib-jitsi-meet deploy-libflac
$(WEBPACK_DEV_SERVER)
source-package:

View File

@@ -704,7 +704,7 @@ export default {
track.mute();
}
});
logger.log('initialized with %s local tracks', tracks.length);
logger.log(`initialized with ${tracks.length} local tracks`);
this._localTracksInitialized = true;
con.addEventListener(
JitsiConnectionEvents.CONNECTION_FAILED,
@@ -1678,7 +1678,7 @@ export default {
role: user.getRole()
}));
logger.log('USER %s connnected', id, user);
logger.log(`USER ${id} connnected:`, user);
APP.API.notifyUserJoined(id, {
displayName,
formattedDisplayName: appendSuffix(
@@ -1698,7 +1698,7 @@ export default {
}
APP.store.dispatch(participantLeft(id, room));
logger.log('USER %s LEFT', id, user);
logger.log(`USER ${id} LEFT:`, user);
APP.API.notifyUserLeft(id);
APP.UI.messageHandler.participantNotification(
user.getDisplayName(),

View File

@@ -347,6 +347,36 @@ var config = {
// userRegion: "asia"
}
// Local Recording
//
// localRecording: {
// Enables local recording.
// Additionally, 'localrecording' (all lowercase) needs to be added to
// TOOLBAR_BUTTONS in interface_config.js for the Local Recording
// button to show up on the toolbar.
//
// enabled: true,
//
// The recording format, can be one of 'ogg', 'flac' or 'wav'.
// format: 'flac'
//
// }
// Options related to end-to-end (participant to participant) ping.
// e2eping: {
// // The interval in milliseconds at which pings will be sent.
// // Defaults to 10000, set to <= 0 to disable.
// pingInterval: 10000,
//
// // The interval in milliseconds at which analytics events
// // with the measured RTT will be sent. Defaults to 60000, set
// // to <= 0 to disable.
// analyticsInterval: 60000,
// }
// List of undocumented settings used in jitsi-meet
/**
_immediateReloadThreshold
@@ -396,6 +426,7 @@ var config = {
nick
startBitrate
*/
};
/* eslint-enable no-unused-vars, no-var */

View File

@@ -14,14 +14,9 @@
* Focused video thumbnail.
*/
&.videoContainerFocused {
transition-duration: 0.5s;
-webkit-transition-duration: 0.5s;
-webkit-animation-name: greyPulse;
-webkit-animation-duration: 2s;
-webkit-animation-iteration-count: 1;
border: $thumbnailVideoBorder solid $videoThumbnailSelected !important;
border: $thumbnailVideoBorder solid $videoThumbnailSelected;
box-shadow: inset 0 0 3px $videoThumbnailSelected,
0 0 3px $videoThumbnailSelected !important;
0 0 3px $videoThumbnailSelected;
}
.remotevideomenu > .icon-menu {
@@ -31,7 +26,7 @@
/**
* Hovered video thumbnail.
*/
&:hover {
&:hover:not(.videoContainerFocused):not(.active-speaker) {
cursor: hand;
border: $thumbnailVideoBorder solid $videoThumbnailHovered;
box-shadow: inset 0 0 3px $videoThumbnailHovered,

View File

@@ -0,0 +1,113 @@
/**
* CSS styles that are specific to the filmstrip that shows the thumbnail tiles.
*/
.tile-view {
/**
* Add a border around the active speaker to make the thumbnail easier to
* see.
*/
.active-speaker {
border: $thumbnailVideoBorder solid $videoThumbnailSelected;
box-shadow: inset 0 0 3px $videoThumbnailSelected,
0 0 3px $videoThumbnailSelected;
}
#filmstripRemoteVideos {
align-items: center;
box-sizing: border-box;
display: flex;
flex-direction: column;
height: 100vh;
width: 100vw;
}
.filmstrip__videos .videocontainer {
&:not(.active-speaker),
&:hover:not(.active-speaker) {
border: none;
box-shadow: none;
}
}
#remoteVideos {
/**
* Height is modified with an inline style in horizontal filmstrip mode
* so !important is used to override that.
*/
height: 100% !important;
width: 100%;
}
.filmstrip {
align-items: center;
display: flex;
height: 100%;
justify-content: center;
left: 0;
position: fixed;
top: 0;
width: 100%;
z-index: $filmstripVideosZ
}
/**
* Regardless of the user setting, do not let the filmstrip be in a hidden
* state.
*/
.filmstrip__videos.hidden {
display: block;
}
#filmstripRemoteVideos {
box-sizing: border-box;
/**
* Allow scrolling of the thumbnails.
*/
overflow: auto;
}
/**
* The size of the thumbnails should be set with javascript, based on
* desired column count and window width. The rows are created using flex
* and allowing the thumbnails to wrap.
*/
#filmstripRemoteVideosContainer {
align-content: center;
align-items: center;
box-sizing: border-box;
display: flex;
flex-wrap: wrap;
height: 100vh;
justify-content: center;
padding: 100px 0;
.videocontainer {
box-sizing: border-box;
display: block;
margin: 5px;
}
video {
object-fit: contain;
}
}
.has-overflow#filmstripRemoteVideosContainer {
align-content: baseline;
}
.has-overflow .videocontainer {
align-self: baseline;
}
/**
* Firefox flex acts a little differently. To make sure the bottom row of
* thumbnails is not overlapped by the horizontal toolbar, margin is added
* to the local thumbnail to keep it from the bottom of the screen. It is
* assumed the local thumbnail will always be on the bottom row.
*/
.has-overflow #localVideoContainer {
margin-bottom: 100px !important;
}
}

View File

@@ -0,0 +1,47 @@
/**
* Various overrides outside of the filmstrip to style the app to support a
* tiled thumbnail experience.
*/
.tile-view {
/**
* Let the avatar grow with the tile.
*/
.userAvatar {
max-height: initial;
max-width: initial;
}
/**
* Hide various features that should not be displayed while in tile view.
*/
#dominantSpeaker,
#filmstripLocalVideoThumbnail,
#largeVideoElementsContainer,
#sharedVideo,
.filmstrip__toolbar {
display: none;
}
#localConnectionMessage,
#remoteConnectionMessage,
.watermark {
z-index: $filmstripVideosZ + 1;
}
/**
* The follow styling uses !important to override inline styles set with
* javascript.
*
* TODO: These overrides should be more easy to remove and should be removed
* when the components are in react so their rendering done declaratively,
* making conditional styling easier to apply.
*/
#largeVideoElementsContainer,
#remoteConnectionMessage,
#remotePresenceMessage {
display: none !important;
}
#largeVideoContainer {
background-color: $defaultBackground !important;
}
}

View File

@@ -45,6 +45,7 @@
@import 'modals/settings/settings';
@import 'modals/speaker_stats/speaker_stats';
@import 'modals/video-quality/video-quality';
@import 'modals/local-recording/local-recording';
@import 'videolayout_default';
@import 'notice';
@import 'popup_menu';
@@ -72,6 +73,8 @@
@import 'filmstrip/filmstrip_toolbar';
@import 'filmstrip/horizontal_filmstrip';
@import 'filmstrip/small_video';
@import 'filmstrip/tile_view';
@import 'filmstrip/tile_view_overrides';
@import 'filmstrip/vertical_filmstrip';
@import 'filmstrip/vertical_filmstrip_overrides';
@import 'unsupported-browser/main';

View File

@@ -0,0 +1,92 @@
.localrec-participant-stats {
list-style: none;
padding: 0;
width: 100%;
font-weight: 500;
.localrec-participant-stats-item__status-dot {
position: relative;
display: block;
width: 9px;
height: 9px;
border-radius: 50%;
margin: 0 auto;
&.status-on {
background: green;
}
&.status-off {
background: gray;
}
&.status-unknown {
background: darkgoldenrod;
}
&.status-error {
background: darkred;
}
}
.localrec-participant-stats-item__status,
.localrec-participant-stats-item__name,
.localrec-participant-stats-item__sessionid {
display: inline-block;
margin: 5px 0;
vertical-align: middle;
}
.localrec-participant-stats-item__status {
width: 5%;
}
.localrec-participant-stats-item__name {
width: 40%;
}
.localrec-participant-stats-item__sessionid {
width: 55%;
}
.localrec-participant-stats-item__name,
.localrec-participant-stats-item__sessionid {
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
}
.localrec-control-info-label {
font-weight: bold;
}
.localrec-control-info-label:after {
content: ' ';
}
.localrec-control-action-link {
display: inline-block;
line-height: 1.5em;
a {
cursor: pointer;
vertical-align: middle;
}
}
.localrec-control-action-link:before {
color: $linkFontColor;
content: '\2022';
font-size: 1.5em;
padding: 0 10px;
vertical-align: middle;
}
.localrec-control-action-link:first-child:before {
content: '';
padding: 0;
}
.localrec-control-action-links {
font-weight: bold;
margin-top: 10px;
white-space: nowrap;
}

View File

@@ -168,6 +168,10 @@
background: #FF5630;
}
.circular-label.local-rec {
background: #FF5630;
}
.circular-label.stream {
background: #0065FF;
}

View File

@@ -48,7 +48,8 @@ var interfaceConfig = {
'microphone', 'camera', 'closedcaptions', 'desktop', 'fullscreen',
'fodeviceselection', 'hangup', 'profile', 'info', 'chat', 'recording',
'livestreaming', 'etherpad', 'sharedvideo', 'settings', 'raisehand',
'videoquality', 'filmstrip', 'invite', 'feedback', 'stats', 'shortcuts'
'videoquality', 'filmstrip', 'invite', 'feedback', 'stats', 'shortcuts',
'tileview'
],
SETTINGS_SECTIONS: [ 'devices', 'language', 'moderator', 'profile' ],
@@ -172,6 +173,12 @@ var interfaceConfig = {
*/
RECENT_LIST_ENABLED: true
/**
* How many columns the tile view can expand to. The respected range is
* between 1 and 5.
*/
// TILE_VIEW_MAX_COLUMNS: 5,
/**
* Specify custom URL for downloading android mobile app.
*/

View File

@@ -11,7 +11,7 @@ PODS:
- React/Core (= 0.55.4)
- react-native-background-timer (2.0.0):
- React
- react-native-calendar-events (1.6.0):
- react-native-calendar-events (1.6.1):
- React
- react-native-fast-image (4.0.14):
- FLAnimatedImage

View File

@@ -43,7 +43,8 @@
"mute": "Mute or unmute your microphone",
"fullScreen": "View or exit full screen",
"videoMute": "Start or stop your camera",
"showSpeakerStats": "Show speaker stats"
"showSpeakerStats": "Show speaker stats",
"localRecording": "Show or hide local recording controls"
},
"welcomepage":{
"accessibilityLabel": {
@@ -87,6 +88,7 @@
"fullScreen": "Toggle full screen",
"hangup": "Leave the call",
"invite": "Invite people",
"localRecording": "Toggle local recording controls",
"lockRoom": "Toggle room lock",
"moreActions": "Toggle more actions menu",
"moreActionsMenu": "More actions menu",
@@ -102,6 +104,7 @@
"shortcuts": "Toggle shortcuts",
"speakerStats": "Toggle speaker statistics",
"toggleCamera": "Toggle camera",
"tileView": "Toggle tile view",
"videomute": "Toggle mute video"
},
"addPeople": "Add people to your call",
@@ -144,6 +147,7 @@
"raiseHand": "Raise / Lower your hand",
"shortcuts": "View shortcuts",
"speakerStats": "Speaker stats",
"tileViewToggle": "Toggle tile view",
"invite": "Invite people"
},
"chat":{
@@ -198,6 +202,7 @@
"packetloss": "Packet loss:",
"resolution": "Resolution:",
"framerate": "Frame rate:",
"e2e_rtt": "E2E RTT:",
"less": "Show less",
"more": "Show more",
"address": "Address:",
@@ -665,5 +670,34 @@
"decline": "Dismiss",
"productLabel": "from Jitsi Meet",
"videoCallTitle": "Incoming video call"
},
"localRecording": {
"localRecording": "Local Recording",
"dialogTitle": "Local Recording Controls",
"start": "Start Recording",
"stop": "Stop Recording",
"moderator": "Moderator",
"me": "Me",
"duration": "Duration",
"durationNA": "N/A",
"encoding": "Encoding",
"participantStats": "Participant Stats",
"participant": "Participant",
"sessionToken": "Session Token",
"clientState": {
"on": "On",
"off": "Off",
"unknown": "Unknown"
},
"messages": {
"engaged": "Local recording engaged.",
"finished": "Recording session __token__ finished. Please send the recorded file to the moderator.",
"finishedModerator": "Recording session __token__ finished. The recording of the local track has been saved. Please ask the other participants to submit their recordings.",
"notModerator": "You are not the moderator. You cannot start or stop local recording."
},
"yes": "Yes",
"no": "No",
"label": "LOR",
"labelToolTip": "Local recording is engaged"
}
}

View File

@@ -21,6 +21,7 @@ import {
getPinnedParticipant,
pinParticipant
} from '../react/features/base/participants';
import { setTileView } from '../react/features/video-layout';
import UIEvents from '../service/UI/UIEvents';
import VideoLayout from './UI/videolayout/VideoLayout';
@@ -117,6 +118,31 @@ class State {
}
}
/**
* A getter for this object instance to know the state of tile view.
*
* @returns {boolean} True if tile view is enabled.
*/
get tileViewEnabled() {
return this._tileViewEnabled;
}
/**
* A setter for {@link tileViewEnabled}. Fires a property change event for
* other participants to follow.
*
* @param {boolean} b - Whether or not tile view is enabled.
* @returns {void}
*/
set tileViewEnabled(b) {
const oldValue = this._tileViewEnabled;
if (oldValue !== b) {
this._tileViewEnabled = b;
this._firePropertyChange('tileViewEnabled', oldValue, b);
}
}
/**
* Invokes {_propertyChangeCallback} to notify it that {property} had its
* value changed from {oldValue} to {newValue}.
@@ -189,6 +215,10 @@ class FollowMe {
this._sharedDocumentToggled
.bind(this, this._UI.getSharedDocumentManager().isVisible());
}
this._tileViewToggled.bind(
this,
APP.store.getState()['features/video-layout'].tileViewEnabled);
}
/**
@@ -214,6 +244,10 @@ class FollowMe {
this.sharedDocEventHandler = this._sharedDocumentToggled.bind(this);
this._UI.addListener(UIEvents.TOGGLED_SHARED_DOCUMENT,
this.sharedDocEventHandler);
this.tileViewEventHandler = this._tileViewToggled.bind(this);
this._UI.addListener(UIEvents.TOGGLED_TILE_VIEW,
this.tileViewEventHandler);
}
/**
@@ -227,6 +261,8 @@ class FollowMe {
this.sharedDocEventHandler);
this._UI.removeListener(UIEvents.PINNED_ENDPOINT,
this.pinnedEndpointEventHandler);
this._UI.removeListener(UIEvents.TOGGLED_TILE_VIEW,
this.tileViewEventHandler);
}
/**
@@ -266,6 +302,18 @@ class FollowMe {
this._local.sharedDocumentVisible = sharedDocumentVisible;
}
/**
* Notifies this instance that the tile view mode has been enabled or
* disabled.
*
* @param {boolean} enabled - True if tile view has been enabled, false
* if has been disabled.
* @returns {void}
*/
_tileViewToggled(enabled) {
this._local.tileViewEnabled = enabled;
}
/**
* Changes the nextOnStage property value.
*
@@ -316,7 +364,8 @@ class FollowMe {
attributes: {
filmstripVisible: local.filmstripVisible,
nextOnStage: local.nextOnStage,
sharedDocumentVisible: local.sharedDocumentVisible
sharedDocumentVisible: local.sharedDocumentVisible,
tileViewEnabled: local.tileViewEnabled
}
});
}
@@ -355,6 +404,7 @@ class FollowMe {
this._onFilmstripVisible(attributes.filmstripVisible);
this._onNextOnStage(attributes.nextOnStage);
this._onSharedDocumentVisible(attributes.sharedDocumentVisible);
this._onTileViewEnabled(attributes.tileViewEnabled);
}
/**
@@ -434,6 +484,21 @@ class FollowMe {
}
}
/**
* Process a tile view enabled / disabled event received from FOLLOW-ME.
*
* @param {boolean} enabled - Whether or not tile view should be shown.
* @private
* @returns {void}
*/
_onTileViewEnabled(enabled) {
if (typeof enabled === 'undefined') {
return;
}
APP.store.dispatch(setTileView(enabled === 'true'));
}
/**
* Pins / unpins the video thumbnail given by clickId.
*

View File

@@ -1,4 +1,6 @@
/* global $ */
/* global $, APP */
import { shouldDisplayTileView } from '../../../react/features/video-layout';
import SmallVideo from '../videolayout/SmallVideo';
const logger = require('jitsi-meet-logger').getLogger(__filename);
@@ -64,7 +66,9 @@ SharedVideoThumb.prototype.createContainer = function(spanId) {
* The thumb click handler.
*/
SharedVideoThumb.prototype.videoClick = function() {
this._togglePin();
if (!shouldDisplayTileView(APP.store.getState())) {
this._togglePin();
}
};
/**

View File

@@ -1,6 +1,13 @@
/* global $, APP, interfaceConfig */
import { setFilmstripVisible } from '../../../react/features/filmstrip';
import {
LAYOUTS,
getCurrentLayout,
getMaxColumnCount,
getTileViewGridDimensions,
shouldDisplayTileView
} from '../../../react/features/video-layout';
import UIEvents from '../../../service/UI/UIEvents';
import UIUtil from '../util/UIUtil';
@@ -233,6 +240,10 @@ const Filmstrip = {
* @returns {*|{localVideo, remoteVideo}}
*/
calculateThumbnailSize() {
if (shouldDisplayTileView(APP.store.getState())) {
return this._calculateThumbnailSizeForTileView();
}
const availableSizes = this.calculateAvailableSize();
const width = availableSizes.availableWidth;
const height = availableSizes.availableHeight;
@@ -247,11 +258,10 @@ const Filmstrip = {
* @returns {{availableWidth: number, availableHeight: number}}
*/
calculateAvailableSize() {
let availableHeight = interfaceConfig.FILM_STRIP_MAX_HEIGHT;
const thumbs = this.getThumbs(true);
const numvids = thumbs.remoteThumbs.length;
const localVideoContainer = $('#localVideoContainer');
const state = APP.store.getState();
const currentLayout = getCurrentLayout(state);
const isHorizontalFilmstripView
= currentLayout === LAYOUTS.HORIZONTAL_FILMSTRIP_VIEW;
/**
* If the videoAreaAvailableWidth is set we use this one to calculate
@@ -268,10 +278,15 @@ const Filmstrip = {
- UIUtil.parseCssInt(this.filmstrip.css('borderRightWidth'), 10)
- 5;
let availableHeight = interfaceConfig.FILM_STRIP_MAX_HEIGHT;
let availableWidth = videoAreaAvailableWidth;
const thumbs = this.getThumbs(true);
// If local thumb is not hidden
if (thumbs.localThumb) {
const localVideoContainer = $('#localVideoContainer');
availableWidth = Math.floor(
videoAreaAvailableWidth - (
UIUtil.parseCssInt(
@@ -289,10 +304,12 @@ const Filmstrip = {
);
}
// If the number of videos is 0 or undefined or we're in vertical
// If the number of videos is 0 or undefined or we're not in horizontal
// filmstrip mode we don't need to calculate further any adjustments
// to width based on the number of videos present.
if (numvids && !interfaceConfig.VERTICAL_FILMSTRIP) {
const numvids = thumbs.remoteThumbs.length;
if (numvids && isHorizontalFilmstripView) {
const remoteVideoContainer = thumbs.remoteThumbs.eq(0);
availableWidth = Math.floor(
@@ -322,8 +339,10 @@ const Filmstrip = {
availableHeight
= Math.min(maxHeight, window.innerHeight - 18);
return { availableWidth,
availableHeight };
return {
availableHeight,
availableWidth
};
},
/**
@@ -434,6 +453,51 @@ const Filmstrip = {
};
},
/**
* Calculates the size for thumbnails when in tile view layout.
*
* @returns {{localVideo, remoteVideo}}
*/
_calculateThumbnailSizeForTileView() {
const tileAspectRatio = 16 / 9;
// The distance from the top and bottom of the screen, as set by CSS, to
// avoid overlapping UI elements.
const topBottomPadding = 200;
// Minimum space to keep between the sides of the tiles and the sides
// of the window.
const sideMargins = 30 * 2;
const state = APP.store.getState();
const viewWidth = document.body.clientWidth - sideMargins;
const viewHeight = document.body.clientHeight - topBottomPadding;
const {
columns,
visibleRows
} = getTileViewGridDimensions(state, getMaxColumnCount());
const initialWidth = viewWidth / columns;
const aspectRatioHeight = initialWidth / tileAspectRatio;
const heightOfEach = Math.min(
aspectRatioHeight,
viewHeight / visibleRows);
const widthOfEach = tileAspectRatio * heightOfEach;
return {
localVideo: {
thumbWidth: widthOfEach,
thumbHeight: heightOfEach
},
remoteVideo: {
thumbWidth: widthOfEach,
thumbHeight: heightOfEach
}
};
},
/**
* Resizes thumbnails
* @param local
@@ -443,6 +507,28 @@ const Filmstrip = {
*/
// eslint-disable-next-line max-params
resizeThumbnails(local, remote, forceUpdate = false) {
const state = APP.store.getState();
if (shouldDisplayTileView(state)) {
// The size of the side margins for each tile as set in CSS.
const sideMargins = 10 * 2;
const {
columns,
rows
} = getTileViewGridDimensions(state, getMaxColumnCount());
const hasOverflow = rows > columns;
// Width is set so that the flex layout can automatically wrap
// tiles onto new rows.
this.filmstripRemoteVideos.css({
width: (local.thumbWidth * columns) + (columns * sideMargins)
});
this.filmstripRemoteVideos.toggleClass('has-overflow', hasOverflow);
} else {
this.filmstripRemoteVideos.css('width', '');
}
const thumbs = this.getThumbs(!forceUpdate);
if (thumbs.localThumb) {
@@ -466,13 +552,15 @@ const Filmstrip = {
});
}
const currentLayout = getCurrentLayout(APP.store.getState());
// Let CSS take care of height in vertical filmstrip mode.
if (interfaceConfig.VERTICAL_FILMSTRIP) {
if (currentLayout === LAYOUTS.VERTICAL_FILMSTRIP_VIEW) {
$('#filmstripLocalVideo').css({
// adds 4 px because of small video 2px border
width: `${local.thumbWidth + 4}px`
});
} else {
} else if (currentLayout === LAYOUTS.HORIZONTAL_FILMSTRIP_VIEW) {
this.filmstrip.css({
// adds 4 px because of small video 2px border
height: `${remote.thumbHeight + 4}px`

View File

@@ -11,6 +11,7 @@ import {
getAvatarURLByParticipantId
} from '../../../react/features/base/participants';
import { updateSettings } from '../../../react/features/base/settings';
import { shouldDisplayTileView } from '../../../react/features/video-layout';
/* eslint-enable no-unused-vars */
const logger = require('jitsi-meet-logger').getLogger(__filename);
@@ -26,7 +27,7 @@ function LocalVideo(VideoLayout, emitter, streamEndedCallback) {
this.streamEndedCallback = streamEndedCallback;
this.container = this.createContainer();
this.$container = $(this.container);
$('#filmstripLocalVideoThumbnail').append(this.container);
this.updateDOMLocation();
this.localVideoId = null;
this.bindHoverHandler();
@@ -109,16 +110,7 @@ LocalVideo.prototype.changeVideo = function(stream) {
this.localVideoId = `localVideo_${stream.getId()}`;
const localVideoContainer = document.getElementById('localVideoWrapper');
ReactDOM.render(
<Provider store = { APP.store }>
<VideoTrack
id = { this.localVideoId }
videoTrack = {{ jitsiTrack: stream }} />
</Provider>,
localVideoContainer
);
this._updateVideoElement();
// eslint-disable-next-line eqeqeq
const isVideo = stream.videoType != 'desktop';
@@ -128,12 +120,14 @@ LocalVideo.prototype.changeVideo = function(stream) {
this.setFlipX(isVideo ? settings.localFlipX : false);
const endedHandler = () => {
const localVideoContainer
= document.getElementById('localVideoWrapper');
// Only remove if there is no video and not a transition state.
// Previous non-react logic created a new video element with each track
// removal whereas react reuses the video component so it could be the
// stream ended but a new one is being used.
if (this.videoStream.isEnded()) {
if (localVideoContainer && this.videoStream.isEnded()) {
ReactDOM.unmountComponentAtNode(localVideoContainer);
}
@@ -235,6 +229,29 @@ LocalVideo.prototype._enableDisableContextMenu = function(enable) {
}
};
/**
* Places the {@code LocalVideo} in the DOM based on the current video layout.
*
* @returns {void}
*/
LocalVideo.prototype.updateDOMLocation = function() {
if (!this.container) {
return;
}
if (this.container.parentElement) {
this.container.parentElement.removeChild(this.container);
}
const appendTarget = shouldDisplayTileView(APP.store.getState())
? document.getElementById('localVideoTileViewContainer')
: document.getElementById('filmstripLocalVideoThumbnail');
appendTarget && appendTarget.appendChild(this.container);
this._updateVideoElement();
};
/**
* Callback invoked when the thumbnail is clicked. Will directly call
* VideoLayout to handle thumbnail click if certain elements have not been
@@ -258,7 +275,9 @@ LocalVideo.prototype._onContainerClick = function(event) {
= $source.parents('.displayNameContainer').length > 0;
const clickedOnPopover = $source.parents('.popover').length > 0
|| classList.contains('popover');
const ignoreClick = clickedOnDisplayName || clickedOnPopover;
const ignoreClick = clickedOnDisplayName
|| clickedOnPopover
|| shouldDisplayTileView(APP.store.getState());
if (event.stopPropagation && !ignoreClick) {
event.stopPropagation();
@@ -269,4 +288,28 @@ LocalVideo.prototype._onContainerClick = function(event) {
}
};
/**
* Renders the React Element for displaying video in {@code LocalVideo}.
*
*/
LocalVideo.prototype._updateVideoElement = function() {
const localVideoContainer = document.getElementById('localVideoWrapper');
ReactDOM.render(
<Provider store = { APP.store }>
<VideoTrack
id = 'localVideo_container'
videoTrack = {{ jitsiTrack: this.videoStream }} />
</Provider>,
localVideoContainer
);
// Ensure the video gets play() called on it. This may be necessary in the
// case where the local video container was moved and re-attached, in which
// case video does not autoplay.
const video = this.container.querySelector('video');
video && video.play();
};
export default LocalVideo;

View File

@@ -20,6 +20,11 @@ import {
REMOTE_CONTROL_MENU_STATES,
RemoteVideoMenuTriggerButton
} from '../../../react/features/remote-video-menu';
import {
LAYOUTS,
getCurrentLayout,
shouldDisplayTileView
} from '../../../react/features/video-layout';
/* eslint-enable no-unused-vars */
const logger = require('jitsi-meet-logger').getLogger(__filename);
@@ -163,8 +168,17 @@ RemoteVideo.prototype._generatePopupContent = function() {
const onVolumeChange = this._setAudioVolume;
const { isModerator } = APP.conference;
const participantID = this.id;
const menuPosition = interfaceConfig.VERTICAL_FILMSTRIP
? 'left bottom' : 'top center';
const currentLayout = getCurrentLayout(APP.store.getState());
let remoteMenuPosition;
if (currentLayout === LAYOUTS.TILE_VIEW) {
remoteMenuPosition = 'left top';
} else if (currentLayout === LAYOUTS.VERTICAL_FILMSTRIP_VIEW) {
remoteMenuPosition = 'left bottom';
} else {
remoteMenuPosition = 'top center';
}
ReactDOM.render(
<Provider store = { APP.store }>
@@ -174,7 +188,7 @@ RemoteVideo.prototype._generatePopupContent = function() {
initialVolumeValue = { initialVolumeValue }
isAudioMuted = { this.isAudioMuted }
isModerator = { isModerator }
menuPosition = { menuPosition }
menuPosition = { remoteMenuPosition }
onMenuDisplay
= {this._onRemoteVideoMenuDisplay.bind(this)}
onRemoteControlToggle = { onRemoteControlToggle }
@@ -613,7 +627,8 @@ RemoteVideo.prototype._onContainerClick = function(event) {
const { classList } = event.target;
const ignoreClick = $source.parents('.popover').length > 0
|| classList.contains('popover');
|| classList.contains('popover')
|| shouldDisplayTileView(APP.store.getState());
if (!ignoreClick) {
this._togglePin();

View File

@@ -27,6 +27,11 @@ import {
RaisedHandIndicator,
VideoMutedIndicator
} from '../../../react/features/filmstrip';
import {
LAYOUTS,
getCurrentLayout,
shouldDisplayTileView
} from '../../../react/features/video-layout';
/* eslint-enable no-unused-vars */
const logger = require('jitsi-meet-logger').getLogger(__filename);
@@ -328,7 +333,21 @@ SmallVideo.prototype.setVideoMutedView = function(isMuted) {
SmallVideo.prototype.updateStatusBar = function() {
const statusBarContainer
= this.container.querySelector('.videocontainer__toolbar');
const tooltipPosition = interfaceConfig.VERTICAL_FILMSTRIP ? 'left' : 'top';
if (!statusBarContainer) {
return;
}
const currentLayout = getCurrentLayout(APP.store.getState());
let tooltipPosition;
if (currentLayout === LAYOUTS.TILE_VIEW) {
tooltipPosition = 'right';
} else if (currentLayout === LAYOUTS.VERTICAL_FILMSTRIP_VIEW) {
tooltipPosition = 'left';
} else {
tooltipPosition = 'top';
}
ReactDOM.render(
<I18nextProvider i18n = { i18next }>
@@ -547,7 +566,8 @@ SmallVideo.prototype.isVideoPlayable = function() {
*/
SmallVideo.prototype.selectDisplayMode = function() {
// Display name is always and only displayed when user is on the stage
if (this.isCurrentlyOnLargeVideo()) {
if (this.isCurrentlyOnLargeVideo()
&& !shouldDisplayTileView(APP.store.getState())) {
return this.isVideoPlayable() && !APP.conference.isAudioOnly()
? DISPLAY_BLACKNESS_WITH_NAME : DISPLAY_AVATAR_WITH_NAME;
} else if (this.isVideoPlayable()
@@ -685,7 +705,10 @@ SmallVideo.prototype.showDominantSpeakerIndicator = function(show) {
this._showDominantSpeaker = show;
this.$container.toggleClass('active-speaker', this._showDominantSpeaker);
this.updateIndicators();
this.updateView();
};
/**
@@ -765,6 +788,18 @@ SmallVideo.prototype.initBrowserSpecificProperties = function() {
}
};
/**
* Helper function for re-rendering multiple react components of the small
* video.
*
* @returns {void}
*/
SmallVideo.prototype.rerender = function() {
this.updateIndicators();
this.updateStatusBar();
this.updateView();
};
/**
* Updates the React element responsible for showing connection status, dominant
* speaker, and raised hand icons. Uses instance variables to get the necessary
@@ -784,7 +819,19 @@ SmallVideo.prototype.updateIndicators = function() {
const iconSize = UIUtil.getIndicatorFontSize();
const showConnectionIndicator = this.videoIsHovered
|| !interfaceConfig.CONNECTION_INDICATOR_AUTO_HIDE_ENABLED;
const tooltipPosition = interfaceConfig.VERTICAL_FILMSTRIP ? 'left' : 'top';
const currentLayout = getCurrentLayout(APP.store.getState());
let statsPopoverPosition, tooltipPosition;
if (currentLayout === LAYOUTS.TILE_VIEW) {
statsPopoverPosition = 'right top';
tooltipPosition = 'right';
} else if (currentLayout === LAYOUTS.VERTICAL_FILMSTRIP_VIEW) {
statsPopoverPosition = this.statsPopoverLocation;
tooltipPosition = 'left';
} else {
statsPopoverPosition = this.statsPopoverLocation;
tooltipPosition = 'top';
}
ReactDOM.render(
<I18nextProvider i18n = { i18next }>
@@ -799,7 +846,7 @@ SmallVideo.prototype.updateIndicators = function() {
enableStatsDisplay
= { !interfaceConfig.filmStripOnly }
statsPopoverPosition
= { this.statsPopoverLocation }
= { statsPopoverPosition }
userID = { this.id } />
: null }
{ this._showRaisedHand

View File

@@ -1,6 +1,10 @@
/* global APP, $, interfaceConfig */
const logger = require('jitsi-meet-logger').getLogger(__filename);
import {
getNearestReceiverVideoQualityLevel,
setMaxReceiverVideoQuality
} from '../../../react/features/base/conference';
import {
JitsiParticipantConnectionStatus
} from '../../../react/features/base/lib-jitsi-meet';
@@ -9,6 +13,9 @@ import {
getPinnedParticipant,
pinParticipant
} from '../../../react/features/base/participants';
import {
shouldDisplayTileView
} from '../../../react/features/video-layout';
import { SHARED_VIDEO_CONTAINER_TYPE } from '../shared_video/SharedVideo';
import SharedVideoThumb from '../shared_video/SharedVideoThumb';
@@ -594,12 +601,19 @@ const VideoLayout = {
Filmstrip.resizeThumbnails(localVideo, remoteVideo, forceUpdate);
if (shouldDisplayTileView(APP.store.getState())) {
const height
= (localVideo && localVideo.thumbHeight)
|| (remoteVideo && remoteVideo.thumbnHeight)
|| 0;
const qualityLevel = getNearestReceiverVideoQualityLevel(height);
APP.store.dispatch(setMaxReceiverVideoQuality(qualityLevel));
}
if (onComplete && typeof onComplete === 'function') {
onComplete();
}
return { localVideo,
remoteVideo };
},
/**
@@ -1142,6 +1156,22 @@ const VideoLayout = {
);
},
/**
* Helper method to invoke when the video layout has changed and elements
* have to be re-arranged and resized.
*
* @returns {void}
*/
refreshLayout() {
localVideoThumbnail && localVideoThumbnail.updateDOMLocation();
VideoLayout.resizeVideoArea();
localVideoThumbnail && localVideoThumbnail.rerender();
Object.values(remoteVideos).forEach(
remoteVideo => remoteVideo.rerender()
);
},
/**
* Triggers an update of large video if the passed in participant is
* currently displayed on large video.

26
package-lock.json generated
View File

@@ -6449,8 +6449,8 @@
}
},
"eslint-config-jitsi": {
"version": "github:jitsi/eslint-config-jitsi#3d193df6476a73f827582e137a67a8612130a455",
"from": "github:jitsi/eslint-config-jitsi#v0.1.0",
"version": "github:jitsi/eslint-config-jitsi#7474f6668515eb5852f1273dc5a50b940a550d3f",
"from": "github:jitsi/eslint-config-jitsi#7474f6668515eb5852f1273dc5a50b940a550d3f",
"dev": true
},
"eslint-import-resolver-node": {
@@ -9719,8 +9719,8 @@
}
},
"lib-jitsi-meet": {
"version": "github:jitsi/lib-jitsi-meet#2be752fc88ff71e454c6b9178b21a33b59c53f41",
"from": "github:jitsi/lib-jitsi-meet#2be752fc88ff71e454c6b9178b21a33b59c53f41",
"version": "github:jitsi/lib-jitsi-meet#4a28a196160411d657518022de8bded7c02ad679",
"from": "github:jitsi/lib-jitsi-meet#4a28a196160411d657518022de8bded7c02ad679",
"requires": {
"@jitsi/sdp-interop": "0.1.13",
"@jitsi/sdp-simulcast": "0.2.1",
@@ -9736,6 +9736,10 @@
"yaeti": "1.0.1"
}
},
"libflacjs": {
"version": "github:mmig/libflac.js#93d37e7f811f01cf7d8b6a603e38bd3c3810907d",
"from": "github:mmig/libflac.js#93d37e7f811f01cf7d8b6a603e38bd3c3810907d"
},
"load-json-file": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/load-json-file/-/load-json-file-2.0.0.tgz",
@@ -12712,8 +12716,8 @@
"integrity": "sha512-vLNJIedXQZN4p3ChFsAgVHacnJqQMnLl+wBsnZuliRkmsjEHo8kQOA9fnLih/OoiDi1O3eHQvXC5L8f+RYiKgw=="
},
"react-native-calendar-events": {
"version": "github:jitsi/react-native-calendar-events#cad37355f36d17587d84af72b0095e8cc5fd3df9",
"from": "github:jitsi/react-native-calendar-events#cad37355f36d17587d84af72b0095e8cc5fd3df9"
"version": "github:jitsi/react-native-calendar-events#03babdb99e7fea3539796804cecdef8a907f2a3f",
"from": "github:jitsi/react-native-calendar-events#03babdb99e7fea3539796804cecdef8a907f2a3f"
},
"react-native-callstats": {
"version": "3.52.0",
@@ -12802,8 +12806,8 @@
}
},
"react-native-webrtc": {
"version": "github:jitsi/react-native-webrtc#6b0ea124414f6f5b7f234a7d5cec75d30f5f6312",
"from": "github:jitsi/react-native-webrtc#6b0ea124414f6f5b7f234a7d5cec75d30f5f6312",
"version": "github:jitsi/react-native-webrtc#bed49210a51cf53081954028589d720381e7cf40",
"from": "github:jitsi/react-native-webrtc#bed49210a51cf53081954028589d720381e7cf40",
"requires": {
"base64-js": "^1.1.2",
"event-target-shim": "^1.0.5",
@@ -14298,9 +14302,9 @@
}
},
"sdp": {
"version": "2.7.4",
"resolved": "https://registry.npmjs.org/sdp/-/sdp-2.7.4.tgz",
"integrity": "sha512-0+wTfgvUUEGcvvFoHIC0aiGbx6gzwAUm8FkKt5Oqqkjf9mEEDLgwnoDKX7MYTGXrNNwzikVbutJ+OVNAGmJBQw=="
"version": "2.8.0",
"resolved": "https://registry.npmjs.org/sdp/-/sdp-2.8.0.tgz",
"integrity": "sha512-wRSES07rAwKWAR7aev9UuClT7kdf9ZTdeUK5gTgHue9vlhs19Fbm3ccNEGJO4y2IitH4/JzS4sdzyPl6H2KQLw=="
},
"sdp-transform": {
"version": "2.3.0",

View File

@@ -47,7 +47,8 @@
"js-md5": "0.6.1",
"jsc-android": "224109.1.0",
"jwt-decode": "2.2.0",
"lib-jitsi-meet": "github:jitsi/lib-jitsi-meet#2be752fc88ff71e454c6b9178b21a33b59c53f41",
"lib-jitsi-meet": "github:jitsi/lib-jitsi-meet#4a28a196160411d657518022de8bded7c02ad679",
"libflacjs": "github:mmig/libflac.js#93d37e7f811f01cf7d8b6a603e38bd3c3810907d",
"lodash": "4.17.4",
"moment": "2.19.4",
"moment-duration-format": "2.2.2",
@@ -58,7 +59,7 @@
"react-i18next": "4.8.0",
"react-native": "0.55.4",
"react-native-background-timer": "2.0.0",
"react-native-calendar-events": "github:jitsi/react-native-calendar-events#cad37355f36d17587d84af72b0095e8cc5fd3df9",
"react-native-calendar-events": "github:jitsi/react-native-calendar-events#03babdb99e7fea3539796804cecdef8a907f2a3f",
"react-native-callstats": "3.52.0",
"react-native-fast-image": "4.0.14",
"react-native-immersive": "1.1.0",
@@ -69,7 +70,7 @@
"react-native-prompt": "1.0.0",
"react-native-sound": "0.10.9",
"react-native-vector-icons": "4.4.2",
"react-native-webrtc": "github:jitsi/react-native-webrtc#6b0ea124414f6f5b7f234a7d5cec75d30f5f6312",
"react-native-webrtc": "github:jitsi/react-native-webrtc#bed49210a51cf53081954028589d720381e7cf40",
"react-redux": "5.0.7",
"redux": "4.0.0",
"redux-thunk": "2.2.0",
@@ -87,7 +88,7 @@
"clean-css": "3.4.25",
"css-loader": "0.28.7",
"eslint": "4.12.1",
"eslint-config-jitsi": "github:jitsi/eslint-config-jitsi#v0.1.0",
"eslint-config-jitsi": "github:jitsi/eslint-config-jitsi#7474f6668515eb5852f1273dc5a50b940a550d3f",
"eslint-plugin-flowtype": "2.39.1",
"eslint-plugin-import": "2.8.0",
"eslint-plugin-jsdoc": "3.2.0",

View File

@@ -8,7 +8,8 @@ import {
AVATAR_ID_COMMAND,
AVATAR_URL_COMMAND,
EMAIL_COMMAND,
JITSI_CONFERENCE_URL_KEY
JITSI_CONFERENCE_URL_KEY,
VIDEO_QUALITY_LEVELS
} from './constants';
const logger = require('jitsi-meet-logger').getLogger(__filename);
@@ -102,6 +103,38 @@ export function getCurrentConference(stateful: Function | Object) {
: joining);
}
/**
* Finds the nearest match for the passed in {@link availableHeight} to am
* enumerated value in {@code VIDEO_QUALITY_LEVELS}.
*
* @param {number} availableHeight - The height to which a matching video
* quality level should be found.
* @returns {number} The closest matching value from
* {@code VIDEO_QUALITY_LEVELS}.
*/
export function getNearestReceiverVideoQualityLevel(availableHeight: number) {
const qualityLevels = [
VIDEO_QUALITY_LEVELS.HIGH,
VIDEO_QUALITY_LEVELS.STANDARD,
VIDEO_QUALITY_LEVELS.LOW
];
let selectedLevel = qualityLevels[0];
for (let i = 1; i < qualityLevels.length; i++) {
const previousValue = qualityLevels[i - 1];
const currentValue = qualityLevels[i];
const diffWithCurrent = Math.abs(availableHeight - currentValue);
const diffWithPrevious = Math.abs(availableHeight - previousValue);
if (diffWithCurrent < diffWithPrevious) {
selectedLevel = currentValue;
}
}
return selectedLevel;
}
/**
* Handle an error thrown by the backend (i.e. lib-jitsi-meet) while
* manipulating a conference participant (e.g. pin or select participant).

View File

@@ -96,6 +96,7 @@ const WHITELISTED_KEYS = [
'disableRtx',
'disableSuspendVideo',
'displayJids',
'e2eping',
'enableDisplayNameInStats',
'enableLayerSuspension',
'enableLipSync',

View File

@@ -113,7 +113,7 @@ class Dialog extends AbstractDialog<Props, State> {
[_TAG_KEY]: _SUBMIT_TEXT_TAG_VALUE
};
let el: ?React$Element<*> = ( // eslint-disable-line no-extra-parens
let el: ?React$Element<*> = (
<Prompt
cancelButtonTextStyle = { cancelButtonTextStyle }
cancelText = { t(cancelTitleKey) }

View File

@@ -14,6 +14,7 @@ export const JitsiConnectionErrors = JitsiMeetJS.errors.connection;
export const JitsiConnectionEvents = JitsiMeetJS.events.connection;
export const JitsiConnectionQualityEvents
= JitsiMeetJS.events.connectionQuality;
export const JitsiE2ePingEvents = JitsiMeetJS.events.e2eping;
export const JitsiMediaDevicesEvents = JitsiMeetJS.events.mediaDevices;
export const JitsiParticipantConnectionStatus
= JitsiMeetJS.constants.participantConnectionStatus;

View File

@@ -93,7 +93,7 @@ export default class Video extends Component<*> {
? 'contain'
: (style && style.objectFit) || 'cover';
const rtcView
= ( // eslint-disable-line no-extra-parens
= (
<RTCView
mirror = { this.props.mirror }
objectFit = { objectFit }

View File

@@ -41,7 +41,7 @@ class InlineDialogFailure extends Component<*> {
const supportString = t('inlineDialogFailure.supportMsg');
const supportLinkElem
= supportLink
? ( // eslint-disable-line no-extra-parens
? (
<div className = 'inline-dialog-error-text'>
<span>{ supportString.padEnd(supportString.length + 1) }
</span>

View File

@@ -244,7 +244,7 @@ class MultiSelectAutocomplete extends Component {
if (!this.state.error) {
return null;
}
const content = ( // eslint-disable-line no-extra-parens
const content = (
<div className = 'autocomplete-error'>
<InlineDialogFailure
onRetry = { this._onRetry } />

View File

@@ -60,7 +60,6 @@ export default class SectionList extends Component<Props> {
*/
if (sections) {
return (
/* eslint-disable no-extra-parens */
<Container
className = 'navigate-section-list'>
{
@@ -83,7 +82,6 @@ export default class SectionList extends Component<Props> {
)
}
</Container>
/* eslint-enable no-extra-parens */
);
}

View File

@@ -105,7 +105,7 @@ class Watermarks extends Component<*, *> {
let reactElement = null;
if (this.state.showBrandWatermark) {
reactElement = ( // eslint-disable-line no-extra-parens
reactElement = (
<div
className = 'watermark rightwatermark'
style = { _RIGHT_WATERMARK_STYLE } />
@@ -114,7 +114,7 @@ class Watermarks extends Component<*, *> {
const { brandWatermarkLink } = this.state;
if (brandWatermarkLink) {
reactElement = ( // eslint-disable-line no-extra-parens
reactElement = (
<a
href = { brandWatermarkLink }
target = '_new'>
@@ -144,7 +144,7 @@ class Watermarks extends Component<*, *> {
const { jitsiWatermarkLink } = this.state;
if (jitsiWatermarkLink) {
reactElement = ( // eslint-disable-line no-extra-parens
reactElement = (
<a
href = { jitsiWatermarkLink }
target = '_new'>

View File

@@ -67,7 +67,7 @@ export default class ToolboxItem extends AbstractToolboxItem<Props> {
// XXX TouchableHighlight requires 1 child. If there's a need to
// show both the icon and the label, then these two need to be
// wrapped in a View.
children = ( // eslint-disable-line no-extra-parens
children = (
<View style = { style }>
{ children }
<Text style = { styles && styles.labelStyle }>

View File

@@ -35,7 +35,6 @@ export default class ToolboxItem extends AbstractToolboxItem<Props> {
};
const elementType = showLabel ? 'li' : 'div';
const useTooltip = this.tooltip && this.tooltip.length > 0;
// eslint-disable-next-line no-extra-parens
let children = (
<Fragment>
{ this._renderIcon() }
@@ -47,7 +46,6 @@ export default class ToolboxItem extends AbstractToolboxItem<Props> {
);
if (useTooltip) {
// eslint-disable-next-line no-extra-parens
children = (
<Tooltip
content = { this.tooltip }

View File

@@ -4,6 +4,8 @@ import _ from 'lodash';
import React, { Component } from 'react';
import { connect as reactReduxConnect } from 'react-redux';
import VideoLayout from '../../../../modules/UI/videolayout/VideoLayout';
import { obtainConfig } from '../../base/config';
import { connect, disconnect } from '../../base/connection';
import { DialogContainer } from '../../base/dialog';
@@ -13,6 +15,12 @@ import { CalleeInfoContainer } from '../../invite';
import { LargeVideo } from '../../large-video';
import { NotificationsContainer } from '../../notifications';
import { SidePanel } from '../../side-panel';
import {
LAYOUTS,
getCurrentLayout,
shouldDisplayTileView
} from '../../video-layout';
import { default as Notice } from './Notice';
import {
Toolbox,
@@ -49,9 +57,10 @@ const FULL_SCREEN_EVENTS = [
* @private
* @type {Object}
*/
const LAYOUT_CLASSES = {
HORIZONTAL_FILMSTRIP: 'horizontal-filmstrip',
VERTICAL_FILMSTRIP: 'vertical-filmstrip'
const LAYOUT_CLASSNAMES = {
[LAYOUTS.HORIZONTAL_FILMSTRIP_VIEW]: 'horizontal-filmstrip',
[LAYOUTS.TILE_VIEW]: 'tile-view',
[LAYOUTS.VERTICAL_FILMSTRIP_VIEW]: 'vertical-filmstrip'
};
/**
@@ -68,13 +77,18 @@ type Props = {
* The CSS class to apply to the root of {@link Conference} to modify the
* application layout.
*/
_layoutModeClassName: string,
_layoutClassName: string,
/**
* Conference room name.
*/
_room: string,
/**
* Whether or not the current UI layout should be in tile view.
*/
_shouldDisplayTileView: boolean,
dispatch: Function,
t: Function
}
@@ -143,6 +157,25 @@ class Conference extends Component<Props> {
}
}
/**
* Calls into legacy UI to update the application layout, if necessary.
*
* @inheritdoc
* returns {void}
*/
componentDidUpdate(prevProps) {
if (this.props._shouldDisplayTileView
=== prevProps._shouldDisplayTileView) {
return;
}
// TODO: For now VideoLayout is being called as LargeVideo and Filmstrip
// sizing logic is still handled outside of React. Once all components
// are in react they should calculate size on their own as much as
// possible and pass down sizings.
VideoLayout.refreshLayout();
}
/**
* Disconnect from the conference when component will be
* unmounted.
@@ -180,7 +213,7 @@ class Conference extends Component<Props> {
return (
<div
className = { this.props._layoutModeClassName }
className = { this.props._layoutClassName }
id = 'videoconference_page'
onMouseMove = { this._onShowToolbar }>
<Notice />
@@ -257,29 +290,19 @@ class Conference extends Component<Props> {
* @private
* @returns {{
* _iAmRecorder: boolean,
* _room: ?string
* _layoutClassName: string,
* _room: ?string,
* _shouldDisplayTileView: boolean
* }}
*/
function _mapStateToProps(state) {
const { room } = state['features/base/conference'];
const { iAmRecorder } = state['features/base/config'];
const currentLayout = getCurrentLayout(state);
return {
/**
* Whether the local participant is recording the conference.
*
* @private
*/
_iAmRecorder: iAmRecorder,
_layoutModeClassName: interfaceConfig.VERTICAL_FILMSTRIP
? LAYOUT_CLASSES.VERTICAL_FILMSTRIP
: LAYOUT_CLASSES.HORIZONTAL_FILMSTRIP,
/**
* Conference room name.
*/
_room: room
_iAmRecorder: state['features/base/config'].iAmRecorder,
_layoutClassName: LAYOUT_CLASSNAMES[currentLayout],
_room: state['features/base/conference'].room,
_shouldDisplayTileView: shouldDisplayTileView(state)
};
}

View File

@@ -324,6 +324,7 @@ class ConnectionIndicator extends Component {
* @returns {void}
*/
_onStatsUpdated(stats = {}) {
// Rely on React to batch setState actions.
const { connectionQuality } = stats;
const newPercentageState = typeof connectionQuality === 'undefined'
? {} : { percent: connectionQuality };
@@ -337,7 +338,6 @@ class ConnectionIndicator extends Component {
stats: newStats
});
// Rely on React to batch setState actions.
this._updateIndicatorAutoHide(newStats.percent);
}
@@ -410,8 +410,10 @@ class ConnectionIndicator extends Component {
const {
bandwidth,
bitrate,
e2eRtt,
framerate,
packetLoss,
region,
resolution,
transport
} = this.state.stats;
@@ -421,10 +423,12 @@ class ConnectionIndicator extends Component {
bandwidth = { bandwidth }
bitrate = { bitrate }
connectionSummary = { this._getConnectionStatusTip() }
e2eRtt = { e2eRtt }
framerate = { framerate }
isLocalVideo = { this.props.isLocalVideo }
onShowMore = { this._onToggleShowMore }
packetLoss = { packetLoss }
region = { region }
resolution = { resolution }
shouldShowMore = { this.state.showMoreStats }
transport = { transport } />

View File

@@ -2,7 +2,10 @@
import _ from 'lodash';
import { JitsiConnectionQualityEvents } from '../base/lib-jitsi-meet';
import {
JitsiConnectionQualityEvents,
JitsiE2ePingEvents
} from '../base/lib-jitsi-meet';
declare var APP: Object;
@@ -33,6 +36,17 @@ const statsEmitter = {
conference.on(JitsiConnectionQualityEvents.REMOTE_STATS_UPDATED,
(id, stats) => this._emitStatsUpdate(id, stats));
conference.on(
JitsiE2ePingEvents.E2E_RTT_CHANGED,
(participant, e2eRtt) => {
const stats = {
e2eRtt,
region: participant.getProperty('region')
};
this._emitStatsUpdate(participant.getId(), stats);
});
},
/**

View File

@@ -39,7 +39,12 @@ class ConnectionStatsTable extends Component {
connectionSummary: PropTypes.string,
/**
* Statistics related to framerates for each ssrc.
* The end-to-end round-trip-time.
*/
e2eRtt: PropTypes.number,
/**
* Statistics related to frame rates for each ssrc.
* {{
* [ ssrc ]: Number
* }}
@@ -47,7 +52,7 @@ class ConnectionStatsTable extends Component {
framerate: PropTypes.object,
/**
* Whether or not the statitics are for local video.
* Whether or not the statistics are for local video.
*/
isLocalVideo: PropTypes.bool,
@@ -65,6 +70,11 @@ class ConnectionStatsTable extends Component {
*/
packetLoss: PropTypes.object,
/**
* The region.
*/
region: PropTypes.string,
/**
* Statistics related to display resolutions for each ssrc.
* {{
@@ -208,6 +218,31 @@ class ConnectionStatsTable extends Component {
);
}
/**
* Creates a table row as a ReactElement for displaying end-to-end RTT and
* the region.
*
* @returns {ReactElement}
* @private
*/
_renderE2eRtt() {
const { e2eRtt, region, t } = this.props;
let str = e2eRtt ? `${e2eRtt.toFixed(0)}ms` : 'N/A';
if (region) {
str += ` (${region})`;
}
return (
<tr>
<td>
<span>{ t('connectionindicator.e2e_rtt') }</span>
</td>
<td>{ str }</td>
</tr>
);
}
/**
* Creates a table row as a ReactElement for displaying frame rate related
* statistics.
@@ -245,7 +280,6 @@ class ConnectionStatsTable extends Component {
if (packetLoss) {
const { download, upload } = packetLoss;
// eslint-disable-next-line no-extra-parens
packetLossTableData = (
<td>
<span className = 'connection-info__download'>
@@ -330,12 +364,15 @@ class ConnectionStatsTable extends Component {
* @returns {ReactElement}
*/
_renderStatistics() {
const isRemoteVideo = !this.props.isLocalVideo;
return (
<table className = 'connection-info__container'>
<tbody>
{ this._renderConnectionSummary() }
{ this._renderBitrate() }
{ this._renderPacketLoss() }
{ isRemoteVideo ? this._renderE2eRtt() : null }
{ this._renderResolution() }
{ this._renderFrameRate() }
</tbody>
@@ -354,7 +391,6 @@ class ConnectionStatsTable extends Component {
const { t, transport } = this.props;
if (!transport || transport.length === 0) {
// eslint-disable-next-line no-extra-parens
const NA = (
<tr key = 'address'>
<td>

View File

@@ -61,18 +61,16 @@ class DesktopPickerPane extends Component {
const classNames
= `desktop-picker-pane default-scrollbar source-type-${type}`;
const previews
= sources ? sources.map(
source =>
// eslint-disable-next-line react/jsx-wrap-multilines
= sources
? sources.map(source => (
<DesktopSourcePreview
key = { source.id }
onClick = { onClick }
onDoubleClick = { onDoubleClick }
selected = { source.id === selectedSourceId }
source = { source }
type = { type } />)
: ( // eslint-disable-line no-extra-parens
type = { type } />))
: (
<div className = 'desktop-picker-pane-spinner'>
<Spinner
isCompleting = { false }

View File

@@ -121,17 +121,15 @@ class Filmstrip extends Component<Props> {
&& <LocalThumbnail />
}
{
/* eslint-disable react/jsx-wrap-multilines */
this._sort(
this.props._participants,
isNarrowAspectRatio_)
.map(p =>
.map(p => (
<Thumbnail
key = { p.id }
participant = { p } />)
participant = { p } />))
/* eslint-enable react/jsx-wrap-multilines */
}
{
!this._separateLocalThumbnail

View File

@@ -8,6 +8,7 @@ import { dockToolbox } from '../../../toolbox';
import { setFilmstripHovered } from '../../actions';
import { shouldRemoteVideosBeVisible } from '../../functions';
import Toolbar from './Toolbar';
declare var interfaceConfig: Object;
@@ -185,9 +186,8 @@ function _mapStateToProps(state) {
&& state['features/toolbox'].visible
&& interfaceConfig.TOOLBAR_BUTTONS.length;
const remoteVideosVisible = shouldRemoteVideosBeVisible(state);
const className = `${remoteVideosVisible ? '' : 'hide-videos'} ${
reduceHeight ? 'reduce-height' : ''}`;
reduceHeight ? 'reduce-height' : ''}`.trim();
return {
_className: className,

View File

@@ -477,7 +477,7 @@ class AddPeopleDialog extends Component<*, *> {
const supportString = t('inlineDialogFailure.supportMsg');
const supportLink = interfaceConfig.SUPPORT_URL;
const supportLinkContent
= ( // eslint-disable-line no-extra-parens
= (
<span>
<span>
{ supportString.padEnd(supportString.length + 1) }

View File

@@ -6,7 +6,9 @@ import {
} from '../analytics';
import { _handleParticipantError } from '../base/conference';
import { MEDIA_TYPE } from '../base/media';
import { getParticipants } from '../base/participants';
import { reportError } from '../base/util';
import { shouldDisplayTileView } from '../video-layout';
import {
SELECT_LARGE_VIDEO_PARTICIPANT,
@@ -26,17 +28,19 @@ export function selectParticipant() {
const { conference } = state['features/base/conference'];
if (conference) {
const largeVideo = state['features/large-video'];
const id = largeVideo.participantId;
const ids = shouldDisplayTileView(state)
? getParticipants(state).map(participant => participant.id)
: [ state['features/large-video'].participantId ];
try {
conference.selectParticipant(id);
conference.selectParticipants(ids);
} catch (err) {
_handleParticipantError(err);
sendAnalytics(createSelectParticipantFailedEvent(err));
reportError(err, `Failed to select participant ${id}`);
reportError(
err, `Failed to select participants ${ids.toString()}`);
}
}
};

View File

@@ -3,7 +3,9 @@
import React, { Component } from 'react';
import { isFilmstripVisible } from '../../filmstrip';
import { LocalRecordingLabel } from '../../local-recording';
import { RecordingLabel } from '../../recording';
import { shouldDisplayTileView } from '../../video-layout';
import { VideoQualityLabel } from '../../video-quality';
import { TranscribingLabel } from '../../transcribing/';
@@ -17,6 +19,11 @@ export type Props = {
* determine display classes to set.
*/
_filmstripVisible: boolean,
/**
* Whether or not the video quality label should be displayed.
*/
_showVideoQualityLabel: boolean
};
/**
@@ -63,6 +70,18 @@ export default class AbstractLabels<P: Props, S> extends Component<P, S> {
<TranscribingLabel />
);
}
/**
* Renders the {@code LocalRecordingLabel}.
*
* @returns {React$Element}
* @protected
*/
_renderLocalRecordingLabel() {
return (
<LocalRecordingLabel />
);
}
}
/**
@@ -72,11 +91,13 @@ export default class AbstractLabels<P: Props, S> extends Component<P, S> {
* @param {Object} state - The Redux state.
* @private
* @returns {{
* _filmstripVisible: boolean
* _filmstripVisible: boolean,
* _showVideoQualityLabel: boolean
* }}
*/
export function _abstractMapStateToProps(state: Object) {
return {
_filmstripVisible: isFilmstripVisible(state)
_filmstripVisible: isFilmstripVisible(state),
_showVideoQualityLabel: !shouldDisplayTileView(state)
};
}

View File

@@ -85,11 +85,15 @@ class Labels extends AbstractLabels<Props, State> {
this._renderRecordingLabel(
JitsiRecordingConstants.mode.STREAM)
}
{
this._renderLocalRecordingLabel()
}
{
this._renderTranscribingLabel()
}
{
this._renderVideoQualityLabel()
this.props._showVideoQualityLabel
&& this._renderVideoQualityLabel()
}
</div>
);
@@ -100,6 +104,8 @@ class Labels extends AbstractLabels<Props, State> {
_renderVideoQualityLabel: () => React$Element<*>
_renderTranscribingLabel: () => React$Element<*>
_renderLocalRecordingLabel: () => React$Element<*>
}
export default connect(_mapStateToProps)(Labels);

View File

@@ -50,7 +50,7 @@ export default class LargeVideo extends Component<*> {
</div>
<div id = 'remotePresenceMessage' />
<span id = 'remoteConnectionMessage' />
<div>
<div id = 'largeVideoElementsContainer'>
<div id = 'largeVideoBackgroundContainer' />
{

View File

@@ -0,0 +1,32 @@
/**
* Action to signal that the local client has started to perform recording,
* (as in: {@code RecordingAdapter} is actively collecting audio data).
*
* {
* type: LOCAL_RECORDING_ENGAGED,
* recordingEngagedAt: Date
* }
*/
export const LOCAL_RECORDING_ENGAGED = Symbol('LOCAL_RECORDING_ENGAGED');
/**
* Action to signal that the local client has stopped recording,
* (as in: {@code RecordingAdapter} is no longer collecting audio data).
*
* {
* type: LOCAL_RECORDING_UNENGAGED
* }
*/
export const LOCAL_RECORDING_UNENGAGED = Symbol('LOCAL_RECORDING_UNENGAGED');
/**
* Action to update {@code LocalRecordingInfoDialog} with stats from all
* clients.
*
* {
* type: LOCAL_RECORDING_STATS_UPDATE,
* stats: Object
* }
*/
export const LOCAL_RECORDING_STATS_UPDATE
= Symbol('LOCAL_RECORDING_STATS_UPDATE');

View File

@@ -0,0 +1,59 @@
/* @flow */
import {
LOCAL_RECORDING_ENGAGED,
LOCAL_RECORDING_UNENGAGED,
LOCAL_RECORDING_STATS_UPDATE
} from './actionTypes';
// The following two actions signal state changes in local recording engagement.
// In other words, the events of the local WebWorker / MediaRecorder starting to
// record and finishing recording.
// Note that this is not the event fired when the users tries to start the
// recording in the UI.
/**
* Signals that local recording has been engaged.
*
* @param {Date} startTime - Time when the recording is engaged.
* @returns {{
* type: LOCAL_RECORDING_ENGAGED,
* recordingEngagedAt: Date
* }}
*/
export function localRecordingEngaged(startTime: Date) {
return {
type: LOCAL_RECORDING_ENGAGED,
recordingEngagedAt: startTime
};
}
/**
* Signals that local recording has finished.
*
* @returns {{
* type: LOCAL_RECORDING_UNENGAGED
* }}
*/
export function localRecordingUnengaged() {
return {
type: LOCAL_RECORDING_UNENGAGED
};
}
/**
* Updates the the local recording stats from each client,
* to be displayed on {@code LocalRecordingInfoDialog}.
*
* @param {*} stats - The stats object.
* @returns {{
* type: LOCAL_RECORDING_STATS_UPDATE,
* stats: Object
* }}
*/
export function statsUpdate(stats: Object) {
return {
type: LOCAL_RECORDING_STATS_UPDATE,
stats
};
}

View File

@@ -0,0 +1,86 @@
/* @flow */
import React, { Component } from 'react';
import { translate } from '../../base/i18n';
import { ToolbarButton } from '../../toolbox';
/**
* The type of the React {@code Component} state of
* {@link LocalRecordingButton}.
*/
type Props = {
/**
* Whether or not {@link LocalRecordingInfoDialog} should be displayed.
*/
isDialogShown: boolean,
/**
* Callback function called when {@link LocalRecordingButton} is clicked.
*/
onClick: Function,
/**
* Invoked to obtain translated strings.
*/
t: Function
}
/**
* A React {@code Component} for opening or closing the
* {@code LocalRecordingInfoDialog}.
*
* @extends Component
*/
class LocalRecordingButton extends Component<Props> {
/**
* Initializes a new {@code LocalRecordingButton} instance.
*
* @param {Object} props - The read-only properties with which the new
* instance is to be initialized.
*/
constructor(props: Props) {
super(props);
// Bind event handlers so they are only bound once per instance.
this._onClick = this._onClick.bind(this);
}
/**
* Implements React's {@link Component#render()}.
*
* @inheritdoc
* @returns {ReactElement}
*/
render() {
const { isDialogShown, t } = this.props;
const iconClasses
= `icon-thumb-menu ${isDialogShown
? 'icon-rec toggled' : 'icon-rec'}`;
return (
<ToolbarButton
accessibilityLabel
= { t('toolbar.accessibilityLabel.localRecording') }
iconName = { iconClasses }
onClick = { this._onClick }
tooltip = { t('localRecording.dialogTitle') } />
);
}
_onClick: () => void;
/**
* Callback invoked when the Toolbar button is clicked.
*
* @private
* @returns {void}
*/
_onClick() {
this.props.onClick();
}
}
export default translate(LocalRecordingButton);

View File

@@ -0,0 +1,403 @@
/* @flow */
import moment from 'moment';
import React, { Component } from 'react';
import { connect } from 'react-redux';
import { Dialog } from '../../base/dialog';
import { translate } from '../../base/i18n';
import {
PARTICIPANT_ROLE,
getLocalParticipant
} from '../../base/participants';
import { statsUpdate } from '../actions';
import { recordingController } from '../controller';
/**
* The type of the React {@code Component} props of
* {@link LocalRecordingInfoDialog}.
*/
type Props = {
/**
* Redux store dispatch function.
*/
dispatch: Dispatch<*>,
/**
* Current encoding format.
*/
encodingFormat: string,
/**
* Whether the local user is the moderator.
*/
isModerator: boolean,
/**
* Whether local recording is engaged.
*/
isEngaged: boolean,
/**
* The start time of the current local recording session.
* Used to calculate the duration of recording.
*/
recordingEngagedAt: Date,
/**
* Stats of all the participant.
*/
stats: Object,
/**
* Invoked to obtain translated strings.
*/
t: Function
}
/**
* The type of the React {@code Component} state of
* {@link LocalRecordingInfoDialog}.
*/
type State = {
/**
* The recording duration string to be displayed on the UI.
*/
durationString: string
}
/**
* A React Component with the contents for a dialog that shows information about
* local recording. For users with moderator rights, this is also the "control
* panel" for starting/stopping local recording on all clients.
*
* @extends Component
*/
class LocalRecordingInfoDialog extends Component<Props, State> {
/**
* Saves a handle to the timer for UI updates,
* so that it can be cancelled when the component unmounts.
*/
_timer: ?IntervalID;
/**
* Initializes a new {@code LocalRecordingInfoDialog} instance.
*
* @param {Props} props - The React {@code Component} props to initialize
* the new {@code LocalRecordingInfoDialog} instance with.
*/
constructor(props: Props) {
super(props);
this.state = {
durationString: ''
};
}
/**
* Implements React's {@link Component#componentDidMount()}.
*
* @returns {void}
*/
componentDidMount() {
this._timer = setInterval(
() => {
this.setState((_prevState, props) => {
const nowTime = new Date();
return {
durationString: this._getDuration(nowTime,
props.recordingEngagedAt)
};
});
try {
this.props.dispatch(
statsUpdate(recordingController
.getParticipantsStats()));
} catch (e) {
// do nothing
}
},
1000
);
}
/**
* Implements React's {@link Component#componentWillUnmount()}.
*
* @returns {void}
*/
componentWillUnmount() {
if (this._timer) {
clearInterval(this._timer);
this._timer = null;
}
}
/**
* Implements React's {@link Component#render()}.
*
* @inheritdoc
* @returns {ReactElement}
*/
render() {
const { isModerator, t } = this.props;
return (
<Dialog
cancelTitleKey = { 'dialog.close' }
submitDisabled = { true }
titleKey = 'localRecording.dialogTitle'>
<div className = 'localrec-control'>
<span className = 'localrec-control-info-label'>
{`${t('localRecording.moderator')}:`}
</span>
<span className = 'info-value'>
{ isModerator
? t('localRecording.yes')
: t('localRecording.no') }
</span>
</div>
{ this._renderModeratorControls() }
{ this._renderDurationAndFormat() }
</Dialog>
);
}
/**
* Renders the recording duration and encoding format. Only shown if local
* recording is engaged.
*
* @private
* @returns {ReactElement|null}
*/
_renderDurationAndFormat() {
const { encodingFormat, isEngaged, t } = this.props;
const { durationString } = this.state;
if (!isEngaged) {
return null;
}
return (
<div>
<div>
<span className = 'localrec-control-info-label'>
{`${t('localRecording.duration')}:`}
</span>
<span className = 'info-value'>
{ durationString === ''
? t('localRecording.durationNA')
: durationString }
</span>
</div>
<div>
<span className = 'localrec-control-info-label'>
{`${t('localRecording.encoding')}:`}
</span>
<span className = 'info-value'>
{ encodingFormat }
</span>
</div>
</div>
);
}
/**
* Returns React elements for displaying the local recording stats of
* each participant.
*
* @private
* @returns {ReactElement|null}
*/
_renderStats() {
const { stats } = this.props;
if (stats === undefined) {
return null;
}
const ids = Object.keys(stats);
return (
<div className = 'localrec-participant-stats' >
{ this._renderStatsHeader() }
{ ids.map((id, i) => this._renderStatsLine(i, id)) }
</div>
);
}
/**
* Renders the stats for one participant.
*
* @private
* @param {*} lineKey - The key required by React for elements in lists.
* @param {*} id - The ID of the participant.
* @returns {ReactElement}
*/
_renderStatsLine(lineKey, id) {
const { stats } = this.props;
let statusClass = 'localrec-participant-stats-item__status-dot ';
statusClass += stats[id].recordingStats
? stats[id].recordingStats.isRecording
? 'status-on'
: 'status-off'
: 'status-unknown';
return (
<div
className = 'localrec-participant-stats-item'
key = { lineKey } >
<div className = 'localrec-participant-stats-item__status'>
<span className = { statusClass } />
</div>
<div className = 'localrec-participant-stats-item__name'>
{ stats[id].displayName || id }
</div>
<div className = 'localrec-participant-stats-item__sessionid'>
{ stats[id].recordingStats.currentSessionToken }
</div>
</div>
);
}
/**
* Renders the participant stats header line.
*
* @private
* @returns {ReactElement}
*/
_renderStatsHeader() {
const { t } = this.props;
return (
<div className = 'localrec-participant-stats-item'>
<div className = 'localrec-participant-stats-item__status' />
<div className = 'localrec-participant-stats-item__name'>
{ t('localRecording.participant') }
</div>
<div className = 'localrec-participant-stats-item__sessionid'>
{ t('localRecording.sessionToken') }
</div>
</div>
);
}
/**
* Renders the moderator-only controls, i.e. stats of all users and the
* action links.
*
* @private
* @returns {ReactElement|null}
*/
_renderModeratorControls() {
const { isModerator, isEngaged, t } = this.props;
if (!isModerator) {
return null;
}
return (
<div>
<div className = 'localrec-control-action-links'>
<div className = 'localrec-control-action-link'>
{ isEngaged ? <a
onClick = { this._onStop }>
{ t('localRecording.stop') }
</a>
: <a
onClick = { this._onStart }>
{ t('localRecording.start') }
</a>
}
</div>
</div>
<div>
<span className = 'localrec-control-info-label'>
{`${t('localRecording.participantStats')}:`}
</span>
</div>
{ this._renderStats() }
</div>
);
}
/**
* Creates a duration string "HH:MM:SS" from two Date objects.
*
* @param {Date} now - Current time.
* @param {Date} prev - Previous time, the time to be subtracted.
* @returns {string}
*/
_getDuration(now, prev) {
if (prev === null || prev === undefined) {
return '';
}
// Still a hack, as moment.js does not support formatting of duration
// (i.e. TimeDelta). Only works if total duration < 24 hours.
// But who is going to have a 24-hour long conference?
return moment(now - prev).utc()
.format('HH:mm:ss');
}
/**
* Callback function for the Start UI action.
*
* @private
* @returns {void}
*/
_onStart() {
recordingController.startRecording();
}
/**
* Callback function for the Stop UI action.
*
* @private
* @returns {void}
*/
_onStop() {
recordingController.stopRecording();
}
}
/**
* Maps (parts of) the Redux state to the associated props for the
* {@code LocalRecordingInfoDialog} component.
*
* @param {Object} state - The Redux state.
* @private
* @returns {{
* encodingFormat: string,
* isModerator: boolean,
* isEngaged: boolean,
* recordingEngagedAt: Date,
* stats: Object
* }}
*/
function _mapStateToProps(state) {
const {
encodingFormat,
isEngaged,
recordingEngagedAt,
stats
} = state['features/local-recording'];
const isModerator
= getLocalParticipant(state).role === PARTICIPANT_ROLE.MODERATOR;
return {
encodingFormat,
isModerator,
isEngaged,
recordingEngagedAt,
stats
};
}
export default translate(connect(_mapStateToProps)(LocalRecordingInfoDialog));

View File

@@ -0,0 +1,75 @@
// @flow
import Tooltip from '@atlaskit/tooltip';
import React, { Component } from 'react';
import { connect } from 'react-redux';
import { translate } from '../../base/i18n/index';
import { CircularLabel } from '../../base/label/index';
/**
* The type of the React {@code Component} props of {@link LocalRecordingLabel}.
*/
type Props = {
/**
* Invoked to obtain translated strings.
*/
t: Function,
/**
* Whether local recording is engaged or not.
*/
isEngaged: boolean
};
/**
* React Component for displaying a label when local recording is engaged.
*
* @extends Component
*/
class LocalRecordingLabel extends Component<Props> {
/**
* Implements React's {@link Component#render()}.
*
* @inheritdoc
* @returns {ReactElement}
*/
render() {
if (!this.props.isEngaged) {
return null;
}
return (
<Tooltip
content = { this.props.t('localRecording.labelToolTip') }
position = { 'left' }>
<CircularLabel
className = 'local-rec'
label = { this.props.t('localRecording.label') } />
</Tooltip>
);
}
}
/**
* Maps (parts of) the Redux state to the associated props for the
* {@code LocalRecordingLabel} component.
*
* @param {Object} state - The Redux state.
* @private
* @returns {{
* }}
*/
function _mapStateToProps(state) {
const { isEngaged } = state['features/local-recording'];
return {
isEngaged
};
}
export default translate(connect(_mapStateToProps)(LocalRecordingLabel));

View File

@@ -0,0 +1,5 @@
export { default as LocalRecordingButton } from './LocalRecordingButton';
export { default as LocalRecordingLabel } from './LocalRecordingLabel';
export {
default as LocalRecordingInfoDialog
} from './LocalRecordingInfoDialog';

View File

@@ -0,0 +1,687 @@
/* @flow */
import { i18next } from '../../base/i18n';
import {
FlacAdapter,
OggAdapter,
WavAdapter,
downloadBlob
} from '../recording';
import { sessionManager } from '../session';
const logger = require('jitsi-meet-logger').getLogger(__filename);
/**
* XMPP command for signaling the start of local recording to all clients.
* Should be sent by the moderator only.
*/
const COMMAND_START = 'localRecStart';
/**
* XMPP command for signaling the stop of local recording to all clients.
* Should be sent by the moderator only.
*/
const COMMAND_STOP = 'localRecStop';
/**
* One-time command used to trigger the moderator to resend the commands.
* This is a workaround for newly-joined clients to receive remote presence.
*/
const COMMAND_PING = 'localRecPing';
/**
* One-time command sent upon receiving a {@code COMMAND_PING}.
* Only the moderator sends this command.
* This command does not carry any information itself, but rather forces the
* XMPP server to resend the remote presence.
*/
const COMMAND_PONG = 'localRecPong';
/**
* Participant property key for local recording stats.
*/
const PROPERTY_STATS = 'localRecStats';
/**
* Supported recording formats.
*/
const RECORDING_FORMATS = new Set([ 'flac', 'wav', 'ogg' ]);
/**
* Default recording format.
*/
const DEFAULT_RECORDING_FORMAT = 'flac';
/**
* States of the {@code RecordingController}.
*/
const ControllerState = Object.freeze({
/**
* Idle (not recording).
*/
IDLE: Symbol('IDLE'),
/**
* Starting.
*/
STARTING: Symbol('STARTING'),
/**
* Engaged (recording).
*/
RECORDING: Symbol('RECORDING'),
/**
* Stopping.
*/
STOPPING: Symbol('STOPPING'),
/**
* Failed, due to error during starting / stopping process.
*/
FAILED: Symbol('FAILED')
});
/**
* Type of the stats reported by each participant (client).
*/
type RecordingStats = {
/**
* Current local recording session token used by the participant.
*/
currentSessionToken: number,
/**
* Whether local recording is engaged on the participant's device.
*/
isRecording: boolean,
/**
* Total recorded bytes. (Reserved for future use.)
*/
recordedBytes: number,
/**
* Total recording duration. (Reserved for future use.)
*/
recordedLength: number
}
/**
* The component responsible for the coordination of local recording, across
* multiple participants.
* Current implementation requires that there is only one moderator in a room.
*/
class RecordingController {
/**
* For each recording session, there is a separate @{code RecordingAdapter}
* instance so that encoded bits from the previous sessions can still be
* retrieved after they ended.
*
* @private
*/
_adapters = {};
/**
* The {@code JitsiConference} instance.
*
* @private
*/
_conference: * = null;
/**
* Current recording session token.
* Session token is a number generated by the moderator, to ensure every
* client is in the same recording state.
*
* @private
*/
_currentSessionToken: number = -1;
/**
* Current state of {@code RecordingController}.
*
* @private
*/
_state = ControllerState.IDLE;
/**
* Whether or not the audio is muted in the UI. This is stored as internal
* state of {@code RecordingController} because we might have recording
* sessions that start muted.
*/
_isMuted = false;
/**
* The ID of the active microphone.
*
* @private
*/
_micDeviceId = 'default';
/**
* Current recording format. This will be in effect from the next
* recording session, i.e., if this value is changed during an on-going
* recording session, that on-going session will not use the new format.
*
* @private
*/
_format = DEFAULT_RECORDING_FORMAT;
/**
* Whether or not the {@code RecordingController} has registered for
* XMPP events. Prevents initialization from happening multiple times.
*
* @private
*/
_registered = false;
/**
* FIXME: callback function for the {@code RecordingController} to notify
* UI it wants to display a notice. Keeps {@code RecordingController}
* decoupled from UI.
*/
_onNotify: ?(messageKey: string, messageParams?: Object) => void;
/**
* FIXME: callback function for the {@code RecordingController} to notify
* UI it wants to display a warning. Keeps {@code RecordingController}
* decoupled from UI.
*/
_onWarning: ?(messageKey: string, messageParams?: Object) => void;
/**
* FIXME: callback function for the {@code RecordingController} to notify
* UI that the local recording state has changed.
*/
_onStateChanged: ?(boolean) => void;
/**
* Constructor.
*
* @returns {void}
*/
constructor() {
this.registerEvents = this.registerEvents.bind(this);
this.getParticipantsStats = this.getParticipantsStats.bind(this);
this._onStartCommand = this._onStartCommand.bind(this);
this._onStopCommand = this._onStopCommand.bind(this);
this._onPingCommand = this._onPingCommand.bind(this);
this._doStartRecording = this._doStartRecording.bind(this);
this._doStopRecording = this._doStopRecording.bind(this);
this._updateStats = this._updateStats.bind(this);
this._switchToNewSession = this._switchToNewSession.bind(this);
}
registerEvents: () => void;
/**
* Registers listeners for XMPP events.
*
* @param {JitsiConference} conference - {@code JitsiConference} instance.
* @returns {void}
*/
registerEvents(conference: Object) {
if (!this._registered) {
this._conference = conference;
if (this._conference) {
this._conference
.addCommandListener(COMMAND_STOP, this._onStopCommand);
this._conference
.addCommandListener(COMMAND_START, this._onStartCommand);
this._conference
.addCommandListener(COMMAND_PING, this._onPingCommand);
this._registered = true;
}
if (!this._conference.isModerator()) {
this._conference.sendCommandOnce(COMMAND_PING, {});
}
}
}
/**
* Sets the event handler for {@code onStateChanged}.
*
* @param {Function} delegate - The event handler.
* @returns {void}
*/
set onStateChanged(delegate: Function) {
this._onStateChanged = delegate;
}
/**
* Sets the event handler for {@code onNotify}.
*
* @param {Function} delegate - The event handler.
* @returns {void}
*/
set onNotify(delegate: Function) {
this._onNotify = delegate;
}
/**
* Sets the event handler for {@code onWarning}.
*
* @param {Function} delegate - The event handler.
* @returns {void}
*/
set onWarning(delegate: Function) {
this._onWarning = delegate;
}
/**
* Signals the participants to start local recording.
*
* @returns {void}
*/
startRecording() {
this.registerEvents();
if (this._conference && this._conference.isModerator()) {
this._conference.removeCommand(COMMAND_STOP);
this._conference.sendCommand(COMMAND_START, {
attributes: {
sessionToken: this._getRandomToken(),
format: this._format
}
});
} else if (this._onWarning) {
this._onWarning('localRecording.messages.notModerator');
}
}
/**
* Signals the participants to stop local recording.
*
* @returns {void}
*/
stopRecording() {
if (this._conference) {
if (this._conference.isModerator()) {
this._conference.removeCommand(COMMAND_START);
this._conference.sendCommand(COMMAND_STOP, {
attributes: {
sessionToken: this._currentSessionToken
}
});
} else if (this._onWarning) {
this._onWarning('localRecording.messages.notModerator');
}
}
}
/**
* Triggers the download of recorded data.
* Browser only.
*
* @param {number} sessionToken - The token of the session to download.
* @returns {void}
*/
downloadRecordedData(sessionToken: number) {
if (this._adapters[sessionToken]) {
this._adapters[sessionToken].exportRecordedData()
.then(args => {
const { data, format } = args;
const filename = `session_${sessionToken}`
+ `_${this._conference.myUserId()}.${format}`;
downloadBlob(data, filename);
})
.catch(error => {
logger.error('Failed to download audio for'
+ ` session ${sessionToken}. Error: ${error}`);
});
} else {
logger.error(`Invalid session token for download ${sessionToken}`);
}
}
/**
* Changes the current microphone.
*
* @param {string} micDeviceId - The new microphone device ID.
* @returns {void}
*/
setMicDevice(micDeviceId: string) {
if (micDeviceId !== this._micDeviceId) {
this._micDeviceId = String(micDeviceId);
if (this._state === ControllerState.RECORDING) {
// sessionManager.endSegment(this._currentSessionToken);
logger.log('Before switching microphone...');
this._adapters[this._currentSessionToken]
.setMicDevice(this._micDeviceId)
.then(() => {
logger.log('Finished switching microphone.');
// sessionManager.beginSegment(this._currentSesoken);
})
.catch(() => {
logger.error('Failed to switch microphone');
});
}
logger.log(`Switch microphone to ${this._micDeviceId}`);
}
}
/**
* Mute or unmute audio. When muted, the ongoing local recording should
* produce silence.
*
* @param {boolean} muted - If the audio should be muted.
* @returns {void}
*/
setMuted(muted: boolean) {
this._isMuted = Boolean(muted);
if (this._state === ControllerState.RECORDING) {
this._adapters[this._currentSessionToken].setMuted(this._isMuted);
}
}
/**
* Switches the recording format.
*
* @param {string} newFormat - The new format.
* @returns {void}
*/
switchFormat(newFormat: string) {
if (!RECORDING_FORMATS.has(newFormat)) {
logger.log(`Unknown format ${newFormat}. Ignoring...`);
return;
}
this._format = newFormat;
logger.log(`Recording format switched to ${newFormat}`);
// the new format will be used in the next recording session
}
/**
* Returns the local recording stats.
*
* @returns {RecordingStats}
*/
getLocalStats(): RecordingStats {
return {
currentSessionToken: this._currentSessionToken,
isRecording: this._state === ControllerState.RECORDING,
recordedBytes: 0,
recordedLength: 0
};
}
getParticipantsStats: () => *;
/**
* Returns the remote participants' local recording stats.
*
* @returns {*}
*/
getParticipantsStats() {
const members
= this._conference.getParticipants()
.map(member => {
return {
id: member.getId(),
displayName: member.getDisplayName(),
recordingStats:
JSON.parse(member.getProperty(PROPERTY_STATS) || '{}'),
isSelf: false
};
});
// transform into a dictionary for consistent ordering
const result = {};
for (let i = 0; i < members.length; ++i) {
result[members[i].id] = members[i];
}
const localId = this._conference.myUserId();
result[localId] = {
id: localId,
displayName: i18next.t('localRecording.me'),
recordingStats: this.getLocalStats(),
isSelf: true
};
return result;
}
_changeState: (Symbol) => void;
/**
* Changes the current state of {@code RecordingController}.
*
* @private
* @param {Symbol} newState - The new state.
* @returns {void}
*/
_changeState(newState: Symbol) {
if (this._state !== newState) {
logger.log(`state change: ${this._state.toString()} -> `
+ `${newState.toString()}`);
this._state = newState;
}
}
_updateStats: () => void;
/**
* Sends out updates about the local recording stats via XMPP.
*
* @private
* @returns {void}
*/
_updateStats() {
if (this._conference) {
this._conference.setLocalParticipantProperty(PROPERTY_STATS,
JSON.stringify(this.getLocalStats()));
}
}
_onStartCommand: (*) => void;
/**
* Callback function for XMPP event.
*
* @private
* @param {*} value - The event args.
* @returns {void}
*/
_onStartCommand(value) {
const { sessionToken, format } = value.attributes;
if (this._state === ControllerState.IDLE) {
this._changeState(ControllerState.STARTING);
this._switchToNewSession(sessionToken, format);
this._doStartRecording();
} else if (this._state === ControllerState.RECORDING
&& this._currentSessionToken !== sessionToken) {
// There is local recording going on, but not for the same session.
// This means the current state might be out-of-sync with the
// moderator's, so we need to restart the recording.
this._changeState(ControllerState.STOPPING);
this._doStopRecording().then(() => {
this._changeState(ControllerState.STARTING);
this._switchToNewSession(sessionToken, format);
this._doStartRecording();
});
}
}
_onStopCommand: (*) => void;
/**
* Callback function for XMPP event.
*
* @private
* @param {*} value - The event args.
* @returns {void}
*/
_onStopCommand(value) {
if (this._state === ControllerState.RECORDING
&& this._currentSessionToken === value.attributes.sessionToken) {
this._changeState(ControllerState.STOPPING);
this._doStopRecording();
}
}
_onPingCommand: () => void;
/**
* Callback function for XMPP event.
*
* @private
* @returns {void}
*/
_onPingCommand() {
if (this._conference.isModerator()) {
logger.log('Received ping, sending pong.');
this._conference.sendCommandOnce(COMMAND_PONG, {});
}
}
/**
* Generates a token that can be used to distinguish each local recording
* session.
*
* @returns {number}
*/
_getRandomToken() {
return Math.floor(Math.random() * 100000000) + 1;
}
_doStartRecording: () => void;
/**
* Starts the recording locally.
*
* @private
* @returns {void}
*/
_doStartRecording() {
if (this._state === ControllerState.STARTING) {
const delegate = this._adapters[this._currentSessionToken];
delegate.start(this._micDeviceId)
.then(() => {
this._changeState(ControllerState.RECORDING);
sessionManager.beginSegment(this._currentSessionToken);
logger.log('Local recording engaged.');
if (this._onNotify) {
this._onNotify('localRecording.messages.engaged');
}
if (this._onStateChanged) {
this._onStateChanged(true);
}
delegate.setMuted(this._isMuted);
this._updateStats();
})
.catch(err => {
logger.error('Failed to start local recording.', err);
});
}
}
_doStopRecording: () => Promise<void>;
/**
* Stops the recording locally.
*
* @private
* @returns {Promise<void>}
*/
_doStopRecording() {
if (this._state === ControllerState.STOPPING) {
const token = this._currentSessionToken;
return this._adapters[this._currentSessionToken]
.stop()
.then(() => {
this._changeState(ControllerState.IDLE);
sessionManager.endSegment(this._currentSessionToken);
logger.log('Local recording unengaged.');
this.downloadRecordedData(token);
const messageKey
= this._conference.isModerator()
? 'localRecording.messages.finishedModerator'
: 'localRecording.messages.finished';
const messageParams = {
token
};
if (this._onNotify) {
this._onNotify(messageKey, messageParams);
}
if (this._onStateChanged) {
this._onStateChanged(false);
}
this._updateStats();
})
.catch(err => {
logger.error('Failed to stop local recording.', err);
});
}
/* eslint-disable */
return (Promise.resolve(): Promise<void>);
// FIXME: better ways to satisfy flow and ESLint at the same time?
/* eslint-enable */
}
_switchToNewSession: (string, string) => void;
/**
* Switches to a new local recording session.
*
* @param {string} sessionToken - The session Token.
* @param {string} format - The recording format for the session.
* @returns {void}
*/
_switchToNewSession(sessionToken, format) {
this._format = format;
this._currentSessionToken = sessionToken;
logger.log(`New session: ${this._currentSessionToken}, `
+ `format: ${this._format}`);
this._adapters[sessionToken]
= this._createRecordingAdapter();
sessionManager.createSession(sessionToken, this._format);
}
/**
* Creates a recording adapter according to the current recording format.
*
* @private
* @returns {RecordingAdapter}
*/
_createRecordingAdapter() {
logger.debug('[RecordingController] creating recording'
+ ` adapter for ${this._format} format.`);
switch (this._format) {
case 'ogg':
return new OggAdapter();
case 'flac':
return new FlacAdapter();
case 'wav':
return new WavAdapter();
default:
throw new Error(`Unknown format: ${this._format}`);
}
}
}
/**
* Global singleton of {@code RecordingController}.
*/
export const recordingController = new RecordingController();

View File

@@ -0,0 +1 @@
export * from './RecordingController';

View File

@@ -0,0 +1,7 @@
export * from './actions';
export * from './actionTypes';
export * from './components';
export * from './controller';
import './middleware';
import './reducer';

View File

@@ -0,0 +1,92 @@
/* @flow */
import { createShortcutEvent, sendAnalytics } from '../analytics';
import { APP_WILL_MOUNT, APP_WILL_UNMOUNT } from '../base/app';
import { CONFERENCE_JOINED } from '../base/conference';
import { toggleDialog } from '../base/dialog';
import { i18next } from '../base/i18n';
import { SET_AUDIO_MUTED } from '../base/media';
import { MiddlewareRegistry } from '../base/redux';
import { SETTINGS_UPDATED } from '../base/settings/actionTypes';
import { showNotification } from '../notifications';
import { localRecordingEngaged, localRecordingUnengaged } from './actions';
import { LocalRecordingInfoDialog } from './components';
import { recordingController } from './controller';
declare var APP: Object;
declare var config: Object;
const isFeatureEnabled = typeof config === 'object' && config.localRecording
&& config.localRecording.enabled === true;
isFeatureEnabled
&& MiddlewareRegistry.register(({ getState, dispatch }) => next => action => {
const result = next(action);
switch (action.type) {
case CONFERENCE_JOINED: {
const { conference } = getState()['features/base/conference'];
const { localRecording } = getState()['features/base/config'];
if (localRecording && localRecording.format) {
recordingController.switchFormat(localRecording.format);
}
recordingController.registerEvents(conference);
break;
}
case APP_WILL_MOUNT:
// realize the delegates on recordingController, allowing the UI to
// react to state changes in recordingController.
recordingController.onStateChanged = isEngaged => {
if (isEngaged) {
const nowTime = new Date();
dispatch(localRecordingEngaged(nowTime));
} else {
dispatch(localRecordingUnengaged());
}
};
recordingController.onWarning = (messageKey, messageParams) => {
dispatch(showNotification({
title: i18next.t('localRecording.localRecording'),
description: i18next.t(messageKey, messageParams)
}, 10000));
};
recordingController.onNotify = (messageKey, messageParams) => {
dispatch(showNotification({
title: i18next.t('localRecording.localRecording'),
description: i18next.t(messageKey, messageParams)
}, 10000));
};
typeof APP === 'object' && typeof APP.keyboardshortcut === 'object'
&& APP.keyboardshortcut.registerShortcut('L', null, () => {
sendAnalytics(createShortcutEvent('local.recording'));
dispatch(toggleDialog(LocalRecordingInfoDialog));
}, 'keyboardShortcuts.localRecording');
break;
case APP_WILL_UNMOUNT:
recordingController.onStateChanged = null;
recordingController.onNotify = null;
recordingController.onWarning = null;
break;
case SET_AUDIO_MUTED:
recordingController.setMuted(action.muted);
break;
case SETTINGS_UPDATED: {
const { micDeviceId } = getState()['features/base/settings'];
if (micDeviceId) {
recordingController.setMicDevice(micDeviceId);
}
break;
}
}
return result;
});

View File

@@ -0,0 +1,129 @@
import { RecordingAdapter } from './RecordingAdapter';
const logger = require('jitsi-meet-logger').getLogger(__filename);
/**
* Base class for {@code AudioContext}-based recording adapters.
*/
export class AbstractAudioContextAdapter extends RecordingAdapter {
/**
* The {@code AudioContext} instance.
*/
_audioContext = null;
/**
* The {@code ScriptProcessorNode} instance.
*/
_audioProcessingNode = null;
/**
* The {@code MediaStreamAudioSourceNode} instance.
*/
_audioSource = null;
/**
* The {@code MediaStream} instance, representing the current audio device.
*/
_stream = null;
/**
* Sample rate.
*/
_sampleRate = 44100;
/**
* Constructor.
*/
constructor() {
super();
// sampleRate is browser and OS dependent.
// Setting sampleRate explicitly is in the specs but not implemented
// by browsers.
// See: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/
// AudioContext#Browser_compatibility
// And https://bugs.chromium.org/p/chromium/issues/detail?id=432248
this._audioContext = new AudioContext();
this._sampleRate = this._audioContext.sampleRate;
logger.log(`Current sampleRate ${this._sampleRate}.`);
}
/**
* Sets up the audio graph in the AudioContext.
*
* @protected
* @param {string} micDeviceId - The current microphone device ID.
* @param {Function} callback - Callback function to
* handle AudioProcessingEvents.
* @returns {Promise}
*/
_initializeAudioContext(micDeviceId, callback) {
if (typeof callback !== 'function') {
return Promise.reject('a callback function is required.');
}
return this._getAudioStream(micDeviceId)
.then(stream => {
this._stream = stream;
this._audioSource
= this._audioContext.createMediaStreamSource(stream);
this._audioProcessingNode
= this._audioContext.createScriptProcessor(4096, 1, 1);
this._audioProcessingNode.onaudioprocess = callback;
logger.debug('AudioContext is set up.');
})
.catch(err => {
logger.error(`Error calling getUserMedia(): ${err}`);
return Promise.reject(err);
});
}
/**
* Connects the nodes in the {@code AudioContext} to start the flow of
* audio data.
*
* @protected
* @returns {void}
*/
_connectAudioGraph() {
this._audioSource.connect(this._audioProcessingNode);
this._audioProcessingNode.connect(this._audioContext.destination);
}
/**
* Disconnects the nodes in the {@code AudioContext}.
*
* @protected
* @returns {void}
*/
_disconnectAudioGraph() {
this._audioProcessingNode.onaudioprocess = undefined;
this._audioProcessingNode.disconnect();
this._audioSource.disconnect();
}
/**
* Replaces the current microphone MediaStream.
*
* @protected
* @param {string} micDeviceId - New microphone ID.
* @returns {Promise}
*/
_replaceMic(micDeviceId) {
if (this._audioContext && this._audioProcessingNode) {
return this._getAudioStream(micDeviceId).then(newStream => {
const newSource = this._audioContext
.createMediaStreamSource(newStream);
this._audioSource.disconnect();
newSource.connect(this._audioProcessingNode);
this._stream = newStream;
this._audioSource = newSource;
});
}
return Promise.resolve();
}
}

View File

@@ -0,0 +1,143 @@
import { RecordingAdapter } from './RecordingAdapter';
const logger = require('jitsi-meet-logger').getLogger(__filename);
/**
* Recording adapter that uses {@code MediaRecorder} (default browser encoding
* with Opus codec).
*/
export class OggAdapter extends RecordingAdapter {
/**
* Instance of MediaRecorder.
* @private
*/
_mediaRecorder = null;
/**
* Initialization promise.
* @private
*/
_initPromise = null;
/**
* The recorded audio file.
* @private
*/
_recordedData = null;
/**
* Implements {@link RecordingAdapter#start()}.
*
* @inheritdoc
*/
start(micDeviceId) {
if (!this._initPromise) {
this._initPromise = this._initialize(micDeviceId);
}
return this._initPromise.then(() =>
new Promise(resolve => {
this._mediaRecorder.start();
resolve();
})
);
}
/**
* Implements {@link RecordingAdapter#stop()}.
*
* @inheritdoc
*/
stop() {
return new Promise(
resolve => {
this._mediaRecorder.onstop = () => resolve();
this._mediaRecorder.stop();
}
);
}
/**
* Implements {@link RecordingAdapter#exportRecordedData()}.
*
* @inheritdoc
*/
exportRecordedData() {
if (this._recordedData !== null) {
return Promise.resolve({
data: this._recordedData,
format: 'ogg'
});
}
return Promise.reject('No audio data recorded.');
}
/**
* Implements {@link RecordingAdapter#setMuted()}.
*
* @inheritdoc
*/
setMuted(muted) {
const shouldEnable = !muted;
if (!this._stream) {
return Promise.resolve();
}
const track = this._stream.getAudioTracks()[0];
if (!track) {
logger.error('Cannot mute/unmute. Track not found!');
return Promise.resolve();
}
if (track.enabled !== shouldEnable) {
track.enabled = shouldEnable;
logger.log(muted ? 'Mute' : 'Unmute');
}
return Promise.resolve();
}
/**
* Initialize the adapter.
*
* @private
* @param {string} micDeviceId - The current microphone device ID.
* @returns {Promise}
*/
_initialize(micDeviceId) {
if (this._mediaRecorder) {
return Promise.resolve();
}
return new Promise((resolve, error) => {
this._getAudioStream(micDeviceId)
.then(stream => {
this._stream = stream;
this._mediaRecorder = new MediaRecorder(stream);
this._mediaRecorder.ondataavailable
= e => this._saveMediaData(e.data);
resolve();
})
.catch(err => {
logger.error(`Error calling getUserMedia(): ${err}`);
error();
});
});
}
/**
* Callback for storing the encoded data.
*
* @private
* @param {Blob} data - Encoded data.
* @returns {void}
*/
_saveMediaData(data) {
this._recordedData = data;
}
}

View File

@@ -0,0 +1,85 @@
import JitsiMeetJS from '../../base/lib-jitsi-meet';
/**
* Base class for recording backends.
*/
export class RecordingAdapter {
/**
* Starts recording.
*
* @param {string} micDeviceId - The microphone to record on.
* @returns {Promise}
*/
start(/* eslint-disable no-unused-vars */
micDeviceId/* eslint-enable no-unused-vars */) {
throw new Error('Not implemented');
}
/**
* Stops recording.
*
* @returns {Promise}
*/
stop() {
throw new Error('Not implemented');
}
/**
* Export the recorded and encoded audio file.
*
* @returns {Promise<Object>}
*/
exportRecordedData() {
throw new Error('Not implemented');
}
/**
* Mutes or unmutes the current recording.
*
* @param {boolean} muted - Whether to mute or to unmute.
* @returns {Promise}
*/
setMuted(/* eslint-disable no-unused-vars */
muted/* eslint-enable no-unused-vars */) {
throw new Error('Not implemented');
}
/**
* Changes the current microphone.
*
* @param {string} micDeviceId - The new microphone device ID.
* @returns {Promise}
*/
setMicDevice(/* eslint-disable no-unused-vars */
micDeviceId/* eslint-enable no-unused-vars */) {
throw new Error('Not implemented');
}
/**
* Helper method for getting an audio {@code MediaStream}. Use this instead
* of calling browser APIs directly.
*
* @protected
* @param {number} micDeviceId - The ID of the current audio device.
* @returns {Promise}
*/
_getAudioStream(micDeviceId) {
return JitsiMeetJS.createLocalTracks({
devices: [ 'audio' ],
micDeviceId
}).then(result => {
if (result.length !== 1) {
throw new Error('Unexpected number of streams '
+ 'from createLocalTracks.');
}
const mediaStream = result[0].stream;
if (mediaStream === undefined) {
throw new Error('Failed to create local track.');
}
return mediaStream;
});
}
}

View File

@@ -0,0 +1,20 @@
/**
* Force download of Blob in browser by faking an <a> tag.
*
* @param {Blob} blob - Base64 URL.
* @param {string} fileName - The filename to appear in the download dialog.
* @returns {void}
*/
export function downloadBlob(blob, fileName = 'recording.ogg') {
const base64Url = window.URL.createObjectURL(blob);
// fake a anchor tag
const a = document.createElement('a');
a.style = 'display: none';
a.href = base64Url;
a.download = fileName;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
}

View File

@@ -0,0 +1,290 @@
import { AbstractAudioContextAdapter } from './AbstractAudioContextAdapter';
const logger = require('jitsi-meet-logger').getLogger(__filename);
const WAV_BITS_PER_SAMPLE = 16;
/**
* Recording adapter for raw WAVE format.
*/
export class WavAdapter extends AbstractAudioContextAdapter {
/**
* Length of the WAVE file, in number of samples.
*/
_wavLength = 0;
/**
* The {@code ArrayBuffer}s that stores the PCM bits.
*/
_wavBuffers = [];
/**
* Whether or not the {@code WavAdapter} is in a ready state.
*/
_isInitialized = false;
/**
* Initialization promise.
*/
_initPromise = null;
/**
* Constructor.
*/
constructor() {
super();
this._onAudioProcess = this._onAudioProcess.bind(this);
}
/**
* Implements {@link RecordingAdapter#start()}.
*
* @inheritdoc
*/
start(micDeviceId) {
if (!this._initPromise) {
this._initPromise = this._initialize(micDeviceId);
}
return this._initPromise.then(() => {
this._wavBuffers = [];
this._wavLength = 0;
this._connectAudioGraph();
});
}
/**
* Implements {@link RecordingAdapter#stop()}.
*
* @inheritdoc
*/
stop() {
this._disconnectAudioGraph();
this._data = this._exportMonoWAV(this._wavBuffers, this._wavLength);
this._audioProcessingNode = null;
this._audioSource = null;
this._isInitialized = false;
return Promise.resolve();
}
/**
* Implements {@link RecordingAdapter#exportRecordedData()}.
*
* @inheritdoc
*/
exportRecordedData() {
if (this._data !== null) {
return Promise.resolve({
data: this._data,
format: 'wav'
});
}
return Promise.reject('No audio data recorded.');
}
/**
* Implements {@link RecordingAdapter#setMuted()}.
*
* @inheritdoc
*/
setMuted(muted) {
const shouldEnable = !muted;
if (!this._stream) {
return Promise.resolve();
}
const track = this._stream.getAudioTracks()[0];
if (!track) {
logger.error('Cannot mute/unmute. Track not found!');
return Promise.resolve();
}
if (track.enabled !== shouldEnable) {
track.enabled = shouldEnable;
logger.log(muted ? 'Mute' : 'Unmute');
}
return Promise.resolve();
}
/**
* Implements {@link RecordingAdapter#setMicDevice()}.
*
* @inheritdoc
*/
setMicDevice(micDeviceId) {
return this._replaceMic(micDeviceId);
}
/**
* Creates a WAVE file header.
*
* @private
* @param {number} dataLength - Length of the payload (PCM data), in bytes.
* @returns {Uint8Array}
*/
_createWavHeader(dataLength) {
// adapted from
// https://github.com/mmig/speech-to-flac/blob/master/encoder.js
// ref: http://soundfile.sapp.org/doc/WaveFormat/
// create our WAVE file header
const buffer = new ArrayBuffer(44);
const view = new DataView(buffer);
// RIFF chunk descriptor
writeUTFBytes(view, 0, 'RIFF');
// set file size at the end
writeUTFBytes(view, 8, 'WAVE');
// FMT sub-chunk
writeUTFBytes(view, 12, 'fmt ');
view.setUint32(16, 16, true);
view.setUint16(20, 1, true);
// NumChannels
view.setUint16(22, 1, true);
// SampleRate
view.setUint32(24, this._sampleRate, true);
// ByteRate
view.setUint32(28,
Number(this._sampleRate) * 1 * WAV_BITS_PER_SAMPLE / 8, true);
// BlockAlign
view.setUint16(32, 1 * Number(WAV_BITS_PER_SAMPLE) / 8, true);
view.setUint16(34, WAV_BITS_PER_SAMPLE, true);
// data sub-chunk
writeUTFBytes(view, 36, 'data');
// file length
view.setUint32(4, 32 + dataLength, true);
// data chunk length
view.setUint32(40, dataLength, true);
return new Uint8Array(buffer);
}
/**
* Initialize the adapter.
*
* @private
* @param {string} micDeviceId - The current microphone device ID.
* @returns {Promise}
*/
_initialize(micDeviceId) {
if (this._isInitialized) {
return Promise.resolve();
}
return this._initializeAudioContext(micDeviceId, this._onAudioProcess)
.then(() => {
this._isInitialized = true;
});
}
/**
* Callback function for handling AudioProcessingEvents.
*
* @private
* @param {AudioProcessingEvent} e - The event containing the raw PCM.
* @returns {void}
*/
_onAudioProcess(e) {
// See: https://developer.mozilla.org/en-US/docs/Web/API/
// AudioBuffer/getChannelData
// The returned value is an Float32Array.
const channelLeft = e.inputBuffer.getChannelData(0);
// Need to copy the Float32Array:
// unlike passing to WebWorker, this data is passed by reference,
// so we need to copy it, otherwise the resulting audio file will be
// just repeating the last segment.
this._wavBuffers.push(new Float32Array(channelLeft));
this._wavLength += channelLeft.length;
}
/**
* Combines buffers and export to a wav file.
*
* @private
* @param {Float32Array[]} buffers - The stored buffers.
* @param {number} length - Total length (number of samples).
* @returns {Blob}
*/
_exportMonoWAV(buffers, length) {
const dataLength = length * 2; // each sample = 16 bit = 2 bytes
const buffer = new ArrayBuffer(44 + dataLength);
const view = new DataView(buffer);
// copy WAV header data into the array buffer
const header = this._createWavHeader(dataLength);
const len = header.length;
for (let i = 0; i < len; ++i) {
view.setUint8(i, header[i]);
}
// write audio data
floatTo16BitPCM(view, 44, buffers);
return new Blob([ view ], { type: 'audio/wav' });
}
}
/**
* Helper function. Writes a UTF string to memory
* using big endianness. Required by WAVE headers.
*
* @param {ArrayBuffer} view - The view to memory.
* @param {number} offset - Offset.
* @param {string} string - The string to be written.
* @returns {void}
*/
function writeUTFBytes(view, offset, string) {
const lng = string.length;
// convert to big endianness
for (let i = 0; i < lng; ++i) {
view.setUint8(offset + i, string.charCodeAt(i));
}
}
/**
* Helper function for converting Float32Array to Int16Array.
*
* @param {DataView} output - View to the output buffer.
* @param {number} offset - The offset in output buffer to write from.
* @param {Float32Array[]} inputBuffers - The input buffers.
* @returns {void}
*/
function floatTo16BitPCM(output, offset, inputBuffers) {
let i, j;
let input, s, sampleCount;
const bufferCount = inputBuffers.length;
let o = offset;
for (i = 0; i < bufferCount; ++i) {
input = inputBuffers[i];
sampleCount = input.length;
for (j = 0; j < sampleCount; ++j, o += 2) {
s = Math.max(-1, Math.min(1, input[j]));
output.setInt16(o, s < 0 ? s * 0x8000 : s * 0x7FFF, true);
}
}
}

View File

@@ -0,0 +1,262 @@
import {
DEBUG,
MAIN_THREAD_FINISH,
MAIN_THREAD_INIT,
MAIN_THREAD_NEW_DATA_ARRIVED,
WORKER_BLOB_READY,
WORKER_LIBFLAC_READY
} from './messageTypes';
import { AbstractAudioContextAdapter } from '../AbstractAudioContextAdapter';
const logger = require('jitsi-meet-logger').getLogger(__filename);
/**
* Recording adapter that uses libflac.js in the background.
*/
export class FlacAdapter extends AbstractAudioContextAdapter {
/**
* Instance of WebWorker (flacEncodeWorker).
*/
_encoder = null;
/**
* Resolve function of the Promise returned by {@code stop()}.
* This is called after the WebWorker sends back {@code WORKER_BLOB_READY}.
*/
_stopPromiseResolver = null;
/**
* Resolve function of the Promise that initializes the flacEncodeWorker.
*/
_initWorkerPromiseResolver = null;
/**
* Initialization promise.
*/
_initPromise = null;
/**
* Constructor.
*/
constructor() {
super();
this._onAudioProcess = this._onAudioProcess.bind(this);
this._onWorkerMessage = this._onWorkerMessage.bind(this);
}
/**
* Implements {@link RecordingAdapter#start()}.
*
* @inheritdoc
*/
start(micDeviceId) {
if (!this._initPromise) {
this._initPromise = this._initialize(micDeviceId);
}
return this._initPromise.then(() => {
this._connectAudioGraph();
});
}
/**
* Implements {@link RecordingAdapter#stop()}.
*
* @inheritdoc
*/
stop() {
if (!this._encoder) {
logger.error('Attempting to stop but has nothing to stop.');
return Promise.reject();
}
return new Promise(resolve => {
this._initPromise = null;
this._disconnectAudioGraph();
this._stopPromiseResolver = resolve;
this._encoder.postMessage({
command: MAIN_THREAD_FINISH
});
});
}
/**
* Implements {@link RecordingAdapter#exportRecordedData()}.
*
* @inheritdoc
*/
exportRecordedData() {
if (this._data !== null) {
return Promise.resolve({
data: this._data,
format: 'flac'
});
}
return Promise.reject('No audio data recorded.');
}
/**
* Implements {@link RecordingAdapter#setMuted()}.
*
* @inheritdoc
*/
setMuted(muted) {
const shouldEnable = !muted;
if (!this._stream) {
return Promise.resolve();
}
const track = this._stream.getAudioTracks()[0];
if (!track) {
logger.error('Cannot mute/unmute. Track not found!');
return Promise.resolve();
}
if (track.enabled !== shouldEnable) {
track.enabled = shouldEnable;
logger.log(muted ? 'Mute' : 'Unmute');
}
return Promise.resolve();
}
/**
* Implements {@link RecordingAdapter#setMicDevice()}.
*
* @inheritdoc
*/
setMicDevice(micDeviceId) {
return this._replaceMic(micDeviceId);
}
/**
* Initialize the adapter.
*
* @private
* @param {string} micDeviceId - The current microphone device ID.
* @returns {Promise}
*/
_initialize(micDeviceId) {
if (this._encoder !== null) {
return Promise.resolve();
}
const promiseInitWorker = new Promise((resolve, reject) => {
try {
this._loadWebWorker();
} catch (e) {
reject();
}
// Save the Promise's resolver to resolve it later.
// This Promise is only resolved in _onWorkerMessage when we
// receive WORKER_LIBFLAC_READY from the WebWorker.
this._initWorkerPromiseResolver = resolve;
// set up listener for messages from the WebWorker
this._encoder.onmessage = this._onWorkerMessage;
this._encoder.postMessage({
command: MAIN_THREAD_INIT,
config: {
sampleRate: this._sampleRate,
bps: 16
}
});
});
// Arrow function is used here because we want AudioContext to be
// initialized only **after** promiseInitWorker is resolved.
return promiseInitWorker
.then(() =>
this._initializeAudioContext(
micDeviceId,
this._onAudioProcess
));
}
/**
* Callback function for handling AudioProcessingEvents.
*
* @private
* @param {AudioProcessingEvent} e - The event containing the raw PCM.
* @returns {void}
*/
_onAudioProcess(e) {
// Delegates to the WebWorker to do the encoding.
// The return of getChannelData() is a Float32Array,
// each element representing one sample.
const channelLeft = e.inputBuffer.getChannelData(0);
this._encoder.postMessage({
command: MAIN_THREAD_NEW_DATA_ARRIVED,
buf: channelLeft
});
}
/**
* Handler for messages from flacEncodeWorker.
*
* @private
* @param {MessageEvent} e - The event sent by the WebWorker.
* @returns {void}
*/
_onWorkerMessage(e) {
switch (e.data.command) {
case WORKER_BLOB_READY:
// Received a Blob representing an encoded FLAC file.
this._data = e.data.buf;
if (this._stopPromiseResolver !== null) {
this._stopPromiseResolver();
this._stopPromiseResolver = null;
this._encoder.terminate();
this._encoder = null;
}
break;
case DEBUG:
logger.log(e.data);
break;
case WORKER_LIBFLAC_READY:
logger.log('libflac is ready.');
this._initWorkerPromiseResolver();
break;
default:
logger.error(
`Unknown event
from encoder (WebWorker): "${e.data.command}"!`);
break;
}
}
/**
* Loads the WebWorker.
*
* @private
* @returns {void}
*/
_loadWebWorker() {
// FIXME: Workaround for different file names in development/
// production environments.
// We cannot import flacEncodeWorker as a webpack module,
// because it is in a different bundle and should be lazy-loaded
// only when flac recording is in use.
try {
// try load the minified version first
this._encoder = new Worker('/libs/flacEncodeWorker.min.js');
} catch (exception1) {
// if failed, try unminified version
try {
this._encoder = new Worker('/libs/flacEncodeWorker.js');
} catch (exception2) {
throw new Error('Failed to load flacEncodeWorker.');
}
}
}
}

View File

@@ -0,0 +1,397 @@
import {
MAIN_THREAD_FINISH,
MAIN_THREAD_INIT,
MAIN_THREAD_NEW_DATA_ARRIVED,
WORKER_BLOB_READY,
WORKER_LIBFLAC_READY
} from './messageTypes';
const logger = require('jitsi-meet-logger').getLogger(__filename);
/**
* WebWorker that does FLAC encoding using libflac.js
*/
self.FLAC_SCRIPT_LOCATION = '/libs/';
/* eslint-disable */
importScripts('/libs/libflac4-1.3.2.min.js');
/* eslint-enable */
// There is a number of API calls to libflac.js, which does not conform
// to the camalCase naming convention, but we cannot change it.
// So we disable the ESLint rule `new-cap` in this file.
/* eslint-disable new-cap */
// Flow will complain about the number keys in `FLAC_ERRORS`,
// ESLint will complain about the `declare` statement.
// As the current workaround, add an exception for eslint.
/* eslint-disable flowtype/no-types-missing-file-annotation */
declare var Flac: Object;
const FLAC_ERRORS = {
// The encoder is in the normal OK state and samples can be processed.
0: 'FLAC__STREAM_ENCODER_OK',
// The encoder is in the uninitialized state one of the
// FLAC__stream_encoder_init_*() functions must be called before samples can
// be processed.
1: 'FLAC__STREAM_ENCODER_UNINITIALIZED',
// An error occurred in the underlying Ogg layer.
2: 'FLAC__STREAM_ENCODER_OGG_ERROR',
// An error occurred in the underlying verify stream decoder; check
// FLAC__stream_encoder_get_verify_decoder_state().
3: 'FLAC__STREAM_ENCODER_VERIFY_DECODER_ERROR',
// The verify decoder detected a mismatch between the original audio signal
// and the decoded audio signal.
4: 'FLAC__STREAM_ENCODER_VERIFY_MISMATCH_IN_AUDIO_DATA',
// One of the callbacks returned a fatal error.
5: 'FLAC__STREAM_ENCODER_CLIENT_ERROR',
// An I/O error occurred while opening/reading/writing a file. Check errno.
6: 'FLAC__STREAM_ENCODER_IO_ERROR',
// An error occurred while writing the stream; usually, the write_callback
// returned an error.
7: 'FLAC__STREAM_ENCODER_FRAMING_ERROR',
// Memory allocation failed.
8: 'FLAC__STREAM_ENCODER_MEMORY_ALLOCATION_ERROR'
};
/**
* States of the {@code Encoder}.
*/
const EncoderState = Object.freeze({
/**
* Initial state, when libflac.js is not initialized.
*/
UNINTIALIZED: Symbol('uninitialized'),
/**
* Actively encoding new audio bits.
*/
WORKING: Symbol('working'),
/**
* Encoding has finished and encoded bits are available.
*/
FINISHED: Symbol('finished')
});
/**
* Default FLAC compression level.
*/
const FLAC_COMPRESSION_LEVEL = 5;
/**
* Concat multiple Uint8Arrays into one.
*
* @param {Uint8Array[]} arrays - Array of Uint8 arrays.
* @param {number} totalLength - Total length of all Uint8Arrays.
* @returns {Uint8Array}
*/
function mergeUint8Arrays(arrays, totalLength) {
const result = new Uint8Array(totalLength);
let offset = 0;
const len = arrays.length;
for (let i = 0; i < len; i++) {
const buffer = arrays[i];
result.set(buffer, offset);
offset += buffer.length;
}
return result;
}
/**
* Wrapper class around libflac API.
*/
class Encoder {
/**
* Flac encoder instance ID. (As per libflac.js API).
* @private
*/
_encoderId = 0;
/**
* Sample rate.
* @private
*/
_sampleRate;
/**
* Bit depth (bits per sample).
* @private
*/
_bitDepth;
/**
* Buffer size.
* @private
*/
_bufferSize;
/**
* Buffers to store encoded bits temporarily.
*/
_flacBuffers = [];
/**
* Length of encoded FLAC bits.
*/
_flacLength = 0;
/**
* The current state of the {@code Encoder}.
*/
_state = EncoderState.UNINTIALIZED;
/**
* The ready-for-grab downloadable Blob.
*/
_data = null;
/**
* Constructor.
* Note: only create instance when Flac.isReady() returns true.
*
* @param {number} sampleRate - Sample rate of the raw audio data.
* @param {number} bitDepth - Bit depth (bit per sample).
* @param {number} bufferSize - The size of each batch.
*/
constructor(sampleRate, bitDepth = 16, bufferSize = 4096) {
if (!Flac.isReady()) {
throw new Error('libflac is not ready yet!');
}
this._sampleRate = sampleRate;
this._bitDepth = bitDepth;
this._bufferSize = bufferSize;
// create the encoder
this._encoderId = Flac.init_libflac_encoder(
this._sampleRate,
// Mono channel
1,
this._bitDepth,
FLAC_COMPRESSION_LEVEL,
// Pass 0 in becuase of unknown total samples,
0,
// checksum, FIXME: double-check whether this is necessary
true,
// Auto-determine block size (samples per frame)
0
);
if (this._encoderId === 0) {
throw new Error('Failed to create libflac encoder.');
}
// initialize the encoder
const initResult = Flac.init_encoder_stream(
this._encoderId,
this._onEncodedData.bind(this),
this._onMetadataAvailable.bind(this)
);
if (initResult !== 0) {
throw new Error('Failed to initalise libflac encoder.');
}
this._state = EncoderState.WORKING;
}
/**
* Receive and encode new data.
*
* @param {Float32Array} audioData - Raw audio data.
* @returns {void}
*/
encode(audioData) {
if (this._state !== EncoderState.WORKING) {
throw new Error('Encoder is not ready or has finished.');
}
if (!Flac.isReady()) {
throw new Error('Flac not ready');
}
const bufferLength = audioData.length;
// Convert sample to signed 32-bit integers.
// According to libflac documentation:
// each sample in the buffers should be a signed integer,
// right-justified to the resolution set by
// FLAC__stream_encoder_set_bits_per_sample().
// Here we are using 16 bits per sample, the samples should all be in
// the range [-32768,32767]. This is achieved by multipling Float32
// numbers with 0x7FFF.
const bufferI32 = new Int32Array(bufferLength);
const view = new DataView(bufferI32.buffer);
const volume = 1;
let index = 0;
for (let i = 0; i < bufferLength; i++) {
view.setInt32(index, audioData[i] * (0x7FFF * volume), true);
index += 4; // 4 bytes (32-bit)
}
// pass it to libflac
const status = Flac.FLAC__stream_encoder_process_interleaved(
this._encoderId,
bufferI32,
bufferI32.length
);
if (status !== 1) {
// gets error number
const errorNo
= Flac.FLAC__stream_encoder_get_state(this._encoderId);
logger.error('Error during encoding', FLAC_ERRORS[errorNo]);
}
}
/**
* Signals the termination of encoding.
*
* @returns {void}
*/
finish() {
if (this._state === EncoderState.WORKING) {
this._state = EncoderState.FINISHED;
const status = Flac.FLAC__stream_encoder_finish(this._encoderId);
logger.log('Flac encoding finished: ', status);
// free up resources
Flac.FLAC__stream_encoder_delete(this._encoderId);
this._data = this._exportFlacBlob();
}
}
/**
* Gets the encoded flac file.
*
* @returns {Blob} - The encoded flac file.
*/
getBlob() {
if (this._state === EncoderState.FINISHED) {
return this._data;
}
return null;
}
/**
* Converts flac buffer to a Blob.
*
* @private
* @returns {void}
*/
_exportFlacBlob() {
const samples = mergeUint8Arrays(this._flacBuffers, this._flacLength);
const blob = new Blob([ samples ], { type: 'audio/flac' });
return blob;
}
/* eslint-disable no-unused-vars */
/**
* Callback function for saving encoded Flac data.
* This is invoked by libflac.
*
* @private
* @param {Uint8Array} buffer - The encoded Flac data.
* @param {number} bytes - Number of bytes in the data.
* @returns {void}
*/
_onEncodedData(buffer, bytes) {
this._flacBuffers.push(buffer);
this._flacLength += buffer.byteLength;
}
/* eslint-enable no-unused-vars */
/**
* Callback function for receiving metadata.
*
* @private
* @returns {void}
*/
_onMetadataAvailable = () => {
// reserved for future use
}
}
let encoder = null;
self.onmessage = function(e) {
switch (e.data.command) {
case MAIN_THREAD_INIT:
{
const bps = e.data.config.bps;
const sampleRate = e.data.config.sampleRate;
if (Flac.isReady()) {
encoder = new Encoder(sampleRate, bps);
self.postMessage({
command: WORKER_LIBFLAC_READY
});
} else {
Flac.onready = function() {
setTimeout(() => {
encoder = new Encoder(sampleRate, bps);
self.postMessage({
command: WORKER_LIBFLAC_READY
});
}, 0);
};
}
break;
}
case MAIN_THREAD_NEW_DATA_ARRIVED:
if (encoder === null) {
logger.error('flacEncoderWorker received data when the encoder is'
+ 'not ready.');
} else {
encoder.encode(e.data.buf);
}
break;
case MAIN_THREAD_FINISH:
if (encoder !== null) {
encoder.finish();
const data = encoder.getBlob();
self.postMessage(
{
command: WORKER_BLOB_READY,
buf: data
}
);
encoder = null;
}
break;
}
};

View File

@@ -0,0 +1 @@
export * from './FlacAdapter';

View File

@@ -0,0 +1,44 @@
/**
* Types of messages that are passed between the main thread and the WebWorker
* ({@code flacEncodeWorker})
*/
// Messages sent by the main thread
/**
* Message type that signals the termination of encoding,
* after which no new audio bits should be sent to the
* WebWorker.
*/
export const MAIN_THREAD_FINISH = 'MAIN_THREAD_FINISH';
/**
* Message type that carries initial parameters for
* the WebWorker.
*/
export const MAIN_THREAD_INIT = 'MAIN_THREAD_INIT';
/**
* Message type that carries the newly received raw audio bits
* for the WebWorker to encode.
*/
export const MAIN_THREAD_NEW_DATA_ARRIVED = 'MAIN_THREAD_NEW_DATA_ARRIVED';
// Messages sent by the WebWorker
/**
* Message type that signals libflac is ready to receive audio bits.
*/
export const WORKER_LIBFLAC_READY = 'WORKER_LIBFLAC_READY';
/**
* Message type that carries the encoded FLAC file as a Blob.
*/
export const WORKER_BLOB_READY = 'WORKER_BLOB_READY';
// Messages sent by either the main thread or the WebWorker
/**
* Debug messages.
*/
export const DEBUG = 'DEBUG';

View File

@@ -0,0 +1,5 @@
export * from './OggAdapter';
export * from './RecordingAdapter';
export * from './Utils';
export * from './WavAdapter';
export * from './flac';

View File

@@ -0,0 +1,35 @@
/* @flow */
import { ReducerRegistry } from '../base/redux';
import {
LOCAL_RECORDING_ENGAGED,
LOCAL_RECORDING_STATS_UPDATE,
LOCAL_RECORDING_UNENGAGED
} from './actionTypes';
import { recordingController } from './controller';
ReducerRegistry.register('features/local-recording', (state = {}, action) => {
switch (action.type) {
case LOCAL_RECORDING_ENGAGED: {
return {
...state,
isEngaged: true,
recordingEngagedAt: action.recordingEngagedAt,
encodingFormat: recordingController._format
};
}
case LOCAL_RECORDING_UNENGAGED:
return {
...state,
isEngaged: false,
recordingEngagedAt: null
};
case LOCAL_RECORDING_STATS_UPDATE:
return {
...state,
stats: action.stats
};
default:
return state;
}
});

View File

@@ -0,0 +1,439 @@
/* @flow */
import jitsiLocalStorage from '../../../../modules/util/JitsiLocalStorage';
const logger = require('jitsi-meet-logger').getLogger(__filename);
/**
* Gets high precision system time.
*
* @returns {number}
*/
function highPrecisionTime(): number {
return window.performance
&& window.performance.now
&& window.performance.timing
&& window.performance.timing.navigationStart
? window.performance.now() + window.performance.timing.navigationStart
: Date.now();
}
// Have to use string literal here, instead of Symbols,
// because these values need to be JSON-serializible.
/**
* Types of SessionEvents.
*/
const SessionEventType = Object.freeze({
/**
* Start of local recording session. This is recorded when the
* {@code RecordingController} receives the signal to start local recording,
* before the actual adapter is engaged.
*/
SESSION_STARTED: 'SESSION_STARTED',
/**
* Start of a continuous segment. This is recorded when the adapter is
* engaged. Can happen multiple times in a local recording session,
* due to browser reloads or switching of recording device.
*/
SEGMENT_STARTED: 'SEGMENT_STARTED',
/**
* End of a continuous segment. This is recorded when the adapter unengages.
*/
SEGMENT_ENDED: 'SEGMENT_ENDED'
});
/**
* Represents an event during a local recording session.
* The event can be either that the adapter started recording, or stopped
* recording.
*/
type SessionEvent = {
/**
* The type of the event.
* Should be one of the values in {@code SessionEventType}.
*/
type: string,
/**
* The timestamp of the event.
*/
timestamp: number
};
/**
* Representation of the metadata of a segment.
*/
type SegmentInfo = {
/**
* The length of gap before this segment, in milliseconds.
* mull if unknown.
*/
gapBefore?: ?number,
/**
* The duration of this segment, in milliseconds.
* null if unknown or the segment is not finished.
*/
duration?: ?number,
/**
* The start time, in milliseconds.
*/
start?: ?number,
/**
* The end time, in milliseconds.
* null if unknown, the segment is not finished, or the recording is
* interrupted (e.g. browser reload).
*/
end?: ?number
};
/**
* Representation of metadata of a local recording session.
*/
type SessionInfo = {
/**
* The session token.
*/
sessionToken: string,
/**
* The start time of the session.
*/
start: ?number,
/**
* The recording format.
*/
format: string,
/**
* Array of segments in the session.
*/
segments: SegmentInfo[]
}
/**
* {@code localStorage} key.
*/
const LOCAL_STORAGE_KEY = 'localRecordingMetadataVersion1';
/**
* SessionManager manages the metadata of each segment during each local
* recording session.
*
* A segment is a continous portion of recording done using the same adapter
* on the same microphone device.
*
* Browser refreshes, switching of microphone will cause new segments to be
* created.
*
* A recording session can consist of one or more segments.
*/
class SessionManager {
/**
* The metadata.
*/
_sessionsMetadata = {
};
/**
* Constructor.
*/
constructor() {
this._loadMetadata();
}
/**
* Loads metadata from localStorage.
*
* @private
* @returns {void}
*/
_loadMetadata() {
const dataStr = jitsiLocalStorage.getItem(LOCAL_STORAGE_KEY);
if (dataStr !== null) {
try {
const dataObject = JSON.parse(dataStr);
this._sessionsMetadata = dataObject;
} catch (e) {
logger.warn('Failed to parse localStorage item.');
return;
}
}
}
/**
* Persists metadata to localStorage.
*
* @private
* @returns {void}
*/
_saveMetadata() {
jitsiLocalStorage.setItem(LOCAL_STORAGE_KEY,
JSON.stringify(this._sessionsMetadata));
}
/**
* Creates a session if not exists.
*
* @param {string} sessionToken - The local recording session token.
* @param {string} format - The local recording format.
* @returns {void}
*/
createSession(sessionToken: string, format: string) {
if (this._sessionsMetadata[sessionToken] === undefined) {
this._sessionsMetadata[sessionToken] = {
format,
events: []
};
this._sessionsMetadata[sessionToken].events.push({
type: SessionEventType.SESSION_STARTED,
timestamp: highPrecisionTime()
});
this._saveMetadata();
} else {
logger.warn(`Session ${sessionToken} already exists`);
}
}
/**
* Gets all the Sessions.
*
* @returns {SessionInfo[]}
*/
getSessions(): SessionInfo[] {
const sessionTokens = Object.keys(this._sessionsMetadata);
const output = [];
for (let i = 0; i < sessionTokens.length; ++i) {
const thisSession = this._sessionsMetadata[sessionTokens[i]];
const newSessionInfo : SessionInfo = {
start: thisSession.events[0].timestamp,
format: thisSession.format,
sessionToken: sessionTokens[i],
segments: this.getSegments(sessionTokens[i])
};
output.push(newSessionInfo);
}
output.sort((a, b) => (a.start || 0) - (b.start || 0));
return output;
}
/**
* Removes session metadata.
*
* @param {string} sessionToken - The session token.
* @returns {void}
*/
removeSession(sessionToken: string) {
delete this._sessionsMetadata[sessionToken];
this._saveMetadata();
}
/**
* Get segments of a given Session.
*
* @param {string} sessionToken - The session token.
* @returns {SegmentInfo[]}
*/
getSegments(sessionToken: string): SegmentInfo[] {
const thisSession = this._sessionsMetadata[sessionToken];
if (thisSession) {
return this._constructSegments(thisSession.events);
}
return [];
}
/**
* Marks the start of a new segment.
* This should be invoked by {@code RecordingAdapter}s when they need to
* start asynchronous operations (such as switching tracks) that interrupts
* recording.
*
* @param {string} sessionToken - The token of the session to start a new
* segment in.
* @returns {number} - Current segment index.
*/
beginSegment(sessionToken: string): number {
if (this._sessionsMetadata[sessionToken] === undefined) {
logger.warn('Attempting to add segments to nonexistent'
+ ` session ${sessionToken}`);
return -1;
}
this._sessionsMetadata[sessionToken].events.push({
type: SessionEventType.SEGMENT_STARTED,
timestamp: highPrecisionTime()
});
this._saveMetadata();
return this.getSegments(sessionToken).length - 1;
}
/**
* Gets the current segment index. Starting from 0 for the first
* segment.
*
* @param {string} sessionToken - The session token.
* @returns {number}
*/
getCurrentSegmentIndex(sessionToken: string): number {
if (this._sessionsMetadata[sessionToken] === undefined) {
return -1;
}
const segments = this.getSegments(sessionToken);
if (segments.length === 0) {
return -1;
}
const lastSegment = segments[segments.length - 1];
if (lastSegment.end) {
// last segment is already ended
return -1;
}
return segments.length - 1;
}
/**
* Marks the end of the last segment in a session.
*
* @param {string} sessionToken - The session token.
* @returns {void}
*/
endSegment(sessionToken: string) {
if (this._sessionsMetadata[sessionToken] === undefined) {
logger.warn('Attempting to end a segment in nonexistent'
+ ` session ${sessionToken}`);
} else {
this._sessionsMetadata[sessionToken].events.push({
type: SessionEventType.SEGMENT_ENDED,
timestamp: highPrecisionTime()
});
this._saveMetadata();
}
}
/**
* Constructs an array of {@code SegmentInfo} from an array of
* {@code SessionEvent}s.
*
* @private
* @param {SessionEvent[]} events - The array of {@code SessionEvent}s.
* @returns {SegmentInfo[]}
*/
_constructSegments(events: SessionEvent[]): SegmentInfo[] {
if (events.length === 0) {
return [];
}
const output = [];
let sessionStartTime = null;
let currentSegment : SegmentInfo = {
};
/**
* Helper function for adding a new {@code SegmentInfo} object to the
* output.
*
* @returns {void}
*/
function commit() {
if (currentSegment.gapBefore === undefined
|| currentSegment.gapBefore === null) {
if (output.length > 0 && output[output.length - 1].end) {
const lastSegment = output[output.length - 1];
if (currentSegment.start && lastSegment.end) {
currentSegment.gapBefore = currentSegment.start
- lastSegment.end;
} else {
currentSegment.gapBefore = null;
}
} else if (sessionStartTime !== null && output.length === 0) {
currentSegment.gapBefore = currentSegment.start
? currentSegment.start - sessionStartTime
: null;
} else {
currentSegment.gapBefore = null;
}
}
currentSegment.duration = currentSegment.end && currentSegment.start
? currentSegment.end - currentSegment.start
: null;
output.push(currentSegment);
currentSegment = {};
}
for (let i = 0; i < events.length; ++i) {
const currentEvent = events[i];
switch (currentEvent.type) {
case SessionEventType.SESSION_STARTED:
if (sessionStartTime === null) {
sessionStartTime = currentEvent.timestamp;
} else {
logger.warn('Unexpected SESSION_STARTED event.'
, currentEvent);
}
break;
case SessionEventType.SEGMENT_STARTED:
if (currentSegment.start === undefined
|| currentSegment.start === null) {
currentSegment.start = currentEvent.timestamp;
} else {
commit();
currentSegment.start = currentEvent.timestamp;
}
break;
case SessionEventType.SEGMENT_ENDED:
if (currentSegment.start === undefined
|| currentSegment.start === null) {
logger.warn('Unexpected SEGMENT_ENDED event', currentEvent);
} else {
currentSegment.end = currentEvent.timestamp;
commit();
}
break;
default:
logger.warn('Unexpected error during _constructSegments');
break;
}
}
if (currentSegment.start) {
commit();
}
return output;
}
}
/**
* Global singleton of {@code SessionManager}.
*/
export const sessionManager = new SessionManager();
// For debug only. To remove later.
window.sessionManager = sessionManager;

View File

@@ -0,0 +1 @@
export * from './SessionManager';

View File

@@ -76,7 +76,6 @@ class Notification extends AbstractNotification<Props> {
pointerEvents = 'box-none'
style = { styles.notificationContent }>
{
// eslint-disable-next-line no-extra-parens
this._getDescription().map((line, index) => (
<Text
key = { index }

View File

@@ -86,19 +86,20 @@ class BroadcastsDropdown extends PureComponent {
render() {
const { broadcasts, selectedBoundStreamID, t } = this.props;
const dropdownItems = broadcasts.map(broadcast =>
// eslint-disable-next-line react/jsx-wrap-multilines
<DropdownItem
key = { broadcast.boundStreamID }
// eslint-disable-next-line react/jsx-no-bind
onClick = { () => this._onSelect(broadcast.boundStreamID) }>
{ broadcast.title }
</DropdownItem>
);
const selected = this.props.broadcasts.find(
broadcast => broadcast.boundStreamID === selectedBoundStreamID);
const triggerText = (selected && selected.title)
|| t('liveStreaming.choose');
const dropdownItems
= broadcasts.map(broadcast => (
<DropdownItem
key = { broadcast.boundStreamID }
// eslint-disable-next-line react/jsx-no-bind
onClick = { () => this._onSelect(broadcast.boundStreamID) }>
{ broadcast.title }
</DropdownItem>));
const selected
= this.props.broadcasts.find(
broadcast => broadcast.boundStreamID === selectedBoundStreamID);
const triggerText
= (selected && selected.title) || t('liveStreaming.choose');
return (
<div className = 'broadcast-dropdown'>

View File

@@ -237,7 +237,7 @@ class StartLiveStreamDialog
switch (this.props._googleAPIState) {
case GOOGLE_API_STATES.LOADED:
googleContent = ( // eslint-disable-line no-extra-parens
googleContent = (
<GoogleSignInButton
onClick = { this._onGoogleSignIn }
text = { t('liveStreaming.signIn') } />
@@ -247,7 +247,7 @@ class StartLiveStreamDialog
break;
case GOOGLE_API_STATES.SIGNED_IN:
googleContent = ( // eslint-disable-line no-extra-parens
googleContent = (
<BroadcastsDropdown
broadcasts = { broadcasts }
onBroadcastSelected = { this._onYouTubeBroadcastIDSelected }
@@ -259,7 +259,7 @@ class StartLiveStreamDialog
* that also accepts the anchor. This can be done using the Trans
* component of react-i18next but I couldn't get it working...
*/
helpText = ( // eslint-disable-line no-extra-parens
helpText = (
<div>
{ `${t('liveStreaming.chooseCTA',
{ email: _googleProfileEmail })} ` }
@@ -273,7 +273,7 @@ class StartLiveStreamDialog
case GOOGLE_API_STATES.NEEDS_LOADING:
default:
googleContent = ( // eslint-disable-line no-extra-parens
googleContent = (
<Spinner
isCompleting = { false }
size = 'medium' />
@@ -283,7 +283,7 @@ class StartLiveStreamDialog
}
if (this.state.errorType !== undefined) {
googleContent = ( // eslint-disable-line no-extra-parens
googleContent = (
<GoogleSignInButton
onClick = { this._onRequestGoogleSignIn }
text = { t('liveStreaming.signIn') } />

View File

@@ -143,16 +143,16 @@ class MoreTab extends AbstractDialogTab<Props, State> {
t
} = this.props;
const languageItems = languages.map(language =>
// eslint-disable-next-line react/jsx-wrap-multilines
<DropdownItem
key = { language }
// eslint-disable-next-line react/jsx-no-bind
onClick = {
() => super._onChange({ currentLanguage: language }) }>
{ t(`languages:${language}`) }
</DropdownItem>
);
const languageItems
= languages.map(language => (
<DropdownItem
key = { language }
// eslint-disable-next-line react/jsx-no-bind
onClick = {
() => super._onChange({ currentLanguage: language }) }>
{ t(`languages:${language}`) }
</DropdownItem>));
return (
<div

View File

@@ -101,7 +101,7 @@ class OverflowMenuItem extends Component<Props> {
* @returns {ReactElement}
*/
_renderText() {
const textElement = ( // eslint-disable-line no-extra-parens
const textElement = (
<span className = 'overflow-menu-item-text'>
{ this.props.text }
</span>

View File

@@ -28,6 +28,10 @@ import {
isDialOutEnabled
} from '../../../invite';
import { openKeyboardShortcutsDialog } from '../../../keyboard-shortcuts';
import {
LocalRecordingButton,
LocalRecordingInfoDialog
} from '../../../local-recording';
import {
LiveStreamButton,
RecordButton
@@ -40,6 +44,7 @@ import {
import { toggleSharedVideo } from '../../../shared-video';
import { toggleChat } from '../../../side-panel';
import { SpeakerStats } from '../../../speaker-stats';
import { TileViewButton } from '../../../video-layout';
import {
OverflowMenuVideoQualityItem,
VideoQualityDialog
@@ -128,6 +133,11 @@ type Props = {
*/
_localParticipantID: String,
/**
* The subsection of Redux state for local recording
*/
_localRecState: Object,
/**
* Whether or not the overflow menu is visible.
*/
@@ -158,6 +168,7 @@ type Props = {
*/
_visible: boolean,
/**
* Set with the buttons which this Toolbox should display.
*/
@@ -227,6 +238,8 @@ class Toolbox extends Component<Props> {
= this._onToolbarToggleScreenshare.bind(this);
this._onToolbarToggleSharedVideo
= this._onToolbarToggleSharedVideo.bind(this);
this._onToolbarOpenLocalRecordingInfoDialog
= this._onToolbarOpenLocalRecordingInfoDialog.bind(this);
}
/**
@@ -369,6 +382,14 @@ class Toolbox extends Component<Props> {
visible = { this._shouldShowButton('camera') } />
</div>
<div className = 'button-group-right'>
{ this._shouldShowButton('localrecording')
&& <LocalRecordingButton
onClick = {
this._onToolbarOpenLocalRecordingInfoDialog
} />
}
{ this._shouldShowButton('tileview')
&& <TileViewButton /> }
{ this._shouldShowButton('invite')
&& !_hideInviteButton
&& <ToolbarButton
@@ -839,6 +860,20 @@ class Toolbox extends Component<Props> {
this._doToggleSharedVideo();
}
_onToolbarOpenLocalRecordingInfoDialog: () => void;
/**
* Opens the {@code LocalRecordingInfoDialog}.
*
* @private
* @returns {void}
*/
_onToolbarOpenLocalRecordingInfoDialog() {
sendAnalytics(createToolbarEvent('local.recording'));
this.props.dispatch(openDialog(LocalRecordingInfoDialog));
}
/**
* Renders a button for toggleing screen sharing.
*
@@ -981,7 +1016,7 @@ class Toolbox extends Component<Props> {
* Returns if a button name has been explicitly configured to be displayed.
*
* @param {string} buttonName - The name of the button, as expected in
* {@link intefaceConfig}.
* {@link interfaceConfig}.
* @private
* @returns {boolean} True if the button should be displayed.
*/
@@ -1018,6 +1053,7 @@ function _mapStateToProps(state) {
visible
} = state['features/toolbox'];
const localParticipant = getLocalParticipant(state);
const localRecordingStates = state['features/local-recording'];
const localVideo = getLocalVideoTrack(state['features/base/tracks']);
const addPeopleEnabled = isAddPeopleEnabled(state);
const dialOutEnabled = isDialOutEnabled(state);
@@ -1058,6 +1094,7 @@ function _mapStateToProps(state) {
_isGuest: state['features/base/jwt'].isGuest,
_fullScreen: fullScreen,
_localParticipantID: localParticipant.id,
_localRecState: localRecordingStates,
_overflowMenuVisible: overflowMenuVisible,
_raisedHand: localParticipant.raisedHand,
_screensharing: localVideo && localVideo.videoType === 'desktop',

View File

@@ -0,0 +1,10 @@
/**
* The type of the action which enables or disables the feature for showing
* video thumbnails in a two-axis tile view.
*
* @returns {{
* type: SET_TILE_VIEW,
* enabled: boolean
* }}
*/
export const SET_TILE_VIEW = Symbol('SET_TILE_VIEW');

View File

@@ -0,0 +1,20 @@
// @flow
import { SET_TILE_VIEW } from './actionTypes';
/**
* Creates a (redux) action which signals to set the UI layout to be tiled view
* or not.
*
* @param {boolean} enabled - Whether or not tile view should be shown.
* @returns {{
* type: SET_TILE_VIEW,
* enabled: boolean
* }}
*/
export function setTileView(enabled: boolean) {
return {
type: SET_TILE_VIEW,
enabled
};
}

View File

@@ -0,0 +1,90 @@
// @flow
import { connect } from 'react-redux';
import {
createToolbarEvent,
sendAnalytics
} from '../../analytics';
import { translate } from '../../base/i18n';
import {
AbstractButton,
type AbstractButtonProps
} from '../../base/toolbox';
import { setTileView } from '../actions';
/**
* The type of the React {@code Component} props of {@link TileViewButton}.
*/
type Props = AbstractButtonProps & {
/**
* Whether or not tile view layout has been enabled as the user preference.
*/
_tileViewEnabled: boolean,
/**
* Used to dispatch actions from the buttons.
*/
dispatch: Dispatch<*>
};
/**
* Component that renders a toolbar button for toggling the tile layout view.
*
* @extends AbstractButton
*/
class TileViewButton<P: Props> extends AbstractButton<P, *> {
accessibilityLabel = 'toolbar.accessibilityLabel.tileView';
iconName = 'icon-tiles-many';
toggledIconName = 'icon-tiles-many toggled';
tooltip = 'toolbar.tileViewToggle';
/**
* Handles clicking / pressing the button.
*
* @override
* @protected
* @returns {void}
*/
_handleClick() {
const { _tileViewEnabled, dispatch } = this.props;
sendAnalytics(createToolbarEvent(
'tileview.button',
{
'is_enabled': _tileViewEnabled
}));
dispatch(setTileView(!_tileViewEnabled));
}
/**
* Indicates whether this button is in toggled state or not.
*
* @override
* @protected
* @returns {boolean}
*/
_isToggled() {
return this.props._tileViewEnabled;
}
}
/**
* Maps (parts of) the redux state to the associated props for the
* {@code TileViewButton} component.
*
* @param {Object} state - The Redux state.
* @returns {{
* _tileViewEnabled: boolean
* }}
*/
function _mapStateToProps(state) {
return {
_tileViewEnabled: state['features/video-layout'].tileViewEnabled
};
}
export default translate(connect(_mapStateToProps)(TileViewButton));

View File

@@ -0,0 +1 @@
export { default as TileViewButton } from './TileViewButton';

View File

@@ -0,0 +1,10 @@
/**
* An enumeration of the different display layouts supported by the application.
*
* @type {Object}
*/
export const LAYOUTS = {
HORIZONTAL_FILMSTRIP_VIEW: 'horizontal-filmstrip-view',
TILE_VIEW: 'tile-view',
VERTICAL_FILMSTRIP_VIEW: 'vertical-filmstrip-view'
};

View File

@@ -0,0 +1,78 @@
// @flow
import { LAYOUTS } from './constants';
declare var interfaceConfig: Object;
/**
* Returns the {@code LAYOUTS} constant associated with the layout
* the application should currently be in.
*
* @param {Object} state - The redux state.
* @returns {string}
*/
export function getCurrentLayout(state: Object) {
if (shouldDisplayTileView(state)) {
return LAYOUTS.TILE_VIEW;
} else if (interfaceConfig.VERTICAL_FILMSTRIP) {
return LAYOUTS.VERTICAL_FILMSTRIP_VIEW;
}
return LAYOUTS.HORIZONTAL_FILMSTRIP_VIEW;
}
/**
* Returns how many columns should be displayed in tile view. The number
* returned will be between 1 and 5, inclusive.
*
* @returns {number}
*/
export function getMaxColumnCount() {
const configuredMax = interfaceConfig.TILE_VIEW_MAX_COLUMNS || 5;
return Math.max(Math.min(configuredMax, 1), 5);
}
/**
* Returns the cell count dimensions for tile view. Tile view tries to uphold
* equal count of tiles for height and width, until maxColumn is reached in
* which rows will be added but no more columns.
*
* @param {Object} state - The redux state.
* @param {number} maxColumns - The maximum number of columns that can be
* displayed.
* @returns {Object} An object is return with the desired number of columns,
* rows, and visible rows (the rest should overflow) for the tile view layout.
*/
export function getTileViewGridDimensions(state: Object, maxColumns: number) {
// Purposefully include all participants, which includes fake participants
// that should show a thumbnail.
const potentialThumbnails = state['features/base/participants'].length;
const columnsToMaintainASquare = Math.ceil(Math.sqrt(potentialThumbnails));
const columns = Math.min(columnsToMaintainASquare, maxColumns);
const rows = Math.ceil(potentialThumbnails / columns);
const visibleRows = Math.min(maxColumns, rows);
return {
columns,
rows,
visibleRows
};
}
/**
* Selector for determining if the UI layout should be in tile view. Tile view
* is determined by more than just having the tile view setting enabled, as
* one-on-one calls should not be in tile view, as well as etherpad editing.
*
* @param {Object} state - The redux state.
* @returns {boolean} True if tile view should be displayed.
*/
export function shouldDisplayTileView(state: Object = {}) {
return Boolean(
state['features/video-layout']
&& state['features/video-layout'].tileViewEnabled
&& !state['features/etherpad'].editing
);
}

View File

@@ -1 +1,9 @@
export * from './actions';
export * from './actionTypes';
export * from './components';
export * from './constants';
export * from './functions';
import './middleware';
import './reducer';
import './subscriber';

View File

@@ -15,6 +15,8 @@ import {
import { MiddlewareRegistry } from '../base/redux';
import { TRACK_ADDED } from '../base/tracks';
import { SET_TILE_VIEW } from './actionTypes';
declare var APP: Object;
/**
@@ -71,6 +73,10 @@ MiddlewareRegistry.register(store => next => action => {
Boolean(action.participant.id));
break;
case SET_TILE_VIEW:
APP.UI.emitEvent(UIEvents.TOGGLED_TILE_VIEW, action.enabled);
break;
case TRACK_ADDED:
if (!action.track.local) {
VideoLayout.onRemoteStreamAdded(action.track.jitsiTrack);

View File

@@ -0,0 +1,17 @@
// @flow
import { ReducerRegistry } from '../base/redux';
import { SET_TILE_VIEW } from './actionTypes';
ReducerRegistry.register('features/video-layout', (state = {}, action) => {
switch (action.type) {
case SET_TILE_VIEW:
return {
...state,
tileViewEnabled: action.enabled
};
}
return state;
});

View File

@@ -0,0 +1,24 @@
// @flow
import {
VIDEO_QUALITY_LEVELS,
setMaxReceiverVideoQuality
} from '../base/conference';
import { StateListenerRegistry } from '../base/redux';
import { selectParticipant } from '../large-video';
import { shouldDisplayTileView } from './functions';
/**
* StateListenerRegistry provides a reliable way of detecting changes to
* preferred layout state and dispatching additional actions.
*/
StateListenerRegistry.register(
/* selector */ state => shouldDisplayTileView(state),
/* listener */ (displayTileView, { dispatch }) => {
dispatch(selectParticipant());
if (!displayTileView) {
dispatch(setMaxReceiverVideoQuality(VIDEO_QUALITY_LEVELS.HIGH));
}
}
);

View File

@@ -230,7 +230,6 @@ class WelcomePage extends AbstractWelcomePage {
const { t } = this.props;
let children;
/* eslint-disable no-extra-parens */
if (this.state.joining) {
// TouchableHighlight is picky about what its children can be, so
@@ -251,7 +250,6 @@ class WelcomePage extends AbstractWelcomePage {
);
}
/* eslint-enable no-extra-parens */
const buttonDisabled = this._isJoinDisabled();

View File

@@ -59,6 +59,7 @@ export default {
TOGGLED_FILMSTRIP: 'UI.toggled_filmstrip',
TOGGLE_SCREENSHARING: 'UI.toggle_screensharing',
TOGGLED_SHARED_DOCUMENT: 'UI.toggled_shared_document',
TOGGLED_TILE_VIEW: 'UI.toggled_tile_view',
HANGUP: 'UI.hangup',
LOGOUT: 'UI.logout',
VIDEO_DEVICE_CHANGED: 'UI.video_device_changed',

View File

@@ -149,7 +149,11 @@ module.exports = [
],
'do_external_connect':
'./connection_optimization/do_external_connect.js'
'./connection_optimization/do_external_connect.js',
'flacEncodeWorker':
'./react/features/local-recording/'
+ 'recording/flac/flacEncodeWorker.js'
}
}),