Compare commits

..

31 Commits

Author SHA1 Message Date
Saúl Ibarra Corretgé
013212b753 fix(chat) hide private message option in context menu
If disablePrivateChat is configured.
2025-06-27 13:49:15 +02:00
Mihaela Dumitru
d741fcdd1c fix(recordings) create missing local tracks when unmuting after consent (#16119)
* fix(recordings) create missing local tracks when unmuting after consent

* fix(conference) Avoid creating duplicate tracks on unmute

* squash: Ignore TS linter error

---------

Co-authored-by: Jaya Allamsetty <jaya.allamsetty@8x8.com>
2025-06-10 13:36:23 -04:00
damencho
744818c225 fix(permissions): Adds an option to force-send permissions.
If backend modify permissions can force sending those on the initial presence.
2025-05-23 21:38:03 -05:00
damencho
7d30a665f7 feat(prosody): Check granted identity for recordings. 2025-05-23 12:23:32 -05:00
damencho
d432f1c881 feat(av-moderation): Updates startMuted policy in metadata. 2025-05-22 10:25:30 -05:00
damencho
dfec5b73c0 feat(av-moderation): Disable start muted settings when av moderation is on. 2025-05-22 10:25:19 -05:00
damencho
4898160200 feat(metadata): Pushes metadata early before join. 2025-05-22 10:25:05 -05:00
Jaya Allamsetty
ea47070dd2 fix(conference) Mute user when startMuted policy update is received in conference meta data (#16025) 2025-05-22 10:20:34 -04:00
Saúl Ibarra Corretgé
2cf8ae838c fix(spot) make Spot TV detection more resilient
Setting the UA string in Electron doesn't propagate the change to the
iframe where the meeting is loaded (🤦).

Thus make it more resilient by trying different things:

- A freshly introduced "iAmSpot" config option, similar to Jibri
- The app ID is present in the UA string, so we can test for that
- As a last-ditch effort, check if the display name is the default
  "Meeting Room"
2025-05-16 15:08:42 +02:00
Saúl Ibarra Corretgé
f162a56adb fix(recording) fix matching initiator
LJM will use either a JitsiParticipant object or a string for the
recording session initiator, handle both cases when checking if it's
ourselves.
2025-05-15 21:06:28 -05:00
Hristo Terezov
865207649a Revert typography values in tokens to px from rem (#16026)
* Revert "feat(base/ui/native): Convert rem to px  (#15934)"

This reverts commit 057dc0e4d2.

* Revert "fix(StageParticipantNameLabel): size"

This reverts commit a01f4468a0.

* Revert "fix(subtitles): position part1"

This reverts commit 6c6ed8d7a8.

* Revert "fix(ITypographyType): wrong type of fontSize and lineHeight props"

This reverts commit bffcc9092b.

* revert(Tokens): font sizes and line heights back to px from rem

Turns out there are many places that does not expect rem. Temporary reverting this change from commit 6fa94b0bb4. We should bring it back along with proper handling of rem everywhere.
2025-05-14 10:04:20 -05:00
damencho
3e46011352 fix: Fixes ljm branch. 2025-05-12 16:22:19 -05:00
Hristo Terezov
243acb4a0f fix(ITypographyType): wrong type of fontSize and lineHeight props
In a previous comit about accessibility we changed the fint size and line height to use rem (expressed as string) instead of numbers for px but the types for the interface were not updated.
2025-05-07 20:31:32 -05:00
Hristo Terezov
def8062141 fix(StageParticipantNameLabel): size
Fixes an issue where StageParticipantNameLabel is smaller. This is caused because the font size and line height  props are calculated to an invalid (NaN) value after we started using rem instead of px for lineHeight and fontSize in theme.
Reference: #15917
2025-05-07 19:37:36 -05:00
Hristo Terezov
15ec3a25cb fix(subtitles): position part1
Fixes an issue where subtitles are displayed in the middle of the screen. This is caused because the bottom prop is calculated to an invalid (NaN) value after we started using rem instead of px for lineHeight in theme.
Reference: https://github.com/jitsi/jitsi-meet/pull/15917
2025-05-07 19:36:07 -05:00
Saúl Ibarra Corretgé
603d239884 feat(recording) add ability to skip consent in-meeting
When turned on, the consent dialog won't be displayed for the users who
are already in the meeting, it will only be displayed to those who join
after the recording was started.
2025-05-07 15:17:25 +03:00
Saúl Ibarra Corretgé
67fcfeff43 fix(recording) prevent multiple consent requests
A given recording should only trigger a single consent request.

The mechanism to notify about recording status updates may fire multiple
times since it's tied to XMPP presence and may send updates such as when
the live stream view URL is set.

Rather than trying to handle all possible corner cases to make sure we
only show the consent dialog once, keep track of the recording session
IDs for which we _have_ asked for consent and skip the dialog in case we
have done it already.
2025-05-07 15:16:07 +03:00
Дамян Минков
905cfce884 feat(tests): Use more predictable room names. (#15998)
* feat(tests): Use more predictable room names.

* squash: Make sure room name is in lowercase.
2025-05-06 12:09:37 -05:00
Andrei Gavrilescu
734d7e3bd0 fix(popover): touch interaction closes overflow drawer without triggering action
* automatic drawer toolbox on mobile browser

* fix touch interaction on Popover
2025-05-06 16:23:27 +03:00
Saúl Ibarra Corretgé
92c22be002 feat(lang,settings) remove experimental label from multi-pinning 2025-05-06 15:21:51 +02:00
Saúl Ibarra Corretgé
8a300be738 feat(recording) refactor consent dialog (#15985)
* feat(recording) refactor consent dialog

Offer 2 choices and add a configurable "learn more" link.

* hide dialog and display link conditionally

* native changes

---------

Co-authored-by: Mihaela Dumitru <mihdmt@gmail.com>
2025-05-06 15:43:09 +03:00
damencho
3859b8a8c2 feat(tests): Validate-shard tests improvements.
feat(tests): Prefer to generate token for dial in.

feat(tests): Adds invite test. (#15986)

* feat(tests): Adds invite test.

Tests dial-in, dial-out and inviting sip-jibri.

* squash: Extract duplicate code in a function.

* squash: Fixes comments.

feat(tests): Handle and final transcriptions.

feat(tests): Adds debug log for webhooks.
2025-05-05 08:36:54 -05:00
Hristo Terezov
e93990a1f2 chore(package.json): Use LJM from release branch 2025-04-30 10:08:32 -05:00
damencho
9fb4618ffe fix(prosody): Adds a nil check for ends_with utility. 2025-04-28 15:44:36 -05:00
damencho
f06359b9d1 fix(prosody): Fixes filter rayo message when int id is used.
Make sure we add string values to the stanza.
2025-04-28 14:18:12 -05:00
Hristo Terezov
0e0e18ad52 feat(toolbar): Enable 9th and 10th button 2025-04-22 15:51:07 -05:00
Saúl Ibarra Corretgé
0c0bb4991e fix(recording) skip consent dialog on Spot TV 2025-04-17 21:35:13 +02:00
Saúl Ibarra Corretgé
b2578c140e fix(polls) halt processing of malformed polls
We need to return something other than nil in order to halt the
processing of the event.

https://prosody.im/doc/developers/moduleapi#modulehook_event_name_handler_priority
2025-04-17 21:35:03 +02:00
damencho
33d3e971ca fix(prosody): Fixes extracting domain from rooms without a domain. 2025-04-11 11:21:30 -05:00
Дамян Минков
5092407555 * feat(tests): Simplifies display names and participant create.
* feat(tests): Simplifies display names and participant create.

Moves token creation only when needed.

* squash: Skip webhook check of user id for guest participants.

* squash: Waits for kick reason dialog.

* squash: Simplifies by matching participant name and display name.

* squash: Drop displayname field.
2025-04-11 11:20:55 -05:00
Hristo Terezov
f4bf25ba6c fix(DesktopPicker): Stops displaying if closed too fast.
If the desktop picker window is closed before we load the sources, a JS error is thrown. From there the app goes into a broken state where when the screen sharing button is pressed nothing happens.  Explanation:
When the error from the _onCloseModal handler is thrown we don't reach the line to call the onSourceChoose callback. The result is that we never call the callback received by setDisplayMediaRequestHandler. It seems that when this happens on subsequent gDM calls electron won't call the setDisplayMediaRequestHandler and therefore we don't display the desktop picker.
2025-04-11 10:13:02 -05:00
103 changed files with 1656 additions and 2427 deletions

View File

@@ -89,7 +89,7 @@ import {
setVideoMuted,
setVideoUnmutePermissions
} from './react/features/base/media/actions';
import { MEDIA_TYPE, VIDEO_TYPE } from './react/features/base/media/constants';
import { MEDIA_TYPE, VIDEO_MUTISM_AUTHORITY, VIDEO_TYPE } from './react/features/base/media/constants';
import {
getStartWithAudioMuted,
getStartWithVideoMuted,
@@ -131,7 +131,6 @@ import {
createLocalTracksF,
getLocalJitsiAudioTrack,
getLocalJitsiVideoTrack,
getLocalTracks,
getLocalVideoTrack,
isLocalTrackMuted,
isUserInteractionRequiredForUnmute
@@ -206,23 +205,6 @@ function sendData(command, value) {
room.sendCommand(command, { value });
}
/**
* Mute or unmute local audio stream if it exists.
* @param {boolean} muted - if audio stream should be muted or unmuted.
*/
function muteLocalAudio(muted) {
APP.store.dispatch(setAudioMuted(muted));
}
/**
* Mute or unmute local video stream if it exists.
* @param {boolean} muted if video stream should be muted or unmuted.
*
*/
function muteLocalVideo(muted) {
APP.store.dispatch(setVideoMuted(muted));
}
/**
* A queue for the async replaceLocalTrack action so that multiple audio
* replacements cannot happen simultaneously. This solves the issue where
@@ -709,11 +691,10 @@ export default {
* Simulates toolbar button click for audio mute. Used by shortcuts and API.
*
* @param {boolean} mute true for mute and false for unmute.
* @param {boolean} [showUI] when set to false will not display any error
* dialogs in case of media permissions error.
* @returns {Promise}
*/
async muteAudio(mute, showUI = true) {
async muteAudio(mute) {
const state = APP.store.getState();
if (!mute
@@ -732,47 +713,7 @@ export default {
return;
}
// Not ready to modify track's state yet
if (!this._localTracksInitialized) {
// This will only modify base/media.audio.muted which is then synced
// up with the track at the end of local tracks initialization.
muteLocalAudio(mute);
this.updateAudioIconEnabled();
return;
} else if (this.isLocalAudioMuted() === mute) {
// NO-OP
return;
}
const localAudio = getLocalJitsiAudioTrack(APP.store.getState());
if (!localAudio && !mute) {
const maybeShowErrorDialog = error => {
showUI && APP.store.dispatch(notifyMicError(error));
};
APP.store.dispatch(gumPending([ MEDIA_TYPE.AUDIO ], IGUMPendingState.PENDING_UNMUTE));
await createLocalTracksF({ devices: [ 'audio' ] })
.then(([ audioTrack ]) => audioTrack)
.catch(error => {
maybeShowErrorDialog(error);
// Rollback the audio muted status by using null track
return null;
})
.then(async audioTrack => {
await this._maybeApplyAudioMixerEffect(audioTrack);
return this.useAudioStream(audioTrack);
})
.finally(() => {
APP.store.dispatch(gumPending([ MEDIA_TYPE.AUDIO ], IGUMPendingState.NONE));
});
} else {
muteLocalAudio(mute);
}
await APP.store.dispatch(setAudioMuted(mute, true));
},
/**
@@ -802,10 +743,9 @@ export default {
/**
* Simulates toolbar button click for video mute. Used by shortcuts and API.
* @param mute true for mute and false for unmute.
* @param {boolean} [showUI] when set to false will not display any error
* dialogs in case of media permissions error.
*/
muteVideo(mute, showUI = true) {
muteVideo(mute) {
if (this.videoSwitchInProgress) {
logger.warn('muteVideo - unable to perform operations while video switch is in progress');
@@ -826,60 +766,7 @@ export default {
return;
}
// If not ready to modify track's state yet adjust the base/media
if (!this._localTracksInitialized) {
// This will only modify base/media.video.muted which is then synced
// up with the track at the end of local tracks initialization.
muteLocalVideo(mute);
this.setVideoMuteStatus();
return;
} else if (this.isLocalVideoMuted() === mute) {
// NO-OP
return;
}
const localVideo = getLocalJitsiVideoTrack(state);
if (!localVideo && !mute && !this.isCreatingLocalTrack) {
const maybeShowErrorDialog = error => {
showUI && APP.store.dispatch(notifyCameraError(error));
};
this.isCreatingLocalTrack = true;
APP.store.dispatch(gumPending([ MEDIA_TYPE.VIDEO ], IGUMPendingState.PENDING_UNMUTE));
// Try to create local video if there wasn't any.
// This handles the case when user joined with no video
// (dismissed screen sharing screen or in audio only mode), but
// decided to add it later on by clicking on muted video icon or
// turning off the audio only mode.
//
// FIXME when local track creation is moved to react/redux
// it should take care of the use case described above
createLocalTracksF({ devices: [ 'video' ] })
.then(([ videoTrack ]) => videoTrack)
.catch(error => {
// FIXME should send some feedback to the API on error ?
maybeShowErrorDialog(error);
// Rollback the video muted status by using null track
return null;
})
.then(videoTrack => {
logger.debug(`muteVideo: calling useVideoStream for track: ${videoTrack}`);
return this.useVideoStream(videoTrack);
})
.finally(() => {
this.isCreatingLocalTrack = false;
APP.store.dispatch(gumPending([ MEDIA_TYPE.VIDEO ], IGUMPendingState.NONE));
});
} else {
// FIXME show error dialog if it fails (should be handled by react)
muteLocalVideo(mute);
}
APP.store.dispatch(setVideoMuted(mute, VIDEO_MUTISM_AUTHORITY.USER, true));
},
/**
@@ -1829,35 +1716,6 @@ export default {
onStartMutedPolicyChanged(audio, video));
}
);
room.on(JitsiConferenceEvents.STARTED_MUTED, () => {
const audioMuted = room.isStartAudioMuted();
const videoMuted = room.isStartVideoMuted();
const localTracks = getLocalTracks(APP.store.getState()['features/base/tracks']);
const promises = [];
APP.store.dispatch(setAudioMuted(audioMuted));
APP.store.dispatch(setVideoMuted(videoMuted));
// Remove the tracks from the peerconnection.
for (const track of localTracks) {
// Always add the track on Safari because of a known issue where audio playout doesn't happen
// if the user joins audio and video muted, i.e., if there is no local media capture.
if (audioMuted && track.jitsiTrack?.getType() === MEDIA_TYPE.AUDIO && !browser.isWebKitBased()) {
promises.push(this.useAudioStream(null));
}
if (videoMuted && track.jitsiTrack?.getType() === MEDIA_TYPE.VIDEO) {
promises.push(this.useVideoStream(null));
}
}
Promise.allSettled(promises)
.then(() => {
APP.store.dispatch(showNotification({
titleKey: 'notify.mutedTitle',
descriptionKey: 'notify.muted'
}, NOTIFICATION_TIMEOUT_TYPE.SHORT));
});
});
room.on(
JitsiConferenceEvents.DATA_CHANNEL_OPENED, () => {

View File

@@ -89,9 +89,6 @@ var config = {
// Enables use of getDisplayMedia in electron
// electronUseGetDisplayMedia: false,
// Enables AV1 codec for FF. Note: By default it is disabled.
// enableAV1ForFF: false,
// Enables the use of the codec selection API supported by the browsers .
// enableCodecSelectionAPI: false,
@@ -401,6 +398,10 @@ var config = {
// // If true, mutes audio and video when a recording begins and displays a dialog
// // explaining the effect of unmuting.
// // requireConsent: true,
// // If true consent will be skipped for users who are already in the meeting.
// // skipConsentInMeeting: true,
// // Link for the recording consent dialog's "Learn more" link.
// // consentLearnMoreLink: 'https://jitsi.org/meet/consent',
// },
// recordingService: {

View File

@@ -15,7 +15,6 @@ external_services = {
cross_domain_bosh = false;
consider_bosh_secure = true;
consider_websocket_secure = true;
-- https_ports = { }; -- Remove this line to prevent listening on port 5284
-- by default prosody 0.12 sends cors headers, if you want to disable it uncomment the following (the config is available on 0.12.1)

View File

@@ -263,7 +263,6 @@
"Remove": "Entfernen",
"Share": "Teilen",
"Submit": "OK",
"Understand": "Verstanden",
"WaitForHostMsg": "Die Konferenz wurde noch nicht gestartet. Falls Sie die Konferenz leiten, authentifizieren Sie sich bitte. Warten Sie andernfalls, bis die Konferenz gestartet wird.",
"WaitForHostNoAuthMsg": "Die Konferenz wurde noch nicht gestartet. Bitte warten Sie, bis die Konferenz gestartet wird.",
"WaitingForHostButton": "Auf Moderation warten",
@@ -394,8 +393,6 @@
"recentlyUsedObjects": "Ihre zuletzt verwendeten Objekte",
"recording": "Aufnahme",
"recordingDisabledBecauseOfActiveLiveStreamingTooltip": "Während eines Livestreams nicht möglich",
"recordingInProgressDescription": "Diese Konferenz wird aufgezeichnet. Ihr Ton und Video ist deaktiviert, wenn Sie es aktivieren, stimmen Sie der Aufzeichnung zu.",
"recordingInProgressTitle": "Aufnahme läuft",
"rejoinNow": "Jetzt erneut beitreten",
"remoteControlAllowedMessage": "{{user}} hat die Anfrage zur Fernsteuerung angenommen!",
"remoteControlDeniedMessage": "{{user}} hat die Anfrage zur Fernsteuerung verweigert!",
@@ -752,8 +749,7 @@
"dataChannelClosedDescriptionWithAudio": "Die Steuerungsverbindung (Bridge Channel) wurde unterbrochen, daher können Video- und Tonprobleme auftreten.",
"dataChannelClosedWithAudio": "Ton- und Videoqualität können beeinträchtigt sein",
"disabledIframe": "Die Einbettung ist nur für Demo-Zwecke vorgesehen. Diese Konferenz wird in {{timeout}} Minuten beendet.",
"disabledIframeSecondaryNative": "Die Einbettung von {{domain}} ist nur für Demo-Zwecke vorgesehen. Diese Konferenz wird in {{timeout}} Minuten beendet.",
"disabledIframeSecondaryWeb": "Die Einbettung von {{domain}} ist nur für Demo-Zwecke vorgesehen. Diese Konferenz wird in {{timeout}} Minuten beendet. Bitte nutzen Sie <a href='{{jaasDomain}}' rel='noopener noreferrer' target='_blank'>Jitsi as a Service</a> für produktive Zwecke!",
"disabledIframeSecondary": "Die Einbettung von {{domain}} ist nur für Demo-Zwecke vorgesehen. Diese Konferenz wird in {{timeout}} Minuten beendet. Bitte nutzen Sie <a href='{{jaasDomain}}' rel='noopener noreferrer' target='_blank'>Jitsi as a Service</a> für produktive Zwecke!",
"disconnected": "getrennt",
"displayNotifications": "Benachrichtigungen anzeigen für",
"dontRemindMe": "Nicht erinnern",
@@ -881,7 +877,6 @@
"waitingLobby": "In der Lobby ({{count}})"
},
"search": "Suche Anwesende",
"searchDescription": "Tippen Sie um die Anwesendenliste zu filtern",
"title": "Anwesende"
},
"passwordDigitsOnly": "Bis zu {{number}} Ziffern",
@@ -1109,7 +1104,6 @@
"signedIn": "Momentan wird auf Kalendertermine von {{email}} zugegriffen. Klicken Sie auf die folgende Schaltfläche „Trennen“, um den Zugriff auf die Kalendertermine zu stoppen.",
"title": "Kalender"
},
"chatWithPermissions": "Chat mit Freigaben",
"desktopShareFramerate": "Framerate für Bildschirmfreigabe",
"desktopShareHighFpsWarning": "Eine höhere Framerate könnte sich auf Ihre Datenrate auswirken. Sie müssen die Bildschirmfreigabe neustarten, damit die Einstellung übernommen wird.",
"desktopShareWarning": "Sie müssen die Bildschirmfreigabe neustarten, damit die Einstellung übernommen wird.",
@@ -1198,7 +1192,6 @@
"neutral": "Neutral",
"sad": "Traurig",
"search": "Suche",
"searchDescription": "Tippen Sie um die Anwesendenliste zu filtern",
"searchHint": "Suche Anwesende",
"seconds": "{{count}} Sek.",
"speakerStats": "Sprechstatistik",
@@ -1277,7 +1270,7 @@
"muteGUMPending": "Verbinde Ihr Mikrofon",
"noiseSuppression": "Rauschunterdrückung",
"openChat": "Chat öffnen",
"participants": "Anwesenheitsliste öffnen. {{participantsCount}} anwesend",
"participants": "Anwesende",
"pip": "Bild-in-Bild-Modus ein-/ausschalten",
"privateMessage": "Private Nachricht senden",
"profile": "Profil bearbeiten",
@@ -1415,8 +1408,7 @@
"ccButtonTooltip": "Untertitel ein-/ausschalten",
"expandedLabel": "Transkribieren ist derzeit eingeschaltet",
"failed": "Transkribieren fehlgeschlagen",
"labelTooltip": "Die Konferenz wird transkribiert",
"labelTooltipExtra": "Zusätzlich wird das Transkript später verfügbar sein.",
"labelToolTip": "Die Konferenz wird transkribiert",
"sourceLanguageDesc": "Aktuell ist die Sprache der Konferenz auf <b>{{sourceLanguage}}</b> eingestellt. <br/> Sie könne dies hier ",
"sourceLanguageHere": "ändern",
"start": "Anzeige der Untertitel starten",

View File

@@ -122,9 +122,7 @@
"nickname": {
"popover": "Choose a nickname",
"title": "Enter a nickname to use chat",
"titleWithCC": "Enter a nickname to use chat and closed captions",
"titleWithPolls": "Enter a nickname to use chat and polls",
"titleWithPollsAndCC": "Enter a nickname to use chat, polls and closed captions"
"titleWithPolls": "Enter a nickname to use chat and polls"
},
"noMessagesMessage": "There are no messages in the meeting yet. Start a conversation here!",
"privateNotice": "Private message to {{recipient}}",
@@ -133,13 +131,10 @@
"systemDisplayName": "System",
"tabs": {
"chat": "Chat",
"closedCaptions": "CC",
"polls": "Polls"
},
"title": "Chat",
"titleWithCC": "Chat and CC",
"titleWithPolls": "Chat and Polls",
"titleWithPollsAndCC": "Chat, Polls and CC",
"you": "you"
},
"chromeExtensionBanner": {
@@ -149,10 +144,6 @@
"dontShowAgain": "Dont show me this again",
"installExtensionText": "Install the extension for Google Calendar and Office 365 integration"
},
"closedCaptionsTab": {
"emptyState": "The closed captions content will be available once a moderator starts it",
"startClosedCaptionsButton": "Start closed captions"
},
"connectingOverlay": {
"joiningRoom": "Connecting you to your meeting…"
},
@@ -272,7 +263,8 @@
"Remove": "Remove",
"Share": "Share",
"Submit": "Submit",
"Understand": "I understand",
"Understand": "I understand, keep me muted for now",
"UnderstandAndUnmute": "I understand, please unmute me",
"WaitForHostMsg": "The conference has not yet started because no moderators have yet arrived. If you'd like to become a moderator please log-in. Otherwise, please wait.",
"WaitForHostNoAuthMsg": "The conference has not yet started because no moderators have yet arrived. Please wait.",
"WaitingForHostButton": "Wait for moderator",
@@ -309,6 +301,7 @@
"conferenceReloadMsg": "We're trying to fix this. Reconnecting in {{seconds}} sec…",
"conferenceReloadTitle": "Unfortunately, something went wrong.",
"confirm": "Confirm",
"confirmBack": "Back",
"confirmNo": "No",
"confirmYes": "Yes",
"connectError": "Oops! Something went wrong and we couldn't connect to the conference.",
@@ -346,6 +339,7 @@
"kickParticipantTitle": "Kick this participant?",
"kickSystemTitle": "Ouch! You were kicked out of the meeting",
"kickTitle": "Ouch! {{participantDisplayName}} kicked you out of the meeting",
"learnMore": "learn more",
"linkMeeting": "Link meeting",
"linkMeetingTitle": "Link meeting to Salesforce",
"liveStreaming": "Live Streaming",
@@ -403,7 +397,9 @@
"recentlyUsedObjects": "Your recently used objects",
"recording": "Recording",
"recordingDisabledBecauseOfActiveLiveStreamingTooltip": "Not possible while a live stream is active",
"recordingInProgressDescription": "This meeting is being recorded. Your audio and video have been muted. If you choose to unmute, you consent to being recorded.",
"recordingInProgressDescription": "This meeting is being recorded and analyzed by AI{{learnMore}}. Your audio and video have been muted. If you choose to unmute, you consent to being recorded.",
"recordingInProgressDescriptionFirstHalf": "This meeting is being recorded and analyzed by AI",
"recordingInProgressDescriptionSecondHalf": ". Your audio and video have been muted. If you choose to unmute, you consent to being recorded.",
"recordingInProgressTitle": "Recording in progress",
"rejoinNow": "Rejoin now",
"remoteControlAllowedMessage": "{{user}} accepted your remote control request!",
@@ -890,7 +886,6 @@
"waitingLobby": "Waiting in lobby ({{count}})"
},
"search": "Search participants",
"searchDescription": "Start typing to filter participants",
"title": "Participants"
},
"passwordDigitsOnly": "Up to {{number}} digits",
@@ -1148,7 +1143,6 @@
"selectMic": "Microphone",
"selfView": "Self view",
"shortcuts": "Shortcuts",
"showSubtitlesOnStage": "Show subtitles on stage",
"speakers": "Speakers",
"startAudioMuted": "Everyone starts muted",
"startReactionsMuted": "Mute reaction sounds for everyone",
@@ -1208,7 +1202,6 @@
"neutral": "Neutral",
"sad": "Sad",
"search": "Search",
"searchDescription": "Start typing to filter participants",
"searchHint": "Search participants",
"seconds": "{{count}}s",
"speakerStats": "Participants Stats",
@@ -1245,7 +1238,6 @@
"closeChat": "Close chat",
"closeMoreActions": "Close more actions menu",
"closeParticipantsPane": "Close participants pane",
"closedCaptions": "Closed captions",
"collapse": "Collapse",
"document": "Toggle shared document",
"documentClose": "Close shared document",
@@ -1336,7 +1328,6 @@
"closeChat": "Close chat",
"closeParticipantsPane": "Close participants pane",
"closeReactionsMenu": "Close reactions menu",
"closedCaptions": "Closed captions",
"disableNoiseSuppression": "Disable extra noise suppression (BETA)",
"disableReactionSounds": "You can disable reaction sounds for this meeting",
"documentClose": "Close shared document",
@@ -1429,16 +1420,13 @@
"failed": "Transcribing failed",
"labelTooltip": "This meeting is being transcribed.",
"labelTooltipExtra": "In addition, a transcript will be available later.",
"openClosedCaptions": "Open closed captions",
"original": "Original",
"sourceLanguageDesc": "Currently the meeting language is set to <b>{{sourceLanguage}}</b>. <br/> You can change it from ",
"sourceLanguageHere": "here",
"start": "Start showing subtitles",
"stop": "Stop showing subtitles",
"subtitles": "Subtitles",
"subtitlesOff": "Off",
"tr": "TR",
"translateTo": "Translate to"
"tr": "TR"
},
"unpinParticipant": "{{participantName}} - Unpin",
"userMedia": {

992
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -27,6 +27,7 @@
"@jitsi/js-utils": "2.2.1",
"@jitsi/logger": "2.0.2",
"@jitsi/rnnoise-wasm": "0.2.1",
"@jitsi/rtcstats": "9.5.1",
"@matrix-org/olm": "https://gitlab.matrix.org/api/v4/projects/27/packages/npm/@matrix-org/olm/-/@matrix-org/olm-3.2.3.tgz",
"@microsoft/microsoft-graph-client": "3.0.1",
"@mui/material": "5.12.1",
@@ -67,7 +68,7 @@
"js-md5": "0.6.1",
"js-sha512": "0.8.0",
"jwt-decode": "2.2.0",
"lib-jitsi-meet": "https://github.com/jitsi/lib-jitsi-meet/releases/download/v1980.0.0+34a32e86/lib-jitsi-meet.tgz",
"lib-jitsi-meet": "https://github.com/jitsi/lib-jitsi-meet#release-8542",
"lodash-es": "4.17.21",
"moment": "2.29.4",
"moment-duration-format": "2.2.2",
@@ -122,6 +123,7 @@
"util": "0.12.1",
"uuid": "8.3.2",
"wasm-check": "2.0.1",
"webm-duration-fix": "1.0.4",
"windows-iana": "3.1.0",
"zxcvbn": "4.4.2"
},

View File

@@ -72,15 +72,11 @@ export function getInitials(s?: string) {
/**
* Checks if the passed URL should be loaded with CORS.
*
* @param {string | Function} url - The URL (on mobile we use a specific Icon component for avatars).
* @param {string} url - The URL.
* @param {Array<string>} corsURLs - The URL pattern that matches a URL that needs to be handled with CORS.
* @returns {boolean}
* @returns {void}
*/
export function isCORSAvatarURL(url: string | Function, corsURLs: Array<string> = []): boolean {
if (typeof url === 'function') {
return false;
}
export function isCORSAvatarURL(url: string, corsURLs: Array<string> = []): boolean {
return corsURLs.some(pattern => url.startsWith(pattern));
}

View File

@@ -1,5 +1,3 @@
import { createStartMutedConfigurationEvent } from '../../analytics/AnalyticsEvents';
import { sendAnalytics } from '../../analytics/functions';
import { IReduxState, IStore } from '../../app/types';
import { transcriberJoined, transcriberLeft } from '../../transcribing/actions';
import { setIAmVisitor } from '../../visitors/actions';
@@ -11,9 +9,7 @@ import { JITSI_CONNECTION_CONFERENCE_KEY } from '../connection/constants';
import { hasAvailableDevices } from '../devices/functions.any';
import JitsiMeetJS, { JitsiConferenceEvents, JitsiE2ePingEvents } from '../lib-jitsi-meet';
import {
setAudioMuted,
setAudioUnmutePermissions,
setVideoMuted,
setVideoUnmutePermissions
} from '../media/actions';
import { MEDIA_TYPE, MediaType } from '../media/constants';
@@ -31,7 +27,6 @@ import { IJitsiParticipant } from '../participants/types';
import { toState } from '../redux/functions';
import {
destroyLocalTracks,
replaceLocalTrack,
trackAdded,
trackRemoved
} from '../tracks/actions.any';
@@ -163,39 +158,6 @@ function _addConferenceListeners(conference: IJitsiConference, dispatch: IStore[
// Dispatches into features/base/media follow:
conference.on(
JitsiConferenceEvents.STARTED_MUTED,
() => {
const audioMuted = Boolean(conference.isStartAudioMuted());
const videoMuted = Boolean(conference.isStartVideoMuted());
const localTracks = getLocalTracks(state['features/base/tracks']);
sendAnalytics(createStartMutedConfigurationEvent('remote', audioMuted, videoMuted));
logger.log(`Start muted: ${audioMuted ? 'audio, ' : ''}${videoMuted ? 'video' : ''}`);
// XXX Jicofo tells lib-jitsi-meet to start with audio and/or video
// muted i.e. Jicofo expresses an intent. Lib-jitsi-meet has turned
// Jicofo's intent into reality by actually muting the respective
// tracks. The reality is expressed in base/tracks already so what
// is left is to express Jicofo's intent in base/media.
// TODO Maybe the app needs to learn about Jicofo's intent and
// transfer that intent to lib-jitsi-meet instead of lib-jitsi-meet
// acting on Jicofo's intent without the app's knowledge.
dispatch(setAudioMuted(audioMuted));
dispatch(setVideoMuted(videoMuted));
// Remove the tracks from peerconnection as well.
for (const track of localTracks) {
const trackType = track.jitsiTrack.getType();
// Do not remove the audio track on RN. Starting with iOS 15 it will fail to unmute otherwise.
if ((audioMuted && trackType === MEDIA_TYPE.AUDIO && navigator.product !== 'ReactNative')
|| (videoMuted && trackType === MEDIA_TYPE.VIDEO)) {
dispatch(replaceLocalTrack(track.jitsiTrack, null, conference));
}
}
});
conference.on(
JitsiConferenceEvents.AUDIO_UNMUTE_PERMISSIONS_CHANGED,
(disableAudioMuteChange: boolean) => {
@@ -808,10 +770,8 @@ export function nonParticipantMessageReceived(id: string, json: Object) {
/**
* Updates the known state of start muted policies.
*
* @param {boolean} audioMuted - Whether or not members will join the conference
* as audio muted.
* @param {boolean} videoMuted - Whether or not members will join the conference
* as video muted.
* @param {boolean} audioMuted - Whether or not members will join the conference as audio muted.
* @param {boolean} videoMuted - Whether or not members will join the conference as video muted.
* @returns {{
* type: SET_START_MUTED_POLICY,
* startAudioMutedPolicy: boolean,
@@ -1022,10 +982,8 @@ export function setRoom(room?: string) {
/**
* Sets whether or not members should join audio and/or video muted.
*
* @param {boolean} startAudioMuted - Whether or not members will join the
* conference as audio muted.
* @param {boolean} startVideoMuted - Whether or not members will join the
* conference as video muted.
* @param {boolean} startAudioMuted - Whether or not members will join the conference as audio muted.
* @param {boolean} startVideoMuted - Whether or not members will join the conference as video muted.
* @returns {Function}
*/
export function setStartMutedPolicy(
@@ -1037,9 +995,6 @@ export function setStartMutedPolicy(
audio: startAudioMuted,
video: startVideoMuted
});
dispatch(
onStartMutedPolicyChanged(startAudioMuted, startVideoMuted));
};
}

View File

@@ -22,12 +22,14 @@ import { INotificationProps } from '../../notifications/types';
import { hasDisplayName } from '../../prejoin/utils';
import { stopLocalVideoRecording } from '../../recording/actions.any';
import LocalRecordingManager from '../../recording/components/Recording/LocalRecordingManager';
import { AudioMixerEffect } from '../../stream-effects/audio-mixer/AudioMixerEffect';
import { iAmVisitor } from '../../visitors/functions';
import { overwriteConfig } from '../config/actions';
import { CONNECTION_ESTABLISHED, CONNECTION_FAILED } from '../connection/actionTypes';
import { connectionDisconnected, disconnect } from '../connection/actions';
import { validateJwt } from '../jwt/functions';
import { JitsiConferenceErrors, JitsiConferenceEvents, JitsiConnectionErrors } from '../lib-jitsi-meet';
import { MEDIA_TYPE } from '../media/constants';
import { PARTICIPANT_UPDATED, PIN_PARTICIPANT } from '../participants/actionTypes';
import { PARTICIPANT_ROLE } from '../participants/constants';
import {
@@ -70,6 +72,7 @@ import {
} from './functions';
import logger from './logger';
import { IConferenceMetadata } from './reducer';
import './subscriber';
/**
* Handler for before unload event.
@@ -653,7 +656,7 @@ function _setRoom({ dispatch, getState }: IStore, next: Function, action: AnyAct
* @private
* @returns {Object} The value returned by {@code next(action)}.
*/
function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction) {
async function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction) {
const track = action.track;
// TODO All track swapping should happen here instead of conference.js.
@@ -661,7 +664,6 @@ function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction)
const { getState } = store;
const state = getState();
const conference = getCurrentConference(state);
let promise;
if (conference) {
const jitsiTrack = action.track.jitsiTrack;
@@ -670,14 +672,22 @@ function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction)
// If gUM is slow and tracks are created after the user has already joined the conference, avoid
// adding the tracks to the conference if the user is a visitor.
if (!iAmVisitor(state)) {
promise = _addLocalTracksToConference(conference, [ jitsiTrack ]);
const { desktopAudioTrack } = state['features/screen-share'];
// If the user is sharing their screen and has a desktop audio track, we need to replace that with
// the audio mixer effect so that the desktop audio is mixed in with the microphone audio.
if (typeof APP !== 'undefined' && desktopAudioTrack && track.mediaType === MEDIA_TYPE.AUDIO) {
await conference.replaceTrack(desktopAudioTrack, null);
const audioMixerEffect = new AudioMixerEffect(desktopAudioTrack);
await jitsiTrack.setEffect(audioMixerEffect);
await conference.replaceTrack(null, jitsiTrack);
} else {
await _addLocalTracksToConference(conference, [ jitsiTrack ]);
}
}
} else {
promise = _removeLocalTracksFromConference(conference, [ jitsiTrack ]);
}
if (promise) {
return promise.then(() => next(action));
await _removeLocalTracksFromConference(conference, [ jitsiTrack ]);
}
}
}

View File

@@ -105,8 +105,6 @@ export interface IJitsiConference {
isLobbySupported: Function;
isP2PActive: Function;
isSIPCallingSupported: Function;
isStartAudioMuted: Function;
isStartVideoMuted: Function;
join: Function;
joinLobby: Function;
kickParticipant: Function;

View File

@@ -0,0 +1,61 @@
import { IStore } from '../../app/types';
import { showNotification } from '../../notifications/actions';
import { NOTIFICATION_TIMEOUT_TYPE } from '../../notifications/constants';
import StateListenerRegistry from '../redux/StateListenerRegistry';
import { setAudioMuted, setVideoMuted } from '../media/actions';
import { VIDEO_MUTISM_AUTHORITY } from '../media/constants';
let hasShownNotification = false;
/**
* Handles changes in the start muted policy for audio and video tracks in the meta data set for the conference.
*/
StateListenerRegistry.register(
/* selector */ state => state['features/base/conference'].startAudioMutedPolicy,
/* listener */ (startAudioMutedPolicy, store) => {
_updateTrackMuteState(store, true);
});
StateListenerRegistry.register(
/* selector */ state => state['features/base/conference'].startVideoMutedPolicy,
/* listener */(startVideoMutedPolicy, store) => {
_updateTrackMuteState(store, false);
});
/**
* Updates the mute state of the track based on the start muted policy.
*
* @param {IStore} store - The redux store.
* @param {boolean} isAudio - Whether the track is audio or video.
* @returns {void}
*/
function _updateTrackMuteState(store: IStore, isAudio: boolean) {
const { dispatch, getState } = store;
const mutedPolicyKey = isAudio ? 'startAudioMutedPolicy' : 'startVideoMutedPolicy';
const mutedPolicyValue = getState()['features/base/conference'][mutedPolicyKey];
// Currently, the policy only supports force muting others, not unmuting them.
if (!mutedPolicyValue) {
return;
}
let muteStateUpdated = false;
const { muted } = isAudio ? getState()['features/base/media'].audio : getState()['features/base/media'].video;
if (isAudio && !Boolean(muted)) {
dispatch(setAudioMuted(mutedPolicyValue, true));
muteStateUpdated = true;
} else if (!isAudio && !Boolean(muted)) {
// TODO: Add a new authority for video mutism for the moderator case.
dispatch(setVideoMuted(mutedPolicyValue, VIDEO_MUTISM_AUTHORITY.USER, true));
muteStateUpdated = true;
}
if (!hasShownNotification && muteStateUpdated) {
hasShownNotification = true;
dispatch(showNotification({
titleKey: 'notify.mutedTitle',
descriptionKey: 'notify.muted'
}, NOTIFICATION_TIMEOUT_TYPE.SHORT));
}
}

View File

@@ -438,6 +438,7 @@ export interface IConfig {
};
iAmRecorder?: boolean;
iAmSipGateway?: boolean;
iAmSpot?: boolean;
ignoreStartMuted?: boolean;
inviteAppName?: string | null;
inviteServiceCallFlowsUrl?: string;
@@ -542,10 +543,12 @@ export interface IConfig {
};
recordingSharingUrl?: string;
recordings?: {
consentLearnMoreLink?: string;
recordAudioAndVideo?: boolean;
requireConsent?: boolean;
showPrejoinWarning?: boolean;
showRecordingLink?: boolean;
skipConsentInMeeting?: boolean;
suggestRecording?: boolean;
};
remoteVideoMenu?: {
@@ -617,7 +620,6 @@ export interface IConfig {
transcription?: {
autoCaptionOnTranscribe?: boolean;
autoTranscribeOnRecord?: boolean;
disableClosedCaptions?: boolean;
enabled?: boolean;
preferredLanguage?: string;
translationLanguages?: Array<string>;

View File

@@ -169,6 +169,7 @@ export default [
'hideLobbyButton',
'iAmRecorder',
'iAmSipGateway',
'iAmSpot',
'ignoreStartMuted',
'inviteAppName',
'liveStreaming.enabled',

View File

@@ -40,6 +40,7 @@ export default class AbstractDialog<P extends IProps, S extends IState = IState>
super(props);
// Bind event handlers so they are only bound once per instance.
this._onBack = this._onBack.bind(this);
this._onCancel = this._onCancel.bind(this);
this._onSubmit = this._onSubmit.bind(this);
this._onSubmitFulfilled = this._onSubmitFulfilled.bind(this);
@@ -75,6 +76,14 @@ export default class AbstractDialog<P extends IProps, S extends IState = IState>
return this.props.dispatch(hideDialog());
}
_onBack() {
const { backDisabled = false, onBack } = this.props;
if (!backDisabled && (!onBack || onBack())) {
this._hide();
}
}
/**
* Dispatches a redux action to hide this dialog when it's canceled.
*

View File

@@ -16,6 +16,11 @@ import styles from './styles';
*/
interface IProps extends AbstractProps, WithTranslation {
/**
* The i18n key of the text label for the back button.
*/
backLabel?: string;
/**
* The i18n key of the text label for the cancel button.
*/
@@ -36,6 +41,11 @@ interface IProps extends AbstractProps, WithTranslation {
*/
descriptionKey?: string | { key: string; params: string; };
/**
* Whether the back button is hidden.
*/
isBackHidden?: Boolean;
/**
* Whether the cancel button is hidden.
*/
@@ -55,6 +65,11 @@ interface IProps extends AbstractProps, WithTranslation {
* Dialog title.
*/
title?: string;
/**
* Renders buttons vertically.
*/
verticalButtons?: boolean;
}
/**
@@ -102,14 +117,17 @@ class ConfirmDialog extends AbstractDialog<IProps> {
*/
override render() {
const {
backLabel,
cancelLabel,
children,
confirmLabel,
isBackHidden = true,
isCancelHidden,
isConfirmDestructive,
isConfirmHidden,
t,
title
title,
verticalButtons
} = this.props;
const dialogButtonStyle
@@ -119,6 +137,7 @@ class ConfirmDialog extends AbstractDialog<IProps> {
return (
<Dialog.Container
coverScreen = { false }
verticalButtons = { verticalButtons }
visible = { true }>
{
title && <Dialog.Title>
@@ -127,6 +146,12 @@ class ConfirmDialog extends AbstractDialog<IProps> {
}
{ this._renderDescription() }
{ children }
{
!isBackHidden && <Dialog.Button
label = { t(backLabel || 'dialog.confirmBack') }
onPress = { this._onBack }
style = { styles.dialogButton } />
}
{
!isCancelHidden && <Dialog.Button
label = { t(cancelLabel || 'dialog.confirmNo') }

View File

@@ -2,6 +2,16 @@ import { ReactNode } from 'react';
export type DialogProps = {
/**
* Whether back button is disabled. Enabled by default.
*/
backDisabled?: boolean;
/**
* Optional i18n key to change the back button title.
*/
backKey?: string;
/**
* Whether cancel button is disabled. Enabled by default.
*/
@@ -27,6 +37,11 @@ export type DialogProps = {
*/
okKey?: string;
/**
* The handler for onBack event.
*/
onBack?: Function;
/**
* The handler for onCancel event.
*/

View File

@@ -176,6 +176,7 @@ class Popover extends Component<IProps, IState> {
this._setContextMenuStyle = this._setContextMenuStyle.bind(this);
this._getCustomDialogStyle = this._getCustomDialogStyle.bind(this);
this._onOutsideClick = this._onOutsideClick.bind(this);
this._onOutsideTouchStart = this._onOutsideTouchStart.bind(this);
}
/**
@@ -185,7 +186,7 @@ class Popover extends Component<IProps, IState> {
* @returns {void}
*/
override componentDidMount() {
window.addEventListener('touchstart', this._onTouchStart);
window.addEventListener('touchstart', this._onOutsideTouchStart);
if (this.props.trigger === 'click') {
// @ts-ignore
window.addEventListener('click', this._onOutsideClick);
@@ -199,7 +200,7 @@ class Popover extends Component<IProps, IState> {
* @returns {void}
*/
override componentWillUnmount() {
window.removeEventListener('touchstart', this._onTouchStart);
window.removeEventListener('touchstart', this._onOutsideTouchStart);
if (this.props.trigger === 'click') {
// @ts-ignore
window.removeEventListener('click', this._onOutsideClick);
@@ -261,6 +262,7 @@ class Popover extends Component<IProps, IState> {
id = { id }
onClick = { this._onClick }
onKeyPress = { this._onKeyPress }
onTouchStart = { this._onTouchStart }
{ ...(trigger === 'hover' ? {
onMouseEnter: this._onShowDialog,
onMouseLeave: this._onHideDialog
@@ -337,7 +339,7 @@ class Popover extends Component<IProps, IState> {
* @private
* @returns {void}
*/
_onTouchStart(event: TouchEvent) {
_onOutsideTouchStart(event: TouchEvent) {
if (this.props.visible
&& !this.props.overflowDrawer
&& !this._contextMenuRef?.contains?.(event.target as Node)
@@ -401,6 +403,24 @@ class Popover extends Component<IProps, IState> {
}
}
/**
* Stops propagation of touchstart events originating from the Popover's trigger container.
* This prevents the window's 'touchstart' listener (_onOutsideTouchStart) from
* immediately closing the Popover if the touch begins on the trigger area itself.
* Without this, the subsequent synthesized 'click' event will not execute
* because the Popover would already be closing or removed, breaking interactions
* within the Popover on touch devices.
*
* e.g. On a mobile device overflow buttons don't execute their click actions.
*
* @param {React.TouchEvent} event - The touch start event.
* @private
* @returns {void}
*/
_onTouchStart(event: React.TouchEvent) {
event.stopPropagation();
}
/**
* KeyPress handler for accessibility.
*

View File

@@ -12,7 +12,6 @@
* localFlipX: boolean,
* micDeviceId: string,
* serverURL: string,
* showSubtitlesOnStage: boolean,
* startAudioOnly: boolean,
* startWithAudioMuted: boolean,
* startWithVideoMuted: boolean,

View File

@@ -29,7 +29,6 @@ const DEFAULT_STATE: ISettingsState = {
micDeviceId: undefined,
serverURL: undefined,
hideShareAudioHelper: false,
showSubtitlesOnStage: false,
soundsIncomingMessage: true,
soundsParticipantJoined: true,
soundsParticipantKnocking: true,
@@ -68,7 +67,6 @@ export interface ISettingsState {
maxStageParticipants?: number;
micDeviceId?: string | boolean;
serverURL?: string;
showSubtitlesOnStage?: boolean;
soundsIncomingMessage?: boolean;
soundsParticipantJoined?: boolean;
soundsParticipantKnocking?: boolean;

View File

@@ -2,7 +2,6 @@ import { batch } from 'react-redux';
import { IStore } from '../../app/types';
import { _RESET_BREAKOUT_ROOMS } from '../../breakout-rooms/actionTypes';
import { isPrejoinPageVisible } from '../../prejoin/functions';
import { getCurrentConference } from '../conference/functions';
import {
SET_AUDIO_MUTED,
@@ -203,11 +202,8 @@ function _setMuted(store: IStore, { ensureTrack, muted }: {
setTrackMuted(jitsiTrack, muted, state, dispatch)
.catch(() => dispatch(trackMuteUnmuteFailed(localTrack, muted)));
}
} else if (!muted && ensureTrack && (typeof APP === 'undefined' || isPrejoinPageVisible(state))) {
} else if (!muted && ensureTrack) {
typeof APP !== 'undefined' && dispatch(gumPending([ mediaType ], IGUMPendingState.PENDING_UNMUTE));
// FIXME: This only runs on mobile now because web has its own way of
// creating local tracks. Adjust the check once they are unified.
dispatch(createLocalTracksA({ devices: [ mediaType ] })).then(() => {
typeof APP !== 'undefined' && dispatch(gumPending([ mediaType ], IGUMPendingState.NONE));
});

View File

@@ -1,3 +1,4 @@
// Mapping between the token used and the color
export const colorMap = {
// ----- Surfaces -----
@@ -118,8 +119,8 @@ export const colorMap = {
export const font = {
weightRegular: 400,
weightSemiBold: 600
weightRegular: '400',
weightSemiBold: '600'
};
export const shape = {
@@ -129,7 +130,7 @@ export const shape = {
};
export const spacing
= [ '0rem', '0.25rem', '0.5rem', '1rem', '1.5rem', '2rem', '2.5rem', '3rem', '3.5rem', '4rem', '4.5rem', '5rem', '5.5rem', '6rem', '6.5rem', '7rem', '7.5rem', '8rem' ];
= [ 0, 4, 8, 16, 24, 32, 40, 48, 56, 64, 72, 80, 88, 96, 104, 112, 120, 128 ];
export const typography = {
labelRegular: 'label01',
@@ -137,64 +138,64 @@ export const typography = {
labelBold: 'labelBold01',
bodyShortRegularSmall: {
fontSize: '0.625rem',
lineHeight: '1rem',
fontSize: 10,
lineHeight: 16,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyShortRegular: {
fontSize: '0.875rem',
lineHeight: '1.25rem',
fontSize: 14,
lineHeight: 20,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyShortBold: {
fontSize: '0.875rem',
lineHeight: '1.25rem',
fontSize: 14,
lineHeight: 20,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
bodyShortRegularLarge: {
fontSize: '1rem',
lineHeight: '1.375rem',
fontSize: 16,
lineHeight: 22,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyShortBoldLarge: {
fontSize: '1rem',
lineHeight: '1.375rem',
fontSize: 16,
lineHeight: 22,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
bodyLongRegular: {
fontSize: '0.875rem',
lineHeight: '1.5rem',
fontSize: 14,
lineHeight: 24,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyLongRegularLarge: {
fontSize: '1rem',
lineHeight: '1.625rem',
fontSize: 16,
lineHeight: 26,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyLongBold: {
fontSize: '0.875rem',
lineHeight: '1.5rem',
fontSize: 14,
lineHeight: 24,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
bodyLongBoldLarge: {
fontSize: '1rem',
lineHeight: '1.625rem',
fontSize: 16,
lineHeight: 26,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
@@ -204,29 +205,29 @@ export const typography = {
heading2: 'heading02',
heading3: {
fontSize: '2rem',
lineHeight: '2.5rem',
fontSize: 32,
lineHeight: 40,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
heading4: {
fontSize: '1.75rem',
lineHeight: '2.25rem',
fontSize: 28,
lineHeight: 36,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
heading5: {
fontSize: '1.25rem',
lineHeight: '1.75rem',
fontSize: 20,
lineHeight: 28,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
heading6: {
fontSize: '1rem',
lineHeight: '1.625rem',
fontSize: 16,
lineHeight: 26,
fontWeight: font.weightSemiBold,
letterSpacing: 0
}

View File

@@ -213,7 +213,7 @@ const ContextMenu = ({
if (offsetTop + height > offsetHeight + scrollTop && height > offsetTop) {
// top offset and + padding + border
container.style.maxHeight = `calc(${offsetTop}px - (${spacing[2]} * 2 + 2px))`;
container.style.maxHeight = `${offsetTop - ((spacing[2] * 2) + 2)}px`;
}
// get the height after style changes

View File

@@ -1,29 +0,0 @@
import React from 'react';
interface IHiddenDescriptionProps {
children: React.ReactNode;
id: string;
}
export const HiddenDescription: React.FC<IHiddenDescriptionProps> = ({ id, children }) => {
const hiddenStyle: React.CSSProperties = {
border: 0,
clip: 'rect(0 0 0 0)',
clipPath: 'inset(50%)',
height: '1px',
margin: '-1px',
overflow: 'hidden',
padding: 0,
position: 'absolute',
width: '1px',
whiteSpace: 'nowrap'
};
return (
<span
id = { id }
style = { hiddenStyle }>
{children}
</span>
);
};

View File

@@ -7,7 +7,6 @@ import Icon from '../../../icons/components/Icon';
import { IconCloseCircle } from '../../../icons/svg';
import { withPixelLineHeight } from '../../../styles/functions.web';
import { IInputProps } from '../types';
import { HiddenDescription } from './HiddenDescription';
interface IProps extends IInputProps {
accessibilityLabel?: string;
@@ -15,7 +14,6 @@ interface IProps extends IInputProps {
autoFocus?: boolean;
bottomLabel?: string;
className?: string;
hiddenDescription?: string; // Text that will be announced by screen readers but not displayed visually.
iconClick?: () => void;
/**
@@ -154,14 +152,13 @@ const useStyles = makeStyles()(theme => {
const Input = React.forwardRef<any, IProps>(({
accessibilityLabel,
autoComplete = 'off',
autoComplete,
autoFocus,
bottomLabel,
className,
clearable = false,
disabled,
error,
hiddenDescription,
icon,
iconClick,
id,
@@ -188,22 +185,11 @@ const Input = React.forwardRef<any, IProps>(({
const { classes: styles, cx } = useStyles();
const isMobile = isMobileBrowser();
const showClearIcon = clearable && value !== '' && !disabled;
const inputAutoCompleteOff = autoComplete === 'off' ? { 'data-1p-ignore': '' } : {};
const handleChange = useCallback((e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>) =>
onChange?.(e.target.value), []);
const clearInput = useCallback(() => onChange?.(''), []);
const hiddenDescriptionId = `${id}-hidden-description`;
let ariaDescribedById: string | undefined;
if (bottomLabel) {
ariaDescribedById = `${id}-description`;
} else if (hiddenDescription) {
ariaDescribedById = hiddenDescriptionId;
} else {
ariaDescribedById = undefined;
}
return (
<div className = { cx(styles.inputContainer, className) }>
@@ -221,7 +207,6 @@ const Input = React.forwardRef<any, IProps>(({
src = { icon } />}
{textarea ? (
<TextareaAutosize
aria-describedby = { ariaDescribedById }
aria-label = { accessibilityLabel }
autoComplete = { autoComplete }
autoFocus = { autoFocus }
@@ -242,7 +227,7 @@ const Input = React.forwardRef<any, IProps>(({
value = { value } />
) : (
<input
aria-describedby = { ariaDescribedById }
aria-describedby = { bottomLabel ? `${id}-description` : undefined }
aria-label = { accessibilityLabel }
autoComplete = { autoComplete }
autoFocus = { autoFocus }
@@ -251,7 +236,6 @@ const Input = React.forwardRef<any, IProps>(({
data-testid = { testId }
disabled = { disabled }
id = { id }
{ ...inputAutoCompleteOff }
{ ...(mode ? { inputmode: mode } : {}) }
{ ...(type === 'number' ? { max: maxValue } : {}) }
maxLength = { maxLength }
@@ -282,7 +266,6 @@ const Input = React.forwardRef<any, IProps>(({
{bottomLabel}
</span>
)}
{!bottomLabel && hiddenDescription && <HiddenDescription id = { hiddenDescriptionId }>{ hiddenDescription }</HiddenDescription>}
</div>
);
});

View File

@@ -2,47 +2,6 @@ import { DefaultTheme } from 'react-native-paper';
import { createColorTokens } from './utils';
// Base font size in pixels (standard is 16px = 1rem)
const BASE_FONT_SIZE = 16;
/**
* Converts rem to pixels.
*
* @param {string} remValue - The value in rem units (e.g. '0.875rem').
* @returns {number}
*/
function remToPixels(remValue: string): number {
const numericValue = parseFloat(remValue.replace('rem', ''));
return Math.round(numericValue * BASE_FONT_SIZE);
}
/**
* Converts all rem to pixels in an object.
*
* @param {Object} obj - The object to convert rem values in.
* @returns {Object}
*/
function convertRemValues(obj: any): any {
const converted: { [key: string]: any; } = {};
if (typeof obj !== 'object' || obj === null) {
return obj;
}
Object.entries(obj).forEach(([ key, value ]) => {
if (typeof value === 'string' && value.includes('rem')) {
converted[key] = remToPixels(value);
} else if (typeof value === 'object' && value !== null) {
converted[key] = convertRemValues(value);
} else {
converted[key] = value;
}
});
return converted;
}
/**
* Creates a React Native Paper theme based on local UI tokens.
*
@@ -54,10 +13,10 @@ export function createNativeTheme({ font, colorMap, shape, spacing, typography }
...DefaultTheme,
palette: createColorTokens(colorMap),
shape,
spacing: spacing.map(remToPixels),
spacing,
typography: {
font,
...convertRemValues(typography)
...typography
}
};
}

View File

@@ -18,7 +18,7 @@ interface ThemeProps {
colorMap: Object;
font: Object;
shape: Object;
spacing: Array<number | string>;
spacing: Array<number>;
typography: Object;
}

View File

@@ -1,79 +0,0 @@
/**
* Interface representing a message that can be grouped.
* Used by both chat messages and subtitles.
*/
export interface IGroupableMessage {
/**
* The ID of the participant who sent the message.
*/
participantId: string;
}
/**
* Interface representing a group of messages from the same sender.
*
* @template T - The type of messages in the group, must extend IGroupableMessage.
*/
export interface IMessageGroup<T extends IGroupableMessage> {
/**
* Array of messages in this group.
*/
messages: T[];
/**
* The ID of the participant who sent all messages in this group.
*/
senderId: string;
}
/**
* Groups an array of messages by sender.
*
* @template T - The type of messages to group, must extend IGroupableMessage.
* @param {T[]} messages - The array of messages to group.
* @returns {IMessageGroup<T>[]} - An array of message groups, where each group contains messages from the same sender.
* @example
* const messages = [
* { participantId: "user1", timestamp: 1000 },
* { participantId: "user1", timestamp: 2000 },
* { participantId: "user2", timestamp: 3000 }
* ];
* const groups = groupMessagesBySender(messages);
* // Returns:
* // [
* // {
* // senderId: "user1",
* // messages: [
* // { participantId: "user1", timestamp: 1000 },
* // { participantId: "user1", timestamp: 2000 }
* // ]
* // },
* // { senderId: "user2", messages: [{ participantId: "user2", timestamp: 3000 }] }
* // ]
*/
export function groupMessagesBySender<T extends IGroupableMessage>(
messages: T[]
): IMessageGroup<T>[] {
if (!messages?.length) {
return [];
}
const groups: IMessageGroup<T>[] = [];
let currentGroup: IMessageGroup<T> | null = null;
for (const message of messages) {
if (!currentGroup || currentGroup.senderId !== message.participantId) {
currentGroup = {
messages: [ message ],
senderId: message.participantId
};
groups.push(currentGroup);
} else {
currentGroup.messages.push(message);
}
}
return groups;
}

View File

@@ -1,9 +1,16 @@
import { IReduxState } from '../../app/types';
/**
* Checks if Jitsi Meet is running on Spot TV.
*
* @param {IReduxState} state - The redux state.
* @returns {boolean} Whether or not Jitsi Meet is running on Spot TV.
*/
export function isSpotTV(): boolean {
return navigator.userAgent.includes('SpotElectron/');
export function isSpotTV(state: IReduxState): boolean {
const { defaultLocalDisplayName, iAmSpot } = state['features/base/config'] || {};
return iAmSpot
|| navigator.userAgent.includes('JitsiSpot/') // Jitsi Spot app
|| navigator.userAgent.includes('8x8MeetingRooms/') // 8x8 Meeting Rooms app
|| defaultLocalDisplayName === 'Meeting Room';
}

View File

@@ -98,14 +98,14 @@ export const SEND_REACTION = 'SEND_REACTION';
export const SET_PRIVATE_MESSAGE_RECIPIENT = 'SET_PRIVATE_MESSAGE_RECIPIENT';
/**
* The type of action which signals setting the focused tab.
* The type of action which signals the update a _isPollsTabFocused.
*
* {
* type: SET_FOCUSED_TAB,
* tabId: string
* isPollsTabFocused: boolean,
* type: SET_PRIVATE_MESSAGE_RECIPIENT
* }
*/
export const SET_FOCUSED_TAB = 'SET_FOCUSED_TAB';
export const SET_IS_POLL_TAB_FOCUSED = 'SET_IS_POLL_TAB_FOCUSED';
/**
* The type of action which sets the current recipient for lobby messages.

View File

@@ -10,16 +10,14 @@ import {
CLEAR_MESSAGES,
CLOSE_CHAT,
EDIT_MESSAGE,
OPEN_CHAT,
REMOVE_LOBBY_CHAT_PARTICIPANT,
SEND_MESSAGE,
SEND_REACTION,
SET_FOCUSED_TAB,
SET_IS_POLL_TAB_FOCUSED,
SET_LOBBY_CHAT_ACTIVE_STATE,
SET_LOBBY_CHAT_RECIPIENT,
SET_PRIVATE_MESSAGE_RECIPIENT
} from './actionTypes';
import { ChatTabs } from './constants';
/**
* Adds a chat message to the collection of messages.
@@ -171,36 +169,18 @@ export function setPrivateMessageRecipient(participant?: Object) {
}
/**
* Set the value of the currently focused tab.
* Set the value of _isPollsTabFocused.
*
* @param {string} tabId - The id of the currently focused tab.
* @returns {{
* type: SET_FOCUSED_TAB,
* tabId: string
* }}
* @param {boolean} isPollsTabFocused - The new value for _isPollsTabFocused.
* @returns {Function}
*/
export function setFocusedTab(tabId: ChatTabs) {
export function setIsPollsTabFocused(isPollsTabFocused: boolean) {
return {
type: SET_FOCUSED_TAB,
tabId
isPollsTabFocused,
type: SET_IS_POLL_TAB_FOCUSED
};
}
/**
* Opens the chat panel with CC tab active.
*
* @returns {Object} The redux action.
*/
export function openCCPanel() {
return async (dispatch: IStore['dispatch']) => {
dispatch(setFocusedTab(ChatTabs.CLOSED_CAPTIONS));
dispatch({
type: OPEN_CHAT
});
};
}
/**
* Initiates the sending of messages between a moderator and a lobby attendee.
*

View File

@@ -1,28 +1,26 @@
import React, { Component } from 'react';
import React from 'react';
import { FlatList, Text, TextStyle, View, ViewStyle } from 'react-native';
import { connect } from 'react-redux';
import { translate } from '../../../base/i18n/functions';
import { IMessageGroup, groupMessagesBySender } from '../../../base/util/messageGrouping';
import { IMessage } from '../../types';
import AbstractMessageContainer, { IProps as AbstractProps } from '../AbstractMessageContainer';
import ChatMessageGroup from './ChatMessageGroup';
import styles from './styles';
interface IProps {
messages: IMessage[];
interface IProps extends AbstractProps {
/**
* Function to be used to translate i18n labels.
*/
t: Function;
}
/**
* Implements a container to render all the chat messages in a conference.
*/
class MessageContainer extends Component<IProps, any> {
static defaultProps = {
messages: [] as IMessage[]
};
class MessageContainer extends AbstractMessageContainer<IProps, any> {
/**
* Instantiates a new instance of the component.
*
@@ -34,7 +32,6 @@ class MessageContainer extends Component<IProps, any> {
this._keyExtractor = this._keyExtractor.bind(this);
this._renderListEmptyComponent = this._renderListEmptyComponent.bind(this);
this._renderMessageGroup = this._renderMessageGroup.bind(this);
this._getMessagesGroupedBySender = this._getMessagesGroupedBySender.bind(this);
}
/**
@@ -97,21 +94,9 @@ class MessageContainer extends Component<IProps, any> {
* @param {Array<Object>} messages - The chat message to render.
* @returns {React$Element<*>}
*/
_renderMessageGroup({ item: group }: { item: IMessageGroup<IMessage>; }) {
const { messages } = group;
_renderMessageGroup({ item: messages }: { item: IMessage[]; }) {
return <ChatMessageGroup messages = { messages } />;
}
/**
* Returns an array of message groups, where each group is an array of messages
* grouped by the sender.
*
* @returns {Array<Array<Object>>}
*/
_getMessagesGroupedBySender() {
return groupMessagesBySender(this.props.messages);
}
}
export default translate(connect()(MessageContainer));

View File

@@ -9,14 +9,12 @@ import { withPixelLineHeight } from '../../../base/styles/functions.web';
import Tabs from '../../../base/ui/components/web/Tabs';
import { arePollsDisabled } from '../../../conference/functions.any';
import PollsPane from '../../../polls/components/web/PollsPane';
import { isCCTabEnabled } from '../../../subtitles/functions.any';
import { sendMessage, setFocusedTab, toggleChat } from '../../actions.web';
import { CHAT_SIZE, ChatTabs, SMALL_WIDTH_THRESHOLD } from '../../constants';
import { sendMessage, setIsPollsTabFocused, toggleChat } from '../../actions.web';
import { CHAT_SIZE, CHAT_TABS, SMALL_WIDTH_THRESHOLD } from '../../constants';
import { IChatProps as AbstractProps } from '../../types';
import ChatHeader from './ChatHeader';
import ChatInput from './ChatInput';
import ClosedCaptionsTab from './ClosedCaptionsTab';
import DisplayNameForm from './DisplayNameForm';
import KeyboardAvoider from './KeyboardAvoider';
import MessageContainer from './MessageContainer';
@@ -24,16 +22,6 @@ import MessageRecipient from './MessageRecipient';
interface IProps extends AbstractProps {
/**
* The currently focused tab.
*/
_focusedTab: ChatTabs;
/**
* True if the CC tab is enabled and false otherwise.
*/
_isCCTabEnabled: boolean;
/**
* Whether the chat is opened in a modal or not (computed based on window width).
*/
@@ -49,6 +37,11 @@ interface IProps extends AbstractProps {
*/
_isPollsEnabled: boolean;
/**
* Whether the poll tab is focused or not.
*/
_isPollsTabFocused: boolean;
/**
* Number of unread poll messages.
*/
@@ -154,8 +147,7 @@ const Chat = ({
_isModal,
_isOpen,
_isPollsEnabled,
_isCCTabEnabled,
_focusedTab,
_isPollsTabFocused,
_messages,
_nbUnreadMessages,
_nbUnreadPolls,
@@ -211,8 +203,8 @@ const Chat = ({
* @returns {void}
*/
const onChangeTab = useCallback((id: string) => {
dispatch(setFocusedTab(id as ChatTabs));
}, [ dispatch ]);
dispatch(setIsPollsTabFocused(id !== CHAT_TABS.CHAT));
}, []);
/**
* Returns a React Element for showing chat messages and a form to send new
@@ -224,15 +216,15 @@ const Chat = ({
function renderChat() {
return (
<>
{renderTabs()}
{_isPollsEnabled && renderTabs()}
<div
aria-labelledby = { ChatTabs.CHAT }
aria-labelledby = { CHAT_TABS.CHAT }
className = { cx(
classes.chatPanel,
!_isPollsEnabled && !_isCCTabEnabled && classes.chatPanelNoTabs,
_focusedTab !== ChatTabs.CHAT && 'hide'
!_isPollsEnabled && classes.chatPanelNoTabs,
_isPollsTabFocused && 'hide'
) }
id = { `${ChatTabs.CHAT}-panel` }
id = { `${CHAT_TABS.CHAT}-panel` }
role = 'tabpanel'
tabIndex = { 0 }>
<MessageContainer
@@ -241,76 +233,49 @@ const Chat = ({
<ChatInput
onSend = { onSendMessage } />
</div>
{ _isPollsEnabled && (
{_isPollsEnabled && (
<>
<div
aria-labelledby = { ChatTabs.POLLS }
className = { cx(classes.pollsPanel, _focusedTab !== ChatTabs.POLLS && 'hide') }
id = { `${ChatTabs.POLLS}-panel` }
aria-labelledby = { CHAT_TABS.POLLS }
className = { cx(classes.pollsPanel, !_isPollsTabFocused && 'hide') }
id = { `${CHAT_TABS.POLLS}-panel` }
role = 'tabpanel'
tabIndex = { 1 }>
tabIndex = { 0 }>
<PollsPane />
</div>
<KeyboardAvoider />
</>
)}
{ _isCCTabEnabled && <div
aria-labelledby = { ChatTabs.CLOSED_CAPTIONS }
className = { cx(classes.chatPanel, _focusedTab !== ChatTabs.CLOSED_CAPTIONS && 'hide') }
id = { `${ChatTabs.CLOSED_CAPTIONS}-panel` }
role = 'tabpanel'
tabIndex = { 2 }>
<ClosedCaptionsTab />
</div> }
</>
);
}
/**
* Returns a React Element showing the Chat, Polls and Subtitles tabs.
* Returns a React Element showing the Chat and Polls tab.
*
* @private
* @returns {ReactElement}
*/
function renderTabs() {
const tabs = [
{
accessibilityLabel: t('chat.tabs.chat'),
countBadge:
_focusedTab !== ChatTabs.CHAT && _nbUnreadMessages > 0 ? _nbUnreadMessages : undefined,
id: ChatTabs.CHAT,
controlsId: `${ChatTabs.CHAT}-panel`,
label: t('chat.tabs.chat')
}
];
if (_isPollsEnabled) {
tabs.push({
accessibilityLabel: t('chat.tabs.polls'),
countBadge: _focusedTab !== ChatTabs.POLLS && _nbUnreadPolls > 0 ? _nbUnreadPolls : undefined,
id: ChatTabs.POLLS,
controlsId: `${ChatTabs.POLLS}-panel`,
label: t('chat.tabs.polls')
});
}
if (_isCCTabEnabled) {
tabs.push({
accessibilityLabel: t('chat.tabs.closedCaptions'),
countBadge: undefined,
id: ChatTabs.CLOSED_CAPTIONS,
controlsId: `${ChatTabs.CLOSED_CAPTIONS}-panel`,
label: t('chat.tabs.closedCaptions')
});
}
return (
<Tabs
accessibilityLabel = { t(_isPollsEnabled ? 'chat.titleWithPolls' : 'chat.title') }
onChange = { onChangeTab }
selected = { _focusedTab }
tabs = { tabs } />
selected = { _isPollsTabFocused ? CHAT_TABS.POLLS : CHAT_TABS.CHAT }
tabs = { [ {
accessibilityLabel: t('chat.tabs.chat'),
countBadge: _isPollsTabFocused && _nbUnreadMessages > 0 ? _nbUnreadMessages : undefined,
id: CHAT_TABS.CHAT,
controlsId: `${CHAT_TABS.CHAT}-panel`,
label: t('chat.tabs.chat')
}, {
accessibilityLabel: t('chat.tabs.polls'),
countBadge: !_isPollsTabFocused && _nbUnreadPolls > 0 ? _nbUnreadPolls : undefined,
id: CHAT_TABS.POLLS,
controlsId: `${CHAT_TABS.POLLS}-panel`,
label: t('chat.tabs.polls')
}
] } />
);
}
@@ -321,13 +286,10 @@ const Chat = ({
onKeyDown = { onEscClick } >
<ChatHeader
className = { cx('chat-header', classes.chatHeader) }
isCCTabEnabled = { _isCCTabEnabled }
isPollsEnabled = { _isPollsEnabled }
onCancel = { onToggleChat } />
{_showNamePrompt
? <DisplayNameForm
isCCTabEnabled = { _isCCTabEnabled }
isPollsEnabled = { _isPollsEnabled } />
? <DisplayNameForm isPollsEnabled = { _isPollsEnabled } />
: renderChat()}
</div> : null
);
@@ -344,8 +306,7 @@ const Chat = ({
* _isModal: boolean,
* _isOpen: boolean,
* _isPollsEnabled: boolean,
* _isCCTabEnabled: boolean,
* _focusedTab: string,
* _isPollsTabFocused: boolean,
* _messages: Array<Object>,
* _nbUnreadMessages: number,
* _nbUnreadPolls: number,
@@ -353,7 +314,7 @@ const Chat = ({
* }}
*/
function _mapStateToProps(state: IReduxState, _ownProps: any) {
const { isOpen, focusedTab, messages, nbUnreadMessages } = state['features/chat'];
const { isOpen, isPollsTabFocused, messages, nbUnreadMessages } = state['features/chat'];
const { nbUnreadPolls } = state['features/polls'];
const _localParticipant = getLocalParticipant(state);
@@ -361,8 +322,7 @@ function _mapStateToProps(state: IReduxState, _ownProps: any) {
_isModal: window.innerWidth <= SMALL_WIDTH_THRESHOLD,
_isOpen: isOpen,
_isPollsEnabled: !arePollsDisabled(state),
_isCCTabEnabled: isCCTabEnabled(state),
_focusedTab: focusedTab,
_isPollsTabFocused: isPollsTabFocused,
_messages: messages,
_nbUnreadMessages: nbUnreadMessages,
_nbUnreadPolls: nbUnreadPolls,

View File

@@ -13,11 +13,6 @@ interface IProps {
*/
className: string;
/**
* Whether CC tab is enabled or not.
*/
isCCTabEnabled: boolean;
/**
* Whether the polls feature is enabled or not.
*/
@@ -34,7 +29,7 @@ interface IProps {
*
* @returns {React$Element<any>}
*/
function ChatHeader({ className, isPollsEnabled, isCCTabEnabled }: IProps) {
function ChatHeader({ className, isPollsEnabled }: IProps) {
const dispatch = useDispatch();
const { t } = useTranslation();
@@ -49,23 +44,13 @@ function ChatHeader({ className, isPollsEnabled, isCCTabEnabled }: IProps) {
}
}, []);
let title = 'chat.title';
if (isCCTabEnabled && isPollsEnabled) {
title = 'chat.titleWithPollsAndCC';
} else if (isCCTabEnabled) {
title = 'chat.titleWithCC';
} else if (isPollsEnabled) {
title = 'chat.titleWithPolls';
}
return (
<div
className = { className || 'chat-dialog-header' }>
<span
aria-level = { 1 }
role = 'heading'>
{ t(title) }
{ t(isPollsEnabled ? 'chat.titleWithPolls' : 'chat.title') }
</span>
<Icon
ariaLabel = { t('toolbar.closeChat') }

View File

@@ -413,9 +413,11 @@ const ChatMessage = ({
function _mapStateToProps(state: IReduxState, { message }: IProps) {
const { knocking } = state['features/lobby'];
const localParticipantId = state['features/base/participants'].local?.id;
const { remoteVideoMenu = {} } = state['features/base/config'];
const { disablePrivateChat } = remoteVideoMenu;
return {
shouldDisplayChatMessageMenu: message.participantId !== localParticipantId,
shouldDisplayChatMessageMenu: !disablePrivateChat && message.participantId !== localParticipantId,
knocking,
state
};

View File

@@ -1,178 +0,0 @@
import React, { useCallback, useMemo, useState } from 'react';
import { useTranslation } from 'react-i18next';
import { useDispatch, useSelector } from 'react-redux';
import { makeStyles } from 'tss-react/mui';
import Icon from '../../../base/icons/components/Icon';
import { IconSubtitles } from '../../../base/icons/svg';
import { withPixelLineHeight } from '../../../base/styles/functions.web';
import Button from '../../../base/ui/components/web/Button';
import { groupMessagesBySender } from '../../../base/util/messageGrouping';
import { setRequestingSubtitles } from '../../../subtitles/actions.any';
import LanguageSelector from '../../../subtitles/components/web/LanguageSelector';
import { canStartSubtitles } from '../../../subtitles/functions.any';
import { ISubtitle } from '../../../subtitles/types';
import { isTranscribing } from '../../../transcribing/functions';
import { SubtitlesMessagesContainer } from './SubtitlesMessagesContainer';
import { IReduxState } from '../../../app/types';
/**
* The styles for the ClosedCaptionsTab component.
*/
const useStyles = makeStyles()(theme => {
return {
subtitlesList: {
display: 'flex',
flexDirection: 'column',
height: '100%',
overflowY: 'auto',
padding: '16px',
flex: 1,
boxSizing: 'border-box',
color: theme.palette.text01
},
container: {
display: 'flex',
flexDirection: 'column',
height: '100%',
position: 'relative',
overflow: 'hidden'
},
messagesContainer: {
display: 'flex',
flexDirection: 'column',
flex: 1,
overflow: 'hidden'
},
emptyContent: {
display: 'flex',
alignItems: 'center',
justifyContent: 'center',
height: '100%',
padding: '16px',
boxSizing: 'border-box',
flexDirection: 'column',
gap: '16px',
color: theme.palette.text01,
textAlign: 'center'
},
emptyIcon: {
width: '100px',
padding: '16px',
'& svg': {
width: '100%',
height: 'auto'
}
},
emptyState: {
...withPixelLineHeight(theme.typography.bodyLongBold),
color: theme.palette.text02
}
};
});
/**
* Component that displays the subtitles history in a scrollable list.
*
* @returns {JSX.Element} - The ClosedCaptionsTab component.
*/
export default function ClosedCaptionsTab() {
const { classes, theme } = useStyles();
const dispatch = useDispatch();
const { t } = useTranslation();
const subtitles = useSelector((state: IReduxState) => state['features/subtitles'].subtitlesHistory);
const language = useSelector((state: IReduxState) => state['features/subtitles']._language);
const selectedLanguage = language?.replace('translation-languages:', '');
const _isTranscribing = useSelector(isTranscribing);
const _canStartSubtitles = useSelector(canStartSubtitles);
const [ isButtonPressed, setButtonPressed ] = useState(false);
const filteredSubtitles = useMemo(() => {
// First, create a map of transcription messages by message ID
const transcriptionMessages = new Map(
subtitles
.filter(s => s.isTranscription)
.map(s => [ s.id, s ])
);
if (!selectedLanguage) {
// When no language is selected, show all original transcriptions
return Array.from(transcriptionMessages.values());
}
// Then, create a map of translation messages by message ID
const translationMessages = new Map(
subtitles
.filter(s => !s.isTranscription && s.language === selectedLanguage)
.map(s => [ s.id, s ])
);
// When a language is selected, for each transcription message:
// 1. Use its translation if available
// 2. Fall back to the original transcription if no translation exists
return Array.from(transcriptionMessages.values())
.filter((m: ISubtitle) => !m.interim)
.map(m => translationMessages.get(m.id) ?? m);
}, [ subtitles, selectedLanguage ]);
const groupedSubtitles = useMemo(() =>
groupMessagesBySender(filteredSubtitles), [ filteredSubtitles ]);
const startClosedCaptions = useCallback(() => {
if (isButtonPressed) {
return;
}
dispatch(setRequestingSubtitles(true, false, null));
setButtonPressed(true);
}, [ dispatch, isButtonPressed, setButtonPressed ]);
if (!_isTranscribing) {
if (_canStartSubtitles) {
return (
<div className = { classes.emptyContent }>
<Button
accessibilityLabel = 'Start Closed Captions'
appearance = 'primary'
disabled = { isButtonPressed }
labelKey = 'closedCaptionsTab.startClosedCaptionsButton'
onClick = { startClosedCaptions }
size = 'large'
type = 'primary' />
</div>
);
}
if (isButtonPressed) {
setButtonPressed(false);
}
return (
<div className = { classes.emptyContent }>
<Icon
className = { classes.emptyIcon }
color = { theme.palette.icon03 }
src = { IconSubtitles } />
<span className = { classes.emptyState }>
{ t('closedCaptionsTab.emptyState') }
</span>
</div>
);
}
if (isButtonPressed) {
setButtonPressed(false);
}
return (
<div className = { classes.container }>
<LanguageSelector />
<div className = { classes.messagesContainer }>
<SubtitlesMessagesContainer
groups = { groupedSubtitles }
messages = { filteredSubtitles } />
</div>
</div>
);
}

View File

@@ -20,11 +20,6 @@ interface IProps extends WithTranslation {
*/
dispatch: IStore['dispatch'];
/**
* Whether CC tab is enabled or not.
*/
isCCTabEnabled: boolean;
/**
* Whether the polls feature is enabled or not.
*/
@@ -74,26 +69,16 @@ class DisplayNameForm extends Component<IProps, IState> {
* @returns {ReactElement}
*/
override render() {
const { isCCTabEnabled, isPollsEnabled, t } = this.props;
let title = 'chat.nickname.title';
if (isCCTabEnabled && isPollsEnabled) {
title = 'chat.nickname.titleWithPollsAndCC';
} else if (isCCTabEnabled) {
title = 'chat.nickname.titleWithCC';
} else if (isPollsEnabled) {
title = 'chat.nickname.titleWithPolls';
}
const { isPollsEnabled, t } = this.props;
return (
<div id = 'nickname'>
<form onSubmit = { this._onSubmit }>
<Input
accessibilityLabel = { t(title) }
accessibilityLabel = { t('chat.nickname.title') }
autoFocus = { true }
id = 'nickinput'
label = { t(title) }
label = { t(isPollsEnabled ? 'chat.nickname.titleWithPolls' : 'chat.nickname.title') }
name = 'name'
onChange = { this._onDisplayNameChange }
placeholder = { t('chat.nickname.popover') }

View File

@@ -1,19 +1,13 @@
import { throttle } from 'lodash-es';
import React, { Component, RefObject } from 'react';
import React, { RefObject } from 'react';
import { scrollIntoView } from 'seamless-scroll-polyfill';
import { groupMessagesBySender } from '../../../base/util/messageGrouping';
import { MESSAGE_TYPE_LOCAL, MESSAGE_TYPE_REMOTE } from '../../constants';
import { IMessage } from '../../types';
import AbstractMessageContainer, { IProps } from '../AbstractMessageContainer';
import ChatMessageGroup from './ChatMessageGroup';
import NewMessagesButton from './NewMessagesButton';
interface IProps {
messages: IMessage[];
}
interface IState {
/**
@@ -35,9 +29,9 @@ interface IState {
/**
* Displays all received chat messages, grouped by sender.
*
* @augments Component
* @augments AbstractMessageContainer
*/
export default class MessageContainer extends Component<IProps, IState> {
export default class MessageContainer extends AbstractMessageContainer<IProps, IState> {
/**
* Component state used to decide when the hasNewMessages button to appear
* and where to scroll when click on hasNewMessages button.
@@ -65,10 +59,6 @@ export default class MessageContainer extends Component<IProps, IState> {
*/
_bottomListObserver: IntersectionObserver;
static defaultProps = {
messages: [] as IMessage[]
};
/**
* Initializes a new {@code MessageContainer} instance.
*
@@ -96,15 +86,14 @@ export default class MessageContainer extends Component<IProps, IState> {
*/
override render() {
const groupedMessages = this._getMessagesGroupedBySender();
const content = groupedMessages.map((group, index) => {
const { messages } = group;
const messageType = messages[0]?.messageType;
const messages = groupedMessages.map((group, index) => {
const messageType = group[0]?.messageType;
return (
<ChatMessageGroup
className = { messageType || MESSAGE_TYPE_REMOTE }
key = { index }
messages = { messages } />
messages = { group } />
);
});
@@ -117,7 +106,7 @@ export default class MessageContainer extends Component<IProps, IState> {
ref = { this._messageListRef }
role = 'log'
tabIndex = { 0 }>
{ content }
{ messages }
{ !this.state.isScrolledToBottom && this.state.hasNewMessages
&& <NewMessagesButton
@@ -324,14 +313,4 @@ export default class MessageContainer extends Component<IProps, IState> {
return false;
}
/**
* Returns an array of message groups, where each group is an array of messages
* grouped by the sender.
*
* @returns {Array<Array<Object>>}
*/
_getMessagesGroupedBySender() {
return groupMessagesBySender(this.props.messages);
}
}

View File

@@ -1,97 +0,0 @@
import React from 'react';
import { useSelector } from 'react-redux';
import { makeStyles } from 'tss-react/mui';
import { getParticipantDisplayName } from '../../../base/participants/functions';
import { withPixelLineHeight } from '../../../base/styles/functions.web';
import { ISubtitle } from '../../../subtitles/types';
/**
* Props for the SubtitleMessage component.
*/
interface IProps extends ISubtitle {
/**
* Whether to show the display name of the participant.
*/
showDisplayName: boolean;
}
/**
* The styles for the SubtitleMessage component.
*/
const useStyles = makeStyles()(theme => {
return {
messageContainer: {
backgroundColor: theme.palette.ui02,
borderRadius: '4px 12px 12px 12px',
padding: '12px',
maxWidth: '100%',
marginTop: '4px',
boxSizing: 'border-box',
display: 'inline-flex'
},
messageContent: {
maxWidth: '100%',
overflow: 'hidden',
flex: 1
},
messageHeader: {
...withPixelLineHeight(theme.typography.labelBold),
color: theme.palette.text02,
whiteSpace: 'nowrap',
textOverflow: 'ellipsis',
overflow: 'hidden',
marginBottom: theme.spacing(1),
maxWidth: '130px'
},
messageText: {
...withPixelLineHeight(theme.typography.bodyShortRegular),
color: theme.palette.text01,
whiteSpace: 'pre-wrap',
wordBreak: 'break-word'
},
timestamp: {
...withPixelLineHeight(theme.typography.labelRegular),
color: theme.palette.text03,
marginTop: theme.spacing(1)
},
interim: {
opacity: 0.7
}
};
});
/**
* Component that renders a single subtitle message with the participant's name,
* message content, and timestamp.
*
* @param {IProps} props - The component props.
* @returns {JSX.Element} - The rendered subtitle message.
*/
export default function SubtitleMessage({ participantId, text, timestamp, interim, showDisplayName }: IProps) {
const { classes } = useStyles();
const participantName = useSelector((state: any) =>
getParticipantDisplayName(state, participantId));
return (
<div className = { `${classes.messageContainer} ${interim ? classes.interim : ''}` }>
<div className = { classes.messageContent }>
{showDisplayName && (
<div className = { classes.messageHeader }>
{participantName}
</div>
)}
<div className = { classes.messageText }>{text}</div>
<div className = { classes.timestamp }>
{new Date(timestamp).toLocaleTimeString()}
</div>
</div>
</div>
);
}

View File

@@ -1,76 +0,0 @@
import React from 'react';
import { makeStyles } from 'tss-react/mui';
import Avatar from '../../../base/avatar/components/Avatar';
import { ISubtitle } from '../../../subtitles/types';
import SubtitleMessage from './SubtitleMessage';
/**
* Props for the SubtitlesGroup component.
*/
interface IProps {
/**
* Array of subtitle messages to be displayed in this group.
*/
messages: ISubtitle[];
/**
* The ID of the participant who sent these subtitles.
*/
senderId: string;
}
const useStyles = makeStyles()(theme => {
return {
groupContainer: {
display: 'flex',
marginBottom: theme.spacing(3)
},
avatar: {
marginRight: theme.spacing(2),
alignSelf: 'flex-start'
},
messagesContainer: {
display: 'flex',
flexDirection: 'column',
flex: 1,
maxWidth: 'calc(100% - 56px)', // 40px avatar + 16px margin
gap: theme.spacing(1)
}
};
});
/**
* Component that renders a group of subtitle messages from the same sender.
*
* @param {IProps} props - The props for the component.
* @returns {JSX.Element} - A React component rendering a group of subtitles.
*/
export function SubtitlesGroup({ messages, senderId }: IProps) {
const { classes } = useStyles();
if (!messages.length) {
return null;
}
return (
<div className = { classes.groupContainer }>
<Avatar
className = { classes.avatar }
participantId = { senderId }
size = { 32 } />
<div className = { classes.messagesContainer }>
{messages.map((message, index) => (
<SubtitleMessage
key = { `${message.timestamp}-${message.id}` }
showDisplayName = { index === 0 }
{ ...message } />
))}
</div>
</div>
);
}

View File

@@ -1,154 +0,0 @@
import React, { useCallback, useEffect, useRef, useState } from 'react';
import { scrollIntoView } from 'seamless-scroll-polyfill';
import { makeStyles } from 'tss-react/mui';
import { ISubtitle } from '../../../subtitles/types';
import NewMessagesButton from './NewMessagesButton';
import { SubtitlesGroup } from './SubtitlesGroup';
interface IProps {
groups: Array<{
messages: ISubtitle[];
senderId: string;
}>;
messages: ISubtitle[];
}
/**
* The padding value used for the message list.
*
* @constant {string}
*/
const MESSAGE_LIST_PADDING = '16px';
const useStyles = makeStyles()(() => {
return {
container: {
flex: 1,
overflow: 'hidden',
position: 'relative',
height: '100%'
},
messagesList: {
height: '100%',
overflowY: 'auto',
padding: MESSAGE_LIST_PADDING,
boxSizing: 'border-box'
}
};
});
/**
* Component that handles the display and scrolling behavior of subtitles messages.
* It provides auto-scrolling for new messages and a button to jump to new messages
* when the user has scrolled up.
*
* @returns {JSX.Element} - A React component displaying subtitles messages with scroll functionality.
*/
export function SubtitlesMessagesContainer({ messages, groups }: IProps) {
const { classes } = useStyles();
const [ hasNewMessages, setHasNewMessages ] = useState(false);
const [ isScrolledToBottom, setIsScrolledToBottom ] = useState(true);
const [ observer, setObserver ] = useState<IntersectionObserver | null>(null);
const messagesEndRef = useRef<HTMLDivElement>(null);
const scrollToElement = useCallback((withAnimation: boolean, element: Element | null) => {
const scrollTo = element ? element : messagesEndRef.current;
const block = element ? 'end' : 'nearest';
scrollIntoView(scrollTo as Element, {
behavior: withAnimation ? 'smooth' : 'auto',
block
});
}, [ messagesEndRef.current ]);
const handleNewMessagesClick = useCallback(() => {
scrollToElement(true, null);
}, [ scrollToElement ]);
const handleIntersectBottomList = (entries: IntersectionObserverEntry[]) => {
entries.forEach((entry: IntersectionObserverEntry) => {
if (entry.isIntersecting) {
setIsScrolledToBottom(true);
setHasNewMessages(false);
}
if (!entry.isIntersecting) {
setIsScrolledToBottom(false);
}
});
};
const createBottomListObserver = () => {
const target = document.querySelector('#subtitles-messages-end');
if (target) {
const newObserver = new IntersectionObserver(
handleIntersectBottomList, {
root: document.querySelector('#subtitles-messages-list'),
rootMargin: MESSAGE_LIST_PADDING,
threshold: 1
});
setObserver(newObserver);
newObserver.observe(target);
}
};
useEffect(() => {
scrollToElement(false, null);
createBottomListObserver();
return () => {
if (observer) {
observer.disconnect();
setObserver(null);
}
};
}, []);
const previousMessages = useRef(messages);
useEffect(() => {
const newMessages = messages.filter(message => !previousMessages.current.includes(message));
if (newMessages.length > 0) {
if (isScrolledToBottom) {
scrollToElement(false, null);
} else {
setHasNewMessages(true);
}
}
previousMessages.current = messages;
},
// isScrolledToBottom is not a dependency because we neither need to show the new messages button neither scroll to the
// bottom when the user has scrolled up.
[ messages, scrollToElement ]);
return (
<div
className = { classes.container }
id = 'subtitles-messages-container'>
<div
className = { classes.messagesList }
id = 'subtitles-messages-list'>
{groups.map(group => (
<SubtitlesGroup
key = { `${group.senderId}-${group.messages[0].timestamp}` }
messages = { group.messages }
senderId = { group.senderId } />
))}
{ !isScrolledToBottom && hasNewMessages && (
<NewMessagesButton
onGoToFirstUnreadMessage = { handleNewMessagesClick } />
)}
<div
id = 'subtitles-messages-end'
ref = { messagesEndRef } />
</div>
</div>
);
}

View File

@@ -39,11 +39,10 @@ export const SMALL_WIDTH_THRESHOLD = 580;
*/
export const LOBBY_CHAT_MESSAGE = 'LOBBY_CHAT_MESSAGE';
export enum ChatTabs {
CHAT = 'chat-tab',
CLOSED_CAPTIONS = 'cc-tab',
POLLS = 'polls-tab'
}
export const CHAT_TABS = {
POLLS: 'polls-tab',
CHAT: 'chat-tab'
};
/**
* Formatter string to display the message timestamp.

View File

@@ -40,12 +40,11 @@ import {
OPEN_CHAT,
SEND_MESSAGE,
SEND_REACTION,
SET_FOCUSED_TAB
SET_IS_POLL_TAB_FOCUSED
} from './actionTypes';
import { addMessage, addMessageReaction, clearMessages, closeChat, setPrivateMessageRecipient } from './actions.any';
import { ChatPrivacyDialog } from './components';
import {
ChatTabs,
INCOMING_MSG_SOUND_ID,
LOBBY_CHAT_MESSAGE,
MESSAGE_TYPE_ERROR,
@@ -104,15 +103,15 @@ MiddlewareRegistry.register(store => next => action => {
break;
case CLOSE_CHAT: {
const { focusedTab } = getState()['features/chat'];
const isPollTabOpen = getState()['features/chat'].isPollsTabFocused;
if (focusedTab === ChatTabs.CHAT) {
unreadCount = 0;
unreadCount = 0;
if (typeof APP !== 'undefined') {
APP.API.notifyChatUpdated(unreadCount, false);
}
} else if (focusedTab === ChatTabs.POLLS) {
if (typeof APP !== 'undefined') {
APP.API.notifyChatUpdated(unreadCount, false);
}
if (isPollTabOpen) {
dispatch(resetNbUnreadPollsMessages());
}
break;
@@ -162,34 +161,32 @@ MiddlewareRegistry.register(store => next => action => {
break;
}
case SET_FOCUSED_TAB:
case OPEN_CHAT: {
const focusedTab = action.tabId || getState()['features/chat'].focusedTab;
unreadCount = 0;
if (focusedTab === ChatTabs.CHAT) {
unreadCount = 0;
if (typeof APP !== 'undefined') {
APP.API.notifyChatUpdated(unreadCount, true);
}
const { privateMessageRecipient } = store.getState()['features/chat'];
if (
isSendGroupChatDisabled(store.getState())
&& privateMessageRecipient
&& !action.participant
) {
const participant = getParticipantById(store.getState(), privateMessageRecipient.id);
if (participant) {
action.participant = participant;
}
}
} else if (focusedTab === ChatTabs.POLLS) {
dispatch(resetNbUnreadPollsMessages());
if (typeof APP !== 'undefined') {
APP.API.notifyChatUpdated(unreadCount, true);
}
const { privateMessageRecipient } = store.getState()['features/chat'];
if (
isSendGroupChatDisabled(store.getState())
&& privateMessageRecipient
&& !action.participant
) {
const participant = getParticipantById(store.getState(), privateMessageRecipient.id);
if (participant) {
action.participant = participant;
}
}
break;
}
case SET_IS_POLL_TAB_FOCUSED: {
dispatch(resetNbUnreadPollsMessages());
break;
}
@@ -259,6 +256,7 @@ MiddlewareRegistry.register(store => next => action => {
lobbyChat: false
}, false, true);
}
break;
}
}
@@ -532,7 +530,8 @@ function _handleReceivedMessage({ dispatch, getState }: IStore,
// skip message notifications on join (the messages having timestamp - coming from the history)
const shouldShowNotification = userSelectedNotifications?.['notify.chatMessages']
&& !hasRead && !isReaction && (!timestamp || lobbyChat);
&& !hasRead && !isReaction
&& (!timestamp || lobbyChat);
if (isGuest) {
displayNameToShow = `${displayNameToShow} ${i18next.t('visitors.chatIndicator')}`;

View File

@@ -1,6 +1,5 @@
import { ILocalParticipant, IParticipant } from '../base/participants/types';
import ReducerRegistry from '../base/redux/ReducerRegistry';
import { ChatTabs } from './constants';
import {
ADD_MESSAGE,
@@ -10,10 +9,10 @@ import {
EDIT_MESSAGE,
OPEN_CHAT,
REMOVE_LOBBY_CHAT_PARTICIPANT,
SET_IS_POLL_TAB_FOCUSED,
SET_LOBBY_CHAT_ACTIVE_STATE,
SET_LOBBY_CHAT_RECIPIENT,
SET_PRIVATE_MESSAGE_RECIPIENT,
SET_FOCUSED_TAB
SET_PRIVATE_MESSAGE_RECIPIENT
} from './actionTypes';
import { IMessage } from './types';
import { UPDATE_CONFERENCE_METADATA } from '../base/conference/actionTypes';
@@ -21,20 +20,21 @@ import { UPDATE_CONFERENCE_METADATA } from '../base/conference/actionTypes';
const DEFAULT_STATE = {
groupChatWithPermissions: false,
isOpen: false,
isPollsTabFocused: false,
lastReadMessage: undefined,
messages: [],
reactions: {},
nbUnreadMessages: 0,
privateMessageRecipient: undefined,
lobbyMessageRecipient: undefined,
isLobbyChatActive: false,
focusedTab: ChatTabs.CHAT
isLobbyChatActive: false
};
export interface IChatState {
focusedTab: ChatTabs;
groupChatWithPermissions: boolean;
isLobbyChatActive: boolean;
isOpen: boolean;
isPollsTabFocused: boolean;
lastReadMessage?: IMessage;
lobbyMessageRecipient?: {
id: string;
@@ -78,7 +78,7 @@ ReducerRegistry.register<IChatState>('features/chat', (state = DEFAULT_STATE, ac
...state,
lastReadMessage:
action.hasRead ? newMessage : state.lastReadMessage,
nbUnreadMessages: state.focusedTab !== ChatTabs.CHAT ? state.nbUnreadMessages + 1 : state.nbUnreadMessages,
nbUnreadMessages: state.isPollsTabFocused ? state.nbUnreadMessages + 1 : state.nbUnreadMessages,
messages
};
}
@@ -170,6 +170,13 @@ ReducerRegistry.register<IChatState>('features/chat', (state = DEFAULT_STATE, ac
isLobbyChatActive: false
};
case SET_IS_POLL_TAB_FOCUSED: {
return {
...state,
isPollsTabFocused: action.isPollsTabFocused,
nbUnreadMessages: 0
}; }
case SET_LOBBY_CHAT_RECIPIENT:
return {
...state,
@@ -208,15 +215,7 @@ ReducerRegistry.register<IChatState>('features/chat', (state = DEFAULT_STATE, ac
groupChatWithPermissions: Boolean(metadata.permissions.groupChatRestricted)
};
}
break;
}
case SET_FOCUSED_TAB:
return {
...state,
focusedTab: action.tabId,
nbUnreadMessages: action.tabId === ChatTabs.CHAT ? 0 : state.nbUnreadMessages
};
}
return state;

View File

@@ -180,6 +180,7 @@ export interface IDynamicBrandingState {
requireRecordingConsent?: boolean;
sharedVideoAllowedURLDomains?: Array<string>;
showGiphyIntegration?: boolean;
skipRecordingConsentInMeeting?: boolean;
supportUrl?: string;
useDynamicBrandingData: boolean;
virtualBackgrounds: Array<Image>;
@@ -206,9 +207,10 @@ ReducerRegistry.register<IDynamicBrandingState>(STORE_NAME, (state = DEFAULT_STA
muiBrandedTheme,
pollCreationRequiresPermission,
premeetingBackground,
requireRecordingConsent,
sharedVideoAllowedURLDomains,
showGiphyIntegration,
requireRecordingConsent,
skipRecordingConsentInMeeting,
supportUrl,
virtualBackgrounds
} = action.value;
@@ -228,9 +230,10 @@ ReducerRegistry.register<IDynamicBrandingState>(STORE_NAME, (state = DEFAULT_STA
muiBrandedTheme,
pollCreationRequiresPermission,
premeetingBackground,
requireRecordingConsent,
sharedVideoAllowedURLDomains,
showGiphyIntegration,
requireRecordingConsent,
skipRecordingConsentInMeeting,
supportUrl,
customizationFailed: false,
customizationReady: true,

View File

@@ -249,8 +249,6 @@ class AddPeopleDialog extends AbstractAddPeopleDialog<IProps, IState> {
const { item } = flatListItem;
switch (item.type) {
// isCORSAvatarURL in this case is false
case INVITE_TYPES.PHONE:
return {
avatar: IconPhoneRinging,

View File

@@ -17,7 +17,6 @@ import { FILMSTRIP_BREAKPOINT } from '../../filmstrip/constants';
import { getVerticalViewMaxWidth, isFilmstripResizable } from '../../filmstrip/functions.web';
import SharedVideo from '../../shared-video/components/web/SharedVideo';
import Captions from '../../subtitles/components/web/Captions';
import { areClosedCaptionsEnabled } from '../../subtitles/functions.any';
import { setTileView } from '../../video-layout/actions.web';
import Whiteboard from '../../whiteboard/components/web/Whiteboard';
import { isWhiteboardEnabled } from '../../whiteboard/functions';
@@ -100,11 +99,6 @@ interface IProps {
*/
_showDominantSpeakerBadge: boolean;
/**
* Whether or not to show subtitles button.
*/
_showSubtitles?: boolean;
/**
* The width of the vertical filmstrip (user resized).
*/
@@ -205,8 +199,7 @@ class LargeVideo extends Component<IProps> {
_isDisplayNameVisible,
_noAutoPlayVideo,
_showDominantSpeakerBadge,
_whiteboardEnabled,
_showSubtitles
_whiteboardEnabled
} = this.props;
const style = this._getCustomStyles();
const className = `videocontainer${_isChatOpen ? ' shift-right' : ''}`;
@@ -254,8 +247,8 @@ class LargeVideo extends Component<IProps> {
playsInline = { true } /* for Safari on iOS to work */ />
</div>
</div>
{ (!interfaceConfig.DISABLE_TRANSCRIPTION_SUBTITLES && _showSubtitles)
&& <Captions /> }
{ interfaceConfig.DISABLE_TRANSCRIPTION_SUBTITLES
|| <Captions /> }
{
_isDisplayNameVisible
&& (
@@ -383,7 +376,7 @@ function _mapStateToProps(state: IReduxState) {
_customBackgroundColor: backgroundColor,
_customBackgroundImageUrl: backgroundImageUrl,
_displayScreenSharingPlaceholder:
Boolean(isLocalScreenshareOnLargeVideo && !seeWhatIsBeingShared && !isSpotTV()),
Boolean(isLocalScreenshareOnLargeVideo && !seeWhatIsBeingShared && !isSpotTV(state)),
_hideSelfView: getHideSelfView(state),
_isChatOpen: isChatOpen,
_isDisplayNameVisible: isDisplayNameVisible(state),
@@ -394,8 +387,6 @@ function _mapStateToProps(state: IReduxState) {
_resizableFilmstrip: isFilmstripResizable(state),
_seeWhatIsBeingShared: Boolean(seeWhatIsBeingShared),
_showDominantSpeakerBadge: !hideDominantSpeakerBadge,
_showSubtitles: areClosedCaptionsEnabled(state)
&& Boolean(state['features/base/settings'].showSubtitlesOnStage),
_verticalFilmstripWidth: verticalFilmstripWidth.current,
_verticalViewMaxWidth: getVerticalViewMaxWidth(state),
_visibleFilmstrip: visible,

View File

@@ -9,9 +9,9 @@ import {
getClientHeight,
getClientWidth
} from '../../../../../base/modal/components/functions';
import { setFocusedTab } from '../../../../../chat/actions.any';
import { setIsPollsTabFocused } from '../../../../../chat/actions.native';
// @ts-ignore
import Chat from '../../../../../chat/components/native/Chat';
import { ChatTabs } from '../../../../../chat/constants';
import { resetNbUnreadPollsMessages } from '../../../../../polls/actions';
import PollsPane from '../../../../../polls/components/native/PollsPane';
import { screen } from '../../../routes';
@@ -23,8 +23,8 @@ const ChatAndPolls = () => {
const clientHeight = useSelector(getClientHeight);
const clientWidth = useSelector(getClientWidth);
const dispatch = useDispatch();
const { focusedTab } = useSelector((state: IReduxState) => state['features/chat']);
const initialRouteName = focusedTab === ChatTabs.POLLS
const { isPollsTabFocused } = useSelector((state: IReduxState) => state['features/chat']);
const initialRouteName = isPollsTabFocused
? screen.conference.chatandpolls.tab.polls
: screen.conference.chatandpolls.tab.chat;
@@ -42,7 +42,7 @@ const ChatAndPolls = () => {
component = { Chat }
listeners = {{
tabPress: () => {
dispatch(setFocusedTab(ChatTabs.CHAT));
dispatch(setIsPollsTabFocused(false));
}
}}
name = { screen.conference.chatandpolls.tab.chat } />
@@ -50,7 +50,7 @@ const ChatAndPolls = () => {
component = { PollsPane }
listeners = {{
tabPress: () => {
dispatch(setFocusedTab(ChatTabs.POLLS));
dispatch(setIsPollsTabFocused(true));
dispatch(resetNbUnreadPollsMessages);
}
}}

View File

@@ -33,7 +33,6 @@ class PageReloadOverlay extends AbstractPageReloadOverlay<IProps> {
className = 'inlay'
role = 'dialog'>
<span
aria-level = { 1 }
className = 'reload_overlay_title'
id = 'reload_overlay_title'
role = 'heading'>

View File

@@ -130,7 +130,6 @@ function MeetingParticipants({
accessibilityLabel = { t('participantsPane.search') }
className = { styles.search }
clearable = { true }
hiddenDescription = { t('participantsPane.searchDescription') }
id = 'participants-search-input'
onChange = { setSearchString }
placeholder = { t('participantsPane.search') }

View File

@@ -10,7 +10,6 @@ import JitsiScreen from '../../../base/modal/components/JitsiScreen';
import { StyleType } from '../../../base/styles/functions.any';
import Button from '../../../base/ui/components/native/Button';
import { BUTTON_TYPES } from '../../../base/ui/constants.native';
import { ChatTabs } from '../../../chat/constants';
import { TabBarLabelCounter }
from '../../../mobile/navigation/components/TabBarLabelCounter';
import AbstractPollsPane from '../AbstractPollsPane';
@@ -23,7 +22,7 @@ import { pollsStyles } from './styles';
const PollsPane = (props: AbstractProps) => {
const { createMode, isCreatePollsDisabled, onCreate, setCreateMode, t } = props;
const navigation = useNavigation();
const isPollsTabFocused = useSelector((state: IReduxState) => state['features/chat'].focusedTab === ChatTabs.POLLS);
const { isPollsTabFocused } = useSelector((state: IReduxState) => state['features/chat']);
const { nbUnreadPolls } = useSelector((state: IReduxState) => state['features/polls']);
useEffect(() => {

View File

@@ -4,7 +4,7 @@ import { getCurrentConference } from '../base/conference/functions';
import MiddlewareRegistry from '../base/redux/MiddlewareRegistry';
import StateListenerRegistry from '../base/redux/StateListenerRegistry';
import { playSound } from '../base/sounds/actions';
import { ChatTabs, INCOMING_MSG_SOUND_ID } from '../chat/constants';
import { INCOMING_MSG_SOUND_ID } from '../chat/constants';
import { arePollsDisabled } from '../conference/functions.any';
import { showNotification } from '../notifications/actions';
import { NOTIFICATION_TIMEOUT_TYPE, NOTIFICATION_TYPE } from '../notifications/constants';
@@ -96,7 +96,7 @@ MiddlewareRegistry.register(({ dispatch, getState }) => next => action => {
}
const isChatOpen: boolean = state['features/chat'].isOpen;
const isPollsTabFocused: boolean = state['features/chat'].focusedTab === ChatTabs.POLLS;
const isPollsTabFocused: boolean = state['features/chat'].isPollsTabFocused;
// Finally, we notify user they received a new poll if their pane is not opened
if (action.notify && (!isChatOpen || !isPollsTabFocused)) {

View File

@@ -67,9 +67,7 @@ function DeviceStatus() {
role = 'alert'
tabIndex = { -1 }>
{!hasError && <div className = { classes.indicator } />}
<span
aria-level = { 3 }
role = 'heading'>
<span role = 'heading'>
{hasError ? t('prejoin.errorNoPermissions') : t(deviceStatusText ?? '')}
</span>
</div>

View File

@@ -8,6 +8,16 @@
*/
export const CLEAR_RECORDING_SESSIONS = 'CLEAR_RECORDING_SESSIONS';
/**
* The type of Redux action which marks a session ID as consent requested.
*
* {
* type: MARK_CONSENT_REQUESTED,
* sessionId: string
* }
*/
export const MARK_CONSENT_REQUESTED = 'MARK_CONSENT_REQUESTED';
/**
* The type of Redux action which updates the current known state of a recording
* session.

View File

@@ -20,6 +20,7 @@ import { isRecorderTranscriptionsRunning } from '../transcribing/functions';
import {
CLEAR_RECORDING_SESSIONS,
MARK_CONSENT_REQUESTED,
RECORDING_SESSION_UPDATED,
SET_MEETING_HIGHLIGHT_BUTTON_STATE,
SET_PENDING_RECORDING_NOTIFICATION_UID,
@@ -285,19 +286,10 @@ export function showStartedRecordingNotification(
// add the option to copy recording link
if (showRecordingLink) {
const actions = [
...notifyProps.dialogProps.customActionNameKey ?? [],
'recording.copyLink'
];
const handlers = [
...notifyProps.dialogProps.customActionHandler ?? [],
() => copyText(link)
];
notifyProps.dialogProps = {
...notifyProps.dialogProps,
customActionNameKey: actions,
customActionHandler: handlers,
customActionNameKey: [ 'recording.copyLink' ],
customActionHandler: [ () => copyText(link) ],
titleKey: 'recording.on',
descriptionKey: 'recording.linkGenerated'
};
@@ -476,3 +468,17 @@ export function showStartRecordingNotificationWithCallback(openRecordingDialog:
}, NOTIFICATION_TIMEOUT_TYPE.EXTRA_LONG));
};
}
/**
* Marks the given session as consent requested. No further consent requests will be
* made for this session.
*
* @param {string} sessionId - The session id.
* @returns {Object}
*/
export function markConsentRequested(sessionId: string) {
return {
type: MARK_CONSENT_REQUESTED,
sessionId
};
}

View File

@@ -3,7 +3,6 @@ import { IStore } from '../../../app/types';
interface ILocalRecordingManager {
addAudioTrackToLocalRecording: (track: any) => void;
isRecordingLocally: () => boolean;
isSupported: () => boolean;
selfRecording: {
on: boolean;
withVideo: boolean;
@@ -41,15 +40,6 @@ const LocalRecordingManager: ILocalRecordingManager = {
*/
async startLocalRecording() { }, // eslint-disable-line @typescript-eslint/no-empty-function
/**
* Whether or not local recording is supported.
*
* @returns {boolean}
*/
isSupported() {
return false;
},
/**
* Whether or not we're currently recording locally.
*

View File

@@ -1,12 +1,11 @@
import i18next from 'i18next';
import { v4 as uuidV4 } from 'uuid';
import fixWebmDuration from 'webm-duration-fix';
import { IStore } from '../../../app/types';
import { getRoomName } from '../../../base/conference/functions';
import { MEDIA_TYPE } from '../../../base/media/constants';
import { getLocalTrack, getTrackState } from '../../../base/tracks/functions';
import { isMobileBrowser } from '../../../base/environment/utils';
import { browser } from '../../../base/lib-jitsi-meet';
import { isEmbedded } from '../../../base/util/embedUtils';
import { stopLocalVideoRecording } from '../../actions.any';
@@ -19,54 +18,63 @@ interface ILocalRecordingManager {
addAudioTrackToLocalRecording: (track: MediaStreamTrack) => void;
audioContext: AudioContext | undefined;
audioDestination: MediaStreamAudioDestinationNode | undefined;
fileHandle: FileSystemFileHandle | undefined;
getFilename: () => string;
initializeAudioMixer: () => void;
isRecordingLocally: () => boolean;
isSupported: () => boolean;
mediaType: string;
mixAudioStream: (stream: MediaStream) => void;
recorder: MediaRecorder | undefined;
recordingData: Blob[];
roomName: string;
saveRecording: (recordingData: Blob[], filename: string) => void;
selfRecording: ISelfRecording;
startLocalRecording: (store: IStore, onlySelf: boolean) => Promise<void>;
stopLocalRecording: () => void;
stream: MediaStream | undefined;
writableStream: FileSystemWritableFileStream | undefined;
totalSize: number;
}
/**
* We want to use the MP4 container due to it not suffering from the resulting file
* not being seek-able.
*
* The choice of VP9 as the video codec and Opus as the audio codec is for compatibility.
* While Chrome does support avc1 and avc3 (we'd need the latter since the resolution can change)
* it's not supported across the board.
*/
const PREFERRED_MEDIA_TYPE = 'video/mp4;codecs=vp9,opus';
const getMimeType = (): string => {
const possibleTypes = [
'video/webm;codecs=vp8'
];
for (const type of possibleTypes) {
if (MediaRecorder.isTypeSupported(type)) {
return type;
}
}
throw new Error('No MIME Type supported by MediaRecorder');
};
const VIDEO_BIT_RATE = 2500000; // 2.5Mbps in bits
const MAX_SIZE = 1073741824; // 1GB in bytes
// Lazily initialize.
let preferredMediaType: string;
const LocalRecordingManager: ILocalRecordingManager = {
recordingData: [],
recorder: undefined,
stream: undefined,
audioContext: undefined,
audioDestination: undefined,
roomName: '',
totalSize: MAX_SIZE,
selfRecording: {
on: false,
withVideo: false
},
fileHandle: undefined,
writableStream: undefined,
get mediaType() {
if (this.selfRecording.on && !this.selfRecording.withVideo) {
return 'audio/webm;';
}
if (!preferredMediaType) {
preferredMediaType = getMimeType();
}
return PREFERRED_MEDIA_TYPE;
return preferredMediaType;
},
/**
@@ -120,6 +128,27 @@ const LocalRecordingManager: ILocalRecordingManager = {
return `${this.roomName}_${timestamp}`;
},
/**
* Saves local recording to file.
*
* @param {Array} recordingData - The recording data.
* @param {string} filename - The name of the file.
* @returns {void}
* */
async saveRecording(recordingData, filename) {
// @ts-ignore
const blob = await fixWebmDuration(new Blob(recordingData, { type: this.mediaType }));
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
const extension = this.mediaType.slice(this.mediaType.indexOf('/') + 1, this.mediaType.indexOf(';'));
a.style.display = 'none';
a.href = url;
a.download = `${filename}.${extension}`;
a.click();
},
/**
* Stops local recording.
*
@@ -131,10 +160,12 @@ const LocalRecordingManager: ILocalRecordingManager = {
this.recorder = undefined;
this.audioContext = undefined;
this.audioDestination = undefined;
this.writableStream?.close().then(() => {
this.fileHandle = undefined;
this.writableStream = undefined;
});
this.totalSize = MAX_SIZE;
setTimeout(() => {
if (this.recordingData.length > 0) {
this.saveRecording(this.recordingData, this.getFilename());
}
}, 1000);
}
},
@@ -148,23 +179,13 @@ const LocalRecordingManager: ILocalRecordingManager = {
async startLocalRecording(store, onlySelf) {
const { dispatch, getState } = store;
this.roomName = getRoomName(getState()) ?? '';
// Get a handle to the file we are going to write.
const options = {
startIn: 'downloads',
suggestedName: `${this.getFilename()}.mp4`,
};
// @ts-expect-error
this.fileHandle = await window.showSaveFilePicker(options);
this.writableStream = await this.fileHandle?.createWritable();
// @ts-ignore
const supportsCaptureHandle = Boolean(navigator.mediaDevices.setCaptureHandleConfig) && !isEmbedded();
const tabId = uuidV4();
this.selfRecording.on = onlySelf;
this.recordingData = [];
this.roomName = getRoomName(getState()) ?? '';
let gdmStream: MediaStream = new MediaStream();
const tracks = getTrackState(getState());
@@ -259,9 +280,13 @@ const LocalRecordingManager: ILocalRecordingManager = {
mimeType: this.mediaType,
videoBitsPerSecond: VIDEO_BIT_RATE
});
this.recorder.addEventListener('dataavailable', async e => {
if (this.recorder && e.data && e.data.size > 0) {
await this.writableStream?.write(e.data);
this.recorder.addEventListener('dataavailable', e => {
if (e.data && e.data.size > 0) {
this.recordingData.push(e.data);
this.totalSize -= e.data.size;
if (this.totalSize <= 0) {
dispatch(stopLocalVideoRecording());
}
}
});
@@ -283,22 +308,6 @@ const LocalRecordingManager: ILocalRecordingManager = {
this.recorder.start(5000);
},
/**
* Whether or not local recording is supported.
*
* @returns {boolean}
*/
isSupported() {
return browser.isChromiumBased()
&& !browser.isElectron()
&& !browser.isReactNative()
&& !isMobileBrowser()
// @ts-expect-error
&& typeof window.showSaveFilePicker !== 'undefined'
&& MediaRecorder.isTypeSupported(PREFERRED_MEDIA_TYPE);
},
/**
* Whether or not we're currently recording locally.
*

View File

@@ -1,8 +1,14 @@
import React, { useCallback } from 'react';
import { useDispatch } from 'react-redux';
import { useDispatch, useSelector } from 'react-redux';
import { useTranslation } from 'react-i18next';
import Dialog from 'react-native-dialog';
import ConfirmDialog from '../../../../base/dialog/components/native/ConfirmDialog';
import { setAudioUnmutePermissions, setVideoUnmutePermissions } from '../../../../base/media/actions';
import { setAudioMuted, setAudioUnmutePermissions, setVideoMuted, setVideoUnmutePermissions } from '../../../../base/media/actions';
import { VIDEO_MUTISM_AUTHORITY } from '../../../../base/media/constants';
import Link from '../../../../base/react/components/native/Link';
import { IReduxState } from '../../../../app/types';
import styles from '../styles.native';
/**
* Component that renders the dialog for explicit consent for recordings.
@@ -11,6 +17,10 @@ import { setAudioUnmutePermissions, setVideoUnmutePermissions } from '../../../.
*/
export default function RecordingConsentDialog() {
const dispatch = useDispatch();
const { t } = useTranslation();
const { recordings } = useSelector((state: IReduxState) => state['features/base/config']);
const { consentLearnMoreLink } = recordings ?? {};
const consent = useCallback(() => {
dispatch(setAudioUnmutePermissions(false, true));
@@ -19,12 +29,36 @@ export default function RecordingConsentDialog() {
return true;
}, []);
const consentAndUnmute = useCallback(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
dispatch(setAudioMuted(false, true));
dispatch(setVideoMuted(false, VIDEO_MUTISM_AUTHORITY.USER, true));
return true;
}, []);
return (
<ConfirmDialog
backLabel = { 'dialog.UnderstandAndUnmute' }
confirmLabel = { 'dialog.Understand' }
descriptionKey = { 'dialog.recordingInProgressDescription' }
isBackHidden = { false }
isCancelHidden = { true }
onBack = { consentAndUnmute }
onSubmit = { consent }
title = { 'dialog.recordingInProgressTitle' } />
title = { 'dialog.recordingInProgressTitle' }
verticalButtons = { true }>
<Dialog.Description>
{t('dialog.recordingInProgressDescriptionFirstHalf')}
{consentLearnMoreLink && (
<Link
style = { styles.learnMoreLink }
url = { consentLearnMoreLink }>
{`(${t('dialog.learnMore')})`}
</Link>
)}
{t('dialog.recordingInProgressDescriptionSecondHalf')}
</Dialog.Description>
</ConfirmDialog>
);
}

View File

@@ -94,8 +94,11 @@ export default {
highlightDialogButtonsSpace: {
height: 16,
width: '100%'
},
learnMoreLink: {
color: BaseTheme.palette.link01,
fontWeight: 'bold'
}
};
/**

View File

@@ -1,9 +1,18 @@
import React, { useCallback } from 'react';
import { useTranslation } from 'react-i18next';
import { useDispatch } from 'react-redux';
import { batch, useDispatch, useSelector } from 'react-redux';
import { setAudioUnmutePermissions, setVideoUnmutePermissions } from '../../../../base/media/actions';
import { IReduxState } from '../../../../app/types';
import { translateToHTML } from '../../../../base/i18n/functions';
import {
setAudioMuted,
setAudioUnmutePermissions,
setVideoMuted,
setVideoUnmutePermissions
} from '../../../../base/media/actions';
import { VIDEO_MUTISM_AUTHORITY } from '../../../../base/media/constants';
import Dialog from '../../../../base/ui/components/web/Dialog';
import { hideDialog } from '../../../../base/dialog/actions';
/**
* Component that renders the dialog for explicit consent for recordings.
@@ -13,14 +22,34 @@ import Dialog from '../../../../base/ui/components/web/Dialog';
export default function RecordingConsentDialog() {
const { t } = useTranslation();
const dispatch = useDispatch();
const { recordings } = useSelector((state: IReduxState) => state['features/base/config']);
const { consentLearnMoreLink } = recordings ?? {};
const learnMore = ` (<a href="${consentLearnMoreLink}" target="_blank" rel="noopener noreferrer">${t('dialog.learnMore')}</a>)`;
const consent = useCallback(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
batch(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
});
}, []);
const consentAndUnmute = useCallback(() => {
batch(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
dispatch(setAudioMuted(false, true));
dispatch(setVideoMuted(false, VIDEO_MUTISM_AUTHORITY.USER, true));
dispatch(hideDialog());
});
}, []);
return (
<Dialog
back = {{
hidden: false,
onClick: consentAndUnmute,
translationKey: 'dialog.UnderstandAndUnmute'
}}
cancel = {{ hidden: true }}
disableBackdropClose = { true }
disableEscape = { true }
@@ -28,9 +57,7 @@ export default function RecordingConsentDialog() {
ok = {{ translationKey: 'dialog.Understand' }}
onSubmit = { consent }
titleKey = 'dialog.recordingInProgressTitle'>
<div>
{t('dialog.recordingInProgressDescription')}
</div>
{ translateToHTML(t, 'dialog.recordingInProgressDescription', { learnMore }) }
</Dialog>
);
}

View File

@@ -1,9 +1,10 @@
import i18next from 'i18next';
import { IReduxState, IStore } from '../app/types';
import { isMobileBrowser } from '../base/environment/utils';
import { MEET_FEATURES } from '../base/jwt/constants';
import { isJwtFeatureEnabled } from '../base/jwt/functions';
import { JitsiRecordingConstants } from '../base/lib-jitsi-meet';
import { JitsiRecordingConstants, browser } from '../base/lib-jitsi-meet';
import { getSoundFileSrc } from '../base/media/functions';
import { getLocalParticipant, getRemoteParticipants } from '../base/participants/functions';
import { registerSound, unregisterSound } from '../base/sounds/actions';
@@ -151,7 +152,8 @@ export function getSessionStatusToShow(state: IReduxState, mode: string): string
* @returns {boolean} - Whether local recording is supported or not.
*/
export function supportsLocalRecording() {
return LocalRecordingManager.isSupported();
return browser.isChromiumBased() && !browser.isElectron() && !isMobileBrowser()
&& navigator.product !== 'ReactNative';
}
/**
@@ -439,15 +441,18 @@ export function isLiveStreamingButtonVisible({
* @returns {boolean}
*/
export function shouldRequireRecordingConsent(recorderSession: any, state: IReduxState) {
const { requireRecordingConsent } = state['features/dynamic-branding'] || {};
const { requireConsent } = state['features/base/config'].recordings || {};
const { requireRecordingConsent, skipRecordingConsentInMeeting }
= state['features/dynamic-branding'] || {};
const { conference } = state['features/base/conference'] || {};
const { requireConsent, skipConsentInMeeting } = state['features/base/config'].recordings || {};
const { iAmRecorder } = state['features/base/config'];
const { consentRequested } = state['features/recording'];
if (iAmRecorder) {
return false;
}
if (isSpotTV()) {
if (isSpotTV(state)) {
return false;
}
@@ -455,10 +460,25 @@ export function shouldRequireRecordingConsent(recorderSession: any, state: IRedu
return false;
}
if (!recorderSession.getInitiator()
|| recorderSession.getStatus() === JitsiRecordingConstants.status.OFF) {
if (consentRequested.has(recorderSession.getID())) {
return false;
}
return recorderSession.getInitiator() !== getLocalParticipant(state)?.id;
// If we join a meeting that has an ongoing recording `conference` will be undefined since
// we get the recording state through the initial presence which happens in between the
// WILL_JOIN and JOINED events.
if (conference && (skipConsentInMeeting || skipRecordingConsentInMeeting)) {
return false;
}
// lib-jitsi-meet may set a JitsiParticipant as the initiator of the recording session or the
// JID resource in case it cannot find it. We need to handle both cases.
const initiator = recorderSession.getInitiator();
const initiatorId = initiator?.getId?.() ?? initiator;
if (!initiatorId || recorderSession.getStatus() === JitsiRecordingConstants.status.OFF) {
return false;
}
return initiatorId !== getLocalParticipant(state)?.id;
}

View File

@@ -36,6 +36,7 @@ import { isRecorderTranscriptionsRunning } from '../transcribing/functions';
import { RECORDING_SESSION_UPDATED, START_LOCAL_RECORDING, STOP_LOCAL_RECORDING } from './actionTypes';
import {
clearRecordingSessions,
markConsentRequested,
hidePendingRecordingNotification,
showPendingRecordingNotification,
showRecordingError,
@@ -420,6 +421,7 @@ function _showExplicitConsentDialog(recorderSession: any, dispatch: IStore['disp
}
batch(() => {
dispatch(markConsentRequested(recorderSession.getID()));
dispatch(setAudioUnmutePermissions(true, true));
dispatch(setVideoUnmutePermissions(true, true));
dispatch(setAudioMuted(true));

View File

@@ -2,6 +2,7 @@ import ReducerRegistry from '../base/redux/ReducerRegistry';
import {
CLEAR_RECORDING_SESSIONS,
MARK_CONSENT_REQUESTED,
RECORDING_SESSION_UPDATED,
SET_MEETING_HIGHLIGHT_BUTTON_STATE,
SET_PENDING_RECORDING_NOTIFICATION_UID,
@@ -11,6 +12,7 @@ import {
} from './actionTypes';
const DEFAULT_STATE = {
consentRequested: new Set(),
disableHighlightMeetingMoment: false,
pendingNotificationUids: {},
selectedRecordingService: '',
@@ -29,6 +31,7 @@ export interface ISessionData {
}
export interface IRecordingState {
consentRequested: Set<any>;
disableHighlightMeetingMoment: boolean;
pendingNotificationUids: {
[key: string]: string | undefined;
@@ -57,6 +60,15 @@ ReducerRegistry.register<IRecordingState>(STORE_NAME,
sessionDatas: []
};
case MARK_CONSENT_REQUESTED:
return {
...state,
consentRequested: new Set([
...state.consentRequested,
action.sessionId
])
};
case RECORDING_SESSION_UPDATED:
return {
...state,

View File

@@ -1,3 +1,11 @@
/* eslint-disable lines-around-comment */
import {
PC_CON_STATE_CHANGE,
PC_STATE_CONNECTED,
PC_STATE_FAILED
// @ts-expect-error
} from '@jitsi/rtcstats/events';
import JitsiMeetJS, { RTCStatsEvents } from '../base/lib-jitsi-meet';
import logger from './logger';
@@ -8,11 +16,6 @@ import {
VideoTypeData
} from './types';
// TODO(saghul): expose these in libn-jitsi-meet?
const PC_CON_STATE_CHANGE = 'connectionstatechange';
const PC_STATE_CONNECTED = 'connected';
const PC_STATE_FAILED = 'failed';
/**
* Handle lib-jitsi-meet rtcstats events and send jitsi-meet specific statistics.
*/

View File

@@ -155,10 +155,6 @@ export function submitMoreTab(newState: any) {
conference?.setTranscriptionLanguage(newState.currentLanguage);
}
if (newState.showSubtitlesOnStage !== currentState.showSubtitlesOnStage) {
dispatch(updateSettings({ showSubtitlesOnStage: newState.showSubtitlesOnStage }));
}
};
}

View File

@@ -18,6 +18,7 @@ import FormSection from './FormSection';
const ModeratorSection = () => {
const dispatch = useDispatch();
const {
audioModerationEnabled,
chatWithPermissionsEnabled,
followMeActive,
followMeEnabled,
@@ -25,7 +26,8 @@ const ModeratorSection = () => {
followMeRecorderEnabled,
startAudioMuted,
startVideoMuted,
startReactionsMuted
startReactionsMuted,
videoModerationEnabled
} = useSelector((state: IReduxState) => getModeratorTabProps(state));
const { disableReactionsModeration } = useSelector((state: IReduxState) => state['features/base/config']);
@@ -68,13 +70,13 @@ const ModeratorSection = () => {
const moderationSettings = useMemo(() => {
const moderation = [
{
disabled: false,
disabled: audioModerationEnabled,
label: 'settings.startAudioMuted',
state: startAudioMuted,
onChange: onStartAudioMutedToggled
},
{
disabled: false,
disabled: videoModerationEnabled,
label: 'settings.startVideoMuted',
state: startVideoMuted,
onChange: onStartVideoMutedToggled

View File

@@ -13,6 +13,10 @@ import Checkbox from '../../../base/ui/components/web/Checkbox';
* The type of the React {@code Component} props of {@link ModeratorTab}.
*/
export interface IProps extends AbstractDialogTabProps, WithTranslation {
/**
* Whether the user has selected the audio moderation feature to be enabled.
*/
audioModerationEnabled: boolean;
/**
* Whether the user has selected the chat with permissions feature to be enabled.
@@ -71,6 +75,11 @@ export interface IProps extends AbstractDialogTabProps, WithTranslation {
* enabled.
*/
startVideoMuted: boolean;
/**
* Whether the user has selected the video moderation feature to be enabled.
*/
videoModerationEnabled: boolean;
}
const styles = (theme: Theme) => {
@@ -200,6 +209,7 @@ class ModeratorTab extends AbstractDialogTab<IProps, any> {
*/
override render() {
const {
audioModerationEnabled,
chatWithPermissionsEnabled,
disableChatWithPermissions,
disableReactionsModeration,
@@ -210,7 +220,8 @@ class ModeratorTab extends AbstractDialogTab<IProps, any> {
startAudioMuted,
startVideoMuted,
startReactionsMuted,
t
t,
videoModerationEnabled
} = this.props;
const classes = withStyles.getClasses(this.props);
@@ -223,18 +234,18 @@ class ModeratorTab extends AbstractDialogTab<IProps, any> {
<h2 className = { classes.title }>
{t('settings.moderatorOptions')}
</h2>
<Checkbox
{ !audioModerationEnabled && <Checkbox
checked = { startAudioMuted }
className = { classes.checkbox }
label = { t('settings.startAudioMuted') }
name = 'start-audio-muted'
onChange = { this._onStartAudioMutedChanged } />
<Checkbox
onChange = { this._onStartAudioMutedChanged } /> }
{ !videoModerationEnabled && <Checkbox
checked = { startVideoMuted }
className = { classes.checkbox }
label = { t('settings.startVideoMuted') }
name = 'start-video-muted'
onChange = { this._onStartVideoMutedChanged } />
onChange = { this._onStartVideoMutedChanged } /> }
<Checkbox
checked = { followMeEnabled && !followMeActive && !followMeRecorderChecked }
className = { classes.checkbox }

View File

@@ -17,11 +17,6 @@ import { MAX_ACTIVE_PARTICIPANTS } from '../../../filmstrip/constants';
*/
export interface IProps extends AbstractDialogTabProps, WithTranslation {
/**
* Indicates if closed captions are enabled.
*/
areClosedCaptionsEnabled: boolean;
/**
* CSS classes object.
*/
@@ -83,11 +78,6 @@ export interface IProps extends AbstractDialogTabProps, WithTranslation {
*/
showPrejoinSettings: boolean;
/**
* Whether or not to show subtitles on stage.
*/
showSubtitlesOnStage: boolean;
/**
* Whether or not the stage filmstrip is enabled.
*/
@@ -136,7 +126,6 @@ class MoreTab extends AbstractDialogTab<IProps, any> {
this._renderMaxStageParticipantsSelect = this._renderMaxStageParticipantsSelect.bind(this);
this._onMaxStageParticipantsSelect = this._onMaxStageParticipantsSelect.bind(this);
this._onHideSelfViewChanged = this._onHideSelfViewChanged.bind(this);
this._onShowSubtitlesOnStageChanged = this._onShowSubtitlesOnStageChanged.bind(this);
this._onLanguageItemSelect = this._onLanguageItemSelect.bind(this);
}
@@ -148,13 +137,11 @@ class MoreTab extends AbstractDialogTab<IProps, any> {
*/
override render() {
const {
areClosedCaptionsEnabled,
showPrejoinSettings,
disableHideSelfView,
iAmVisitor,
hideSelfView,
showLanguageSettings,
showSubtitlesOnStage,
t
} = this.props;
const classes = withStyles.getClasses(this.props);
@@ -176,12 +163,6 @@ class MoreTab extends AbstractDialogTab<IProps, any> {
name = 'hide-self-view'
onChange = { this._onHideSelfViewChanged } />
)}
{areClosedCaptionsEnabled && <Checkbox
checked = { showSubtitlesOnStage }
className = { classes.checkbox }
label = { t('settings.showSubtitlesOnStage') }
name = 'show-subtitles-button'
onChange = { this._onShowSubtitlesOnStageChanged } /> }
{showLanguageSettings && this._renderLanguageSelect()}
</div>
);
@@ -223,17 +204,6 @@ class MoreTab extends AbstractDialogTab<IProps, any> {
super._onChange({ hideSelfView: checked });
}
/**
* Callback invoked to select if show subtitles button should be enabled.
*
* @param {Object} e - The key event to handle.
*
* @returns {void}
*/
_onShowSubtitlesOnStageChanged({ target: { checked } }: React.ChangeEvent<HTMLInputElement>) {
super._onChange({ showSubtitlesOnStage: checked });
}
/**
* Callback invoked to select a language from select dropdown.
*

View File

@@ -316,7 +316,6 @@ function _mapStateToProps(state: IReduxState, ownProps: any) {
currentLanguage: tabState?.currentLanguage,
hideSelfView: tabState?.hideSelfView,
showPrejoinPage: tabState?.showPrejoinPage,
showSubtitlesOnStage: tabState?.showSubtitlesOnStage,
maxStageParticipants: tabState?.maxStageParticipants
};
},

View File

@@ -1,9 +1,11 @@
import { IReduxState } from '../app/types';
import { isEnabledFromState } from '../av-moderation/functions';
import { IStateful } from '../base/app/types';
import { isNameReadOnly } from '../base/config/functions.any';
import { SERVER_URL_CHANGE_ENABLED } from '../base/flags/constants';
import { getFeatureFlag } from '../base/flags/functions';
import i18next, { DEFAULT_LANGUAGE, LANGUAGES } from '../base/i18n/i18next';
import { MEDIA_TYPE } from '../base/media/constants';
import { getLocalParticipant } from '../base/participants/functions';
import { toState } from '../base/redux/functions';
import { getHideSelfView } from '../base/settings/functions.any';
@@ -12,7 +14,6 @@ import { isStageFilmstripEnabled } from '../filmstrip/functions';
import { isFollowMeActive, isFollowMeRecorderActive } from '../follow-me/functions';
import { isPrejoinEnabledInConfig } from '../prejoin/functions';
import { isReactionsEnabled } from '../reactions/functions.any';
import { areClosedCaptionsEnabled } from '../subtitles/functions.any';
import { iAmVisitor } from '../visitors/functions';
import { shouldShowModeratorSettings } from './functions';
@@ -108,7 +109,6 @@ export function getMoreTabProps(stateful: IStateful) {
const { disableSelfView, disableSelfViewSettings } = state['features/base/config'];
return {
areClosedCaptionsEnabled: areClosedCaptionsEnabled(state),
currentLanguage: language,
disableHideSelfView: disableSelfViewSettings || disableSelfView,
hideSelfView: getHideSelfView(state),
@@ -118,7 +118,6 @@ export function getMoreTabProps(stateful: IStateful) {
showLanguageSettings: configuredTabs.includes('language'),
showPrejoinPage: !state['features/base/settings'].userSelectedSkipPrejoin,
showPrejoinSettings: isPrejoinEnabledInConfig(state),
showSubtitlesOnStage: state['features/base/settings'].showSubtitlesOnStage,
stageFilmstripEnabled
};
}
@@ -147,9 +146,13 @@ export function getModeratorTabProps(stateful: IStateful) {
const followMeRecorderActive = isFollowMeRecorderActive(state);
const showModeratorSettings = shouldShowModeratorSettings(state);
const disableChatWithPermissions = !conference?.getMetadataHandler().getMetadata().allownersEnabled;
const isAudioModerationEnabled = isEnabledFromState(MEDIA_TYPE.AUDIO, state);
const isVideoModerationEnabled = isEnabledFromState(MEDIA_TYPE.VIDEO, state);
// The settings sections to display.
return {
audioModerationEnabled: isAudioModerationEnabled,
videoModerationEnabled: isVideoModerationEnabled,
chatWithPermissionsEnabled: Boolean(groupChatWithPermissions),
showModeratorSettings: Boolean(conference && showModeratorSettings),
disableChatWithPermissions: Boolean(disableChatWithPermissions),

View File

@@ -9,7 +9,6 @@ import { getFieldValue } from '../../../base/react/functions';
import { withPixelLineHeight } from '../../../base/styles/functions.web';
import { MOBILE_BREAKPOINT } from '../../constants';
import { isSpeakerStatsSearchDisabled } from '../../functions';
import { HiddenDescription } from '../../../base/ui/components/web/HiddenDescription';
const useStyles = makeStyles()(theme => {
return {
@@ -97,9 +96,6 @@ function SpeakerStatsSearch({ onSearch }: IProps) {
return null;
}
const inputId = 'speaker-stats-search';
const inputDescriptionId = `${inputId}-hidden-description`;
return (
<div className = { classes.speakerStatsSearchContainer }>
<Icon
@@ -107,21 +103,17 @@ function SpeakerStatsSearch({ onSearch }: IProps) {
color = { theme.palette.icon03 }
src = { IconSearch } />
<input
aria-describedby = { inputDescriptionId }
aria-label = { t('speakerStats.searchHint') }
autoComplete = 'off'
autoFocus = { false }
className = { classes.speakerStatsSearch }
id = { inputId }
id = 'speaker-stats-search'
name = 'speakerStatsSearch'
onChange = { onChange }
onKeyPress = { preventDismiss }
placeholder = { t('speakerStats.search') }
tabIndex = { 0 }
value = { searchValue } />
<HiddenDescription id = { inputDescriptionId }>
{t('speakerStats.searchDescription')}
</HiddenDescription>
</div>
);
}

View File

@@ -68,6 +68,7 @@ export class AudioMixerEffect {
* @param {MediaStream} audioStream - Audio stream which will be mixed with _mixAudio.
* @returns {MediaStream} - MediaStream containing both audio tracks mixed together.
*/
// @ts-ignore
startEffect(audioStream: MediaStream) {
this._originalStream = audioStream;
this._originalTrack = audioStream.getTracks()[0];

View File

@@ -55,8 +55,3 @@ export const TOGGLE_REQUESTING_SUBTITLES
*/
export const SET_REQUESTING_SUBTITLES
= 'SET_REQUESTING_SUBTITLES';
/**
* Action to store received subtitles in history.
*/
export const STORE_SUBTITLE = 'STORE_SUBTITLE';

View File

@@ -4,11 +4,9 @@ import {
REMOVE_CACHED_TRANSCRIPT_MESSAGE,
REMOVE_TRANSCRIPT_MESSAGE,
SET_REQUESTING_SUBTITLES,
STORE_SUBTITLE,
TOGGLE_REQUESTING_SUBTITLES,
UPDATE_TRANSCRIPT_MESSAGE
} from './actionTypes';
import { ISubtitle } from './types';
/**
* Signals that a transcript has to be removed from the state.
@@ -100,19 +98,3 @@ export function setRequestingSubtitles(
language
};
}
/**
* Stores a received subtitle in the history.
*
* @param {ISubtitle} subtitle - The subtitle to store.
* @returns {{
* type: STORE_SUBTITLE,
* subtitle: ISubtitle
* }}
*/
export function storeSubtitle(subtitle: ISubtitle) {
return {
type: STORE_SUBTITLE,
subtitle
};
}

View File

@@ -4,21 +4,10 @@ import { IReduxState } from '../../app/types';
import { MEET_FEATURES } from '../../base/jwt/constants';
import AbstractButton, { IProps as AbstractButtonProps } from '../../base/toolbox/components/AbstractButton';
import { maybeShowPremiumFeatureDialog } from '../../jaas/actions';
import { canStartSubtitles, isCCTabEnabled } from '../functions.any';
import { canStartSubtitles } from '../functions.any';
/**
* Props interface for the Abstract Closed Caption Button component.
*
* @interface IAbstractProps
* @augments {AbstractButtonProps}
*/
export interface IAbstractProps extends AbstractButtonProps {
/**
* Whether the subtitles tab is enabled in the UI.
*/
_isCCTabEnabled: boolean;
_language: string | null;
/**
@@ -120,7 +109,6 @@ export function _abstractMapStateToProps(state: IReduxState, ownProps: IAbstract
const { visible = canStartSubtitles(state) } = ownProps;
return {
_isCCTabEnabled: isCCTabEnabled(state),
_requestingSubtitles,
_language,
visible

View File

@@ -3,8 +3,11 @@ import { useTranslation } from 'react-i18next';
import { useDispatch, useSelector } from 'react-redux';
import { IReduxState, IStore } from '../../app/types';
import {
TRANSLATION_LANGUAGES,
TRANSLATION_LANGUAGES_HEAD
} from '../../base/i18n/i18next';
import { setRequestingSubtitles } from '../actions.any';
import { getAvailableSubtitlesLanguages } from '../functions.any';
export interface IAbstractLanguageSelectorDialogProps {
@@ -27,30 +30,40 @@ export interface IAbstractLanguageSelectorDialogProps {
const AbstractLanguageSelectorDialog = (Component: ComponentType<IAbstractLanguageSelectorDialogProps>) => () => {
const dispatch = useDispatch();
const { t } = useTranslation();
const language = useSelector((state: IReduxState) => state['features/subtitles']._language);
// The value for the selected language contains "translation-languages:" prefix.
const selectedLanguage = language?.replace('translation-languages:', '');
const languageCodes = useSelector((state: IReduxState) => getAvailableSubtitlesLanguages(state, selectedLanguage));
const noLanguageLabel = 'transcribing.subtitlesOff';
const selected = language ?? noLanguageLabel;
const items = [ noLanguageLabel, ...languageCodes.map((lang: string) => `translation-languages:${lang}`) ];
const listItems = items
const language = useSelector((state: IReduxState) => state['features/subtitles']._language);
const subtitles = language ?? noLanguageLabel;
const transcription = useSelector((state: IReduxState) => state['features/base/config'].transcription);
const translationLanguagesHead = transcription?.translationLanguagesHead ?? TRANSLATION_LANGUAGES_HEAD;
const languagesHead = translationLanguagesHead?.map((lang: string) => `translation-languages:${lang}`);
// The off and the head languages are always on the top of the list. But once you are selecting
// a language from the translationLanguages, that language is moved under the fixedItems list,
// until a new languages is selected. FixedItems keep their positions.
const fixedItems = [ noLanguageLabel, ...languagesHead ];
const translationLanguages = transcription?.translationLanguages ?? TRANSLATION_LANGUAGES;
const languages = translationLanguages
.map((lang: string) => `translation-languages:${lang}`)
.filter((lang: string) => !(lang === subtitles || languagesHead?.includes(lang)));
const listItems = (fixedItems?.includes(subtitles)
? [ ...fixedItems, ...languages ]
: [ ...fixedItems, subtitles, ...languages ])
.map((lang, index) => {
return {
id: lang + index,
lang,
selected: lang === selected
selected: lang === subtitles
};
});
const onLanguageSelected = useCallback((value: string) => {
const _selectedLanguage = value === noLanguageLabel ? null : value;
const enabled = Boolean(_selectedLanguage);
const selectedLanguage = value === noLanguageLabel ? null : value;
const enabled = Boolean(selectedLanguage);
const displaySubtitles = enabled;
dispatch(setRequestingSubtitles(enabled, displaySubtitles, _selectedLanguage));
dispatch(setRequestingSubtitles(enabled, displaySubtitles, selectedLanguage));
}, [ language ]);
return (
@@ -59,7 +72,7 @@ const AbstractLanguageSelectorDialog = (Component: ComponentType<IAbstractLangua
language = { language }
listItems = { listItems }
onLanguageSelected = { onLanguageSelected }
subtitles = { selected }
subtitles = { subtitles }
t = { t } />
);
};

View File

@@ -2,91 +2,37 @@ import { connect } from 'react-redux';
import { translate } from '../../../base/i18n/functions';
import { IconSubtitles } from '../../../base/icons/svg';
import { openCCPanel } from '../../../chat/actions.any';
import { toggleLanguageSelectorDialog } from '../../actions.web';
import { canStartSubtitles, isCCTabEnabled } from '../../functions.any';
import {
AbstractClosedCaptionButton,
IAbstractProps,
_abstractMapStateToProps
} from '../AbstractClosedCaptionButton';
import { IReduxState } from '../../../app/types';
/**
* A button which starts/stops the transcriptions.
*/
class ClosedCaptionButton
extends AbstractClosedCaptionButton {
override accessibilityLabel = 'toolbar.accessibilityLabel.cc';
override icon = IconSubtitles;
override tooltip = 'transcribing.ccButtonTooltip';
override label = 'toolbar.startSubtitles';
override labelProps = {
language: this.props.t(this.props._language ?? 'transcribing.subtitlesOff'),
languages: this.props.t(this.props.languages ?? ''),
languagesHead: this.props.t(this.props.languagesHead ?? '')
};
/**
* Gets the current button label based on the CC tab state.
*
* @returns {void}
*/
override _getLabel() {
const { _isCCTabEnabled } = this.props;
return _isCCTabEnabled ? 'toolbar.closedCaptions' : 'toolbar.startSubtitles';
}
/**
* Returns the accessibility label for the button.
*
* @returns {string} Accessibility label.
*/
override _getAccessibilityLabel() {
const { _isCCTabEnabled } = this.props;
return _isCCTabEnabled ? 'toolbar.accessibilityLabel.closedCaptions' : 'toolbar.accessibilityLabel.cc';
}
/**
* Returns the tooltip text based on the CC tab state.
*
* @returns {string} The tooltip text.
*/
override _getTooltip() {
const { _isCCTabEnabled } = this.props;
return _isCCTabEnabled ? 'transcribing.openClosedCaptions' : 'transcribing.ccButtonTooltip';
}
/**
* Toggle language selection dialog.
*
* @returns {void}
*/
override _handleClickOpenLanguageSelector() {
const { dispatch, _isCCTabEnabled } = this.props;
const { dispatch } = this.props;
if (_isCCTabEnabled) {
dispatch(openCCPanel());
} else {
dispatch(toggleLanguageSelectorDialog());
}
dispatch(toggleLanguageSelectorDialog());
}
}
/**
* Maps redux state to component props.
*
* @param {Object} state - The redux state.
* @param {Object} ownProps - The component's own props.
* @returns {Object} Mapped props for the component.
*/
function mapStateToProps(state: IReduxState, ownProps: IAbstractProps) {
const { visible = canStartSubtitles(state) || isCCTabEnabled(state) } = ownProps;
return _abstractMapStateToProps(state, {
...ownProps,
visible
});
}
export default translate(connect(mapStateToProps)(ClosedCaptionButton));
export default translate(connect(_abstractMapStateToProps)(ClosedCaptionButton));

View File

@@ -1,104 +0,0 @@
import React, { ChangeEvent, useCallback } from 'react';
import { useTranslation } from 'react-i18next';
import { useDispatch, useSelector } from 'react-redux';
import { makeStyles } from 'tss-react/mui';
import { IReduxState } from '../../../app/types';
import { withPixelLineHeight } from '../../../base/styles/functions.web';
import Select from '../../../base/ui/components/web/Select';
import { setRequestingSubtitles } from '../../actions.any';
import { getAvailableSubtitlesLanguages } from '../../functions.any';
/**
* The styles for the LanguageSelector component.
*
* @param {Theme} theme - The MUI theme.
* @returns {Object} The styles object.
*/
const useStyles = makeStyles()(theme => {
return {
container: {
display: 'flex',
alignItems: 'center',
padding: theme.spacing(2),
gap: theme.spacing(2)
},
select: {
flex: 1,
minWidth: 200
},
label: {
...withPixelLineHeight(theme.typography.bodyShortRegular),
color: theme.palette.text01,
whiteSpace: 'nowrap'
}
};
});
/**
* Component that renders a language selection dropdown.
* Uses the same language options as LanguageSelectorDialog and
* updates the subtitles language preference in Redux.
*
* @param {IProps} props - The component props.
* @returns {JSX.Element} - The rendered component.
*/
function LanguageSelector() {
const { t } = useTranslation();
const { classes } = useStyles();
const dispatch = useDispatch();
const selectedLanguage = useSelector((state: IReduxState) => state['features/subtitles']._language);
const languageCodes = useSelector((state: IReduxState) => getAvailableSubtitlesLanguages(
state,
selectedLanguage?.replace('translation-languages:', '')
));
/**
* Maps available languages to Select component options format.
*
* @type {Array<{value: string, label: string}>}
*/
const languages = [ 'transcribing.original', ...languageCodes.map(lang => `translation-languages:${lang}`) ]
.map(lang => {
return {
value: lang,
label: t(lang)
};
});
/**
* Handles language selection changes.
* Dispatches the setRequestingSubtitles action with the new language.
*
* @param {string} value - The selected language code.
* @returns {void}
*/
const onLanguageChange = useCallback((e: ChangeEvent<HTMLSelectElement>) => {
let { value }: { value?: string | null; } = e.target;
if (value === 'transcribing.original') {
value = null;
}
dispatch(setRequestingSubtitles(true, true, value));
if (value !== null) {
value = value.replace('translation-languages:', '');
}
}, [ dispatch ]);
return (
<div className = { classes.container }>
<span className = { classes.label }>
{t('transcribing.translateTo')}:
</span>
<Select
className = { classes.select }
id = 'subtitles-language-select'
onChange = { onLanguageChange }
options = { languages }
value = { selectedLanguage || 'transcribing.original' } />
</div>
);
}
export default LanguageSelector;

View File

@@ -1,7 +1,4 @@
import { IReduxState } from '../app/types';
import { IStateful } from '../base/app/types';
import { TRANSLATION_LANGUAGES, TRANSLATION_LANGUAGES_HEAD } from '../base/i18n/i18next';
import { toState } from '../base/redux/functions';
import { canAddTranscriber, isTranscribing } from '../transcribing/functions';
/**
@@ -13,59 +10,3 @@ import { canAddTranscriber, isTranscribing } from '../transcribing/functions';
export function canStartSubtitles(state: IReduxState) {
return canAddTranscriber(state) || isTranscribing(state);
}
/**
* Retrieves the list of available subtitles languages. The list consists of head languages (fixed items that stay on
* top) followed by the rest of available translation languages.
*
* @param {IStateful} stateful - The stateful object containing the redux state.
* @param {string} [selectedLanguage] - Optional language code of currently selected language. If provided and not in
* regular translation languages, it will be added after head languages.
* @returns {Array<string>} - Array of language codes. Includes both head languages and regular translation languages.
*/
export function getAvailableSubtitlesLanguages(stateful: IStateful, selectedLanguage?: string | null) {
const state = toState(stateful);
const { transcription } = state['features/base/config'];
const translationLanguagesHead = transcription?.translationLanguagesHead ?? TRANSLATION_LANGUAGES_HEAD;
const translationLanguages
= (transcription?.translationLanguages ?? TRANSLATION_LANGUAGES)
.filter((lang: string) => !translationLanguagesHead?.includes(lang) && lang !== selectedLanguage);
const isSelectedLanguageNotIncluded = Boolean(
selectedLanguage
&& !translationLanguages.includes(selectedLanguage)
&& !translationLanguagesHead.includes(selectedLanguage));
return [
...translationLanguagesHead,
// selectedLanguage is redundant but otherwise TS complains about null elements in the array.
...isSelectedLanguageNotIncluded && selectedLanguage ? [ selectedLanguage ] : [],
...translationLanguages
];
}
/**
* Determines if closed captions are enabled.
*
* @param {IReduxState} state - The Redux state object.
* @returns {boolean} A boolean indicating whether closed captions are enabled.
*/
export function areClosedCaptionsEnabled(state: IReduxState) {
const { transcription } = state['features/base/config'];
return !transcription?.disableClosedCaptions;
}
/**
* Checks whether the subtitles tab should be enabled in the UI.
*
* @param {IReduxState} state - The redux state.
* @returns {boolean} - True if the subtitles tab should be enabled.
*/
export function isCCTabEnabled(state: IReduxState) {
const { showSubtitlesOnStage = false } = state['features/base/settings'];
return areClosedCaptionsEnabled(state) && !showSubtitlesOnStage;
}

View File

@@ -1,9 +1,7 @@
import { useSelector } from 'react-redux';
import { IReduxState } from '../app/types';
import ClosedCaptionButton from './components/web/ClosedCaptionButton';
import { areClosedCaptionsEnabled, canStartSubtitles } from './functions.any';
import { canStartSubtitles } from './functions.any';
const cc = {
key: 'closedcaptions',
@@ -14,18 +12,12 @@ const cc = {
/**
* A hook that returns the CC button if it is enabled and undefined otherwise.
*
* @returns {Object | undefined}
* @returns {Object | undefined}
*/
export function useClosedCaptionButton() {
const isStartSubtitlesButtonVisible = useSelector(canStartSubtitles);
const { showSubtitlesOnStage = false } = useSelector((state: IReduxState) => state['features/base/settings']);
const _areClosedCaptionsEnabled = useSelector(areClosedCaptionsEnabled);
if (!_areClosedCaptionsEnabled) {
return undefined;
}
if (isStartSubtitlesButtonVisible || !showSubtitlesOnStage) {
if (isStartSubtitlesButtonVisible) {
return cc;
}
}

View File

@@ -16,13 +16,12 @@ import {
removeCachedTranscriptMessage,
removeTranscriptMessage,
setRequestingSubtitles,
storeSubtitle,
updateTranscriptMessage
} from './actions.any';
import { notifyTranscriptionChunkReceived } from './functions';
import { areClosedCaptionsEnabled, isCCTabEnabled } from './functions.any';
import logger from './logger';
import { ISubtitle, ITranscriptMessage } from './types';
import { ITranscriptMessage } from './types';
/**
* The type of json-message which indicates that json carries a
@@ -123,7 +122,11 @@ function _endpointMessageReceived(store: IStore, next: Function, action: AnyActi
const { dispatch, getState } = store;
const state = getState();
const _areClosedCaptionsEnabled = areClosedCaptionsEnabled(store.getState());
const language
= state['features/base/conference'].conference
?.getLocalParticipantProperty(P_NAME_TRANSLATION_LANGUAGE);
const { dumpTranscript, skipInterimTranscriptions } = state['features/base/config'].testing ?? {};
const transcriptMessageID = json.message_id;
const { name, id, avatar_url: avatarUrl } = json.participant;
const participant = {
@@ -131,57 +134,25 @@ function _endpointMessageReceived(store: IStore, next: Function, action: AnyActi
id,
name
};
const { timestamp } = json;
const participantId = participant.id;
// Handle transcript messages
const language = state['features/base/conference'].conference
?.getLocalParticipantProperty(P_NAME_TRANSLATION_LANGUAGE);
const { dumpTranscript, skipInterimTranscriptions } = state['features/base/config'].testing ?? {};
let newTranscriptMessage: ITranscriptMessage | undefined;
if (json.type === JSON_TYPE_TRANSLATION_RESULT) {
if (!_areClosedCaptionsEnabled) {
// If closed captions are not enabled, bail out.
return next(action);
}
const translation = json.text?.trim();
if (isCCTabEnabled(state)) {
dispatch(storeSubtitle({
participantId,
text: translation,
language: json.language,
interim: false,
isTranscription: false,
timestamp,
id: transcriptMessageID
}));
return next(action);
}
if (json.language === language) {
// Displays final results in the target language if translation is
// enabled.
newTranscriptMessage = {
clearTimeOut: undefined,
final: json.text?.trim(),
participant
};
}
if (json.type === JSON_TYPE_TRANSLATION_RESULT && json.language === language) {
// Displays final results in the target language if translation is
// enabled.
newTranscriptMessage = {
clearTimeOut: undefined,
final: json.text?.trim(),
participant
};
} else if (json.type === JSON_TYPE_TRANSCRIPTION_RESULT) {
const isInterim = json.is_interim;
// Displays interim and final results without any translation if
// translations are disabled.
const { text } = json.transcript[0];
// First, notify the external API.
if (!(isInterim && skipInterimTranscriptions)) {
if (!(json.is_interim && skipInterimTranscriptions)) {
const txt: any = {};
if (!json.is_interim) {
@@ -221,27 +192,6 @@ function _endpointMessageReceived(store: IStore, next: Function, action: AnyActi
}
}
if (!_areClosedCaptionsEnabled) {
// If closed captions are not enabled, bail out.
return next(action);
}
const subtitle: ISubtitle = {
id: transcriptMessageID,
participantId,
language: json.language,
text,
interim: isInterim,
timestamp,
isTranscription: true
};
if (isCCTabEnabled(state)) {
dispatch(storeSubtitle(subtitle));
return next(action);
}
// If the user is not requesting transcriptions just bail.
// Regex to filter out all possible country codes after language code:
// this should catch all notations like 'en-GB' 'en_GB' and 'enGB'

View File

@@ -5,11 +5,10 @@ import {
REMOVE_CACHED_TRANSCRIPT_MESSAGE,
REMOVE_TRANSCRIPT_MESSAGE,
SET_REQUESTING_SUBTITLES,
STORE_SUBTITLE,
TOGGLE_REQUESTING_SUBTITLES,
UPDATE_TRANSCRIPT_MESSAGE
} from './actionTypes';
import { ISubtitle, ITranscriptMessage } from './types';
import { ITranscriptMessage } from './types';
/**
* Default State for 'features/transcription' feature.
@@ -19,9 +18,7 @@ const defaultState = {
_displaySubtitles: false,
_transcriptMessages: new Map(),
_requestingSubtitles: false,
_language: null,
messages: [],
subtitlesHistory: []
_language: null
};
export interface ISubtitlesState {
@@ -30,8 +27,6 @@ export interface ISubtitlesState {
_language: string | null;
_requestingSubtitles: boolean;
_transcriptMessages: Map<string, ITranscriptMessage>;
messages: ITranscriptMessage[];
subtitlesHistory: Array<ISubtitle>;
}
/**
@@ -64,30 +59,6 @@ ReducerRegistry.register<ISubtitlesState>('features/subtitles', (
...state,
...defaultState
};
case STORE_SUBTITLE: {
const existingIndex = state.subtitlesHistory.findIndex(
subtitle => subtitle.id === action.subtitle.id
);
if (existingIndex >= 0 && state.subtitlesHistory[existingIndex].interim) {
const newHistory = [ ...state.subtitlesHistory ];
newHistory[existingIndex] = action.subtitle;
return {
...state,
subtitlesHistory: newHistory
};
}
return {
...state,
subtitlesHistory: [
...state.subtitlesHistory,
action.subtitle
]
};
}
}
return state;

View File

@@ -1,5 +1,3 @@
import { IGroupableMessage } from '../base/util/messageGrouping';
export interface ITranscriptMessage {
clearTimeOut?: number;
final?: string;
@@ -11,13 +9,3 @@ export interface ITranscriptMessage {
stable?: string;
unstable?: string;
}
export interface ISubtitle extends IGroupableMessage {
id: string;
interim?: boolean;
isTranscription?: boolean;
language?: string;
participantId: string;
text: string;
timestamp: number;
}

View File

@@ -8,7 +8,6 @@ import { isMobileBrowser } from '../../../base/environment/utils';
import { getLocalParticipant, isLocalParticipantModerator } from '../../../base/participants/functions';
import ContextMenu from '../../../base/ui/components/web/ContextMenu';
import { isReactionsButtonEnabled, shouldDisplayReactionsButtons } from '../../../reactions/functions.web';
import { isCCTabEnabled } from '../../../subtitles/functions.any';
import { isTranscribing } from '../../../transcribing/functions';
import {
setHangupMenuVisible,
@@ -92,10 +91,9 @@ export default function Toolbox({
const isDialogVisible = useSelector((state: IReduxState) => Boolean(state['features/base/dialog'].component));
const localParticipant = useSelector(getLocalParticipant);
const transcribing = useSelector(isTranscribing);
const _isCCTabEnabled = useSelector(isCCTabEnabled);
// Do not convert to selector, it returns new array and will cause re-rendering of toolbox on every action.
const jwtDisabledButtons = getJwtDisabledButtons(transcribing, _isCCTabEnabled, localParticipant?.features);
const jwtDisabledButtons = getJwtDisabledButtons(transcribing, localParticipant?.features);
const reactionsButtonEnabled = useSelector(isReactionsButtonEnabled);
const _shouldDisplayReactionsButtons = useSelector(shouldDisplayReactionsButtons);

View File

@@ -27,13 +27,11 @@ export function isAudioMuteButtonDisabled(state: IReduxState) {
* This function is stateless as it returns a new array and may cause re-rendering.
*
* @param {boolean} isTranscribing - Whether there is currently a transcriber in the meeting.
* @param {boolean} isCCTabEnabled - Whether the closed captions tab is enabled.
* @param {ILocalParticipant} localParticipantFeatures - The features of the local participant.
* @returns {string[]} - The disabled by jwt buttons array.
*/
export function getJwtDisabledButtons(
isTranscribing: boolean,
isCCTabEnabled: boolean,
localParticipantFeatures?: IParticipantFeatures) {
const acc = [];
@@ -45,7 +43,7 @@ export function getJwtDisabledButtons(
acc.push('livestreaming');
}
if (!isTranscribing && !isCCTabEnabled && !isJwtFeatureEnabledStateless({
if (!isTranscribing && !isJwtFeatureEnabledStateless({
localParticipantFeatures,
feature: 'transcription',
ifNotInFeatures: false

View File

@@ -7,7 +7,6 @@ import StateListenerRegistry from '../base/redux/StateListenerRegistry';
import { playSound } from '../base/sounds/actions';
import { showNotification } from '../notifications/actions';
import { NOTIFICATION_TIMEOUT_TYPE } from '../notifications/constants';
import { INotificationProps } from '../notifications/types';
import { RECORDING_OFF_SOUND_ID, RECORDING_ON_SOUND_ID } from '../recording/constants';
import { isLiveStreamingRunning, isRecordingRunning } from '../recording/functions';
@@ -59,13 +58,11 @@ function maybeEmitRecordingNotification(dispatch: IStore['dispatch'], getState:
return;
}
const notifyProps: INotificationProps = {
descriptionKey: on ? 'recording.on' : 'recording.off',
titleKey: 'dialog.recording'
};
batch(() => {
dispatch(showNotification(notifyProps, NOTIFICATION_TIMEOUT_TYPE.SHORT));
dispatch(showNotification({
descriptionKey: on ? 'recording.on' : 'recording.off',
titleKey: 'dialog.recording'
}, NOTIFICATION_TIMEOUT_TYPE.SHORT));
dispatch(playSound(on ? RECORDING_ON_SOUND_ID : RECORDING_OFF_SOUND_ID));
});
}

View File

@@ -207,6 +207,20 @@ function on_message(event)
room.av_moderation_actors = {};
end
room.av_moderation[mediaType] = array{};
-- We want to set startMuted policy in metadata, in case of new participants are joining to respect
-- it, that will be enforced by jicofo
local startMutedMetadata = room.jitsiMetadata.startMuted or {};
-- We want to keep the previous value of startMuted for this mediaType if av moderation is disabled
-- to be able to restore
local av_moderation_startMuted_restore = room.av_moderation_startMuted_restore or {};
av_moderation_startMuted_restore = startMutedMetadata[mediaType];
room.av_moderation_startMuted_restore = av_moderation_startMuted_restore;
startMutedMetadata[mediaType] = true;
room.jitsiMetadata.startMuted = startMutedMetadata;
room.av_moderation_actors[mediaType] = occupant.nick;
end
else
@@ -218,7 +232,11 @@ function on_message(event)
room.av_moderation[mediaType] = nil;
room.av_moderation_actors[mediaType] = nil;
-- clears room.av_moderation if empty
local startMutedMetadata = room.jitsiMetadata.startMuted or {};
local av_moderation_startMuted_restore = room.av_moderation_startMuted_restore or {};
startMutedMetadata[mediaType] = av_moderation_startMuted_restore[mediaType];
room.jitsiMetadata.startMuted = startMutedMetadata;
local is_empty = true;
for key,_ in pairs(room.av_moderation) do
if room.av_moderation[key] then

View File

@@ -44,10 +44,12 @@ local stanza = event.stanza;
if session.jitsi_meet_context_user ~= nil then
initiator.id = session.jitsi_meet_context_user.id;
else
initiator.id = session.granted_jitsi_meet_context_user_id;
end
if session.jitsi_meet_context_group ~= nil then
initiator.group = session.jitsi_meet_context_group;
end
initiator.group
= session.jitsi_meet_context_group or session.granted_jitsi_meet_context_group_id;
app_data.file_recording_metadata.initiator = initiator
update_app_data = true;

View File

@@ -112,36 +112,48 @@ function filter_stanza(stanza, session)
end
local muc_x = stanza:get_child('x', MUC_NS..'#user');
if not muc_x then
if not muc_x or not presence_check_status(muc_x, '110') then
return stanza;
end
local room = get_room_from_jid(room_jid_match_rewrite(jid.bare(stanza.attr.from)));
if not room or not room.send_default_permissions_to or is_healthcheck_room(room.jid) then
if not room or is_healthcheck_room(room.jid) then
return stanza;
end
if session.auth_token and session.jitsi_meet_context_features then -- token and features are set so skip
room.send_default_permissions_to[bare_to] = nil;
return stanza;
if not room.send_default_permissions_to then
room.send_default_permissions_to = {};
end
-- we are sending permissions only when becoming a member
local is_moderator = false;
for item in muc_x:childtags('item') do
if item.attr.role == 'moderator' then
is_moderator = true;
break;
if not session.force_permissions_update then
if session.auth_token and session.jitsi_meet_context_features then -- token and features are set so skip
room.send_default_permissions_to[bare_to] = nil;
return stanza;
end
-- we are sending permissions only when becoming a member
local is_moderator = false;
for item in muc_x:childtags('item') do
if item.attr.role == 'moderator' then
is_moderator = true;
break;
end
end
if not is_moderator then
return stanza;
end
if not room.send_default_permissions_to[bare_to] then
return stanza;
end
end
if not is_moderator or not room.send_default_permissions_to[bare_to]
or not presence_check_status(muc_x, '110') then
return stanza;
end
session.force_permissions_update = false;
local permissions_to_send = session.granted_jitsi_meet_context_features or default_permissions;
local permissions_to_send
= session.jitsi_meet_context_features or session.granted_jitsi_meet_context_features or default_permissions;
room.send_default_permissions_to[bare_to] = nil;

View File

@@ -10,18 +10,21 @@
-- Component "metadata.jitmeet.example.com" "room_metadata_component"
-- muc_component = "conference.jitmeet.example.com"
-- breakout_rooms_component = "breakout.jitmeet.example.com"
local filters = require 'util.filters';
local jid_node = require 'util.jid'.node;
local json = require 'cjson.safe';
local st = require 'util.stanza';
local jid = require 'util.jid';
local util = module:require 'util';
local is_admin = util.is_admin;
local is_healthcheck_room = util.is_healthcheck_room;
local get_room_from_jid = util.get_room_from_jid;
local room_jid_match_rewrite = util.room_jid_match_rewrite;
local internal_room_jid_match_rewrite = util.internal_room_jid_match_rewrite;
local process_host_module = util.process_host_module;
local MUC_NS = 'http://jabber.org/protocol/muc';
local COMPONENT_IDENTITY_TYPE = 'room_metadata';
local FORM_KEY = 'muc#roominfo_jitsimetadata';
@@ -96,6 +99,8 @@ function room_created(event)
if not room.jitsiMetadata then
room.jitsiMetadata = {};
end
room.sent_initial_metadata = {};
end
function on_message(event)
@@ -281,3 +286,57 @@ if breakout_rooms_component_host then
end
end);
end
-- Send a message update for metadata before sending the first self presence
function filter_stanza(stanza, session)
if not stanza.attr or not stanza.attr.to or stanza.name ~= 'presence'
or stanza.attr.type == 'unavailable' or ends_with(stanza.attr.from, '/focus') then
return stanza;
end
local bare_to = jid.bare(stanza.attr.to);
if is_admin(bare_to) then
return stanza;
end
local muc_x = stanza:get_child('x', MUC_NS..'#user');
if not muc_x or not presence_check_status(muc_x, '110') then
return stanza;
end
local room = get_room_from_jid(room_jid_match_rewrite(jid.bare(stanza.attr.from)));
if not room or not room.sent_initial_metadata or is_healthcheck_room(room.jid) then
return stanza;
end
if room.sent_initial_metadata[bare_to] then
return stanza;
end
local occupant;
for _, o in room:each_occupant() do
if o.bare_jid == bare_to then
occupant = o;
end
end
if not occupant then
module:log('warn', 'No occupant %s found for %s', bare_to, room.jid);
return stanza;
end
room.sent_initial_metadata[bare_to] = true;
send_json_msg(occupant.jid, internal_room_jid_match_rewrite(room.jid), getMetadataJSON(room));
return stanza;
end
function filter_session(session)
-- domain mapper is filtering on default priority 0
-- allowners is -1 and we need it after that, permissions is -2
filters.add_filter(session, 'stanzas/out', filter_stanza, -3);
end
-- enable filtering presences
filters.add_filter_hook(filter_session);

View File

@@ -3,7 +3,10 @@
#BASE_URL=
# Room name suffix to use when creating new room names
# ROOM_NAME_SUFFIX=
#ROOM_NAME_SUFFIX=
# Room name prefix to use when creating new room names
#ROOM_NAME_PREFIX=
# To be able to match a domain to a specific address
# The format is "MAP example.com 1.2.3.4"
@@ -43,6 +46,12 @@
# A rest URL to be used by dial-in tests to invite jigasi to the conference
#DIAL_IN_REST_URL=
# A destination number to dialout, that auto answers and sends media
#DIAL_OUT_URL=
# A destination number to dialout, that auto answer and sends media audio and video
#SIP_JIBRI_DIAL_OUT_URL=
# Whether to use beta for the first participants
#BROWSER_CHROME_BETA=false
#BROWSER_FF_BETA=false

View File

@@ -1,11 +1,13 @@
import fs from 'node:fs';
import WebSocket from 'ws';
/**
* Uses the webhook proxy service to proxy events to the testing clients.
*/
export default class WebhookProxy {
private url;
private secret;
private readonly url;
private readonly secret;
private logFile;
private ws: WebSocket | undefined;
private cache = new Map();
private listeners = new Map();
@@ -15,10 +17,12 @@ export default class WebhookProxy {
* Initializes the webhook proxy.
* @param url
* @param secret
* @param logFile
*/
constructor(url: string, secret: string) {
constructor(url: string, secret: string, logFile: string) {
this.url = url;
this.secret = secret;
this.logFile = logFile;
}
/**
@@ -40,6 +44,8 @@ export default class WebhookProxy {
this.ws.on('message', (data: any) => {
const msg = JSON.parse(data.toString());
this.logInfo(`${msg.eventType} event: ${JSON.stringify(msg)}`);
if (msg.eventType) {
let processed = false;
@@ -85,6 +91,7 @@ export default class WebhookProxy {
* Clear any stored event.
*/
clearCache() {
this.logInfo('cache cleared');
this.cache.clear();
}
@@ -98,7 +105,11 @@ export default class WebhookProxy {
const error = new Error(`Timeout waiting for event:${eventType}`);
return new Promise((resolve, reject) => {
const waiter = setTimeout(() => reject(error), timeout);
const waiter = setTimeout(() => {
this.logInfo(error.message);
return reject(error);
}, timeout);
this.addConsumer(eventType, event => {
clearTimeout(waiter);
@@ -134,6 +145,22 @@ export default class WebhookProxy {
this.ws.close();
console.log('WebhookProxy disconnected');
this.ws = undefined;
this.logInfo('disconnected');
}
}
/**
* Logs a message in the logfile.
*
* @param {string} message - The message to add.
* @returns {void}
*/
logInfo(message: string) {
try {
// @ts-ignore
fs.appendFileSync(this.logFile, `${new Date().toISOString()} ${message}\n`);
} catch (err) {
console.error(err);
}
}
}

View File

@@ -318,7 +318,8 @@ function getToken(ctx: IContext, displayName: string, moderator = true) {
'features': {
'outbound-call': 'true',
'transcription': 'true',
'recording': 'true'
'recording': 'true',
'sip-outbound-call': true
},
},
'room': '*'

12
tests/helpers/utils.ts Normal file
View File

@@ -0,0 +1,12 @@
/**
* Generates a random number between 1 and the specified maximum value (inclusive).
*
* @param {number} max - The maximum value for the random number (must be a positive integer).
* @param numberOfDigits - The number of digits to pad the random number with leading zeros.
* @return {string} The random number formatted with leading zeros if needed.
*/
export function getRandomNumberAsStr(max: number, numberOfDigits: number): string {
const randomNumber = Math.floor(Math.random() * max) + 1;
return randomNumber.toString().padStart(numberOfDigits, '0');
}

View File

@@ -41,10 +41,10 @@ export default class IframeAPI extends BasePageObject {
addEventListener(eventName: string) {
return this.participant.execute(
(event, prefix) => {
console.log(`${new Date().toISOString()} ${prefix} Adding listener for event: ${event}`);
console.log(`${new Date().toISOString()} ${prefix}iframeAPI - Adding listener for event: ${event}`);
window.jitsiAPI.addListener(event, evt => {
console.log(
`${new Date().toISOString()} ${prefix} Received ${event} event: ${JSON.stringify(evt)}`);
`${new Date().toISOString()} ${prefix}iframeAPI - Received ${event} event: ${JSON.stringify(evt)}`);
window.jitsiAPI.test[event] = evt;
});
}, eventName, LOG_PREFIX);
@@ -89,4 +89,24 @@ export default class IframeAPI extends BasePageObject {
dispose() {
return this.participant.execute(() => window.jitsiAPI.dispose());
}
/**
* Invite the given participant to the meeting via PSTN.
*/
invitePhone(value: string) {
return this.participant.execute(v => window.jitsiAPI.invite([ {
type: 'phone',
number: v
} ]), value);
}
/**
* Invite the given participant to the meeting via sip (sip jibri).
*/
inviteSIP(value: string) {
return this.participant.execute(v => window.jitsiAPI.invite([ {
type: 'sip',
address: v
} ]), value);
}
}

View File

@@ -197,7 +197,7 @@ async function checkReceivingChunks(p1: Participant, p2: Participant, webhooksPr
// @ts-ignore
const firstEntryData = result[0].value.data;
const stable = firstEntryData.stable;
const stable = firstEntryData.stable || firstEntryData.final;
const language = firstEntryData.language;
const messageID = firstEntryData.messageID;
const p1Id = await p1.getEndpointId();
@@ -210,7 +210,7 @@ async function checkReceivingChunks(p1: Participant, p2: Participant, webhooksPr
return v.data;
}).forEach(tr => {
const checkTranscripts = stable.includes(tr.stable) || tr.stable.includes(stable);
const checkTranscripts = stable.includes(tr.stable || tr.final) || (tr.stable || tr.final).includes(stable);
if (!checkTranscripts) {
console.log('received events', result);

View File

@@ -30,7 +30,9 @@ describe('Codec selection', () => {
// Check if p1 is sending VP9 and p2 is sending VP8 as per their codec preferences.
// Except on Firefox because it doesn't support VP9 encode.
if (p1.driver.isFirefox) {
const majorVersion = parseInt(p1.driver.capabilities.browserVersion || '0', 10);
if (p1.driver.isFirefox && majorVersion < 136) {
expect(await p1.execute(() => JitsiMeetJS.app.testing.isLocalCameraEncodingVp8())).toBe(true);
} else {
expect(await p1.execute(() => JitsiMeetJS.app.testing.isLocalCameraEncodingVp9())).toBe(true);
@@ -52,11 +54,11 @@ describe('Codec selection', () => {
// Check if media is playing on p3.
expect(await p3.execute(() => JitsiMeetJS.app.testing.isLargeVideoReceived())).toBe(true);
// Check if p1 is encoding in VP9, p2 in VP8 and p3 in AV1 as per their codec preferences.
// Except on Firefox because it doesn't support AV1/VP9 encode and AV1 decode.
const majorVersion = parseInt(p1.driver.capabilities.browserVersion || '0', 10);
// Check if p1 is encoding in VP9, p2 in VP8 and p3 in AV1 as per their codec preferences.
// Except on Firefox because it doesn't support VP9 encode.
if (p1.driver.isFirefox) {
if (p1.driver.isFirefox && majorVersion < 136) {
expect(await p1.execute(() => JitsiMeetJS.app.testing.isLocalCameraEncodingVp8())).toBe(true);
} else {
expect(await p1.execute(() => JitsiMeetJS.app.testing.isLocalCameraEncodingVp9())).toBe(true);
@@ -85,7 +87,9 @@ describe('Codec selection', () => {
const { p1, p2 } = ctx;
// Disable this test on Firefox because it doesn't support VP9 encode.
if (p1.driver.isFirefox) {
const majorVersion = parseInt(p1.driver.capabilities.browserVersion || '0', 10);
if (p1.driver.isFirefox && majorVersion < 136) {
return;
}

View File

@@ -76,7 +76,6 @@ describe('StartMuted', () => {
await p3.getParticipantsPane().assertVideoMuteIconIsDisplayed(p2, true);
});
it('config options test', async () => {
await hangupAllParticipants();
@@ -92,14 +91,20 @@ describe('StartMuted', () => {
};
await ensureOneParticipant(ctx, options);
await joinSecondParticipant(ctx, { skipInMeetingChecks: true });
await joinSecondParticipant(ctx, {
...options,
skipInMeetingChecks: true
});
const { p2 } = ctx;
await p2.waitForIceConnected();
await p2.waitForSendReceiveData({ checkSend: false });
await joinThirdParticipant(ctx, { skipInMeetingChecks: true });
await joinThirdParticipant(ctx, {
...options,
skipInMeetingChecks: true
});
const { p3 } = ctx;
@@ -110,10 +115,8 @@ describe('StartMuted', () => {
const p2ID = await p2.getEndpointId();
p1.log(`Start configOptionsTest, second participant: ${p2ID}`);
// Participant 3 should be muted, 1 and 2 unmuted.
await p3.getFilmstrip().assertAudioMuteIconIsDisplayed(p3);
await p3.getParticipantsPane().assertVideoMuteIconIsDisplayed(p3);

View File

@@ -1,8 +1,7 @@
import https from 'node:https';
import process from 'node:process';
import { ensureOneParticipant } from '../../helpers/participants';
import { cleanup, isDialInEnabled, waitForAudioFromDialInParticipant } from '../helpers/DialIn';
import { cleanup, dialIn, isDialInEnabled, retrievePin, waitForAudioFromDialInParticipant } from '../helpers/DialIn';
describe('Dial-In', () => {
it('join participant', async () => {
@@ -13,7 +12,7 @@ describe('Dial-In', () => {
return;
}
await ensureOneParticipant(ctx);
await ensureOneParticipant(ctx, { preferGenerateToken: true });
// check dial-in is enabled
if (!await isDialInEnabled(ctx.p1)) {
@@ -22,59 +21,25 @@ describe('Dial-In', () => {
});
it('retrieve pin', async () => {
let dialInPin;
try {
dialInPin = await ctx.p1.getInviteDialog().getPinNumber();
await retrievePin(ctx.p1);
} catch (e) {
console.error('dial-in.test.no-pin');
ctx.skipSuiteTests = true;
throw e;
}
await ctx.p1.getInviteDialog().clickCloseButton();
if (dialInPin.length === 0) {
if (ctx.data.dialInPin === 0) {
console.error('dial-in.test.no-pin');
ctx.skipSuiteTests = true;
throw new Error('no pin');
}
expect(dialInPin.length >= 8).toBe(true);
ctx.data.dialInPin = dialInPin;
expect(ctx.data.dialInPin.length >= 8).toBe(true);
});
it('invite dial-in participant', async () => {
if (!await ctx.p1.isInMuc()) {
// local participant did not join abort
return;
}
const restUrl = process.env.DIAL_IN_REST_URL?.replace('{0}', ctx.data.dialInPin);
// we have already checked in the first test that DIAL_IN_REST_URL exist so restUrl cannot be ''
const responseData: string = await new Promise((resolve, reject) => {
https.get(restUrl || '', res => {
let data = '';
res.on('data', chunk => {
data += chunk;
});
res.on('end', () => {
ctx.times.restAPIExecutionTS = performance.now();
resolve(data);
});
}).on('error', err => {
console.error('dial-in.test.restAPI.request.fail');
console.error(err);
reject(err);
});
});
console.log(`dial-in.test.call_session_history_id:${JSON.parse(responseData).call_session_history_id}`);
await dialIn(ctx.p1);
});
it('wait for audio from dial-in participant', async () => {

View File

@@ -0,0 +1,205 @@
import { ensureOneParticipant } from '../../helpers/participants';
import {
cleanup,
dialIn,
isDialInEnabled,
retrievePin,
waitForAudioFromDialInParticipant
} from '../helpers/DialIn';
import type { Participant } from '../../helpers/Participant';
describe('Invite iframeAPI', () => {
it('join participant', async () => {
await ensureOneParticipant(ctx);
const { p1 } = ctx;
// check for dial-in dial-out sip-jibri maybe
if (await p1.execute(() => config.disableIframeAPI)) {
// skip the test if iframeAPI is disabled
ctx.skipSuiteTests = true;
return;
}
ctx.data.dialOutDisabled = Boolean(!await p1.execute(() => config.dialOutAuthUrl));
ctx.data.sipJibriDisabled = Boolean(!await p1.execute(() => config.inviteServiceUrl));
// check dial-in is enabled
if (!await isDialInEnabled(ctx.p1) || !process.env.DIAL_IN_REST_URL) {
ctx.data.dialInDisabled = true;
}
});
it('dial-in', async () => {
if (ctx.data.dialInDisabled) {
return;
}
const { p1 } = ctx;
await retrievePin(p1);
expect(ctx.data.dialInPin.length >= 8).toBe(true);
await dialIn(p1);
if (!await p1.isInMuc()) {
// local participant did not join abort
return;
}
await waitForAudioFromDialInParticipant(p1);
await checkDialEvents(p1, 'in', 'DIAL_IN_STARTED', 'DIAL_IN_ENDED');
});
it('dial-out', async () => {
if (ctx.data.dialOutDisabled || !process.env.DIAL_OUT_URL) {
return;
}
const { p1 } = ctx;
await p1.switchToAPI();
await p1.getIframeAPI().invitePhone(process.env.DIAL_OUT_URL);
await p1.switchInPage();
await p1.waitForParticipants(1);
await waitForAudioFromDialInParticipant(p1);
await checkDialEvents(p1, 'out', 'DIAL_OUT_STARTED', 'DIAL_OUT_ENDED');
});
it('sip jibri', async () => {
if (ctx.data.sipJibriDisabled || !process.env.SIP_JIBRI_DIAL_OUT_URL) {
return;
}
const { p1 } = ctx;
await p1.switchToAPI();
await p1.getIframeAPI().inviteSIP(process.env.SIP_JIBRI_DIAL_OUT_URL);
await p1.switchInPage();
await p1.waitForParticipants(1);
await waitForAudioFromDialInParticipant(p1);
const { webhooksProxy } = ctx;
if (webhooksProxy) {
const customerId = process.env.IFRAME_TENANT?.replace('vpaas-magic-cookie-', '');
const sipCallOutStartedEvent: {
customerId: string;
data: {
participantFullJid: string;
participantId: string;
participantJid: string;
sipAddress: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent('SIP_CALL_OUT_STARTED');
expect('SIP_CALL_OUT_STARTED').toBe(sipCallOutStartedEvent.eventType);
expect(sipCallOutStartedEvent.data.sipAddress).toBe(`sip:${process.env.SIP_JIBRI_DIAL_OUT_URL}`);
expect(sipCallOutStartedEvent.customerId).toBe(customerId);
const participantId = sipCallOutStartedEvent.data.participantId;
const participantJid = sipCallOutStartedEvent.data.participantJid;
const participantFullJid = sipCallOutStartedEvent.data.participantFullJid;
await cleanup(p1);
const sipCallOutEndedEvent: {
customerId: string;
data: {
direction: string;
participantFullJid: string;
participantId: string;
participantJid: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent('SIP_CALL_OUT_ENDED');
expect('SIP_CALL_OUT_ENDED').toBe(sipCallOutEndedEvent.eventType);
expect(sipCallOutEndedEvent.customerId).toBe(customerId);
expect(sipCallOutEndedEvent.data.participantFullJid).toBe(participantFullJid);
expect(sipCallOutEndedEvent.data.participantId).toBe(participantId);
expect(sipCallOutEndedEvent.data.participantJid).toBe(participantJid);
} else {
await cleanup(p1);
}
});
});
/**
* Checks the dial events for a participant and clean up at the end.
* @param participant
* @param startedEventName
* @param endedEventName
* @param direction
*/
async function checkDialEvents(participant: Participant, direction: string, startedEventName: string, endedEventName: string) {
const { webhooksProxy } = ctx;
if (webhooksProxy) {
const customerId = process.env.IFRAME_TENANT?.replace('vpaas-magic-cookie-', '');
const dialInStartedEvent: {
customerId: string;
data: {
direction: string;
participantFullJid: string;
participantId: string;
participantJid: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent(startedEventName);
expect(startedEventName).toBe(dialInStartedEvent.eventType);
expect(dialInStartedEvent.data.direction).toBe(direction);
expect(dialInStartedEvent.customerId).toBe(customerId);
const participantId = dialInStartedEvent.data.participantId;
const participantJid = dialInStartedEvent.data.participantJid;
const participantFullJid = dialInStartedEvent.data.participantFullJid;
const usageEvent: {
customerId: string;
data: any;
eventType: string;
} = await webhooksProxy.waitForEvent('USAGE');
expect('USAGE').toBe(usageEvent.eventType);
expect(usageEvent.customerId).toBe(customerId);
expect(usageEvent.data.some((el: any) =>
el.participantId === participantId && el.callDirection === direction)).toBe(true);
await cleanup(participant);
const dialInEndedEvent: {
customerId: string;
data: {
direction: string;
participantFullJid: string;
participantId: string;
participantJid: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent(endedEventName);
expect(endedEventName).toBe(dialInEndedEvent.eventType);
expect(dialInEndedEvent.customerId).toBe(customerId);
expect(dialInEndedEvent.data.participantFullJid).toBe(participantFullJid);
expect(dialInEndedEvent.data.participantId).toBe(participantId);
expect(dialInEndedEvent.data.participantJid).toBe(participantJid);
} else {
await cleanup(participant);
}
}

Some files were not shown because too many files have changed in this diff Show More