Compare commits

...

31 Commits

Author SHA1 Message Date
Saúl Ibarra Corretgé
013212b753 fix(chat) hide private message option in context menu
If disablePrivateChat is configured.
2025-06-27 13:49:15 +02:00
Mihaela Dumitru
d741fcdd1c fix(recordings) create missing local tracks when unmuting after consent (#16119)
* fix(recordings) create missing local tracks when unmuting after consent

* fix(conference) Avoid creating duplicate tracks on unmute

* squash: Ignore TS linter error

---------

Co-authored-by: Jaya Allamsetty <jaya.allamsetty@8x8.com>
2025-06-10 13:36:23 -04:00
damencho
744818c225 fix(permissions): Adds an option to force-send permissions.
If backend modify permissions can force sending those on the initial presence.
2025-05-23 21:38:03 -05:00
damencho
7d30a665f7 feat(prosody): Check granted identity for recordings. 2025-05-23 12:23:32 -05:00
damencho
d432f1c881 feat(av-moderation): Updates startMuted policy in metadata. 2025-05-22 10:25:30 -05:00
damencho
dfec5b73c0 feat(av-moderation): Disable start muted settings when av moderation is on. 2025-05-22 10:25:19 -05:00
damencho
4898160200 feat(metadata): Pushes metadata early before join. 2025-05-22 10:25:05 -05:00
Jaya Allamsetty
ea47070dd2 fix(conference) Mute user when startMuted policy update is received in conference meta data (#16025) 2025-05-22 10:20:34 -04:00
Saúl Ibarra Corretgé
2cf8ae838c fix(spot) make Spot TV detection more resilient
Setting the UA string in Electron doesn't propagate the change to the
iframe where the meeting is loaded (🤦).

Thus make it more resilient by trying different things:

- A freshly introduced "iAmSpot" config option, similar to Jibri
- The app ID is present in the UA string, so we can test for that
- As a last-ditch effort, check if the display name is the default
  "Meeting Room"
2025-05-16 15:08:42 +02:00
Saúl Ibarra Corretgé
f162a56adb fix(recording) fix matching initiator
LJM will use either a JitsiParticipant object or a string for the
recording session initiator, handle both cases when checking if it's
ourselves.
2025-05-15 21:06:28 -05:00
Hristo Terezov
865207649a Revert typography values in tokens to px from rem (#16026)
* Revert "feat(base/ui/native): Convert rem to px  (#15934)"

This reverts commit 057dc0e4d2.

* Revert "fix(StageParticipantNameLabel): size"

This reverts commit a01f4468a0.

* Revert "fix(subtitles): position part1"

This reverts commit 6c6ed8d7a8.

* Revert "fix(ITypographyType): wrong type of fontSize and lineHeight props"

This reverts commit bffcc9092b.

* revert(Tokens): font sizes and line heights back to px from rem

Turns out there are many places that does not expect rem. Temporary reverting this change from commit 6fa94b0bb4. We should bring it back along with proper handling of rem everywhere.
2025-05-14 10:04:20 -05:00
damencho
3e46011352 fix: Fixes ljm branch. 2025-05-12 16:22:19 -05:00
Hristo Terezov
243acb4a0f fix(ITypographyType): wrong type of fontSize and lineHeight props
In a previous comit about accessibility we changed the fint size and line height to use rem (expressed as string) instead of numbers for px but the types for the interface were not updated.
2025-05-07 20:31:32 -05:00
Hristo Terezov
def8062141 fix(StageParticipantNameLabel): size
Fixes an issue where StageParticipantNameLabel is smaller. This is caused because the font size and line height  props are calculated to an invalid (NaN) value after we started using rem instead of px for lineHeight and fontSize in theme.
Reference: #15917
2025-05-07 19:37:36 -05:00
Hristo Terezov
15ec3a25cb fix(subtitles): position part1
Fixes an issue where subtitles are displayed in the middle of the screen. This is caused because the bottom prop is calculated to an invalid (NaN) value after we started using rem instead of px for lineHeight in theme.
Reference: https://github.com/jitsi/jitsi-meet/pull/15917
2025-05-07 19:36:07 -05:00
Saúl Ibarra Corretgé
603d239884 feat(recording) add ability to skip consent in-meeting
When turned on, the consent dialog won't be displayed for the users who
are already in the meeting, it will only be displayed to those who join
after the recording was started.
2025-05-07 15:17:25 +03:00
Saúl Ibarra Corretgé
67fcfeff43 fix(recording) prevent multiple consent requests
A given recording should only trigger a single consent request.

The mechanism to notify about recording status updates may fire multiple
times since it's tied to XMPP presence and may send updates such as when
the live stream view URL is set.

Rather than trying to handle all possible corner cases to make sure we
only show the consent dialog once, keep track of the recording session
IDs for which we _have_ asked for consent and skip the dialog in case we
have done it already.
2025-05-07 15:16:07 +03:00
Дамян Минков
905cfce884 feat(tests): Use more predictable room names. (#15998)
* feat(tests): Use more predictable room names.

* squash: Make sure room name is in lowercase.
2025-05-06 12:09:37 -05:00
Andrei Gavrilescu
734d7e3bd0 fix(popover): touch interaction closes overflow drawer without triggering action
* automatic drawer toolbox on mobile browser

* fix touch interaction on Popover
2025-05-06 16:23:27 +03:00
Saúl Ibarra Corretgé
92c22be002 feat(lang,settings) remove experimental label from multi-pinning 2025-05-06 15:21:51 +02:00
Saúl Ibarra Corretgé
8a300be738 feat(recording) refactor consent dialog (#15985)
* feat(recording) refactor consent dialog

Offer 2 choices and add a configurable "learn more" link.

* hide dialog and display link conditionally

* native changes

---------

Co-authored-by: Mihaela Dumitru <mihdmt@gmail.com>
2025-05-06 15:43:09 +03:00
damencho
3859b8a8c2 feat(tests): Validate-shard tests improvements.
feat(tests): Prefer to generate token for dial in.

feat(tests): Adds invite test. (#15986)

* feat(tests): Adds invite test.

Tests dial-in, dial-out and inviting sip-jibri.

* squash: Extract duplicate code in a function.

* squash: Fixes comments.

feat(tests): Handle and final transcriptions.

feat(tests): Adds debug log for webhooks.
2025-05-05 08:36:54 -05:00
Hristo Terezov
e93990a1f2 chore(package.json): Use LJM from release branch 2025-04-30 10:08:32 -05:00
damencho
9fb4618ffe fix(prosody): Adds a nil check for ends_with utility. 2025-04-28 15:44:36 -05:00
damencho
f06359b9d1 fix(prosody): Fixes filter rayo message when int id is used.
Make sure we add string values to the stanza.
2025-04-28 14:18:12 -05:00
Hristo Terezov
0e0e18ad52 feat(toolbar): Enable 9th and 10th button 2025-04-22 15:51:07 -05:00
Saúl Ibarra Corretgé
0c0bb4991e fix(recording) skip consent dialog on Spot TV 2025-04-17 21:35:13 +02:00
Saúl Ibarra Corretgé
b2578c140e fix(polls) halt processing of malformed polls
We need to return something other than nil in order to halt the
processing of the event.

https://prosody.im/doc/developers/moduleapi#modulehook_event_name_handler_priority
2025-04-17 21:35:03 +02:00
damencho
33d3e971ca fix(prosody): Fixes extracting domain from rooms without a domain. 2025-04-11 11:21:30 -05:00
Дамян Минков
5092407555 * feat(tests): Simplifies display names and participant create.
* feat(tests): Simplifies display names and participant create.

Moves token creation only when needed.

* squash: Skip webhook check of user id for guest participants.

* squash: Waits for kick reason dialog.

* squash: Simplifies by matching participant name and display name.

* squash: Drop displayname field.
2025-04-11 11:20:55 -05:00
Hristo Terezov
f4bf25ba6c fix(DesktopPicker): Stops displaying if closed too fast.
If the desktop picker window is closed before we load the sources, a JS error is thrown. From there the app goes into a broken state where when the screen sharing button is pressed nothing happens.  Explanation:
When the error from the _onCloseModal handler is thrown we don't reach the line to call the onSourceChoose callback. The result is that we never call the callback received by setDisplayMediaRequestHandler. It seems that when this happens on subsequent gDM calls electron won't call the setDisplayMediaRequestHandler and therefore we don't display the desktop picker.
2025-04-11 10:13:02 -05:00
89 changed files with 1116 additions and 574 deletions

View File

@@ -89,7 +89,7 @@ import {
setVideoMuted,
setVideoUnmutePermissions
} from './react/features/base/media/actions';
import { MEDIA_TYPE, VIDEO_TYPE } from './react/features/base/media/constants';
import { MEDIA_TYPE, VIDEO_MUTISM_AUTHORITY, VIDEO_TYPE } from './react/features/base/media/constants';
import {
getStartWithAudioMuted,
getStartWithVideoMuted,
@@ -131,7 +131,6 @@ import {
createLocalTracksF,
getLocalJitsiAudioTrack,
getLocalJitsiVideoTrack,
getLocalTracks,
getLocalVideoTrack,
isLocalTrackMuted,
isUserInteractionRequiredForUnmute
@@ -206,23 +205,6 @@ function sendData(command, value) {
room.sendCommand(command, { value });
}
/**
* Mute or unmute local audio stream if it exists.
* @param {boolean} muted - if audio stream should be muted or unmuted.
*/
function muteLocalAudio(muted) {
APP.store.dispatch(setAudioMuted(muted));
}
/**
* Mute or unmute local video stream if it exists.
* @param {boolean} muted if video stream should be muted or unmuted.
*
*/
function muteLocalVideo(muted) {
APP.store.dispatch(setVideoMuted(muted));
}
/**
* A queue for the async replaceLocalTrack action so that multiple audio
* replacements cannot happen simultaneously. This solves the issue where
@@ -709,11 +691,10 @@ export default {
* Simulates toolbar button click for audio mute. Used by shortcuts and API.
*
* @param {boolean} mute true for mute and false for unmute.
* @param {boolean} [showUI] when set to false will not display any error
* dialogs in case of media permissions error.
* @returns {Promise}
*/
async muteAudio(mute, showUI = true) {
async muteAudio(mute) {
const state = APP.store.getState();
if (!mute
@@ -732,47 +713,7 @@ export default {
return;
}
// Not ready to modify track's state yet
if (!this._localTracksInitialized) {
// This will only modify base/media.audio.muted which is then synced
// up with the track at the end of local tracks initialization.
muteLocalAudio(mute);
this.updateAudioIconEnabled();
return;
} else if (this.isLocalAudioMuted() === mute) {
// NO-OP
return;
}
const localAudio = getLocalJitsiAudioTrack(APP.store.getState());
if (!localAudio && !mute) {
const maybeShowErrorDialog = error => {
showUI && APP.store.dispatch(notifyMicError(error));
};
APP.store.dispatch(gumPending([ MEDIA_TYPE.AUDIO ], IGUMPendingState.PENDING_UNMUTE));
await createLocalTracksF({ devices: [ 'audio' ] })
.then(([ audioTrack ]) => audioTrack)
.catch(error => {
maybeShowErrorDialog(error);
// Rollback the audio muted status by using null track
return null;
})
.then(async audioTrack => {
await this._maybeApplyAudioMixerEffect(audioTrack);
return this.useAudioStream(audioTrack);
})
.finally(() => {
APP.store.dispatch(gumPending([ MEDIA_TYPE.AUDIO ], IGUMPendingState.NONE));
});
} else {
muteLocalAudio(mute);
}
await APP.store.dispatch(setAudioMuted(mute, true));
},
/**
@@ -802,10 +743,9 @@ export default {
/**
* Simulates toolbar button click for video mute. Used by shortcuts and API.
* @param mute true for mute and false for unmute.
* @param {boolean} [showUI] when set to false will not display any error
* dialogs in case of media permissions error.
*/
muteVideo(mute, showUI = true) {
muteVideo(mute) {
if (this.videoSwitchInProgress) {
logger.warn('muteVideo - unable to perform operations while video switch is in progress');
@@ -826,60 +766,7 @@ export default {
return;
}
// If not ready to modify track's state yet adjust the base/media
if (!this._localTracksInitialized) {
// This will only modify base/media.video.muted which is then synced
// up with the track at the end of local tracks initialization.
muteLocalVideo(mute);
this.setVideoMuteStatus();
return;
} else if (this.isLocalVideoMuted() === mute) {
// NO-OP
return;
}
const localVideo = getLocalJitsiVideoTrack(state);
if (!localVideo && !mute && !this.isCreatingLocalTrack) {
const maybeShowErrorDialog = error => {
showUI && APP.store.dispatch(notifyCameraError(error));
};
this.isCreatingLocalTrack = true;
APP.store.dispatch(gumPending([ MEDIA_TYPE.VIDEO ], IGUMPendingState.PENDING_UNMUTE));
// Try to create local video if there wasn't any.
// This handles the case when user joined with no video
// (dismissed screen sharing screen or in audio only mode), but
// decided to add it later on by clicking on muted video icon or
// turning off the audio only mode.
//
// FIXME when local track creation is moved to react/redux
// it should take care of the use case described above
createLocalTracksF({ devices: [ 'video' ] })
.then(([ videoTrack ]) => videoTrack)
.catch(error => {
// FIXME should send some feedback to the API on error ?
maybeShowErrorDialog(error);
// Rollback the video muted status by using null track
return null;
})
.then(videoTrack => {
logger.debug(`muteVideo: calling useVideoStream for track: ${videoTrack}`);
return this.useVideoStream(videoTrack);
})
.finally(() => {
this.isCreatingLocalTrack = false;
APP.store.dispatch(gumPending([ MEDIA_TYPE.VIDEO ], IGUMPendingState.NONE));
});
} else {
// FIXME show error dialog if it fails (should be handled by react)
muteLocalVideo(mute);
}
APP.store.dispatch(setVideoMuted(mute, VIDEO_MUTISM_AUTHORITY.USER, true));
},
/**
@@ -1829,35 +1716,6 @@ export default {
onStartMutedPolicyChanged(audio, video));
}
);
room.on(JitsiConferenceEvents.STARTED_MUTED, () => {
const audioMuted = room.isStartAudioMuted();
const videoMuted = room.isStartVideoMuted();
const localTracks = getLocalTracks(APP.store.getState()['features/base/tracks']);
const promises = [];
APP.store.dispatch(setAudioMuted(audioMuted));
APP.store.dispatch(setVideoMuted(videoMuted));
// Remove the tracks from the peerconnection.
for (const track of localTracks) {
// Always add the track on Safari because of a known issue where audio playout doesn't happen
// if the user joins audio and video muted, i.e., if there is no local media capture.
if (audioMuted && track.jitsiTrack?.getType() === MEDIA_TYPE.AUDIO && !browser.isWebKitBased()) {
promises.push(this.useAudioStream(null));
}
if (videoMuted && track.jitsiTrack?.getType() === MEDIA_TYPE.VIDEO) {
promises.push(this.useVideoStream(null));
}
}
Promise.allSettled(promises)
.then(() => {
APP.store.dispatch(showNotification({
titleKey: 'notify.mutedTitle',
descriptionKey: 'notify.muted'
}, NOTIFICATION_TIMEOUT_TYPE.SHORT));
});
});
room.on(
JitsiConferenceEvents.DATA_CHANNEL_OPENED, () => {

View File

@@ -398,6 +398,10 @@ var config = {
// // If true, mutes audio and video when a recording begins and displays a dialog
// // explaining the effect of unmuting.
// // requireConsent: true,
// // If true consent will be skipped for users who are already in the meeting.
// // skipConsentInMeeting: true,
// // Link for the recording consent dialog's "Learn more" link.
// // consentLearnMoreLink: 'https://jitsi.org/meet/consent',
// },
// recordingService: {

View File

@@ -1111,7 +1111,7 @@
"incomingMessage": "Příchozí zpráva",
"language": "Jazyk",
"loggedIn": "Přihlášen/a jako {{name}}",
"maxStageParticipants": "Maximální počet účastníků, které lze připnout na hlavní pódium (EXPERIMENTÁLNÍ)",
"maxStageParticipants": "Maximální počet účastníků, které lze připnout na hlavní pódium",
"microphones": "Mikrofony",
"moderator": "Moderátor",
"moderatorOptions": "Možnosti moderátora",

View File

@@ -984,7 +984,7 @@
"incomingMessage": "Εισερχόμενο μήνυμα",
"language": "Γλώσσα",
"loggedIn": "Συνδέθηκε ως {{name}}",
"maxStageParticipants": "Μέγιστος αριθμός συμμετεχόντων που μπορούν να διατηρηθούν στην κύρια σκηνή (ΠΕΙΡΑΜΑΤΙΚΟ)",
"maxStageParticipants": "Μέγιστος αριθμός συμμετεχόντων που μπορούν να διατηρηθούν στην κύρια σκηνή",
"microphones": "Μικρόφωνα",
"moderator": "Συντονιστής",
"moderatorOptions": "Επιλογές συντονιστή",

View File

@@ -1070,7 +1070,7 @@
"incomingMessage": "Envena mesaĝo",
"language": "Lingvo",
"loggedIn": "Ensalutinta kiels {{name}}",
"maxStageParticipants": "Maksimuma nombro da partoprenantoj, kiuj povas esti alpinglitaj al la ĉefa scenejo (EXPERIMENTA)",
"maxStageParticipants": "Maksimuma nombro da partoprenantoj, kiuj povas esti alpinglitaj al la ĉefa scenejo",
"microphones": "Mikrofonoj",
"moderator": "Kunvenestro",
"moderatorOptions": "Kunvenestaj agordoj",

View File

@@ -1026,7 +1026,7 @@
"incomingMessage": "پیام ورودی",
"language": "زبان",
"loggedIn": "واردشده به عنوان {{name}}",
"maxStageParticipants": "بیشینه تعداد شرکت‌کنندگانی که می‌توانند به صحنه اصلی سنجاق شوند (<b>آزمایشی</b>)",
"maxStageParticipants": "بیشینه تعداد شرکت‌کنندگانی که می‌توانند به صحنه اصلی سنجاق شوند",
"microphones": "میکروفون‌ها",
"moderator": "مدیر",
"moderatorOptions": "گزینه‌های مدیر",

View File

@@ -1111,7 +1111,7 @@
"incomingMessage": "un message arrive",
"language": "Langue",
"loggedIn": "Connecté en tant que {{name}}",
"maxStageParticipants": "Nombre maximum de participants pouvant être épinglé sur laffichage principal (EXPÉRIMENTAL)",
"maxStageParticipants": "Nombre maximum de participants pouvant être épinglé sur laffichage principal",
"microphones": "Microphones",
"moderator": "Modérateur",
"moderatorOptions": "Options de modérateur",

View File

@@ -1077,7 +1077,7 @@
"incomingMessage": "un message arrive",
"language": "Langue",
"loggedIn": "Connecté en tant que {{name}}",
"maxStageParticipants": "Nombre maximum de participants pouvant être épinglé sur laffichage principal (EXPÉRIMENTAL)",
"maxStageParticipants": "Nombre maximum de participants pouvant être épinglé sur laffichage principal",
"microphones": "Microphones",
"moderator": "Modérateur",
"moderatorOptions": "Options de modérateur",

View File

@@ -1088,7 +1088,7 @@
"incomingMessage": "Pesan masuk",
"language": "Bahasa",
"loggedIn": "Masuk sebagai {{name}}",
"maxStageParticipants": "Jumlah maksimum peserta yang dapat ditampilkan di panggung utama (PERCOBAAN)",
"maxStageParticipants": "Jumlah maksimum peserta yang dapat ditampilkan di panggung utama",
"microphones": "Mikrofon",
"moderator": "Moderator",
"moderatorOptions": "Opsi moderator",

View File

@@ -1069,7 +1069,7 @@
"incomingMessage": "Móttekin skilaboð",
"language": "Tungumál",
"loggedIn": "Skráð inn sem {{name}}",
"maxStageParticipants": "Hámarksfjöldi þátttakenda sem hægt er að festa á aðalgluggann (Á TILRAUNASTIGI)",
"maxStageParticipants": "Hámarksfjöldi þátttakenda sem hægt er að festa á aðalgluggann",
"microphones": "Hljóðnemar",
"moderator": "Stjórnandi",
"moderatorOptions": "Valkostir umsjónarmanns",

View File

@@ -1110,7 +1110,7 @@
"incomingMessage": "수신 메시지",
"language": "언어",
"loggedIn": "{{name}}으로 로그인",
"maxStageParticipants": "메인 스테이지에 고정할 수 있는 최대 참가자 수 (실험적 기능)",
"maxStageParticipants": "메인 스테이지에 고정할 수 있는 최대 참가자 수",
"microphones": "마이크",
"moderator": "진행자",
"moderatorOptions": "진행자 옵션",

View File

@@ -1117,7 +1117,7 @@
"incomingMessage": "Ienākošā ziņa",
"language": "Valoda",
"loggedIn": "Ierakstījies kā {{name}}",
"maxStageParticipants": "Maksimālais dalībnieku skaits, kurus var piespraust galvenajai skatuvei (EKSPERIMENTĀLS)",
"maxStageParticipants": "Maksimālais dalībnieku skaits, kurus var piespraust galvenajai skatuvei",
"microphones": "Mikrofoni",
"moderator": "Moderators",
"moderatorOptions": "Moderatora opcijas",

View File

@@ -997,7 +997,7 @@
"incomingMessage": "Ирсэн мессэж",
"language": "Хэл",
"loggedIn": "{{name}} нэвтэрсэн",
"maxStageParticipants": "Үндсэн тайз руу гарах оролцогчийн хамгийн их тоо(Туршилтынх)",
"maxStageParticipants": "Үндсэн тайз руу гарах оролцогчийн хамгийн их тоо",
"microphones": "Микрофон",
"moderator": "Зохицуулагч",
"moderatorOptions": "Зохицуулагчийн сонголт",

View File

@@ -1111,7 +1111,7 @@
"incomingMessage": "Innkommende melding",
"language": "Språk",
"loggedIn": "Logget inn som {{name}}",
"maxStageParticipants": "Maksimalt antall deltakere som kan festes til hovedscenen (EKSPERIMENTELL)",
"maxStageParticipants": "Maksimalt antall deltakere som kan festes til hovedscenen",
"microphones": "Mikrofoner",
"moderator": "Moderator",
"moderatorOptions": "Moderatoralternativer",

View File

@@ -1111,7 +1111,7 @@
"incomingMessage": "Innkommende melding",
"language": "Språk",
"loggedIn": "Logget inn som {{name}}",
"maxStageParticipants": "Maksimalt antall deltakere som kan festes til hovedscenen (EKSPERIMENTELL)",
"maxStageParticipants": "Maksimalt antall deltakere som kan festes til hovedscenen",
"microphones": "Mikrofoner",
"moderator": "Moderator",
"moderatorOptions": "Moderatoralternativer",

View File

@@ -1111,7 +1111,7 @@
"incomingMessage": "Messatge dintrant",
"language": "Lenga",
"loggedIn": "Session a {{name}}",
"maxStageParticipants": "Nombre maximal de participants que se pòt penjar a la scèna principala (EXPERIMENTAL)",
"maxStageParticipants": "Nombre maximal de participants que se pòt penjar a la scèna principala",
"microphones": "Microfòns",
"moderator": "Moderator",
"moderatorOptions": "Opcions de moderacion",

View File

@@ -1097,7 +1097,7 @@
"incomingMessage": "Receber uma mensagem",
"language": "Idioma",
"loggedIn": "Sessão iniciada como {{name}}",
"maxStageParticipants": "Número máximo de participantes que podem ser afixados (EXPERIMENTAL)",
"maxStageParticipants": "Número máximo de participantes que podem ser afixados",
"microphones": "Microfones",
"moderator": "Moderador",
"moderatorOptions": "Opções de moderador",

View File

@@ -1067,7 +1067,7 @@
"incomingMessage": "Mensagem recebida",
"language": "Idioma",
"loggedIn": "Conectado como {{name}}",
"maxStageParticipants": "Número máximo de participantes que podem ser fixados no palco principal (EXPERIMENTAL)",
"maxStageParticipants": "Número máximo de participantes que podem ser fixados no palco principal",
"microphones": "Microfones",
"moderator": "Moderador",
"moderatorOptions": "Opções de moderador",

View File

@@ -1083,7 +1083,7 @@
"incomingMessage": "Входящее сообщение",
"language": "Язык",
"loggedIn": "Вошел как {{name}}",
"maxStageParticipants": "Максимальное количество участников, которых можно закрепить на главной сцене (ЭКСПЕРИМЕНТАЛЬНО)",
"maxStageParticipants": "Максимальное количество участников, которых можно закрепить на главной сцене",
"microphones": "Микрофоны",
"moderator": "Модератор",
"moderatorOptions": "Настройки модератора",

View File

@@ -968,7 +968,7 @@
"incomingMessage": "Messàgiu in intrada",
"language": "Limba",
"loggedIn": "Autenticatzione: {{name}}",
"maxStageParticipants": "Nùmeru màssimu de partetzipantes chi podent èssere apicados a s'iscena printzipale (ISPERIMENTALE)",
"maxStageParticipants": "Nùmeru màssimu de partetzipantes chi podent èssere apicados a s'iscena printzipale",
"microphones": "Micròfonos",
"moderator": "Moderadore",
"more": "Àteru",

View File

@@ -1110,7 +1110,7 @@
"incomingMessage": "Mesazh ardhës",
"language": "Gjuhë",
"loggedIn": "I futur si {{name}}",
"maxStageParticipants": "Numër maksimum pjesëmarrësish që mund të fiksohen te skena kryesore (EKSPERIMENTALe)",
"maxStageParticipants": "Numër maksimum pjesëmarrësish që mund të fiksohen te skena kryesore",
"microphones": "Mikrofona",
"moderator": "Moderator",
"moderatorOptions": "Mundësi moderatori",

View File

@@ -995,7 +995,7 @@
"incomingMessage": "Вхідне повідомлення",
"language": "Мова",
"loggedIn": "Увійшли як {{name}}",
"maxStageParticipants": "Максимальна кількість учасників, яку можна закріпити на головній сцені (ТЕСТУВАННЯ)",
"maxStageParticipants": "Максимальна кількість учасників, яку можна закріпити на головній сцені",
"microphones": "Мікрофони",
"moderator": "Модератор",
"moderatorOptions": "Параметри модерації",

View File

@@ -1081,7 +1081,7 @@
"incomingMessage": "Tin nhắn đang gửi",
"language": "Ngôn ngữ",
"loggedIn": "Đã đăng nhập dưới tên {{name}}",
"maxStageParticipants": "Số lượng người tham gia tối đa có thể được ghim vào sân khấu chính (THỬ NGHIỆM)",
"maxStageParticipants": "Số lượng người tham gia tối đa có thể được ghim vào sân khấu chính",
"microphones": "Micro",
"moderator": "Quản trị viên",
"moderatorOptions": "Tùy chọn quản trị viên",

View File

@@ -1049,7 +1049,7 @@
"incomingMessage": "新消息",
"language": "语言",
"loggedIn": "以{{name}}登录",
"maxStageParticipants": "可以固定的最大参会者人数(实验性功能)",
"maxStageParticipants": "可以固定的最大参会者人数",
"microphones": "麦克风",
"moderator": "主持人",
"moderatorOptions": "主持人选项",

View File

@@ -1066,7 +1066,7 @@
"incomingMessage": "新訊息",
"language": "語言",
"loggedIn": "以{{name}}登入",
"maxStageParticipants": "可被釘選的最大與會者人數(實驗性功能)",
"maxStageParticipants": "可被釘選的最大與會者人數",
"microphones": "麥克風",
"moderator": "主持人",
"moderatorOptions": "主持人選項",

View File

@@ -263,7 +263,8 @@
"Remove": "Remove",
"Share": "Share",
"Submit": "Submit",
"Understand": "I understand",
"Understand": "I understand, keep me muted for now",
"UnderstandAndUnmute": "I understand, please unmute me",
"WaitForHostMsg": "The conference has not yet started because no moderators have yet arrived. If you'd like to become a moderator please log-in. Otherwise, please wait.",
"WaitForHostNoAuthMsg": "The conference has not yet started because no moderators have yet arrived. Please wait.",
"WaitingForHostButton": "Wait for moderator",
@@ -300,6 +301,7 @@
"conferenceReloadMsg": "We're trying to fix this. Reconnecting in {{seconds}} sec…",
"conferenceReloadTitle": "Unfortunately, something went wrong.",
"confirm": "Confirm",
"confirmBack": "Back",
"confirmNo": "No",
"confirmYes": "Yes",
"connectError": "Oops! Something went wrong and we couldn't connect to the conference.",
@@ -337,6 +339,7 @@
"kickParticipantTitle": "Kick this participant?",
"kickSystemTitle": "Ouch! You were kicked out of the meeting",
"kickTitle": "Ouch! {{participantDisplayName}} kicked you out of the meeting",
"learnMore": "learn more",
"linkMeeting": "Link meeting",
"linkMeetingTitle": "Link meeting to Salesforce",
"liveStreaming": "Live Streaming",
@@ -394,7 +397,9 @@
"recentlyUsedObjects": "Your recently used objects",
"recording": "Recording",
"recordingDisabledBecauseOfActiveLiveStreamingTooltip": "Not possible while a live stream is active",
"recordingInProgressDescription": "This meeting is being recorded. Your audio and video have been muted. If you choose to unmute, you consent to being recorded.",
"recordingInProgressDescription": "This meeting is being recorded and analyzed by AI{{learnMore}}. Your audio and video have been muted. If you choose to unmute, you consent to being recorded.",
"recordingInProgressDescriptionFirstHalf": "This meeting is being recorded and analyzed by AI",
"recordingInProgressDescriptionSecondHalf": ". Your audio and video have been muted. If you choose to unmute, you consent to being recorded.",
"recordingInProgressTitle": "Recording in progress",
"rejoinNow": "Rejoin now",
"remoteControlAllowedMessage": "{{user}} accepted your remote control request!",
@@ -1119,7 +1124,7 @@
"incomingMessage": "Incoming message",
"language": "Language",
"loggedIn": "Logged in as {{name}}",
"maxStageParticipants": "Maximum number of participants who can be pinned to the main stage (EXPERIMENTAL)",
"maxStageParticipants": "Maximum number of participants who can be pinned to the main stage",
"microphones": "Microphones",
"moderator": "Moderator",
"moderatorOptions": "Moderator options",

11
package-lock.json generated
View File

@@ -62,7 +62,7 @@
"js-md5": "0.6.1",
"js-sha512": "0.8.0",
"jwt-decode": "2.2.0",
"lib-jitsi-meet": "https://github.com/jitsi/lib-jitsi-meet/releases/download/v1973.0.0+64dcc15c/lib-jitsi-meet.tgz",
"lib-jitsi-meet": "https://github.com/jitsi/lib-jitsi-meet#release-8542",
"lodash-es": "4.17.21",
"moment": "2.29.4",
"moment-duration-format": "2.2.2",
@@ -16891,8 +16891,8 @@
},
"node_modules/lib-jitsi-meet": {
"version": "0.0.0",
"resolved": "https://github.com/jitsi/lib-jitsi-meet/releases/download/v1973.0.0+64dcc15c/lib-jitsi-meet.tgz",
"integrity": "sha512-uwFKP+eZpxA8AXpv/XWk4qbyHNEovPS517Kz8gOhQSzYZpCnaN4smc3kfawInWw5da+GXtljVkkWXCWn3Lergw==",
"resolved": "git+ssh://git@github.com/jitsi/lib-jitsi-meet.git#44c07b5cd396f6345819300d9755fa5031dc934c",
"integrity": "sha512-dkZmPKtXZB/xJ7nME/I/Yfr/1X44ZTH6GEEHQ8dJGl1h3lFD8uBFH0Y5oZqxy0oHyKY12IJjckmk6U4mrkS7uA==",
"license": "Apache-2.0",
"dependencies": {
"@jitsi/js-utils": "2.2.1",
@@ -37377,8 +37377,9 @@
}
},
"lib-jitsi-meet": {
"version": "https://github.com/jitsi/lib-jitsi-meet/releases/download/v1973.0.0+64dcc15c/lib-jitsi-meet.tgz",
"integrity": "sha512-uwFKP+eZpxA8AXpv/XWk4qbyHNEovPS517Kz8gOhQSzYZpCnaN4smc3kfawInWw5da+GXtljVkkWXCWn3Lergw==",
"version": "git+ssh://git@github.com/jitsi/lib-jitsi-meet.git#44c07b5cd396f6345819300d9755fa5031dc934c",
"integrity": "sha512-dkZmPKtXZB/xJ7nME/I/Yfr/1X44ZTH6GEEHQ8dJGl1h3lFD8uBFH0Y5oZqxy0oHyKY12IJjckmk6U4mrkS7uA==",
"from": "lib-jitsi-meet@https://github.com/jitsi/lib-jitsi-meet#release-8542",
"requires": {
"@jitsi/js-utils": "2.2.1",
"@jitsi/logger": "2.0.2",

View File

@@ -68,7 +68,7 @@
"js-md5": "0.6.1",
"js-sha512": "0.8.0",
"jwt-decode": "2.2.0",
"lib-jitsi-meet": "https://github.com/jitsi/lib-jitsi-meet/releases/download/v1973.0.0+64dcc15c/lib-jitsi-meet.tgz",
"lib-jitsi-meet": "https://github.com/jitsi/lib-jitsi-meet#release-8542",
"lodash-es": "4.17.21",
"moment": "2.29.4",
"moment-duration-format": "2.2.2",

View File

@@ -1,5 +1,3 @@
import { createStartMutedConfigurationEvent } from '../../analytics/AnalyticsEvents';
import { sendAnalytics } from '../../analytics/functions';
import { IReduxState, IStore } from '../../app/types';
import { transcriberJoined, transcriberLeft } from '../../transcribing/actions';
import { setIAmVisitor } from '../../visitors/actions';
@@ -11,9 +9,7 @@ import { JITSI_CONNECTION_CONFERENCE_KEY } from '../connection/constants';
import { hasAvailableDevices } from '../devices/functions.any';
import JitsiMeetJS, { JitsiConferenceEvents, JitsiE2ePingEvents } from '../lib-jitsi-meet';
import {
setAudioMuted,
setAudioUnmutePermissions,
setVideoMuted,
setVideoUnmutePermissions
} from '../media/actions';
import { MEDIA_TYPE, MediaType } from '../media/constants';
@@ -31,7 +27,6 @@ import { IJitsiParticipant } from '../participants/types';
import { toState } from '../redux/functions';
import {
destroyLocalTracks,
replaceLocalTrack,
trackAdded,
trackRemoved
} from '../tracks/actions.any';
@@ -163,39 +158,6 @@ function _addConferenceListeners(conference: IJitsiConference, dispatch: IStore[
// Dispatches into features/base/media follow:
conference.on(
JitsiConferenceEvents.STARTED_MUTED,
() => {
const audioMuted = Boolean(conference.isStartAudioMuted());
const videoMuted = Boolean(conference.isStartVideoMuted());
const localTracks = getLocalTracks(state['features/base/tracks']);
sendAnalytics(createStartMutedConfigurationEvent('remote', audioMuted, videoMuted));
logger.log(`Start muted: ${audioMuted ? 'audio, ' : ''}${videoMuted ? 'video' : ''}`);
// XXX Jicofo tells lib-jitsi-meet to start with audio and/or video
// muted i.e. Jicofo expresses an intent. Lib-jitsi-meet has turned
// Jicofo's intent into reality by actually muting the respective
// tracks. The reality is expressed in base/tracks already so what
// is left is to express Jicofo's intent in base/media.
// TODO Maybe the app needs to learn about Jicofo's intent and
// transfer that intent to lib-jitsi-meet instead of lib-jitsi-meet
// acting on Jicofo's intent without the app's knowledge.
dispatch(setAudioMuted(audioMuted));
dispatch(setVideoMuted(videoMuted));
// Remove the tracks from peerconnection as well.
for (const track of localTracks) {
const trackType = track.jitsiTrack.getType();
// Do not remove the audio track on RN. Starting with iOS 15 it will fail to unmute otherwise.
if ((audioMuted && trackType === MEDIA_TYPE.AUDIO && navigator.product !== 'ReactNative')
|| (videoMuted && trackType === MEDIA_TYPE.VIDEO)) {
dispatch(replaceLocalTrack(track.jitsiTrack, null, conference));
}
}
});
conference.on(
JitsiConferenceEvents.AUDIO_UNMUTE_PERMISSIONS_CHANGED,
(disableAudioMuteChange: boolean) => {
@@ -808,10 +770,8 @@ export function nonParticipantMessageReceived(id: string, json: Object) {
/**
* Updates the known state of start muted policies.
*
* @param {boolean} audioMuted - Whether or not members will join the conference
* as audio muted.
* @param {boolean} videoMuted - Whether or not members will join the conference
* as video muted.
* @param {boolean} audioMuted - Whether or not members will join the conference as audio muted.
* @param {boolean} videoMuted - Whether or not members will join the conference as video muted.
* @returns {{
* type: SET_START_MUTED_POLICY,
* startAudioMutedPolicy: boolean,
@@ -1022,10 +982,8 @@ export function setRoom(room?: string) {
/**
* Sets whether or not members should join audio and/or video muted.
*
* @param {boolean} startAudioMuted - Whether or not members will join the
* conference as audio muted.
* @param {boolean} startVideoMuted - Whether or not members will join the
* conference as video muted.
* @param {boolean} startAudioMuted - Whether or not members will join the conference as audio muted.
* @param {boolean} startVideoMuted - Whether or not members will join the conference as video muted.
* @returns {Function}
*/
export function setStartMutedPolicy(
@@ -1037,9 +995,6 @@ export function setStartMutedPolicy(
audio: startAudioMuted,
video: startVideoMuted
});
dispatch(
onStartMutedPolicyChanged(startAudioMuted, startVideoMuted));
};
}

View File

@@ -22,12 +22,14 @@ import { INotificationProps } from '../../notifications/types';
import { hasDisplayName } from '../../prejoin/utils';
import { stopLocalVideoRecording } from '../../recording/actions.any';
import LocalRecordingManager from '../../recording/components/Recording/LocalRecordingManager';
import { AudioMixerEffect } from '../../stream-effects/audio-mixer/AudioMixerEffect';
import { iAmVisitor } from '../../visitors/functions';
import { overwriteConfig } from '../config/actions';
import { CONNECTION_ESTABLISHED, CONNECTION_FAILED } from '../connection/actionTypes';
import { connectionDisconnected, disconnect } from '../connection/actions';
import { validateJwt } from '../jwt/functions';
import { JitsiConferenceErrors, JitsiConferenceEvents, JitsiConnectionErrors } from '../lib-jitsi-meet';
import { MEDIA_TYPE } from '../media/constants';
import { PARTICIPANT_UPDATED, PIN_PARTICIPANT } from '../participants/actionTypes';
import { PARTICIPANT_ROLE } from '../participants/constants';
import {
@@ -70,6 +72,7 @@ import {
} from './functions';
import logger from './logger';
import { IConferenceMetadata } from './reducer';
import './subscriber';
/**
* Handler for before unload event.
@@ -653,7 +656,7 @@ function _setRoom({ dispatch, getState }: IStore, next: Function, action: AnyAct
* @private
* @returns {Object} The value returned by {@code next(action)}.
*/
function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction) {
async function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction) {
const track = action.track;
// TODO All track swapping should happen here instead of conference.js.
@@ -661,7 +664,6 @@ function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction)
const { getState } = store;
const state = getState();
const conference = getCurrentConference(state);
let promise;
if (conference) {
const jitsiTrack = action.track.jitsiTrack;
@@ -670,14 +672,22 @@ function _trackAddedOrRemoved(store: IStore, next: Function, action: AnyAction)
// If gUM is slow and tracks are created after the user has already joined the conference, avoid
// adding the tracks to the conference if the user is a visitor.
if (!iAmVisitor(state)) {
promise = _addLocalTracksToConference(conference, [ jitsiTrack ]);
const { desktopAudioTrack } = state['features/screen-share'];
// If the user is sharing their screen and has a desktop audio track, we need to replace that with
// the audio mixer effect so that the desktop audio is mixed in with the microphone audio.
if (typeof APP !== 'undefined' && desktopAudioTrack && track.mediaType === MEDIA_TYPE.AUDIO) {
await conference.replaceTrack(desktopAudioTrack, null);
const audioMixerEffect = new AudioMixerEffect(desktopAudioTrack);
await jitsiTrack.setEffect(audioMixerEffect);
await conference.replaceTrack(null, jitsiTrack);
} else {
await _addLocalTracksToConference(conference, [ jitsiTrack ]);
}
}
} else {
promise = _removeLocalTracksFromConference(conference, [ jitsiTrack ]);
}
if (promise) {
return promise.then(() => next(action));
await _removeLocalTracksFromConference(conference, [ jitsiTrack ]);
}
}
}

View File

@@ -105,8 +105,6 @@ export interface IJitsiConference {
isLobbySupported: Function;
isP2PActive: Function;
isSIPCallingSupported: Function;
isStartAudioMuted: Function;
isStartVideoMuted: Function;
join: Function;
joinLobby: Function;
kickParticipant: Function;

View File

@@ -0,0 +1,61 @@
import { IStore } from '../../app/types';
import { showNotification } from '../../notifications/actions';
import { NOTIFICATION_TIMEOUT_TYPE } from '../../notifications/constants';
import StateListenerRegistry from '../redux/StateListenerRegistry';
import { setAudioMuted, setVideoMuted } from '../media/actions';
import { VIDEO_MUTISM_AUTHORITY } from '../media/constants';
let hasShownNotification = false;
/**
* Handles changes in the start muted policy for audio and video tracks in the meta data set for the conference.
*/
StateListenerRegistry.register(
/* selector */ state => state['features/base/conference'].startAudioMutedPolicy,
/* listener */ (startAudioMutedPolicy, store) => {
_updateTrackMuteState(store, true);
});
StateListenerRegistry.register(
/* selector */ state => state['features/base/conference'].startVideoMutedPolicy,
/* listener */(startVideoMutedPolicy, store) => {
_updateTrackMuteState(store, false);
});
/**
* Updates the mute state of the track based on the start muted policy.
*
* @param {IStore} store - The redux store.
* @param {boolean} isAudio - Whether the track is audio or video.
* @returns {void}
*/
function _updateTrackMuteState(store: IStore, isAudio: boolean) {
const { dispatch, getState } = store;
const mutedPolicyKey = isAudio ? 'startAudioMutedPolicy' : 'startVideoMutedPolicy';
const mutedPolicyValue = getState()['features/base/conference'][mutedPolicyKey];
// Currently, the policy only supports force muting others, not unmuting them.
if (!mutedPolicyValue) {
return;
}
let muteStateUpdated = false;
const { muted } = isAudio ? getState()['features/base/media'].audio : getState()['features/base/media'].video;
if (isAudio && !Boolean(muted)) {
dispatch(setAudioMuted(mutedPolicyValue, true));
muteStateUpdated = true;
} else if (!isAudio && !Boolean(muted)) {
// TODO: Add a new authority for video mutism for the moderator case.
dispatch(setVideoMuted(mutedPolicyValue, VIDEO_MUTISM_AUTHORITY.USER, true));
muteStateUpdated = true;
}
if (!hasShownNotification && muteStateUpdated) {
hasShownNotification = true;
dispatch(showNotification({
titleKey: 'notify.mutedTitle',
descriptionKey: 'notify.muted'
}, NOTIFICATION_TIMEOUT_TYPE.SHORT));
}
}

View File

@@ -438,6 +438,7 @@ export interface IConfig {
};
iAmRecorder?: boolean;
iAmSipGateway?: boolean;
iAmSpot?: boolean;
ignoreStartMuted?: boolean;
inviteAppName?: string | null;
inviteServiceCallFlowsUrl?: string;
@@ -542,10 +543,12 @@ export interface IConfig {
};
recordingSharingUrl?: string;
recordings?: {
consentLearnMoreLink?: string;
recordAudioAndVideo?: boolean;
requireConsent?: boolean;
showPrejoinWarning?: boolean;
showRecordingLink?: boolean;
skipConsentInMeeting?: boolean;
suggestRecording?: boolean;
};
remoteVideoMenu?: {

View File

@@ -169,6 +169,7 @@ export default [
'hideLobbyButton',
'iAmRecorder',
'iAmSipGateway',
'iAmSpot',
'ignoreStartMuted',
'inviteAppName',
'liveStreaming.enabled',

View File

@@ -40,6 +40,7 @@ export default class AbstractDialog<P extends IProps, S extends IState = IState>
super(props);
// Bind event handlers so they are only bound once per instance.
this._onBack = this._onBack.bind(this);
this._onCancel = this._onCancel.bind(this);
this._onSubmit = this._onSubmit.bind(this);
this._onSubmitFulfilled = this._onSubmitFulfilled.bind(this);
@@ -75,6 +76,14 @@ export default class AbstractDialog<P extends IProps, S extends IState = IState>
return this.props.dispatch(hideDialog());
}
_onBack() {
const { backDisabled = false, onBack } = this.props;
if (!backDisabled && (!onBack || onBack())) {
this._hide();
}
}
/**
* Dispatches a redux action to hide this dialog when it's canceled.
*

View File

@@ -16,6 +16,11 @@ import styles from './styles';
*/
interface IProps extends AbstractProps, WithTranslation {
/**
* The i18n key of the text label for the back button.
*/
backLabel?: string;
/**
* The i18n key of the text label for the cancel button.
*/
@@ -36,6 +41,11 @@ interface IProps extends AbstractProps, WithTranslation {
*/
descriptionKey?: string | { key: string; params: string; };
/**
* Whether the back button is hidden.
*/
isBackHidden?: Boolean;
/**
* Whether the cancel button is hidden.
*/
@@ -55,6 +65,11 @@ interface IProps extends AbstractProps, WithTranslation {
* Dialog title.
*/
title?: string;
/**
* Renders buttons vertically.
*/
verticalButtons?: boolean;
}
/**
@@ -102,14 +117,17 @@ class ConfirmDialog extends AbstractDialog<IProps> {
*/
override render() {
const {
backLabel,
cancelLabel,
children,
confirmLabel,
isBackHidden = true,
isCancelHidden,
isConfirmDestructive,
isConfirmHidden,
t,
title
title,
verticalButtons
} = this.props;
const dialogButtonStyle
@@ -119,6 +137,7 @@ class ConfirmDialog extends AbstractDialog<IProps> {
return (
<Dialog.Container
coverScreen = { false }
verticalButtons = { verticalButtons }
visible = { true }>
{
title && <Dialog.Title>
@@ -127,6 +146,12 @@ class ConfirmDialog extends AbstractDialog<IProps> {
}
{ this._renderDescription() }
{ children }
{
!isBackHidden && <Dialog.Button
label = { t(backLabel || 'dialog.confirmBack') }
onPress = { this._onBack }
style = { styles.dialogButton } />
}
{
!isCancelHidden && <Dialog.Button
label = { t(cancelLabel || 'dialog.confirmNo') }

View File

@@ -2,6 +2,16 @@ import { ReactNode } from 'react';
export type DialogProps = {
/**
* Whether back button is disabled. Enabled by default.
*/
backDisabled?: boolean;
/**
* Optional i18n key to change the back button title.
*/
backKey?: string;
/**
* Whether cancel button is disabled. Enabled by default.
*/
@@ -27,6 +37,11 @@ export type DialogProps = {
*/
okKey?: string;
/**
* The handler for onBack event.
*/
onBack?: Function;
/**
* The handler for onCancel event.
*/

View File

@@ -176,6 +176,7 @@ class Popover extends Component<IProps, IState> {
this._setContextMenuStyle = this._setContextMenuStyle.bind(this);
this._getCustomDialogStyle = this._getCustomDialogStyle.bind(this);
this._onOutsideClick = this._onOutsideClick.bind(this);
this._onOutsideTouchStart = this._onOutsideTouchStart.bind(this);
}
/**
@@ -185,7 +186,7 @@ class Popover extends Component<IProps, IState> {
* @returns {void}
*/
override componentDidMount() {
window.addEventListener('touchstart', this._onTouchStart);
window.addEventListener('touchstart', this._onOutsideTouchStart);
if (this.props.trigger === 'click') {
// @ts-ignore
window.addEventListener('click', this._onOutsideClick);
@@ -199,7 +200,7 @@ class Popover extends Component<IProps, IState> {
* @returns {void}
*/
override componentWillUnmount() {
window.removeEventListener('touchstart', this._onTouchStart);
window.removeEventListener('touchstart', this._onOutsideTouchStart);
if (this.props.trigger === 'click') {
// @ts-ignore
window.removeEventListener('click', this._onOutsideClick);
@@ -261,6 +262,7 @@ class Popover extends Component<IProps, IState> {
id = { id }
onClick = { this._onClick }
onKeyPress = { this._onKeyPress }
onTouchStart = { this._onTouchStart }
{ ...(trigger === 'hover' ? {
onMouseEnter: this._onShowDialog,
onMouseLeave: this._onHideDialog
@@ -337,7 +339,7 @@ class Popover extends Component<IProps, IState> {
* @private
* @returns {void}
*/
_onTouchStart(event: TouchEvent) {
_onOutsideTouchStart(event: TouchEvent) {
if (this.props.visible
&& !this.props.overflowDrawer
&& !this._contextMenuRef?.contains?.(event.target as Node)
@@ -401,6 +403,24 @@ class Popover extends Component<IProps, IState> {
}
}
/**
* Stops propagation of touchstart events originating from the Popover's trigger container.
* This prevents the window's 'touchstart' listener (_onOutsideTouchStart) from
* immediately closing the Popover if the touch begins on the trigger area itself.
* Without this, the subsequent synthesized 'click' event will not execute
* because the Popover would already be closing or removed, breaking interactions
* within the Popover on touch devices.
*
* e.g. On a mobile device overflow buttons don't execute their click actions.
*
* @param {React.TouchEvent} event - The touch start event.
* @private
* @returns {void}
*/
_onTouchStart(event: React.TouchEvent) {
event.stopPropagation();
}
/**
* KeyPress handler for accessibility.
*

View File

@@ -2,7 +2,6 @@ import { batch } from 'react-redux';
import { IStore } from '../../app/types';
import { _RESET_BREAKOUT_ROOMS } from '../../breakout-rooms/actionTypes';
import { isPrejoinPageVisible } from '../../prejoin/functions';
import { getCurrentConference } from '../conference/functions';
import {
SET_AUDIO_MUTED,
@@ -203,11 +202,8 @@ function _setMuted(store: IStore, { ensureTrack, muted }: {
setTrackMuted(jitsiTrack, muted, state, dispatch)
.catch(() => dispatch(trackMuteUnmuteFailed(localTrack, muted)));
}
} else if (!muted && ensureTrack && (typeof APP === 'undefined' || isPrejoinPageVisible(state))) {
} else if (!muted && ensureTrack) {
typeof APP !== 'undefined' && dispatch(gumPending([ mediaType ], IGUMPendingState.PENDING_UNMUTE));
// FIXME: This only runs on mobile now because web has its own way of
// creating local tracks. Adjust the check once they are unified.
dispatch(createLocalTracksA({ devices: [ mediaType ] })).then(() => {
typeof APP !== 'undefined' && dispatch(gumPending([ mediaType ], IGUMPendingState.NONE));
});

View File

@@ -138,64 +138,64 @@ export const typography = {
labelBold: 'labelBold01',
bodyShortRegularSmall: {
fontSize: '0.625rem',
lineHeight: '1rem',
fontSize: 10,
lineHeight: 16,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyShortRegular: {
fontSize: '0.875rem',
lineHeight: '1.25rem',
fontSize: 14,
lineHeight: 20,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyShortBold: {
fontSize: '0.875rem',
lineHeight: '1.25rem',
fontSize: 14,
lineHeight: 20,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
bodyShortRegularLarge: {
fontSize: '1rem',
lineHeight: '1.375rem',
fontSize: 16,
lineHeight: 22,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyShortBoldLarge: {
fontSize: '1rem',
lineHeight: '1.375rem',
fontSize: 16,
lineHeight: 22,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
bodyLongRegular: {
fontSize: '0.875rem',
lineHeight: '1.5rem',
fontSize: 14,
lineHeight: 24,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyLongRegularLarge: {
fontSize: '1rem',
lineHeight: '1.625rem',
fontSize: 16,
lineHeight: 26,
fontWeight: font.weightRegular,
letterSpacing: 0
},
bodyLongBold: {
fontSize: '0.875rem',
lineHeight: '1.5rem',
fontSize: 14,
lineHeight: 24,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
bodyLongBoldLarge: {
fontSize: '1rem',
lineHeight: '1.625rem',
fontSize: 16,
lineHeight: 26,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
@@ -205,29 +205,29 @@ export const typography = {
heading2: 'heading02',
heading3: {
fontSize: '2rem',
lineHeight: '2.5rem',
fontSize: 32,
lineHeight: 40,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
heading4: {
fontSize: '1.75rem',
lineHeight: '2.25rem',
fontSize: 28,
lineHeight: 36,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
heading5: {
fontSize: '1.25rem',
lineHeight: '1.75rem',
fontSize: 20,
lineHeight: 28,
fontWeight: font.weightSemiBold,
letterSpacing: 0
},
heading6: {
fontSize: '1rem',
lineHeight: '1.625rem',
fontSize: 16,
lineHeight: 26,
fontWeight: font.weightSemiBold,
letterSpacing: 0
}

View File

@@ -0,0 +1,16 @@
import { IReduxState } from '../../app/types';
/**
* Checks if Jitsi Meet is running on Spot TV.
*
* @param {IReduxState} state - The redux state.
* @returns {boolean} Whether or not Jitsi Meet is running on Spot TV.
*/
export function isSpotTV(state: IReduxState): boolean {
const { defaultLocalDisplayName, iAmSpot } = state['features/base/config'] || {};
return iAmSpot
|| navigator.userAgent.includes('JitsiSpot/') // Jitsi Spot app
|| navigator.userAgent.includes('8x8MeetingRooms/') // 8x8 Meeting Rooms app
|| defaultLocalDisplayName === 'Meeting Room';
}

View File

@@ -413,9 +413,11 @@ const ChatMessage = ({
function _mapStateToProps(state: IReduxState, { message }: IProps) {
const { knocking } = state['features/lobby'];
const localParticipantId = state['features/base/participants'].local?.id;
const { remoteVideoMenu = {} } = state['features/base/config'];
const { disablePrivateChat } = remoteVideoMenu;
return {
shouldDisplayChatMessageMenu: message.participantId !== localParticipantId,
shouldDisplayChatMessageMenu: !disablePrivateChat && message.participantId !== localParticipantId,
knocking,
state
};

View File

@@ -275,7 +275,7 @@ class DesktopPicker extends PureComponent<IProps, IState> {
const { sources } = this.state;
// @ts-ignore
const source = sources.screen.concat(sources.window).find(s => s.id === id);
const source = (sources?.screen ?? []).concat(sources?.window ?? []).find(s => s.id === id);
this.props.onSourceChoose(id, type, screenShareAudio, source);
this.props.dispatch(hideDialog());

View File

@@ -180,6 +180,7 @@ export interface IDynamicBrandingState {
requireRecordingConsent?: boolean;
sharedVideoAllowedURLDomains?: Array<string>;
showGiphyIntegration?: boolean;
skipRecordingConsentInMeeting?: boolean;
supportUrl?: string;
useDynamicBrandingData: boolean;
virtualBackgrounds: Array<Image>;
@@ -206,9 +207,10 @@ ReducerRegistry.register<IDynamicBrandingState>(STORE_NAME, (state = DEFAULT_STA
muiBrandedTheme,
pollCreationRequiresPermission,
premeetingBackground,
requireRecordingConsent,
sharedVideoAllowedURLDomains,
showGiphyIntegration,
requireRecordingConsent,
skipRecordingConsentInMeeting,
supportUrl,
virtualBackgrounds
} = action.value;
@@ -228,9 +230,10 @@ ReducerRegistry.register<IDynamicBrandingState>(STORE_NAME, (state = DEFAULT_STA
muiBrandedTheme,
pollCreationRequiresPermission,
premeetingBackground,
requireRecordingConsent,
sharedVideoAllowedURLDomains,
showGiphyIntegration,
requireRecordingConsent,
skipRecordingConsentInMeeting,
supportUrl,
customizationFailed: false,
customizationReady: true,

View File

@@ -11,6 +11,7 @@ import Watermarks from '../../base/react/components/web/Watermarks';
import { getHideSelfView } from '../../base/settings/functions.any';
import { getVideoTrackByParticipant } from '../../base/tracks/functions.web';
import { setColorAlpha } from '../../base/util/helpers';
import { isSpotTV } from '../../base/util/spot';
import StageParticipantNameLabel from '../../display-name/components/web/StageParticipantNameLabel';
import { FILMSTRIP_BREAKPOINT } from '../../filmstrip/constants';
import { getVerticalViewMaxWidth, isFilmstripResizable } from '../../filmstrip/functions.web';
@@ -24,8 +25,6 @@ import { getLargeVideoParticipant } from '../functions';
import ScreenSharePlaceholder from './ScreenSharePlaceholder.web';
// Hack to detect Spot.
const SPOT_DISPLAY_NAME = 'Meeting Room';
interface IProps {
@@ -364,20 +363,20 @@ function _mapStateToProps(state: IReduxState) {
const { backgroundColor, backgroundImageUrl } = state['features/dynamic-branding'];
const { isOpen: isChatOpen } = state['features/chat'];
const { width: verticalFilmstripWidth, visible } = state['features/filmstrip'];
const { defaultLocalDisplayName, hideDominantSpeakerBadge } = state['features/base/config'];
const { hideDominantSpeakerBadge } = state['features/base/config'];
const { seeWhatIsBeingShared } = state['features/large-video'];
const localParticipantId = getLocalParticipant(state)?.id;
const largeVideoParticipant = getLargeVideoParticipant(state);
const videoTrack = getVideoTrackByParticipant(state, largeVideoParticipant);
const isLocalScreenshareOnLargeVideo = largeVideoParticipant?.id?.includes(localParticipantId ?? '')
&& videoTrack?.videoType === VIDEO_TYPE.DESKTOP;
const isOnSpot = defaultLocalDisplayName === SPOT_DISPLAY_NAME;
return {
_backgroundAlpha: state['features/base/config'].backgroundAlpha,
_customBackgroundColor: backgroundColor,
_customBackgroundImageUrl: backgroundImageUrl,
_displayScreenSharingPlaceholder: Boolean(isLocalScreenshareOnLargeVideo && !seeWhatIsBeingShared && !isOnSpot),
_displayScreenSharingPlaceholder:
Boolean(isLocalScreenshareOnLargeVideo && !seeWhatIsBeingShared && !isSpotTV(state)),
_hideSelfView: getHideSelfView(state),
_isChatOpen: isChatOpen,
_isDisplayNameVisible: isDisplayNameVisible(state),

View File

@@ -8,6 +8,16 @@
*/
export const CLEAR_RECORDING_SESSIONS = 'CLEAR_RECORDING_SESSIONS';
/**
* The type of Redux action which marks a session ID as consent requested.
*
* {
* type: MARK_CONSENT_REQUESTED,
* sessionId: string
* }
*/
export const MARK_CONSENT_REQUESTED = 'MARK_CONSENT_REQUESTED';
/**
* The type of Redux action which updates the current known state of a recording
* session.

View File

@@ -20,6 +20,7 @@ import { isRecorderTranscriptionsRunning } from '../transcribing/functions';
import {
CLEAR_RECORDING_SESSIONS,
MARK_CONSENT_REQUESTED,
RECORDING_SESSION_UPDATED,
SET_MEETING_HIGHLIGHT_BUTTON_STATE,
SET_PENDING_RECORDING_NOTIFICATION_UID,
@@ -467,3 +468,17 @@ export function showStartRecordingNotificationWithCallback(openRecordingDialog:
}, NOTIFICATION_TIMEOUT_TYPE.EXTRA_LONG));
};
}
/**
* Marks the given session as consent requested. No further consent requests will be
* made for this session.
*
* @param {string} sessionId - The session id.
* @returns {Object}
*/
export function markConsentRequested(sessionId: string) {
return {
type: MARK_CONSENT_REQUESTED,
sessionId
};
}

View File

@@ -1,8 +1,14 @@
import React, { useCallback } from 'react';
import { useDispatch } from 'react-redux';
import { useDispatch, useSelector } from 'react-redux';
import { useTranslation } from 'react-i18next';
import Dialog from 'react-native-dialog';
import ConfirmDialog from '../../../../base/dialog/components/native/ConfirmDialog';
import { setAudioUnmutePermissions, setVideoUnmutePermissions } from '../../../../base/media/actions';
import { setAudioMuted, setAudioUnmutePermissions, setVideoMuted, setVideoUnmutePermissions } from '../../../../base/media/actions';
import { VIDEO_MUTISM_AUTHORITY } from '../../../../base/media/constants';
import Link from '../../../../base/react/components/native/Link';
import { IReduxState } from '../../../../app/types';
import styles from '../styles.native';
/**
* Component that renders the dialog for explicit consent for recordings.
@@ -11,6 +17,10 @@ import { setAudioUnmutePermissions, setVideoUnmutePermissions } from '../../../.
*/
export default function RecordingConsentDialog() {
const dispatch = useDispatch();
const { t } = useTranslation();
const { recordings } = useSelector((state: IReduxState) => state['features/base/config']);
const { consentLearnMoreLink } = recordings ?? {};
const consent = useCallback(() => {
dispatch(setAudioUnmutePermissions(false, true));
@@ -19,12 +29,36 @@ export default function RecordingConsentDialog() {
return true;
}, []);
const consentAndUnmute = useCallback(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
dispatch(setAudioMuted(false, true));
dispatch(setVideoMuted(false, VIDEO_MUTISM_AUTHORITY.USER, true));
return true;
}, []);
return (
<ConfirmDialog
backLabel = { 'dialog.UnderstandAndUnmute' }
confirmLabel = { 'dialog.Understand' }
descriptionKey = { 'dialog.recordingInProgressDescription' }
isBackHidden = { false }
isCancelHidden = { true }
onBack = { consentAndUnmute }
onSubmit = { consent }
title = { 'dialog.recordingInProgressTitle' } />
title = { 'dialog.recordingInProgressTitle' }
verticalButtons = { true }>
<Dialog.Description>
{t('dialog.recordingInProgressDescriptionFirstHalf')}
{consentLearnMoreLink && (
<Link
style = { styles.learnMoreLink }
url = { consentLearnMoreLink }>
{`(${t('dialog.learnMore')})`}
</Link>
)}
{t('dialog.recordingInProgressDescriptionSecondHalf')}
</Dialog.Description>
</ConfirmDialog>
);
}

View File

@@ -94,8 +94,11 @@ export default {
highlightDialogButtonsSpace: {
height: 16,
width: '100%'
},
learnMoreLink: {
color: BaseTheme.palette.link01,
fontWeight: 'bold'
}
};
/**

View File

@@ -1,9 +1,18 @@
import React, { useCallback } from 'react';
import { useTranslation } from 'react-i18next';
import { useDispatch } from 'react-redux';
import { batch, useDispatch, useSelector } from 'react-redux';
import { setAudioUnmutePermissions, setVideoUnmutePermissions } from '../../../../base/media/actions';
import { IReduxState } from '../../../../app/types';
import { translateToHTML } from '../../../../base/i18n/functions';
import {
setAudioMuted,
setAudioUnmutePermissions,
setVideoMuted,
setVideoUnmutePermissions
} from '../../../../base/media/actions';
import { VIDEO_MUTISM_AUTHORITY } from '../../../../base/media/constants';
import Dialog from '../../../../base/ui/components/web/Dialog';
import { hideDialog } from '../../../../base/dialog/actions';
/**
* Component that renders the dialog for explicit consent for recordings.
@@ -13,14 +22,34 @@ import Dialog from '../../../../base/ui/components/web/Dialog';
export default function RecordingConsentDialog() {
const { t } = useTranslation();
const dispatch = useDispatch();
const { recordings } = useSelector((state: IReduxState) => state['features/base/config']);
const { consentLearnMoreLink } = recordings ?? {};
const learnMore = ` (<a href="${consentLearnMoreLink}" target="_blank" rel="noopener noreferrer">${t('dialog.learnMore')}</a>)`;
const consent = useCallback(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
batch(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
});
}, []);
const consentAndUnmute = useCallback(() => {
batch(() => {
dispatch(setAudioUnmutePermissions(false, true));
dispatch(setVideoUnmutePermissions(false, true));
dispatch(setAudioMuted(false, true));
dispatch(setVideoMuted(false, VIDEO_MUTISM_AUTHORITY.USER, true));
dispatch(hideDialog());
});
}, []);
return (
<Dialog
back = {{
hidden: false,
onClick: consentAndUnmute,
translationKey: 'dialog.UnderstandAndUnmute'
}}
cancel = {{ hidden: true }}
disableBackdropClose = { true }
disableEscape = { true }
@@ -28,9 +57,7 @@ export default function RecordingConsentDialog() {
ok = {{ translationKey: 'dialog.Understand' }}
onSubmit = { consent }
titleKey = 'dialog.recordingInProgressTitle'>
<div>
{t('dialog.recordingInProgressDescription')}
</div>
{ translateToHTML(t, 'dialog.recordingInProgressDescription', { learnMore }) }
</Dialog>
);
}

View File

@@ -8,6 +8,7 @@ import { JitsiRecordingConstants, browser } from '../base/lib-jitsi-meet';
import { getSoundFileSrc } from '../base/media/functions';
import { getLocalParticipant, getRemoteParticipants } from '../base/participants/functions';
import { registerSound, unregisterSound } from '../base/sounds/actions';
import { isSpotTV } from '../base/util/spot';
import { isInBreakoutRoom as isInBreakoutRoomF } from '../breakout-rooms/functions';
import { isEnabled as isDropboxEnabled } from '../dropbox/functions';
import { extractFqnFromPath } from '../dynamic-branding/functions.any';
@@ -440,22 +441,44 @@ export function isLiveStreamingButtonVisible({
* @returns {boolean}
*/
export function shouldRequireRecordingConsent(recorderSession: any, state: IReduxState) {
const { requireRecordingConsent } = state['features/dynamic-branding'] || {};
const { requireConsent } = state['features/base/config'].recordings || {};
const { requireRecordingConsent, skipRecordingConsentInMeeting }
= state['features/dynamic-branding'] || {};
const { conference } = state['features/base/conference'] || {};
const { requireConsent, skipConsentInMeeting } = state['features/base/config'].recordings || {};
const { iAmRecorder } = state['features/base/config'];
const { consentRequested } = state['features/recording'];
if (iAmRecorder) {
return false;
}
if (isSpotTV(state)) {
return false;
}
if (!requireConsent && !requireRecordingConsent) {
return false;
}
if (!recorderSession.getInitiator()
|| recorderSession.getStatus() === JitsiRecordingConstants.status.OFF) {
if (consentRequested.has(recorderSession.getID())) {
return false;
}
return recorderSession.getInitiator() !== getLocalParticipant(state)?.id;
// If we join a meeting that has an ongoing recording `conference` will be undefined since
// we get the recording state through the initial presence which happens in between the
// WILL_JOIN and JOINED events.
if (conference && (skipConsentInMeeting || skipRecordingConsentInMeeting)) {
return false;
}
// lib-jitsi-meet may set a JitsiParticipant as the initiator of the recording session or the
// JID resource in case it cannot find it. We need to handle both cases.
const initiator = recorderSession.getInitiator();
const initiatorId = initiator?.getId?.() ?? initiator;
if (!initiatorId || recorderSession.getStatus() === JitsiRecordingConstants.status.OFF) {
return false;
}
return initiatorId !== getLocalParticipant(state)?.id;
}

View File

@@ -36,6 +36,7 @@ import { isRecorderTranscriptionsRunning } from '../transcribing/functions';
import { RECORDING_SESSION_UPDATED, START_LOCAL_RECORDING, STOP_LOCAL_RECORDING } from './actionTypes';
import {
clearRecordingSessions,
markConsentRequested,
hidePendingRecordingNotification,
showPendingRecordingNotification,
showRecordingError,
@@ -420,6 +421,7 @@ function _showExplicitConsentDialog(recorderSession: any, dispatch: IStore['disp
}
batch(() => {
dispatch(markConsentRequested(recorderSession.getID()));
dispatch(setAudioUnmutePermissions(true, true));
dispatch(setVideoUnmutePermissions(true, true));
dispatch(setAudioMuted(true));

View File

@@ -2,6 +2,7 @@ import ReducerRegistry from '../base/redux/ReducerRegistry';
import {
CLEAR_RECORDING_SESSIONS,
MARK_CONSENT_REQUESTED,
RECORDING_SESSION_UPDATED,
SET_MEETING_HIGHLIGHT_BUTTON_STATE,
SET_PENDING_RECORDING_NOTIFICATION_UID,
@@ -11,6 +12,7 @@ import {
} from './actionTypes';
const DEFAULT_STATE = {
consentRequested: new Set(),
disableHighlightMeetingMoment: false,
pendingNotificationUids: {},
selectedRecordingService: '',
@@ -29,6 +31,7 @@ export interface ISessionData {
}
export interface IRecordingState {
consentRequested: Set<any>;
disableHighlightMeetingMoment: boolean;
pendingNotificationUids: {
[key: string]: string | undefined;
@@ -57,6 +60,15 @@ ReducerRegistry.register<IRecordingState>(STORE_NAME,
sessionDatas: []
};
case MARK_CONSENT_REQUESTED:
return {
...state,
consentRequested: new Set([
...state.consentRequested,
action.sessionId
])
};
case RECORDING_SESSION_UPDATED:
return {
...state,

View File

@@ -18,6 +18,7 @@ import FormSection from './FormSection';
const ModeratorSection = () => {
const dispatch = useDispatch();
const {
audioModerationEnabled,
chatWithPermissionsEnabled,
followMeActive,
followMeEnabled,
@@ -25,7 +26,8 @@ const ModeratorSection = () => {
followMeRecorderEnabled,
startAudioMuted,
startVideoMuted,
startReactionsMuted
startReactionsMuted,
videoModerationEnabled
} = useSelector((state: IReduxState) => getModeratorTabProps(state));
const { disableReactionsModeration } = useSelector((state: IReduxState) => state['features/base/config']);
@@ -68,13 +70,13 @@ const ModeratorSection = () => {
const moderationSettings = useMemo(() => {
const moderation = [
{
disabled: false,
disabled: audioModerationEnabled,
label: 'settings.startAudioMuted',
state: startAudioMuted,
onChange: onStartAudioMutedToggled
},
{
disabled: false,
disabled: videoModerationEnabled,
label: 'settings.startVideoMuted',
state: startVideoMuted,
onChange: onStartVideoMutedToggled

View File

@@ -13,6 +13,10 @@ import Checkbox from '../../../base/ui/components/web/Checkbox';
* The type of the React {@code Component} props of {@link ModeratorTab}.
*/
export interface IProps extends AbstractDialogTabProps, WithTranslation {
/**
* Whether the user has selected the audio moderation feature to be enabled.
*/
audioModerationEnabled: boolean;
/**
* Whether the user has selected the chat with permissions feature to be enabled.
@@ -71,6 +75,11 @@ export interface IProps extends AbstractDialogTabProps, WithTranslation {
* enabled.
*/
startVideoMuted: boolean;
/**
* Whether the user has selected the video moderation feature to be enabled.
*/
videoModerationEnabled: boolean;
}
const styles = (theme: Theme) => {
@@ -200,6 +209,7 @@ class ModeratorTab extends AbstractDialogTab<IProps, any> {
*/
override render() {
const {
audioModerationEnabled,
chatWithPermissionsEnabled,
disableChatWithPermissions,
disableReactionsModeration,
@@ -210,7 +220,8 @@ class ModeratorTab extends AbstractDialogTab<IProps, any> {
startAudioMuted,
startVideoMuted,
startReactionsMuted,
t
t,
videoModerationEnabled
} = this.props;
const classes = withStyles.getClasses(this.props);
@@ -223,18 +234,18 @@ class ModeratorTab extends AbstractDialogTab<IProps, any> {
<h2 className = { classes.title }>
{t('settings.moderatorOptions')}
</h2>
<Checkbox
{ !audioModerationEnabled && <Checkbox
checked = { startAudioMuted }
className = { classes.checkbox }
label = { t('settings.startAudioMuted') }
name = 'start-audio-muted'
onChange = { this._onStartAudioMutedChanged } />
<Checkbox
onChange = { this._onStartAudioMutedChanged } /> }
{ !videoModerationEnabled && <Checkbox
checked = { startVideoMuted }
className = { classes.checkbox }
label = { t('settings.startVideoMuted') }
name = 'start-video-muted'
onChange = { this._onStartVideoMutedChanged } />
onChange = { this._onStartVideoMutedChanged } /> }
<Checkbox
checked = { followMeEnabled && !followMeActive && !followMeRecorderChecked }
className = { classes.checkbox }

View File

@@ -1,9 +1,11 @@
import { IReduxState } from '../app/types';
import { isEnabledFromState } from '../av-moderation/functions';
import { IStateful } from '../base/app/types';
import { isNameReadOnly } from '../base/config/functions.any';
import { SERVER_URL_CHANGE_ENABLED } from '../base/flags/constants';
import { getFeatureFlag } from '../base/flags/functions';
import i18next, { DEFAULT_LANGUAGE, LANGUAGES } from '../base/i18n/i18next';
import { MEDIA_TYPE } from '../base/media/constants';
import { getLocalParticipant } from '../base/participants/functions';
import { toState } from '../base/redux/functions';
import { getHideSelfView } from '../base/settings/functions.any';
@@ -144,9 +146,13 @@ export function getModeratorTabProps(stateful: IStateful) {
const followMeRecorderActive = isFollowMeRecorderActive(state);
const showModeratorSettings = shouldShowModeratorSettings(state);
const disableChatWithPermissions = !conference?.getMetadataHandler().getMetadata().allownersEnabled;
const isAudioModerationEnabled = isEnabledFromState(MEDIA_TYPE.AUDIO, state);
const isVideoModerationEnabled = isEnabledFromState(MEDIA_TYPE.VIDEO, state);
// The settings sections to display.
return {
audioModerationEnabled: isAudioModerationEnabled,
videoModerationEnabled: isVideoModerationEnabled,
chatWithPermissionsEnabled: Boolean(groupChatWithPermissions),
showModeratorSettings: Boolean(conference && showModeratorSettings),
disableChatWithPermissions: Boolean(disableChatWithPermissions),

View File

@@ -68,6 +68,7 @@ export class AudioMixerEffect {
* @param {MediaStream} audioStream - Audio stream which will be mixed with _mixAudio.
* @returns {MediaStream} - MediaStream containing both audio tracks mixed together.
*/
// @ts-ignore
startEffect(audioStream: MediaStream) {
this._originalStream = audioStream;
this._originalTrack = audioStream.getTracks()[0];

View File

@@ -12,7 +12,8 @@ import {
SET_TOOLBOX_VISIBLE,
TOGGLE_TOOLBOX_VISIBLE
} from './actionTypes';
import { IMainToolbarButtonThresholds } from './types';
import { DUMMY_10_BUTTONS_THRESHOLD_VALUE, DUMMY_9_BUTTONS_THRESHOLD_VALUE } from './constants';
import { IMainToolbarButtonThresholds, IMainToolbarButtonThresholdsUnfiltered } from './types';
/**
* Enables/disables the toolbox.
@@ -127,7 +128,7 @@ export function setShiftUp(shiftUp: boolean) {
* @param {IMainToolbarButtonThresholds} thresholds - Thresholds for screen size and visible main toolbar buttons.
* @returns {Function}
*/
export function setMainToolbarThresholds(thresholds: IMainToolbarButtonThresholds) {
export function setMainToolbarThresholds(thresholds: IMainToolbarButtonThresholdsUnfiltered) {
return (dispatch: IStore['dispatch'], getState: IStore['getState']) => {
const { mainToolbarButtons } = getState()['features/base/config'];
@@ -149,12 +150,27 @@ export function setMainToolbarThresholds(thresholds: IMainToolbarButtonThreshold
});
thresholds.forEach(({ width, order }) => {
let finalOrder = mainToolbarButtonsLengthMap.get(order.length);
let numberOfButtons = 0;
if (Array.isArray(order)) {
numberOfButtons = order.length;
} else if (order === DUMMY_9_BUTTONS_THRESHOLD_VALUE) {
numberOfButtons = 9;
} else if (order === DUMMY_10_BUTTONS_THRESHOLD_VALUE) {
numberOfButtons = 10;
} else { // Unexpected value. Ignore it.
return;
}
let finalOrder = mainToolbarButtonsLengthMap.get(numberOfButtons);
if (finalOrder) {
orderIsChanged = true;
} else {
} else if (Array.isArray(order)) {
finalOrder = order;
} else {
// Ignore dummy (symbol) values.
return;
}
mainToolbarButtonsThresholds.push({

View File

@@ -1,9 +1,33 @@
import { NativeToolbarButton, ToolbarButton } from './types';
/**
* Dummy toolbar threschold value for 9 buttons. It is used as a placeholder in THRESHOLDS that would work only when
* this value is overiden.
*/
export const DUMMY_9_BUTTONS_THRESHOLD_VALUE = Symbol('9_BUTTONS_THRESHOLD_VALUE');
/**
* Dummy toolbar threschold value for 10 buttons. It is used as a placeholder in THRESHOLDS that would work only when
* this value is overiden.
*/
export const DUMMY_10_BUTTONS_THRESHOLD_VALUE = Symbol('10_BUTTONS_THRESHOLD_VALUE');
/**
* Thresholds for displaying toolbox buttons.
*/
export const THRESHOLDS = [
// This entry won't be used unless the order is overridden trough the mainToolbarButtons config prop.
{
width: 675,
order: DUMMY_10_BUTTONS_THRESHOLD_VALUE
},
// This entry won't be used unless the order is overridden trough the mainToolbarButtons config prop.
{
width: 625,
order: DUMMY_9_BUTTONS_THRESHOLD_VALUE
},
{
width: 565,
order: [ 'microphone', 'camera', 'desktop', 'chat', 'raisehand', 'reactions', 'participants-pane', 'tileview' ]

View File

@@ -66,8 +66,8 @@ export function isVideoMuteButtonDisabled(state: IReduxState) {
* @param {IGetVisibleButtonsParams} params - The parameters needed to extract the visible buttons.
* @returns {Object} - The visible buttons arrays .
*/
export function getVisibleNativeButtons({ allButtons, clientWidth, mainToolbarButtonsThresholds, toolbarButtons
}: IGetVisibleNativeButtonsParams) {
export function getVisibleNativeButtons(
{ allButtons, clientWidth, mainToolbarButtonsThresholds, toolbarButtons }: IGetVisibleNativeButtonsParams) {
const filteredButtons = Object.keys(allButtons).filter(key =>
typeof key !== 'undefined' // filter invalid buttons that may be coming from config.mainToolbarButtons override
&& isButtonEnabled(key, toolbarButtons));

View File

@@ -21,6 +21,15 @@ import {
import { NATIVE_THRESHOLDS, THRESHOLDS } from './constants';
import { IMainToolbarButtonThresholds, NOTIFY_CLICK_MODE } from './types';
/**
* Array of thresholds for the main toolbar buttons that will inlude only the usable entries from THRESHOLDS array.
*
* Note: THRESHOLDS array includes some dummy values that enables users of the iframe API to override and use.
* Note2: Casting is needed because it seems isArray guard is not working well in TS. See:
* https://github.com/microsoft/TypeScript/issues/17002.
*/
const FILTERED_THRESHOLDS = THRESHOLDS.filter(({ order }) => Array.isArray(order)) as IMainToolbarButtonThresholds;
/**
* Initial state of toolbox's part of Redux store.
*/
@@ -52,7 +61,7 @@ const INITIAL_STATE = {
/**
* The thresholds for screen size and visible main toolbar buttons.
*/
mainToolbarButtonsThresholds: navigator.product === 'ReactNative' ? NATIVE_THRESHOLDS : THRESHOLDS,
mainToolbarButtonsThresholds: navigator.product === 'ReactNative' ? NATIVE_THRESHOLDS : FILTERED_THRESHOLDS,
participantMenuButtonsWithNotifyClick: new Map(),

View File

@@ -65,6 +65,11 @@ export type IMainToolbarButtonThresholds = Array<{
width: number;
}>;
export type IMainToolbarButtonThresholdsUnfiltered = Array<{
order: Array<ToolbarButton | NativeToolbarButton | string> | Symbol;
width: number;
}>;
export interface ICustomToolbarButton {
Content?: ComponentType<any>;
backgroundColor?: string;

View File

@@ -207,6 +207,20 @@ function on_message(event)
room.av_moderation_actors = {};
end
room.av_moderation[mediaType] = array{};
-- We want to set startMuted policy in metadata, in case of new participants are joining to respect
-- it, that will be enforced by jicofo
local startMutedMetadata = room.jitsiMetadata.startMuted or {};
-- We want to keep the previous value of startMuted for this mediaType if av moderation is disabled
-- to be able to restore
local av_moderation_startMuted_restore = room.av_moderation_startMuted_restore or {};
av_moderation_startMuted_restore = startMutedMetadata[mediaType];
room.av_moderation_startMuted_restore = av_moderation_startMuted_restore;
startMutedMetadata[mediaType] = true;
room.jitsiMetadata.startMuted = startMutedMetadata;
room.av_moderation_actors[mediaType] = occupant.nick;
end
else
@@ -218,7 +232,11 @@ function on_message(event)
room.av_moderation[mediaType] = nil;
room.av_moderation_actors[mediaType] = nil;
-- clears room.av_moderation if empty
local startMutedMetadata = room.jitsiMetadata.startMuted or {};
local av_moderation_startMuted_restore = room.av_moderation_startMuted_restore or {};
startMutedMetadata[mediaType] = av_moderation_startMuted_restore[mediaType];
room.jitsiMetadata.startMuted = startMutedMetadata;
local is_empty = true;
for key,_ in pairs(room.av_moderation) do
if room.av_moderation[key] then

View File

@@ -145,7 +145,7 @@ module:hook("pre-iq/full", function(event)
dial:tag("header", {
xmlns = "urn:xmpp:rayo:1",
name = OUT_INITIATOR_USER_ATTR_NAME,
value = user_id });
value = tostring(user_id)});
dial:up();
-- Add the initiator group information if it is present
@@ -153,7 +153,7 @@ module:hook("pre-iq/full", function(event)
dial:tag("header", {
xmlns = "urn:xmpp:rayo:1",
name = OUT_INITIATOR_GROUP_ATTR_NAME,
value = session.jitsi_meet_context_group });
value = tostring(session.jitsi_meet_context_group) });
dial:up();
end
end

View File

@@ -44,10 +44,12 @@ local stanza = event.stanza;
if session.jitsi_meet_context_user ~= nil then
initiator.id = session.jitsi_meet_context_user.id;
else
initiator.id = session.granted_jitsi_meet_context_user_id;
end
if session.jitsi_meet_context_group ~= nil then
initiator.group = session.jitsi_meet_context_group;
end
initiator.group
= session.jitsi_meet_context_group or session.granted_jitsi_meet_context_group_id;
app_data.file_recording_metadata.initiator = initiator
update_app_data = true;

View File

@@ -112,36 +112,48 @@ function filter_stanza(stanza, session)
end
local muc_x = stanza:get_child('x', MUC_NS..'#user');
if not muc_x then
if not muc_x or not presence_check_status(muc_x, '110') then
return stanza;
end
local room = get_room_from_jid(room_jid_match_rewrite(jid.bare(stanza.attr.from)));
if not room or not room.send_default_permissions_to or is_healthcheck_room(room.jid) then
if not room or is_healthcheck_room(room.jid) then
return stanza;
end
if session.auth_token and session.jitsi_meet_context_features then -- token and features are set so skip
room.send_default_permissions_to[bare_to] = nil;
return stanza;
if not room.send_default_permissions_to then
room.send_default_permissions_to = {};
end
-- we are sending permissions only when becoming a member
local is_moderator = false;
for item in muc_x:childtags('item') do
if item.attr.role == 'moderator' then
is_moderator = true;
break;
if not session.force_permissions_update then
if session.auth_token and session.jitsi_meet_context_features then -- token and features are set so skip
room.send_default_permissions_to[bare_to] = nil;
return stanza;
end
-- we are sending permissions only when becoming a member
local is_moderator = false;
for item in muc_x:childtags('item') do
if item.attr.role == 'moderator' then
is_moderator = true;
break;
end
end
if not is_moderator then
return stanza;
end
if not room.send_default_permissions_to[bare_to] then
return stanza;
end
end
if not is_moderator or not room.send_default_permissions_to[bare_to]
or not presence_check_status(muc_x, '110') then
return stanza;
end
session.force_permissions_update = false;
local permissions_to_send = session.granted_jitsi_meet_context_features or default_permissions;
local permissions_to_send
= session.jitsi_meet_context_features or session.granted_jitsi_meet_context_features or default_permissions;
room.send_default_permissions_to[bare_to] = nil;

View File

@@ -9,6 +9,7 @@ local get_room_from_jid = main_util.get_room_from_jid;
local is_healthcheck_room = main_util.is_healthcheck_room;
local internal_room_jid_match_rewrite = main_util.internal_room_jid_match_rewrite;
local presence_check_status = main_util.presence_check_status;
local extract_subdomain = main_util.extract_subdomain;
local QUEUE_MAX_SIZE = 500;
@@ -223,7 +224,11 @@ module:hook('message/bare', function(event)
transcription.session_id = room._data.meetingId;
local tenant, conference_name, id = extract_subdomain(jid.node(room.jid));
transcription.fqn = tenant..'/'..conference_name;
if tenant then
transcription.fqn = tenant..'/'..conference_name;
else
transcription.fqn = conference_name;
end
transcription.customer_id = id;
return module:fire_event('jitsi-transcript-received', {

View File

@@ -72,7 +72,7 @@ module:hook('jitsi-endpoint-message-received', function(event)
if string.len(event.raw_message) >= POLL_PAYLOAD_LIMIT then
module:log('error', 'Poll payload too large, discarding. Sender: %s to:%s', stanza.attr.from, stanza.attr.to);
return nil;
return true;
end
if data.type == "new-poll" then
@@ -86,7 +86,7 @@ module:hook('jitsi-endpoint-message-received', function(event)
if room.polls.count >= POLLS_LIMIT then
module:log("error", "Too many polls created in %s", room.jid)
return
return true;
end
if room.polls.by_id[data.pollId] ~= nil then

View File

@@ -10,18 +10,21 @@
-- Component "metadata.jitmeet.example.com" "room_metadata_component"
-- muc_component = "conference.jitmeet.example.com"
-- breakout_rooms_component = "breakout.jitmeet.example.com"
local filters = require 'util.filters';
local jid_node = require 'util.jid'.node;
local json = require 'cjson.safe';
local st = require 'util.stanza';
local jid = require 'util.jid';
local util = module:require 'util';
local is_admin = util.is_admin;
local is_healthcheck_room = util.is_healthcheck_room;
local get_room_from_jid = util.get_room_from_jid;
local room_jid_match_rewrite = util.room_jid_match_rewrite;
local internal_room_jid_match_rewrite = util.internal_room_jid_match_rewrite;
local process_host_module = util.process_host_module;
local MUC_NS = 'http://jabber.org/protocol/muc';
local COMPONENT_IDENTITY_TYPE = 'room_metadata';
local FORM_KEY = 'muc#roominfo_jitsimetadata';
@@ -96,6 +99,8 @@ function room_created(event)
if not room.jitsiMetadata then
room.jitsiMetadata = {};
end
room.sent_initial_metadata = {};
end
function on_message(event)
@@ -281,3 +286,57 @@ if breakout_rooms_component_host then
end
end);
end
-- Send a message update for metadata before sending the first self presence
function filter_stanza(stanza, session)
if not stanza.attr or not stanza.attr.to or stanza.name ~= 'presence'
or stanza.attr.type == 'unavailable' or ends_with(stanza.attr.from, '/focus') then
return stanza;
end
local bare_to = jid.bare(stanza.attr.to);
if is_admin(bare_to) then
return stanza;
end
local muc_x = stanza:get_child('x', MUC_NS..'#user');
if not muc_x or not presence_check_status(muc_x, '110') then
return stanza;
end
local room = get_room_from_jid(room_jid_match_rewrite(jid.bare(stanza.attr.from)));
if not room or not room.sent_initial_metadata or is_healthcheck_room(room.jid) then
return stanza;
end
if room.sent_initial_metadata[bare_to] then
return stanza;
end
local occupant;
for _, o in room:each_occupant() do
if o.bare_jid == bare_to then
occupant = o;
end
end
if not occupant then
module:log('warn', 'No occupant %s found for %s', bare_to, room.jid);
return stanza;
end
room.sent_initial_metadata[bare_to] = true;
send_json_msg(occupant.jid, internal_room_jid_match_rewrite(room.jid), getMetadataJSON(room));
return stanza;
end
function filter_session(session)
-- domain mapper is filtering on default priority 0
-- allowners is -1 and we need it after that, permissions is -2
filters.add_filter(session, 'stanzas/out', filter_stanza, -3);
end
-- enable filtering presences
filters.add_filter_hook(filter_session);

View File

@@ -280,6 +280,11 @@ function extract_subdomain(room_node)
end
local subdomain, room_name = room_node:match("^%[([^%]]+)%](.+)$");
if not subdomain then
room_name = room_node;
end
local _, customer_id = subdomain and subdomain:match("^(vpaas%-magic%-cookie%-)(.*)$") or nil, nil;
local cache_value = { subdomain=subdomain, room=room_name, customer_id=customer_id };
extract_subdomain_cache:set(room_node, cache_value);
@@ -316,8 +321,11 @@ function starts_with_one_of(str, prefixes)
return false
end
function ends_with(str, ending)
if not str then
return false;
end
return ending == "" or str:sub(-#ending) == ending
end

View File

@@ -3,7 +3,10 @@
#BASE_URL=
# Room name suffix to use when creating new room names
# ROOM_NAME_SUFFIX=
#ROOM_NAME_SUFFIX=
# Room name prefix to use when creating new room names
#ROOM_NAME_PREFIX=
# To be able to match a domain to a specific address
# The format is "MAP example.com 1.2.3.4"
@@ -43,6 +46,12 @@
# A rest URL to be used by dial-in tests to invite jigasi to the conference
#DIAL_IN_REST_URL=
# A destination number to dialout, that auto answers and sends media
#DIAL_OUT_URL=
# A destination number to dialout, that auto answer and sends media audio and video
#SIP_JIBRI_DIAL_OUT_URL=
# Whether to use beta for the first participants
#BROWSER_CHROME_BETA=false
#BROWSER_FF_BETA=false

View File

@@ -25,10 +25,10 @@ import VideoQualityDialog from '../pageobjects/VideoQualityDialog';
import { LOG_PREFIX, logInfo } from './browserLogger';
import { IContext, IJoinOptions } from './types';
export const P1_DISPLAY_NAME = 'p1';
export const P2_DISPLAY_NAME = 'p2';
export const P3_DISPLAY_NAME = 'p3';
export const P4_DISPLAY_NAME = 'p4';
export const P1 = 'p1';
export const P2 = 'p2';
export const P3 = 'p3';
export const P4 = 'p4';
interface IWaitForSendReceiveDataOptions {
checkReceive?: boolean;
@@ -47,7 +47,6 @@ export class Participant {
* @private
*/
private _name: string;
private _displayName: string;
private _endpointId: string;
private _jwt?: string;
@@ -164,13 +163,6 @@ export class Participant {
return this._name;
}
/**
* The name.
*/
get displayName() {
return this._displayName || this.name;
}
/**
* Adds a log to the participants log file.
*
@@ -203,7 +195,7 @@ export class Participant {
if (!options.skipDisplayName) {
// @ts-ignore
config.userInfo = {
displayName: this._displayName = options.displayName || this._name
displayName: this._name
};
}
@@ -756,8 +748,7 @@ export class Participant {
/**
* Returns the audio level for a participant.
*
* @param observer
* @param participant
* @param p
* @return
*/
async getRemoteAudioLevel(p: Participant) {
@@ -818,15 +809,11 @@ export class Participant {
// When testing for muted we don't want to have
// the condition succeeded
if (muted) {
const name = await testee.displayName;
assert.fail(`There was some sound coming from muted: '${name}'`);
assert.fail(`There was some sound coming from muted: '${this.name}'`);
} // else we're good for unmuted participant
} catch (_timeoutE) {
if (!muted) {
const name = await testee.displayName;
assert.fail(`There was no sound from unmuted: '${name}'`);
assert.fail(`There was no sound from unmuted: '${this.name}'`);
} // else we're good for muted participant
}
}
@@ -844,7 +831,7 @@ export class Participant {
endpointId) && !await this.driver.$(
`//span[@id="participant_${endpointId}" and contains(@class, "display-video")]`).isExisting(), {
timeout: 15_000,
timeoutMsg: `expected remote video for ${endpointId} to not be received 15s by ${this.displayName}`
timeoutMsg: `expected remote video for ${endpointId} to not be received 15s by ${this.name}`
});
} else {
await this.driver.waitUntil(async () =>
@@ -852,7 +839,7 @@ export class Participant {
endpointId) && await this.driver.$(
`//span[@id="participant_${endpointId}" and contains(@class, "display-video")]`).isExisting(), {
timeout: 15_000,
timeoutMsg: `expected remote video for ${endpointId} to be received 15s by ${this.displayName}`
timeoutMsg: `expected remote video for ${endpointId} to be received 15s by ${this.name}`
});
}
}
@@ -872,7 +859,7 @@ export class Participant {
await this.driver.$('//span[contains(@class,"videocontainer")]//span[contains(@class,"connection_ninja")]')
.waitForDisplayed({
timeout: 5_000,
timeoutMsg: `expected ninja icon to be displayed in 5s by ${this.displayName}`
timeoutMsg: `expected ninja icon to be displayed in 5s by ${this.name}`
});
}
}

View File

@@ -1,11 +1,13 @@
import fs from 'node:fs';
import WebSocket from 'ws';
/**
* Uses the webhook proxy service to proxy events to the testing clients.
*/
export default class WebhookProxy {
private url;
private secret;
private readonly url;
private readonly secret;
private logFile;
private ws: WebSocket | undefined;
private cache = new Map();
private listeners = new Map();
@@ -15,10 +17,12 @@ export default class WebhookProxy {
* Initializes the webhook proxy.
* @param url
* @param secret
* @param logFile
*/
constructor(url: string, secret: string) {
constructor(url: string, secret: string, logFile: string) {
this.url = url;
this.secret = secret;
this.logFile = logFile;
}
/**
@@ -40,6 +44,8 @@ export default class WebhookProxy {
this.ws.on('message', (data: any) => {
const msg = JSON.parse(data.toString());
this.logInfo(`${msg.eventType} event: ${JSON.stringify(msg)}`);
if (msg.eventType) {
let processed = false;
@@ -85,6 +91,7 @@ export default class WebhookProxy {
* Clear any stored event.
*/
clearCache() {
this.logInfo('cache cleared');
this.cache.clear();
}
@@ -98,7 +105,11 @@ export default class WebhookProxy {
const error = new Error(`Timeout waiting for event:${eventType}`);
return new Promise((resolve, reject) => {
const waiter = setTimeout(() => reject(error), timeout);
const waiter = setTimeout(() => {
this.logInfo(error.message);
return reject(error);
}, timeout);
this.addConsumer(eventType, event => {
clearTimeout(waiter);
@@ -134,6 +145,22 @@ export default class WebhookProxy {
this.ws.close();
console.log('WebhookProxy disconnected');
this.ws = undefined;
this.logInfo('disconnected');
}
}
/**
* Logs a message in the logfile.
*
* @param {string} message - The message to add.
* @returns {void}
*/
logInfo(message: string) {
try {
// @ts-ignore
fs.appendFileSync(this.logFile, `${new Date().toISOString()} ${message}\n`);
} catch (err) {
console.error(err);
}
}
}

View File

@@ -3,7 +3,7 @@ import jwt from 'jsonwebtoken';
import process from 'node:process';
import { v4 as uuidv4 } from 'uuid';
import { P1_DISPLAY_NAME, P2_DISPLAY_NAME, P3_DISPLAY_NAME, P4_DISPLAY_NAME, Participant } from './Participant';
import { P1, P2, P3, P4, Participant } from './Participant';
import { IContext, IJoinOptions } from './types';
const SUBJECT_XPATH = '//div[starts-with(@class, "subject-text")]';
@@ -31,18 +31,8 @@ export async function ensureThreeParticipants(ctx: IContext, options: IJoinOptio
// these need to be all, so we get the error when one fails
await Promise.all([
_joinParticipant('participant2', ctx.p2, p => {
ctx.p2 = p;
}, {
displayName: P2_DISPLAY_NAME,
...options
}),
_joinParticipant('participant3', ctx.p3, p => {
ctx.p3 = p;
}, {
displayName: P3_DISPLAY_NAME,
...options
})
_joinParticipant(P2, ctx, options),
_joinParticipant(P3, ctx, options)
]);
if (options.skipInMeetingChecks) {
@@ -80,12 +70,7 @@ export function joinFirstParticipant(ctx: IContext, options: IJoinOptions = {}):
* @returns {Promise<void>}
*/
export function joinSecondParticipant(ctx: IContext, options: IJoinOptions = {}): Promise<void> {
return _joinParticipant('participant2', ctx.p2, p => {
ctx.p2 = p;
}, {
displayName: P2_DISPLAY_NAME,
...options
});
return _joinParticipant(P2, ctx, options);
}
/**
@@ -96,12 +81,7 @@ export function joinSecondParticipant(ctx: IContext, options: IJoinOptions = {})
* @returns {Promise<void>}
*/
export function joinThirdParticipant(ctx: IContext, options: IJoinOptions = {}): Promise<void> {
return _joinParticipant('participant3', ctx.p3, p => {
ctx.p3 = p;
}, {
displayName: P3_DISPLAY_NAME,
...options
});
return _joinParticipant(P3, ctx, options);
}
/**
@@ -116,24 +96,9 @@ export async function ensureFourParticipants(ctx: IContext, options: IJoinOption
// these need to be all, so we get the error when one fails
await Promise.all([
_joinParticipant('participant2', ctx.p2, p => {
ctx.p2 = p;
}, {
displayName: P2_DISPLAY_NAME,
...options
}),
_joinParticipant('participant3', ctx.p3, p => {
ctx.p3 = p;
}, {
displayName: P3_DISPLAY_NAME,
...options
}),
_joinParticipant('participant4', ctx.p4, p => {
ctx.p4 = p;
}, {
displayName: P4_DISPLAY_NAME,
...options
})
_joinParticipant(P2, ctx, options),
_joinParticipant(P3, ctx, options),
_joinParticipant(P4, ctx, options)
]);
if (options.skipInMeetingChecks) {
@@ -162,28 +127,8 @@ export async function ensureFourParticipants(ctx: IContext, options: IJoinOption
* @returns {Promise<void>}
*/
async function joinTheModeratorAsP1(ctx: IContext, options?: IJoinOptions) {
const p1DisplayName = P1_DISPLAY_NAME;
let token;
if (!options?.skipFirstModerator) {
// we prioritize the access token when iframe is not used and private key is set,
// otherwise if private key is not specified we use the access token if set
if (process.env.JWT_ACCESS_TOKEN
&& ((ctx.jwtPrivateKeyPath && !ctx.iframeAPI && !options?.preferGenerateToken)
|| !ctx.jwtPrivateKeyPath)) {
token = process.env.JWT_ACCESS_TOKEN;
} else if (ctx.jwtPrivateKeyPath) {
token = getToken(ctx, p1DisplayName);
}
}
// make sure the first participant is moderator, if supported by deployment
await _joinParticipant('participant1', ctx.p1, p => {
ctx.p1 = p;
}, {
displayName: p1DisplayName,
...options
}, token);
await _joinParticipant(P1, ctx, options);
}
/**
@@ -195,12 +140,7 @@ async function joinTheModeratorAsP1(ctx: IContext, options?: IJoinOptions) {
export async function ensureTwoParticipants(ctx: IContext, options: IJoinOptions = {}): Promise<void> {
await joinTheModeratorAsP1(ctx, options);
await _joinParticipant('participant2', ctx.p2, p => {
ctx.p2 = p;
}, {
displayName: P2_DISPLAY_NAME,
...options
}, options.preferGenerateToken ? getToken(ctx, P2_DISPLAY_NAME) : undefined);
await _joinParticipant(P2, ctx, options);
if (options.skipInMeetingChecks) {
return Promise.resolve();
@@ -219,17 +159,17 @@ export async function ensureTwoParticipants(ctx: IContext, options: IJoinOptions
/**
* Creates a participant instance or prepares one for re-joining.
* @param name - The name of the participant.
* @param p - The participant instance to prepare or undefined if new one is needed.
* @param setter - The setter to use for setting the new participant instance into the context if needed.
* @param {IContext} ctx - The context.
* @param {boolean} options - Join options.
* @param {string?} jwtToken - The token to use if any.
*/
async function _joinParticipant( // eslint-disable-line max-params
name: string,
p: Participant,
setter: (p: Participant) => void,
options: IJoinOptions = {},
jwtToken?: string) {
ctx: IContext,
options: IJoinOptions = {}) {
// @ts-ignore
const p = ctx[name] as Participant;
if (p) {
if (ctx.iframeAPI) {
await p.switchInPage();
@@ -250,12 +190,34 @@ async function _joinParticipant( // eslint-disable-line max-params
// we want the participant instance re-recreated so we clear any kept state, like endpoint ID
}
let jwtToken;
if (name === P1) {
if (!options?.skipFirstModerator) {
// we prioritize the access token when iframe is not used and private key is set,
// otherwise if private key is not specified we use the access token if set
if (process.env.JWT_ACCESS_TOKEN
&& ((ctx.jwtPrivateKeyPath && !ctx.iframeAPI && !options?.preferGenerateToken)
|| !ctx.jwtPrivateKeyPath)) {
jwtToken = process.env.JWT_ACCESS_TOKEN;
} else if (ctx.jwtPrivateKeyPath) {
jwtToken = getToken(ctx, name);
}
}
} else if (name === P2) {
jwtToken = options.preferGenerateToken ? getToken(ctx, P2) : undefined;
}
const newParticipant = new Participant(name, jwtToken);
// set the new participant instance, pass it to setter
setter(newParticipant);
// set the new participant instance
// @ts-ignore
ctx[name] = newParticipant;
await newParticipant.joinConference(ctx, options);
await newParticipant.joinConference(ctx, {
displayName: name,
...options
});
}
/**
@@ -356,7 +318,8 @@ function getToken(ctx: IContext, displayName: string, moderator = true) {
'features': {
'outbound-call': 'true',
'transcription': 'true',
'recording': 'true'
'recording': 'true',
'sip-outbound-call': true
},
},
'room': '*'

12
tests/helpers/utils.ts Normal file
View File

@@ -0,0 +1,12 @@
/**
* Generates a random number between 1 and the specified maximum value (inclusive).
*
* @param {number} max - The maximum value for the random number (must be a positive integer).
* @param numberOfDigits - The number of digits to pad the random number with leading zeros.
* @return {string} The random number formatted with leading zeros if needed.
*/
export function getRandomNumberAsStr(max: number, numberOfDigits: number): string {
const randomNumber = Math.floor(Math.random() * max) + 1;
return randomNumber.toString().padStart(numberOfDigits, '0');
}

View File

@@ -99,8 +99,8 @@ export default class Filmstrip extends BasePageObject {
async () => await this.participant.getLargeVideo().getId() === videoIdToSwitchTo,
{
timeout: 3_000,
timeoutMsg: `${this.participant.displayName} did not switch the large video to ${
participant.displayName}`
timeoutMsg: `${this.participant.name} did not switch the large video to ${
participant.name}`
}
);
}
@@ -120,7 +120,7 @@ export default class Filmstrip extends BasePageObject {
await this.participant.driver.$(`//div[ @id="pin-indicator-${epId}" ]`).waitForDisplayed({
timeout: 2_000,
timeoutMsg: `${this.participant.displayName} did not unpin ${participant.displayName}`,
timeoutMsg: `${this.participant.name} did not unpin ${participant.name}`,
reverse: true
});
}

View File

@@ -41,10 +41,10 @@ export default class IframeAPI extends BasePageObject {
addEventListener(eventName: string) {
return this.participant.execute(
(event, prefix) => {
console.log(`${new Date().toISOString()} ${prefix} Adding listener for event: ${event}`);
console.log(`${new Date().toISOString()} ${prefix}iframeAPI - Adding listener for event: ${event}`);
window.jitsiAPI.addListener(event, evt => {
console.log(
`${new Date().toISOString()} ${prefix} Received ${event} event: ${JSON.stringify(evt)}`);
`${new Date().toISOString()} ${prefix}iframeAPI - Received ${event} event: ${JSON.stringify(evt)}`);
window.jitsiAPI.test[event] = evt;
});
}, eventName, LOG_PREFIX);
@@ -89,4 +89,24 @@ export default class IframeAPI extends BasePageObject {
dispose() {
return this.participant.execute(() => window.jitsiAPI.dispose());
}
/**
* Invite the given participant to the meeting via PSTN.
*/
invitePhone(value: string) {
return this.participant.execute(v => window.jitsiAPI.invite([ {
type: 'phone',
number: v
} ]), value);
}
/**
* Invite the given participant to the meeting via sip (sip jibri).
*/
inviteSIP(value: string) {
return this.participant.execute(v => window.jitsiAPI.invite([ {
type: 'sip',
address: v
} ]), value);
}
}

View File

@@ -75,8 +75,8 @@ export default class ParticipantsPane extends BasePageObject {
await this.participant.driver.$(mutedIconXPath).waitForDisplayed({
reverse,
timeout: 2000,
timeoutMsg: `Video mute icon is${reverse ? '' : ' not'} displayed for ${testee.displayName} at ${
this.participant.displayName} side.`
timeoutMsg: `Video mute icon is${reverse ? '' : ' not'} displayed for ${testee.name} at ${
this.participant.name} side.`
});
if (!isOpen) {
@@ -107,8 +107,8 @@ export default class ParticipantsPane extends BasePageObject {
await this.participant.driver.$(mutedIconXPath).waitForDisplayed({
reverse,
timeout: 2000,
timeoutMsg: `Audio mute icon is${reverse ? '' : ' not'} displayed for ${testee.displayName} at ${
this.participant.displayName} side.`
timeoutMsg: `Audio mute icon is${reverse ? '' : ' not'} displayed for ${testee.name} at ${
this.participant.name} side.`
});
if (!isOpen) {

View File

@@ -1,13 +1,13 @@
import { isEqual } from 'lodash-es';
import { P1_DISPLAY_NAME, P2_DISPLAY_NAME, Participant } from '../../helpers/Participant';
import { P1, P2, Participant } from '../../helpers/Participant';
import { ensureTwoParticipants, parseJid } from '../../helpers/participants';
import { IContext } from '../../helpers/types';
/**
* Tests PARTICIPANT_LEFT webhook.
*/
async function checkParticipantLeftHook(ctx: IContext, p: Participant, reason: string) {
async function checkParticipantLeftHook(ctx: IContext, p: Participant, reason: string, checkId = false) {
const { webhooksProxy } = ctx;
if (webhooksProxy) {
@@ -32,13 +32,15 @@ async function checkParticipantLeftHook(ctx: IContext, p: Participant, reason: s
expect(event.data.disconnectReason).toBe(reason);
expect(event.data.isBreakout).toBe(false);
expect(event.data.participantId).toBe(await p.getEndpointId());
expect(event.data.name).toBe(p.displayName);
expect(event.data.name).toBe(p.name);
const jwtPayload = ctx.data[`${p.displayName}-jwt-payload`];
if (checkId) {
const jwtPayload = ctx.data[`${p.name}-jwt-payload`];
expect(event.data.id).toBe(jwtPayload?.context?.user?.id);
expect(event.data.group).toBe(jwtPayload?.context?.group);
expect(event.customerId).toBe(process.env.IFRAME_TENANT?.replace('vpaas-magic-cookie-', ''));
expect(event.data.id).toBe(jwtPayload?.context?.user?.id);
expect(event.data.group).toBe(jwtPayload?.context?.group);
expect(event.customerId).toBe(process.env.IFRAME_TENANT?.replace('vpaas-magic-cookie-', ''));
}
}
}
@@ -239,14 +241,14 @@ describe('Participants presence', () => {
const eventP1 = await p1.driver.waitUntil(() => p1.getIframeAPI().getEventResult('participantKickedOut'), {
timeout: 2000,
timeoutMsg: 'participantKickedOut event not received on participant1 side'
timeoutMsg: 'participantKickedOut event not received on p1 side'
});
const eventP2 = await p2.driver.waitUntil(() => p2.getIframeAPI().getEventResult('participantKickedOut'), {
timeout: 2000,
timeoutMsg: 'participantKickedOut event not received on participant2 side'
timeoutMsg: 'participantKickedOut event not received on p2 side'
});
await checkParticipantLeftHook(ctx, p2, 'kicked');
await checkParticipantLeftHook(ctx, p2, 'kicked', true);
expect(eventP1).toBeDefined();
expect(eventP2).toBeDefined();
@@ -318,7 +320,7 @@ describe('Participants presence', () => {
expect(event.data.moderator).toBe(false);
expect(event.data.name).toBe(await p2.getLocalDisplayName());
expect(event.data.participantId).toBe(await p2.getEndpointId());
expect(event.data.name).toBe(p2.displayName);
expect(event.data.name).toBe(p2.name);
}
await p1.switchToAPI();
@@ -343,8 +345,8 @@ describe('Participants presence', () => {
const p1EpId = await p1.getEndpointId();
const p2EpId = await p2.getEndpointId();
const newP1Name = P1_DISPLAY_NAME;
const newP2Name = P2_DISPLAY_NAME;
const newP1Name = P1;
const newP2Name = P2;
const newNames: ({ id: string; name: string; })[] = [ {
id: p2EpId,
name: newP2Name
@@ -412,7 +414,7 @@ describe('Participants presence', () => {
expect(eventConferenceLeft).toBeDefined();
expect(eventConferenceLeft.roomName).toBe(roomName);
await checkParticipantLeftHook(ctx, p1, 'left');
await checkParticipantLeftHook(ctx, p1, 'left', true);
if (webhooksProxy) {
// ROOM_DESTROYED webhook
// @ts-ignore

View File

@@ -157,13 +157,13 @@ async function checkReceivingChunks(p1: Participant, p2: Participant, webhooksPr
allTranscripts.push(await p1.driver.waitUntil(() => p1.getIframeAPI()
.getEventResult('transcriptionChunkReceived'), {
timeout: 60000,
timeoutMsg: 'transcriptionChunkReceived event not received on participant1 side'
timeoutMsg: 'transcriptionChunkReceived event not received on p1 side'
}));
allTranscripts.push(await p2.driver.waitUntil(() => p2.getIframeAPI()
.getEventResult('transcriptionChunkReceived'), {
timeout: 60000,
timeoutMsg: 'transcriptionChunkReceived event not received on participant2 side'
timeoutMsg: 'transcriptionChunkReceived event not received on p2 side'
}));
if (webhooksProxy) {
@@ -197,7 +197,7 @@ async function checkReceivingChunks(p1: Participant, p2: Participant, webhooksPr
// @ts-ignore
const firstEntryData = result[0].value.data;
const stable = firstEntryData.stable;
const stable = firstEntryData.stable || firstEntryData.final;
const language = firstEntryData.language;
const messageID = firstEntryData.messageID;
const p1Id = await p1.getEndpointId();
@@ -210,7 +210,7 @@ async function checkReceivingChunks(p1: Participant, p2: Participant, webhooksPr
return v.data;
}).forEach(tr => {
const checkTranscripts = stable.includes(tr.stable) || tr.stable.includes(stable);
const checkTranscripts = stable.includes(tr.stable || tr.final) || (tr.stable || tr.final).includes(stable);
if (!checkTranscripts) {
console.log('received events', result);
@@ -220,6 +220,6 @@ async function checkReceivingChunks(p1: Participant, p2: Participant, webhooksPr
expect(tr.language).toBe(language);
expect(tr.messageID).toBe(messageID);
expect(tr.participant.id).toBe(p1Id);
expect(tr.participant.name).toBe(p1.displayName);
expect(tr.participant.name).toBe(p1.name);
});
}

View File

@@ -35,5 +35,10 @@ async function kickParticipant2AndCheck() {
await p1.waitForParticipants(0);
// check that the kicked participant sees the kick reason dialog
expect(await p2.isLeaveReasonDialogOpen()).toBe(true);
// let's wait for this to appear at least 2 seconds
await p2.driver.waitUntil(
async () => p2.isLeaveReasonDialogOpen(), {
timeout: 2000,
timeoutMsg: 'No leave reason dialog shown for p2'
});
}

View File

@@ -1,4 +1,4 @@
import { P1_DISPLAY_NAME, P3_DISPLAY_NAME, Participant } from '../../helpers/Participant';
import { P1, P3, Participant } from '../../helpers/Participant';
import {
ensureOneParticipant,
ensureThreeParticipants,
@@ -34,8 +34,8 @@ describe('Lobby', () => {
const notificationText = await p2.getNotifications().getLobbyParticipantAccessGranted();
expect(notificationText.includes(P1_DISPLAY_NAME)).toBe(true);
expect(notificationText.includes(P3_DISPLAY_NAME)).toBe(true);
expect(notificationText.includes(P1)).toBe(true);
expect(notificationText.includes(P3)).toBe(true);
await p2.getNotifications().closeLobbyParticipantAccessGranted();
@@ -49,7 +49,7 @@ describe('Lobby', () => {
// now check third one display name in the room, is the one set in the prejoin screen
const name = await p1.getFilmstrip().getRemoteDisplayName(await p3.getEndpointId());
expect(name).toBe(P3_DISPLAY_NAME);
expect(name).toBe(P3);
await p3.hangup();
});
@@ -67,8 +67,8 @@ describe('Lobby', () => {
// deny notification on 2nd participant
const notificationText = await p2.getNotifications().getLobbyParticipantAccessDenied();
expect(notificationText.includes(P1_DISPLAY_NAME)).toBe(true);
expect(notificationText.includes(P3_DISPLAY_NAME)).toBe(true);
expect(notificationText.includes(P1)).toBe(true);
expect(notificationText.includes(P3)).toBe(true);
await p2.getNotifications().closeLobbyParticipantAccessDenied();
@@ -108,7 +108,7 @@ describe('Lobby', () => {
// now check third one display name in the room, is the one set in the prejoin screen
const name = await p1.getFilmstrip().getRemoteDisplayName(await p3.getEndpointId());
expect(name).toBe(P3_DISPLAY_NAME);
expect(name).toBe(P3);
await p3.hangup();
});
@@ -349,7 +349,7 @@ describe('Lobby', () => {
// check that moderator (participant 1) sees notification about participant in lobby
const name = await p1.getNotifications().getKnockingParticipantName();
expect(name).toBe(P3_DISPLAY_NAME);
expect(name).toBe(P3);
expect(await lobbyScreen.isLobbyRoomJoined()).toBe(true);
await p1ParticipantsPane.open();
@@ -379,7 +379,7 @@ async function enableLobby() {
await p1SecurityDialog.toggleLobby();
await p1SecurityDialog.waitForLobbyEnabled();
expect((await p2.getNotifications().getLobbyEnabledText()).includes(p1.displayName)).toBe(true);
expect((await p2.getNotifications().getLobbyEnabledText()).includes(p1.name)).toBe(true);
await p2.getNotifications().closeLobbyEnabled();
@@ -467,7 +467,7 @@ async function enterLobby(participant: Participant, enterDisplayName = false, us
// this check needs to be added once the functionality exists
// enter display name
await screen.enterDisplayName(P3_DISPLAY_NAME);
await screen.enterDisplayName(P3);
// check join button is enabled
classes = await joinButton.getAttribute('class');
@@ -495,7 +495,7 @@ async function enterLobby(participant: Participant, enterDisplayName = false, us
// check that moderator (participant 1) sees notification about participant in lobby
const name = await participant.getNotifications().getKnockingParticipantName();
expect(name).toBe(P3_DISPLAY_NAME);
expect(name).toBe(P3);
expect(await screen.isLobbyRoomJoined()).toBe(true);
return name;

View File

@@ -76,7 +76,6 @@ describe('StartMuted', () => {
await p3.getParticipantsPane().assertVideoMuteIconIsDisplayed(p2, true);
});
it('config options test', async () => {
await hangupAllParticipants();
@@ -92,14 +91,20 @@ describe('StartMuted', () => {
};
await ensureOneParticipant(ctx, options);
await joinSecondParticipant(ctx, { skipInMeetingChecks: true });
await joinSecondParticipant(ctx, {
...options,
skipInMeetingChecks: true
});
const { p2 } = ctx;
await p2.waitForIceConnected();
await p2.waitForSendReceiveData({ checkSend: false });
await joinThirdParticipant(ctx, { skipInMeetingChecks: true });
await joinThirdParticipant(ctx, {
...options,
skipInMeetingChecks: true
});
const { p3 } = ctx;
@@ -110,10 +115,8 @@ describe('StartMuted', () => {
const p2ID = await p2.getEndpointId();
p1.log(`Start configOptionsTest, second participant: ${p2ID}`);
// Participant 3 should be muted, 1 and 2 unmuted.
await p3.getFilmstrip().assertAudioMuteIconIsDisplayed(p3);
await p3.getParticipantsPane().assertVideoMuteIconIsDisplayed(p3);

View File

@@ -1,8 +1,7 @@
import https from 'node:https';
import process from 'node:process';
import { ensureOneParticipant } from '../../helpers/participants';
import { cleanup, isDialInEnabled, waitForAudioFromDialInParticipant } from '../helpers/DialIn';
import { cleanup, dialIn, isDialInEnabled, retrievePin, waitForAudioFromDialInParticipant } from '../helpers/DialIn';
describe('Dial-In', () => {
it('join participant', async () => {
@@ -13,7 +12,7 @@ describe('Dial-In', () => {
return;
}
await ensureOneParticipant(ctx);
await ensureOneParticipant(ctx, { preferGenerateToken: true });
// check dial-in is enabled
if (!await isDialInEnabled(ctx.p1)) {
@@ -22,59 +21,25 @@ describe('Dial-In', () => {
});
it('retrieve pin', async () => {
let dialInPin;
try {
dialInPin = await ctx.p1.getInviteDialog().getPinNumber();
await retrievePin(ctx.p1);
} catch (e) {
console.error('dial-in.test.no-pin');
ctx.skipSuiteTests = true;
throw e;
}
await ctx.p1.getInviteDialog().clickCloseButton();
if (dialInPin.length === 0) {
if (ctx.data.dialInPin === 0) {
console.error('dial-in.test.no-pin');
ctx.skipSuiteTests = true;
throw new Error('no pin');
}
expect(dialInPin.length >= 8).toBe(true);
ctx.data.dialInPin = dialInPin;
expect(ctx.data.dialInPin.length >= 8).toBe(true);
});
it('invite dial-in participant', async () => {
if (!await ctx.p1.isInMuc()) {
// local participant did not join abort
return;
}
const restUrl = process.env.DIAL_IN_REST_URL?.replace('{0}', ctx.data.dialInPin);
// we have already checked in the first test that DIAL_IN_REST_URL exist so restUrl cannot be ''
const responseData: string = await new Promise((resolve, reject) => {
https.get(restUrl || '', res => {
let data = '';
res.on('data', chunk => {
data += chunk;
});
res.on('end', () => {
ctx.times.restAPIExecutionTS = performance.now();
resolve(data);
});
}).on('error', err => {
console.error('dial-in.test.restAPI.request.fail');
console.error(err);
reject(err);
});
});
console.log(`dial-in.test.call_session_history_id:${JSON.parse(responseData).call_session_history_id}`);
await dialIn(ctx.p1);
});
it('wait for audio from dial-in participant', async () => {

View File

@@ -0,0 +1,205 @@
import { ensureOneParticipant } from '../../helpers/participants';
import {
cleanup,
dialIn,
isDialInEnabled,
retrievePin,
waitForAudioFromDialInParticipant
} from '../helpers/DialIn';
import type { Participant } from '../../helpers/Participant';
describe('Invite iframeAPI', () => {
it('join participant', async () => {
await ensureOneParticipant(ctx);
const { p1 } = ctx;
// check for dial-in dial-out sip-jibri maybe
if (await p1.execute(() => config.disableIframeAPI)) {
// skip the test if iframeAPI is disabled
ctx.skipSuiteTests = true;
return;
}
ctx.data.dialOutDisabled = Boolean(!await p1.execute(() => config.dialOutAuthUrl));
ctx.data.sipJibriDisabled = Boolean(!await p1.execute(() => config.inviteServiceUrl));
// check dial-in is enabled
if (!await isDialInEnabled(ctx.p1) || !process.env.DIAL_IN_REST_URL) {
ctx.data.dialInDisabled = true;
}
});
it('dial-in', async () => {
if (ctx.data.dialInDisabled) {
return;
}
const { p1 } = ctx;
await retrievePin(p1);
expect(ctx.data.dialInPin.length >= 8).toBe(true);
await dialIn(p1);
if (!await p1.isInMuc()) {
// local participant did not join abort
return;
}
await waitForAudioFromDialInParticipant(p1);
await checkDialEvents(p1, 'in', 'DIAL_IN_STARTED', 'DIAL_IN_ENDED');
});
it('dial-out', async () => {
if (ctx.data.dialOutDisabled || !process.env.DIAL_OUT_URL) {
return;
}
const { p1 } = ctx;
await p1.switchToAPI();
await p1.getIframeAPI().invitePhone(process.env.DIAL_OUT_URL);
await p1.switchInPage();
await p1.waitForParticipants(1);
await waitForAudioFromDialInParticipant(p1);
await checkDialEvents(p1, 'out', 'DIAL_OUT_STARTED', 'DIAL_OUT_ENDED');
});
it('sip jibri', async () => {
if (ctx.data.sipJibriDisabled || !process.env.SIP_JIBRI_DIAL_OUT_URL) {
return;
}
const { p1 } = ctx;
await p1.switchToAPI();
await p1.getIframeAPI().inviteSIP(process.env.SIP_JIBRI_DIAL_OUT_URL);
await p1.switchInPage();
await p1.waitForParticipants(1);
await waitForAudioFromDialInParticipant(p1);
const { webhooksProxy } = ctx;
if (webhooksProxy) {
const customerId = process.env.IFRAME_TENANT?.replace('vpaas-magic-cookie-', '');
const sipCallOutStartedEvent: {
customerId: string;
data: {
participantFullJid: string;
participantId: string;
participantJid: string;
sipAddress: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent('SIP_CALL_OUT_STARTED');
expect('SIP_CALL_OUT_STARTED').toBe(sipCallOutStartedEvent.eventType);
expect(sipCallOutStartedEvent.data.sipAddress).toBe(`sip:${process.env.SIP_JIBRI_DIAL_OUT_URL}`);
expect(sipCallOutStartedEvent.customerId).toBe(customerId);
const participantId = sipCallOutStartedEvent.data.participantId;
const participantJid = sipCallOutStartedEvent.data.participantJid;
const participantFullJid = sipCallOutStartedEvent.data.participantFullJid;
await cleanup(p1);
const sipCallOutEndedEvent: {
customerId: string;
data: {
direction: string;
participantFullJid: string;
participantId: string;
participantJid: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent('SIP_CALL_OUT_ENDED');
expect('SIP_CALL_OUT_ENDED').toBe(sipCallOutEndedEvent.eventType);
expect(sipCallOutEndedEvent.customerId).toBe(customerId);
expect(sipCallOutEndedEvent.data.participantFullJid).toBe(participantFullJid);
expect(sipCallOutEndedEvent.data.participantId).toBe(participantId);
expect(sipCallOutEndedEvent.data.participantJid).toBe(participantJid);
} else {
await cleanup(p1);
}
});
});
/**
* Checks the dial events for a participant and clean up at the end.
* @param participant
* @param startedEventName
* @param endedEventName
* @param direction
*/
async function checkDialEvents(participant: Participant, direction: string, startedEventName: string, endedEventName: string) {
const { webhooksProxy } = ctx;
if (webhooksProxy) {
const customerId = process.env.IFRAME_TENANT?.replace('vpaas-magic-cookie-', '');
const dialInStartedEvent: {
customerId: string;
data: {
direction: string;
participantFullJid: string;
participantId: string;
participantJid: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent(startedEventName);
expect(startedEventName).toBe(dialInStartedEvent.eventType);
expect(dialInStartedEvent.data.direction).toBe(direction);
expect(dialInStartedEvent.customerId).toBe(customerId);
const participantId = dialInStartedEvent.data.participantId;
const participantJid = dialInStartedEvent.data.participantJid;
const participantFullJid = dialInStartedEvent.data.participantFullJid;
const usageEvent: {
customerId: string;
data: any;
eventType: string;
} = await webhooksProxy.waitForEvent('USAGE');
expect('USAGE').toBe(usageEvent.eventType);
expect(usageEvent.customerId).toBe(customerId);
expect(usageEvent.data.some((el: any) =>
el.participantId === participantId && el.callDirection === direction)).toBe(true);
await cleanup(participant);
const dialInEndedEvent: {
customerId: string;
data: {
direction: string;
participantFullJid: string;
participantId: string;
participantJid: string;
};
eventType: string;
} = await webhooksProxy.waitForEvent(endedEventName);
expect(endedEventName).toBe(dialInEndedEvent.eventType);
expect(dialInEndedEvent.customerId).toBe(customerId);
expect(dialInEndedEvent.data.participantFullJid).toBe(participantFullJid);
expect(dialInEndedEvent.data.participantId).toBe(participantId);
expect(dialInEndedEvent.data.participantJid).toBe(participantJid);
} else {
await cleanup(participant);
}
}

View File

@@ -1,4 +1,6 @@
import type { Participant } from '../../helpers/Participant';
import process from 'node:process';
import https from 'node:https';
/**
* Helper functions for dial-in related operations.
@@ -49,3 +51,51 @@ export async function isDialInEnabled(participant: Participant) {
return await participant.execute(() => Boolean(
config.dialInConfCodeUrl && config.dialInNumbersUrl && config.hosts?.muc));
}
/**
* Retrieves the dial-in pin number from the invite dialog of the participant.
* @param participant
*/
export async function retrievePin(participant: Participant) {
const dialInPin = await participant.getInviteDialog().getPinNumber();
await participant.getInviteDialog().clickCloseButton();
ctx.data.dialInPin = dialInPin;
}
/**
* Sends a request to the REST API to dial in the participant using the provided pin.
* @param participant
*/
export async function dialIn(participant: Participant) {
if (!await participant.isInMuc()) {
// local participant did not join abort
return;
}
const restUrl = process.env.DIAL_IN_REST_URL?.replace('{0}', ctx.data.dialInPin);
// we have already checked in the first test that DIAL_IN_REST_URL exist so restUrl cannot be ''
const responseData: string = await new Promise((resolve, reject) => {
https.get(restUrl || '', res => {
let data = '';
res.on('data', chunk => {
data += chunk;
});
res.on('end', () => {
ctx.times.restAPIExecutionTS = performance.now();
resolve(data);
});
}).on('error', err => {
console.error('dial-in.test.restAPI.request.fail');
console.error(err);
reject(err);
});
});
console.log(`dial-in.test.call_session_history_id:${JSON.parse(responseData).call_session_history_id}`);
}

View File

@@ -9,6 +9,7 @@ import pretty from 'pretty';
import WebhookProxy from './helpers/WebhookProxy';
import { getLogs, initLogger, logInfo } from './helpers/browserLogger';
import { IContext } from './helpers/types';
import { getRandomNumberAsStr } from './helpers/utils';
// eslint-disable-next-line @typescript-eslint/no-var-requires
const allure = require('allure-commandline');
@@ -85,7 +86,8 @@ export const config: WebdriverIO.MultiremoteConfig = {
},
capabilities: {
participant1: {
// participant1
p1: {
capabilities: {
browserName: 'chrome',
browserVersion: process.env.BROWSER_CHROME_BETA ? 'beta' : undefined,
@@ -95,7 +97,8 @@ export const config: WebdriverIO.MultiremoteConfig = {
}
}
},
participant2: {
// participant2
p2: {
capabilities: {
browserName: 'chrome',
'goog:chromeOptions': {
@@ -107,7 +110,8 @@ export const config: WebdriverIO.MultiremoteConfig = {
]
}
},
participant3: {
// participant3
p3: {
capabilities: {
browserName: 'chrome',
'goog:chromeOptions': {
@@ -120,7 +124,8 @@ export const config: WebdriverIO.MultiremoteConfig = {
]
}
},
participant4: {
// participant4
p4: {
capabilities: {
browserName: 'chrome',
'goog:chromeOptions': {
@@ -209,14 +214,33 @@ export const config: WebdriverIO.MultiremoteConfig = {
bInstance.iframePageBase = `file://${path.dirname(rpath)}`;
}));
globalAny.ctx.roomName = `jitsimeettorture-${crypto.randomUUID()}`;
globalAny.ctx.roomName = `${testName}-${getRandomNumberAsStr(40, 3)}`;
if (process.env.ROOM_NAME_PREFIX) {
globalAny.ctx.roomName = `${process.env.ROOM_NAME_PREFIX.trim()}_${globalAny.ctx.roomName}`;
}
if (process.env.ROOM_NAME_SUFFIX) {
globalAny.ctx.roomName += `_${process.env.ROOM_NAME_SUFFIX.trim()}`;
}
globalAny.ctx.roomName = globalAny.ctx.roomName.toLowerCase();
globalAny.ctx.jwtPrivateKeyPath = process.env.JWT_PRIVATE_KEY_PATH;
globalAny.ctx.jwtKid = process.env.JWT_KID;
globalAny.ctx.isJaasAvailable = () => globalAny.ctx.jwtKid?.startsWith('vpaas-magic-cookie-');
// If we are running the iFrameApi tests, we need to mark it as such and if needed to create the proxy
// and connect to it.
if (testName.startsWith('iFrameApi')) {
globalAny.ctx.iframeAPI = true;
if (!globalAny.ctx.webhooksProxy
&& process.env.WEBHOOKS_PROXY_URL && process.env.WEBHOOKS_PROXY_SHARED_SECRET) {
globalAny.ctx.webhooksProxy = new WebhookProxy(
`${process.env.WEBHOOKS_PROXY_URL}&room=${globalAny.ctx.roomName}`,
process.env.WEBHOOKS_PROXY_SHARED_SECRET,
`${TEST_RESULTS_DIR}/webhooks-${cid}-${testName}.log`);
globalAny.ctx.webhooksProxy.connect();
}
}
},
after() {
@@ -250,22 +274,6 @@ export const config: WebdriverIO.MultiremoteConfig = {
* @param {Object} suite - Suite details.
*/
beforeSuite(suite) {
const { ctx }: any = global;
// If we are running the iFrameApi tests, we need to mark it as such and if needed to create the proxy
// and connect to it.
if (path.basename(suite.file).startsWith('iFrameApi')) {
ctx.iframeAPI = true;
if (!ctx.webhooksProxy
&& process.env.WEBHOOKS_PROXY_URL && process.env.WEBHOOKS_PROXY_SHARED_SECRET) {
ctx.webhooksProxy = new WebhookProxy(
`${process.env.WEBHOOKS_PROXY_URL}&room=${ctx.roomName}`,
process.env.WEBHOOKS_PROXY_SHARED_SECRET);
ctx.webhooksProxy.connect();
}
}
multiremotebrowser.instances.forEach((instance: string) => {
logInfo(multiremotebrowser.getInstance(instance),
`---=== Begin ${suite.file.substring(suite.file.lastIndexOf('/') + 1)} ===---`);

View File

@@ -22,6 +22,7 @@ if (process.env.HEADLESS === 'true') {
const ffExcludes = [
'specs/2way/iFrameApiParticipantsPresence.spec.ts', // FF does not support uploading files (uploadFile)
'specs/2way/iFrameApiTranscriptions.spec.ts',
'specs/alone/iFrameApiInvite.spec.ts',
// FF does not support setting a file as mic input, no dominant speaker events
'specs/3way/activeSpeaker.spec.ts',
@@ -38,7 +39,7 @@ const ffExcludes = [
const mergedConfig = merge(defaultConfig, {
ffExcludes,
capabilities: {
participant1: {
p1: {
capabilities: {
browserName: 'firefox',
browserVersion: process.env.BROWSER_FF_BETA ? 'beta' : undefined,
@@ -49,26 +50,26 @@ const mergedConfig = merge(defaultConfig, {
acceptInsecureCerts: process.env.ALLOW_INSECURE_CERTS === 'true'
}
},
participant2: {
p2: {
capabilities: {
'wdio:exclude': [
...defaultConfig.capabilities.participant2.capabilities['wdio:exclude'],
...defaultConfig.capabilities.p2.capabilities['wdio:exclude'],
...ffExcludes
]
}
},
participant3: {
p3: {
capabilities: {
'wdio:exclude': [
...defaultConfig.capabilities.participant3.capabilities['wdio:exclude'],
...defaultConfig.capabilities.p3.capabilities['wdio:exclude'],
...ffExcludes
]
}
},
participant4: {
p4: {
capabilities: {
'wdio:exclude': [
...defaultConfig.capabilities.participant4.capabilities['wdio:exclude'],
...defaultConfig.capabilities.p4.capabilities['wdio:exclude'],
...ffExcludes
]
}
@@ -78,6 +79,6 @@ const mergedConfig = merge(defaultConfig, {
// Remove the chrome options from the first participant
// @ts-ignore
mergedConfig.capabilities.participant1.capabilities['goog:chromeOptions'] = undefined;
mergedConfig.capabilities.p1.capabilities['goog:chromeOptions'] = undefined;
export const config = mergedConfig;

View File

@@ -17,14 +17,14 @@ const mergedConfig = {
path: gridUrl.pathname
};
mergedConfig.capabilities.participant1.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.participant1.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.participant2.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.participant2.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.participant3.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.participant3.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.participant4.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.participant4.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.p1.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.p1.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.p2.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.p2.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.p3.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.p3.capabilities['goog:chromeOptions'].args);
mergedConfig.capabilities.p4.capabilities['goog:chromeOptions'].args
= updateRemoteResource(mergedConfig.capabilities.p4.capabilities['goog:chromeOptions'].args);
export const config = mergedConfig;