Walk and Talk is a platform for mobile teleconferencing in extended reality (XR) using WebXR, built on BabylonJs. The application specifically facilitates virtual walking meetings.
- Motivation and Impact
- Features
- Demo
- System Architecture
- Getting Started
- The Walk and Talk Team
- Getting Help
Most meetings today, especially with the increase in remote work, are done by sitting in front of a computer or laptop without much change in one's working environment. This has made the workday more sedentary for the average working individual and leads to negative physical and psychological consequences, including a higher risk of developing multiple chronic conditions. Light physical activity through the workday can counter these consequences as it has health benefits and improves happiness and satisfaction.
Walking meetings are an easy way to integrate light physical activity into the workday and have been shown boost creative thinking and help people stay focused and energized. However, it is currently difficult to walk or pace around during virtual meetings - app aims to facilitate virtual walking meetings and provide users with a relaxing change of scenery while maintaining a sense of presence by allowing users to see others’ videos, movements, or shared resources.
Participants of virtual walking meetings could choose their own pace, or they could choose to not walk at all – making participation in walking meetings more accessible to everyone – something that isn’t possible with walking meetings in the physical world. Virtual walking or mobile meetings are also a natural evolution to the standing desk.
Walk and Talk allows users to collaborate across different devices and supports both desktop and XR users. Meeting participants can choose their most comfortable position and device.
XR users appear as avatars and desktop users appear as TVs with robot bodies. XR avatars also move their heads based on the corresponding user’s head pose.
XR users can choose between different environments to walk in using their control panel. The images below show a user changing their enviornment from a forest to a beach scene.
Users can set up the space they can walk around in and aspects of the environment (like trees or rocks) will be generated in spaces where they cannot walk to keep them safe.
Users can toggle between two modes of participant positioning - Meeting participants can either stay in a static location in the environment, or they can follow the user as they walk around.
Users can record audio snippets using a button in their control panel.
Non-XR users can join meetings using the desktop client. These users can also share their screen that can be viewed by both XR and Non-XR participants.
We use an SFU architecture for WebRTC (with the MediaSoup package) to ensure that numerous clients can simultaneously connect and communicate. This diagram illustrates how a user can join a room or meeting.
These WebRTC connections consist of various identified elements (producers, consumers, and sockets); all of which are unlinked. In order to link these and their respective Videos and Avatars, we created a series of classes and event systems which allow for identifiers to propagate outwards, accumulating pairs by various means so that everything syncs together.
- Download Node.js and NPM
- Windows and Mac: download installer
- Linux:
sudo apt install nodejs npm
- Clone this repository -
git clone https://github.com/WeibelLab-Teaching/CSE_218_118_Fa20_Team_SRSLy_Joking.git
- Enter the project folder, run
npm install
. Let the downloads finish. This will install a series of dependencies through the Node Package Manager (NPM) these packages are all monitored by NPM and checked for security. - Set WebRTC Host IP
- Get your local ip address:
- Open terminal/powershell
- Mac/Linux:
ifconfig | grep 192
- Windows:
ipconfig
- Mac/Linux:
- Use the address that is of the form
192.168.x.x
(eg:192.168.1.127
)
- Open terminal/powershell
- Add your IP to
.../webrtc_server_scripts/config.js
line 63:announcedIp:'192.168.1.127' // replace by public IP address
- Get your local ip address:
- From Terminal, run
npm start
to start the server - Go to https://localhost
- If you want to access from remote devices, you will need to generate a self signed certificate. Instructions for doing this can be found here.
- Copy the
cert
andkey
files into.../cert/
- You can now access secure content (video feeds) from any device in your local area network.
-
index.js is the main server file and is used to initialize all routes, the https server, and initializes all the webrtc_server_script objects and logic for webrtc rooms.
-
assets includes the visual assets (like 3D models) used in the app.
-
- main: Main XR application
- webrtc.html: A development version of main
- webrtc_nonvr: Desktop Client
-
scripts houses the client side scripts.
- RoomClient.js: Client side organization of all the webrtc information on the client pov
- Streamer.js: Client side organization of videos of different participants (
VideoStreamer.js
andAvatarStreamer.js
are children of this class) - PlaySpace.js: A class for managing the area in which the user can walk around in
- Environment.js: Class managing the generation of the virtual environments for the users
- Momentum.js and Follower.js keep elements with the user as they walk
- webrtc_index.js: A file to declare all client-side functions that help with joining rooms and setting up mic/video devices to be used for webrtc
- PCPair.js: Keeps sockets, webrtc producers and consumers, video streamers, and avatars streamers organized and linked together.
- scripts_serverside is a folder containing scripts specific for running the server and setup the data for user's webrtc transports and a meeting room's webrtc information.
Team Name: SRSLy Joking
Team Members
- Eric Siu: Undergraduate student at UC San Diego
- Naba Rizvi: PhD student at UC San Diego
- Tommy Sharkey: PhD student at UC San Diego
- Stephen Liu: Undergraduate student at UC San Diego
- Janet Johnson: PhD student at UC San Diego
The Walk and Talk team is responsible for maintaining this project. Should you need additional help, you can contact our team at [email protected].
You can find general support for the technologies we use here:
Additional background on the project can be found in our Midterm Presentation, and our Final Presentation