SRS - Streaming en stéréoscopie
The document provides a template of the Software Requirements Specification (SRS). It is inspired of the IEEE/ANSI 830-1998 Standard.
Version | Date | Authors | Description | Validator | Validation Date | |
---|---|---|---|---|---|---|
0.1.0 | 18/01/2016 | Zilong ZHAO - Guillaume HAMMOUTI | Creation of this page | TBC | TBC |
Version | Date | Authors | Description | Validator | Validation Date | |
---|---|---|---|---|---|---|
1.0.0 | 29/02/2016 | Zilong ZHAO - Guillaume HAMMOUTI | Update general description and specific requirements | TBC | TBC |
1. Introduction
1.1 Purpose of the requirements document
This Software Requirements Specification (SRS) identifies the requirements and it presents a detailed description about our project "Streaming en stéréoscopie".
1.2 Scope of the product
Allowed users to see through Oculus in 3D what robot can see, can be used when the topography is dangerous.
1.3 Definitions, acronyms and abbreviations
Stéréoscopie: a technique for creating or enhancing the illusion of depth in an image by means of stereopsis for binocular vision.
1.4 References
https://webrtc.github.io/samples/
2. General description
2.1 Product perspective
The product is a web based system implementing p2p model, it provide a more real 3D effect by Oculus Rift with two cameras
2.2 User characteristics
It is considered that the user do have the basic knowledge of operating the internet and Oculus Rift. The administrator is expected to be familiar with javascript and websocket.
2.3 General constraints
Now, as we use the technology webrtc for streaming our videos, it can only support by Chrome and Firefox browser.
2.4 Assumptions and dependencies
In the part of websocket server, on quoted a server here: https://github.com/theturtle32/WebSocket-Node
3.Specific requirements, covering functional, non-functional and interface requirements
Function: 1. Composer deux flux de videos dans un seul stream, les envoyer d'un client au server, afficher dans coté server pour avoir un effet de video en 3D . 2. Envoyer l'audio en stereo d'un client au server, émettre un audio en effet 3D. 3. Envoyer les informations de rotations de Oculus d'un client au server, puis utiliser ces informations pour contrôler Arduino et après pour contrôler servos qui tourne avec Oculus
Inputs: Two videos streaming, an audio, information of rotations of Oculus
Outputs: Composed video streaming, audio on stereo, motor rotated with Oculus
Destination: Composed video streaming, audio on stereo, motor rotated with Oculus
Non functional requirements:
- Based on Oculus OSX runtime environment, run with OSX EI CAPITAN. If you want to developer with windows or linux, Oculus has the requirements as follow:
NVIDIA GTX 970 / AMD 290 equivalent or greater Intel i5-4590 equivalent or greater 8GB+ RAM Compatible HDMI 1.3 video output 2x USB 3.0 ports Windows 7 SP1 or newer
- The version of Nodejs requires 4.+
- There are several type of ARDUINO, our code runs only on ARDUINO UNO
- The part of code where we take the information of rotation of Oculus is only run on Xcode
4. Product evolution
From one camera with video 2d to two cameras on video 3D, We replace traditional screen with a Virtual Reality Environment