Difference between revisions of "Proj-2013-2014-BrasRobot-Handicap-2/SRS"

From air
Jump to navigation Jump to search
Line 80: Line 80:
   
 
==3.1 Requirement X.Y.Z (in Structured Natural Language)==
 
==3.1 Requirement X.Y.Z (in Structured Natural Language)==
'''Function''':
 
Help the camera to detect the marker on the object, localise the object on the simulator relative to the robotic arm base, take it and bring it back
 
'''Description''':
 
Implement a simulator to move the robotic clamp in spacial coordinates
 
'''Inputs''':
 
Webcam, marker on the object, a users which can move all the robotic arm to make the object reachable
 
'''Source''':
 
   
  +
== a. Markers detection ==
'''Outputs''':
 
A persons with arm disabilities can take an object
 
'''Destination''':
 
All people, all object which can be taken by the robot (dimension)
 
'''Action''':
 
   
  +
'''Description''': The camera detect the marker on the object and locate it
The simulated robot follows a path to find the marker
 
The detection program of the other groups give the displacement of the camera by sending data in sockets
 
The simulated robot take the object and bring it to a place
 
   
  +
'''Inputs''': Video stream and markers
   
  +
'''Source''': ArUco library detection
'''Non functional requirements''':
 
  +
  +
'''Outputs''': video stream with markers highlighted with their id and an xml file which contains the marker's id and the coordinates of the four points of the markers
  +
 
'''Destination''': All people, all object which can be taken by the robot
  +
 
'''Action''':
  +
The ArUco library detects borders and analyzes in rectangular regions which may be markers
   
  +
'''Non functional requirements''': The marker detection should be done in real time and faster
   
  +
'''Pre-condition''': Using the Creative Camera, ambient light should be enough to see correctly user´s movements.
'''Pre-condition''':
 
  +
Using the leap motion, hands should be above the sensor.
  +
The recognized gesture is present in the data base.
   
'''Post-condition''':
+
'''Post-condition''': The user hand movement has been well recognized.
   
'''Side-effects''':
+
'''Side-effects''': Bad gesture recognition
   
 
=4. Product evolution=
 
=4. Product evolution=

Revision as of 22:23, 18 April 2014

The document provides a template of the Software Requirements Specification (SRS). It is inspired of the IEEE/ANSI 830-1998 Standard.


Document History
Version Date Authors Description Validator Validation Date
TBC TBC TBC


1. Introduction

1.1 Purpose of the requirements document

This Software Requirements Specification (SRS) identifies the requirements for the RobotArm - Handicap

1.2 Scope of the product

The purpose of this project is to make a simulator to make possible the robot to reach a destination, espacially a real object :

- One part concerns the embodiment of the GUI and a description of the robot
- A second part concerns the communication with the robot
- A third part concerns the communication with the program of detecting markers with the other group

1.3 Definitions, acronyms and abbreviations

Python : Python is a high-level programming language which supports multiple programming paradigms, including object-oriented, imperative and functional programming

VPython : Vpython is a Python language including a library which allows programmer making a GUI in three dimensions more reachable than other languages

1.4 References

Python : For Documentation

VPython : For Documentation

1.5 Overview of the remainder of the document

2. General description

2.1 Product perspective

The aim is to allow a robotic arm with five degrees of freedom to take an object previously detected and to bring it to a specific place. It will allow people with disabilities to take for example, a glass of water.

2.2 Product functions

2.3 User characteristics

The users doesn't need to know something about programming or how the robotic arm works. The last one have to work automatically.

2.4 General constraints

Software constraint:

Installing VPython is a quite difficult on windows and complicated on Linux


Material constraint :

There is some problems with the robotic arm

2.5 Assumptions and dependencies

The robotic arm can be controlled by the simulator

3.Specific requirements, covering functional, non-functional and interface requirements

  • document external interfaces,
  • describe system functionality and performance
  • specify logical database requirements,
  • design constraints,
  • emergent system properties and quality characteristics.

3.1 Requirement X.Y.Z (in Structured Natural Language)

a. Markers detection

Description: The camera detect the marker on the object and locate it

Inputs: Video stream and markers

Source: ArUco library detection

Outputs: video stream with markers highlighted with their id and an xml file which contains the marker's id and the coordinates of the four points of the markers

Destination: All people, all object which can be taken by the robot

Action: The ArUco library detects borders and analyzes in rectangular regions which may be markers

Non functional requirements: The marker detection should be done in real time and faster

Pre-condition: Using the Creative Camera, ambient light should be enough to see correctly user´s movements. Using the leap motion, hands should be above the sensor. The recognized gesture is present in the data base.

Post-condition: The user hand movement has been well recognized.

Side-effects: Bad gesture recognition

4. Product evolution

5. Appendices

6. Index