Projets-2016-2017-Plateform Analyse Données IOT: Difference between revisions

From air
Jump to navigation Jump to search
Line 156: Line 156:


- Save data about sensors in the house (during 30 minutes, every minute)
- Save data about sensors in the house (during 30 minutes, every minute)

== Wednesday 29th March ==

* Progressing on database creation with node red

* Writing better documentation about node red / Improving Air page

Revision as of 07:46, 29 March 2017

Team

  • Supervisors : Daniel Hilaire, Jean-Louis Bergerand, Nicolas Palix, Olivier Richard
  • Members : Estelle Allard, Lambert Rocher
  • Departement : RICM 4, Polytech Grenoble

Week 1 (January 9th - January 15th)

  • Choice of project and learning more information about project purposes

Week 2 (January 16th - January 22th)

  • Discovery of the project
  • Contact with Daniel Hilaire and Jean-Louis Bergerand
  • Search for information on Docker and OpenHab
  • Statement of requirements

Exigences

Meeting with Daniel Hilaire and Jean-Louis Bergerand (Wednesday 18/01/2017)

  • Two possible solutions to retrieve data collected by the house sensors :
    • KNX gateway, prepared as a binary image that can be written to an SD card for a RaspberryPI
    • Programming the SpaceLYnk in LUA
  • Worthwhile data analysis : temperature and energy consumption compared to weather, indoor air quality

Week 3 (January 23th- January 29th)

  • Use diagramm edition

Diagramme des cas d'utilisation

  • Execution of a docker image running influxDB and grafana (dockerfile found on dockerhub).
  • Manual addition of data in the database using the http api.
  • Visualization of these data in a grafana dashboard.

Week 4 (January 30th- February 5th)

Session postponed

Week 5 (February 6th- February 12th)

  • Questions about the use of Grafana, or Openhab or both for the final plateform
-> essential to choose Openhab in order to be compatible with other projects (greenhouse ...)
-> then associate Grafana visuals / and R programming

Week 6 (February 13th- February 19th)

Monday 13th February

  • Creation of a docker image with Grafana, InfluxDB and R
  • Attempts about linking OpenHab and Grafana

Tuesday 14th February

  • Exploitation and research in SpaceLynk documentation in order to find a way to connect the device to our plateform
  • Technology intelligence about IBM solution for IoT plateform service and applications

Week 7 (February 27th - March 5th)

Monday 27th February

  • Information and test in OpenHab v2 (and phone app)
  • Thinking about the utility of OpenHab
  • Found how works the sitemap in order to make modification to create a visual interface close to the house's aspects

Tuesday 28th February

- Adding R installation to docker image

- Creating R script that writes data in the influxDB database.

Friday 3rd March

  • Two solutions for openhab sitemaps modifications :

- Looking how to access to the sitemap by creating a script which will write directly on the sitemap we want for making modifications

- Install again OpenHab in order to have to complete version and all fonctions able to show us directly the versions of the sitemap (which can be modified in local) **succeed**

  • Difficulty to find how to modified the sitemap, creation of a file test1.sitemaps but never use in openhab

Week 8 (March 6th - March 12th)

Monday 6th March

  • We went at IUT to find a way to transfer data collected by captors to our database. We used the HomelyLynk wich is a logical controller connected to captors, our goal was to make the HomelYnk send data through a web interface. We tried to connect to the HomelYnk with ftp but we coudnt see the files containing the data inside it. So now we think about crating LUA script to programm the HomelYnk. We still managed to save a csv file containing captors data manually.

Tuesday 7th March

Mid-term presentation. Thinking about database architecture.

Week 9 (March 13rd - March 19th)

Monday 13rd March

  • Reading documentation about Spacelynk that may help to get captors data.

http://openrb.com/docs/remote-new.htm

http://openrb.com/example-export-last-hour-csv-object-log-file-to-external-ftp-server-from-lm2/

  • Writing javascript programm that makes the http requests to collect data from spacelynk. In order to test it on Terra Nostra next week.
  • Trying Node-Red to write data on influxDB.


Tuesday 14th March

  • Contact the two students who works on the Terra Nostra house in order to go an see it next week and testing our script.
  • Suceed on connecting the five dockers images (NodeRed + NodeRed for InfluxDB + InfluxDB + Grafana + R) with docker compose


Week 10 (March 20th - March 26th)

Monday 20th March

  • Visiting Terra Nostra :

- Speaking with the students who works on Terra Nostra construction

- Connecting with SpaceLynk in order to use our script

- Export some examples of sensors data


Tuesday 21st March

Working with docker in order to make node-red write into InfluxDB and make InfluxDB accessible by Grafana.


Week 11 (March 27th - April 2nd)

Monday 27th March

- Creating http request with node red and not with a script

- Remark : About writing on a new file, careful about permission of node red in writing.

- Creating a database in node red

- Edit Grafana dashboard : need dynamic configuration (with http request and api) because of the link between the number of sensors and the number of graphics.

- Edit Grafana DataSource : and link it to the dashboard

Tuesday 28th March

  • Working on Terra Nostra

- Testing Node red directly on the SpaceLynk

- Save data about sensors in the house (during 30 minutes, every minute)

Wednesday 29th March

  • Progressing on database creation with node red
  • Writing better documentation about node red / Improving Air page