Recherche

Résultats de recherche

University of Waterloo Dataverse Translation missing: fr.blacklight.search.logo
Borealis
Y. Aussat; Costin Ograda-Bratu; S. Huo; S. Keshav 2020-11-16 <p><b>Data Owner</b>: Y. Aussat, S. Keshav</p> <p><b>Data File</b>: 32.8 MB zip file containing the data files and description</p> <p><b>Data Description</b>: This dataset contains daylight signals collected over approximately 200 days in four unoccupied offices in the Davis Center building at the University of Waterloo. Thus, these measure the available daylight in the room. Light levels were measured using custom-built light sensing modules based on the Omega Onion microcomputer with a light sensor. An example of the module is shown in the file sensing-module.png in this directory.</p> <p>Each sensing module is named using four hex digits. We started all modules on August 30, 2018, which corresponds to minute 0 in the dataset. However, the modules were not deployed immediately. Below are the times when we started collecting the light data in each office and corresponding sensing module names.</p> <p><b>Office number</b> &nbsp;&nbsp;&nbsp; <b>Devices</b> &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <b>Start time</b></p> <p>DC3526 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; af65, b02d &nbsp;&nbsp;&nbsp; September 6, 2018, 11:00 am</p> <p>DC2518 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; afa7 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; September 6, 2018, 11:00 am</p> <p>DC2319 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; af67, f073 &nbsp;&nbsp;&nbsp;&nbsp; September 21, 2018, 11:00 am</p> <p>DC3502 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; afa5, b969 &nbsp;&nbsp;&nbsp; September 21, 2018, 11:00 am</p> <p>Moreover, due to some technical problems, the initial 6 days for offices 1 and 2 and initial 21 days for offices 3 and 4 are dummy data and should be ignored.</p> <p>Finally, there were two known outages in DC during the data collection process: </p> <ul> <li>from 00:00 AM to 4:00 AM on September 17, 2018 </li> <li>from 11:00pm on 10/9/2018 until 7:45am on October 10, 2018 </li> </ul> <p>We stopped collecting the data around 2:45 pm on May 16, 2019. Therefore, we have 217 uninterrupted days of clean collected data from October 11, 2018 to May 15, 2019. </p> <p>To take care of these problems, we have provided a python script process-lighting-data.ipynb that extracts clean data from the raw data. Both raw and processed data are provided as described next.</p> <p><b>Raw data</b>: Raw data folder names correspond to the device names. The light sensing modules log (minute_count, visible_light, IR_light) every minute to a file. Here, minute 0 corresponds to August 30, 2018. Every 1440 minutes (i.e., 1 day) we saved the current file, created a new one, and started writing to it. The filename format is {device_name}_{starting_minute}. For example Omega-AF65_28800.csv is data collected by Omega-AF65, starting at minute 28800. A metadata file can also be found in each folder with the details of the log file structure. </p> <p><b>Processed data</b>: The folder named ‘processed_data’ contains the processed data, which results from running the python script. Each file in this directory is named after the device ID, for example af65.csv stores the processed data of the device Omega-AF65. The columns in this file are: </p> <ul> <li><b>Minutes</b>: Consecutive minute of the experiment </li> <li><b>Illum</b>: Illumination level (lux)</li> <li><b>Min_from_midnight</b>: Minutes from midnight of the current day</li> <li><b>Day_of_exp</b>: Count of the day number starting from October 11, 2018</li> <li><b>Day_of_year</b>: Day of the year</li> </ul> <p><b>Funding</b>: The Natural Sciences and Engineering Research Council of Canada (NSERC) </p>
University of Waterloo Dataverse Translation missing: fr.blacklight.search.logo
Borealis
C. Gorenflo; I.Rios; L. Golab; Costin Ograda-Bratu; S. Huo; S. Keshav 2020-11-17 <p><b>Data Owner</b>: C. Gorenflo, I.Rios, L. Golab, S. Keshav </p> <p><b>Data Description</b>: This dataset contains data collected through the University of Waterloo’s WeBike field trial, which includes e-bike trips and battery charging sessions spanning from summer 2014 until spring 2017. 31 participants were each given an e-bike, and the e-bikes are monitored by our custom-built kit. The Samsung Galaxy S-III smart phone provides the angular speed and acceleration in all 3 axes through Android’s standard API; Phidget voltage sensor measures the battery voltage, a Phidget current transducer measures the battery charging current, a Digikey current transducer measures the battery discharge current, and a Digikey sensor measures the temperature of the battery. The smart phones are configured the smart phones to wake up for 4 seconds every minute and collect four data samples, one per second, from all the sensors. </p> <p>More information can be found at: <a href="https://iss4e.ca/webike-software/">https://iss4e.ca/webike-software/ </a>and <a href="https://iss4e.ca/webike-a-three-year-study-on-e-bikes-as-a-mode-of-sustainable-transport-in-a-canadian-city/">https://iss4e.ca/webike-a-three-year-study-on-e-bikes-as-a-mode-of-sustainable-transport-in-a-canadian-city/</a></p> <p>To identify trip activities and battery charging sessions from the raw data in webike.json, you can follow the activity detection algorithm provided in Section 4.1 of Usage Patterns of Electric Bicycles: An Analysis of the WeBike Project, (<a href="https://www.hindawi.com/journals/jat/2017/3739505/">https://www.hindawi.com/journals/jat/2017/3739505/</a>), and the scripts we used can be found at: <a href="https://github.com/iss4e">https://github.com/iss4e.</a></p> <ul> <li><b>time</b>: Timestamp of the data collected </li> <li><b>acceleration_x</b>: Acceleration in x axis </li> <li><b>acceleration_y</b>: Acceleration in y axis </li> <li><b>acceleration_z</b>: Acceleration in z axis </li> <li><b>ambient_temperature</b>: Ambient Temperature </li> <li><b>atmospheric_pressure</b>: Atmospheric Pressure </li> <li><b>battery_temperature</b>: Temperature inside box (from a sensor in the telemetry box, in Celsius) </li> <li><b>charging_current</b>: Charging current (using Phidget sensor, is reported as a float value. To convert reported value to Amperes use: Charging Current (in A) = Sensor Value * 0.05. Therefore, a sensor value of 400 implies a charging current of 2 A) </li> <li><b>code_version</b>: Version of our software stack source code </li> <li><b>discharge_current</b>: Discharging current (using ISS4E built sensor): Discharge Current (A) = (Sensor Value-504)*0.033 (the value 504 is the value displayed by the sensor for 0 Amps; it can vary slightly, +/- few units, so adjust accordingly) </li> <li><b>gravitational_acceleration</b>: Gravitational acceleration </li> <li><b>gyroscope_x</b>: Angular speed in x axis </li> <li><b>gyroscope_y</b>: Angular speed in y axis </li> <li><b>gyroscope_z</b>: Angular speed in z axis </li> <li><b>light_level</b>: Light level in lux </li> <li><b>linear_acceleration_x</b>: Linear acceleration in x axis </li> <li><b>linear_acceleration_y</b>: Linear acceleration in y axis </li> <li><b>linear_acceleration_z</b>: Linear acceleration in z axis </li> <li><b>magnetic_field_x</b>: Magnetic field in x axis </li> <li><b>magnetic_field_y</b>: Magnetic field in y axis </li> <li><b>magnetic_field_z</b>: Magnetic field in z axis </li> <li><b>phone_battery_state</b>: The state the battery of the phone is in <li><b>proximity_sensor</b>: Proximity Sensor </li> <li><b>voltage</b>: Battery voltage (using phidget sensor, in V): The actual voltage is 32/22 times the value read by the sensor </li> <li><b>phone_battery_percentage</b>: Percentage of battery left </li> <li><b>phone_charging_or_full</b>: Is phone charging or full </li> <li><b>phone_is_AC_charge</b>: Does phone charge with AC adapter </li> <li><b>phone_is_USB_charge</b>: Does phone charge with USB </li> <li><b>rotation_scalar</b>: Rotation scalar </li> <li><b>rotation_x</b>: Rotation in x axis </li> <li><b>rotation_y</b>: Rotation in y axis </li> <li><b>rotation_z</b>: Rotation in z axis </li> </ul> <p><b>Funding</b>: Cisco Systems and the Natural Science and Engineering Research Council of Canada (NSERC).</p>
University of Waterloo Dataverse Translation missing: fr.blacklight.search.logo
Borealis
Jia Ying Lin; Costin Ograda-Bratu; S. Huo; S. Keshav 2020-12-01 <p><b>Data Owner</b>: Jia Ying Lin, Costin Ograda-Bratu and S. Keshav</p> <p><b>Data Description</b>: There are many anomalies that may affect a solar panel’s power production, including cloudiness, snow, dust, and shadows. Images of the solar panels are taken as input to detect snow, since it can be easily identified through images. This dataset contains a set of measurements from a solar installation on top of the ERC building at the University of Waterloo, and some photos of the system which could allow image recognition to identify snow on the panels. </p> <p><b>Data Size</b>: The dataset is 172.2 MB in total.</p> <p><b>Data Field Description</b>: In the Images folder, there are three folders: no_snow, partial, and all_snow containing images of the solar panels with no snow on it, partially covered by snow, and completely covered in snow respectively. Every jpg file is named with the timestamp that the image is taken. Images are taken at January 26th, 27th, 28th, February 2nd, 3rd, 28th, and March 1st, 2nd, 21st, 22nd, 23rd, 24th, 25th of 2019.</p> <p>In the Power_Measurements folder, each file is named after the date when the measurements are taken. Each entry consists of the timestamp, battery voltage and charging current. Power measurements are collected from February 27th 2019 to July 22nd 2019.</p> <p><b>Funding</b>: Cisco Systems and the Natural Science and Engineering Research Council of Canada (NSERC).</p>
University of Waterloo Dataverse Translation missing: fr.blacklight.search.logo
Borealis
Omid Ardakanian; S. Huo; S. Keshav 2020-12-01 <p><b>Data Description</b>: Current Cost <a href="http://www.currentcost.com/product-envir.html">Envi</a> monitors measure real power using <a href="https://www.fluke.com/en-ca/learn/blog/clamps/inside-hall-effect-clamp-meters"> D-clamp Hall-effect</a> sensors attached to each of two phases of household mains at the point of interconnection to the electrical grid. These meters were installed at 30 households in Waterloo, ON, Canada and connected to small notebook computers to record data during 2011-2012. At least 3 months of data were collected from a total of 25 households. </p> <p>Files in the dataset are named by household ID (ex: home2.csv contains data collected from the 2nd household). Data are collected every 6 seconda. Each entry consists of the timestamp at the time of data collection, the room temperature (measured in Fahrenheit), per-phase power consumption (in Watt), and the sum of power consumption from both phases. </p> <p>This datasets contains 26 files, one for each household, for a total of 2.8 GB.</p> <p>Below is a summary of how much data is available from each household:</p> <ul> <b>ID &nbsp;&nbsp;&nbsp;&nbsp; available data </b> <li>2 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 206 days </li> <li>3 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 367 days </li> <li>4 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 288 days </li> <li>5 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 346 days </li> <li>6 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 150 days </li> <li>7 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 140 days </li> <li>8 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 306 days </li> <li>9 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 196 days </li> <li>10 &nbsp;&nbsp;&nbsp;&nbsp; 92 days </li> <li>12 &nbsp;&nbsp;&nbsp;&nbsp; 169 days </li> <li>13 &nbsp;&nbsp;&nbsp;&nbsp; 166 days </li> <li>14 &nbsp;&nbsp;&nbsp;&nbsp; 289 days </li> <li>15 &nbsp;&nbsp;&nbsp;&nbsp; 198 days </li> <li>16 &nbsp;&nbsp;&nbsp;&nbsp; 204 days </li> <li>17 &nbsp;&nbsp;&nbsp;&nbsp; 248 days </li> <li>18 &nbsp;&nbsp;&nbsp;&nbsp; 274 days </li> <li>19 &nbsp;&nbsp;&nbsp;&nbsp; 182 days </li> <li>20 &nbsp;&nbsp;&nbsp;&nbsp; 271 days </li> <li>21 &nbsp;&nbsp;&nbsp;&nbsp; 325 days </li> <li>22 &nbsp;&nbsp;&nbsp;&nbsp; 277 days </li> <li>23 &nbsp;&nbsp;&nbsp;&nbsp; 235 days </li> <li>24 &nbsp;&nbsp;&nbsp;&nbsp; 5 days </li> <li>25 &nbsp;&nbsp;&nbsp;&nbsp; 279 days </li> <li>26 &nbsp;&nbsp;&nbsp;&nbsp; 107 days </li> <li>27 &nbsp;&nbsp;&nbsp;&nbsp; 133 days </li> <li>30 &nbsp;&nbsp;&nbsp;&nbsp; 240 days </li> </ul> <p><b>Funding</b>: Cisco Systems and the Natural Science and Engineering Research Council of Canada (NSERC).</p>
University of Waterloo Dataverse Translation missing: fr.blacklight.search.logo
Borealis
Alimohammad Rabbani; Costin Ograda-Bratu; S. Huo; S. Keshav 2020-11-16 <p><b>Data Owner</b>: Alimohammad Rabbani, S. Keshav</p> <p><b>Data Description</b>: Unlike traditional conventional centralized HVAC systems that heat or cool an entire zone, Personal Environmental Control systems can provide personalized thermal comfort for each individual but are expensive and difficult to deploy. The SPOT* system, in contrast, is an individual thermal comfort system that can be rapidly and cost-effectively deployed.</p> <p>This dataset contains data collected from a cumulative 58,000 hours of operation of 45 SPOT* systems in 15 offices. Invitations are sent out to approximately 1500 building residents of four selected campus buildings, and we distributed the systems in first-come-first-served order. Over the course of the data collection, only one person has withdrawn from the trial because they left the university, and only two failures have happened (both resolved by re-plugging the device to the power outlet fixed the problem). Details regarding the design and deployment of the SPOT* system can be found in the paper “<i>The SPOT* Personal Thermal Comfort System</i>” by A. Rabbani and S. Keshav: <a href="http://blizzard.cs.uwaterloo.ca/keshav/wiki/images/b/be/Spotstar.pdf">http://blizzard.cs.uwaterloo.ca/keshav/wiki/images/b/be/Spotstar.pdf.</a></p> <p>The AD22100 surface-mount temperature sensor with 0.1◦C resolution is used to obtain temperature readings. The AMN22111 passive infrared human detection sensor outputs analog values that are converted to values between 0 and 1000 on the Raspberry Pi. When there is no movement, the sensor output values are approximately 500. Each movement causes the sensor to first generate one value close to 1000 and then another close to 0. The closer these values are to 1000 and 0, the greater the intensity of movement. Over a 30-second window, a standard deviation close to 0 indicates almost no movement, and thus no occupancy, while higher standard deviations correspond to more movement. The user interface with SPOT* system is by means of a Web app, and we collect users’ comfort preferences to the control app. More information on the data collection process can be found in section 3.1 and 4.1 of the paper “<i>The SPOT* Personal Thermal Comfort System</i>”.</p> <ul> <li><i><b>PPVs.csv</b></i>:</li> <p><b>time</b>: Epoch Unix Time Stamp (seconds since Jan 01 1970 (UTC)) </p> <p><b>Predicted Mean Vote (ASHRAE scale)</b>: Predicted Mean Vote (PMV) model estimates an average worker’s comfort level on the 7-point ASHRAE scale using a function fpmv(·): pmv = fpmv(ta,(t_r ) ̅,var,pa,M,Icl) </p> <p><b>Predicted Personal Vote (ASHRAE scale)</b>: PPV is a generalized version of PMV, computed as a*PMV + b. During a training period, the system collects comfort votes from the user to extract two user- specific parameters a and b using least-squares regression. </p> <p><b>device_id</b>: ID of devices placed in each office </p> <li><i><b>Motions.csv</b></i>:</li> <p><b>time</b>: Epoch Unix Time Stamp (seconds since Jan 01 1970 (UTC)) </p> <p><b>standard deviation of motion sensor in the last 30s</b>: standard deviation of motion values (i.e. motion intensity) during 30-second time windows </p> <p><b>device id</b>: ID of devices placed in each office </p> <li><i><b>Occupancies.csv</b></i>:</li> <p><b>time</b>: Epoch Unix Time Stamp (seconds since Jan 01 1970 (UTC)) </p> <p><b>occupancy</b>: 0(not occupied), 1(slight chance), 2(high chance), 3(definitely occupied) </p> <p><b>device id</b>: ID of devices placed in each office </p> <li><i><b>Temperatures.csv</b></i>:</li> <p><b>time</b>: Epoch Unix Time Stamp (seconds since Jan 01 1970 (UTC)) </p> <p><b>temperature</b>: degrees C </p> <p><b>device id</b>: ID of devices placed in each office </p> </ul> <p><b>Funding</b>: Cisco Systems and the Natural Science and Engineering Research Council of Canada (NSERC).</p>
University of Waterloo Dataverse Translation missing: fr.blacklight.search.logo
Borealis
S. Doubov; Costin Ograda-Bratu; S. Huo; S. Keshav 2020-11-30 <p><b>Data Description</b>: A camera is taped to the ceiling of a lab at the University of Waterloo, and pictures are taken every 3 minutes between 10:15 am and 8 pm on March 9th, 11th, 12th, 13th, 14th, 15th, and 18th. Each image was rotated by -4.5 degrees and was split into four quadrants and a central region. A demo of how quadrants are truncated is in the dataset named as demo1.jpg and demo2.jpg.</p> <p>The files are named by the date the photo is captured, a serial number and the quadrant number. For example, mar-09-fri_cap_5_0.txt is information of the 5th picture taken on Friday, March 9th, and it is a cropped image of the 0th quadrant.</p> <p>Each file contains a binary denoting occupancy, where 1 indicates at least one person is present in the quadrant and 0 indicates no person in the quadrant.</p> <p>This dataset includes 3365 files of occupancy information deduced from 673 images each divided into 5 quadrants. </p> <p>This dataset has a total size of 215 KB.</p> <p><b>Funding</b>: Cisco Systems and the Natural Science and Engineering Research Council of Canada (NSERC).</p>

Instructions pour la recherche cartographique

1.Activez le filtre cartographique en cliquant sur le bouton « Limiter à la zone sur la carte ».
2.Déplacez la carte pour afficher la zone qui vous intéresse. Maintenez la touche Maj enfoncée et cliquez pour encadrer une zone spécifique à agrandir sur la carte. Les résultats de la recherche changeront à mesure que vous déplacerez la carte.
3.Pour voir les détails d’un emplacement, vous pouvez cliquer soit sur un élément dans les résultats de recherche, soit sur l’épingle d’un emplacement sur la carte et sur le lien associé au titre.
Remarque : Les groupes servent à donner un aperçu visuel de l’emplacement des données. Puisqu’un maximum de 50 emplacements peut s’afficher sur la carte, il est possible que vous n’obteniez pas un portrait exact du nombre total de résultats de recherche.