Netatmo weather station data ingestion to Azure Log Analytics
At the time of writing this blog post, the Azure Log Analytics HTTP Data Collector API was released only for a few weeks. I wanted to showcase the possibilities of data and log ingestion into the Log Analytics platform, so I created a demo to prove that we could ingest data other than operating system and IT log data, and do something with it.
Problem
Ingest data from a Netatmo Weather Station to Azure Log Analytics and present data visually appealing.
Solution
The Netatmo Weather Station consists of two sensors, one indoor and one outdoor, to measure environmental metrics like temperature, humidity, CO2, etc. The general setup of a netatmo is downloading the mobile app to your phone, creating a new Netatmo account and then connecting the weather station to your account. Once this is done, the weather station uploads sensor readings every X few minutes (Not sure how often, I’m guessing every 5 minutes) to Netatmo. The app on your phone queries the netatmo service, which displays rich dashboards of the data.
This then also meant that you cannot read data from the sensors directly, however Netatmo exposes an API through their development program Netatmo Connect, to read data from your sensors and systems on your account.
Thus, the high level steps for this solution was:
- Create a new app on Netatmo Connect
- Write a Powershell script to read the sensor readings from the Netatmo API and upload into the Log Analytics HTTP collector Api
- Create some dashboards.
Netatmo Connect
Navigate to at https://dev.netatmo.com/dev/createanapp and create a new app. This will provide you with a ClientID and a Client Secret. Keep for later.
Powershell Script
I used the powershell script used in the examples section in the official Log Analytics HTTP Data Collector API documentation, and added the netatmo api reads. This is the code that did all the work. Note that you need to change all the fields at the top that’s marked ‘xxxxxxxxxxxxxxxxx’with your own config data. Other than that, you can pretty much copy and paste into something likePowershell ISE and run and it should work.
The short explanation of what the script does is this:
- Auth to Netatmo
- Read sensor data from API
- Deconstruct the json response from Netatmo into a powershell object
- Correlate the different measurements and add a new log entry for each sensor and its values.
Code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 |
#------------------------------------------------------------------------------ # THIS CODE AND ANY ASSOCIATED INFORMATION ARE PROVIDED “AS IS” WITHOUT # WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT # LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS # FOR A PARTICULAR PURPOSE. THE ENTIRE RISK OF USE, INABILITY TO USE, OR # RESULTS FROM THE USE OF THIS CODE REMAINS WITH THE USER. #------------------------------------------------------------------------------ $global:access_token = $null $global:refresh_token = $null #Netatmo Details. Client ID and secret from creating the App, and a username and password that has access to the api. $global:client_id='xxxxxxxxxxxxxxxxxxxxxxxxxxx' $global:client_secret='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' $global:Netatmo_username='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' $global:Netatmo_password='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' #OMS Workspace Variables $global:CustomerId = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" #Workspace ID $global:SharedKey = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" $global:LogType = "Netatmo_WeatherStation" #Create a Log Type name #Function to send data to OMS workspace function WriteOMSEvent($SensorID, $SensorName, $MeasurementName, $MeasurementValue){ #Specify a time in the format YYYY-MM-DDThh:mm:ssZ to specify a created datetime for the records. $TimeStampField = [DateTime]::UtcNow.ToString("r") #Create json record $json = @" [{ "Sensor_id": "$SensorID", "Sensor_name": "$SensorName", "Measurement_name": "$MeasurementName", "Value": $MeasurementValue }] "@ #Output the json to console so that we can see the results. $json # Function to create the authorization signature. Function Build-Signature ($customerId, $SharedKey, $date, $contentLength, $method, $contentType, $resource) { $xHeaders = "x-ms-date:" + $date $stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource $bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash) $keyBytes = [Convert]::FromBase64String($SharedKey) $sha256 = New-Object System.Security.Cryptography.HMACSHA256 $sha256.Key = $keyBytes $calculatedHash = $sha256.ComputeHash($bytesToHash) $encodedHash = [Convert]::ToBase64String($calculatedHash) $authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash return $authorization } # Function doing the post to the OMS Workspace. Function Post-OMSData($customerId, $SharedKey, $body, $LogType) { $method = "POST" $contentType = "application/json" $resource = "/api/logs" $rfc1123date = [DateTime]::UtcNow.ToString("r") $contentLength = $body.Length $signature = Build-Signature ` -customerId $customerId ` -sharedKey $SharedKey ` -date $rfc1123date ` -contentLength $contentLength ` -fileName $fileName ` -method $method ` -contentType $contentType ` -resource $resource $uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01" $headers = @{ "Authorization" = $signature; "Log-Type" = $LogType; "x-ms-date" = $rfc1123date; "time-generated-field" = $TimeStampField; } $response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing return $response.StatusCode } # The only function after creating the json document that's doing the work. Post-OMSData -customerId $customerId -sharedKey $SharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $LogType } #End of Function WriteOMSEvent #Authenticate to the Netatmo API and get an access token. function Netatmo-DoAuth { $postParams = @{ grant_type='password'; client_id=$client_id; client_secret=$client_secret; username=$Netatmo_username; password=$Netatmo_password } $Request = Invoke-WebRequest -Uri https://api.netatmo.com/oauth2/token -Method POST -Body $postParams -UseBasicParsing if ($Request.StatusCode -eq 200) { $ResponseJson = $Request.Content | ConvertFrom-Json $global:access_token = $ResponseJson.access_token $global:refresh_token = $ResponseJson.refresh_token } ELSE { write-error -message $Request.StatusDescription -ErrorId $Request.StatusCode } } #Read the Netatmo API for Sensor data, convert into an object, and post to OMS. function ReadSensorData{$SensorpostParams = @{access_token=$access_token} #Request Sensor data from Netatmo $SensorRequest = Invoke-WebRequest -Uri https://api.netatmo.com/api/getstationsdata -Method POST -Body $SensorpostParams -UseBasicParsing if ($SensorRequest.StatusCode -eq 200) { #Convert to object. $SensorContent = $SensorRequest.Content | ConvertFrom-Json #Indoor unit sensors $MainUnit_id = $SensorContent.body.devices[0]._id $MainUnit_module_name = $SensorContent.body.devices[0].module_name $MainUnit_wifi_status = $SensorContent.body.devices[0].wifi_status $MainUnit_Pressure = $SensorContent.body.devices[0].dashboard_data.Pressure $MainUnit_pressure_trend = $SensorContent.body.devices[0].dashboard_data.pressure_trend $MainUnit_Temperature = $SensorContent.body.devices[0].dashboard_data.Temperature $MainUnit_temp_trend = $SensorContent.body.devices[0].dashboard_data.temp_trend $MainUnit_Humidity = $SensorContent.body.devices[0].dashboard_data.Humidity $MainUnit_CO2 = $SensorContent.body.devices[0].dashboard_data.CO2 WriteOMSEvent -SensorID $MainUnit_id -SensorName $MainUnit_module_name -MeasurementName "Pressure" -MeasurementValue $MainUnit_Pressure WriteOMSEvent -SensorID $MainUnit_id -SensorName $MainUnit_module_name -MeasurementName "Wifi Signal" -MeasurementValue $MainUnit_wifi_status WriteOMSEvent -SensorID $MainUnit_id -SensorName $MainUnit_module_name -MeasurementName "Temperature" -MeasurementValue $MainUnit_Temperature WriteOMSEvent -SensorID $MainUnit_id -SensorName $MainUnit_module_name -MeasurementName "Humidity" -MeasurementValue $MainUnit_Humidity WriteOMSEvent -SensorID $MainUnit_id -SensorName $MainUnit_module_name -MeasurementName "CO2" -MeasurementValue $MainUnit_CO2 #Outdoor Unit Sensors $OutdoorUnit_id = $SensorContent.body.devices[0].modules[0]._id $OutdoorUnit_module_name = $SensorContent.body.devices[0].modules[0].module_name $OutdoorUnit_Temperature = $SensorContent.body.devices[0].modules[0].dashboard_data.Temperature -as [double] $OutdoorUnit_temp_trend = $SensorContent.body.devices[0].modules[0].dashboard_data.temp_trend $OutdoorUnit_Humidity = $SensorContent.body.devices[0].modules[0].dashboard_data.Humidity -as [double] WriteOMSEvent -SensorID $OutdoorUnit_id -SensorName $OutdoorUnit_module_name -MeasurementName "Temperature" -MeasurementValue $OutdoorUnit_Temperature WriteOMSEvent -SensorID $OutdoorUnit_id -SensorName $OutdoorUnit_module_name -MeasurementName "Humidity" -MeasurementValue $OutdoorUnit_Humidity } ELSE { if ($SensorRequest.StatusCode -eq 403) { write-error -message "Netatmo Unauthorized, refreshing access token" Netatmo-DoAuth ReadSensorData } write-error -message $Request.StatusDescription -ErrorId $Request.StatusCode } } #Call the two main functions to do the work Netatmo-DoAuth ReadSensorData |
Using Azure automation to run script on a schedule
I wanted to be able to run the script once an hour automatically, which I can do on my workstation, but I decided to do it on an Azure Automation account in the free sku… because why not.
To get this done, I created a new Azure automation account:
Once this was done, I created a new runbook, pasted my powershell into it, and setup a schedule running once an hour.
Tested the script, which would show like this if successful:
Over to Log Analytics
After about an hour of sending events over to Log Analytics, you should get the customer fields being in the settings section:
Once these fields show, you should see all fields when doing a search:
and that pretty much proves the concept. However I wanted to make it look a bit more awesome, so I created an extra custom solution to make my data look great
and drilling down to the solution looks like this.
and once you click on a metric, it shows details, also in a graph.
If you would like to know how I created the custom solution, let me know and I’ll add a blog post about it. Also, if you would like to get notified of new posts, please subscribe to my blog top left of the page.
Awesome, huge help! I will note that you can also use the ‘Invoke-RestMethod’ command instead of the ‘Invoke-WebRequest’ command. Then you don’t need to convert the output from JSON!
Thanks for the tip Kyle!
Awesome