Skip to main content

Getting Weather Data

I decided that I wanted to ingest some weather data through a different means, this time combining Google API scripts to retrieve the data from the API and another script to connect to the MySQL Database and deposit the data. Both steps are setup on a schedule. 

Setting up the API Call: 

I followed the instructions from the following medium post to create a function in Google Apps Scripts that would call an API for me. I use it to call the weather API website and retrieve the weather data for where I live. 

These 2 things combined import the data into my Google Sheet as per: 


I have then set this up to run on a 4 hour schedule within Google Sheets using one of their triggers. 

Getting the data into MySQL:

So now that I have the data in Google Sheets I want to regularly import this data into a database as it only stores a single row in google sheet, though I could probably get it to persist here as well. Looking at my options with the JDBC drivers available MySQL could work and Snowflake couldn't easily and I didn't really want to use a Keboola Flow several times a day to track this. 

I found the following post and modified the code as needed for my weather data and MySQL credentials. I then set that to run on a schedule as well so the data would get imported into the MySQL database on a regular basis and I can look to integrate it with my parkrun and step data to look for correlations between times / number of steps and the weather. 

Weather API

MySQL Table Script:



Create TABLE weather (
LocationName VARCHAR(100),
LocationRegion VARCHAR(100),
LocationCountry VARCHAR(100),
LocationLat VARCHAR(100),
LocationLon VARCHAR(100),
LocationTzId VARCHAR(100),
LocationLocaltimeEpoch VARCHAR(100),
LocationLocaltime VARCHAR(100),
CurrentLastUpdatedEpoch VARCHAR(100),
CurrentLastUpdated VARCHAR(100),
CurrentTempC VARCHAR(100),
CurrentTempF VARCHAR(100),
CurrentIsDay VARCHAR(100),
CurrentConditionText VARCHAR(100),
CurrentConditionIcon VARCHAR(200),
CurrentConditionCode VARCHAR(100),
CurrentWindMph VARCHAR(100),
CurrentWindKph VARCHAR(100),
CurrentWindDegree VARCHAR(100),
CurrentWindDir VARCHAR(100),
CurrentPressureMb VARCHAR(100),
CurrentPressureIn VARCHAR(100),
CurrentPrecipMm VARCHAR(100),
CurrentPrecipIn VARCHAR(100),
CurrentHumidity VARCHAR(100),
CurrentCloud VARCHAR(100),
CurrentFeelslikeC VARCHAR(100),
CurrentFeelslikeF VARCHAR(100),
CurrentVisKm VARCHAR(100),
CurrentVisMiles VARCHAR(100),
CurrentUv VARCHAR(100),
CurrentGustMph VARCHAR(100),
CurrentGustKph VARCHAR(100),
CurrentAirQualityCo VARCHAR(100),
CurrentAirQualityNo2 VARCHAR(100),
CurrentAirQualityO3 VARCHAR(100),
CurrentAirQualitySo2 VARCHAR(100),
CurrentAirQualityPm25 VARCHAR(100),
CurrentAirQualityPm10 VARCHAR(100),
CurrentAQUsepaindex VARCHAR(100),
CurrentAQGbdefraindex VARCHAR(100));
view raw weatherApi.sql hosted with ❤ by GitHub

Comments

Popular posts from this blog

Gen AI news 29-04-2024

Here are some recent updates and insights related to Generative AI (gen AI) : Enterprise Hits and Misses - Robotics and Gen AI Converge : This article discusses the convergence of robotics and generative AI. It explores breakthroughs needed in the field, the FTC’s policy change regarding non-competes, and the impact on AI model sizes for enterprises 1 . Read more All You Need To Know About The Upcoming AI-Powered OLED iPad Pro : This piece provides a summary of rumors surrounding the next-gen AI-fused OLED iPad Pro, powered by the new Apple M4 chip 2 . Read more Delivering on the Promise of Gen AI : New Electronics reflects on NVIDIA GTC and key announcements that contribute to delivering on the promises made for generative AI 3 . Read more The Future of Generative AI - An Early View in 15 Charts (McKinsey): Since the release of ChatGPT in November 2022, generative AI has been making headlines. McKinsey research estimates that gen AI features could add up to $4.4 trillion to the globa...

Keboola Flows

Really finding Keboola was the thing that kickstarted this project otherwise I would be trying to build custom code on a python cloud server and building everything from scratch.  In Keboola you build you data sources and destinations using connection details which is fairly simple and something I will likely cover in another post, same goes for transformations etc. Here though I am going to discuss Flows, this is where you bring everything together. On my free account there are some limitations.  My easiest flow is very basic:  Pull parkrun results e-mail from Gmail to Google Sheets (actually done by Zap not Keboola).  Keboola will, as often as I like, in this case once a week, pull the data from the sheet into its storage.  It will then transfer this to the target database. Currently I have this setup to be MySQL database but I can and might expand that to the Snowflake instance within Keboola.  I then, outside of Keboola, connect to the MySQL database f...

Snowflake Scripting - SQL Cursors

Snowflake scripting in SQL seems to be in preview and I have decided to have a play with it. Given how new it is there is limited documentation so I am using a combination of what I can find on the Snowflake site and the odd blog that has been written about it. There appear to be a few quirks, at least when compared to Oracle PL/SQL (though that has been round for years). How many of these are intentional and how many are things to be ironed out I don't know. You can see the procedure I have created it:  Accepts an id as a parameter  Creates a result set selecting from a table, using the parameter as a filter Loads the results set into a cursor.  Loops through the cursor loading the id in the cursor into variable Calls procedure passing in the variable as the parameter.  Then as a proof of concept I tried the Snowflake feature of allowing declaration of variables within the main start and end block.