Weather at a glance with Raspberry Pi

Scott Edenbaum
Posted on Sep 5, 2017

Introduction

I am a big fan of the Rapsberry Pi single board computing platform, and I've used the devices in numerous projects. I recently acquired a Pimoroni Unicorn Phat (Pico Hardware Attached to Top) - a 4 x 8, 255 color LED display and I decided to see how well a small display would work to convey information at a glance.

I will use a Pi Zero W and the Pimoroni Unicorn Phat (I know, its a silly name!) to display local weather conditions in realtime.

To reach this goal, I will use code written in Python to use Pandas to scrape the current hourly weather info based on a given Zip Code. To make full use of the 4 separate LED columns it will display:

  • Temperature
  • Humidity
  • Precipitation,
  • Wind Speed

Before stepping into the code there is the conceptual problem - how do I output a float as a combination of 8 LEDS of various color?

Since the 'screen' will display 4 separate values, we'll need to code them separately.

First is the temperature. As a resident of the NYC area, the annual temperature can range from 0 F to 95 F in a typical year, so I split the temperature into three groups. Cold, Warm, and Hot, represented by blue, green, and red respectively.

Cold is any below 32 F, Warm is 33 - 74 F, and Hot is 75+ F.

According to the display below, it is ~85 degrees, ~80% humidity,  ~20% precipitation, and wind speed of ~16 MPH in NYC

  • Temperature (Blue -> cold [0, 32], Green -> warm [33, 74], Red -> hot[75,100])
  • Humidity reflects % humidity * 8 / 100
  • Precipitation reflects % precipitation * 8 / 100
  • Wind Speed in MPH is a log_2 scale

When I took this image it was a bit cooler in Memphis (38101), it is ~75 degrees, ~90% humidity,  ~30% precipitation, and wind speed of ~16 MPH in TN

Lastly we have Beaver (25813), ~ 87 degrees, ~65% humidity,  ~0% precipitation, and wind speed of ~8 MPH in WVA.

Code

Below is a snipped of code for setting the color and number of active LEDS in a given row.

 

def set_temp(temp):

    temp = int(temp)

    def linshuffle(linspace,temp):

        ticks = linspace

        tmp = temp

        ticks.put(0,int(tmp))

        ticks.sort()

        ticks = pd.Series(ticks)

        return max(1,int(ticks[ticks == temp].index[0]))

        

    if int(temp) <= 32:

        T = linshuffle(np.linspace(0,32,10),temp)

        R,G,B = (0,0,100)

        set_hpline([0],T,R,G,B)

    elif int(temp) < 75:

        T = linshuffle(np.linspace(32,75,10),temp)

        R,G,B = (0,int(np.ceil(3*int(temp))),0)

        set_hpline([0],T,R,G,B)

    elif int(temp) >= 75:

        T = linshuffle(np.linspace(75,100,10),temp)

        R,G,B = (2*int(temp),0,0)

        set_hpline([0],T,R,G,B)

 

How do we get weather information?

There are two options, the first is to use a hardware sensor and capture the data. The second method is to scrape the relevant information from the web. In this instance, we can get the hourly weather information containing temperature (F), humidity %, precipitation %, and wind speed (MPH)

The data above is from weather.com, their hourly weather information is exactly what we need!

Here's the Python & Pandas code to quickly scrape the table of data without needing to parse the HTML tree.

def get_weather_df(zip_code):
Zip_Code = zip_code.strip().replace(' ','').split('-')[0]
if len(Zip_Code) == 5 and Zip_Code.isnumeric():
url = 'https://weather.com/weather/hourbyhour/l/%s:4:US' %(Zip_Code)
else:
url = weather_url()
temp_df = pd.read_html(url)[0]
temp_df = temp_df.iloc[:,1:]
temp_df.columns = ['Time','Description','Temp','Feels','Precip','Humidity','Wind']
temp_df['Time'] = pd.to_datetime(temp_df['Time'])
temp_df['Temp'] = temp_df['Temp'].map(lambda x: int(x[:-1]))
temp_df['Feels'] = temp_df['Feels'].map(lambda x: int(x[:-1]))
temp_df['Precip'] = temp_df['Precip'].map(lambda x: int(x[:-1]))
temp_df['Humidity'] = temp_df['Humidity'].map(lambda x: int(x[:-1]))
temp_df['Wind Direction'] = temp_df['Wind'].map(lambda x: x.split()[0])
temp_df['Wind Speed'] = temp_df['Wind'].map(lambda x: x.split()[1])
temp_df.index.name=''
temp_df.index=(temp_df['Time'])
temp_df = temp_df[['Description','Temp','Feels','Precip','Humidity','Wind Direction','Wind Speed']]
return (temp_df)

Lucky for us, the Pandas read_html() method is powerful enough to grab the entire page of data in one go!. The subsequent lines of code are to clean up the content - ie: change strings to ints, remove % symbol and degree sign.

The full code can be found on my github page here https://github.com/edenbaus/unicornhat/blob/master/unicornweather.py

About Author

Scott Edenbaum

Scott Edenbaum

Scott Edenbaum is a recent graduate from the NYC Data Science Academy. He was hired by the Academy to assist in buildout of the learning management system and seeks to pursue a career as a Data Scientist. Scott's...
View all posts by Scott Edenbaum >

Related Articles

Leave a Comment

No comments found.

View Posts by Categories


Our Recent Popular Posts


View Posts by Tags

#python #trainwithnycdsa 2019 airbnb Alex Baransky alumni Alumni Interview Alumni Reviews Alumni Spotlight alumni story Alumnus API Application artist aws beautiful soup Best Bootcamp Best Data Science 2019 Best Data Science Bootcamp Best Data Science Bootcamp 2020 Best Ranked Big Data Book Launch Book-Signing bootcamp Bootcamp Alumni Bootcamp Prep Bundles California Cancer Research capstone Career Career Day citibike clustering Coding Course Demo Course Report D3.js data Data Analyst data science Data Science Academy Data Science Bootcamp Data science jobs Data Science Reviews Data Scientist Data Scientist Jobs data visualization Deep Learning Demo Day Discount dplyr employer networking feature engineering Finance Financial Data Science Flask gbm Get Hired ggplot2 googleVis Hadoop higgs boson Hiring hiring partner events Hiring Partners Industry Experts Instructor Blog Instructor Interview Job Job Placement Jobs Jon Krohn JP Morgan Chase Kaggle Kickstarter lasso regression Lead Data Scienctist Lead Data Scientist leaflet linear regression Logistic Regression machine learning Maps matplotlib Medical Research Meet the team meetup Networking neural network Neural networks New Courses nlp NYC NYC Data Science nyc data science academy NYC Open Data NYCDSA NYCDSA Alumni Online Online Bootcamp Online Training Open Data painter pandas Part-time Portfolio Development prediction Prework Programming PwC python python machine learning python scrapy python web scraping python webscraping Python Workshop R R language R Programming R Shiny r studio R Visualization R Workshop R-bloggers random forest Ranking recommendation recommendation system regression Remote remote data science bootcamp Scrapy scrapy visualization seaborn Selenium sentiment analysis Shiny Shiny Dashboard Spark Special Special Summer Sports statistics streaming Student Interview Student Showcase SVM Switchup Tableau team TensorFlow Testimonial tf-idf Top Data Science Bootcamp twitter visualization web scraping Weekend Course What to expect word cloud word2vec XGBoost yelp