Wednesday, April 29, 2015

Repost: R's twitteR and Accessing the 1 Percent of Geotagged Tweet

About 1 percent of all tweets are geotagged.  Fortunately, most of these geotagged tweets fall into public stream data.

Of course, this only applies to Twitter.  The percent of geotagged media varies by social network source.  For example, Instagram had up to 25% of photos geotagged by users in 2012, according to the New York Times.

We will take a look at the twitteR package in R that provides an interface with the Twitter API. The main reason I chose this was my familiarity with R.

Getting a Twitter Dev Account
First, head over to https://apps.twitter.com/.  You can use your regular Twitter account to login. Click on the "Create New App" button in the upper right-hand corner.  Follow the on-screen instructions and be sure to read the Developer Agreement.

After your "app" is created, click on the "Permissions" tab and make sure the last radio button is selected: "Read, Write, and Access direct messages."  Also check out the Keys and Access Token, especially if you are more familiar with connecting to APIs.

Connecting with R
For this blog, I am using R 3.1.3 64-bit.  Start R, then follow the instructions located in the screenshot below. Click the screenshot for a closer look.
Click the above screenshot to get a closer look.

Update #1 12/10/2015:
In addition to the code listed above, you will have to install one additional package:
install.packages('base64enc')

Otherwise, you will oauth errors.

Concluding remarks...
Connecting to and using an API may not be your strong suit, but it is not mine either!  Hopefully, I've saved you some time and got you connected to this valuable source of data!

You can also now follow me on Twitter at: @jontheepi.  I will post blog updates there as well as additional quick insights about open source GIS and mapping.

Wednesday, April 15, 2015

Global Climate Monitor: "Getting Knowledge from Data"

The Global Climate Monitor (GCM) was created by researchers at the University of Seville to "model and [g]eo-visualize global climate data and climate-environmental indicators." If you are familiar with NOAA's monthly climate reports, you will find many of the same types of information here.   Finding and accessing weather and climate data can be daunting.  So, any website that tackles the challenge of making spatiotemporal data more accessible gets kudos.

Users can query several different datasets.
Users can query the GCM database by monthly and annual values as well as normals and trend back to 1901.  Here values are displayed in a grid (with each cell representing roughly 34 x 34 miles)

Moreover, the website allows users to easily select and interact with data across the globe.  In addition, a data download tool has recently released.  It allows users to download the onscreen data as a square grid in many different commonly used formats (*.kml, tiff, jpeg, csv, xlsx, and shp).

In order to download data, look for the little gift/package button on the right-hand side toolbar--at the bottom.

The three goals of the project are: 1) Analysis and management of climate information, 2) spatio-temporal climate variability, and 3) Applied Climatology.

Lastly, the project uses open source web map development tools including GeoServer and OpenLayers.

For more information, visit the main GCM website at http://www.globalclimatemonitor.org/ or the project page at http://grupo.us.es/climatemonitor/.

Above: Monthly temperature anomalies for February 2015.  The website allows users
 to choose a basemap, change transparency of layers, and identify values. 
The project leverages popular free and open source web map development tools.