Using Elk to create maps in Grasshopper

Posted by on Jun 20, 2013 in data mapping, Grasshopper | 20 Comments

As far as information synthesis goes, Grasshopper is pretty amazing. A frequent starting point for representing data is with the conventional geographic map, which creating in GH is not always an intuitive process, so I thought I’d create a tutorial on creating a geographic map in GH to begin to map spatial data. The plugins I use in this example are gHowl and Elk. If you follow along with the exercise, feel free to click on the images below to enlarge them.

First, download your vector information from Use OpenStreetMap’s interface to crop to the area you’d like to map, and download an .osm file by selecting the export tab and then OpenStreetMap xml data.


In GH, use Timothy Logan’s Elk to bring the .osm info into GH. Use the File Path module to connect to the P input of the Location module, this identifies the .osm file you’d like to use.


The Location module parses the .osm file and identifies what objects are what. You then need to start connecting the rest of the Elk modules to pull out the different kinds of data- connect the O and X outputs of the Location module to the O and X inputs of any of the other Elk modules (except the topo module) to start building your map.


The points representing the different objects will be organized in different branches, which will allow you to create different geometry from them.


The GenericOSM module is crucial, it allows you to parse any other osm features there aren’t modules for, in this case Buildings. You can find a list of osm features here. You can also use a text editor to open your xml file and see what points are tagged with data. Amenity, Highway, and Railway are all worth exploring- Land Use is particularly interesting. You can also use the V input to parse for subcategories of features and the K output combined with the Text Tag module to label the objects with their metadata. GH3D user Ivor Ip posted a GH attribute list of some of the osm features that you can download here.


This is a workable base for many diagrammatic maps- there’s a lot more data we can add but for now this should give us enough to work with. The next step is creating the ability to position WGS84 coordinates (longitude and latitude information) relative to this information. Use Elk’s sTopo module to use SRTM data to create a topography. First we’ll need to download the right SRTM file, I’ve been using the the WGS84 info from Elk’s Location module to help find the right file from the USGS repository. Be sure to verify that you’re looking at the right data with QGIS or another program- the naming convention can be a little tricky… but once you have your .hgt file, use the File Path again to reference it to the sTopo module.


If you’re interested in creating a 3D topography, flattening the points from the sTopo module and inputting them into a Delaunay Mesh will create a quick and easy mesh. For our purposes, we’re just going to use the points created from the topography as a reference for mapping our coordinates. The goal here is to use gHowl’s Geo To XYZ module to position coordinates relative to our map. Find the first and last points from the sTopo for the P1_XYZ and P2_XYZ inputs, then use the Lo and La outputs of Elk’s Location module to create info for the P1_Geo and P1_Geo inputs.


Now any WGS84 coordinates that are loaded into the Geo to XYZ module will be positioned on our map. This is particularly useful when mapping geotagged information, like tweets or other social media info.

The System(s) of the Census Dotmap

Posted by on Jan 23, 2013 in data mapping, graphics | No Comments

As usual, the MIT media lab cranks out amazing stuff- Brandon M Anderson put together a fascinating infographic that maps every citizen recorded in the 2010 US census. The graphic itself is very compelling, especially when you adjust the scale, but what is also compelling is Anderson’s description of the process. Using GIS shapefiles, python, processing, Google Maps, and a 17 GB CSV file (!) Brandon was able to produce an incredibly dense graphic. Read more about his method here.

CNN featured this project on their what’s next blog and noted the dense population in the east part of the country compared to the west. This was attributed to a proximity to agriculture, but Dr. Adelamar Alcantara of the Bureau of Business and Economic Research here at the University of New Mexico would be quick to point out that there are other factors at play as well. According to Dr. Alcantara, the U.S. census is a fairly flawed system, based largely off of IRS tax returns that are validated by county census records- a system which can ignore low income people in poor counties or in areas without a similar infrastructure. Dr. Alcantara is developing a system that uses GIS to evaluate the number of housing units for the state, cross reference those records with birth and death certificate records, then evaluate the “gaps” in the addresses. A paper on the BBER’s work related to this subject is here.

While the Census Dotmap is an amazing example of describing immensely large amounts of data, making the data understandable should not only posit theories about trends and patterns but call to question the nature of the data to begin with. Can we understand a graphic when we cannot assess the information it represents?

global warming diagrams and maps from

Posted by on Dec 5, 2009 in data mapping, sustainability | No Comments


interesting social data on global warming from textmap looks at the use of different terms across the web and graphs frequency of their use, use of the terms by location, and how the terms relate to other terms.