Parametric Urbanism Final Projects

NataliaCundari_bike accident
South Lake Union Bike Accidents by Natalia Cundari

In last summer’s Parametric Urbanism seminar we looked at how social media could be aggregated to better understand urban behavior. Using the South Lake Union neighborhood of Seattle, the students looked at everything from MapMyRun paths to Flickr hashtags. The information was gathered and positioned with Grasshopper plugins Elk, gHowl, Mosquito and Lunch Box and exported to Google Earth. Some of the projects are embedded below, please note that for many of them you will have to pan down to see the information.

**Please note: The GEarthHacks plugin seems to work best with Google Chrome. There may be loading issues with older browsers.

Urban Tribes / Hyper-Local By Sophie Brasfield
An investigation into local subcultures through searching for keywords on Google Maps reviews and proprietor websites, then mapping the aggregation of these keywords.
Hypothesis: Amazon will not kill small business, it will change it however. What’s left after the internet? Places that offer an experience.

HYPER-LOCAL = PLACES THAT OFFER AN EXPERIENCE.

COMMON TRIBES CONGREGATE AT PLACES OF INTEREST. DIVERSE, VIBRANT COMMUNITIES ALLOW TRIBES TO CROSS OVER AND MINGLE.
PLACES OF SPECTACLE = PEOPLE WATCHING
PLACES OF EXPERIENCE = STREET, CURATED RETAIL, RESTAURANTS, COFFEE

South Lake Union Recreation by Dale Lusk
This file uses MapMyRun data to map the aggregation of running routes in South Lake Union. Highways are described as barriers, parks described as nodes, and paths mapped between them. The paths stack, growing wider, taller, and with more color information based off of how many people use them.

Flickr Moods by Zhu Zhu and Stevie Hartman
The lines are the collection of different moods. (the sad mood line is composed of sad mood points; the happy mood line is composed of happy mood points etc.)
Metaball are used to show the density of the mood points too.

zhuzhu_script

Positive(pink):
happy, joyful, fun, interesting, beautiful, elegant, surprised …
Middle:
drunk
Negative (blue):
sad, cry, tear, depressed, angry, suck, disgusting…..

Craigslist Flotsam by Catherine Harris
South Lake Union is a man shaped lake with filled in marsh land and extensive dredging mostly accomplished in the early 20th Century. I became interested in the idea of flotsam — those items that wash up on shore, as an index of human presence in South Lake Union.
I looked on craigslist and found all the items people were selling in a particular 24 hour period. I then filtered for items which included a google map. Then I took those items and found their geo coordinates and placed a marker in google earth. I also used those geo coordinates to generate a metaball geometry through gHowl and Grasshopper, which mapped centers of intensity, by creating a dome like structure reflecting the numbers of contiguous or nearby points.

I used a nearest point mapping module in Grasshopper to take a tracing of the two most near coastlines, the southern shore of South Lake Union and the nearest ocean front and from those two lines, generated a series of possible lines to form the shortest distances between the two shorelines. This mapping is only possible because of the early 20th Century removal of Denny Hill, which was leveled to create the current topography of South Lake Union. Thus the land can be seen as transiently in its current angle of repose.

The conjunction of these two forms, projected on a Google Earth mapping of South Lake Union, gives a reading of the marginalized material goods and their potential trajectories.

Grocery Situation in South Lake Union by Michael Salinas
An investigation into food accessibility in South Lake Union, articulated by geometric aggregations and separations in an attempt to define a range of grocery territory.

All Google Earth Embeds powered by Google Earth Hacks.

Metaball Diagrams with Google Earth and gHowl

Posted by on Jul 8, 2013 in data mapping, Grasshopper | 3 Comments

Google Earth presents an intuitive, dynamic platform for understanding spatial context. Combined with a parametric modeler like Grasshopper, Google Earth presents complex datasets relative to geo-positioning in a way that is understandable. Facilitated by GH plugin gHowl, GH meshes and lines can be exported in Google Earth’s .kml format to be viewed by Google Earth or an enabled web browser.

Creating legible geometry for Google Earth is challenging, but one type of geometry I’ve experimented with is GH’s metaballs, which are about as old school as it gets for 3D curvature. Metaballs, as described by Yoda (Greg Lynn), are “defined as a single surface whose contours result from the intersection and assemblage of the multiple internal fields that define it.” (Lynn, Blobs, Journal of Philosophy and the Visual Arts 1995). This aggregation of internal fields can provide an intuitive understanding of various contextual forces relative to the spatial context of a site. While GH metaballs are only curves and not meshes / surfaces you can easily use a delaunay mesh to begin to create a mesh.

This tutorial will walk through the process of creating metaballs from Geo coordinates. I’m using a map I created with Elk that is based off of Open Street Maps info, if you’re interested in doing something similar look here.

Just click on the images below if you’d like to see them in more detail.

Start by positioning your Geo coordinates in GH space through gHowl’s Geo To XYZ module.

GEM_placingGeoPoints

Use the output of the Geo to XYZ module as the point input for GH’s Metaball(t) module. There are several options for creating metaballs, I’ve had the best luck with the (t) one. The next step is to set up a series of section planes for the Metaball(t), the number of planes will define the resolution of the metaball. Instead of creating planes, it’s a little bit more familiar / easier to create a series of points in the Z direction. The last input you need is the threshold value, which is a number you want as low as absolutely possible. The trick to this is that you probably still want the flexibility of a slider, but on a city scale metaball you need a number smaller than .001 which is the minimum value a slider allows. The trick here is to multiply two sliders with small values together and use that value to drive the threshold value, you can even use a third or fourth slider if you are using a bigger scale.

GEM_createMetaballs

Once you’ve created the metaballs, you can use the gHowl KML Out module to create a .kml. Use a path module to set a file path, have the Metaball(t) output feed the G input of the KML Out. Use the KML Style module to set the curve color and use 2 similar GEO and XYZ points that you used in the Geo to XYZ module to Geo reference the output. I’ve had the best luck with setting the Altitude Mode to Absolute, but I’ve really only done this at sea level at this point.

GEM_KMLout

This will create a .kml file that will show your metaball data in Google Earth.

GEM_colorGradient_GE

Using Elk to create maps in Grasshopper

Posted by on Jun 20, 2013 in data mapping, Grasshopper | 20 Comments

As far as information synthesis goes, Grasshopper is pretty amazing. A frequent starting point for representing data is with the conventional geographic map, which creating in GH is not always an intuitive process, so I thought I’d create a tutorial on creating a geographic map in GH to begin to map spatial data. The plugins I use in this example are gHowl and Elk. If you follow along with the exercise, feel free to click on the images below to enlarge them.

First, download your vector information from OpenStreetMap.org. Use OpenStreetMap’s interface to crop to the area you’d like to map, and download an .osm file by selecting the export tab and then OpenStreetMap xml data.

GISMap_OpenStreetMap_02

In GH, use Timothy Logan’s Elk to bring the .osm info into GH. Use the File Path module to connect to the P input of the Location module, this identifies the .osm file you’d like to use.

GISMap_GH_Elk_Location_02

The Location module parses the .osm file and identifies what objects are what. You then need to start connecting the rest of the Elk modules to pull out the different kinds of data- connect the O and X outputs of the Location module to the O and X inputs of any of the other Elk modules (except the topo module) to start building your map.

GISMap_GH_Elk_MajorRoads_02

The points representing the different objects will be organized in different branches, which will allow you to create different geometry from them.

GISMap_GH_Elk_MajorRoads_Polys_02

The GenericOSM module is crucial, it allows you to parse any other osm features there aren’t modules for, in this case Buildings. You can find a list of osm features here. You can also use a text editor to open your xml file and see what points are tagged with data. Amenity, Highway, and Railway are all worth exploring- Land Use is particularly interesting. You can also use the V input to parse for subcategories of features and the K output combined with the Text Tag module to label the objects with their metadata. GH3D user Ivor Ip posted a GH attribute list of some of the osm features that you can download here.

GISMap_GH_Elk_Buildings_02

This is a workable base for many diagrammatic maps- there’s a lot more data we can add but for now this should give us enough to work with. The next step is creating the ability to position WGS84 coordinates (longitude and latitude information) relative to this information. Use Elk’s sTopo module to use SRTM data to create a topography. First we’ll need to download the right SRTM file, I’ve been using the the WGS84 info from Elk’s Location module to help find the right file from the USGS repository. Be sure to verify that you’re looking at the right data with QGIS or another program- the naming convention can be a little tricky… but once you have your .hgt file, use the File Path again to reference it to the sTopo module.

GISMap_GH_Elk_Topo

If you’re interested in creating a 3D topography, flattening the points from the sTopo module and inputting them into a Delaunay Mesh will create a quick and easy mesh. For our purposes, we’re just going to use the points created from the topography as a reference for mapping our coordinates. The goal here is to use gHowl’s Geo To XYZ module to position coordinates relative to our map. Find the first and last points from the sTopo for the P1_XYZ and P2_XYZ inputs, then use the Lo and La outputs of Elk’s Location module to create info for the P1_Geo and P1_Geo inputs.

GISMap_GH_Elk_Coordinates

Now any WGS84 coordinates that are loaded into the Geo to XYZ module will be positioned on our map. This is particularly useful when mapping geotagged information, like tweets or other social media info.

The System(s) of the Census Dotmap

Posted by on Jan 23, 2013 in data mapping, graphics | No Comments

BMA_censusDotmap
As usual, the MIT media lab cranks out amazing stuff- Brandon M Anderson put together a fascinating infographic that maps every citizen recorded in the 2010 US census. The graphic itself is very compelling, especially when you adjust the scale, but what is also compelling is Anderson’s description of the process. Using GIS shapefiles, python, processing, Google Maps, and a 17 GB CSV file (!) Brandon was able to produce an incredibly dense graphic. Read more about his method here.

CNN featured this project on their what’s next blog and noted the dense population in the east part of the country compared to the west. This was attributed to a proximity to agriculture, but Dr. Adelamar Alcantara of the Bureau of Business and Economic Research here at the University of New Mexico would be quick to point out that there are other factors at play as well. According to Dr. Alcantara, the U.S. census is a fairly flawed system, based largely off of IRS tax returns that are validated by county census records- a system which can ignore low income people in poor counties or in areas without a similar infrastructure. Dr. Alcantara is developing a system that uses GIS to evaluate the number of housing units for the state, cross reference those records with birth and death certificate records, then evaluate the “gaps” in the addresses. A paper on the BBER’s work related to this subject is here.

While the Census Dotmap is an amazing example of describing immensely large amounts of data, making the data understandable should not only posit theories about trends and patterns but call to question the nature of the data to begin with. Can we understand a graphic when we cannot assess the information it represents?

intro to design computation

abstraktAbstrakt-John-Powers_02

in my computational design classes, one of the first questions that is asked is “what is computational design?” the definition is not that elusive, computational design is simply using computation as an approach to solve design problems. the follow up question is a much more difficult one to answer: “why would you want to do that?”

on a basic level, computational design harnesses the processing power of computers to perform millions of mathematic computations to create multiple outcomes. these computations can be anything: form generation, manipulation, or reduction. but what separates this method from any other technique is that the result could have only been created with the aid of a computer- there is no way these designs could have been sketched or sculpted by the creator alone.

sweet… but why is that a big deal? well, it is and it isn’t. while many think design computation and its products are worthy on their own merit, what is incredibly compelling about computational design is its ability to increase design performance in many disciplines. since design computation is based off of computations, this allows the computations to operate off of data- any data. so if the data is relevant to how well the design performs, these techniques have the potential of elevating current design practices to a much higher level.

the highest potential for an increase of performance is in the field of sustainable architecture. the bible for the sustainable design is brown + mckay’s sun, wind & light. sun, wind & light is an absolutely invaluable resource for a designer- it breaks down very complex concepts about passive heating + cooling, solar shading, and thermodynamics so that they are easily understood and implemented. the problem with the text is that most of the calculations are on the scale of the entire building, large moves that deal with the project as a whole. because the calculations are simple enough to be done by hand, the results are broad enough to be reductive and simplistic, as compared to what a computer could do.

design computation offers the possibility of creating solutions on a much finer level, and generating building massings from an algorithmic code. architectural strategies for sustainability effect space on a very small scale, so the solutions we create for them must also be able to operate on a small scale as well. as solutions are generated, more and more input can be added to the equations, producing a more finely-tuned instrument of a building.

as designers become increasingly post-technological, there will be less emphasis on technique and more emphasis on how that technique can be used to increase performance. the shock and awe of controlled chaos will eventually fade to serve the more essential needs of comfort, light, and enclosure. design computation is only a technique, but a technique that uses contemporary tools to solve contemporary problems like few other disciplines can.

some resources for computational design:

grasshopper – a visual scripting plugin for the 3D modeler rhino
nodeBox – a parametric 2D design tool
ecotect – autodesk’s environmental analysis tool
design reform – tutorials on using grasshopper and revit
atelier nGai – ted ngai’s scripts and resources for grasshopper and ecotect
the proving ground – nathan miller’s scripts and resources for grasshopper
utos – thomas grabner’s + ursula frick’s scripts and resources for grasshopper
LIFT architects – LIFT’s blog, resources for grasshopper, and project updates
generator.x – marius watz’ + atle barcley’s blog on computational design
FORM + CODE – the official website of the book, with computational design code examples

global warming diagrams and maps from textmap.com

Posted by on Dec 5, 2009 in data mapping, sustainability | No Comments

200912051444

interesting social data on global warming from textmap.com. textmap looks at the use of different terms across the web and graphs frequency of their use, use of the terms by location, and how the terms relate to other terms.

from ::textmap.com