In last summer’s Parametric Urbanism seminar we looked at how social media could be aggregated to better understand urban behavior. Using the South Lake Union neighborhood of Seattle, the students looked at everything from MapMyRun paths to Flickr hashtags. The information was gathered and positioned with Grasshopper plugins Elk, gHowl, Mosquito and Lunch Box and exported to Google Earth. Some of the projects are embedded below, please note that for many of them you will have to pan down to see the information.
**Please note: The GEarthHacks plugin seems to work best with Google Chrome. There may be loading issues with older browsers.
Urban Tribes / Hyper-Local By Sophie Brasfield
An investigation into local subcultures through searching for keywords on Google Maps reviews and proprietor websites, then mapping the aggregation of these keywords.
Hypothesis: Amazon will not kill small business, it will change it however. What’s left after the internet? Places that offer an experience.
HYPER-LOCAL = PLACES THAT OFFER AN EXPERIENCE.
COMMON TRIBES CONGREGATE AT PLACES OF INTEREST. DIVERSE, VIBRANT COMMUNITIES ALLOW TRIBES TO CROSS OVER AND MINGLE.
PLACES OF SPECTACLE = PEOPLE WATCHING
PLACES OF EXPERIENCE = STREET, CURATED RETAIL, RESTAURANTS, COFFEE
South Lake Union Recreation by Dale Lusk
This file uses MapMyRun data to map the aggregation of running routes in South Lake Union. Highways are described as barriers, parks described as nodes, and paths mapped between them. The paths stack, growing wider, taller, and with more color information based off of how many people use them.
Flickr Moods by Zhu Zhu and Stevie Hartman
The lines are the collection of different moods. (the sad mood line is composed of sad mood points; the happy mood line is composed of happy mood points etc.)
Metaball are used to show the density of the mood points too.
happy, joyful, fun, interesting, beautiful, elegant, surprised …
sad, cry, tear, depressed, angry, suck, disgusting…..
Craigslist Flotsam by Catherine Harris
South Lake Union is a man shaped lake with filled in marsh land and extensive dredging mostly accomplished in the early 20th Century. I became interested in the idea of flotsam — those items that wash up on shore, as an index of human presence in South Lake Union.
I looked on craigslist and found all the items people were selling in a particular 24 hour period. I then filtered for items which included a google map. Then I took those items and found their geo coordinates and placed a marker in google earth. I also used those geo coordinates to generate a metaball geometry through gHowl and Grasshopper, which mapped centers of intensity, by creating a dome like structure reflecting the numbers of contiguous or nearby points.
I used a nearest point mapping module in Grasshopper to take a tracing of the two most near coastlines, the southern shore of South Lake Union and the nearest ocean front and from those two lines, generated a series of possible lines to form the shortest distances between the two shorelines. This mapping is only possible because of the early 20th Century removal of Denny Hill, which was leveled to create the current topography of South Lake Union. Thus the land can be seen as transiently in its current angle of repose.
The conjunction of these two forms, projected on a Google Earth mapping of South Lake Union, gives a reading of the marginalized material goods and their potential trajectories.
Grocery Situation in South Lake Union by Michael Salinas
An investigation into food accessibility in South Lake Union, articulated by geometric aggregations and separations in an attempt to define a range of grocery territory.
All Google Earth Embeds powered by Google Earth Hacks.
As usual, the MIT media lab cranks out amazing stuff- Brandon M Anderson put together a fascinating infographic that maps every citizen recorded in the 2010 US census. The graphic itself is very compelling, especially when you adjust the scale, but what is also compelling is Anderson’s description of the process. Using GIS shapefiles, python, processing, Google Maps, and a 17 GB CSV file (!) Brandon was able to produce an incredibly dense graphic. Read more about his method here.
CNN featured this project on their what’s next blog and noted the dense population in the east part of the country compared to the west. This was attributed to a proximity to agriculture, but Dr. Adelamar Alcantara of the Bureau of Business and Economic Research here at the University of New Mexico would be quick to point out that there are other factors at play as well. According to Dr. Alcantara, the U.S. census is a fairly flawed system, based largely off of IRS tax returns that are validated by county census records- a system which can ignore low income people in poor counties or in areas without a similar infrastructure. Dr. Alcantara is developing a system that uses GIS to evaluate the number of housing units for the state, cross reference those records with birth and death certificate records, then evaluate the “gaps” in the addresses. A paper on the BBER’s work related to this subject is here.
While the Census Dotmap is an amazing example of describing immensely large amounts of data, making the data understandable should not only posit theories about trends and patterns but call to question the nature of the data to begin with. Can we understand a graphic when we cannot assess the information it represents?
in my computational design classes, one of the first questions that is asked is “what is computational design?” the definition is not that elusive, computational design is simply using computation as an approach to solve design problems. the follow up question is a much more difficult one to answer: “why would you want to do that?”
on a basic level, computational design harnesses the processing power of computers to perform millions of mathematic computations to create multiple outcomes. these computations can be anything: form generation, manipulation, or reduction. but what separates this method from any other technique is that the result could have only been created with the aid of a computer- there is no way these designs could have been sketched or sculpted by the creator alone.
sweet… but why is that a big deal? well, it is and it isn’t. while many think design computation and its products are worthy on their own merit, what is incredibly compelling about computational design is its ability to increase design performance in many disciplines. since design computation is based off of computations, this allows the computations to operate off of data- any data. so if the data is relevant to how well the design performs, these techniques have the potential of elevating current design practices to a much higher level.
the highest potential for an increase of performance is in the field of sustainable architecture. the bible for the sustainable design is brown + mckay’s sun, wind & light. sun, wind & light is an absolutely invaluable resource for a designer- it breaks down very complex concepts about passive heating + cooling, solar shading, and thermodynamics so that they are easily understood and implemented. the problem with the text is that most of the calculations are on the scale of the entire building, large moves that deal with the project as a whole. because the calculations are simple enough to be done by hand, the results are broad enough to be reductive and simplistic, as compared to what a computer could do.
design computation offers the possibility of creating solutions on a much finer level, and generating building massings from an algorithmic code. architectural strategies for sustainability effect space on a very small scale, so the solutions we create for them must also be able to operate on a small scale as well. as solutions are generated, more and more input can be added to the equations, producing a more finely-tuned instrument of a building.
as designers become increasingly post-technological, there will be less emphasis on technique and more emphasis on how that technique can be used to increase performance. the shock and awe of controlled chaos will eventually fade to serve the more essential needs of comfort, light, and enclosure. design computation is only a technique, but a technique that uses contemporary tools to solve contemporary problems like few other disciplines can.
some resources for computational design:
grasshopper – a visual scripting plugin for the 3D modeler rhino
nodeBox – a parametric 2D design tool
ecotect – autodesk’s environmental analysis tool
design reform – tutorials on using grasshopper and revit
atelier nGai – ted ngai’s scripts and resources for grasshopper and ecotect
the proving ground – nathan miller’s scripts and resources for grasshopper
utos – thomas grabner’s + ursula frick’s scripts and resources for grasshopper
LIFT architects – LIFT’s blog, resources for grasshopper, and project updates
generator.x – marius watz’ + atle barcley’s blog on computational design
FORM + CODE – the official website of the book, with computational design code examples