Tag Archives: Open Source

Moving to GeoDjango

I’ve been creating a simple GeoDjango application for managing environmental sampling metadata, and it’s been a lot of fun so far. I’ve had experience working with many different forms of metadata tracking, from spreadsheets, to wikis, to online project management tools. All of them have their ups and downs, and it seems like there is always a dealbreaker with each organizational method.

Spreadsheets are easy to edit, but lack any form of relational structure (two sets of data for the same report? i guess i’ll just put two entries into the same cell).

sometimes spreadsheets just don't work...

Wikis are cool, allow easy access to information, but are (in certain cases) a pain for folks to edit. Take the experience of table creation. Dokuwiki, a generic wiki software, requires a series of  carefully placed carrots and pipes to delineate headers and columns. A common pain comes when adding a new value to a new row, in which that value exceeds the length of any previous cell. This requires the author to expand the column header, and all previously entered rows, re-aligning the pipes and carrots. Granted as a slightly OCD GIS Analyst, the sight of a well crafted text table fills me with no less wonder then when I saw Albert Bierstadt’s “Puget Sound on the Pacific Coast”, but it’s just darn tedious at times. Additionally, as the log of sampling events grows larger, it gets harder to manage. Dokuwiki, AFAIK provides no ways to automatically resort entire sections of pages or records in tables based on alphabetical order, which would make searching for information on a particular page much faster as content becomes larger and larger.

^ Column One                ^ Column Two                 ^
| Zee 'z' string            | This is a longer string    |
| 2nd longer string of text | 2nd short string           |
| SuperShort                | A string starting with 'A' |

Online project management tools are interesting as well. They allow rapid collaboration between project members, and provide template functionality, allowing for status reports on recurring workflows to be easily generated (e.g., create a template for a report, spawn off an instance of a template for each new project). The downside to these services are that: they cost money, they also may not provide a normalized way to store data, and (of most interest to myself) they probably don’t allow for the storage/visualization of spatial data.

10 megs free for one user? cool!

In comes GeoDjango. Over the next few posts, I think I’ll record my experiences developing an application that allows the storage of metadata, within the context of environmental sampling efforts. The goal is to provide a web application which stores both tabular and spatial data in a normalized fashion, and easily visualize both in an informative way.

Hey look Ma', it's GeoJSON!

OSM ‘Minutely’ Tile Updates: Thanks CloudMade!

Another great OSM Mapping Party is underway in Tempe Arizona. Brandon of CloudMade came out to run the show, with local hosting by James Fee of RSP Architects. Coffee, pizza, libations, and Open Source GIS provided a great foundation for lively discussion on the practical and philosophical aspects that OpenStreetMaps provides.

Of particular interest was a question posed by a first time OSM’er, who wondered why we had to wait a week for the tiles to render, just to see the results of some type of experiment that we might be trying out in the digitization and attribution of features in the map. Well, no one could really give him a good answer other then, ‘it’s just the way it is’.

It appears though, that CloudMade has provided us with an answer just the other day. The Minutely Updated Tile Server is updated ‘every minute from OpenStreetMap diffs’. The results can be seen below. The first image depicts the standard weekly update view while the bottom image depicts the minutely render. Note the presence of the additional buildings on ASU’s Tempe Campus.

Above: OSM Weekly Update

Above: OSM Weekly Update

Above: CloudMade's Minutely Updated Tile Server for OSM

Above: CloudMade's Minutely Updated Tile Server for OSM

Check out the full post from CloudMade’s blog here: http://blog.cloudmade.com/2009/01/23/nearly-live-tiles/

Geocoding with OpenLayers: A Crash Course In Firebug

The vacation is over. A new job and a new semester are already providing plenty of opportunities to explore those crazy technologies of the geoweb.

A need to incorporate the Yahoo Geocoder into a new OpenLayers app has proved to be a great learning experience in the navigation of the development version of OL. The YahooGeocoder.js addin, created by OL community member, ‘sbenthall’, requires two prerequisites to run: A proxy.cgi script as well as a Yahoo Developer API key. Why would this RESTful service require a proxy, you might find yourself asking? Well, because even though you can query it in a RESTful fashion, the data is returned in an XML shell, requiring a proxy to allow complete the XMLHttpRequest. Yahoo has a great article for novice web programmers like myself explaining the role of  Web Proxies, which can be found here.

A quick overview of the primary steps to add the YahooGeocoder.js addin are as follows:

  1. Sign up for a Yahoo APP Key to enable access to their geocoding service.
  2. Add the proxy.cgi script to your webserver’s ‘cgi-bin’. Note: When navigating to the proxy.cgi’s url, you might encounter, ‘access denied’ errors. If you do, make sure that you have the proper permissions set for your cgi-bin directory. This can be done using the terminal command, ‘chmod 755′ targeting cgi-bin directory.
  3. Edit the proxy.cgi script to include ‘local.yahooapis.com’ in the list of ‘allowedHosts’.

    proxy_hosts

    Above: Modified 'proxy.cgi'

  4. Add the YahooGeocoder.js file to the OpenLayers ‘lib/OpenLayers/Control’ folder.
  5. Add “OpenLayers/Control/YahooGeocoder.js” to the variable array, ‘jsfiles’ inside the “lib/OpenLayers.js” library.

    control_add

    Above: Modified 'lib/OpenLayers.js'

  6. Test the geocoder’s functionality using the supplied .HTML file. (Hopefully it should work!)

    good_xml_return1

    Above: Geocoder Result with Properly Formed XML Response.

Six simple steps, but it can be challenging if you haven’t tried to install any addins to the OL library before.

In the above image, the firebug window can be seen returning a properly formed XML Response, having successfully executed the geocoding function. If you enlarge the image, we can compare this to the raw XML-Response using a properly constructed query. Note in both the response captured from firebug (above) as well as the raw XML (below) the presence of the address: 510 High Street, Bellingham WA, broken down into it’s individual units along with the geocoded result.

basic_xml

Above: Basic XML Return

Further diving into the capabilities of firebug, we can use the DOM inspector to ensure that the various parameters required to properly execute the Yahoo Geocoder are in place. Note in the image below the presence of such necessary information as the APP ID Key, Projection, and Class, for the ygc variable. If any of these parameters happened to be incorrectly set, it would be displayed in this view.

dom_inspector

Above: The Firebug DOM Inspector

I’m finally starting to appreciate the power of firebug as a development tool, which just so happens to coincide with my ability to understand it at a basic level. Hopefully as my experience in GIS Web Development grows, so will be ability to use the higher-end functions of this tool.

OSM Recap

What an awesome weekend. There was a good turnout for the Phoenix OSM mapping party, put on by CloudMade and hosted locally by Gangplank. On both days we saw a mixture of GIS professionals and Open Source users come out to support the project. A new neologism to add to the list also came out of the event, ‘crowdsourcing’. The work of many locals who have an inherent expert knowledge of their surrounding environment replacing the work of what would otherwise be performed by a few skilled professionals.

Here is the result of our work:

ASU Mapping Party

Above: ASU Mapping Party

In any event, we decided to focus our efforts on ASU’s campus for the weekend. Before we went out with GPS units, an OSM’er named ‘jfuredy’ had actually taken a stab at some rooftop digitization of buildings along campus. I’ve already noticed that there are inherent conflicts that occur by digitizing from rooftops. Tall buildings of course, will appear oblique, with their roofs offset from their actual footprints. This creates scenarios in which buildings (created from oblique angled rooftops) are overlapping walkways.

Above: Digitization Conflict

Above: Digitization Conflict

I’m excited to continue working on the OSM project, and hope to see more mapping parties in the future. Speaking with Brandon, our contact at CloudMade, I mentioned that the end of January could be a potential date. This would be the first week of the new semester for MAS-GIS, and hopefully more of the students will come out and support the project!

whew…

This took a ‘lot’ longer than i expected.

http://www.mkgeomatics.com/apps/openlayers/WA_WFS_WGS84.html

Calling Mapserver as a WFS layer, displaying with OpenLayers. Many thanks to Chris Schmidt for a quick response on the OL mailing list!

Currently set to extract attributes in the OL html, but still need to build in the attribute display. A good example of the necessary javascript can be found in the OL Cookbook.

I also set a quick thank you to Jeff McKenna at Gateway Geomatics for putting together such a great set of UMN Mapserver WFS and WMS tutorials. He actually wrote back! oh FOSS4G, what a nice community.

WFS!

Wow, i actually have a working WFS server now. I was able to successfully use QGIS to call upon the same mapfile using both  ‘WMS’ and ‘WFS’ service types. As you can see in the image below, a static baselayer, ‘COUNTY’ (actually a dissolved shp designed to show the physical outline of Washington State) is displayed as WMS layer, as well as WFS layer, ‘NATPARK’, capable of actually identifiying features.

wfs_identify

QGIS Screenshot: Selected feature in yellow, Identified feature in pinkish-outline.

I’m amazed at the ability of Mapserver to take a shapefile, turn it into GML, and send it almost magically to ANY program that can interpret it.  I’m finally wrapping my noggin around the whole concept of XML, and just how powerful it can be.

So what were my primary issues? Well, it took me about an hour to realize that I was running my, ‘DescribeFeatureType’ request on the mapfile entitled ‘WA_WGS84.map’ rather than ‘WA_WFS_WGS84.map’. So that was my own error. Then I realized that while the KEY field for each of these shapefiles in ArcGIS is simply, ‘FID’, this cannot be used for the ‘gml_featureid’ metadata requirement in the mapfile. Another primary key such as, ‘PARKID’ must instead be used, and the GML file will actually create its own, ‘FID’ based on this.

Some other key lessons from today:

  • ESRI ArcGIS does not support WFS out of the box, as it does WMS. To enable the GML simple features profile, try out this ESRI Help Article.
  • Querying your WFS server in-browser using the, ‘getfeature’ and ‘getcapabilities’ requests is a really simple and effective way of determining which required pieces of information are missing from your XML. Mapserver will actually create a comment in-line saying which required pieces of metadata are missing.
  • Creating metadata abstracts for both WFS and WMS in your mapfile is a simple way to view how your GIS software is consuming the data. For example, the only reason i realized that I was actually using QGIS to display my WFS mapfile as a WMS was through the use both of the “wms_abstract” and “wfs_abstract” metadata tags.

What’s next? Wrap an openlayers display around this with selection and identify capabilities.

Nested Tuples and Hours of Patience

I’ve finally got a working python script that does the following:

-Reads each of the vertexes of a polyline shapefile using ‘OGR’

-Runs the vertexes through the Douglas-Peucker smoothing algorithm.

-Generates an output text file using ‘stdout’ with the resulting ‘smoothed’ coordinates (in whatever linear unit of measurement that the source shapefile happens to be in).

This has taken quite awhile to get this far, but it has been an excellent learning experience. The breakthrough for tonight was understanding that the Douglas-Peucker algorithm which I downloaded from mappinghacks required a nested tuple as an input. Basically then, I needed to figure out how translate the extracted vertexes from the source shapefile into the tuple object.

Things started to go bad, really bad, when I attempted to create formatted output files for the extracted vertexes… and re-import them as input files for the DP algorithm. It’s all good now though, and the code looks (a bit) cleaner.

The next step will be to take this output text file and build another shapefile from it…

Here is a link to the heavily commented script (right-click, select ‘save-as’) as it stands thus-far.

Web-Based GRASS!

Whoo-hoo! After a number of tweaks, I finally was able to install command-line GRASS GIS onto this site. I’ve been working off of Aaron Racicot’s tutorial for about a week now. Actually being able to understand and fix some of the errors in the ./configure process is feeling pretty rewarding. I had to manually set the pathways for FFTW and TK using –with-fftw-includes and –with-tclktk-includes respectively. Also, ran into an issue with NAD2BIN not being explicitly laid out as well, but I was lucky enough to find a forum post stating that I needed to use the command, ‘export NAD2BIN=/home/…/bin/nad2bin’.

I’m not all that familiar with the command line syntax of GRASS, so I actually had to use a local install to view some sample syntax, and modify it for use on the web instance. Pretty fun.

I’m wondering if there is anyway for a user to access the geoprocessing functionality of this GRASS install w/o logging into the server. Furthermore, could it be possible to tie this into a mapserver / openlayers install? It would be amazing to recreate this ESRI on-the-fly geoprocessing example using only open source GIS tools.

Anyways, it’s a pretty cool feeling to be able to upload a file to the server, run a geoprocessing task on it, and download the results back to my machine!

GRASS 6.2.3 (spearfish):~/usr/local/bin > r.shaded.relief map=elevation.dem shadedmap=shade_test altitude=30 azimuth=270 zmult=2 scale=1
Calculating shading, please stand by.
100%
Color table for [shade_test] set to grey

Shaded relief map created and named [shade_test].
GRASS 6.2.3 (spearfish):~/usr/local/bin > r.out.tiff input=shade_test output=/home/matthewkenny/usr/local/gis_workspace/shade_out compression=none
WARNING: The map <shade_test> in mapset <user1> is a floating point map.
Decimal values will be rounded to integer!
GRASS 6.2.3 (spearfish):~/usr/local/bin >

And the output image downloaded back to my computer:

GDAL / PROJ.4 On Shared Webhost

For the past couple of days, I’ve been putting hours into understanding Aaron Racicot’s, ‘guide to GIS on a shared hosting service‘. This has been a great learning experience in working w/ SSH and a full mapserver install from the source. My goal is to host a Mapserver WMS directly from this site. I’ve got everything in place… probably not configured correctly, probably full of errors, probably will never work (w/o a healthy dose of patience).

I do however, know that I am on the right track, as GDAL/OGR and PROJ.4 are up and running.

[gotti]$ ./ogrinfo -al -so /home/matthewkenny/usr/local/shp_samples/county_polygon.shp
INFO: Open of `/home/matthewkenny/usr/local/shp_samples/county_polygon.shp’
using driver `ESRI Shapefile’ successful.

Layer name: county_polygon
Geometry: Polygon
Feature Count: 39
Extent: (576751.628593, 81877.320813) – (2551197.698864, 1355595.418907)
Layer SRS WKT:
PROJCS["NAD_1983_HARN_StatePlane_Washington_South_FIPS_4602_Feet",
GEOGCS["GCS_North_American_1983_HARN",
DATUM["NAD83_High_Accuracy_Regional_Network",
SPHEROID["GRS_1980",6378137.0,298.257222101]],
PRIMEM["Greenwich",0.0],
UNIT["Degree",0.0174532925199433]],
PROJECTION["Lambert_Conformal_Conic_2SP"],
PARAMETER["False_Easting",1640416.666666667],
PARAMETER["False_Northing",0.0],
PARAMETER["Central_Meridian",-120.5],
PARAMETER["Standard_Parallel_1",45.83333333333334],
PARAMETER["Standard_Parallel_2",47.33333333333334],
PARAMETER["Latitude_Of_Origin",45.33333333333334],
UNIT["Foot_US",0.3048006096012192]]
COUNTY_COD: Integer (2.0)
COUNTY_FIP: String (3.0)
COUNTY_NM: String (15.0)
ECY_REGION: String (4.0)
AIR_REGION: String (46.0)
[gotti]$