Spatial Networks recently attended two very fascinating conferences.  The first was the annual meeting of the Association of American Geographers, which was aptly called AAG 2014.  The second was the annual United States Geospatial Intelligence Foundation’s Symposium called GEOINT 2013*.

Presentations at both conferences demonstrated the growing field of remote sensing.  At AAG, Gary R Watmough from the Earth Institute demonstrated how algorithms for analyzing satellite imagery can be used to identify rich and poor areas in India.  In a presentation at AAG and a booth at GEOINT, geniuses from the Oak Ridge National Laboratory showcased LandScan, a project that uses remote sensing algorithms to predict population levels around the world.  Many companies at GEOINT are actively developing algorithms for mapping the physical world with LiDAR.  Although not presenting at the conferences, the World Resources Institute is helping run Global Forest Watch, an innovative program to monitor all forest around the world using remote sensing.  The Harvard Humanitarian Initiative is also using remote sensing to detect large-scale attacks and human rights abuses.  It is clear that remote sensing, especially satellite imagery analysis, is a diverse field full of potential.

With all of that said, for the remote sensing industry to reach its greatest potential it will have to leverage the data and services of on-the-ground data collectors, such as Spatial Networks.  Imagery analysts often check if their algorithms work by comparing their results with census data and other open datasets.  Unfortunately, censuses conducted around the world, especially in many conflict-affected countries, can be highly inaccurate, poorly organized and politically biased.  Testing algorithms using biased data can lead to biased algorithms.  Ground-truthers, like Spatial Networks, have a solution to this problem.  We gather data on-the-ground from all over the world in hard-to-reach areas.  If you want to know if your algorithm correctly made a prediction about a location, send us over there to find out!

Remote sensing and mobile data collection are essentially two sides of the same coin.  They both seek to ascertain the truth and can work together to mutual benefit.  Just as information collected on the ground can test the accuracy of remote sensing algorithms, these same algorithms can identify interesting areas as candidates for mobile data collection.

 

Posted
AuthorDaniel Dufour
CategoriesEvent, Article

We just returned from an incredible weekend in San Francisco for the State of the Map US conference, the annual American edition of the OpenStreetMap community conference. Nearly 400 were in attendance – developers, cartographers, enthusiasts, educators, and more – to talk about new ideas in OpenStreetMap, collaborate on new tools, and generally discuss how we as a community can take the OSM project to the next level and make it even more amazing.

SotM US 2013 group photo

My talk on Monday reviewed the state of Pushpin, the mobile OpenStreetMap editor we built back in late 2012. Since its release 8 months ago, we’ve seen incredible adoption – over 43,000 edits and additions to OSM from 100 countries – and have sparked hundreds of new contributors by offering the community a simpler tool to lower the complexity curve for editing. As a result, local knowledge from the field is empowering a level of map data detail that’s been hard to kickstart among the greater community outside of the core power users. Data such as detailed place addressing, routing issues, and local place name information is the sort of hyperlocal content that OSM excels at, and its what will enable commercial developers to build incredible tools powered by OSM data. (video + slides from my talk).

Jumping off from the discussion about editors like Pushpin and iD, the topic of community growth and engagement resonated throughout a number of presentations. Martijn, Richard, and Alyssa each had excellent discussions of the state of the community in general. Mikel and Saman presented great ideas about how to bring to the forefront more of the social aspects that make OSM such a fun and rewarding project. All this great discussion around community improvement also included a Birds of a Feather session led by Kathleen and Steven, with the focus on making the community a more fun and inviting place to be, both physically and virtually. As much as the regularly scheduled talks, the ad hoc BoF sessions are amazing for the collaborative atmosphere they provide for getting together and discussing specific issues. I heard great things about other productive sessions around the OSM hardware infrastructure, the ODbL licensing issues, and how OSM can be used as a teaching tool.

BoF sessions

SotM US has become my favorite conference, combining the fun, energetic community with lots of creativity and interesting work. Out of two full days of talks, not a single one felt out-of-place or slapped together. The high bar set by the presenters in the OSM community makes me honored to get to be one of them. Other highlights included Artem demoing the “world’s smallest tile server” on a Raspberry Pi during his talk on Mapnik, Steven showing the cool work being done at the US Census Bureau with OSM data, and Tom & Eric’s showcase of the 2013 “OpenStreetMap Report”, like a shareholders meeting for the data community. Ryan Closner’s look at using Minecraft 3D worlds to explore OpenStreetMap, with a project called voxel.js (using 3D-gameworld-as-presentation-software) was so cool, I might just have to try some “unslides” for my next talk.

If you haven’t already, I also encourage the US OpenStreetMap community to join the Facebook group and follow the blog for announcements of OSM activities in the US. Video of the sessions is up now on the conference website, for those that couldn’t attend. A huge thank you goes out to the OSM US board and all of the volunteers who helped put this thing together, even getting live video of each talk for those that couldn’t attend (available here). Each event is better than the last, and I’m looking forward to continuing that trend.

(Photo credit: Justin Miller)

Posted
AuthorColeman
CategoriesEvent

The presentation by Young Hahn at FOSS4G NA last year, on his work with the MapBox team to churn out updated map tiles for the whole planet in a day got me thinking once again about an idea I've had for some time that makes sense, but perhaps only to me.  I thought I would toss this idea out for general consumption and peer review, scrutiny, critique and hopefully, limited bruising to my ego. The concept of trying to map things (physical or cultural) on the earth over time is challenging at best.  There have been excellent examples of doing this in limited areas over slices of time but I've yet to see a truly global effort to provide a continuum of data as a surface.   The closest thing I have seen that even approaches doing this at a global scale is MapStory.

I often use the analogy in conversation, when trying to describe this to colleagues, of the moment when you break through the clouds on an airplane and you happen to have a window seat on a clear day, and you are at eye-level with the top of the clouds.  Looking out, you can see an undulating, irregular and dynamic surface punctuated by pockets (no clouds) or perhaps a boiling thunderstorm rising up in an isolated vertical column.

The turbulence you experience is the impact of the plane slicing through this surface or skidding across patches of it at cruising altitude.  This provides a healthy analogy for continuous mapping of spatio-temporal data across the entire planet.

To do this, we need a way to uniquely index the planet in 4D (x, y, z & t for those of you scoring at home).  My idea (and the latent flaws in it) draws from a combination of these variables, so humor me and let me try to articulate this without additional context or history.

If we look at the maximum practical values for each variable above, we'd have something like this:

Latitude (x) = xx.xxx (ie. 89.999 in DD) Longitude (y) = xxx.xxx (ie. 142.034 in DD) Elevation (z) = xxxxx (ie. in meters, yes, it's metric, get over it.) Date (d) = xxxxxxxx (ie. 31122012, sorry StarWars is cool, but no Star Dates here) Time (t) = xxxxxx (ie. 23:59:59)

Total Land Surface area: 510,072,000 sq km (thanks @colemanm + Wikipedia)

If we fishnet grid the entire planet, or perhaps just the land area, at 5 meter intervals we get 20,402,880,000,000 points in the grid (give or take).  That's nearly 20.5 Trillion points.  By any definition, that qualifies as "big data".  It gets better.

If we concatenate (that word always sounds funny to say out loud) the numeric values of the variables above into a single integer (and we stay consistent about it) we get something like:

899,991,420,341,000,031,122,012,245,959 which is more than enough digits to conquer the ~ 20.5 trillion point fishnet across the land surface of the earth.

'Why bother with all this big math?' someone countered me when I brought this up two years ago at WhereCampDC, since we already have UTM to do this?  (I admit, I chose the wrong forum at the wrong time to introduce an imperfect idea and was promptly & properly shot down)  The question is still valid however and my response at this point would be, this number (whatever I get around to calling it) provides a truly unique temporal index to just about everything that has ever or would ever be important to us as humans (and the non-human species around us).  It would allow us to reference everything (to the nearest 5 meter coordinate) over time and if that were to happen consistently and uniformly, we could then perform analysis that is only possible in the realm of science-fiction or DARPA (sort of the same thing at times)

Being able to investigate, interrogate or query time & space (true 4D) and have all results returned on-demand and presented in a rich UI (ie. map) via a browser would be, well, pretty awesome.  It leads back to the notion of the top of the cloud deck - that active filtering of spatial-temporal data at 5 m resolution could provide true "art of the possible" capabilities and insight that might just make a difference in trying to understand the world around us and our impacts on it and upon each other, even get crazy and start anticipating if not predicting strategic events.  Or perhaps my idea is just a more modern adventure fashioned after Quixote.

If you thought that was off the reservation, cycle through that math with a 1 meter fishnet and we're into the quadrillions (510,072,000,000,000,000 to be precise, give or take).  Those numbers venture into the realm of  Jeff Jonas and Terry Busch and that gets truly interesting real quick.

This week Tony and I are out in California at Camp Roberts, working with a group of folks from all over the country, experimenting with tools and methodologies for disaster recovery scenarios.

The RELIEF event series is intended to provide an environment for civilian agencies, academic researchers, technologists, and military personnel to get together and collaborate, working in concert to improve the processes and tech used when first responders arrive on the scene post-disaster to triage and assist affected communities. During the trip, the objective is to experiment with new technologies for FEMA's "survivor-centric" approach to crisis management and response.

Camp Roberts FEMA Corps data collection

Camp Roberts FEMA Corps data collection

We were invited to participate in the exercises, specifically to field trial the Fulcrum platform as a suite of tools for field data collection capabilities for first responders in assessing the on-the-ground environment, with assistance staff and FEMA Corps volunteers using electronic survey applications to replace older style paper survey methods.

Fulcrum will be a great all-around tool for this sort of field survey work, particularly given its capabilities as a fully disconnected, offline platform, and thanks to its foundation on consumer-grade devices. This means that in the low- or no-connectivity environment post-disaster, volunteers can still get the job done, with resilient ability to get to the network for data upload whenever it's available, rather than being permanently tethered to maintain the data uplink back to the central office. Because Fulcrum runs on iOS and Android consumer hardware, there's no expensive device acquisition process, and coordinators can support truly "bring your own device" models for volunteer participation.

We're excited to be part of this effort to improve response agency processes, but also to have the opportunity to put Fulcrum into the "proving ground" for true field testing scenarios. Stay tuned for more later in the week.

Everything needs a little maintenance from time to time. We change our oil, clean the gutters, get our eyes checked, and brush our teeth. Sometimes, things that don't even exist in the physical sense need a little TLC. The ones and zeros that make up Spatialnetworks.com were looking a bit weathered towards the end of 2012, and after tending to the rest of our flock, we were finally able to launch a refreshed version. 

Rediscovery

As we began the discovery phase of this new design, we really wanted to slim everything down and focus on the essentials. We didn't want any sliding image galleries, lengthy, self-affirming dissertations, or redundant content on pages that didn't need to exist. Getting back to basics means answering a few simple questions: Who is coming to the site? What do we want them to know? What action do we want them to take once they're ready to move on?  

blog-redesign-discovery.jpg

One of the inherent benefits of doing a redesign, as opposed to starting from scratch, is that we need only look at history to answer question number one. Of course, in life, many times when one question is answered, others quickly arise: Are the visitors meeting our demographic expectations, i.e. are these the people we want coming to the site?  Are there other markets in which we could be making an impact? Once these and other similar questions are pondered, we can make informed decisions about the next steps instead of just reaching in the dark.

Room For Improvement

While making these evaluations, we realized our previous site fell short when it came to addressing how Spatial Networks fit into industry-specific scenarios. In fact, falling short is putting it mildly, as we barely made mention of any specific industries at all, save "geography". We quickly decided to add both "Industries" and "Professional Services" sections and developed a near-term and long-term strategy for each. Since we wanted this redesign to move as rapidly as possible, it made more sense to release a bare version of these sections first. Without all the bells, whistles, and underlying features needed at launch, we could focus on other parts of the site, thus resolving the whole project more quickly.

Everything At Your Fingertips

Another departure from the old site - and from traditional website layouts - was to keep the content all available on one longer page, instead of breaking it up between individually navigated pages. When we dissected our previous design it was clear that our messaging became diffused with the addition & modification of different pages over time. Our solution was to keep it all in front of the reader and provide modals (windows appearing in front of the existing content) for additional content like staff bios and product descriptions. This keeps the reader from straying too far from the core messaging and provides an overall sense of cohesion.

Finding The Right Tone

Coming on the heels of the Fulcrum Redesign, the old Spatialnetworks.com was indeed showing its age. We still felt strongly about the logo itself, but its surrounding design assets needed a new look. Drawing from principles of the Fulcrum redesign like simplicity, texture, and a thoughtful use of negative space, we were able to coalesce a strong visual tone for the overall look of the site. One thing we wanted to avoid, however, was the sterile, boilerplate look of the average corporate site. We incorporated a more organic palette and a less ubiquitous set of fonts to help achieve this, but we actually found the initial font choice to be a bit too...fiddly, so we scaled it back to something more sensible.  

blog-font-change.jpg

In terms of designing with a purpose, we decided early on the we wanted something unique, informative, easy to use, and even a little fun. When the visitors are finished browsing, we want them to feel as though they've been to the office, had lunch with us, and are leaving with a good sense of who we are and what drives us, both as individuals & as a team.

The SNI Roadmap

Now that the initial pieces are in place, the site will continue to improve as each section matures into its long-term features & goals. We'll also upload our slide decks from speaking engagements around the world and keep all the news & company info updated in our blog. Feel free to contact us with any questions or comments. We look forward to what the rest of the year has to offer.

Posted
AuthorTim
CategoriesArticle
gps talk

Earlier this week I went over to speak at University of South Florida, to a graduate course on the GPS system. The course’s objective is to teach the systemic details of GPS – its structure, communications architecture, and practical application.

I was invited over by Sean Barbeau (the course instructor) to give insight into how GPS is applied and used in commercial settings, particularly how we use it in our work to build and augment geospatial data all over the world, and how it’s integral to modern geospatial tools like Fulcrum. I covered the basics of our project work, product development efforts, and design process in how we think about geospatial technology. I think it opened some eyes to approaches and technology that are somewhat non-traditional, certainly in the general academic community.

Flipping through a quick demonstration of some modern geo tools – like Fulcrum, TileMill, and QGIS – the students in the course had great questions about process, benefits and drawbacks to GPS on smart devices, and how field data collection can be done on an industrial level. OpenStreetMap figured into the discussion, since Sean has been integrating the collection of GPS traces and data capture into his curriculum as a practical usage of the GPS system and devices. He’s even turned them onto Pushpin as a tool for adding data to OSM in the local area. Several students are now working on their semester project using Fulcrum as a field capture tool - the assignment is to go collect something interesting using GPS, then import some of the useful data into OpenStreetMap. Feels great to introduce students to new alternative technologies.

I always enjoy the opportunity to speak with up-and-coming GIS professionals, as I think the “outside opinion” from someone in industry is refreshing to young people in strictly academic settings. While the theory and research angles on the geographic sciences are critical to the learning process, perspectives from practical application are essential to having the write knowledge set when moving into the commercial space.

FedGeoDay

This week we’ll be at FedGeoDay in Washington, DC, talking with government and industry about modern tools and technologies for doing work with maps and data. The schedule is packed with fantastic speakers talking about things like building beautiful maps with open source, new ways to look at geo analysis, how to tell stories with maps, and the growth of the OpenStreetMap project.

In the afternoon, look for Brian Flood’s session on the awesome additions to Arc2Earth providing syncing services to allow you to work with data collected in Fulcrum. This functionality is huge for users who want to bring data from the field and push it into another platform for analytics or presentation, like MapBox (with TileMill Connect), CartoDB, or Google Fusion Tables. With Fulcrum as a cost-effective alternative for mobile surveying and data collection needs, users no longer need to invest in expensive, proprietary hardware to get valuable information from the field. And with these added capabilities in Arc2Earth, visualization and distribution of data becomes even simpler.

The schedule is packed with great sessions, and it’s exciting to see so much innovation in alternatives to . I’ll be at the conference all day, so track me down if you want to talk about mobile tools for mapping, data collection, and how we use open source geospatial technology in our work.

Posted
AuthorColeman