Author Archives: Steve Coast

openstreetmap | style is violence

Scott Morrison posted a good article today in the Wall Street Journal about our hire of Steve Coast, OpenStreetMap‘s founder, and our announcement a week ago that we’d be sharing aerial imagery with OSM.  OpenStreetMap, in case you don’t know, is a sort of Wikipedia for maps, contributed to by all, owned by all.  It’s been up since 2004.

Steve is a wonderfully creative hacker, both idealistic and sardonic.  (Maybe nothing sums the latter up quite so perfectly as his Fake Mayor iPhone app, which spoofs the Foursquare “you’re the mayor” screen and might score you a free cappuccino at some overly-wired coffeeshop.)  In short, he’d be at home as a character in a Cory Doctorow novel.  Hell, he probably is a character in a Cory Doctorow novel.

Like Steve (and Cory) I’m a fan of Creative Commons.  When we released Photosynth in 2008, we had several CC options among the rights structures selectable for uploaded photos, and shortly afterward I prevailed on our program managers and legal people to change the default to Creative Commons Attribution, the most re-mixable variety.  I think it’s important for people to be aware and exercise choice in controlling the rights to their own data.  Most people who post media on the Web in a public forum don’t plan to sell or license those media.  In that case they should be encouraged to share with each other and with the world in a way that prevents the media from ever becoming a corporation’s walled asset.  The CC-Attribution and ShareAlike licenses do that.

Shortly after the Photosynth release, I saw the beautiful “OSM 2008: A Year of Edits” video, an animation showing all of the contributions to OSM over 2008.  It’s a lot better than it sounds.  Actually, I remember it sort of putting a lump in my throat at the time.  Eerily, it reminds me a bit of voltage-sensitive dye neural imaging videos.  As if the Earth is a giant brain wiring itself up.

One of the things that’s exciting to me about OSM is the way it empowers grassroots mapping of places where there’s not enough economic incentive to produce the sort of commercial maps Tele Atlas and Navteq specialize in (and that Bing licenses).  Major OSM projects took place last year in Haiti and in Kibera, one of the biggest slums in the world (home to roughly 1M people).  Even in the US, while the commercial providers have far more precise and complete maps of the areas where people tend to navigate, OSM has a surprising density of small roads and paths in the wilder places, and details of footpaths in parks.

We have a collaboration underway with DigitalGlobe to do one of the largest aerial imagery surveys ever undertaken, covering the US and Western Europe at 30cm resolution.  (The camera we’re using to do this is an impressive technical achievement, developed by our Vexcel team in Graz, Austria.)  By sharing use of the imagery with the OSM community, we hope to enable more OSM goodness.  Maybe one day we can find a way to fund this kind of imaging over less developed parts of the world.

Most of the OSM community has responded very positively, though there are a few of the usual anti-M$FT trolls, like spacecube writing

In my eyes OSM just sold its soul to the devil.

To be clear, OSM’s legal status is like a one-way valve– it’s free and open forever, and any edits made to it from any source become free and open too.  It can be used by anybody, but it can never be “bought” or “owned” by any company.  If a trail over the Rockies can now be positioned with 30cm accuracy by tracing over our aerial imagery, that’s bad for OSM how exactly?

This is quite aside from the question of whether Microsoft can still be considered the devil in the company of its younger brethren– maybe, but at most in an old fashioned, Rolling Stones sort of way.

Kind words from Blaise

OpenStreetMap Gets Noticed by Microsoft, AOL

By SCOTT MORRISON

OpenStreetMap, a sort of Wikipedia of online maps assembled with contributions from thousands of globe-trotting volunteers, has gotten the attention of two big Internet players: Microsoft Corp. and AOL Inc.

The companies recently invested money and contributed aerial imagery to help OpenStreetMap forge new ground. They see the project as a potential alternative or complement to expensive digital maps built by commercial vendors like Nokia Corp.’s Navteq and TomTom International BV’s Tele Atlas, which license data to Microsoft, AOL, Yahoo Inc. and Google Inc. for all or parts of their online maps.

For Microsoft and AOL’s MapQuest unit, OpenStreetMap presents an opportunity to build new local services or develop new business models while skirting the costs and terms associated with licensed data from the commercial providers. The two companies are estimated to pay Navteq tens of millions of dollars a year for its map data.

“As location becomes an important element in online services, it’s really critical that companies have the flexibility to build the services that consumers want without the constraints of licensing agreements,” said MapQuest general manager Christian Dwyer.

Google last year began moving away from commercial vendors by rolling out a U.S. based map built with government data, satellite and aerial imagery, and data collected by its Street View vehicles. A company spokesperson says Google’s having control of its own maps enables it to do frequent updates and make them available whenever and wherever users need them—online, on mobile devices, or in the car.

“Google has tremendous business flexibility in how they use their map,” says digital mapping consultant Marc Prioleau. “The others have to work with a third-party vendor to make changes to the maps or try new business models.”

OpenStreetMap, meanwhile, is a free map of the world that is being built with government data and supplemented by an army of 300,000 volunteers who use GPS technology to trace and upload their routes to OpenStreetMap’s website.

These community mappers can also use programming tools on the website to fill in features like bicycle paths, traffic restrictions, restaurants and shops, historic sites and sporting venues.

Volunteers so far seem to have mixed reactions to the idea of their contributions being used for commercial purposes. Samat Jain, an IT consultant in New Mexico who contributes to OpenStreetMap, says many members of the open-source community are concerned that Microsoft and AOL might steer the project in the wrong direction as they seek to commercialize the maps, but he personally supports their involvement because they will help push his contributions out to a broader audience.

OpenStreetMap founder Steve Coast, a computer developer and physics dropout, founded his nonprofit project in 2004 after recognizing that unlike open-source software and community encyclopedias, there was no free mapping data available to computer programmers. “Mapping is one of the few things that gets you, as a hacker, out into the streets doing physical things,” says Mr. Coast, who recently joined Microsoft as a map architect.

Mr. Coast acknowledges that the OpenStreetMap project has a long way to go, but he argues that maps of some regions, like the U.K and Germany, are comparable, if not more detailed, to those provided by Navteq and Tele Atlas.

The project’s U.S. map, by contrast, still lags. OpenStreetMap used freely available government data to lay out a basic map grid for the entire country and is now relying on community mappers to fill in the details. So far, those mappers have focused on major urban areas like New York City, Houston and San Francisco.

Tiffany Treacy, a senior vice president at Navteq, wouldn’t comment directly on OpenStreetMap, but claims her company can provide the consistency and level of accuracy that consumers demand in their maps.

Patrick McDevitt, vice president of community mapping for Tom Tom, says that while community maps work for some applications, “those that require consistent high quality, accuracy and extensive coverage will need quality-assured and tested products.”

OpenStreetMap received a major boost this year when MapQuest began rolling out maps of several European countries, including the Netherlands and Switzerland, based on the project’s mapping data. AOL also invested $1 million to help developers build tools that make it easier for mappers to contribute to these maps.

MaqQuest’s Mr. Dwyer says the goal is to eventually switch to OpenStreetMap for the entire world, but he estimates it will take three to five years to make that vision a reality.

Microsoft followed MapQuest’s lead last month when it hired Mr. Coast and announced it would provide high resolution aerial imagery to OpenStreetMap, a step that will help volunteers fill in gaps in the map by tracing streets and other features from the images.

Blaise Aguera y Arcas, architect of Bing Maps at Microsoft, says he sees OpenStreetMap as a source of mapping data that is complementary to the sets provided by Navteq, which powers Bing Maps.

Mr. Agueras y Arcas also notes that commercial mapping providers are focused on the U.S. and European markets, while OSM volunteers have in some cases built highly detailed maps in South America and Asia.

“We have no plans to drop our relationship with Navteq,” he says. But “it would be silly not to provide easy ways for users to use OpenStreetMap in areas in which OpenStreetMap has a lot to offer.”

Write to Scott Morrison at scott.morrison@dowjones.com

Woo

Microsoft Imagery details

“Microsoft is pleased to announce the royalty-free use of the Bing Maps Imagery Editor API, allowing the Open Street Map community to use Bing Maps imagery via the API as a backdrop to your OSM map editors.

Bing Maps imagery must be used in accordance with the API Terms and Conditions [see PDF below] – although this is not legal binding advice, and you are encouraged to read the TOU itself, in sum the TOU says: you are only granted rights to use the aerial imagery, you must use the imagery as presented in the API, you cannot modify or edit the imagery, including the copyright and credit notices; you cannot create permanent, offline copies of the imagery, all of your updates to OSM arising out of the application must be shared with OSM, and the OSM map editor must be free to end users.”

 

If you have a question, I’m at steve@asklater.com or you can chat to people live at http://irc.openstreetmap.org/ Richard Fairhurst and others have already been working on the code to use this stuff with potlatch etc. You should see it go live soon!

Law and the GeoWeb

I’ll be there…

Law and the GeoWeb
==================

A workshop on “Intellectual Property and Geographic Data in the
Internet Era” sponsored by Creative Commons and the United States
Geological Survey (USGS) in conjunction with the annual meeting of
AAG, April 11, 2011, Seattle, Washington. The workshop will be held at
the campus of Microsoft Research, and will be streamed live on the
Internet.

This workshop will focus on intellectual property issues with
geographic data, exploring situations when users and creators ranging
from individuals to local, state and federal agencies as well as
private companies and non-profits create, share and reuse geographic
information from different sources over the Internet in their
projects.

For more information, please see http://punkish.org/geoweb/index.html
or search on Twitter for #lawandgeoweb

Rationale
=========

U.S Copyright Law protects tangible original works with creative
content but the law also ensures that facts, that is, data that are
discovered rather than invented, remain free for everyone’s benefit.
This ideas/expression dichotomy creates a lot of issues in the
Internet age when information is very easily created, shared, used and
reused.

With inexpensive computing and networking power available to everyone,
geographic datasets are increasingly being created, shared and used by
individuals, grassroots organizations, and private corporations. These
data come with different expectations with regards to how they may be
used resulting in a hodgepodge of licensing and contractual
obligations that hinders data interoperability. Mixing data of
different provenance creates new data with typically more restrictive
licensing conditions. Public agencies may be unable to mix licensed
data with government data due to restrictive licensing terms of the
resultant dataset, and thus, may be unable to capitalize on and
benefit from user-generated content.

Workshop Structure
==================

The current line-up of speakers from federal, state and local
agencies, Creative Commons, grassroots agencies, intellectual property
lawyers, the geospatial industry, and research and academia includes:

* Ed Arabas, National States Geographic Information Council
* Greg Babinski, King County, State of Washington
* Michael Brick, Microsoft Legal, Bing Maps
* Steve Coast, Founder, OpenStreetMap 
* Kari Craun, Director, National Geospatial Technical Operations, USGS
* Ed Parsons, Chief Technologist, Google Maps, Google
* Diane Peters, General Counsel, Creative Commons
* Tim Trainor, Bureau Chief, Geography Division, US Census Bureau
* Paul Uhlir, Director, Board for Research, Data and Information, NRC

The format of the workshop will encourage discussion and participation.

Participate
===========

To ensure those directly involved in the topic get a chance to attend
the workshop, attendance is based on a short application form
accessible at http://punkish.org/geoweb/participate/in_person/index.html.
Deadline for applying for the workshop is December 18, 2010. Selected
applicants will be informed by January 15, 2011.

Attendees will also be able to submit longer papers for publication in
a special issue of the peer-reviewed, completely free and open access
online journal “International Journal of Spatial Data Infrastructure
Research” published by the Joint Research Centre of the European
Commission.

Logistics
=========

The workshop is organized in conjunction with the AAG annual meeting.
The workshop will be held on the campus of Microsoft Research, and run
from 1 PM to 5 PM on Monday, April 11, 2011.

There is no fee for this workshop and participants do not have to
register for the AAG Annual Meeting. The workshop is limited to 50
participants to facilitate discussion.

Proceedings of the workshop and selected longer papers will be
published in a special issue of the open-access International Journal
of Spatial Data Infrastructure Research published by the Joint
Research Centre of the European Commission.

Contact
=======

Please contact either Puneet Kishor, Creative Commons
[punkish@creativecommons.org] or Barbare Poore, USGS
[bspoore@usgs.gov] if you have any questions.

Tricky Transit Data. Is Transiki the Answer? | It’s All About Data

I have always loved working with transit data. It is so similar to road data like TIGER or OSM that we work with so often, but at the same time it is so much different. While a road network stays the same throughout time, a transit network can change depending on the time of day, or day of the week. This makes working with transit data a much more difficult problem than working with street data, in fact transit networks generally include street networks within them!

Services on top of transit data are somewhat more rare than road network services. Google has had Google Transit for several years now and is slowly expanding to include more cities. Microsoft recently added transit data for a few cities to Bing Maps. These services are great, I honestly don’t think I would be capable of using transit anymore without Google Transit on my iPhone, but what about the data?

Google has opened the General Transit Feed Specification interchange format they use to load data from transit companies into their system, and there are some efforts to make public GTFS data easy to get, as well as a number of very powerful tools that can suck in a GTFS dataset along with an Open Street Map road network for analysis and routing (BTW, GTFS is just a standard schema represented in CSV files, so working with it in FME is very easy too – kudos to Michael Grant of BC Transit who had a great presentation on this topic last week).

Transiki is a new project from Steve Coast (yes, the same Steve Coast who started OpenStreetMap) to bring the OSM model to transit data. Steve says he had the idea when Google Transit failed him, leaving him on a platform waiting for a train that did not exist, except in Google’s datasets. With Transiki, Steve may still have been stranded, but at least he could update the dataset to prevent others from being in the same situation. A wiki-esque transit network could also bring the real-time transit data in The Bay Area or Portland to other cities through a Waze like application. The uses for a large, free transit data network are almost limitless.

Will Transiki prove to be the answer to all of our transit woes? The short answer is nobody knows for certain. As with all crowdsourcing initiatives, only time will tell if the support from the community remains and grows. Given the huge success of OpenStreetMap, I am excited and hopeful. To get involved with Transiki, come join me on the mailing list!

ShareThis

Related posts:

  1. Full Speed Ahead with OpenStreetMap
  2. Trends in Spatial and Temporal Data
  3. Open and Accessible Data
  4. GIS Folks, Ever Metadata You Didn’t Like?
  5. Choosing the Right Data Format

Usabilla shoutout

I want to give a shoutout to usabilla.com who’re helping OpenStreetMap with visual feedback on our user experience testing and design.

The concept behind usabilla is super simple, throw anything from full fledged designs to mockups in front of potential users and get click-based feedback. Users get to click on things and leave notes around specific features and parts of a page. So rather than trawling through a list of feedback, you get a more visual and engaging experience from both the testee and the testers perspective. And a whole lot more, check out the intro video below:

[youtube http://www.youtube.com/watch?v=d6Xi8IdQLik?wmode=transparent]

OSM User Testing

I’ve written previously about OSM usability studies, and now it’s happening. Nate Bolt from the fantabulous Bolt|Peters is going to help OSM run usability tests and we need your help.

The timeline looks something like this: This week or next we’re going to switch on some javascript on the OSM signup page that invites a percentage of signups to help OSM run a user survey. Those people fill out a form and are invited later to use some simple online screen capturing software while asked to do some simple tasks and this is where you come in. We need to think of some simple tasks for new users to complete, and we’ll put them together over on this wiki page. Add a street? Find a mailing list? Add a point of interest? What should they do? That’s up to you.

Also, if you’re running a mapping party we can give you a super secret link where you can send new users to do the same tasks with screen recording. You mustn’t help them on the first go, as that’s exactly what we’re trying to find out – what goes wrong.

Then on December 8th (tentative) at the Bolt|Peters office in San Francisco, OSMers together with the UX wizards will analyze the videos and make some joint suggestions on how to push things forward. Anyone in SF, or can be in SF around then, please drop me a mail.

Measuring the OpenStreetMap Economy

When I see announcements flying around like MapQuests $1M commitment to OSM, or CloudMades $12M VC round it begs the question of how big is the OSM economy?

Purely as an academic exercise it’s interesting to think of OSM as an ecosystem around which people find work and provide goods and services. But also perhaps it would be a nice exponential graph to show as a slide along with user growth.

We have some limit cases. In 2004 when founded, the economy was approximately zero. Or was it? Do we measure volunteer hours? How about the power and bandwidth the servers are burning? Or is that negligible compared to the other large numbers thrown around?

Today I would estimate we have about 5 people freelancing on OSM work worldwide. Perhaps 50 that do OSM work as part of their job, say writing a plugin or using the data. Full-time employees working explicitly on OSM? Perhaps 50 again. These are all guesses with some rough education behind them. These numbers would probably follow the kind of growth curves that various projects around linux did, rather than wikipedia I’m guessing. Because wikipedia was much more about the destruction of value around britannica and others, and the secondary service and otherwise market around wikipedia is pretty small (I think?). Unless you count MediaWiki itself.

Once you have the criteria of what goes in to the measuring pot of the “OSM economy” you further have large error bars on the data for each thing. For example, are those freelancers going to tell you what kind of money they’re making?

Still, an interesting thought exercise.