Free Our Data: the blog

A Guardian Technology campaign for free public access to data about the UK and its citizens

Archive for the 'Media coverage' Category

Government asks Free Our Data to work with OPSI on web channel for users

Thursday, July 26th, 2007

We did say that the meeting with Michael Wills was interesting. In today’s Guardian, we precis the meeting (update: the full text is now on the blog), which boils down to a few key points:

  • Michael Wills, the minister for information at the Department for Justice, thinks the case for free re-use of public sector data in and outside government is “compelling”
  • He thinks that government has to move quickly to adjust to the changing information world
  • He expects the ‘public task’ of Ordnance Survey to be defined within a study that will report to him by the end of this year
  • the trading fund model, set up in the 1970s (by which government agencies charge for their output, reducing their need for direct tax funding) needs reexamination
  • He is reluctant, however, to disturb a model (trading funds) which has produced high-quiality information

They key point though is that he wants Free Our Data to work with the Office of Public Sector information to set up a web channel through which the public can request public data, and what form they want it. That, to us, marks a significant recognition of the importance of this campaign.

Read more at The minister will hear you now in the Guardian. The full transcript of the meeting will be posted once it has been re-checked for accuracy.

Who’s who after the reshuffle

Thursday, July 12th, 2007

Today’s Guardian asks “Can the new ministers make a difference?” and looks at who is now who following the reshuffle by Gordon Brown.

Key players now are:

  • Michael Wills (in charge of OPSI and the National Archives, and Freedom of Information)
  • Stephen Timms, now minister of state for competitiveness, will have a key role in setting the rules under which trading funds such as the Meteorological Office and Ordnance Survey operate in the information market
  • Derek Twigg at the MoD (which has some say over the UK Hydrographic Office and Met Office)
  • Lord (Jeff) Rooker at Defra, who will have to oversee the implementation of Inspire
  • Baroness Andrews at DCLG, which includes overseeing Ordnance Survey
  • Ed Miliband at Cabinet Office, who’ll have to look after the web 2.0 and the implementation of the ideas accepted from The Power Of Information report.

We’ve linked to their Parliamentary profiles on TheyWorkForYou so you have some idea of what they’ve been interested in previously… though of course that’s not necessarily a guide to how they’ll act as ministers.

We’ll have an announcement regarding Michael Wills in a later post.

Europe’s Galileo satellite program faces more obstacles

Wednesday, July 11th, 2007

Following the debate in the House of Commons recently, in which it was pointed out – or at least alleged – that the US’s Global Positioning System (including a useful potted history) now underpins the entire US economy, but that nobody could come up with a convincing reason why we need Galileo too.

Owen Paterson, for the Tories, said:

Last week, I had meetings with representatives of Trafficmaster plc, a highly successful company selling navigation services to more than 100,000 vehicles in the UK. Its technical director, Christopher Barnes, said that

“the free to air GPS service is sufficient for vehicle navigation and therefore we are unlikely to be interested in paying (either voluntarily or through a compulsory tax) to use a European service, even if technically it would be better.”

There is extremely limited application for the higher accuracy that Galileo will offer and, in any event, any such advantage will last only until the US deploys Block III Navstar, which promises equivalence.

(That’s an interesting statistic: 100,000 vehicles using TrafficMaster. And that’s only one of the many satnav systems on sale. How much taxable revenue does GPS – a free government data service – generate?)

Now the Guardian notes that Galileo faces further obstacles: in “Funding row pushes GPS system further off course“, it quotes Olivier Houssin, head of the commercial and security operations of French electronics group Thales, saying Europe runs the risk of being left behind in key commercial and military applications by the US, China, Russia and India if it doesn’t back Galileo.

“If Galileo collapses it will be the collapse of the most important EU programme outside the common agricultural policy,” he said in an interview. “Europe is stagnating in space.” EU transport ministers agreed last month to scrap the public private partnership for building and running the 30-satellite Galileo system, which promises greater accuracy than the American GPS, to control air and road traffic. It will also provide enhanced civil security and even help to pilot driverless trains. Mr Houssin dismissed the PPP/PFI as “a false good idea” because Galileo was a “strategic infrastructure” that Europe had to fund publicly. He said the French and Germans were now at loggerheads over how to provide the extra €2.4bn required.

Berlin favours making extra voluntary contributions to the European Space Agency, which is in charge of the overall project, in return for a greater share of the workload. “They want to take over the technological leadership of the programme and centre it around the activities of Astrium [the space arm of EADS] in Munich,” he said.

The French would prefer to see the project financed as an EU investment, with cash sourced from other European commission budgets.

Britain, which still clings to the notion of a PPP/PFI, has refused to give the go-ahead for switching entirely to public funds.

It is very difficult, as Owen Paterson pointed out, to find a way in which Galileo is necessary when GPS III is on the way – unless, that is, its real value lies in military application. But that supposes us not being an ally of the US, which is a very peculiar worldview to start from.

(Thanks to Rob for pointing to the debate.)

Environment Agency gives its reasons for stopping flood data being used

Thursday, July 5th, 2007

Today’s Guardian Technology includes “Rising tide of frustration at flood maps’ restrictions“, following on from the Environment Agency (for England and Wales) decision to forbid OnOneMap republishing a scraped version of its flood map data.. which is the sort of data that one would like to have at the moment when considering where to buy or rent next.

Interestingly, OnOneMap had also – it tells us – scraped the Scottish Environmental Protection Agency’s data for maps (which doesn’t use the same criteria as the E&W one; some joined-up thinking needed?) but hadn’t started using it.

So why is the flood map data not available beyond the EA site? The EA says it is – but at a price. “The charge for commercial use of the whole Flood Map dataset for England and Wales starts at £4,000 per year. We will charge a royalty fee on those companies that sell the information on,” the agency said.

“More than 50 commercial companies are licensed to use the flood map and/or our other flood risk data. Hundreds more commercial licences are issued for use of flood risk data for local studies. However, some do not pay because they are statutory customers, such as gas and electricity companies and utilities.” The data has been available since 2001.

Philip Sheldrake, managing director of the OnOneMap site, thinks it’s daft.

One irony Sheldrake points out is that the SEPA data is held in a different format – meaning that OnOneMap was the first service that pulled all this information together in one place (although it had not started offering the Scottish flood data when the EA complained).

“We have not applied for a licence [from the EA] as we believe the data should be in the public domain and moreover they have advised us that their licence terms still would not permit us to provide the data to the public via a website!” Sheldrake says.

Flood data? Although it could be done with an XML feed for all (guaranteeing it would be up to date) and thus displayed on any map outlay people wanted, we’re stuck – at present – with one pasted on Ordnance Survey bitmaps. Is this really the best we can do?

Catching up: government responds to OFT and Power of Information reports

Thursday, July 5th, 2007

Should have posted this earlier, but I was wondering whether the disappearance of the DTI would kill the links. (It hasn’t – still works, but just has a different name. By their web addresses ye shall know them…)

Anyway, on the last Monday of June the government finally replied to the OFT “CUPI” report on the Commercial Use of Public Information, and to Tom Steinberg’s and Ed May’s Power Of Information report.

The response to CUPI is from the DTI, or as it’s now known the Dept for Business, Enterprise and Regulatory Reform. The PDF response is at (78kB).

The Cabinet Office response to the Power Of Information is at a page titled “Power Of Information principles get go-ahead from Government.”

There’s also a “Have your say” page attached to the Cabinet Office page; see what you think.

The key question though: will trading funds be forced to consider themselves? Yes. The government is asking for an investigation into the cost benefit justification for organisations working as trading funds.

We reported this as “Government on the back foot over policies for pricing data“:

The long-awaited reports, from the Department of Trade and Industry and the Cabinet Office, recommend that the government make more data available without strings to community and commercial ventures. The Cabinet Office also paves the way to government sites opening self-help forums for citizens, and civil servants engaging openly in independent forums, blogs and wikis.

Both reports, however, avoid the central demand of Technology Guardian’s Free Our Data campaign – that government should stop running information businesses, and instead give all data away to stimulate the knowledge economy.

On the plus side,

Notably, it urges Ordnance Survey to launch its postponed OpenSpace project, which would provide a Google Maps-like interface for mashups, by the end of December.


However, the response warns that more ambitious steps towards free data could threaten the business of trading funds and their “vital role in the UK economy”. It agrees with the Power of Information review, which says that the government should get hard data from an independent study of the costs and benefits of the trading fund model.

So we’ll ask for an on-the-record ministerial interview once we’ve figured out who, under the new regime, is responsible for Cabinet Office, trading funds and public sector information…

In today’s Guardian: Galileo to be publicly-funded – but why compete with GPS?

Thursday, June 21st, 2007

Today’s Guardian Technology section wonders “Will Galileo ever achieve orbit?” That being the project to create a GPS-alike system, but funded from Europe rather than the US.

The costs involved are large –

a 2001 report commissioned by the EU estimated that developing and deploying Galileo would cost €3.4bn (£2.3bn). Philip Davies, senior account manager at Surrey Satellite Technology Ltd (, says he’s seen estimated running costs in the range of €8bn to €10bn over 20 years. These numbers will be part of what is scrutinised between now and September, the deadline the Transport Council has given the European Commission to come up with alternative proposals for funding and managing the system.

GPS-style systems have the benefit that they’re government-generated data which is made for free: that creates private-sector opportunities. Very large ones:

The business plan published by the Galileo Joint Undertaking at the outset of all this estimated that the market for satellite navigation applications would grow from €30bn in 2004 to €276bn by 2020. This estimate was conservative compared to some of those in the report the EU commissioned in 2001 from PricewaterhouseCooper, which projected a market of €276bn by 2015 for personal communications and location services.

Clearly there’s a strong multiplier at work in GPS – comparatively small government input produces big private-sector revenues (and hence taxes, which pays for GPS – in theory – and all sorts of other benefits, as well as reducing congestion and making people easier to locate if they’re in danger, and stopping planes colliding; putting a value on the “not happening” bad things is hard, but must be counted as part of the benefit of satellite location.)

Galileo remains something of a me-too, but it’ll be interesting to see whether the EU decides to focus on its public and private-sector benefits as a justification for the spending – or whether instead it just kills it. Somehow, it doesn’t look likely to kill it.

“Power of Information” review from Cabinet Office: government could do more with our data

Thursday, June 7th, 2007

Tom Steinberg of MySociety (behind theyworkforyou) has co-authored an important new report, “The Power of Information Review” which suggests that online advice sites could improve citizen empowerment.

But it does much more: it looks at government use of public data and suggests more could be done, including beefing up the Office of Public Sector Information.

We’ll have more when we’ve read it in detail..

Tom Steinberg’s blogged his brief comments; here’s the official PDF; and here’s a site where you can comment on the report, rather as you can with Hansard discussion on theyworkforyou.

Meanwhile today’s Guardian looks at the collapse of the National Spatial Address Infrastructure effort in “Address plan finally abandoned“. RIP NSAI – you tried but didn’t have the right backing, ultimately.

If councils move to Google Maps does that help or hinder Ordnance Survey?

Thursday, May 31st, 2007

Today’s Guardian Technology looks at how a number of councils, notably including the London Borough of Brent, and even some central government organisations, are moving to use Google Maps for their consumer-facing displays of map data.

In Councils bypass Ordnance Survey for Google Maps, Heather Brooke looks at the shift, which councils are making because in the first instance, Google Maps is free and comparatively easy both to program and use:

Traditional geographical information systems provide “complex data, complex systems”, said Dane Wright, IT service manager at Brent council in north London, at the annual conference of GIS in the Public Sector earlier this month. Google Maps, by contrast, provides “complex data, simple systems”.

Wright told the conference: “What we are doing is moving to Google Maps as the primary interface for casual use by public users. This will leave the GIS system for more specialist users. The reason for doing this is to provide a better user experience – familiar interface, easy to use, integrated aerial imagery, attractive, no need for training or large manuals.”

But, you say, OS is the source for pretty much all of Google Maps data – and where it isn’t then the company that is sources that from OS.

That’s true – but it does mean that OS becomes vulnerable if Google decides that it would like to shift to someone else for its mapping data. And without knowing the precise details of the Google Maps licence with OS – does it pay per map displayed, per frame downloaded, or is it a lump sum? – one has to wonder what the effect will be.

Meanwhile, even the government’s Directgov system for finding a school in a locality uses Google Maps (although a number of other Directgov systems don’t). Other examples of Google Maps (or indeed Yahoo! Maps or Live Local Maps) being used rather than OS for customer-facing products are welcome. Seen any?

Postcodes: local authorities vs Royal Mail still arguing; want to sign a petition?

Thursday, May 24th, 2007

We’ve got intellectual property rows, petition and a question you might be able to answer this time.

This week’s Guardian returns, in Royal Mail fails to address database issue, to the vexed question of whether RM will let local authorities retain some intellectual property in the addresses they provide to it, as well as paying them for it – neither of which happens at present.

According to leaked letters we’ve seen, RM isn’t in favour. But the authorities are. Impasse. And as the article points out,

The saga provides a graphic example of an issue at the heart of Technology Guardian’s Free Our Data campaign – the bureaucracy and waste that ensue when state bodies treat vital data as an asset that must be made directly profitable.

The addresses go into the Postcode Address File, which unlike thousands of post offices

is profitable, making £1.58m on revenues of £18.36m in 2005-06 (Royal Mail’s postcode database reveals its profitable side, April 26). Councils in England and Wales spend about £2.5m a year on postcodes (paid to Ordnance Survey and commercial businesses, as well as Royal Mail).

After protests last year over price rises, Royal Mail said that it would consider “reasonable remuneration” to local authorities supplying data on new addresses. But negotiations on the terms have foundered over intellectual property rights.

This one could run and run – and already has.

This ludicrous position is a result of conflicting responsibilities placed on state-owned bodies. As a commercial (albeit state-owned) enterprise, Royal Mail has to make its assets pay, and that includes a national resource such as postcodes. Local authorities are being squeezed by council-tax caps and efficiency targets; selling data is thus a rare new source of income. The agency, meanwhile, wants to ensure the future of the National Land and Property Gazetteer, which it sees as a vital tool for modernisation. Its commercial contractor, Intelligent Addressing, is embroiled in a separate dispute with Ordnance Survey over addressing data.

The Free Our Data campaign proposes that all public bodies to free up their address databases, funded from central taxation. We’re not alone. A petition at urging the prime minister “to end the address dispute between local government, Royal Mail and Ordnance Survey” has 370 signatures.

In case you haven’t been to see (or sign) it, here’s the address mess petition. (It was started by Robert Kimber of Luton council; he’s been quoted here earlier.) We found it via the NLPG’s April e-zine – which is itself an interesting byway. The NLPG, of course, is the one which handles all the addresses (except for Birmingham’s. Why doesn’t Birmingham belong to the NLPG? Answers in a comment, please.)

Environment Agency charges for data that were free – creating risk to water

Thursday, May 17th, 2007

In “Free groundwater information dries up“, today’s Guardian looks at the example that was passed to us of the Environment Agency, which used to make available the data about the location of “source protection zones” – essentially, areas around groundwater sources which must be protected from pollution to avoid contamination of drinking water supplies.

One-third of the water Britons drink comes from the ground; in dry and crowded south-east England that proportion rises to three-quarters. But groundwater is a limited resource and vulnerable to pollution. Contaminants can be diffi cult to detect until it is too late. According to the Environment Agency, the main danger comes from constant small leaks of sewage, agricultural chemicals and oil.

The first step to safeguarding our hidden groundwater is to tell people where it is. For this purpose, the agency has identified some 2,000 “source protection zones” around wells, boreholes and springs supplying public drinking water. These zones cover the total area of land needed to support removal of water from a source. They identify places where there is a risk of contamination and allow the authorities to monitor and control polluting activities.

Those data used to be available free as a download from its website; consultants and others considering where and what to do in places they suspect might be SPZs could consult them. Government-collected data, available for free, with a public benefit in its being free.

Now the EA is charging companies for the data:

One consultant says he has been quoted £750 for an annual licence to access data on source protection zones. The agency confirmed that it charges business users. The policy began “a couple of years ago”, it added.

Strangely, it seems to say that individuals can get it for free. (How about self-employed ones?) To which consultants reply that this will probably lead to companies taking the cheaper way out – use the old datasets. But because SPZs can change, that implies a risk to groundwater.

From the article:

Technology Guardian’s Free Our Data campaign, which argues that all impersonal electronic data collected by the government in the course of its public duties be made available free to all comers, agrees. Apart from the direct risks arising from data not being available, there is also a chilling eff ect to the wider knowledge economy: innovative ways of disseminating these data may never be developed if it remains controlled by government.

There is also a practical issue: does the revenue from licensing to conscientious professionals (for unscrupulous ones may find their own sources) really outweigh the cost of administering and policing the charging regime? And is there an overall benefit beyond any (undemonstrated) financial one? With data held on a web server, issues of scarcity do not exist; unlike a well, a server will never run dry of the necessary 0s and 1s to make a copy of a dataset. Yet the Environment Agency is seeking to impose an artifi cial constraint on the supply of this data without any evidence that such a constraint is necessary.

Note: this is, as so many of the examples in this campaign are, an example of charging for data that arguably would be better free sent in by a reader. We’re always grateful for these, and fascinated by how wide and deep your experience goes. Please do let us know, either by leaving a comment here or emailing me ( works well) about other examples of data being charged for that should be free. The government is beginning to listen, we think.

How home information packs give local government a pricey monopoly

Thursday, May 10th, 2007

In today’s Guardian, “Government ducks the issue on property search data” looks at how the arrival of home information packs will give local councils a monopoly in land searches for properties. Trouble is, it’s not one that they’re entirely happy having because it ends up costing them.

However, local authorities can be sticky about releasing “unrefined” information. Nearly two years ago an investigation by the Office of Fair Trading (OFT) found that one in 10 authorities provided no access to records of Town and Country Planning Act notices – even though by law this information should be openly available for free. A personal search company told the office that one third of local authorities do not provide information on highway developments.

Councils argue that they are in a cleft stick. The fee levels set by central government for access to certain items of unrefined data do not cover the cost of dealing with personal search companies. On average, each request takes council staff 70 minutes. They therefore have to subsidise this service out of fees charged for compiled searches. Charges range from £55 to £269, with an average cost of £119.

What’s the problem?

This is another example of what happens when a public body with a monopoly in raw data tries to sell products compiled from that data.

Couldn’t the government do something definitive?

The government responded this month with a consultation document on “Good Practice Guidance” that urges councils to play fair. One key concern of the OFT – price transparency – is ducked entirely. This will be subject to “a further consultation exercise later this year”.

It’s almost strange how the government keeps putting off big decisions like this. It’s almost as if they were waiting for someone important to leave, or someone important to take on a new job so that things could settle down.

Meanwhile we’re expecting the DTI response to the OFT report on public sector information to be released any day now. Let’s hope it’s not bad news – it might get buried.

APPSI comes out in favour of Ordnance Survey on addressing – but it’s two-edged

Thursday, May 3rd, 2007

Deep waters here: this is a case where what goes on in public is more subtle than at first appears. Read on, and you’ll find – we think – that the government is being forced to define precisely where the Ordnance Survey’s “national obligation” ends and its “commercial” (that is, optional, non-core) activities begin…

Today in The Guardian we report on how the Advisory Panel for Public Sector Information has determined in favour of Ordnance Survey in the row between OS and Intelligent Addressing, which runs the National Land and Property Gazeteer (NLPG).

The APPSI said that it couldn’t really rule on the matter – which drew what could be seen as an affronted response from the Office of Public Sector Information, which said that APPSI was making a “literal interpretation” of the rules governing PSI.

As the article explains,

Intelligent Addressing, which operates a gazetteer compiled and run by local councils, complained in February 2006 about the way Ordnance Survey licenses its address database, called AddressPoint.

Intelligent Addressing complained to OPSI, formerly Her Majesty’s Stationery Office, which oversees two compliance mechanisms: the public sector information regulations and a “fair trader” scheme. In July, OPSI’s report backed some of the firm’s complaints. Both sides then asked the APPSI, an expert group responsible to the Department for Constitutional Affairs, to review the findings.

In a 17-page report published on Monday, the advisory panel says that Ordnance Survey’s AddressPoint product is not part of the mapping agency’s “public task”. As such, it cannot breach regulations covering the supply of public sector information. The APPSI recommends that the company take any further complaints to the Office of Fair Trading (OFT).

Here’s the interesting point. If AddressPoint isn’t something that OS needs to do (because it’s not “public task”), then that must be something that lies on the commercial, not PSI, side of its operations. In which case there must be other data that needs to be defined as being the public task.

But equally, one would expect that like the Met Office, which has to charge itself fairly for the data it collects and then resells, this will prevent the OS cross-subsidising itself. In effect, it would strip back what the OS does to a “public task” bone.

Richard Susskind, the head of APPSI, may have made a clever move by palming this off. It in effect forces OPSI to determine what OS’s public task is, and what the limits of PSI are.

We’ll watch with interest.

Revealed: how profitable Royal Mail’s Postcode Address File (PAF) really is

Thursday, April 26th, 2007

The Royal Mail’s Postcode Address File (PAF) is pretty much indispensable for anyone doing direct marketing work. Now the Postcomm report on PAF that we mentioned last week has come up with something not seen before, at least outside RM: the profit and loss accounts for PAF. (Postcomm is the postal regulator, independent of RM.)

Turns out that PAF makes a profit of £1.58m on revenues of £18.36m, an 8.6% return on revenue. Most of the revenues come from PAF resellers (£14.9m). You’ll have to read to Annex 5 of the Postcomm report.

But we’ve looked at what this means in The Guardian. PAF underpins billions of pounds of the Royal Mail’s business, the direct marketing business, and the burgeoning market for satellite navigation (from memory, worth about £400m last year – corrections welcomed).

The price however is without any payment to local authorities, who have been cutting up rough about having no intellectual property rights embedded as suppliers of new address data. (There’s no information from Postcomm about how much PAF revenue comes from the public sector, which would be a useful figure. Royal Mail was chary about providing any information on PAF, one discovers on reading the report.)

RM told us it is “confident” about reaching a settlement. But isn’t this just more of a money-go-round, if they’re paid for supplying data to a product they then buy? Wouldn’t it make more sense to have PAF centrally funded, ringfenced by Postcomm’s recommendations and available for free to all? You might even encourage people to offer updates directly – which would cut the cost of maintenance, which at £9m or so comprises more than half the cost of running PAF for RM.

Could free data have helped the Rural Payments Agency?

Thursday, April 19th, 2007

The Rural Payments Agency has quickly become famous, at least in Westminster circles, for losing £20m of our money and seeing its head fired for the failure which left farmers out of pocket.

This week’s Technology Guardian looks at the extent to which bad mapping led to the problems, and asks whether free data – on the lines we’re advocating – could have averted it.

The answer in truth is “not on its own”, because the RPA was a many-headed failure: the principal cause of problems was that the RPA decided to go beyond the European recommendations, and provide payments for parcels of land as small as 0.1 hectare (1,000 square metres) – one-third as large as required under the European Union rules.

That followed all sorts of problems where maps were inaccurate or out of date, and where changes noted by farmers weren’t incorporated.

Maps printed from the Land Register were sent to every farmer claiming subsidy to check. According to Julie Robinson, a lawyer with the National Farmers’ Union, this is where the system went wrong. “Many of the maps sent back to farmers to check turned out to be seriously inaccurate.” The maps missed land lost to floods, hedges and shadows from lines of trees. “It is all at the mercy of accurate mapping. The farmer depends on them to get it right.” The main problem, she says, was that the system was not matched to the needs of the users.

Somehow we do feel that the Ordnance Survey and its Master Map would have coped better with the problem, as well as giving the Land Registry a huge boost in its attempts to find out who owns what parts of England. (Scotland and Wales coped better.)

Could free mapping data have prevented the disaster? Probably not – mapping was only one factor in a complex mess of policy and management failures. But the fact that Defra was allowed to commission its own geographical database to an unworkably high specification suggests flaws in the government’s current way of working.

There are two opportunities for change. One is a new geographical information strategy for the UK, now before ministers and expected to be published this summer. The second is the process of implementing the European Inspire directive, to create a “geospatial data infrastructure” across Europe. The lead department in transposing this directive into UK law is Defra. We suggest that when implementing Inspire it errs on the side of openness.

In print: Canada’s maps go free – but here’s more background: it’s not so simple

Thursday, April 12th, 2007

Today’s Guardian has Canada drops licences and adopts free model for map data, which makes pretty much all the same points (possibly fewer, due to the limits of print space) as this previous post.

Since writing it however I’ve also been contacted by Tracey Lauriault, of the Geomatics and Cartographic Research Centre, Department of Geography and Environmental Studies at Carleton University in Ottawa. She points out a number of things about the ‘free mapping’ movement in Canada, which I’ll quote at length since they’re all worth absorbing.

The short form, though, is: Canada’s federal maps might be free, but the really useful data lie closer to the local level – and those are still charged for, quite substantially in some cases. Here’s what Tracey said by email, quoted with her permission (since sometimes people don’t want emails reprinted):

I applaud what NRCan has done with its national framework data. Do keep in mind that NRCan topo data that was just made available is out of date and NRCan almost closed the office – read – this Hill Times article (subscription required for full text). We suspect they are making these data public and free to avoid having to continue the provision of paper maps. Canada is a country of wilderness lovers and tons of outdoors enthusiasts (canoers, campers, trekkers, hunters), forests, tundra, mines etc. where people go to very remote areas and well just cannot navigate using a blackberry screen or in remote areas where there is no satellite overage. See how the population is distributed along the US-Canada border to understand how critical it is to have up to date topo maps in rural and remote areas – which is most of the country!

GeoBase ( is very innovative indeed with how it is distributing national scale framework data and Geogratis ( distributes old, out-of-date free data. Both are part of the Canadian Geospatial Data Infrastructure. Both are going in the right direction; however, the good stuff we all want is not there. And both are setting the bar high for other Canadian organizations.

Also read the Maps for Canadians site by Heather McAdam the GIS specialist for the MADGIC (Maps Data and Government Information Centre) at Carleton University. [There is more than one MADGIC in Canada – CA] Heather ran an excellent one woman campaign and mobilized our behind-the-times mapping associations in Canada. .. She is the reason for the topo map announcement.

Canada is a huge country, with tons of geography and few people with three levels of government and division of powers – Federal, Provincial and Municipal. The feds have released the national framework maps, the shells of the national scale. The data to fill the shells such as health canada, statistics canada, environment canada etc. are by no means free or accessible or void of crown copyright. Also, to do any useful analysis at the local scale we need the provincial and municipal data sets which are harder to get – Manitoba being an exception. Other provinces sell the data at a very high price or with very restrictive use policies.

Canada positions itself as being in between UK and US with cost recovery but not extreme like the UK. Statistics canada is very close to the UK model.

The PR person for NRCan was very clever with their headline but this is not a huge breakthrough, just a nice press release, Geogratis and Geobase are far more interesting.