Free Our Data: the blog

A Guardian Technology campaign for free public access to data about the UK and its citizens

Archive for 2007

Europe’s Galileo satellite program faces more obstacles

Wednesday, July 11th, 2007

Following the debate in the House of Commons recently, in which it was pointed out – or at least alleged – that the US’s Global Positioning System (including a useful potted history) now underpins the entire US economy, but that nobody could come up with a convincing reason why we need Galileo too.

Owen Paterson, for the Tories, said:

Last week, I had meetings with representatives of Trafficmaster plc, a highly successful company selling navigation services to more than 100,000 vehicles in the UK. Its technical director, Christopher Barnes, said that

“the free to air GPS service is sufficient for vehicle navigation and therefore we are unlikely to be interested in paying (either voluntarily or through a compulsory tax) to use a European service, even if technically it would be better.”

There is extremely limited application for the higher accuracy that Galileo will offer and, in any event, any such advantage will last only until the US deploys Block III Navstar, which promises equivalence.

(That’s an interesting statistic: 100,000 vehicles using TrafficMaster. And that’s only one of the many satnav systems on sale. How much taxable revenue does GPS – a free government data service – generate?)

Now the Guardian notes that Galileo faces further obstacles: in “Funding row pushes GPS system further off course“, it quotes Olivier Houssin, head of the commercial and security operations of French electronics group Thales, saying Europe runs the risk of being left behind in key commercial and military applications by the US, China, Russia and India if it doesn’t back Galileo.

“If Galileo collapses it will be the collapse of the most important EU programme outside the common agricultural policy,” he said in an interview. “Europe is stagnating in space.” EU transport ministers agreed last month to scrap the public private partnership for building and running the 30-satellite Galileo system, which promises greater accuracy than the American GPS, to control air and road traffic. It will also provide enhanced civil security and even help to pilot driverless trains. Mr Houssin dismissed the PPP/PFI as “a false good idea” because Galileo was a “strategic infrastructure” that Europe had to fund publicly. He said the French and Germans were now at loggerheads over how to provide the extra €2.4bn required.

Berlin favours making extra voluntary contributions to the European Space Agency, which is in charge of the overall project, in return for a greater share of the workload. “They want to take over the technological leadership of the programme and centre it around the activities of Astrium [the space arm of EADS] in Munich,” he said.

The French would prefer to see the project financed as an EU investment, with cash sourced from other European commission budgets.

Britain, which still clings to the notion of a PPP/PFI, has refused to give the go-ahead for switching entirely to public funds.

It is very difficult, as Owen Paterson pointed out, to find a way in which Galileo is necessary when GPS III is on the way – unless, that is, its real value lies in military application. But that supposes us not being an ally of the US, which is a very peculiar worldview to start from.

(Thanks to Rob for pointing to the debate.)

Environment Agency gives its reasons for stopping flood data being used

Thursday, July 5th, 2007

Today’s Guardian Technology includes “Rising tide of frustration at flood maps’ restrictions“, following on from the Environment Agency (for England and Wales) decision to forbid OnOneMap republishing a scraped version of its flood map data.. which is the sort of data that one would like to have at the moment when considering where to buy or rent next.

Interestingly, OnOneMap had also – it tells us – scraped the Scottish Environmental Protection Agency’s data for maps (which doesn’t use the same criteria as the E&W one; some joined-up thinking needed?) but hadn’t started using it.

So why is the flood map data not available beyond the EA site? The EA says it is – but at a price. “The charge for commercial use of the whole Flood Map dataset for England and Wales starts at £4,000 per year. We will charge a royalty fee on those companies that sell the information on,” the agency said.

“More than 50 commercial companies are licensed to use the flood map and/or our other flood risk data. Hundreds more commercial licences are issued for use of flood risk data for local studies. However, some do not pay because they are statutory customers, such as gas and electricity companies and utilities.” The data has been available since 2001.

Philip Sheldrake, managing director of the OnOneMap site, thinks it’s daft.

One irony Sheldrake points out is that the SEPA data is held in a different format – meaning that OnOneMap was the first service that pulled all this information together in one place (although it had not started offering the Scottish flood data when the EA complained).

“We have not applied for a licence [from the EA] as we believe the data should be in the public domain and moreover they have advised us that their licence terms still would not permit us to provide the data to the public via a website!” Sheldrake says.

Flood data? Although it could be done with an XML feed for all (guaranteeing it would be up to date) and thus displayed on any map outlay people wanted, we’re stuck – at present – with one pasted on Ordnance Survey bitmaps. Is this really the best we can do?

Catching up: government responds to OFT and Power of Information reports

Thursday, July 5th, 2007

Should have posted this earlier, but I was wondering whether the disappearance of the DTI would kill the links. (It hasn’t – still works, but just has a different name. By their web addresses ye shall know them…)

Anyway, on the last Monday of June the government finally replied to the OFT “CUPI” report on the Commercial Use of Public Information, and to Tom Steinberg’s and Ed May’s Power Of Information report.

The response to CUPI is from the DTI, or as it’s now known the Dept for Business, Enterprise and Regulatory Reform. The PDF response is at (78kB).

The Cabinet Office response to the Power Of Information is at a page titled “Power Of Information principles get go-ahead from Government.”

There’s also a “Have your say” page attached to the Cabinet Office page; see what you think.

The key question though: will trading funds be forced to consider themselves? Yes. The government is asking for an investigation into the cost benefit justification for organisations working as trading funds.

We reported this as “Government on the back foot over policies for pricing data“:

The long-awaited reports, from the Department of Trade and Industry and the Cabinet Office, recommend that the government make more data available without strings to community and commercial ventures. The Cabinet Office also paves the way to government sites opening self-help forums for citizens, and civil servants engaging openly in independent forums, blogs and wikis.

Both reports, however, avoid the central demand of Technology Guardian’s Free Our Data campaign – that government should stop running information businesses, and instead give all data away to stimulate the knowledge economy.

On the plus side,

Notably, it urges Ordnance Survey to launch its postponed OpenSpace project, which would provide a Google Maps-like interface for mashups, by the end of December.


However, the response warns that more ambitious steps towards free data could threaten the business of trading funds and their “vital role in the UK economy”. It agrees with the Power of Information review, which says that the government should get hard data from an independent study of the costs and benefits of the trading fund model.

So we’ll ask for an on-the-record ministerial interview once we’ve figured out who, under the new regime, is responsible for Cabinet Office, trading funds and public sector information…

Environment Agency yanks flood data from OnOneMap site

Saturday, June 30th, 2007

We’ve written in the past about OnOneMap], which took the interesting step of taking the mobile phone mast data from Ofcom and making it available on its property/rental search site.

But recently it did something much more interesting – and, given the weather, useful. At the start of June it added the Environment Agency’s flood risk data to its Google Maps implementation.

The Environment Agency’s flood data is, to be honest, not as useful as it could be. You can do a postcode search, but when you then look at it, it is difficult to work out quite what’s at risk.

An article in today’s Guardian – Agency’s flood maps fail to hold water – makes this point more elegantly. It appeared in the Money section of the paper.

As the article points out,

users will find the site lacks crucial details. For example, it fails to show the location of a home in relation to the area at risk of flooding.

OnOneMap managed to grab the Environment Agency data (for England and Wales; we’re checking on Scotland) and added it as a layer to its Google Maps.

The the Environment Agency got in touch: the data, it asserted, was its copyright, and it wasn’t happy about it being used in this way – even though OnOneMap is not (at present) a for-profit site. It asserted its ownership of database copyright in the data, which is hard to rebut, and threatened to take OnOneMap to court.

(As I write this, the news is on, saying that the estimated costs of the flooding this month are £1 billion.)

Without the resources to fight such a battle, OnOneMap removed the data – but not without making a note to that effect on its blog:

The Environment Agency claims they have copyright over the information, and despite the fact that tax-payers’ money has paid for it to be collected in the first place, apparently the tax-payer cannot benefit from innovations like our housing and flood map combination.

The comments on the post are quite illuminating, for example:

This is absolutely outrageous given our tax money has paid for this. Surely it is up to the government agencies to ensure this information is widespread, ESPECIALLY during this time period when we are being inundated with water!!


Ridiculous, especially right now, that people need to find out if their home is at risk of flooding… greedy buggers, the Environment Agency… I loved the feature on, shame you had to withdraw it

Interestingly, there is a site which does have very detailed data – which was gathered by the Norwich Union – is you have to pay for the data (unsurprising, since it cost Norwich Union something like £5m to gather it..). But wouldn’t it be better if the Environment Agency data was available to all of us, free, without having to go to its site?

In today’s Guardian: Galileo to be publicly-funded – but why compete with GPS?

Thursday, June 21st, 2007

Today’s Guardian Technology section wonders “Will Galileo ever achieve orbit?” That being the project to create a GPS-alike system, but funded from Europe rather than the US.

The costs involved are large –

a 2001 report commissioned by the EU estimated that developing and deploying Galileo would cost €3.4bn (£2.3bn). Philip Davies, senior account manager at Surrey Satellite Technology Ltd (, says he’s seen estimated running costs in the range of €8bn to €10bn over 20 years. These numbers will be part of what is scrutinised between now and September, the deadline the Transport Council has given the European Commission to come up with alternative proposals for funding and managing the system.

GPS-style systems have the benefit that they’re government-generated data which is made for free: that creates private-sector opportunities. Very large ones:

The business plan published by the Galileo Joint Undertaking at the outset of all this estimated that the market for satellite navigation applications would grow from €30bn in 2004 to €276bn by 2020. This estimate was conservative compared to some of those in the report the EU commissioned in 2001 from PricewaterhouseCooper, which projected a market of €276bn by 2015 for personal communications and location services.

Clearly there’s a strong multiplier at work in GPS – comparatively small government input produces big private-sector revenues (and hence taxes, which pays for GPS – in theory – and all sorts of other benefits, as well as reducing congestion and making people easier to locate if they’re in danger, and stopping planes colliding; putting a value on the “not happening” bad things is hard, but must be counted as part of the benefit of satellite location.)

Galileo remains something of a me-too, but it’ll be interesting to see whether the EU decides to focus on its public and private-sector benefits as a justification for the spending – or whether instead it just kills it. Somehow, it doesn’t look likely to kill it.

New Zealand makes statistics data free to encourage business – but where’s the logic?

Friday, June 15th, 2007

In an announcement that we’re struggling to find much coverage of, New Zealand’s statistics are to be made available for free starting from this August, down from prices of NZ$25,000 (£9,500) in some cases.

That’s quite a radical move, explained by the NZ minister for statistics Clayton Cosgrove thus:

“I am pleased to announce that information to help businesses identify market opportunities, assess their competitiveness, and implement informed investment planning will be made freely available. The roll-out of information will include a host of industry-specific information for the building, retail and tourism sectors, and for importers and exporters. The data will also be useful for local authorities and communities,” Mr Cosgrove said.

And also – chiming with the Free Our Data rationale –

“Previously the information could be ordered at a cost from Statistics New Zealand, but in future, trade figures, for example, which were charged out at around $400 per customised request, or Digital Boundaries files that cost up to $25,000, will be available free.”

“The Labour-led Government is committed to giving businesses every opportunity to grow and prosper by providing the tools to support well informed decision making. Making key information available at no charge will encourage more businesses to identify new markets, for example, and plan for the future.”

Politics aside, businesses are keen on it:

Phil O’Reilly, Chief Executive of Business New Zealand, said, “Business groups have consistently advocated that this valuable information be made freely available, as it is in Australia. I am pleased the Government has taken this step.”

Well, yes, anything that reduces a direct business cost is going to be welcomed. But of course this will have to be balanced with increases in the amounts spent on direct taxation: the NZ government is putting NZ$6m into Statistics NZ over the next four years to ease this process.

And of course in the UK, National Statistics makes all its raw non-personal data (correct me if I’m wrong) available for free, except where it’s forbidden by copyright involving mapping agencies, and only charges for custom-made datasets.

There’s some media coverage – New Zealand Herald interviews the head of Statistics NZ.

An interesting NZ government press release from August 2006, when Mr Cosgrove talked about how statistics can help small businesses. (But not if they can’t afford them, eh?) And the Salvation Army pointed out that the free data is good news for the charity sector:

‘Many of the statistics are currently prohibitively expensive for non-profit groups, so removing the charges will make available a wealth of information otherwise inaccessible.’

A full list of what’s being made available (the government press release plus accompanying PDF).

And here’s a key part, from the FAQs: the expected growth.

Have other countries done this?
Yes. Australia and Denmark have both seen big surges in use of data following similar initiatives to make statistics freely available. The Australian Bureau of Statistics reports data downloads have approximately tripled since they made similar information free in 2005.
What is the uptake of the data expected to be?
A similar upsurge in data uptake is expected in New Zealand. In 2003 Statistics New Zealand made Census information freely available on the internet. This has resulted in a significant increase in public usage from around 250 paying subscribers in 1993 to over 20,000 accesses in the last year alone. [Emphasis added – CA.] The INFOS system currently has 93 annual subscribers. Once the system is redeveloped for easy use on the web, based on international experience, usage could increase to between 1500 to 2,000 users per month, and businesses would become the predominant sector using the information.

What we can’t find is anything leading up to the announcement, nor any indication of the economic analysis that surely must have been done before embarking on this. Any pointers?

Minister confirms government response to OFT on PSI by end of June

Thursday, June 14th, 2007

Wondderful thing, theyworkforyou – it takes government (well, Parliamentary) data and repurposes it to create something more useful.

We’ve just noticed that among the written questions is one by Mark Todd, who wanted to know when the government will respond to the OFTs’ CUPI report.

The response of Ian McCartney, DTI minister:

The OFT’s report contains some challenging recommendations. Given the importance and potential impact of the recommendations, and the wider constituency of interested parties both within and outside Government, more time was required to properly frame the response. The Government expect to publish their response to the report before the end of June.

We’re intrigued by the idea that there are “challenging” recommendations in the OFT report. But at least this gives us a clear indication that things are moving.

In today’s Guardian: more examination of ‘The Power Of Information’

Thursday, June 14th, 2007

Today’s Guardian looks at the Ed Mayo/Tom Steinberg report The Power Of Information and asks what sort of Whitehall it would be that could open up, and what the effects will be.

At the moment, the government’s attitude to the web is a mixture of aloofness and outright hostility. This should change, the report argues, partly because some of the most popular user-driven communities – MoneySavingExpert, for example – are closely linked to government policy. Others directly contribute to the public good: in Los Angeles, when the government started putting the results of food safety inspections online, the incidence of food-borne illnesses fell compared with that in neighbouring jurisdictions.

The government should also be involved because online communities are big users of a repository of data generated by public bodies ranging from tide tables to school league tables. The internet greatly increases the value of this information. The humble postcode, originally developed for a single purpose, now underpins countless public, private and voluntary services.

We’ve been around postcodes before, of course, but what’s interesting is that it has become a datum whose use has expanded far beyond its original intention by virtue of being mashed up with something else – geographical location. (In fact we might call postcode analysis the first mashup.)

However, Government 2.0 is not yet official policy. The Cabinet Office will respond “in due course”, officials said, almost certainly to coincide with the government’s overdue response to the Office of Fair Trading’s report on the commercial use of public-sector information.

As we said – long overdue. As in three months overdue. But it makes sense to lump these together, even if the Mayo/Steinberg report took only one-eighth the time to write by my estimate.

The snag is that the response will need approval from arms of government whose income is likely to be hit by the proposals. If Ordnance Survey or the Meteorological Office had to give away information for which they charge today, they would look to their sponsor departments, Communities and Local Government and the Ministry of Defence, to fill the gap with tax revenues. With a tight three-year spending squeeze to be launched by the Comprehensive Spending Review in October, this would not be popular.

Hmm.. but on the plus side, the report recommends that

• By March next year, the government independently review the cost and benefits of supplying public information through trading funds. This would examine the five largest trading funds, the trade-off between revenue from sales of information, the wider economic benefits of giving the data away and the potential impact on the quality of data.

• Public bodies, including trading funds, only to charge the marginal cost of distribution for raw information – which online is usually zero. The only exceptions should be where independent analysis shows that this does not serve the interests of citizens.

• All trading funds consider introducing free licences for non-commercial re-use of PSI.

• Ordnance Survey should launch its proposed OpenSpace scheme, allowing non-commercial users free access to data, by December. The service is currently on hold, the review says, because smaller commercial users object to data being made available freely to potential competitors.

That’s got to be positive, surely.

“Power of Information” review from Cabinet Office: government could do more with our data

Thursday, June 7th, 2007

Tom Steinberg of MySociety (behind theyworkforyou) has co-authored an important new report, “The Power of Information Review” which suggests that online advice sites could improve citizen empowerment.

But it does much more: it looks at government use of public data and suggests more could be done, including beefing up the Office of Public Sector Information.

We’ll have more when we’ve read it in detail..

Tom Steinberg’s blogged his brief comments; here’s the official PDF; and here’s a site where you can comment on the report, rather as you can with Hansard discussion on theyworkforyou.

Meanwhile today’s Guardian looks at the collapse of the National Spatial Address Infrastructure effort in “Address plan finally abandoned“. RIP NSAI – you tried but didn’t have the right backing, ultimately.

If councils move to Google Maps does that help or hinder Ordnance Survey?

Thursday, May 31st, 2007

Today’s Guardian Technology looks at how a number of councils, notably including the London Borough of Brent, and even some central government organisations, are moving to use Google Maps for their consumer-facing displays of map data.

In Councils bypass Ordnance Survey for Google Maps, Heather Brooke looks at the shift, which councils are making because in the first instance, Google Maps is free and comparatively easy both to program and use:

Traditional geographical information systems provide “complex data, complex systems”, said Dane Wright, IT service manager at Brent council in north London, at the annual conference of GIS in the Public Sector earlier this month. Google Maps, by contrast, provides “complex data, simple systems”.

Wright told the conference: “What we are doing is moving to Google Maps as the primary interface for casual use by public users. This will leave the GIS system for more specialist users. The reason for doing this is to provide a better user experience – familiar interface, easy to use, integrated aerial imagery, attractive, no need for training or large manuals.”

But, you say, OS is the source for pretty much all of Google Maps data – and where it isn’t then the company that is sources that from OS.

That’s true – but it does mean that OS becomes vulnerable if Google decides that it would like to shift to someone else for its mapping data. And without knowing the precise details of the Google Maps licence with OS – does it pay per map displayed, per frame downloaded, or is it a lump sum? – one has to wonder what the effect will be.

Meanwhile, even the government’s Directgov system for finding a school in a locality uses Google Maps (although a number of other Directgov systems don’t). Other examples of Google Maps (or indeed Yahoo! Maps or Live Local Maps) being used rather than OS for customer-facing products are welcome. Seen any?

Postcodes: local authorities vs Royal Mail still arguing; want to sign a petition?

Thursday, May 24th, 2007

We’ve got intellectual property rows, petition and a question you might be able to answer this time.

This week’s Guardian returns, in Royal Mail fails to address database issue, to the vexed question of whether RM will let local authorities retain some intellectual property in the addresses they provide to it, as well as paying them for it – neither of which happens at present.

According to leaked letters we’ve seen, RM isn’t in favour. But the authorities are. Impasse. And as the article points out,

The saga provides a graphic example of an issue at the heart of Technology Guardian’s Free Our Data campaign – the bureaucracy and waste that ensue when state bodies treat vital data as an asset that must be made directly profitable.

The addresses go into the Postcode Address File, which unlike thousands of post offices

is profitable, making £1.58m on revenues of £18.36m in 2005-06 (Royal Mail’s postcode database reveals its profitable side, April 26). Councils in England and Wales spend about £2.5m a year on postcodes (paid to Ordnance Survey and commercial businesses, as well as Royal Mail).

After protests last year over price rises, Royal Mail said that it would consider “reasonable remuneration” to local authorities supplying data on new addresses. But negotiations on the terms have foundered over intellectual property rights.

This one could run and run – and already has.

This ludicrous position is a result of conflicting responsibilities placed on state-owned bodies. As a commercial (albeit state-owned) enterprise, Royal Mail has to make its assets pay, and that includes a national resource such as postcodes. Local authorities are being squeezed by council-tax caps and efficiency targets; selling data is thus a rare new source of income. The agency, meanwhile, wants to ensure the future of the National Land and Property Gazetteer, which it sees as a vital tool for modernisation. Its commercial contractor, Intelligent Addressing, is embroiled in a separate dispute with Ordnance Survey over addressing data.

The Free Our Data campaign proposes that all public bodies to free up their address databases, funded from central taxation. We’re not alone. A petition at urging the prime minister “to end the address dispute between local government, Royal Mail and Ordnance Survey” has 370 signatures.

In case you haven’t been to see (or sign) it, here’s the address mess petition. (It was started by Robert Kimber of Luton council; he’s been quoted here earlier.) We found it via the NLPG’s April e-zine – which is itself an interesting byway. The NLPG, of course, is the one which handles all the addresses (except for Birmingham’s. Why doesn’t Birmingham belong to the NLPG? Answers in a comment, please.)

Environment Agency charges for data that were free – creating risk to water

Thursday, May 17th, 2007

In “Free groundwater information dries up“, today’s Guardian looks at the example that was passed to us of the Environment Agency, which used to make available the data about the location of “source protection zones” – essentially, areas around groundwater sources which must be protected from pollution to avoid contamination of drinking water supplies.

One-third of the water Britons drink comes from the ground; in dry and crowded south-east England that proportion rises to three-quarters. But groundwater is a limited resource and vulnerable to pollution. Contaminants can be diffi cult to detect until it is too late. According to the Environment Agency, the main danger comes from constant small leaks of sewage, agricultural chemicals and oil.

The first step to safeguarding our hidden groundwater is to tell people where it is. For this purpose, the agency has identified some 2,000 “source protection zones” around wells, boreholes and springs supplying public drinking water. These zones cover the total area of land needed to support removal of water from a source. They identify places where there is a risk of contamination and allow the authorities to monitor and control polluting activities.

Those data used to be available free as a download from its website; consultants and others considering where and what to do in places they suspect might be SPZs could consult them. Government-collected data, available for free, with a public benefit in its being free.

Now the EA is charging companies for the data:

One consultant says he has been quoted £750 for an annual licence to access data on source protection zones. The agency confirmed that it charges business users. The policy began “a couple of years ago”, it added.

Strangely, it seems to say that individuals can get it for free. (How about self-employed ones?) To which consultants reply that this will probably lead to companies taking the cheaper way out – use the old datasets. But because SPZs can change, that implies a risk to groundwater.

From the article:

Technology Guardian’s Free Our Data campaign, which argues that all impersonal electronic data collected by the government in the course of its public duties be made available free to all comers, agrees. Apart from the direct risks arising from data not being available, there is also a chilling eff ect to the wider knowledge economy: innovative ways of disseminating these data may never be developed if it remains controlled by government.

There is also a practical issue: does the revenue from licensing to conscientious professionals (for unscrupulous ones may find their own sources) really outweigh the cost of administering and policing the charging regime? And is there an overall benefit beyond any (undemonstrated) financial one? With data held on a web server, issues of scarcity do not exist; unlike a well, a server will never run dry of the necessary 0s and 1s to make a copy of a dataset. Yet the Environment Agency is seeking to impose an artifi cial constraint on the supply of this data without any evidence that such a constraint is necessary.

Note: this is, as so many of the examples in this campaign are, an example of charging for data that arguably would be better free sent in by a reader. We’re always grateful for these, and fascinated by how wide and deep your experience goes. Please do let us know, either by leaving a comment here or emailing me ( works well) about other examples of data being charged for that should be free. The government is beginning to listen, we think.

How home information packs give local government a pricey monopoly

Thursday, May 10th, 2007

In today’s Guardian, “Government ducks the issue on property search data” looks at how the arrival of home information packs will give local councils a monopoly in land searches for properties. Trouble is, it’s not one that they’re entirely happy having because it ends up costing them.

However, local authorities can be sticky about releasing “unrefined” information. Nearly two years ago an investigation by the Office of Fair Trading (OFT) found that one in 10 authorities provided no access to records of Town and Country Planning Act notices – even though by law this information should be openly available for free. A personal search company told the office that one third of local authorities do not provide information on highway developments.

Councils argue that they are in a cleft stick. The fee levels set by central government for access to certain items of unrefined data do not cover the cost of dealing with personal search companies. On average, each request takes council staff 70 minutes. They therefore have to subsidise this service out of fees charged for compiled searches. Charges range from £55 to £269, with an average cost of £119.

What’s the problem?

This is another example of what happens when a public body with a monopoly in raw data tries to sell products compiled from that data.

Couldn’t the government do something definitive?

The government responded this month with a consultation document on “Good Practice Guidance” that urges councils to play fair. One key concern of the OFT – price transparency – is ducked entirely. This will be subject to “a further consultation exercise later this year”.

It’s almost strange how the government keeps putting off big decisions like this. It’s almost as if they were waiting for someone important to leave, or someone important to take on a new job so that things could settle down.

Meanwhile we’re expecting the DTI response to the OFT report on public sector information to be released any day now. Let’s hope it’s not bad news – it might get buried.

APPSI comes out in favour of Ordnance Survey on addressing – but it’s two-edged

Thursday, May 3rd, 2007

Deep waters here: this is a case where what goes on in public is more subtle than at first appears. Read on, and you’ll find – we think – that the government is being forced to define precisely where the Ordnance Survey’s “national obligation” ends and its “commercial” (that is, optional, non-core) activities begin…

Today in The Guardian we report on how the Advisory Panel for Public Sector Information has determined in favour of Ordnance Survey in the row between OS and Intelligent Addressing, which runs the National Land and Property Gazeteer (NLPG).

The APPSI said that it couldn’t really rule on the matter – which drew what could be seen as an affronted response from the Office of Public Sector Information, which said that APPSI was making a “literal interpretation” of the rules governing PSI.

As the article explains,

Intelligent Addressing, which operates a gazetteer compiled and run by local councils, complained in February 2006 about the way Ordnance Survey licenses its address database, called AddressPoint.

Intelligent Addressing complained to OPSI, formerly Her Majesty’s Stationery Office, which oversees two compliance mechanisms: the public sector information regulations and a “fair trader” scheme. In July, OPSI’s report backed some of the firm’s complaints. Both sides then asked the APPSI, an expert group responsible to the Department for Constitutional Affairs, to review the findings.

In a 17-page report published on Monday, the advisory panel says that Ordnance Survey’s AddressPoint product is not part of the mapping agency’s “public task”. As such, it cannot breach regulations covering the supply of public sector information. The APPSI recommends that the company take any further complaints to the Office of Fair Trading (OFT).

Here’s the interesting point. If AddressPoint isn’t something that OS needs to do (because it’s not “public task”), then that must be something that lies on the commercial, not PSI, side of its operations. In which case there must be other data that needs to be defined as being the public task.

But equally, one would expect that like the Met Office, which has to charge itself fairly for the data it collects and then resells, this will prevent the OS cross-subsidising itself. In effect, it would strip back what the OS does to a “public task” bone.

Richard Susskind, the head of APPSI, may have made a clever move by palming this off. It in effect forces OPSI to determine what OS’s public task is, and what the limits of PSI are.

We’ll watch with interest.

Revealed: how profitable Royal Mail’s Postcode Address File (PAF) really is

Thursday, April 26th, 2007

The Royal Mail’s Postcode Address File (PAF) is pretty much indispensable for anyone doing direct marketing work. Now the Postcomm report on PAF that we mentioned last week has come up with something not seen before, at least outside RM: the profit and loss accounts for PAF. (Postcomm is the postal regulator, independent of RM.)

Turns out that PAF makes a profit of £1.58m on revenues of £18.36m, an 8.6% return on revenue. Most of the revenues come from PAF resellers (£14.9m). You’ll have to read to Annex 5 of the Postcomm report.

But we’ve looked at what this means in The Guardian. PAF underpins billions of pounds of the Royal Mail’s business, the direct marketing business, and the burgeoning market for satellite navigation (from memory, worth about £400m last year – corrections welcomed).

The price however is without any payment to local authorities, who have been cutting up rough about having no intellectual property rights embedded as suppliers of new address data. (There’s no information from Postcomm about how much PAF revenue comes from the public sector, which would be a useful figure. Royal Mail was chary about providing any information on PAF, one discovers on reading the report.)

RM told us it is “confident” about reaching a settlement. But isn’t this just more of a money-go-round, if they’re paid for supplying data to a product they then buy? Wouldn’t it make more sense to have PAF centrally funded, ringfenced by Postcomm’s recommendations and available for free to all? You might even encourage people to offer updates directly – which would cut the cost of maintenance, which at £9m or so comprises more than half the cost of running PAF for RM.