The UK’s Digital Economy Bill – failure of an Act

It seems to be a cross-party consensus that anything called a “Digital Economy” Bill will demonstrate how out of touch with digital the authors of Government policy can be. Especially when they have an inability to say not to special interests making a narrow argument.

The current attempt tries to enhance the digital economy with adult content filtering, and then censorship, as a “digital” fix to a social problem of people getting enjoyment from looking at porn. Good luck with that, (although it’s one way for Cabinet Office to get 25 million accounts on Verify).

In the 8 months since the Digital Economy Bill was introduced, the digital world has moved on – treating broken policy intent as damage and simply engineering around it. The Bill received Royal Assent yesterday and is now and Act.

For example, the Act introduces mandatory blocking of websites which don’t implement restrictions on access. That block limits access to a domain name (so your browser can’t work out how to show you what’s on www.ParliamentPorn.uk).

Coincidentally, also yesterday, Ethereum announced the launch of their distributed ledger implementation of DNS, to go from a human readable web address to the content/IP address using the data on the ledger (or at least, bits of it), rather than asking your ISP – who is now required by law to lie to you in the UK, and already routinely would in other countries overseas.

The Act therefore incentivises the use of censorship resistant technologies by those who are most likely to have interest in looking at prohibited content… At least UK teenagers will be more engaged in the digital economy… The Bill assumed that people would use credit cards to assert their identity – because nothing could possibly go wrong with that.

Elsewhere in Ethereum

With the ethereum browser plugin giving cryptography, services, and payments, there wouldn’t be an “hard to explain” entries in your credit card history, just another single use address amongst many others, and fully protected from the state snoopers.

Ethereum contains its own currency with micropayments built in (it is the number 2 cryptocurrency behind bitcoin), and is now finalising the design of zero-knowledge proofs – so you can pay someone money, with the sender, recipient, and amount all able to be fully anonymous (which is a higher level than bitcoin, which is pseudonymous).

The systemic incentives in many ways are obvious – using internet agreements, it’s already been shown that a set of creators and contributors to a project can agree to a contract that automatically shares the proceeds in a pre-agreed and privacy preserving way – the “fair trade music industry”. Whether this project gets used or not is up to the world, but it’s possible, and possibility is what a digital economy can produce.

As a side effect of something else, the technically possible has already undermined the Digital Economy Act’s social intent.

A viewer can access something they want to, that otherwise they couldn’t, in a manner that is fully privacy preserving, and also ensure producers can get paid. You can get a trial app for your phone here (which also does secure comms using Signal based on your wallet address rather than email/phone number).

That is what a digital economy should enable, and indeed it does as a side effect… It’d just be nice if one day they did it deliberately…


Disclosure: in my day job, I was looking at (only) Part 5 of the Digital Economy Bill, which is also broken. The above is just me, just for fun.

Ps – the use of ether addresses as Signal endpoints is a rather genius idea, to build on top anything else. Although this story isn’t entirely new.

 

posted: 28 Apr 2017

Some interim innovations enabling future deployments of verify infrastructure

These are not ideas for Gov.UK Verify, but other conversations on the side where reuse of that model may be being considered – that service is referenced here as “Verify”.

The Principal consideration

The identity assurance principles, as overseen by PCAG, should apply to all deployments of a Verify hub model. Indeed, the Verify Hub codebase should probably include a configuration check that variable “follows_pcag_principles” is set to “yes” in order to start.

Care.data lessons, unlearned

HSCIC has a form you can fill in to opt yourself out of the various HSCIC datasets – it’s 12 pages long.  The equivalent form, ready to be handed to your GP, is 1 side of A4 and contains 2 tick boxes (plus space for your name, address, etc).

The HSCIC 12 page form has the same tick boxes, but the other 11 and half pages is all to verify identity, so that a remote institution that very rarely deals with patients knows that the right person filled in the form. A process, done at the wrong level, can generate that much extra paperwork. It doesn’t need to happen in a trusted system.

We assume the NHS is a trusted system.

  1. Service reauthentication (A)

Normally there is a single step where you should log in and after that identity is assured, which in the case of legal identity, is entirely fine.  But not every service can be like facebook, institute a real name policy, and force citizens to produce papers on request.

For many services, there are points where a  high level of contemporary confirmation is required (just as some websites make you re-enter a password in order to change certain settings, or see certain information).

  1. Service within a service

Like reauthentication, but where the process is less about reauthorisation, but landing on another, institutionally separate, service. That second service is only available to those who have logged in to the first (so there is a knowledge of valid accounts), but the interim step is the only point who knows where the individual went, and where they came from. The second service may have no attributes by which to work out their identity, but on arrival, it simply knows that the session belongs to someone with valid credentials.

There would be many small such services each protecting legally separate sensitive services (e.g. SH24), and ensuring there would need to be failures of at least three silos in order to compromise an individual’s identity. (C)

  1. Define identity (e.g. NHS)

For example, the NHS has spent 20 years moving the NHS number to be the NHS equivalent of a passport number. But the criteria for getting a NHS number and a passport are different. Define who your customer group is.

If there are different definitions of identity and assurance models, a service within a service, using multiple implementations of the verify infrastructure, can provide a bridge between different definitions where that is appropriate to bridge between identity models.  (B)


Without this capability, this means either accepting the clinical definition of identity, or reworking every patient interaction in the NHS. Such a decision may be unwise.

  1. Getting there from here

Existing identity providers need an on ramp that isn’t a full deployment of a IDP infrastructure on day 1. Given the existence of a trusted, shared, identifier within different institutions (e.g. NHS Number), that doesn’t matter as there is pre-shared knowledge.


Providers wishing to implement a light version of the infrastructure hand off using a custom API. The code they’d need being:

When a browser has logged with the relevant hidden field set:

$hash=sha3(rand() . $timestamp . $nhsno . $providerID);
$request=get(‘https://verifyheavy.nhsdigital.nhs.uk/private/confirmvalid?nhsno=$nhsno&timestamp=20161127191800&hash=$hash)
If (not error) {
Redirect browser to https://verifyheavy.nhsdigital.nhs.uk/clientrequest?hash=$hash
}

And when the browser shows up at NHS Digital, if the hash is current and valid, then the the NHS number is a confirmed attribute from the private request made earlier, and attribute exchange can continue from there as normal for a verify framework.


Over time, this lite client can be replaced as full pseudonymous attribute exchange becomes available.

ie

screen-shot-2016-12-01-at-23-02-07

posted: 01 Dec 2016