This page is your concise dossier on the UK story: the actual laws, government papers and original documents, each paired with a plain-English translation so you don’t need a law degree to know what they mean with the originals are linked so you can read the source, and the translation tells you exactly what the dry policy language will do in practice. Expect clear explanations of terms they hide behind and why those terms matter to your daily life.
You’ll find the statutory material and guidance that make One Login, the GOV.UK Wallet and Companies House checks operate in the real world, plus short PDFs on informed consent, annotated extracts from the Acts and guidance, and ready-made letter templates you can send. Everything is aimed at making the state’s moves visible and usable: read the law, read the translation, and use the documents themselves as evidence when talking to friends, advisers or "officials."
This page will grow. New legislative reads, video explainers, expert commentary and practical briefings will be added over time, all UK-focused so you can follow the thread from policy text to practical effect. Use this as your reference shelf: hard facts, plain language, and the receipts you need to show how a single login and a government wallet become the building blocks of a universal digital identity.
Read the documents. Read my translations. Share the evidence. Resist the convenience that becomes compulsion.
Digital ID is the quiet doorway to a different world: it looks like a harmless app, a QR, a green tick — but those little conveniences are actually the master key to everything you do. That single, verifiable credential can be made to open bank accounts, confirm work status, hand over health records, clear travel and gate access to benefits. When one credential becomes the norm, it ceases to be optional. It becomes the thing you must have to eat, to earn, to move. In a system designed around one key, refusal doesn’t look like rebellion — it looks like exclusion.
Once identity is remodelled into infrastructure, control goes structural. A unified digital ID can be wired into payment rails, employer systems, insurance databases and surveillance networks so that behavioural rules—late payments, missed check-ins, “non-compliant” health records—can trigger automated penalties: blocked transactions, denied job clearance, restricted travel. Geofencing can enforce presence or absence in places; automated flags can shrink access to services in real time. Combine scoring algorithms with location gates and you have a system that can punish, quarantine or downgrade citizens without a single new law: it happens through APIs (Application Programming Interface; the set of rules and doors that lets one piece of software talk to another) contracts and business rules.
Social scoring is not sci-fi when identity, transactions and movement are fused. Every purchase, every medical appointment, every work check-in becomes a data point. Private scoring engines or government-adopted vendor metrics can translate those points into privileges: fast lanes, loans, or the opposite — slow lanes, higher costs, exclusions. Algorithms trained on biased data will amplify disadvantage; automated enforcement will multiply it. The veneer of “efficiency” masks a system that ranks people and then gates life according to those ranks.
Look at what happened here: millions pushed back. A UK petition against digital ID gathered millions of signatures, and the government’s formal response has been to press ahead — framing the rollout as routine public service reform even as millions demanded it be scrapped. That shrug is not minor: it shows a democratic short-circuit where public outrage becomes a footnote while technical programmes slip forward under the cover of “efficiency” and “border security.”
This isn’t a lone-state fantasy. Global institutions have canonised identity as infrastructure: the UN treats “legal identity” as a development target, the World Bank builds programmes and funding streams for national digital ID systems, and the World Economic Forum publishes the blueprints and “trust frameworks” that bake interoperability, governance models and acceptable standards into the playbook. That international imprimatur turns a policy idea into a financed, standardised ecosystem that favours the architects of the stack. When identity is rebranded as a public good, the people who build the plumbing decide its shape.
“Philanthropy” and private capital make the takeover low-risk and high-speed. Foundations seed pilots, write playbooks and convene the experts whose reports are then presented as “evidence” that the model works. Big tech and finance supply the platforms and the integration logic: standardised data languages, audit trails and platform operating systems that scale across states. Once those platforms are embedded in government workflows, “vendor lock-in” becomes civic lock-in — a handful of private stacks become the levers of public life.
The endgame is structural and slow: infrastructure is hard to rip up. Build the credential, wire it into the money, work and welfare systems, and the normal rhythms of life will reconfigure around it. That’s how a society moves from optional convenience to soft coercion: not through a single dramatic law, but through procurement rules, international standards, philanthropic playbooks and private platform logic that together make opting out practically impossible.
Picture a world where a slipped score, a flagged movement, or a closed vendor contract can quietly clamp down on someone’s ability to eat, travel or earn — and you have the real face of digital ID: not neutral tech, but a totalising architecture of observation, ranking and control.
If you wanted 1984, this is it in slow motion: the tick on your phone becomes the key in the gate, algorithms replace judgement, corporations host the switches, and international legitimacy dresses it all up as modernisation. The raw truth is ugly and simple — hand over identity to an engineered, global system and you hand over the levers of everyday life. The petition numbers and the steady programme-building show that this is not hypothetical: the machinery is being built and the democratic protests have so far been met with policy momentum, not retreat.
It’s not MPs in Parliament who are designing this architecture or pulling the strings; the real control sits with the architects — international agencies, big foundations, finance and platform companies that design, fund and sell the infrastructure. If you want to stop what’s coming you need to know who builds the machinery: read "The Architects" page. Understanding the enemy is not a game; it’s the first step toward resisting a system that turns identity into a lever of control.
The UK’s shrug at millions of petition signatures shows democracy being sidelined while a global technical and financial machine builds the scaffolding of surveillance, scoring and exclusion. Make that visible, expose the networks, and force the debate into the open — because once this infrastructure is in place, it becomes the quiet, permanent method of governing people’s lives.
That’s why this isn’t just about one government. It’s about stopping a global architecture of identity before it becomes irreversible.
The World Economic Forum itself admits that for many people meaningful informed consent is impossible. The WEF admits it plainly: “For populations lacking digital literacy, it may be impossible to obtain meaningful informed consent.” We will use this chink in their armour. They are building systems so complex and so quickly tied to essential services that a consent tick box becomes a fiction. Once access to work, benefits, schooling and banking depends on a digital pass, refusal is exclusion and "choice" is erased. Technically literate or not, you cannot consent to what you cannot fully see, opt out of, or stop — and that admission is the rope we should use to pull the whole project apart.
The architects admit consent will fail — refuse any digital ID you cannot walk away from.
Lawyer Clare Wills Harrison has written this piece to warn readers that genuine informed consent for digital ID is effectively impossible, to spell out the practical risks and legal gaps, to arm people with plain-English questions and receipts, and to urge refusal of any system that cannot be opted out of without penalty.
Read this, from Lawyer Clare Wills Harrison, and arm yourselves with questions, receipts and the will to refuse.
Connecting the Dots: One Login, Companies House and the GOV.UK Wallet
One Login is already functioning as the UK’s de facto digital ID; the single GOV.UK sign-on and digital-verification gateway that millions have used to “prove who they are.” Think Digital Identity and Cybersecurity for Government event (Think Digital Partners, 1st Oct 2025), quotes Natalie Jones, the Government Digital Service’s director of digital identity, saying “More than 11 million people have now verified their identity using One Login.”
With GOV.UK Wallet and Companies House identity checks plugging into the same system, the infrastructure for a universal digital ID is already in place.
One Login looks is not a simple sign-in system; in practice it is already becoming the government’s single digital key: sign in with One Login to reach multiple services, to “engage with the state,” and over time (and very quickly) it will replace older logins across GOV.UK. That’s not just a convenience upgrade — it centrally ties your government accounts to one identity gateway, the very definition of a digital ID in practice.
GOV.UK Wallet is the next stitch in that fabric. The Wallet is explicitly built to work with One Login and to let you store government documents on your phone; a driving licence, a veteran’s card, “eligibility” documents — so you can “prove” things to businesses and services using a digital token on your device. The Wallet documentation says you will need a GOV.UK One Login to use it; in other words, the wallet’s convenience depends on the one-login gate. When your official papers live as tokens in a phone app that in turn depends on a single account, the whole setup behaves like a government-issued digital identity system even if ministers call it a “wallet.”
Companies House forcing identity verification is the clearest example of how the net is being drawn tighter. From 18th November 2025 Companies House is phasing in mandatory ID checks for new directors and people with significant control, and the guidance requires people to sign in or create a GOV.UK One Login and verify with photo ID. That is a concrete case where commercial and civic obligations now demand a verifiable digital identity linked to government systems. A “corporate duty” to prove who you are through the same corridor that houses the Wallet and One Login.
Put those three facts together and the outline becomes unmistakable: the One Login is the gate, the Wallet is the container for state-issued credentials, and Companies House (and other services) are the first to require that gate be used as proof. That means government identity, business regulation and everyday transactions are being routed through a single interoperable stack. That stack is the platform on which surveillance, automated rules and exclusion can be built, because once services accept the same machine-verifiable credential, they can share signals, deny access, and enforce policy through code, API calls and contracts rather than statute alone.
Companies House expects 6–7 million people will need to verify their identity as this requirement rolls out, and millions of citizens hold driving licences and other documents that will be digitised into the Wallet. The more people funnel into the same digital lanes, the easier it is to normalise the system as the only way to do paperwork, set up a company, access services or prove eligibility. Normalisation is the quiet move from “voluntary” to “default,” and defaults harden into de facto mandates long before any single law makes them compulsory.
Stitch One Login, Wallet tokens and compulsory ID checks together with corporate databases, payment rails and employer systems, and you create a single, machine-readable identity layer. That layer can be used to enable things you might welcome as “convenience” — faster services, fewer forms — but also to enable geofencing, automated access controls, scoring and real-time penalties tied to behavioural rules. Those are not inevitable outcomes, but they are systemic possibilities once identity is consolidated into digital, verifiable credentials controlled by a small set of systems and vendors. APIs and technical standards become the policy levers; procurement and platform choices become the statutes.
One Login is live and intended to become the main sign-in; GOV.UK Wallet will rely on that login and hold official documents; Companies House is moving to mandatory identity verification that uses the GOV.UK route from November 2025. Those are not conspiracy theory; they are current pieces of policy being put into place.
The conclusion; these pieces together are the beginning of a universal digital ID system with all the attendant risks of centralisation, exclusion and automated control — is a logical reading when you join the dots.
The above letter template works as a formal, written record of objection and a request for information. Directors can use it to register clear, dated concerns with Companies House about being forced to verify via GOV.UK One Login, to demand details of data sharing and retention, to insist on alternative verification routes, and to ask for the legal basis for any enforcement action. Framed as a polite but exacting request, it puts the recipient on notice that the director is aware of security and rights issues and is seeking a proper administrative response before proceeding.
It is a practical tool for any company director who wishes to protect their personal and corporate rights while complying with transparency requirements. It can be sent by an individual director, by a company secretary on the director’s instruction, or adapted by advisers acting for multiple clients; it acts as an evidential step that you raised concerns in good time, sought clarification in writing, and offered to comply by other means if necessary. Keep an exact copy, note the method and date of delivery, and attach any supporting material you reference so you have a clear contemporaneous file.
Finally, this letter can support later actions if the response is inadequate: it becomes part of the paper trail for complaints to Companies House, the Information Commissioner’s Office, or for legal advice about next steps. Expect, however, that Companies House may initially reply with standard guidance or refer technical questions to GDS; the value of the letter is that it documents your objections and requests and makes Companies House’s position — and any limitations in that position — explicit.
Thank you to Mr and Mrs Wilson for this letter template.

Legislation: this section gathers the actual laws and regulatory texts that enable the digital-ID roll-out — starting with the Data Use and Access Act and the Trust Framework — and explains, in plain English, what the legal wording will do in practice. For each item you’ll get a short translation that cuts through the euphemisms.
Every entry also links to the original statute, statutory instrument or official guidance so you can read the source yourself. More acts, regulations and guidance will be added as they appear — treat this as a growing legal reference shelf: the law, the translation, and the receipts.
Data Use and Access Act unmasked: how sweeping powers and a DVS rulebook make privacy optional and industrialise data access.
What the Act says:
The Data (Use and Access) Act 2025 gives the Secretary of State and the Treasury broad powers to make regulations requiring organisations to produce, publish or provide customer and business data, to establish a statutory scheme for digital verification services (a DVS trust framework, supplementary codes, a public register, an information gateway and a trust mark) and to create enforcement, monitoring and financial mechanisms to make those rules stick. The Act also sets out investigative powers for enforcers, levy and fee powers, and substantial amendment and cross-referencing to existing data protection and privacy law.
What it really means:
This is not merely a technical tidy up. The Act creates a legal scaffolding that allows ministers to require commercial and public actors to share vast amounts of customer and business data under delegated regulations, while simultaneously codifying and licensing a state backed market for certified digital verification services. In practice the combination does three things at once. First it legalises easy access routes to private commercial records and machine readable data about people and households. Second it industrialises identity and verification by making a government maintained register, trust framework and supplementary sector codes the preferred commercial route to operate. Third, it substitutes a regulatory and contractual enforcement regime for the old, slower, political checks: audits, compliance notices, criminal offences, unlimited fines and levies make the system coercive without a full public debate. The upshot: more data, pushed more often, to more recipients, under a regime that privileges certified providers and generates ongoing costs and lock-in.
Key levers hidden in plain sight:
Regulation by ministerial order: the Secretary of State or the Treasury may by regulation require data holders to produce, retain, change or provide customer and business data and to use specified digital interfaces and APIs. That is delegated lawmaking that can be shaped later without primary debate.
DVS trust framework as market gate: the Act creates a statutory trust framework, supplementary codes and a DVS register; certificates and registration give certified providers clear commercial advantage and form the gateway to sectoral use.
Enforcement and investigatory muscle: enforcers may be empowered by regulations to demand documents, compel attendance, and exercise powers of entry, inspection, search and seizure (subject to some statutory limits). Compliance notices, criminal offences for obstruction and financial penalties backstop the regime.
Operational lock in via interfaces and standards: regulations may require use of specified dashboard services, APIs, interface bodies and interface standards, steering the technical stack to a small set of prescribed formats and providers.
Financial pressure: fees, levies and powers to require financial assistance or to impose costs can be used to compel participation or subsidise the infrastructure that benefits certified providers.
Receipts:
“This Part confers powers on the Secretary of State and the Treasury to make provision in connection with access to customer data and business data.”
“The Secretary of State must prepare and publish a document ‘the DVS trust framework’ setting out rules concerning the provision of digital verification services.”
“The regulations may confer powers of investigation on an enforcer, including… powers of entry, inspection, search and seizure.”
“Regulations under this Part may provide for the processing of information in accordance with the regulations not to be in breach of any obligation of confidence.”
Ominous implications:
Delegated power is where the danger lies. The Act hands ministers the power to design the detail later by regulation: what counts as customer data, who may receive it, how often it must be handed over, which interfaces are compulsory — all set by secondary law. That makes the law both sweeping and flexible for future expansion.
Privacy protections are weakened in practice. The Act allows regulations to state that processing in accordance with those regulations will not be a breach of obligations of confidence. While the Act nominally preserves the main data protection legislation, the wording invites regulatory carve outs and broad exceptions that can erode practical privacy safeguards. That is a real concern for citizens.
Enforcement tools create compliance by fear. The combination of investigative powers, criminal offences for obstruction, unlimited or large fines, and public statements by enforcers means organisations will comply quickly — and individuals will find it harder to challenge overbroad data flows.
Certification becomes the fast lane to power. The DVS register, certificates and supplementary codes create a two-tier market: certified providers have privileged status and relying parties will gravitate to them, embedding private companies as de facto public infrastructure.
Soft law becomes hard reality. Interface standards, guidance and periodic reviews institutionalise the chosen technical models. Once businesses and the public sector adapt to a given stack the political cost of reversing that choice becomes high.
Tone check:
This Act reads like a ministerial toolkit for rapid, large scale data mobilisation and market creation. It was drafted to enable the state to extract and standardise commercial and public data, while simultaneously creating a certified market for verification services. The rhetoric of reliability, trust and inclusion masks a programme that privileges delegated regulation, private conformity assessment bodies and interface governance. Read it as government organised market formation dressed as public protection.
Plain English Definitions:
Data holder: the trader or business that holds customer or business data; essentially the organisation that keeps the records.
Customer data: information about a person who has bought or received goods, services or digital content from a trader; this can include how they used services and performance data.
Business data: operational data about goods, services, prices, feedback and how they were supplied.
Authorised person / third party recipient: someone specified in regulations who may be allowed to receive customer or business data, potentially at the customer’s request or at the authorised person’s request.
Interface body / interface standards: groups or technical rules that run the APIs, dashboards and technical gates that data flows through. Once mandated, using those interfaces becomes compulsory.
DVS trust framework / supplementary code / DVS register: the statutory rulebook for internet based identity and verification services, plus sectoral addenda and a public list of certified providers; together they create the official marketplace for identity services.
Enforcer: a public authority given powers to monitor and enforce the regulations, including issuing compliance notices, fines and in some cases search and seizure.
Do not be reassured by the language of trust, reliability and user control. The Data (Use and Access) Act 2025 is a legislative vehicle that replaces many practical privacy checks with delegated regulatory power, certification regimes and enforceable contractual and monetary levers. Ministers can later decide what data must be handed over, who may receive it and which technical gates are mandatory. Coupled with the DVS trust framework and its register of certified providers, the Act turns data sharing into a state enabled market and a routine operational requirement. People should be rightly concerned: this law hands powerful new tools to the state and to certified private actors while making it harder to challenge broad data extraction in practice.
Trust Framework unmasked, a technical rulebook that quietly turns everyday services into digital identity marketplaces.
What this document says:
The UK digital identity and attributes trust framework is presented as a set of rules that organisations must follow if they want their service certified as a trustworthy digital verification service, in other words a digital verification service or DVS. It promises privacy, transparency, inclusion, interoperability, proportionality and good governance, and it sets out roles, certification routes, a public register, auditing and supplementary codes for specific use cases. The framework is maintained by the Office for Digital Identities and Attributes, part of DSIT, and this gamma publication comes into force on 1 July 2025.
What it really means:
This is not merely technical housekeeping. It is a deliberate market and governance instrument to normalise certified digital identity products across the economy. By defining certifiable roles, by publishing a data schema and by allowing sector specific supplementary codes, the trust framework creates a certified supply chain of wallet providers, identity verifiers, attribute issuers, orchestration platforms and component vendors. Certification, a public register and flow down contractual terms make certified services commercially preferable. In plain terms: get certified and you are more likely to win business from banks, schools, landlords, utilities and government, and to be embedded into everyday systems such as mortgages, driving licences and benefits. The rules, audits and re certification cycles lock participants into an evolving market architecture that becomes harder to opt out of over time.
Key levers hidden in plain sight:
Certifiable roles, and certification as market advantage. Defining identity service provider, attribute service provider, holder service provider, orchestration provider and component provider channels market activity into certifiable business models and gives certified providers a commercial edge.
Supplementary codes as expansion mechanism. Supplementary codes let OfDIA create sector specific rule sets, for example right to work and right to rent now, and any other sector later. Providers must be certified against the trust framework before they can be certified against a supplementary code, so the main framework is the gateway.
Flow down contractual terms. Relying parties do not need certification, but they will be contractually obliged by certified providers to follow flow down terms that enforce fraud controls, data retention, data minimisation, incident handling and other obligations. That is how a voluntary market practice becomes contractual normality.
Interoperability plus a published data schema. The data schema on GitHub encourages common machine readable exchanges and steers implementations toward a small set of compatible formats and protocols, aligning vendors to the same technical stack.
Audits, register and re certification. Certificates last three years, but regular audits and the prospect of having to uplift to newer publications create ongoing switching costs and governance lock in.
Receipts:
“The trust framework is a set of rules for an organisation to follow if they want to have their service certified as a trustworthy digital verification service.”
“Providers that successfully get services certified against the trust framework can apply to appear on the register of certified digital identity and attribute services.”
“Supplementary codes will be owned and run by OfDIA, prepared through stakeholder engagement and certified against by approved CABs.”
“When sharing a user’s digital identity or attribute information with another service or relying party, you must provide enough information for them to be able to identify the person and or decide if the person or organisation is eligible for something.”
Ominous implications:
Scope creep. Supplementary codes now cover right to work, right to rent and DBS. The framework explicitly invites more supplementary codes where there is a market and user need. That is a built in expansion path to mortgages, driving licences, school admissions, benefits and beyond.
Private gatekeepers and vendor capture. The roles and component model let specialist vendors, biometric suppliers and orchestration platforms embed into the stack. Multiple relying parties tied to the same fraud monitor or broker create single points of dependency that OfDIA will monitor. That centralisation is attractive to corporations.
Contractual normalisation not legislative change. Relying parties are not required to be certified but contractual flow down terms will make them behave as if they are. That makes this a governance by contract route rather than by public debate.
Long lived infrastructure. Certification cycles, audits, public registers and standards alignment create technical and political lock in that is hard to unwind. Providers must plan transition arrangements if they withdraw, but transition obligations focus on migration not on the political choice to dismantle the system.
Tone check:
This is an industry facing rulebook that reads like a commercialisation plan for identity technology. It was developed iteratively with industry, the ICO and hundreds of stakeholders, and it is maintained by a government office inside DSIT. The language of inclusion, privacy and transparency is used throughout, but the practical content is about certifying markets, steering standards and building contractual norms that favour certified providers. Consider the provenance as government organised market formation rather than a purely citizen protection exercise.
Plain English Definitions:
Certified service, certified provider, registered service. A certified service is a product that has passed the trust framework audit and sits on the public register. Being certified is a market credential that proves compliance with the trust framework rules.
Holder service, holder service provider, digital wallet. A holder service is the app or device that stores your verified identity and attributes, for example a wallet on your phone. It includes an account and can be remotely managed by the provider for recovery and fraud response.
Attribute and attribute issuer. An attribute is a single fact about you such as age, a mortgage, or a bank balance. An attribute issuer is the organisation that vouches for that fact, public or private.
Relying party. The organisation that asks for your identity or attributes, for example a bank, landlord or school. They do not have to be certified but they will be bound by contractual flow down terms from certified providers.
Supplementary code. A set of extra certifiable rules for a particular sector or use case which sits on top of the main trust framework. Today there are codes for right to work, right to rent and DBS checks.
Do not be fooled by the language about privacy, inclusion and user control. The trust framework is a market making and governance tool. It certifies providers, publishes a register, prescribes data formats and allows sector specific codes. Together those mechanisms create powerful commercial and contractual incentives to use certified identity products across the economy. This is how voluntary standards become de facto infrastructure. The trust framework is the operational playbook for scaling identity into every part of life, not a limited technical convenience for a few government checks.
What the Bill says:
The Bill updates child safeguarding and school rules. It creates duties to share information between a wide range of public bodies and designated agencies, requires local authorities to keep registers of children not in school with detailed personal data, and gives the Secretary of State wide regulation powers — including the ability to define a “consistent identifier” for children and to prescribe who counts as a designated body. The Bill repeatedly states that certain disclosures “do not breach any obligation of confidence” and allows disclosures for safeguarding even where normally protected.
What it really means:
On the surface it is about safeguarding children. Underneath it builds legal scaffolding for universal, linked data flows. The “duty to share” plus a mandated “consistent identifier” is the starter kit for a mass-indexed child record that can be read by schools, social workers, police nominees, inspectorates and outsourced service providers. Registers stocked with name, address, parent details, protected characteristics and attendance patterns become searchable dossiers. The repeated clause that sharing “does not breach any obligation of confidence” removes a usual legal brake on disclosure. Put bluntly: this Bill turns fractured local records into a routable, standardised dataset that can be widened by regulation.
Key levers hidden in plain sight:
The duty to disclose where information “is relevant to safeguarding” and the rider that such disclosures “do not breach any obligation of confidence.” This is the legal button that makes previously private records easily shareable.
The power to create a “consistent identifier” by regulation and to oblige designated persons to include it in records. That is the technical glue for cross-system linking.
Wide regulatory powers for the Secretary of State to designate agencies, prescribe requirements and set registers. The heavy lifting is moved from Parliament to secondary regulation.
Detailed registers of children “not in school” with prescribed fields including home address, parent details, protected characteristics and the specifics of alternative education arrangements — perfect fodder for profiling and automated decisioning.
Clauses that say the duty survives unless a disclosure would be “more detrimental to the child than not disclosing” — a vague safety valve that will be interpreted administratively, not democratically.
Receipts:
“A disclosure of information under this section does not breach any obligation of confidence owed by the person making the disclosure.” (Section inserted after 16L; see page 4).
“The Secretary of State may by regulations specify a description of consistent identifier for the purposes of this section.” (Consistent identifiers for children, section 2016LB; see page 7).
“If this subsection applies the designated person must include the consistent identifier in the information processed.” (2016LB; mandatory inclusion, subject to limited carve outs; see page 7).
“A local authority must maintain a register of children who are eligible to be registered by the authority under this section.” (Register of children not in school, section 436B; see page 56–57).
“A register under section 436B must contain the following information in respect of a child registered in it — (a) the child’s name, date of birth and home address; (b) the name and home address of each parent of the child; … (f) in the case of a child who is in the area of a local authority in Wales, whether the child has any additional learning needs…” (Content and maintenance of registers, section 436C; see page 57).
“Except as provided by subsection (3), a disclosure of information authorised or required under this section does not breach — (a) any obligation of confidence owed by the person making the disclosure, or (b) any other restriction on the disclosure of information (however imposed).” (Processing of information, section 436T; see page 80).
Ominous Implications:
This is not merely better data management. The Bill creates the legal and technical building blocks for: a routable national child identifier, persistent registries that include sensitive attributes, regulatory routes to force provider and contractor participation, and a low threshold for sharing that weakens confidentiality. Once identifiers and registers exist, they intersect with other systems: welfare, health, education databases, and any central trust framework that later standardises credentials and verification. That is the anatomy of mission creep: yesterday’s safeguarding file becomes tomorrow’s universal record, and universal records become the backbone of identity and access systems.
Tone Check:
The Bill reads like a managerial, technocratic fix: problem defined, registry prescribed, regulation enabled, and information flows mandated. Its language is bureaucratic, benign and “for the child’s welfare.” The effect is to normalise routine sharing and to recast confidential files as administrative inputs. That soft, moral tone hides a shift of power to administrative actors, regulators and the speculative vendors who will supply the tech and manage the registers.
This Bill is sold as better safeguarding, but it also builds a legal and technical path from local case records to a routable, regulated, indexed data architecture for children. The “consistent identifier” and the immunity from confidentiality for authorised disclosures are the exact tools you would design if your aim was to make every child’s record linkable, searchable and reusable across services. If you are worried about mission creep, privacy erosion, vendor lock and the normalisation of persistent identifiers, this Bill is an important tipping point. Read the receipts, note the pages I have quoted, and understand that once identifiers and registers are established and normalised by regulation, reversing the reach of the system becomes politically and practically very hard.
Note: The Bill does not name the “trust framework” explicitly, but it hands ministers the exact powers needed to feed a children’s identifier and registers straight into a DVS / trust-framework ecosystem. In practice the connection is indirect but immediate and easily made by regulation.