← All editorials Editorial · Opinion

Digital Europe Is Funding Machines. Someone Needs to Fund the People.

April 6, 2026 · Audience: Governments

The European Commission's March 2026 amendment to the Digital Europe Programme allocates €1.3 billion to AI infrastructure, testing facilities, and innovation hubs — and nothing to the workers who are legally required to understand AI by August 2026, or the students who will enter a workforce built on systems they were never taught to question.

The problem

On 19 March 2026, the European Commission amended its Digital Europe Programme work programme for 2025–2027. The amendment is the second revision to the cycle and carries real money: the overall DIGITAL budget for 2021–2027 exceeds €8.1 billion, with the current three-year phase directing a reported €1.3 billion toward digital skills, artificial intelligence, and cybersecurity. That framing, digital skills listed alongside AI and cybersecurity as a priority, creates an impression of a programme concerned with people as much as infrastructure. Reading the actual text of the amendment corrects that impression quickly.

The amendment funds four categories of AI-related activity:

  • The first is Testing and Experimentation Facilities: physical and digital environments where companies can test AI systems against real-world conditions before deployment.
  • The second is the European Digital Innovation Hub network, 83 hubs selected in October 2025 and now explicitly tasked with delivering AI training for businesses and public administrations.- The third is ELEVATE, the European League of Advanced Digital Skills Academies, which funds advanced technical programmes inside universities.
  • The fourth is DIGITAL-2026-AI-09, the AI Continent call, which opens in late April 2026 and targets institutional capacity for AI deployment across the EU.

ELEVATE is worth dwelling on for a moment, because it is the instrument in this amendment closest to training people rather than testing machines. Advanced AI curricula inside universities are not a waste of money. Europe needs researchers, engineers, and technical staff who understand AI at a deep level, and funding that pipeline has long-term value. But ELEVATE operates at the far end of the education system, serving students who have already cleared every prior barrier: secondary school, university entry, and enrolment in a programme technical enough to qualify. It is, in other words, an investment in the people who need this investment least.

The workers whose AI literacy is both required and practically absent are not in ELEVATE cohorts. They are in warehouses, hospitals, classrooms, and call centres, and DIGITAL has no instrument for them.

Every one of these four instruments is aimed at institutions. TEFs exist to serve companies with products to test. EDIHs are, by the Commission’s own documentation, one-stop shops for companies and public sector bodies, not for individual workers. ELEVATE targets universities. DIGITAL-2026-AI-09 targets AI infrastructure at continental scale. The amendment also introduces digital infrastructure for schools, which sounds like education investment until you read it as what it is: a connectivity and hardware grant, not a curriculum.

Meanwhile, Article 4 of the EU AI Act has been binding since February 2, 2025. It requires every employer in the EU to ensure their workforce has sufficient AI literacy to work with or be subject to AI systems. Full compliance is due by August 2, 2026. The Commission is the institution that wrote that obligation. The Commission is also the institution that designed the March 2026 amendment to DIGITAL. No instrument in that amendment funds Article 4 compliance. The connection between the two, an enforceable literacy obligation and a €1.3 billion skills programme, does not exist.

This is a structural choice, and I think it is the wrong one.

Why it matters

The gap between what DIGITAL funds and what Article 4 requires is not a technicality. It is the difference between infrastructure for AI deployment and preparation for AI exposure.

Consider who Article 4 actually covers. The regulation applies to all employees working with or subject to AI systems. In a mature economy in 2026, that category is effectively the entire workforce. A hospital administrator whose scheduling is managed by an AI system is subject to it. A warehouse picker whose task allocation comes from an algorithmic routing engine is subject to it. A teacher whose students are assessed by an AI-assisted grading tool is subject to it. The Article 4 obligation was written broadly because AI exposure is broad, not because the Commission expected every worker to understand how transformers work.

The EDIH network cannot serve these workers. EDIHs were designed for SMEs and public bodies seeking to adopt AI. They provide technical consultation, testing access, and implementation support. Their AI training offer, reinforced in this amendment, is directed at companies learning to deploy AI, not at employees learning to work safely and critically alongside it. These are related but separate problems, and treating the EDIH as the delivery mechanism for Article 4 literacy is a category error.

The student question is more pointed still. The amendment introduces digital infrastructure for schools as a new action strand. Infrastructure for schools is not unwelcome. But connectivity and hardware are the condition for AI education, not the substance of it. A classroom with faster broadband and newer tablets is not a classroom where students understand what AI is, how it makes decisions, what its failure modes are, or what rights they have in relation to systems that assess them.

I am also conscious that the August 2026 deadline for Article 4 compliance is now less than four months away. There is no funded mechanism in the EU’s own digital programme to help the employers who are now legally obligated to train their workers. There is no DIGITAL call for AI literacy at the individual level. There is no instrument designed to reach the care worker, the logistics operative, or the secondary school teacher. The obligation exists. The funding architecture to meet it does not.

What should happen

Three things need to change, and they are not complicated to name even if they are politically difficult to deliver.

First, the Commission should open a dedicated DIGITAL call for AI literacy at the workforce level. Not for companies deploying AI, not for universities developing AI curricula, but for the organisations, whether employers, trade unions, adult education providers, or regional authorities, delivering training to workers who are now legally required to have it. The call should specify minimum content: how AI systems work at a functional level, how to identify AI-assisted decisions, what data rights apply, and how to raise concerns. It should be funded at a scale proportionate to the obligation, not as a marginal line item.

Second, the school infrastructure investment needs a curriculum companion. The Commission cannot fund broadband for classrooms and then leave curriculum to Member States without guidance or minimum standards. The result will be hardware-rich, knowledge-poor classrooms across the EU, with the widest gaps in countries that have the least capacity to design AI education independently. A framework for what students should understand about AI, by age cohort and by the time they enter the workforce, is within the Commission’s competence to propose. It has not done so.

Third, Article 4 needs an enforcement mechanism that has teeth, and the deadline needs to mean something. An obligation with no penalty for non-compliance is a recommendation with extra steps. The August 2026 deadline is already being ignored by a significant proportion of EU employers. If the Commission allows that deadline to pass without consequence, it will have established that AI literacy for workers is aspirational, not required, and every subsequent attempt to build on that foundation will be weakened by the precedent.

What already exists

The legal obligation is documented in the research archive at EU AI Act Article 4. The March 2026 DIGITAL amendment itself is at Digital Europe Programme Work Programme Amendment. The gap between what Member States have done individually and what the Commission has funded centrally is visible in the country-level research: Denmark, Belgium, Slovakia, and Austria have each developed national approaches to Article 4 implementation, each different, none funded by a DIGITAL instrument.

The sentiment research is consistent on the scale of the problem. IDC projects $5.5 trillion in global losses from IT and AI skills gaps by 2026, with AI identified as the primary driver. The OECD and European Commission jointly published an AI literacy framework in 2025 that defines four dimensions of AI competence, the most complete current definition of what the obligation in Article 4 should produce. That framework exists. Its connection to funded delivery does not.

What you can do

If you work in EU policy or have access to Commission consultation processes: the next DIGITAL work programme revision is an opportunity to name this gap explicitly and push for a workforce-level literacy call. The precedent for individual-facing digital skills funding exists inside DIGITAL itself; the programme has previously funded digital skills campaigns and public awareness initiatives. Extending that logic to AI literacy is not a stretch. It requires someone to ask for it.

If you are an employer covered by Article 4 and have not yet acted: August 2026 is four months away. The absence of a funded EU instrument does not change your legal obligation. The most practical step available now is a structured audit of which roles in your organisation are subject to AI systems, followed by a minimum training specification for each. Waiting for the Commission to fund a solution is not a compliance strategy.

If you work in a school or education authority: the digital infrastructure investment arriving through DIGITAL needs a curriculum to justify it. The Commission has not provided one, but the OECD AI literacy framework offers a usable starting point. The question worth asking of your national education authority is not whether AI tools are permitted in classrooms, but what students are expected to understand about AI by the time they leave school. That question does not currently have a consistent answer anywhere in the EU.

The Commission has committed €1.3 billion to digital skills in the current cycle. The people who most need that investment are not the companies testing AI in TEFs or the universities running advanced skills academies. They are the workers who will be evaluated, sorted, and managed by AI systems before the end of this year, and the students who will enter a workforce built on those systems. Neither group has a DIGITAL instrument with their name on it.