ETHICA SOCIETAS-Rivista di scienze umane e sociali
Intelligenza Artificiale NOTIZIE Privacy Sergio Bedessi

THE FRIA OBLIGATION FOR LOCAL POLICE – Sergio Bedessi

La FRIA come nuovo adempimento obbligatorio per videosorveglianza e sistemi di AI ad alto rischio

Sergio Bedessi

Abstract: Law 132/2025, implementing the European AI Act, makes the Fundamental Rights Impact Assessment (FRIA) mandatory for all high-risk artificial intelligence systems, widely used by Local Police forces in video surveillance and urban security services. With ACN designated as the authority responsible for oversight and sanctions — up to 35 million euros — municipalities must classify the technologies they employ, assess their impact on fundamental rights, and prepare the FRIA by 2 February 2026. The article outlines the regulatory obligations, operational implications, and the relationship between FRIA and DPIA, highlighting the urgent challenges for local authorities, particularly those using video surveillance systems, which now represent the vast majority.

Keywords: #localpolice #ArtificialIntelligence #AI #AILaw #AIAct #Law1322025 #AINormativeFramework #DeepFake #EuropeanAIRegulation #AIAttuation #AIRegulation #AILawStudies #AIEthics #RiskBasedApproach #ProhibitedAI #HighRiskAI #LimitedRiskAI #MinimalRiskAI #FundamentalRights #Privacy #HumanDignity #NonDiscrimination #DataProtection #AITransparency #HumanOversight #SergioBedessi #ethicasocietas #ethicasocietasjournal #scientificjournal #law #ethicasocietasupli


Sergio Bedessi (1958), he holds degrees in architecture, political science, and methodology and empirical research in the social sciences. He worked for many years in public administration, initially as a public works technician and later as chief of local police in various departments. A registered public journalist, he is the author of over 30 books and hundreds of articles. Notably, he published “Artificial Intelligence and Social Phenomena. Forecasting with Neural Networks” for Maggioli. He is currently President of CEDUS – Urban Security and Local Police Documentation Center, and teaches in numerous courses, including at the university level.


versione italiana


Law No. 132 of 29 September 2025, “Provisions and Delegations to the Government on Artificial Intelligence”, which—as already noted—constitutes the Italian implementing legislation (not transposition, as it is a self-executing norm) of EU Regulation 2024/1689 of the European Parliament and of the Council, known as the AI Act, has now identified the national authority responsible for imposing the sanctions provided for by the AI Act. As a result, all those who had so far remained inactive are now compelled to adopt the FRIA (Fundamental Rights Impact Assessment), an obligation required by the European Regulation for all users of high-risk artificial intelligence systems.

Given the exponential proliferation of urban video-surveillance systems equipped with multiple AI-assisted functions, it is evident that virtually all Local Police forces (i.e., municipalities) are required to produce this document.

In fact, two deadlines for adopting the FRIA have already passed (the most recent on 2 August 2025), but since the competent Italian authority for supervision and sanctions had not yet been designated, no consequences followed.

With the next—and final—deadline of 2 February 2026 approaching, all deployers (as defined by the AI Act) of high-risk AI systems must now adopt the FRIA.

Aside from prohibited systems (e.g., real-time remote biometric identification systems in publicly accessible spaces, biometric categorisation systems based on sensitive traits, such as even just gender, etc.), there exists a very broad area of AI-enabled systems that may potentially impact fundamental rights. These include urban video-surveillance systems, dashcams, bodycams, “camera traps”, and unmanned aircraft systems (drones).

Following an analysis of the functions embedded in such systems and their classification according to the AI Act, any function that falls under the category of high-risk systems must undergo a Fundamental Rights Impact Assessment.

It should be noted that the FRIA must be drafted even if the relevant function is not currently in use but is potentially usable.

Annex III of the AI Act provides a non-exhaustive list of AI-based technologies for which a FRIA must be prepared in relation to policing activities; among these are systems used for road-traffic management, systems for assessing and classifying emergency calls, predictive-policing systems, and many others.

WHAT MUST A LOCAL POLICE COMMAND DO?

The key questions to ask are: Are we using prohibited systems? Are we using high-risk systems for which an impact assessment is required?

To respond to these two questions, an analysis must be carried out on the technologies in use (and the same analysis must be conducted for future purchases), with particular attention to technologies that capture or process images and video streams, such as video-surveillance systems. This is necessary to determine whether the technologies fall into one of the following categories: unacceptable-risk technologies (prohibited), high-risk technologies, limited-risk technologies, or minimal-risk (unregulated) technologies.

Once the analysis has been completed, a Fundamental Rights Impact Assessment must be drafted, followed by the adoption of a series of measures aimed at mitigating such impact and, consequently, the risks to individuals’ fundamental rights.

It should be noted that the FRIA has strong connections with the Data Protection Impact Assessment (DPIA), and the two assessments should therefore be carried out in parallel.

The preparation of the FRIA should be entrusted to an individual who simultaneously possesses technical, technological, and legal expertise, also involving the manufacturers/distributors of the AI-based systems used (for example, the producer of the video-surveillance system).

Now that national legislation has finally designated the ACN as the authority responsible for oversight and sanctions, there is a concrete risk that—just as happened with data-protection compliance—many Municipalities/Local Police departments will find themselves without a FRIA by the next deadline (February 2, 2026), exposing themselves to potential fines of up to €35,000,000.


LATEST CONTRIBUTIONS BY THE SAME AUTHOR

THE ITALIAN REGULATION ON AI: NEW CRIMINAL OFFENCES AND ADMINISTRATIVE VIOLATIONS

LATEST CONTRIBUTIONS ON AI

THE IMPACT OF ARTIFICIAL INTELLIGENCE ON CORPORATE FINANCE

THE RISK OF RELIGIOUS BIAS IN ARTIFICIAL INTELLIGENCE

THE FUTURE OF FINANCE: THE IMPACT OF ARTIFICIAL INTELLIGENCE ON BUSINESS VALUATION

ARTIFICIAL INTELLIGENCE AND BIAS: LIMITS AND OPPORTUNITIES

GENERATIVE ARTIFICIAL INTELLIGENCE HOW DOES THOUGHT WILL CHANGE?

LATEST 5 CONTRIBUTIONS

WHEN CRIMINAL TRIAL BECOMES A PLACE OF NEW VIOLENCE

MINORS PROTECTION IN DOMESTIC VIOLENCE

“LIFE DOES NOT BELONG TO US”: MINISTER FLORES HERNÁNDEZ SPEAKS OF DIPLOMACY WITH A SOUL

TELEMEDICINE AS A PARADIGM OF TRANSFORMATION IN TERRITORIAL HEALTHCARE

THE MUNICIPALITY OF SANT’EGIDIO ALLA VIBRATA (TE) CONVICTED FOR MOBBING


Ethica Societas is a free, non-profit review published by a social cooperative non.profit organization
Copyright Ethica Societas, Human&Social Science Review © 2025 by Ethica Societas UPLI onlus.
ISSN 2785-602X. Licensed under CC BY-NC 4.0

Related posts

LA DISCRIMINAZIONE DELLE DONNE E MADRI NELLA POLIZIA LOCALE, Massimiliano Mancini

@Direttore

RELAZIONE ANNUALE AL PARLAMENTO SUL FENOMENO DELLE TOSSICODIPENDENZE IN ITALIA (2025) – 1^ parte, Luigi De Simone

Luigi De Simone

INTERVISTA A DANIELA DI MAGGIO MADRE DI GIOGIÒ IL MUSICISTA TRUCIDATO PER UNA LITE A 24 ANNI, Massimiliano Mancini

@Direttore