
When Palma's squares are watched: AI cameras, new jackets and the question of trust
The city of Palma wants to install 13 AI cameras at Plaza de España and Parc de Ses Estacions. Will that really help — or will it change our public squares?
More eyes, more questions: Palma's new view from the pole
In the early morning when bakers on the Plaza de España fire up their ovens and the tram vibrates, it will soon not only be the pigeons that see everything: the city plans 13 cameras with AI video analysis on the Plaza de España and in Parc de Ses Estacions, as reported in Más ojos en Palma: cámaras con IA en la Plaza de España y el Parque de Ses Estacions – ¿más seguridad o más control?. €139,000 are budgeted for the cameras, plus around €100,000 more for uniforms, protective gear and vehicles for the municipal police, as detailed in Palma se equipa: más cámaras, drones y la gran pregunta sobre la privacidad. That sounds like modern order — but feels like increased control to some residents at first.
What is technically planned — and what that means
The cameras are intended not only to record but to automatically detect 'suspicious behaviour' and trigger alerts. An algorithm filters in advance; people at a monitor are supposed to decide later. Practical if groups get loud at night or shoplifting occurs. Uncomfortable when nobody can say exactly how the machine defines 'suspicious': movement patterns? facial features? clothing? A student at the Estació Intermodal shrugs: 'Who programs their distrust?'
The less visible problems
Discussions often revolve around security or data protection. Important questions rarely get on the table: What data does the AI learn from? Are training datasets diverse or biased? Who operates, updates and controls the software — the city or an external service provider? And what happens when there are errors: if the machine raises a false alarm, stigma and unnecessary police interventions can follow. Such false positives have faces — not just numbers.
How surveillance changes public space
A square with cameras feels different. Conversations get quieter, encounters become more formal. Young people, homeless people or migrant workers notice this immediately: more presence can mean protection — or exclusion. And often displacement happens: problems move to less monitored side streets. It also matters whether face recognition is planned or 'only' behaviour analysis — a difference that decides over civil liberties.
Transparency is often missing where it is most needed
The city refers to legal requirements. That is necessary, but not enough. Who is allowed to access the recordings? How long are they stored? Who sets the alarm criteria? Without publicly available answers, distrust remains. Access for independent data protection officers and open audit reports would be first steps to build trust.
Concrete proposals instead of tech faith
From the neighbourhood and civil society groups come practical proposals:
1) Clear limits: No facial recognition, short retention periods (e.g. 24–72 hours), automatic deletion and pseudonymisation where possible.
2) Public oversight: An independent body with citizen representatives, regular audit reports and a public log of all AI alerts.
3) Humans remain the decision-makers: Every alert must be verified by trained personnel before any intervention. No automatic measures.
4) Pilot phase with evaluation: A transparent trial with clear success criteria (less vandalism, faster response times) and a fixed evaluation period after which the continuation or adjustment is decided publicly.
5) Invest socially: Allocate part of the funds to better lighting, street workers, youth and cultural programs — visible presence that not only deters but connects.
What it feels like in Palma
In the morning, when the first cafés open on Carrer de Sant Miquel and a woman hurries past the Plaza de España with a baguette, it is the mix of the smell of coffee, the distant honk of the tram and the Mallorcan dialogues that make the square lively. A police officer with a new jacket and a pole with a camera change this scene not only visually. What matters is who writes the rules and who exercises control.
Conclusion: Use the opportunities, demand transparency
The new technology can ease real problems. But it must not become Pandora's box that turns squares into technologised zones where closeness and mistrust stand side by side. The coming months are an opportunity: for transparent rules, citizen participation and a balance that does not pit security against privacy but protects both.
Similar News
Daniyella: A Voice That Fills Mallorca Evenings
She sings country, chart hits and classics — always with noticeable heart. Mallorca-based singer Daniyella brings repert...

Large Dogs in the Cabin: How Mallorca Travelers Now Board Stress-Free
A Swiss semi-private jet operator allows dogs up to 30 kg in the cabin. What this means for travelers and Mallorca — pra...
Cala Rajada in Transition: Demolition, Construction Noise and the Question of the Common Good
The demolition of the beach shack at Son Moll, the large harbour development plans and the noise disturbance raise the q...

Palma's Harbor: 13 Applicants, 5 in the Running – What the Selection Really Means
The port authority has selected five teams from 13 proposals to develop a master plan for the roughly 400,000 m² site. A...

Without insurance and deregistered: BMW seized in Palma — a symptom, not an exception
In Palma, a blue BMW with a British license plate was stopped: deregistered and without liability insurance. What the in...
More to explore
Discover more interesting content

Experience Mallorca's Best Beaches and Coves with SUP and Snorkeling

Spanish Cooking Workshop in Mallorca
