Product
Plans
Demo
Blog

Data Protection in the Workplace: Practical, Legal, and Strategic Implications of the GDPR

Data protection is often seen as a topic exclusive to legal or IT departments. In this article, we’ll dive into the key points discussed in ECOMPLY’s webinar on Data Protection in Employment Relationships, highlighting the practical, legal, and strategic implications for companies seeking GDPR compliance and preparing their teams for the challenges of digital transformation.

Our guest speaker (Portuguese Webinar), Jéssica Rocha, Senior Privacy Lawyer at Viseu Advogados, emphasized that workplace data protection must involve all departments— especially Human Resources, leadership, and technology.

The reason is simple: most of the personal data handled by companies comes from hiring, managing, and offboarding employees—processes that require sensitivity, legitimacy, and respect for fundamental rights. That’s why GDPR should be treated as a cross-functional responsibility.

In Brazil, we can draw parallels with these scenarios under the LGPD (General Data Protection Law), which shares many principles that came from GDPR, but also reflects local nuances.

Consent: A Weak Legal Basis in Employment Relationships

One of the webinar’s key points was the limitation that the consent as a legal basis for processing employee and candidate data must have. GDPR requires consent to be freely given, informed, and unambiguous—which is difficult to ensure in hierarchical workplace settings, where individuals may feel pressured to agree. The basic requirements for the effectiveness of a valid legal consent are defined in Article 7 and specified further in recital 32 of the GDPR.

If a company asks for consent to use employee data for internal marketing, like a birthday campaign, the employee may feel compelled to accept “to avoid looking bad, for example.” This undermines the notion of free and voluntary consent.

Imagem

Acceptable Use of Consent: Participation in voluntary and non-mandatory initiatives; optional wellness programs or employee benefits or internal communications campaigns (with a clear opt-out option).

Transparency Is a Principle—Not a Legal Basis

Another important point was clarifying the distinction between transparency and consent. Transparency is mandatory regardless of the legal basis used. This means that even when processing data under legitimate interest or contract performance, employees and candidates have the right to know:  

  • What data is being collected;  
  • For what purposes;  
  • How long it will be retained;  
  • Who it will be shared with;  
  • How to exercise their rights (access, rectification, objection, etc.)

Recommended Best Practices

  • Draft clear and specific privacy notices for candidates
  • Avoid legal jargon; use plain language
  • Apply privacy by design principles from the recruitment stage

Risk of Excessive Data Collection During Hiring

Recruitment and selection processes require extra caution. Jéssica Rocha shared two important case law precedents (from Brazil, in reference to the LGPD):

  • TST (pre-LGPD): The Brazilian Superior Labour Court condemned the indiscriminate requirement of criminal records as a privacy violation.
  • TRT-15 (post-LGPD): The Regional Labour Court of the 15th Region fined two companies BRL 200,000 for collective moral damages due to the excessive and sensitive data collected via a third-party recruitment platform.
Imagem

AI in Recruitment and Management: Real Risks of Discrimination

The use of AI in recruitment is growing—but so are the risks of algorithmic discrimination. AI models are trained on historical data, which can embed past biases. In a well-known case, a tech giant’s recruitment algorithm favored white male candidates, as this matched the “ideal” profile from past hires.

Discrimination Risks Include automatic penalization of female candidates; favoring CVs from certain regions (based on postal codes); filtering out diverse profiles without a clear rationale.  

Avoiding the input of sensitive data into AI systems doesn’t ensure fairness. Discrimination can occur through seemingly neutral data. Therefore, companies must conduct algorithmic impact assessments; ensure human intervention in automated decisions and document decision criteria and algorithm usage.

In recruitment, data should be collected progressively—as candidates move closer to real hiring potential—to avoid gathering unnecessary information on unqualified applicants.

Imagem

What Legal Guarantees Are Involved?

Firstly, the right to non-discrimination. This is a constitutionally protected right, not granted solely by the GDPR. It’s reinforced by multiple international treaties to which EU Member States and Brazil are signatories. Discrimination risks are particularly relevant in the context of Artificial Intelligence, and require robust governance measures to safeguard this right.

The European Union protects non-discrimination through several treaties and legal instruments. These instruments prohibit discrimination based on a variety of factors, including sex, race, religion, sexual orientation, age and disability.

Main Treaties and Instruments:

  • Treaty on European Union (TEU): This treaty establishes the foundations of the European Union and includes provisions on the protection of fundamental rights, including non-discrimination.
  • Charter of Fundamental Rights of the European Union: This binding legal instrument prohibits discrimination and establishes equality between EU citizens.

Algorithmic Management: How Much Monitoring Is Reasonable?

Beyond hiring, AI-based performance and productivity management is becoming more common. However, excessive monitoring—such as camera surveillance, GPS tracking, or keyloggers—may be deemed illegitimate, especially in low-risk environments.

Monitoring is only acceptable when it serves a legitimate purpose (e.g., safety in mines or oil rigs); it is proportionate to the risk involved; it respects the employee’s reasonable expectations and it is transparent and auditable. An office worker does not expect to be tracked, but a miner, on the other hand, might.

Recommended Governance Measures

To ensure compliance, companies should adopt structured measures such as:

  • Algorithmic governance policies
  • Data Protection Impact Assessments (DPIA)
  • Ongoing training for managers and employees
  • Justification and documentation of the legal basis used
  • Audits and impact metrics (e.g., the 4/5 rule)
  • Mandatory human oversight in automated decisions
  • Careful selection of technology vendors

Conclusion: Privacy Is Culture, Not Just Compliance

The main takeaway is clear: privacy must be understood, lived, and implemented across the entire organization. Legal departments alone cannot shoulder this responsibility.

Excessive surveillance may even lead to mental health issues and harm employee well-being. Studies have shown that algorithmic management can erode autonomy, increase pressure, and contribute to workplace stress—creating a hostile, unsustainable environment.

Where Should You Start? Begin with a strong governance policy; appointment of a qualified data protection lead and a thorough vendor risk assessment, considering that vendors lacking proper security measures can expose companies to serious risks.

Imagem

ECOMPLY supports companies with the technology and knowledge to make GDPR compliance a daily, practical reality—ethical, efficient, and culturally integrated.

ECOMPLY is a unified platform that allows you to input data once for easy and repeated use. That way, you can focus your time on what really matters.

Request a demo and discover the advantages of an intelligent compliance operating system, trusted by over 2,000 companies—capable of managing all GDPR tasks, such as automated reports, data subject request channels, incident management, and much more.

ECOMPLY is a GDPR compliance management software that assists in building and maintaining compliance documentation. Check out our website or contact us for more information.

Hauke Holtkamp, CEO ECOMPLY GmbH