One of the main bones of contention among partners around the social table on labour regulation for platform workers is whether or not these workers have the right to know the algorithm used by digital platforms. While the unions argue in favour of the recognition of this right for workers, the bosses shield themselves behind the consideration that the algorithm is a trade secret.
Notwithstanding acceptance of consideration of the algorithm as a trade secret, the fact is that workers’ right to know the logic used in algorithmic management and organisation is already legally recognised in Article 14.2.g) of the General Data Protection Regulation (GDPR), in conjunction with Article 22 thereof, which regulates automated decisions – understood as those decisions that are made in a fully automated manner, without human intervention.
According to Article 22 GDPR, an individual has the right “not to be subject to a decision based solely on automated processing, including profiling”, which produces legal effects concerning him or her or similarly significantly affects him or her. However, fully automated decision-making is allowed when necessary for entering into, or performance of, a contract.
Digital platforms are among the first companies to have introduced automated decision-making to manage work activity
In the labour sphere, decision-making such as selection of personnel for employment, allocation of tasks among the different workers or the resolution of a process of professional promotion can be fully automated, despite having legal or significantly similar effects, on the grounds that it is considered necessary for entering into, or performance of, the contract.
Digital platforms are among the first companies to have introduced automated decision-making to manage work activity. In this way, using sophisticated algorithms, digital platforms distribute tasks and preferences in the allocation of time slots in a fully automated manner. Some even dismiss workers automatically – by disconnecting them from the platform – when they fail to reach a minimum score or a minimum number of services.
Companies that make automated decisions – for example, digital platforms – must comply with the data protection obligations of the General Data Protection Regulation, namely the principles of lawfulness, fairness and transparency, accuracy, purpose limitation, and data minimisation in the processing of personal data (Article 5 GDPR).
Furthermore, people subject to automated decision-making with legal or significantly similar effects have a right to information (Article 14.2.g GDPR). That is, individuals who are subject to automated decisions without any human intervention with legal or significantly similar effects – such as access to employment or remuneration, for example – have the right to know the logic applied in that automated decision-making. Specifically, they are entitled to know (i) the use made by the company of automated formulas for work management and organisation without human intervention, (ii) the logic used for this decision-making, and (iii) the significance and envisaged consequences of these automated decisions.
The right to obtain information on the logic used for automated decision-making cannot be identified with the right to know the algorithm itself. Firstly, the algorithm may constitute a trade secret, which may determine the greater or lesser profitability of the business. And secondly, accessing the algorithm – or algorithms, as it is common to use multiple algorithms – would be absolutely counterproductive and confusing, insofar as it may consist of pages and pages of indecipherable codes.
The right to obtain information on automated decisions, based on the fundamental principle of transparency that governs the protection of personal data, can be understood as the right to obtain clear and simple information on the functioning of the process of automated decision-making, as has been clarified by the Article 29 Working Party.
In the context of digital platforms, from my point of view, this right to information implies the right to obtain information on the variables or metrics used by the algorithm for the distribution of tasks, allocation of time slots, remuneration or disconnection from the platform; also the right to know its weighting or relative importance within the equation; and lastly, the right to know the consequences that might ensue from not reaching these metrics or standards.
Article developed in the context of the LABORAlgorithm project, funded by: FEDER/Ministerio de Ciencia, Innovación y Universidades – Agencia Estatal de Investigación/Proyecto PGC2018-100918-A-100
Join the Do Better community
Become a member and enjoy our free benefits. Get recommendations, receive personalised content in your inbox and save your favourite articles to read later.