Generative AI
Foundation models and general purpose AI systems
‘general purpose AI model’ means an AI model, including when trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications. This does not cover AI models that are used before release on the market for research, development and prototyping activities” (EU AI Act Art. 3, par. 1, n. 63)
→ AI developers must follow general rules (called “horizontal compliance obligations”) when building general-purpose AI models, which the European Parliament refers to as “foundation models”. ****They must clearly state if they used copyrighted material (like books, articles, or images) when training their model. This requirement comes from Recital 105 of the EU AI Act
General obligations for GenAI
- Technical documentation of the model
- Documentation for downstream providers
- Policy to comply to copyright law
- Summary of the training content
- Appointing a representative in the EU
Art. 51-55 Systemic Risk
- Model evaluation → Assess and evaluate risks → incident reporting → Cybersecurity measures
- Classification bases on the risk posed by GPAI
- Considered “systemic risk” if:
- has high impact capabilities assessed on the basis of appropriate technical tools
- if an AI model for general use has equivalent capacities or impact to those referred to previously
- the classification will initially depend on a quantitative threshold measured in Floating Point Operations (FLOPs)
Biometric systems
Regulation of biometric identification systems
→ Biometric data involves physical or behavioural characteristics that uniquely identify individuals, such as facial images, fingerprints, iris scans,….
→ to count as biometric data, it has to fit certain rules, mainly about what it is, not how it’s used
→ usually fall in the category of deep learning systems. They work by a process of detection, alignment (normalization), feature extraction and template matching
→ Protected by:
- GDPR: biometric data as a “special category” (Art. 9)
- LED: provides guidelines for using biometrics in crime prevention
- AI Act: introduces classification for high-risk applications of biometrics, especially real-time facial recognition in public spaces
- Rec. 15: “biometric identification” should be defined as the automated recognition of physical, physiological and behavioural human features such as the face, eye movement, body shape, … for the purpose of establishing an individual’s identity by comparing biometric data of that individual to stored biometric data of individuals in a reference database