Face Recognition algorithm


Top 5 Most Agreed-Upon Display Items

  1. Name
    • Mentioned directly or implied in almost every response
    • Viewed as the basic, minimal identifier
  2. Age / Estimated Age Group
    • Frequently paired with name
    • Often linked to visual observation or government records
  3. Gender
    • Inferred visually or from prior records
    • Common, though ethically more sensitive than others
  4. Job Title / Role / Organization
    • Especially in professional or school contexts
    • Students often saw this as relevant contextual info
  5. Emotions / Facial Expressions
    • Many noted emotional state, affective prosody, or facial dynamics
    • Tied to human-AI interaction, though ethically debated

🔐 Ethical and Privacy Concerns Expressed

🟠 1. Privacy and Consent

  • Lack of user consent was a major concern: individuals should have full control over what data is displayed.
  • Some noted that even basic identifiers like a name can lead to deeper online exposure.
  • The display of personal data without consent (e.g., age, job, address, political preference) was seen as intrusive.
  • Several suggested that users should opt-in to having their info displayed, and be clearly informed beforehand.

🛑 2. Data Misuse and Security

  • Students feared data leaks, unauthorized access, and the risk of info falling into the “wrong hands.”
  • Concerns about deepfakes, impersonation, and hacking of biometric data (e.g., face scans) were frequent.
  • Questions were raised about how securely information is stored, especially when cloud-based.

⚖️ 3. Discrimination, Bias, and Social Stigma

  • Some worried about profiling and bias, especially related to ethnicity, political beliefs, or educational background.
  • Displayed data could lead to unemployment or social discrimination (e.g., someone being rejected due to political views).
  • A few warned of hegemonic control or surveillance-state risks, citing China as an example.

🧍 4. Loss of Anonymity and Autonomy

  • Multiple students emphasized the right to remain anonymous in public spaces.
  • The idea that a person should choose how and when they introduce themselves was frequently mentioned.
  • There was unease about “always-on” identification, which blurs public/private boundaries.

🏫 5. Context and Purpose Limitations

  • Many said info should only be shown if relevant to the specific context (e.g., in schools, only for teachers).
  • The concept of “fit-for-purpose” data was seen as important—only display what’s truly needed.
  • School-related data, if used, should remain strictly internal and controlled.

📣 6. Reliability of Displayed Info

  • Some students questioned the accuracy of inferred data, like emotion, height, or political stance.
  • Others noted that public domain info may still be false or outdated, leading to reputational harm.