Fitting Rooms in the cloud: Privacy implications of VTO in retail

What matters

What matters next

As retailers embrace AI to improve digital shopping, virtual try-on tech blends convenience with personalisation. Yet, in the race for seamless experiences, critical data protection concerns are often overlooked.

As retailers increasingly adopt artificial intelligence (“AI”) to enhance digital shopping journeys, virtual try-on (“VTO”) technology offers a compelling combination of convenience, personalisation, and engagement. Whether helping customers see how a lipstick shade suits their complexion or visualising furniture in their living room, VTO aims to reduce product returns and increase buyer confidence.

However, this innovation also raises complex data protection questions that are too often overlooked in the pursuit of frictionless consumer experiences. VTO solutions rely on processing a wide range of personal data, often including biometric data raising complex compliance issues under the UK GDPR and related data protection laws.

How VTO technology works

Retailers are integrating VTO tools across a variety of product categories, including:

  • makeup
    Apps like ModiFace (acquired by L’Oréal) allow users to digitally apply makeup such as lipstick or foundation in real time. These platforms capture facial features using augmented reality (“AR”) and facial mapping technology to simulate product application.
  • skincare
    ModiFace and similar platforms can analyse uploaded selfies to assess skin conditions and recommend products. Some tools go further, offering personalised skincare routines and even supplement or lifestyle advice, based on “inferred” health data.
  • clothing
    VTO platforms for clothing use mobile device cameras and AR to generate 3D body scans. These scans are then processed using machine learning to match users with appropriate sizing across different brands based on body shape and preferences.

These technologies enable rich, personalised consumer experiences, while also generating valuable data for retailers’ marketing, analytics, and product development teams.

What personal data is collected?

VTO solutions may collect and process the following categories of personal data:

  • biometric data: facial geometry, skin tone, and body dimensions from makeup, skincare, and clothing applications—potentially qualifying as special category data under Article 9 of the UK GDPR. This is of particular significance as Article 9 of the UK GDPR imposes a restriction on the collection and processing of special category data unless the explicit consent of the data subject has been obtained in advance, due to the increased risk to the data subject of processing such data.
  • photographic and video data: user images may inadvertently capture sensitive information, such as religious attire, disability markers, or background elements disclosing political views or health status.
  • device and location data: IP address, GPS data, device model, and operating system details.
  • user-provided data: email, age, gender, preferences—particularly where account creation is required in order to receive personalised recommendations.
  • behavioural and interaction data: logs of user interactions, products tried, time spent on each item, and inferred purchase intent.
  • “inferred data”: AI-generated insights about a particular individual, on matters such as skin tone category, body size, or product fit suitability.

Key compliance risks

  • transparency and privacy notices
    Many retailers fail to clearly communicate how VTO technologies work or what data they collect. Generic website privacy notices are often insufficient. Guidance of the UK data protection regulator, the ICO, stresses the need for clear, just-in-time notices and meaningful transparency for AI-enabled technologies, particularly when ‘special category’ data, or ‘inferred data’, is involved.  It is therefore imperative that retailers have tailored privacy notices in place, which are provided to users at their first interaction with their VTO app or platform.
  • legal basis and consent
    Where biometric data is processed, explicit consent is typically required. Retailers must ensure consent is granular, freely given, and clearly distinguishable from broader terms of service. Bundled or implied consent will not satisfy UK GDPR standards. The French data protection regulator, the CNIL’s guidance on facial recognition, confirms that biometric processing for commercial services generally requires explicit consent and must remain proportionate and necessary to the stated purpose.
  • scope and sensitivity of data
    VTO tools may process sensitive personal data, which requires enhanced protection. If such data is compromised, e.g., through a cyberattack it could be used for identity theft or impersonation via facial recognition technologies. In May 2024 an Australian company, Outabox that implemented facial recognition technology in bars and clubs suffered a substantial data breach which compromised large amounts of biometric data.
  • third-party access and data sharing
    VTO platforms often rely on third-party providers, who may use customer data for analytics, algorithm training, or advertising. This creates challenges around transparency, accountability, and data subject rights.
  • data Retention and Secondary Use
    Retailers may retain user photos or scans for training AI models or marketing purposes. Without obtaining proper consents, disclosures and retention limits, this practice poses legal and ethical risks; specifically where users have withdrawn consent and or closed their accounts.  Retailers’ systems must also be set up to remove such data automatically. 
  • authentication and social logins
    Platforms that allow login via social media accounts may access a broader range of personal data than users realise, increasing privacy risk and complicating data governance.
  • algorithmic bias and fairness
    AI models trained on non-diverse datasets may produce inaccurate or discriminatory results; e.g. recommending unsuitable products for individuals with darker skin tones or non-standard body types. The ICO’s guidance “Explaining decisions made with AI” (with the Alan Turing Institute) urges businesses to mitigate bias and ensure fairness in AI outcomes.

Best practice recommendations

To align VTO deployments with data protection compliance obligations, retailers should adopt the following best practices:

  • privacy by design and default
  • conduct DPIAs (Data Protection Impact Assessments) before implementation.
  • embed privacy safeguards into the technology stack, rather than bolting them on afterwards.
  • The ICO’s AI and Data Protection Risk Toolkit offers useful risk-assessment resources.
  • informed, granular consent
  • avoid implied consent or pre-ticked boxes.
  • provide clear opt-in choices before activating cameras or scanning features.
  • offer opt-out functionality without degrading the user experience.
  • tailored and just-in-time privacy notices

Use contextual notices at the point of data capture, explaining:

  • what data is collected (e.g., facial scans)
  • Why it is collected (e.g., product recommendations)
  • lawful basis (e.g., explicit consent)
  • retention period
  • third-party involvement
  • how users can exercise their rights
  • data security and retention controls
  • use encryption, anonymisation, and real-time rendering; where rendered data is encrypted in transit, only available temporarily (rather than permanently stored) and automatically times out or is masked after viewing.
  • limit retention of biometric data to the duration of the session or user account lifespan.
  • implement easy-to-use "delete my data" features in apps or accounts.

Conclusion

VTO technology presents significant commercial opportunities but also heightened data protection responsibilities. With the processing of biometric and other sensitive personal data, compliance with the UK GDPR is not optional. Retailers must ensure that VTO platforms are designed with privacy in mind, informed by data protection principles such as purpose limitation, data minimisation, and accountability.

The retailers that proactively implement strong data governance, user transparency, and ethical AI practices will not only mitigate regulatory and reputational risk—they will also build deeper consumer trust in an increasingly privacy-conscious marketplace.

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2025.

 

Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.