This Acceptable Use Policy (“AUP”) is part of the Terms of Service and explains what you can and cannot do with LiveSwap. We wrote it in plain language because these rules are non-negotiable: violating them puts real people at risk and will get your account suspended or terminated, with reports to law enforcement where required.
1. The core rule: consent
LiveSwap transforms a live video stream into one that depicts a different face. Before you upload any face image as a swap source, you must affirmatively confirm that you either own the image outright, or that every identifiable person depicted has given you clear, informed permission to use their likeness with the Service.
This consent gate is logged with your IP address, user agent, the exact text of the statement you accepted, and the version of this policy in effect at the time. We retain these records for evidentiary and audit purposes and may disclose them in response to lawful requests, takedown notices, or rights-holder claims.
2. Who can use the Service
- You must be at least 18 years old (or the age of majority in your jurisdiction).
- One human, one account. Sharing or reselling your account is not allowed.
- You are responsible for everything that happens through your account.
3. Prohibited uses
You may not use LiveSwap to create, stream, or distribute content that:
3.1 Impersonates real people without consent
- Depicts an identifiable real person without their clear, informed consent — including public figures, celebrities, journalists, colleagues, classmates, or anyone you can name.
- Is intended to deceive viewers into believing the depicted person said or did something they did not, in any context where that deception could cause harm (financial, reputational, electoral, political, social, or otherwise).
- Is used to bypass identity verification, KYC, biometric checks, age-verification systems, secure-video onboarding, or any authentication built around “face liveness.”
3.2 Sexualises real people or involves minors
- Sexual content depicting any real person without their explicit, documented consent. This includes nudification, non-consensual intimate imagery (“deepfake porn”), and pornography using a real person’s likeness.
- Any content sexualising, depicting, or appearing to depict a minor — full stop, regardless of consent or claimed age. We report suspected CSAM to NCMEC and equivalent authorities.
3.3 Facilitates fraud, harassment, or other illegal acts
- Financial scams, romance scams, business-email-compromise, or any attempt to extract money or credentials by impersonating someone.
- Election interference, voter manipulation, or any content that falsely attributes statements or actions to candidates, officials, or election workers.
- Targeted harassment, doxxing, stalking, defamation, threats of violence, or content designed to intimidate or silence a specific person or group.
- Hate speech, incitement to violence, or content that promotes, glorifies, or advocates harm against people based on race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, disability, or other protected characteristic.
- Anything that violates applicable law in your jurisdiction or ours.
3.4 Abuses the platform
- Reverse-engineering, scraping, or attempting to extract our models, weights, or proprietary code.
- Bypassing rate limits, quotas, safety filters, watermarking, consent gates, or any other technical control.
- Sharing access tokens, API keys, or session credentials, or using the Service to build a competing face-swap product.
- Submitting deliberately malformed inputs to probe for vulnerabilities outside of our published security disclosure process.
4. Disclosure when you publish
When you publish output that depicts a real human face — even your own — we strongly recommend you disclose that the video was AI-generated or AI-modified. Some jurisdictions and platforms (for example, EU AI Act, several U.S. states, major social platforms) require this disclosure by law or by policy. You are responsible for complying with the rules that apply to you.
5. Reporting abuse
If you believe content created with LiveSwap is being used to impersonate, harass, defraud, or sexualise you or someone you represent, contact abuse@liveswap.io. Include enough detail (URLs, timestamps, descriptions) for us to investigate. We aim to acknowledge reports within one business day and to action verified reports as quickly as possible — typically within 72 hours, faster for imminent harm.
Rights-holders sending DMCA or equivalent takedown notices should email legal@liveswap.io.
6. Enforcement
We may, at our discretion and without prior notice, remove content, suspend or terminate accounts, withhold or refund payments, ban billing identifiers, and report illegal activity to law enforcement. We may share information with payment processors, hosting providers, and other platforms when required to investigate abuse or comply with legal process.
We do not issue refunds for accounts terminated for AUP violations unless required by law.
7. Changes
We may update this AUP from time to time. We will revise the “Last updated” date and bump the internal version stamp. Material changes will be communicated via email or in-app notice before they take effect. Continued use of the Service after the effective date constitutes acceptance of the updated rules.
8. Contact
- Abuse reports: abuse@liveswap.io
- Legal & takedowns: legal@liveswap.io
- Everything else: hello@liveswap.io