Session Logo SESSION

Community Safety

Our checks, balances, and zero-tolerance policies.

Community Safety

Session is built on a foundation of trust. Because we facilitate 1-on-1 private VR instances, maintaining a safe, verified, and strictly adult userbase is our absolute highest priority.


Zero-Tolerance Policy

We do not offer warnings or temporary suspensions for severe violations of trust.

Any account found to be engaging in the following behaviors will be permanently and immediately banned from the Session platform:

  • Identity Spoofing: Pretending to be someone else, using someone else’s VRChat account, or misrepresenting your gender or age to manipulate the matchmaking system.
  • Underage Users: VRChat’s 18+ Verification system is strict, but it is possible for a user to borrow an older sibling or parent’s account. If an underage user is discovered operating an 18+ verified account on Session, they will be permanently banned.

Checks and Balances

Our moderation system relies on a combination of automated API checks and Community-Curated Moderation. We believe that the community itself is the best line of defense against bad actors.

1. The Aura System

Aura is the backbone of Session’s community moderation. It is a live reputation score that goes up when you have positive, healthy interactions, and drops rapidly if you act poorly.

  • After every single match, both users are prompted to anonymously rate the interaction.
  • If a user receives a Negative rating, their Aura takes a massive sustained hit.
  • If a user’s Aura drops below the community threshold, they are automatically shadow-banned and placed into an isolated matchmaking pool with other low-Aura users, keeping the main ecosystem clean.

Learn more about how Aura is calculated.

2. Immediate Match Reporting

If you are matched with someone who is violating our zero-tolerance policies (e.g., they sound like a minor, they are harassing you, or they are clearly not who their profile says they are), you have the power to instantly terminate the match.

  1. Open the Session mobile app.
  2. Select the active match.
  3. Tap Report User.

Select the specific reason for the report (e.g., “Underage User”).

[!IMPORTANT] Accounts that receive direct behavioral reports are flagged for immediate manual review by the Session moderation team. If the report of severe misconduct (such as age misrepresentation) is verified via logs or community consensus, the offending account will be removed from the platform.

We rely on you to help keep the network secure. If something feels wrong, report it immediately.


Off-Platform Activity & VRChat Moderation

Session serves purely as a matchmaking layer. Once a match is made, the resulting private instances are generated and hosted entirely on VRChat’s servers.

Because we do not own or operate VRChat’s systems, our accounts are derived directly from the user’s VRChat account. Consequently, we do not have central moderation over what happens in-game:

  • No Central Instance Moderation: Session has zero visibility, monitoring, or central moderation capabilities over what occurs inside these private VRChat instances.
  • User Responsibility: We do not monitor or condone any specific activities performed in these private spaces. Everything that happens off the Session platform is the responsibility of the users and VRChat.
  • Reporting In-Game Issues: If you encounter severe issues, harassment, or safety violations while inside the VRChat instance, you must report those users directly using VRChat’s in-game reporting tools or by contacting VRChat Support.

While you should always use Session’s Aura rating system to report bad actors on our platform to prevent future matches, physical in-game intervention must be handled by VRChat.


Get Verified

If you need to verify your age to use Session or confirm your current Verification Marker settings, please visit the official VRChat Verification Portal.