Most teams that struggle with VR app development don't fail because the technology is hard. They fail because they treat a VR app like a mobile app with a headset attached — and discover the difference only when the frame rate collapses, the store rejects the build, or users report nausea after two minutes.
We've shipped VR development projects across museum installations, enterprise training platforms, consumer apps on the Meta Quest App Store, and live event environments. The decisions that actually determine whether a project ships on time — and works when it does — are rarely the ones covered in beginner tutorials. This post covers them.
Platform and Engine Selection: The Decision That Locks In Everything Else
The first architectural decision in any VR app development project is also the most consequential. Platform and engine choice cascades through performance targets, asset pipelines, SDK dependencies, and submission requirements. Changing course mid-project is expensive. Getting it wrong at the start is more expensive.
Meta held 72.2% of the XR market in 2025[^1]. For most studios, that market reality makes Meta Quest the default starting point — and it's a reasonable default. But "shipping on Quest" is not a single specification. Quest hardware runs a mobile Snapdragon processor, not a desktop GPU. Every performance decision you make — texture compression, polygon counts, draw calls, lighting approach — is governed by that constraint.
For VR development on Unity versus Unreal, the calculus specific to Quest is clear. Unity holds a 25.13% market share in game development overall versus Unreal's 14.99%[^2], and its advantage on mobile VR is more pronounced than those aggregate numbers suggest. Unity's lightweight architecture, mature XR Plugin Management framework, and ASTC texture compression pipeline give it practical advantages on standalone hardware that Unreal's superior visual tooling doesn't offset. We build primarily in Unity for Quest work — across Immersive Exposure, Iman VR, Empathy Lab, and NBK Virtugate — because it gives us the most direct control over mobile performance budgets.
Unreal earns its place on high-end PC VR and enterprise simulation workstations where the GPU can absorb its overhead. For standalone mobile VR, it introduces file size, memory, and build complexity that fights the hardware rather than working with it.
Scene Architecture: Decisions You Can't Undo Later
Scene architecture in VR is not just an organizational preference — it's a performance contract. The choices you make about tracking origins, spatial anchors, and rendering pipelines early in development determine whether the application can be optimized later or requires a rebuild.
Meta's spatial computing model distinguishes between Stage-based tracking (positioning content relative to a defined physical boundary) and Room-scale tracking. For mixed reality work, using anchors to position virtual elements is best practice — not simply defaulting to Stage positioning[^3]. Applications that ignore this distinction produce virtual content that drifts relative to physical surfaces, which breaks immersion immediately and visibly.
For Iman VR — an immersive VR journey through the life of the Prophet Muhammad, built for the International Fair and Museum of the Prophet's Biography — scene architecture had to serve two competing requirements simultaneously: historically accurate environmental reconstruction at a level appropriate for a museum audience, and real-time performance on VR hardware in a live installation environment. Those requirements don't naturally coexist. Historical accuracy pushes toward high-polygon geometry and high-resolution textures. Real-time VR performance pushes hard in the opposite direction.
The resolution was establishing asset budgets before modeling began — not after. Every environment had a polygon ceiling, every texture had a resolution limit, and every scene had a draw call budget. These constraints were set by the target hardware's performance envelope, not by artistic preference. That discipline is what made the project deliverable. You can read more about the lessons from that build in our museum and enterprise VR development breakdown.
The Scene Mesh — the 3D reconstruction of the physical environment captured by device sensors — offers tempting shortcuts for environmental interaction. Meta's own guidance is instructive here: use Scene Mesh for fast collision and obstacle avoidance, such as bouncing physics objects, but not for resting content or complex environmental interactions where mesh artifacts become visible[^3]. Developers who use Scene Mesh as a general-purpose environmental solution discover its coverage limitations only when users encounter visible artifacts that break immersion at the worst possible moment.
Performance Budgets: The Numbers That Govern Everything
Ninety frames per second is the minimum threshold for comfortable VR on Meta Quest 2 hardware. This is not a guideline — it's a physiological constraint. Frame rate drops below 90 FPS introduce judder that directly triggers motion sickness in a significant portion of users. Modern hardware extends to 120 Hz, and the refresh rate spectrum now spans 72 Hz entry points through 120 Hz high-performance modes[^4]. Applications must be tested across this spectrum, not optimized for a single assumed target.
Motion reprojection — Meta's Asynchronous Space Warping — synthesizes frames when the GPU falls short of the native refresh rate. It can mask performance problems in demos. It should not be treated as a performance strategy. Reprojection introduces latency, produces visual artifacts on fast-moving objects, and can trigger motion sickness in users sensitive to extrapolation errors. The target is native frame rate, consistently.
Foveated rendering is the most impactful single optimization available on Quest hardware. Rendering the peripheral region at lower resolution than the foveal center is perceptually transparent to users — human vision concentrates acuity where the eye is directed, and peripheral resolution loss is not consciously perceived. The performance saving is material: applying stencil optimization saves approximately 21% of pixels shaded, and using a 70% inset saves up to 30%[^5]. On Immersive Exposure, this optimization was non-negotiable for maintaining stable 90 FPS on standalone hardware while rendering complex interactive 3D environments.
The implementation caveat matters: foveated rendering requires rendering scenes four times — two views per eye, one for the inset and one for the outset[^5]. Without the multiview extension, this doubles rendering overhead and eliminates the performance benefit. It also only helps GPU-bound applications. If your application is CPU-bound — limited by draw calls, script execution, or physics — foveated rendering does nothing. Diagnose the bottleneck before applying the optimization.
Texture compression is the other high-leverage optimization for Quest. ASTC (Adaptive Scalable Texture Compression) is the right choice for mobile VR. The alternative, ETC compression, leaves textures with higher memory footprints and lower visual quality[^6]. ASTC's configurable block sizes — starting at 5x5 or 6x6 for high-detail textures and scaling up for distant surfaces — give developers direct control over the quality-to-compression tradeoff matched to visual importance. Textures that consume significant screen space get higher-quality compression; textures on distant geometry get more aggressive compression. This is not a one-size-fits-all setting.
Level of Detail systems resolve the tension between artistic quality and polygon budgets by showing lower-polygon mesh versions as camera distance increases[^6]. In complex VR scenes where users can see across significant distances, LOD systems keep vertex counts manageable and prevent micro-triangle artifacts — the visual degradation that occurs when distant geometry becomes smaller than a screen pixel.
Building for Immersive Exposure: What the Meta Quest App Store Actually Requires
Shipping Immersive Exposure — an interactive VR education platform with 3D photography lessons and a virtual community room, released on the Meta Quest App Store — required navigating Meta's submission process with the same rigor applied to the technical build. The client's feedback was direct: "Released early on the Meta Quest app store, meeting expectations. Responds quickly and follows up promptly." That outcome required planning the submission timeline as carefully as the development sprint schedule.
Meta requires at minimum two weeks of lead time before a target launch date[^7]. That timeline is not negotiable, and it runs sequentially across three review phases: technical review against VRC guidelines, content review for polish and policy compliance, and publishing review for release logistics. Each phase can surface issues requiring correction — and each correction resets the clock on that phase.
The most reliable predictor of rejection is not technical failure. Most rejections stem from issues that have nothing to do with application quality:
- Over-requested permissions. Developers request every available permission as a precaution. Meta reviewers interpret this as unnecessary data access and reject the build[^8]. Request only permissions the application genuinely uses, with explicit justification for each.
- Inadequate screencasts. Meta reviewers do not independently explore applications. The screencast is the primary reference. Screencasts without narration, showing mock screens rather than actual functionality, or failing to walk through core user journeys are rejected at high frequency[^8]. Treat the screencast as a structured product demonstration, not a recording of someone using the app.
- Privacy policy failures. Meta checks that privacy policies load instantly, display business name and contact information clearly, explain data usage explicitly, and match permission requests in scope[^8]. A slow-loading policy page triggers rejection regardless of content quality.
- Dashboard misconfiguration. Wrong app type selection, missing test user accounts, app not switched to Live mode, and required products not enabled in Meta infrastructure all generate instant rejections[^8]. These are entirely preventable with a pre-submission verification pass.
Our detailed walkthrough of the full submission process is available in the guide to publishing a VR app on the Meta Quest Store.
Motion Sickness and User Comfort: Test Continuously, Not at the End
Motion sickness is not a polish issue. It is a fundamental design and engineering constraint that must be tested continuously throughout development, not addressed as a QA item in the final sprint.
Frame drops are the primary trigger. Any substantial drop below the target refresh rate creates perceptible judder that exacerbates nausea in susceptible users. But frame rate stability is not the only variable. Locomotion design — how users move through virtual space — has an equally significant impact. Artificial locomotion (moving the camera without corresponding physical movement) is more likely to induce discomfort than teleportation or stationary interaction models. For applications requiring artificial locomotion, comfort settings including vignetting during movement and reduced field-of-view during acceleration are standard mitigations, not optional features.
New VR users typically need 5–15 minute initial sessions to build tolerance[^9]. Applications designed for extended sessions should include explicit comfort settings and session length guidance. This is not a weakness to hide — it is responsible product design.
The Economics: What Budgets Actually Look Like
Basic VR apps with limited interaction and single-platform deployment start at USD 20,000–50,000 and take 2–4 months[^10]. Standard applications with custom 3D environments and multi-platform support run USD 50,000–150,000 over 4–8 months. Advanced applications with complex networking and extensive custom content reach USD 150,000–500,000 or more, requiring 8–18 months. Enterprise training simulations — with system integration requirements and compliance validation — frequently exceed USD 1,000,000[^10].
Senior VR developers charge USD 80–150 per hour; junior developers USD 40–80 per hour[^10]. A moderate 6-month project with 4–6 specialists generates USD 200,000–400,000 in labor costs before hardware, licensing, and QA expenses.
On the revenue side: the Meta Horizon Store now has over 50,000 applications[^11], and over 100 games generated USD 1 million or more in gross revenue in 2025[^12]. But revenue concentrates sharply — the top 124 games account for approximately 85% of total store revenue[^12]. Premium sales remain the largest revenue driver; in-app purchases grew over 10% in 2025, and the number of IAP applications reaching USD 500,000+ increased 20%[^12]. Enterprise deployment models — where revenue comes from per-implementation contracts rather than storefront sales — offer substantially different economics and are worth serious consideration for training and simulation applications.
Pre-Shipping Checklist
Before submitting any VR application to the Meta Horizon Store, verify each of the following:
- [ ] Application maintains native refresh rate (90 FPS minimum for Quest 2) across all core user journeys without relying on reprojection
- [ ] Foveated rendering is implemented and confirmed GPU-bound performance benefit is measurable
- [ ] All textures use ASTC compression with block sizes matched to visual importance
- [ ] LOD systems are applied to all scene geometry visible across distances
- [ ] Only genuinely required permissions are requested, with explicit justification documented
- [ ] Screencast is a structured walkthrough of core functionality with narration — not a passive recording
- [ ] Privacy policy loads instantly, lists business name and contact, and maps directly to permission requests
- [ ] App dashboard is fully configured: correct app type, test user accounts created, Live mode enabled
- [ ] Motion sickness testing has been conducted with representative users across the full user journey — not just the demo path
- [ ] Submission is planned at minimum two weeks before the target launch date
Related Reading
- VR Development — Hub
- How to Publish a VR App on the Meta Quest Store
- Custom VR Experience Development: Museum Lessons for Enterprise
- Unity vs Unreal for Enterprise VR Training
- Immersive Exposure — Project Case Study
- Iman VR — Project Case Study
If you're planning a VR application and want to work through platform selection, performance architecture, or submission strategy with a team that has shipped across Meta Quest, museum installations, and enterprise training environments, talk to us at VVS. We'll tell you what the build actually requires — before you're committed to a direction that's expensive to change.
[^1]: Precedence Research, VR Market Size & Forecast, 2025–2034 [^2]: Unity vs Unreal Engine market share, 2025 developer survey data [^3]: Meta Developer Documentation — Scene Best Practices, 2025 [^4]: Display refresh rate spectrum for Meta Quest hardware, 2025 [^5]: Meta Developer Documentation — Foveated Rendering, pixel saving benchmarks [^6]: Meta Developer Documentation — Texture Compression and LOD Best Practices [^7]: Meta Horizon Store submission timeline requirements [^8]: Meta Horizon Store rejection pattern analysis, practitioner documentation [^9]: VR user comfort and session length guidelines, XR practitioner research [^10]: VR app development cost benchmarks, 2025 [^11]: Meta Horizon Store ecosystem data, 2025 [^12]: Meta Horizon Store revenue distribution data, 2025