Will Roblox and Fortnite cause a reinterpretation of our reality

The advancement of physics based programming skills used in games will make the metaverse a mind bending universe that will collapse the notion of reality in on itself. To understand the gravity of this statement, we need to consider how the relationship between technology, human perception, and the concept of reality has evolved over time. For years, we have defined reality as the tangible world of solid objects, verifiable facts, and measurable events. But as computational sophistication marches forward—particularly in the realm of gaming—we are poised to enter an era where simulated physics and immersive digital environments challenge our most fundamental assumptions.

This will force Brands to license everything in two realities simultaneously in the very near future. Imagine a world where your favorite brand’s clothing line doesn’t just exist on a physical rack in a department store but also in a fully realized, physics-accurate virtual environment accessible via headsets, screens, or holographic interfaces. These parallel licensing agreements, one for physical products and one for their digital counterparts, will reflect a world where consumers don’t just buy products—they buy experiences and identities that transcend the boundaries we currently take for granted.

As we approach this future, it’s becoming clear that Roblox and Fortnite will be leaders in the creation of these worlds, not, as we thought, companies like Meta and Google. The intuition has long been that tech giants with massive resources, research facilities, and global influence would naturally dominate the metaverse. Instead, the crown will belong to the platforms that have already cultivated massive user bases and fostered communities of creators who understand how to build interactive, physics-driven experiences. Roblox and Fortnite have proven that user-generated content, flexible economic systems, and community-driven evolution can outperform top-down corporate designs. Meta and Google, by contrast, might find themselves providing the underlying infrastructure and hardware—the computing power, the network bandwidth, and the augmented reality glasses—while ceding creative direction and cultural influence to the platforms that understand players and creators best.

In this dynamic, those companies will only be hardware and foundational tech providers, creating the essential building blocks—the processors, the displays, the input devices—that enable the metaverse to function. But the true artistry, the cultural relevance, and the social capital will lie in the hands of the game-based metaverse platforms where physics-based programming is refined and innovated on a daily basis. By opening their platforms to endless customization, modification, and iteration, these game worlds become living laboratories for new forms of expression, commerce, and interaction.

Further, VR and AR will both play significant roles in how alternate realities will blend into our own existence. Virtual Reality will provide the fully immersive, closed-loop experiences that transport us entirely into digital realms. In these worlds, physics-based programming can simulate every nuance of gravity, friction, wind resistance, and material density, allowing us to build and explore environments that feel shockingly authentic. Augmented Reality, on the other hand, will overlay these digital constructs onto our immediate surroundings. Suddenly, that coffee table in your living room can be home to a tiny, interactive marketplace of digital goods, or your city park can become a sprawling fantasy kingdom populated by mythical creatures visible only through AR glasses or your phone. The interplay between VR and AR will enrich our world with layers of meaning and existence that challenge any singular definition of “reality.”

The question of what is reality will blur the lines between things we can physically touch, and things that only our minds can touch. As physics-based simulations become more convincing, and as our AR and VR technologies improve, we will find ourselves questioning whether something needs to have mass or matter to be considered “real.” When we can see, hear, and even feel haptic feedback from digital objects, they will begin to occupy a peculiar ontological space. Are they less “real” because they are generated by software code rather than atoms? Or do they become their own category of reality—one based on our neural interpretations, emotional responses, and shared social consensus? As this happens, the boundary between the physically tangible and the mentally immersive will become increasingly fuzzy.

Our perceptions of those two worlds, and how we interact with other humans in them, will guide an ultimate notion of reality that transcends today’s definitions, and erases traditional ideas around the firmament as the ultimate and only decider of who and what we are. Right now, we look to the physical universe as the ultimate reference frame for truth. We believe that the firmness of the ground beneath our feet, the solidity of a brick wall, or the certainty of physical boundaries define what is “real.” But as the metaverse grows more intricate and as physics-based programming becomes more lifelike, these “traditional ideas around the firmament” begin to fracture. We may find ourselves placing equal value on digital experiences that feel genuine, authentic, and significant—experiences that allow us to form deep bonds, forge new identities, and engage in meaningful social, cultural, and economic activities. Over time, this could lead us to revise our metaphysical assumptions, acknowledging that reality may not be limited to the physical domain.

Our bodies are, if nothing else, the ultimate piece of hardware allowing our minds to interpret what we believe to be reality. In other words, the human body is the interface through which we download the world. Our senses take in information—light, sound, pressure, temperature—and our brains interpret these signals to form coherent mental models of “what is.” When we introduce digital layers that produce similar inputs to our senses, our brains don’t necessarily distinguish between what’s physical and what’s computational. We only know what we feel, see, and experience. As metaverse environments deliver increasingly realistic stimuli, our perception of reality will be shaped as much by physics-based programming as by the laws of nature.

The implications for brands, content creators, and everyday people are profound. Companies looking to stay relevant must anticipate a future where they negotiate licensing deals in parallel universes—both the physical and the digital. Content creators will pioneer new forms of storytelling, entertainment, and art that exist in a multi-layered reality. And everyday people will wake up to a world where their morning commute could be accompanied by interactive digital companions, where the coffee mug they pick up might simultaneously exist in a game world, and where the question “Is this real?” no longer has a straightforward answer.

In the end, the advancement of physics-based programming in games and the rise of the metaverse—led by platforms like Roblox and Fortnite—will usher in a paradigm shift, one that collapses the notion of reality onto itself. We will live in two worlds at once, negotiate meaning in both, and ultimately discover that reality is not anchored solely in physical matter but also in the perceptions and interpretations of our minds. This moment in history will push us beyond all previous concepts of truth, existence, and identity, and toward a new understanding of what it means to be human in a universe of blended realities.

By Marcos Sanchez

Next
Next

We’re hiring in Japan!