Our mission is to bring people together through the use of immersive multi-user experiences that enhance every users reality. Our games and experiences also look to allow our users to be creators and share their own personal visions of this new reality with others.
Mohx games started out with a vision to create experiences that would enhance reality and bring in a new genre of entertainment. We knew that in order for XR technologies to become mainstream we would need an experience that was immersive and fostered long term engagement. In our minds, this meant a game and what better kind of game to be used to introduce an new reality than a fantasy game?
With this in mind, we started development of Wizards Academy. To be successful this game would need to have great stories, great graphics and gameplay, plus be a multiplayer experience. The technology to do all of this in XR did not exist, so we set about creating it ourselves.
In order for an experience to be good, we feel it needs a few key features. Among these are multiplayer, enhanced graphics and environmental interaction. The most important of these is the ability to for the experience to be experienced by multiple people at the same time. The technology for multiple users in AR to see the same objects in the same place at the same time did not exist. Which is why we created our cross platform multiplayer system. This system allows users on any device (iOS, Android or XR glasses) to see objects in an experience with near perfect positioning. To add to this positioning breakthrough, we wanted to add the ability for remote users (persons in different physical locations) to be able to share in an experience. To see and interact with the same experience and avatars of one another. We added this feature to the latest build and now any user on any platform in any location can share in an experience. This is our vision of the Metaverse.
Our experiences are optimized to render the best possible graphics possible with the use of the processor on a user’s device. However, this is not enough for the true to life graphics we envision for this new reality. In light of this our team is starting the development of Edge rendering that will allow for true to life graphics to be rendered in real time through a WIFI or 5G network and viewed seamlessly on a user’s device. With this we can create cinematic scenes in games, NPC interactions and artist performances that look as though they are actual living breathing beings in a user’s physical reality.
Lastly, environmental interaction is what takes place when a digital object is influenced by something in the physical world. For this an AI system needs to “see” and understand the physical world around a users’ device. Our system allows AI to “see” the world and both interact with and reference the users’ real-world environment. Examples would be a having an avatar talk to you about items in the room in which you are standing or a character walking behind a car after being told to do so, or helping students with their homework based on their physical papers or items in front of them. In 2023 our teams intend to fully integrate environmental interactions into all of our games and experiences using optimized device processing, as well as Edge computing when necessary and available.