Human beings are known for many different things, but most importantly, they are known for getting better on a consistent basis. This commitment towards getting better, under all possible circumstances, has already brought the world …
Human beings are known for many different things, but most importantly, they are known for getting better on a consistent basis. This commitment towards getting better, under all possible circumstances, has already brought the world some huge milestones, with technology emerging as quite a major member of the stated group. The reason why we hold technology in such a high regard is, by and large, predicated upon its skill-set, which guided us towards a reality that nobody could have ever imagined otherwise. Nevertheless, if we look beyond the surface for one hot second, it will become abundantly clear how the whole runner was also very much inspired from the way we applied those skills across a real world environment. The latter component, in fact, did a lot to give the creation a spectrum-wide presence, and as a result, initiated a full-blown tech revolution. Of course, the next thing this revolution did was to scale up the human experience through some outright unique avenues, but even after achieving a feat so notable, technology will somehow continue to bring forth the right goods. The same has turned more and more evident in recent times, and assuming one new partnership ends up with the desired impact, it will only put that trend on a higher pedestal moving forward.
MSBAI, an Air Force Techstars 2020 company, along with Princeton University, has officially been awarded a 1.25 million dollar Phase 2 contract to build the future of immersive collaborative training for space flight operations. According to certain reports, the stated arrangement will focus on bringing GURU, an autonomous system which drives expert workflows in software to facilitate space flight mission training scenario generation and visualization. In case you weren’t aware, GURU is a cognitive AI assistant that enables you to set up simulations way quicker than what we have come to considered as norm in the context of specialized simulation software such as those utilized in Nuclear Energy Advanced Modeling and Simulation (NEAMS). In practice, the AI assistant will reduce the time and cost it takes to design a nuclear reactor, thus solving a wide range of nuclear, civil, and systems analysis problems. To realize that goal, both MSBAI and Princeton University will work in conjunction with Air Force Research Laboratory Space Vehicles Directorate’s (AFRL RV) Defense Readiness Agile Gaming Online Network (DRAGON). Here, they will hyper-enable their White Cell Trainer group and set up comprehensive mission design, analysis, and training scenarios in no more than a few minutes. The importance of such speed can be contextualized once you consider how teaching trainers to create effective training scenarios for space flight mission operations is a task that requires careful planning and adaptation. This means any misstep can easily manufacture inconsistencies, gaps, and inefficiencies throughout the training process. Talk about how the development in question will address the given problem, it will deliver at your disposal a user-friendly and intuitive system, a system well-equipped with multiple visualization options for space operations. Next up, it is going to integrate with AFRL’s White Cell Console for the purpose of autonomously driving the simulation tools. Then, there is the prospect of creating comprehensive visualizations of training scenarios in both 2D and 3D.
“U.S. Space Force, the Space Development Agency, and other key stakeholders must lead DoD efforts to build comprehensive military space power through a trained, equipped, and ready force that is integrated into Joint Force plans to support the Department’s objectives to compete, deter, and win across the spectrum of conflict,” said US Army General, James H. Dickinson, Commander of U.S. Space Command.
Among other details, GURU will leverage immersive virtual world building to maximize team understanding and collaboration through collaborative visualization, and in case that’s not enough, it will also put to use software such as Unreal Engine to achieve this very objective. By doing so, the system will go a long way in supporting spatial computing and mixed reality experiences across the latest devices, such as Apple’s Vision Pro, Meta’s Quest 3, and Ray-Ban Wayfarer.
“Our team could not be more proud to be part of the future of space operations by hyper-enabling trainers analysts and operators to have a fast onramp to the best mission design and analysis tools, and to use them quickly and collaboratively,” said Allan Grosvenor, CEO of MSBAI.
Going back to the role of AFRL, it has a big role to play, considering its ability in leading the discovery, development, and integration of affordable warfighting technologies for air, space, and cyberspace forces. Boasting a focal point of nine technology areas and 40 other operations, AFRL provides a diverse portfolio of science and technology, ranging from fundamental and advanced research to eventual technology development.
Copyrights © 2024. All Right Reserved. Engineers Outlook.