Industrial Metaverse to Be Enabled by Siemens and NVIDIA

Share this:

In order to enable the industrial metaverse and increase use of AI-driven digital twin technology, Siemens and NVIDIA have announced an expansion of their partnership.

The companies intend to link Siemens Xcelerator, an open digital business platform, and NVIDIA Omniverse, a platform for 3D-design and collaboration, as the first step in their collaboration, which has functionalities across industries, including energy.

Industrial Metaverse to Be Enabled by Siemens and NVIDIA
Siemens Process Simulate (left) and NVIDIA Omniverse (right) are linked to create a real-time, photorealistic digital twin with full design fidelity. | Courtesy Siemens.

The convergence of IT and OT is made possible by the Siemens Xcelerator platform, which links mechanical, electrical, and software domains throughout the product and production processes. Live digital twins are made possible by NVIDIA Omniverse, an AI-enabled, physically-simulated, and industrial-scale virtual world engine.

To speed up the use of digital twins that can improve productivity and processes throughout the production and product lifecycles, Omniverse has been added to the open Siemens Xcelerator partner ecosystem.

Companies of all sizes will be able to, according to Siemens:

  • Utilize digital twins and real-time performance data,
  • Develop fresh industrial IoT solutions,
  • Utilize useful insights from analytics at the edge or in the cloud.
Industrial Metaverse to Be Enabled by Siemens and NVIDIA
Siemens AG CEO Roland Busch (right) and Nvidia founder and CEO Jensen Huang at the Siemens Xcelerator launch event on June 29, 2022 in Munich.

“Photorealistic, physics-based digital twins embedded in the industrial metaverse offer enormous potential to transform our economies and industries by providing a virtual world where people can interact and collaborate to solve real-world problems. Through this partnership, we will make the industrial metaverse a reality for companies of all sizes,” said Roland Busch, president and CEO of Siemens AG.

He continued, “When Siemens Xcelerator is connected to Omniverse, we will enable a real-time, immersive metaverse that connects hardware and software, from the edge to the cloud with rich data from Siemens’ software and solutions.”

“Siemens and NVIDIA share a common vision that the industrial metaverse will drive digital transformation. This is just the first step in our joint effort to make this vision real for our customers and all parts of the global manufacturing industry,” said Jensen Huang, founder and CEO of NVIDIA.

“The connection to Siemens Xcelerator will open NVIDIA’s Omniverse and AI ecosystem to a whole new world of industrial automation that is built using Siemens’ mechanical, electrical, software, IoT and edge solutions.”

How the Metaverse Is Shaped by Artificial Intelligence?

1) Development of virtual environments

The creation of the virtual equivalent of the real world is at the heart of the metaverse. Artificial intelligence can aid in creating conditions that actually exist in the real world by creating a 3D scene from still photos that is incredibly authentic. This enables us to accurately recreate any current location on the planet, from the Tower of London to the veranda of our beach house.

Artificial intelligence is also capable of creating entirely fictitious locations. It might start with a few developer development efforts, but at that point, reinforcement learning will rule and the AI algorithms will configure more environments that are progressively interesting or pleasant for the users. Artificial intelligence could identify the situations that make us feel the happiest or the most relaxed, extracting their highlights and further exploring them to determine whether they are significantly more entertaining or calming. until the ideal areas are created for the user’s requirements, refine the strategy with each emphasis.

2) Making of Avatars

Even though it’s possible that nobody in the metaverse knows who you are, there will be situations — such as metaverse-facilitated conferences — where hiding behind a distinctive user name and a Salvador Dali mask might not be generally accepted behavior. In those circumstances, it will be essential and valuable to be accessible using one’s real name and an avatar that resembles us as closely as is physically possible. Artificial intelligence (AI) models that analyze our photos and recreate a 3D symbol in our image and resemblance can also be helpful in this situation.

3) Mapping of Body Movements

The constant points of interaction in modern VR are not amazing, as you will discover if you have ever used it. This abandons the intention of requiring users to sign in as frequently as is reasonable or keeping them in the metaverse for as long as may be possible. Making VR interactions more stable will enable people to carry out tasks as efficiently as picking up an object or waving a hand, which is one goal. Artificial intelligence will analyze our body changes in order to achieve this, capturing them through a variety of sensors, and subsequently transforming them into orders or developments of the avatars.

Without grasping any controls, shaking hands with someone should be about as simple as it is in the real world. Similarly, opening or closing a virtual board should be quick and easy, with the AI accurately interpreting all of your body movements.

The mapping of body moments won’t end there, in any case. Artificial intelligence can also replicate our facial expressions onto the avatars so that, for example, when we smile, the avatar also smiles. It can also move an increasing number of expressions, such as glares, yawns, shock, and squints, onto our digital twins to make the transition from the physical to the virtual world as seamless as is reasonably possible.

4) Engage Digital Users of The Metaverse

These NPCs or specific coworkers will appear completely different in the metaverse due to artificial intelligence, performing “astute” activities and undeniably more difficult tasks. Imagine digital assistance guiding new users through the metaverse, pointing out their mistakes and suggesting ways to fix them (or, sometimes, really getting them in the clear). On the other hand, imagine a digital assistant that informs us of messages it receives while we are in a Metaverse-based meeting.

Instead, how about we imagine a region of the metaverse where digital avatars are there as companions or even friends, with whom to speak, tell our concerns, or with whom to form romantic relationships, since this already happens with various mobile applications. We shouldn’t be surprised by this: not too long from now, advanced sentiment will become a more widespread “extravagance” due to AI’s ability to create photorealistic human portrayals and engage in discussions of particular profundity. That is also possible in the metaverse.

5) Parallel Translations

One of the use cases that Meta specifically acknowledges is constant interpretation, and it will dedicate a portion of its supercomputer to this movement. The idea is to enable groups from various nations, each speaking a different language, to communicate and interact in person. To accomplish this, the artificial intelligence model must first understand the language used by the user, recognize the meaning of each word, translate it accurately into the language used by the other users, and then produce the interpreted text in audio format—possibly using the same voice as the original questioner or even a deep impersonation.

All of this is now practical to achieve. Practically, it requires enormous resources, especially if you need to do it continuously and at the scale that the metaverse demands. In any case, Meta has been managing the resources there for a very long time. Without a shadow of a doubt, Meta stated that it will probably result in a universal translator.

As of right now, we know that a lot of the exploration work that started a while back was focused on figuring out how people from different countries could communicate with one another in their native languages and what the best application would be in the metaverse.

Artificial intelligence has a significant impact on how the Metaverse is developed. To obtain accurate results and give potential users an immersive user experience, it is therefore advisable to work together with the artificial intelligence development team and the metaverse developers.

 

Share this:

Leave a Reply

Your email address will not be published. Required fields are marked *

Adblock Detected

Please consider supporting us by disabling your ad blocker

Refresh Page