Orbyta presents its innovative projects at WMF 2024
AI mindset, mixed reality for real estate and accessibility: Turin-based Orbyta presents its innovative projects at WMF 2024
From sign language translation to immersive environments for real estate to a mindset development campaign to address AI transformation, Turin-based Orbyta participated in the WMF, International Fair and Festival on AI and Tech Innovation, presenting its projects related to emerging revolutions.
WFM-We Make Future 2024
Turin, June 16, 2024. Orbyta presented its projects related to emerging technologies at WMF – We Make Future, International Fair and Festival on AI, Tech Innovation held in Bologna June 12-14.
Established in 2020 from a decade of experience in consulting, the Orbyta group currently has more than 250 employees and continues its mission of accompanying companies in the ongoing revolutions, with a focus on AI transformation and extended reality.

Mixed reality environments for real estate
Obyta Tech and Orbyta Engineering, the two group companies specializing in ICT solutions and Civil Engineering and Architecture, respectively, are making immersive previsualization apps.
Builders interested in selling new apartments or companies that want to furnish their headquarters or physical stores can create mixed reality environments to visualize spaces in immersive 3D mode even before they are built.
Postcardfromfuture: developing soft skills of vision and AI mindset
#PostcardFromFuture is the campaign created with the aim of spreading the culture of generative AI from which a method was created for developing the soft skills needed in AI transformation: vision, prompt design, creativity and AI mindset.
The method can be used during team building, in training courses for social media managers, marketers, and designers, and as brainstorming support for the development of communication projects
LIS2Speech: automatic sign language translation
LIS2Speech is a research project born out of a desire to contribute Orbyta’s expertise in AI and cross-platform development to the creation of technologies for accessibility for deaf people.
Specifically, the project aims to develop an open platform that by combining neural networks, deep learning, and computer vision can be used to build apps that can recognize and translate sign language and make the results available to public-private initiatives particularly in the PA, healthcare, culture, and tourism sectors.