CES 2024: Sony Debuts Evolved Afeela Prototype EV, Metaverse Content Creation Hardware
Sony is back at the Consumer Electronics Show (CES) to share what it's been cooking up across different business segments.
Under the theme “Powering Creativity with Technology", the Japanese tech giant hosted an exhibit at the show floor from Tuesday, 9 January, showcasing new initiatives and tech.
This year's exhibit focuses on how the company is empowering creators to explore and create more by providing solutions for storytelling, entertainment and beyond, and enabling new ways to fill the world with emotion. Among the main exhibits are Sony Pictures Entertainment's Torchlight, a new advanced visualisation space for creating digital scenes in movies, and fan engagement experiences with new PlayStation products.
Ahead of the exhibit's opening, Sony executives, including Kenichiro Yoshida, Chairman and CEO of Sony Group Corporation, took the CES stage to talk about other new developments.
Chief of which is the evolved Afeela EV prototype, which was driven on-stage by Afeela President and COO, Izumi Kawanishi, using a PlayStation 5 DualSense controller. Afeela, for those unfamiliar, is the automotive joint venture between Sony Group Corporation and Honda Motor Company. Kawanishi did however say that the controller was for demo purposes only, suggesting that such functionalities won't be integrated into the final release.
This year marks the third appearance of the EV, and Sony has shared more details about it. For example, Afeela will be using Epic Games' Unreal Engine 5.3 to power the 3D graphics that fill its ultrawide dashboard display. These include 3D maps and virtual spaces, as well as augmented reality (AR) views of the vehicle's surroundings.
Passengers can access Sony's slate of TV shows, movies and games to keep themselves entertained while on the road.
Unreal Engine is also being used to train the EV's driver assistance systems since it can render highly realistic environments that improve the accuracy of the data fed to the visual models and neutral network processing systems.
In line with artificial intelligence (AI), Sony Honda Mobility announced a new collaboration with Microsoft, which entails the latter to co-develop a conversational agent for the EV's system using Azure AI service. The voice-enabled agent would reportedly converse with a human-like accent and tone that should allow drivers to talk to it as if they were talking to a real person.
Sony Honda Mobility has yet to reveal when they plan to bring the EV to market.
Alongside the prototype EV, Sony also announced a spatial content creation system with an XR (extended reality) head-mounted display and a pair of controllers. This new device is aimed at content creators in the 3D content and metaverse space, with it featuring proprietary rendering technology for real-time 3D rendering of both objects and human facial expressions. It's equipped with six cameras and sensors that allow it to have video see-through functionality and spatial recognition. The controllers, meanwhile, are designed in such a way that they can be used for 3D modeling, paired with a keyboard, while wearing the head-mounted display.
Under the hood, the device houses Qualcomm's most capable XR processor yet, the Snapdragon XR2+ Gen 2. The head-mounted display sports a 1.3-type 4K OLED panel with 96% DCI-P3 colour gamut coverage. It also includes a flip-up mechanism on the display that allows wearers to quickly switch from seeing virtual to physical spaces.
Sony said it plans to launch the spatial content creation system some time in 2024.
Sony is back at CES to share what it's been cooking up across different business segments.
Chief of which is the evolved Afeela EV prototype, which would leverage Epic Games' Unreal Engine 5.3 to power the 3D graphics that fill its ultrawide dashboard display.
Alongside the EV, Sony also announced a spatial content creation system with an XR (extended reality) head-mounted display and a pair of controllers.