What kind of GPU can go further in automotive electronics?

Whether it's a new power system, an in-vehicle infotainment system, or a self-driving car, car technology is growing at an unprecedented rate. New disruptive technologies and industry players are challenging traditional automotive concepts. The driving experience of tomorrow will be very different from today.

Automated vehicles have already appeared in some areas, and by 2030, a quarter of the cars on the road will be autopiloted to some extent. What changes will the future cockpit have? How do cars and drivers communicate? How does the driver's release from driving responsibility affect the design of the cockpit and infotainment system?

Potential challenge

Around 2025, the car will be fully automated, which is quite different from today's cars. It will provide passengers with a wide range of functions on the go: it can be an office, living room, rest and entertainment center. The design of the car will completely subvert the current form, without the steering wheel and the seat facing the interior of the car. The driver does not need to see anything happening outside the car's windshield. The windows will be reduced or disappeared and the car will provide more private space. Although it will take some time to achieve these, some major trends are already visible.

Future cars will be equipped with a large number of screens, which will greatly increase the demand for powerful and powerful GPUs that can handle massive pixels and support augmented reality (AR), gesture control and advanced human-machine interface (HMI). .

It has been suggested that 10 screens should be available in the next generation of cars, offering a combined resolution of up to 72K (through multiple 4K resolution screens). This includes head-up displays (HUDs) that are currently deployed in high-end cars. Head-up displays are typically projected onto a windshield or on a specific screen with infinite depth, so the driver does not have to re-adjust the line of sight as they would with a conventional dashboard to see the information in a head-up view. These head-up displays will become more plentiful and complex. The windshield-based split-screen head-up display is divided into passenger and driver parts, and only the person sitting in the driver's or passenger's seat can see the corresponding display. At the heart of this technology is GPU technology, which not only displays images on the screen of the dashboard, but also displays them on a head-up display.

For example, techniques such as gaze tracking will be combined with a HUD in a driver's car to display driving-related information at the center of the driver's line of sight. The gaze direction can also be used to determine if the driver's attention is on the road. If the driver's attention is not concentrated, the algorithm on the GPU can be used to calculate and issue a warning.

The importance of human-machine interface

As autopilot progresses from Level 1 to Level 5, HMI will become more and more important in controlling Level 2 and Level 3 that need to be passed to the driver. In addition, when the car takes action instead of the driver, the driver needs to be notified so that they are not scared by sudden operations.

Today, people's interaction with dashboards is largely based on vision. In the future, voice interaction, audio response, vibration alarms, gesture control, and projected visual warnings will increasingly be applied to head-up displays.

The new HMI will play an important social role in helping users adapt to the new experience of self-driving cars. During the transition period, people will have to learn to trust unmanned vehicles. Passengers should always know what is going on inside the car: Why choose this lane, which cars are around, which roads are blocked, and how the route is calculated. A well-designed HMI with image and audio elements will be the basis for accepting autonomous vehicles. The human-machine interface should show the car's decision-making process in a natural way, making passengers feel safer and more comfortable. For example, automotive augmented reality (AR) can use a composite glass with a holographic film, like a lens, that reflects only a specific wavelength. The projected video or interactive interface will be seen through the composite glass in front of the windshield.

Of course, challenges will continue to emerge. For example, people must change the basic driving habits of observing the rearview mirror before operating. And developers must also prove to people that the new features of HMI provide a better and more accurate experience than they do today.

What changes will the AR bring to the car?

What will AR bring to the car in the next decade? AR is a technology that places virtual world objects in the real world, and enhances the content being viewed through icons, objects, information, and the like.

Some companies have used the front screen as a projection for the AR HUD (head-up display technology). There are some smaller, unobtrusive versions on the market today, but future versions may be all-encompassing. The AR view is decomposed on the windshield into a simplified driver view (providing only critical information to the driver, reducing distractions). On the passenger side, you can display richer content such as the location of the restaurant, the store, the nearest parking lot, and other information of interest. Another advantage of the AR HUD is that the gaze direction can be inferred by the onboard camera and the key driving information projected to the gaze center to allow for the desired amount of head movement.

GPU provides calculations for ADAS

As you move towards full autonomous driving (Level 5), more driving assistance functions are required in the car, including emergency automatic braking, lane departure warning, pedestrian detection, driver alert, blind spot detection, crossroad assistance and more.

Advanced GPUs are parallel processing units that can be used for repetitive algorithms such as those used in many of the functions of ADAS. The parallelism and multiply/accumulate structure of the GPU is the perfect solution for implementing Convolutional Neural Networks (CNN).

What kind of GPU can go further in automotive electronics?

The CNN algorithm has been around for more than 30 years, and until recently they have provided protection for server farm processing. However, with advances in SoC processing technology, increased efficiency of CNN algorithms, and advances in GPUs, it is now possible to start at the edge of the network, not just in the cloud.

For the ADAS function, any function that requires some degree of image processing, such as extracting road sign information, is very suitable for using CNN to improve overall performance. Embedded high-end PowerVR GPUs deliver up to 20 times better performance and lower power consumption than standard high-end CPUs.

In the end, CNN will accelerate in terms of hardware because their functions are well understood and even standardized. However, GPU computing can support the prototyping and deployment of new neural networks and other acceleration technologies before a fully optimized solution emerges.

To improve security, hardware virtualization can be applied to a variety of applications and services in automobiles. PowerVR GPUs are hardware virtualized, so no two operating system/application combinations can use the same memory space. This core has a two-level MMU (memory management unit) to ensure that this basic requirement is absolutely met. In the event of downloading malware, only the container (the operating system/application running on the security manager) will crash. Other systems will not be affected, such as infotainment systems that may crash, while the ADAS system and cluster/HUD parts running on the GPU will not be affected.

Looking to the future

ImaginaTIon believes that it is necessary to be optimistic about the direction of the autonomous driving market in advance and deliver silicon IP four to five years in advance. Then the car designed in 2018 will appear on the market in 2023. The car that went on sale in 2021 has basically completed the design of the silicon wafer. So Robotaxis today uses chip products from companies such as Intel, NVIDIA, Bosch, Renesas, and TI.

Humanity is entering the most exciting period of automotive design revolution. The changes in cockpit design are the most critical, and these revolutions will eventually be realized as we move toward ever higher levels of autonomous driving. In order to provide the rich functionality and ADAS required within an acceptable power consumption range, the HMI and its various components are particularly dependent on the processing power of future high performance/low power GPUs.

ImaginaTIon Technologies has a long history of providing high-end GPUs to the world's top five automotive chip suppliers, which offer PowerVR-based solutions to world-renowned automotive brands. ImaginaTIon Technologies will leverage this competitive advantage to realize the future of autonomous driving.

Pin Insulator

Pin insulator,Silicone rubber insulator,Composite insulator,Post insulator,Power fittings

TAIZHOU HUADONG INSULATED MATERIAL CO.,LTD , https://www.thim-insulator.com

This entry was posted in on