Samsung has expressed its confidence in the Exynos 2400 SoC, claiming that its GPU performance is better than that of its competitors.

While Qualcomm officially announced the Snapdragon 8 Gen 3 a few days ago, Samsung has only confirmed the existence of the Exynos 2400. As expected, the Exynos 2400 will be equipped with the Galaxy S24 and Galaxy S24+, while the Galaxy S24 Ultra only uses Snapdragon 8 Gen 3. But are these two SoCs on par?

Samsung has expressed confidence in the performance of the Exynos 2400 compared to its competitors. The company says its SoC has a better GPU.

Samsung claims the Exynos 2400 has better GPU performance than the competition

Park Yong-in – President of LSI System – seems confident about the performance of the Exynos 2400. He said: “It will perform well because it has better GPU (graphics processing unit) performance than with competitors.” This statement was made to Korean reporters after his presentation at Semiconductor Expo 2023 held at COEX in Gangnam District, Seoul early yesterday.

Samsung revealed a few days ago that the Exynos 2400 has a 70% faster CPU and 14.7 times faster NPU than the Exynos 2200. There are reports that the Exynos 2400 has been developed with utmost care after the Exynos 2200 incident. came out the other year. The new SoC also supports on-device generation AI, providing the ability to convert text into images even when the device is not connected to the network.

Exynos 2500 is confirmed to use a 3 nm process

President LSI Systems also confirmed that next year's Exynos SoC will use a 3 nm process. The name of this processor has not yet been revealed. However, there is a high possibility that its name will be Exynos 2500. Accordingly, this new SoC is expected to use Samsung Foundry's new 3 nm GAA process, which is said to bring performance and savings. better power than TSMC's 3 nm process.

Additionally, the company also revealed that NPU will be used for AI processing instead of GPU in the future. Currently, GPUs are still being used in cloud servers for AI. However, they are too expensive and consume a lot of energy. So, Park Yong-in stated that NPU should be used for AI use cases.