Samsung unveils ultra-slim chips for faster AI on mobile devices

1 Min Read
ultra-thin memory chip

Samsung has kicked off mass production of its LPDDR5X DRAM semiconductors, tiny chips as thin as a fingernail to improve how mobile devices run AI workloads.

They are the industry’s thinnest 12 nanometer (nm)-class memory chips and come in two sizes: 12GB and 16GB.

The chips are designed to process memory workloads directly on the device, enabling the phone’s operating system to work faster with storage devices to more efficiently handle AI workloads.

Samsung’s new LPDDR5X units are ultra-slim to provide more space in a mobile device.

The slim design accommodates a larger processor dedicated to AI tasks, enhancing performance while improving airflow — a crucial feature as advanced AI applications generate more heat.

The chips are 9% thinner than prior Samsung DRAM 12 nm units and offer around 21% improved heat resistance.

To read the complete article, visit IoT World Today.

Subscribe to receive Urgent Communications Newsletters
Catch up on the latest tech, media, and telecoms news from across the critical communications community