Apple's M3 chip with 128 GB of memory has proven to be suitable for running large language model inferences. This development opens up new possibilities for on-device AI processing.