Back to archive

Thread

6 tweets

1
In Apple's Monday presentation, they mentioned a couple of ML related things. From what I remember: - transformer models for speech2text and autocorrect running on the device - they claimed that the new M2 Ultra with 192GB unified memory could be used to train LLMs.
2
Speech2Text and autocorrect are way due for an update, so looking forward to those (and hoping my iPhone 11 can run those models, too 😅)
3
Re: the latter, Apple has been working hard, for example, providing the mps backend for pytorch, but most people still stick to Nvidia GPUs for training.
4
The M1 Max GPU on my MBP benchmarks a bit slower than the GTX 1660S on my PC. That's not that impressive, but the new M2 Ultras have more than double the cores, so that would put them somewhere along the xx70 series? (assuming roughly performance doubles from series to series)
6
So yeah, I don't really buy the claim (and the machine is way too expensive :)), but definitely awesome performance for video editing and graphics!