Back to 2022
@mikiobraun
Mikio Braun
@mikiobraun
Part of a thread
Also, the use cases where you actually want to deploy a model behind an API in a scalable fashion are quite specific. Very often, computing predictions in batch and storing them in a database is good enough. No need for k8s.