Can we have the best of both worlds? Can the same (Python) codebase achieve a low latency in a realtime setup (API) and -at the same time- have high throughput to process big data (batch)? At the RATE Periodicity team we have combined Polars and PySpark to get high performance in both situations! We currently process all daily payments transaction when they come in with low latency, while the same codebase is also used to process years of historical data in a matter of hours. In this talk I will explain how Polars helped us achieve this!