Four Infrastructure Trends Reshaping Modern Systems
At Prefect Summit, infrastructure veteran Chris Riccomini shared insights about four emerging trends that are transforming how we build and operate modern systems. As someone who has spent two decades in Silicon Valley engineering infrastructure at companies like PayPal and LinkedIn, Chris brings a unique perspective to these developments. Here's my analysis of the key trends he highlighted and their implications for the future of infrastructure.
The Rise of Object Storage: Simplifying System Architecture
One of the most significant shifts in modern system design is the growing adoption of object storage solutions like Amazon S3. Traditionally, systems managed their own persistence, leading to complex challenges around data replication and consistency. Chris outlined three emerging approaches to incorporating object storage:
- Tiered Storage: The most incremental approach, where less frequently accessed data is moved to object storage while maintaining traditional storage for hot data.
- Write-Ahead Log Architecture: A more sophisticated approach where only the write-ahead log (recent mutations) lives outside object storage, while the bulk of data resides in it. Companies like NEON are using this approach to create "bottomless" Postgres implementations.
- Zero Disk Architecture: The most radical approach, championed by companies like Warpstream, where systems abandon local disks entirely in favor of object storage.
The implications are profound: simpler operational processes, easier system integration, and greater flexibility in balancing cost, latency, and durability. Perhaps most intriguingly, this shift could fundamentally change how we think about data integration, potentially reducing our reliance on message queues and traditional ETL processes.
The Great Database Decomposition
The second trend Chris identified is the fascinating decomposition of traditionally monolithic databases into separate, reusable components. Modern projects are breaking apart core database functionalities – query parsing, optimization, and execution – into shared libraries that can be used across different database implementations.
This decomposition, enabled by projects like Apache Data Fusion and Meta's Velox, is creating three distinct opportunities:
- Database Platforms: Focus on developer experience and operational aspects like schema management and data integration
- Multi-Model Databases: Systems that can efficiently handle multiple workload types
- Vertical-Specific Databases: Specialized solutions for specific industries (like Tiger Beetle for finance)
The trade-off question Chris raised is particularly interesting: Will this standardization commoditize databases, or will it enable innovation in other areas?
Postgres's Renaissance
The exponential growth of PostgreSQL adoption caught my attention. Chris attributes this surge to several factors:
- Performance improvements that have addressed historical limitations
- A developer-friendly, liberal open-source license
- Exceptional extensibility through its plugin framework
This extensibility has created a powerful flywheel effect: as more developers adopt Postgres, infrastructure engineers are more motivated to build on top of it, which in turn attracts more users. We're seeing this play out in two ways:
- Protocol/Dialect Compatibility: New databases maintaining compatibility with Postgres's protocols or SQL dialect
- Native Extensions: Direct extensions of Postgres's functionality through its API
The result? A smaller operational footprint as Postgres increasingly serves multiple use cases, from OLAP workloads to vector search.
The Promise and Challenge of Durable Execution
The final trend Chris discussed – durable execution – might be the most complex but also the most promising. As systems become more distributed, ensuring exactly-once semantics across multiple services becomes increasingly challenging. Durable execution frameworks attempt to solve this by logging all operations and managing replays after failures.
While powerful, current implementations often require deep understanding of concepts like determinism and careful management of code deployments. Chris highlighted three emerging approaches that might make durable execution more accessible:
- Language Primitives: Projects like Resonate that leverage async/await semantics
- Stream Processing: Applying lessons learned from decades of stream processing
- Workflow Orchestration: Using workflow systems to handle some durable execution use cases
The evolution of these approaches, particularly in tools like Prefect 3, suggests we're moving toward more practical and maintainable solutions for ensuring reliable execution in distributed systems.
Looking Ahead
These trends point to a future where infrastructure becomes both more powerful and more accessible. The decomposition of monolithic systems, whether databases or storage layers, combined with new paradigms for ensuring reliability, suggests we're entering an era of more flexible and maintainable infrastructure.
For developers and architects, understanding these trends is crucial for making informed decisions about system design and technology choices. As Chris's insights reveal, we're not just seeing incremental improvements but fundamental shifts in how we build and operate modern systems.
This article is based on Chris Riccomini's keynote presentation at Prefect Summit. For more insights on infrastructure trends, follow Chris’s newsletter at materializedview.io