Snowflake

Software Engineer

SnowflakeYesterday
Location

PL-Warsaw

Type

Full Time

Salary

USD 160,000 – 220,000

Level

Mid

Role

Backend Engineer

Posted

Apr 29, 2026

Full TimeMid

The role

Summary

This Software Engineer role at Snowflake focuses on building industry-leading data integration and connector infrastructure for cloud-based data warehousing. You'll design scalable applications that replicate data from diverse sources into Snowflake using change data capture patterns, develop a robust connector platform with APIs and frameworks for both internal and third-party developer use, and optimize performance for enterprise SaaS operations serving hundreds of customers processing millions of complex queries daily. The ideal candidate brings 3+ years of experience building large-scale distributed systems with strong Java and SQL expertise, thriving in fast-moving environments to help redefine how enterprise data integration works.

What you'll do

Data Integration Architecture Design: Design and develop data integration and processing applications that replicate data from diverse sources including relational databases, SaaS platforms, and data streams into Snowflake, implementing change data capture (CDC) patterns to ensure data consistency and reliability across enterprise systems.
Connector Platform Development: Develop and extend a robust, standardized connector platform that accelerates development of connectors for both internal Snowflake engineering teams and third-party external developers, including building core services like scheduling systems that ensure timely task execution and optimal resource allocation.
Performance Optimization and Troubleshooting: Optimize ingestion pipeline performance, conduct technical investigations to troubleshoot data integration issues, ensure secure data transfer from external systems, and meet performance SLAs for customers across the Snowflake ecosystem.
Design Documentation and Stakeholder Communication: Create comprehensive design documents describing architectural decisions, data flows, and technical approaches, presenting these designs to local architects, technical leadership, and other stakeholders for feedback and alignment on engineering direction.
Cross-functional Collaboration: Coordinate synchronous and asynchronous communication across engineering teams to ensure project goals are met, collaborate with Product Managers and enterprise customers to understand business requirements, and translate them into scalable connector solutions.
API and Framework Design: Create well-architected APIs and frameworks that enable other developers to build connectors efficiently, following best practices for API design, documentation, and developer experience to support the Snowflake connector ecosystem.

What we look for

Technical

Java ProficiencyStrong, production-level fluency in Java including deep understanding of object-oriented design, concurrency patterns, memory management, and Java ecosystem tools for building high-performance backend systems.
Distributed Systems DesignProven experience designing, building, and supporting large-scale distributed systems in cloud environments, with strong understanding of scalability challenges, eventual consistency, fault tolerance, and cloud infrastructure patterns.
SQL and Relational DatabasesSolid experience with SQL query optimization, relational database architecture, transaction management, and data modeling to effectively work with various data sources and Snowflake's data warehouse capabilities.
High-Performance Systems EngineeringExperience building high-performance, scalable software systems in internet-scale cloud environments with attention to latency optimization, throughput maximization, resource efficiency, and multi-tenancy considerations.
API Design and DevelopmentDemonstrated interest in creating well-thought-out APIs and infrastructure abstractions that serve both internal teams and external developers, with understanding of API versioning, backward compatibility, and developer experience.

Education

Bachelor's Degree in Computer Science or Related FieldFormal education in Computer Science, Computer Engineering, Software Engineering, or equivalent field providing foundational knowledge in algorithms, data structures, and software architecture principles.

Experience

3+ Years Large-Scale Systems ExperienceMinimum 3 years of professional industry experience designing, building, and supporting large-scale systems, demonstrating ability to navigate complex technical challenges and deliver production-quality solutions at scale.
Cloud Platform EngineeringExperience working with cloud platforms and internet-scale distributed systems, understanding cloud service models, networking, storage, and compute optimization relevant to enterprise SaaS environments.
Data Systems or Integration Experience (Preferred)Background with data pipelines, ETL systems, data warehousing platforms, or data integration tools is valuable for understanding data flow patterns and optimization techniques in modern data platforms.

Skills

Required skills

JavaProduction-grade Java development for building scalable backend systems and distributed applications
SQLAdvanced SQL for query optimization, data modeling, and integration with relational database systems
Distributed Systems DesignArchitectural knowledge of building scalable, fault-tolerant systems in cloud environments
API DesignDesigning well-structured, maintainable APIs that serve diverse consumer needs
System Performance OptimizationTechniques for identifying and resolving performance bottlenecks in large-scale systems

Nice to have

Change Data Capture (CDC) PatternsFamiliarity with CDC implementation approaches for data replication and event streaming
Kafka or Stream ProcessingExperience with event streaming platforms or real-time data processing frameworks
Data Warehouse TechnologiesKnowledge of data warehouse platforms, cloud data integration, or similar systems
Microservices ArchitectureExperience designing and building microservices-based systems with appropriate service boundaries
Cloud Platforms (AWS, GCP, Azure)Hands-on experience deploying and scaling applications on major cloud platforms
Kubernetes and ContainerizationExperience with Docker and Kubernetes for container orchestration in production environments

Compensation & benefits

Salary

USD 160,000 – 220,000 (annual)

Stock options

Available

Benefits

Industry-Leading Data Platform Innovation

Work on cutting-edge cloud technologies powering the era of the agentic enterprise, building an industry-leading data platform that hundreds of customers depend on for millions of complex queries daily.

Learning and Skill Development

Gain deep expertise in super-robust enterprise SaaS architecture, highly-scalable data processing platforms running on thousands of machines, and modern user interface design bridging enterprise and consumer experiences.

World-Class Engineering Team

Collaborate with industry veterans and rising stars in a fast-moving environment that values innovation, experimentation, and low-ego problem-solving with rapid testing of emerging capabilities.

Significant Business Impact

Help redefine how work gets done in the AI-native enterprise by building critical data integration infrastructure connecting Fortune 500 companies to Snowflake's cloud data warehouse.

Career Growth at Scaling Company

Join Snowflake during rapid growth phase where the company is scaling its engineering team, providing opportunities to grow leadership skills and technical expertise alongside the organization.


Interview process

  1. 1
    Initial Screening Phone conversation with recruiter to assess background, technical interests, and alignment with Snowflake's culture of innovation and AI-native thinking in problem-solving.
  2. 2
    Technical Phone Screen Technical conversation with an engineer on the data connectors team covering data integration concepts, system design fundamentals, Java proficiency, and approach to building scalable distributed systems.
  3. 3
    System Design Interview In-depth technical discussion on designing a data connector system, handling distributed architecture challenges, API design principles, and scalability considerations relevant to Snowflake's platform.
  4. 4
    Coding Interview Live coding session focused on Java problem-solving, likely involving data processing logic, performance optimization, or API implementation scenarios relevant to connector platform development.
  5. 5
    Team Collaboration Interview Conversation with hiring manager and team members exploring your experience collaborating with cross-functional teams, communication style, ability to work in fast-moving environments, and alignment with Snowflake's values.
  6. 6
    Final Round Discussion Meeting with technical leadership or architect discussing your architectural thinking, long-term career goals, and how you approach building well-thought-out infrastructure and APIs for developer ecosystems.

Apply for this position

You'll be redirected to the company's application page


Snowflake

Snowflake

View all jobs

Snowflake is an American cloud computing company offering data warehousing and analytics platforms.

Bozeman, Montana, United StatesFounded 2012snowflake.com

Tech Stack

Languages
JavaSQLPython
Frameworks
Spring FrameworkKafkaProtocol Buffers
Databases
SnowflakePostgreSQLMySQL
Tools
GitDockerKubernetesCI/CD Pipelines
Other
Cloud Infrastructure (AWS/GCP/Azure)Monitoring and ObservabilityAPI Documentation (OpenAPI/Swagger)

Interview Guides

11 guides available for Snowflake

Apply Now