Snowflake

Senior Software Engineer- Sharing Foundations

Snowflake3 months ago
Location

US-CA-Menlo Park

Type

Full Time

Salary

USD 200,000 – 287,500

Level

Senior

Role

Senior Software Engineer

Posted

Dec 11, 2025

Full TimeSenior

The role

Summary

Snowflake is seeking a Senior Software Engineer for their Sharing Foundations team to build revolutionary Data Sharing technology and Snowflake Data Marketplace infrastructure. The role requires 7+ years of distributed systems experience with expertise in Java/Scala/C++, cloud-native services, and open-source data lake formats like Apache Iceberg.

What you'll do

Lead Data Sharing Initiatives: Lead highly impactful initiatives around Snowflake Data Sharing and Snowflake Data Marketplace development
Build Secure Software Solutions: Innovate and build highly secured and reliable software to enable data-driven customer solutions
Architect Integration Systems: Design and build systems that integrate open source technologies with Snowflake for massive data lake architectures
Open Source Collaboration: Collaborate with Snowflake's open-source team and Apache Iceberg community to contribute features and enhance REST specifications
Design Distributed Platforms: Design and implement highly available distributed platforms within the global Snowflake infrastructure
Revolutionize Data Distribution: Transform how organizations distribute, consume, and use data as a strategic business asset
Ensure Operational Excellence: Maintain operational readiness of services and meet customer commitments for reliability, availability and performance

What we look for

Technical

Distributed Systems Experience7+ years designing, building and supporting large-scale distributed systems in production
Programming ProficiencyStrong skills in Java, Scala, or C++ with emphasis on performance and reliability
Database ExpertiseDeep understanding of distributed transaction processing, concurrency control, and high-performance query engines
Data Lake ExperienceExperience with open-source data lake formats (Apache Iceberg, Parquet, Delta) and multi-engine interoperability
Cloud-Native DevelopmentExperience building cloud-native services with AWS, Azure, or GCP
Data Governance KnowledgeFamiliarity with data governance, security, and access control models in distributed systems

Education

Computer Science DegreeBS/MS/PhD in Computer Science or related majors, or equivalent experience

Experience

Production Systems7+ years of industry experience with large-scale distributed systems
Open Source EngagementPassion for open-source software and community engagement in the data ecosystem

Skills

Required skills

Java ProgrammingExpert-level Java development skills for enterprise systems
Scala ProgrammingProficiency in Scala for functional programming and big data processing
Distributed SystemsDeep expertise in designing and implementing large-scale distributed architectures
Cloud PlatformsHands-on experience with AWS, Azure, or Google Cloud Platform
Apache IcebergExperience with Apache Iceberg table format and data lake technologies
Performance OptimizationSkills in optimizing system performance and reliability

Nice to have

C++ ProgrammingAdditional systems programming experience for performance-critical components
Apache ParquetExperience with columnar storage formats and data processing optimization
Data GovernanceKnowledge of data security, access control, and governance frameworks
Open Source ContributionActive participation in open-source data ecosystem projects
REST API DesignExperience designing and implementing RESTful service architectures

Compensation & benefits

Salary

USD 200,000 – 287,500 (annual)

Stock options

Available

Benefits

Equity Compensation

Stock options and equity participation in Snowflake's growth

Healthcare Benefits

Comprehensive health insurance coverage

Professional Development

Opportunities for career advancement and technical skill development

Innovation Culture

Impact-driven, collaborative environment focused on innovation

Open Source Engagement

Opportunities to contribute to open-source projects and community engagement


Interview process

  1. 1
    Initial Screening Phone or video call with recruiter to discuss background and role fit
  2. 2
    Technical Phone Screen 45-minute technical interview focusing on distributed systems and programming concepts
  3. 3
    System Design Interview Architecture and design discussion for large-scale data systems
  4. 4
    Coding Assessment Live coding session in Java/Scala focusing on algorithms and data structures
  5. 5
    Team Interview Meet with team members to discuss collaboration and technical approach
  6. 6
    Final Round On-site or virtual panel interview with senior engineers and hiring manager

Apply for this position

You'll be redirected to the company's application page


Snowflake

Snowflake

View all jobs

Snowflake is an American cloud computing company offering data warehousing and analytics platforms.

Bozeman, Montana, United StatesFounded 2012snowflake.com

Tech Stack

Languages
JavaScalaC++
Frameworks
Apache IcebergSpring BootApache Parquet
Databases
Snowflake Data CloudDistributed Data SystemsData Lake Architectures
Tools
AWSAzureGCPDockerKubernetes
Other
REST APIsMicroservicesData GovernanceApache Delta

Interview Guides

11 guides available for Snowflake

Apply Now