Data Engineer Intern - Pune (2026)
Apply Job ID 8904ed85-75d6-48cf-b01b-7941677fde4a Date posted 04/21/2026At Snowflake, we are powering the era of the agentic enterprise. To usher in this new era, we seek AI-native thinkers across every function who are energized by the opportunity to reinvent how they work. You don’t just use tools; you possess an innate curiosity, treating AI as a high-trust collaborator that is core to how you solve problems and accelerate your impact. We look for low-ego individuals who thrive in dynamic and fast-moving environments and move with an experimental mindset — who rapidly test emerging capabilities to discover simpler, more powerful ways to deliver results. At Snowflake, your role isn't just to execute a function, but to help redefine the future of how work gets done.
There is only one Data Cloud. Snowflake’s founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn’t stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow.
We’re looking for dedicated students who share our passion for ground-breaking technology and want to create a lasting future for you and Snowflake. The Data Engineering Team designs, builds and oversees the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs.
WHAT WE OFFER :
Paid, full-time internships in the heart of the software industry
Post-internship career opportunities (full-time and/or additional internships)
Exposure to a fast-paced, fun and inclusive culture
A chance to work with world-class experts on challenging projects
Opportunity to provide meaningful contributions to a real system used by customers
High level of access to supervisors (manager and mentor), detailed direction without micromanagement, feedback throughout your internship, and a final evaluation
WHAT WE EXPECT :
Must be actively enrolled in an accredited college/university program during the time of the internship
Desired Class Level: B.Tech/M.Tech in progress
Desired majors: Computer Science, Software Engineering, Technology, or in a related field.
Strong proficiency in Python for data manipulation, automation, and application development, and Solid understanding of SQL and relational database concepts.
Preliminary experience in big data technologies across multiple domains.
Ability to interact professionally with a diverse group of functional/technical teams and business leaders.
Ability to continuously innovate and drive process improvement.
Problem solving and critical thinking skills, technical acumen, and excellent communication skills with the ability to clearly articulate complex ideas.
Bonus Experience: Hands-on experience with Snowflake (SnowSQL), Experience with CI/CD pipelines and version control (e.g., GitHub/GitLab), Familiarity with R or other statistical scripting languages.
Strong desire to learn; you are energized when new features and technologies become available and eager to understand them and maximize their potential.
Good interpersonal skills and a strong sense of ownership.
WHAT YOU WILL LEARN / GAIN :
How todevelop technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis.
How to create and manage declarative automations and data validation to support business processes.
Exposure with partnering directly with business stakeholders to gather requirements and translate business needs into technical data solutions.
How to Design and develop innovative applications that leverage AI and Machine Learning at their core to drive automated insights.
How to Follow best practices to produce clean, efficient and maintainable data pipelines and configurations.
POSSIBLE TEAMS / WORK FOCUS AREAS :(if applicable)
Data Platform & Enablement: Focus on building and scaling the underlying infrastructure that powers the Data Cloud. You will work on automating platform setup and creating self-service tools for other engineering teams.
AI-Driven Automation: Design and implement intelligent solutions that use AI/ML to replace manual tasks, specifically focusing on automating the lifecycle of data pipeline creation.
Observability & Operations: Build advanced monitoring and alerting frameworks that utilize AI to predict and resolve pipeline failures before they impact the business.
Data Engineering Excellence: Collaborate with the core engineering team to refine data architectures, ensuring high-quality, high-integrity data flows across the Snowflake ecosystem.
Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.
How do you want to make your impact?
For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com