[go: up one dir, main page]

0% found this document useful (0 votes)
42 views3 pages

Job Title - Data Engineering Intern

Uploaded by

vivek722257
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views3 pages

Job Title - Data Engineering Intern

Uploaded by

vivek722257
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Job Title: Data Engineering Intern​

Duration: 6 Months​
Stipend: ₹15000/month​
Location: Remote (Work From Home)​
Eligibility: Open to Freshers & Final-Year Students​
PPO Opportunity: Yes

About Us

Renan Partners is an AI transformation company founded in 2024. We help businesses


adopt and leverage AI to drive growth and efficiency.

Our founders are serial entrepreneurs and graduates of IIT Bombay, bringing over 40+
years of combined experience. They have previously led teams of 25,000+ at organizations
such as Amazon, HSBC, Trell, CoinDelta, Jar, and more.

Our Products and Services:

●​ Friday – AI Business Application for Data Analytics​

●​ Databrewery – AI Model Training Platform​

●​ AI Consulting & Project Execution Practice​

Role Overview

As a Data Engineering Intern, you'll work closely with experienced engineers and data
professionals to build, maintain, and scale data pipelines and infrastructure. This role is ideal
for someone looking to gain practical skills in data ingestion, transformation, and storage
using industry tools and technologies.

Key Responsibilities

●​ Understand and document data architecture and infrastructure​

●​ Assist in developing and maintaining ETL/ELT pipelines​

●​ Work with structured and unstructured data across different sources​

●​ Help optimize data workflows for performance and scalability​


●​ Participate in designing and maintaining data warehouses and data lakes​

●​ Collaborate with analysts and data scientists to ensure data readiness​

●​ Ensure data quality and integrity across systems and processes​

●​ Learn and apply best practices in data engineering and version control​

Requirements

●​ Basic understanding of SQL and database systems​

●​ Exposure to Python or any programming/scripting language​

●​ Familiarity with cloud platforms (AWS, GCP, or Azure) is a plus​

●​ Knowledge of data formats (CSV, JSON, Parquet) and APIs​

●​ Understanding of ETL/ELT concepts and data pipeline architecture​

●​ Eagerness to learn data engineering tools like Apache Airflow, dbt, or Spark​

What We're Looking For

●​ Passion for working in a fast-paced startup environment​

●​ Curiosity and willingness to explore new technologies​

●​ Self-starter with strong problem-solving skills​

●​ Attention to detail and commitment to data reliability​

●​ Strong communication and collaboration skills​

●​ Ability to manage time and meet deadlines independently​


What We Offer

●​ Mentorship from IIT Bombay alumni and industry leaders​

●​ Hands-on experience with real-world data engineering challenges​

●​ Opportunity to work on projects with direct business and AI impact​

●​ Exposure to scalable data pipelines, infrastructure, and tools​

●​ A learning-focused culture with a clear path to a PPO (Pre-Placement Offer)

You might also like