Assist in designing, building, and maintaining data pipelines and ETL/ELT processes
Support data ingestion from various sources (APIs, databases, files, streaming data)
Help clean, transform, and validate data to ensure data quality and accuracy
Work with cloud-based data platforms (e.g., AWS, GCP, or Azure)
Support database management, including data warehousing and data lakes
Collaborate with cross-functional teams to understand data requirements
Assist in monitoring data workflows and troubleshooting basic issues
Document data processes, pipelines, and technical workflows
Currently pursuing or recently completed a degree in Computer Science, Information Systems, Data Science, Engineering, or related fields
Basic understanding of data concepts (databases, tables, schemas, ETL)
Familiarity with SQL and basic data querying
Basic programming knowledge in Python, Java, or Scala
Understanding of cloud computing fundamentals
Strong analytical thinking and problem-solving skills
Willingness to learn and able to work in a team environment