Discover Technata Job board

Find your next tech job in Kanata North, Canada’s largest technology park. Then explore endless international opportunities and dream about where your career will take you. With the Country’s largest density of technology companies ranging from promising startups to leading global giants, Kanata North is the place to be if you are serious about a career in tech.

Data Engineer - I

Siemens

Siemens

Data Science
Posted on Dec 17, 2025

Data Engineer - I

Job ID
487357
Posted since
10-Dec-2025
Organization
Smart Infrastructure
Field of work
Research & Development
Company
Brightly Software India Private Limited
Experience level
Early Professional
Job type
Full-time
Work mode
Hybrid (Remote/Office)
Employment type
Permanent
Location(s)
  • Noida - Uttar Pradesh - India
Data Engineer - I

Who we are:
We are seeking a motivated Data Engineer to join our data engineering team. In this role, you will help build, maintain, and optimize data pipelines and infrastructure within our AWS and Snowflake environments. This position is ideal for someone with strong Python, SQL skills and a growing interest in cloud-based data engineering.
What you will be doing:
  • Assist in building and maintaining scalable data pipelines that support analytics, reporting, and application needs.
  • Work with senior engineers to design and optimize data models in Snowflake.
  • Support ingestion, transformation, and integration of data from multiple sources.
  • Develop and maintain ETL/ELT scripts using Python (e.g., for data transformations, API integrations, automation).
  • Write efficient, well-structured SQL to extract, analyze, and validate data.
  • Assist with deploying and managing data workflows using AWS services such as S3, Lambda, Glue, and Step Functions (exposure/understanding expected; will be mentored).
  • Monitor pipeline performance, troubleshoot data quality issues, and contribute to continuous improvement.
  • Collaborate with data analysts, data scientists, and business partners to understand requirements and support data needs.
  • Follow best practices around version control, documentation, and data governance.
What you need
  • 1–3 years of experience in data engineering, ETL development, or related field.
  • Solid understanding of SQL and relational database concepts.
  • Strong Python skills, including experience working with data libraries (e.g., Pandas, PySpark, Boto3).
  • Exposure to or foundational knowledge of AWS data services (e.g., S3, Lambda, Glue, Athena, DynamoDB).
  • Exposure to Snowflake or other cloud data warehouse platforms.
  • Basic understanding of data pipelines, batch/stream processing, and data modeling principles.
  • Strong problem-solving skills, attention to detail, and willingness to learn.
  • Experience with Python is a plus (not required).
What makes you stand out
  • Familiarity with dbt, Airflow, or similar orchestration tools.
  • Experience with Git or other version control systems.
  • Knowledge of data quality and testing frameworks.
  • Understanding of CI/CD concepts.
What We Offer
  • Mentorship from experienced data engineers.
  • Opportunities to learn advanced AWS and Snowflake capabilities.
  • A collaborative environment focused on growth, innovation, and continuous learning.
  • Access to training resources and professional development programs.
The Brightly culture
We’re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiously making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.
#LI-AC1
#Brightly