Data Engineering - Working Student (Wroclaw)
Nokia
Working Student is a long-term paid internship, that allows you to kickstart your fabulous journey in the IT world. As our Trainee you can have experience working in such areas that correspond to full-time jobs. The internship usually lasts from 6 to 24 months. We are open to flexible working hours so that you can reconcile working with us with your studies. #OpenToYou.
As a Working Student in Data Engineering, you will contribute to designing, building, and maintaining data pipelines and storage systems that support AI/ML workflows. You will work on practical tasks such as data preparation, pipeline development, and performance monitoring, gaining hands-on experience with modern data engineering tools and practices. This role offers an opportunity to learn and apply software development lifecycle (SDLC) principles in real-world AI/ML projects.
Location: Poland, Wroclaw
Nokia is a global leader in connectivity for the AI era. With expertise across fixed, mobile and transport networks, powered by the innovation of Nokia Bell Labs, we’re advancing connectivity to secure a brighter world.
- Flexible and hybrid working schemes
- Well-being programs to support your mental and physical health
- Opportunities to join and receive support from Nokia Employee Resource Groups (NERGs)
- Employee Growth Solutions to support your personalized career & skills development
- Diverse pool of Coaches & Mentors to whom you have easy access
- A learning environment which promotes personal growth and professional development - for your role and beyond
Kick-start your career in Data Engineering by working on real AI/ML projects that make an impact. As a Working Student, you’ll gain hands-on experience with modern data pipelines, cloud technologies, and best engineering practices while learning from an experienced team. This is your chance to turn theory into practice and grow in a fast-moving, data-driven environment.
Must-Have:
- Active student status at minimum one year
- Knowledge of Python programming
- Understanding of SQL and relational databases
- Familiarity with data analysis libraries (e.g., Pandas)
- Interest in data engineering principles and ETL processes
- Willingness to learn workflow orchestration tools (e.g., Airflow, Kubeflow).
- Ability to work independently on assigned tasks and communicate effectively
It would be nice if you also had:
- Exposure to AI/ML concepts and frameworks (e.g., scikit-learn, TensorFlow)
- Experience with version control (Git) and basic software engineering practices
- Familiarity with cloud platforms (AWS, Azure, GCP)
- Previous academic or project experience in data processing or analytics
As part of our team, you will:
- Assist in building and maintaining data pipelines for batch, ETL, and real-time processing.
- Support optimization of data processing and storage systems
- Participate in data transformation, enrichment, and integration workflows
- Help monitor data quality, pipeline performance, and model metrics
- Contribute to technical documentation and testing activities
- Collaborate with team members to ensure data readiness for AI/ML models
- Learn and apply best practices in software engineering and DevOps