Data Engineering and Analytisc

Submit your application. First step toward your successful career with us!

All received applications will be taken into consideration and candidates notified about their application status in a timely manner.

Data is everywhere — from sensor logs to satellite images. Data fuels the modern world, but raw data is not enough. To turn it into insight, we need smart, scalable, and reliable data pipelines. If you’re curious about how large volumes of data are processed, transformed, and analyzed in the cloud, this internship is your gateway. Over the course of three months, you’ll learn and experience the world of cloud platforms and data engineering. Get ready to challenge yourself, learn fast, and build something meaningful around real-life data. During this internship, you will face:

Internship Timeline

A guiding internship timeline is given below. We’ll stay agile along the way and adopt the plan based on progress and new ideas.

week 1
Onboarding & Python Basics
week 2
Introduction to Databricks & Data Exploration
week 3
Ingestion & Bronze Layer
week 4
Data Cleaning & Silver Layer
week 5
Feature Engineering for Gold Layer
week 6
Visual Analytics
week 7
Access Control
week 8
Data Anonymization
week 9
Testing, Logging, Pipeline Finalization
week 10
Anomaly Detection (optional)
week 11
Advanced Analytics (optional)
week 12
Wrap-up & Presentation

Employee Benefits:

Office based or remote
due to situation

Professional &
dynamic team

Professional development
opportunities

Competitive salaries
& benefits

Additional health insurance,
sport & social activities

International work environment
& traveling opportunities

Required Skills and Qualifications

We’re not expecting you to be an expert — this internship is designed to help you learn. Still, to get the most out of the experience, you should bring:

Basic programming knowledge – preferably in Python

Data foundations – tables, SQL, data types

Interest in cloud technologies – curiosity about platforms like Azure or AWS

Good command of English – both written and spoken

Logical thinking and problem-solving mindset

Motivation to learn, experiment, and ask questions

Responsibility and consistency – the project will build week by week

Nice to have

Familiarity with Microsoft Azure services or the Azure portal

Understanding of ETL concepts or data pipelines

Experience with Jupyter Notebooks and Python

Exposure to Power BI, Excel, or any data visualization tool

Previous project work (personal or academic) involving data processing

Basic knowledge of Source Versioning (Git) and/or Scrum (Jira)

Looking forward to reviewing your application

All the luck in the selection process.