We are seeking a Junior Data Engineer who is eager to learn, grow, and contribute to the way we move, store, and process data within our Corporate Risk and Broking (CRB) business. Whether you have experience with cloud platforms such as Azure, Google Cloud, or AWS, or you’re just starting out, what matters most is your curiosity, motivation, and willingness to learn.
If you’re someone who thrives in a fast-paced, collaborative environment, and you’re keen to develop your skills in modern data engineering practices, we’d love to hear from you.
Candidates Also Search: Remote Software/ IT jobs
The Role:
- Designing, building, and optimizing data pipelines using Azure Synapse, Azure Data Factory, and Azure Fabric.
- Writing and fine-tuning PySpark notebooks to handle massive data workloads efficiently.
- Troubleshooting and enhancing ETL/ELT workflows in Azure Synapse.
- Managing and organizing Data Lakes to ensure seamless data access and performance.
- Integrating AI/LLM models into data pipelines to drive innovation and insights.
- Collaborating with Data Scientists, AI Engineers, Data Analysts, and Business domain experts to create powerful data-driven solutions.
- Participating and assisting with data security, governance, and compliance within our Azure ecosystem.
- Staying ahead of the curve with emerging cloud, AI, and big data technologies.
Candidates Also Search: Remote Engineering jobs
Qualifications
- 1+ years of experience in Data Engineering, with competency in Cloud Data Tools.
Solid programming skills in a language like C# or Python. - Understanding of data warehousing, modeling, and basic data architecture either through experience or education
- Strong problem-solving skills and a knack for debugging tricky data issues.
- Excitement of working with data and the business outcomes from data.
- Massive curiosity of how things work and are built.
- Great communication skills and a team-player attitude.
Bonus Points If You Have
- Certifications in Azure Data Engineer Associate or Azure Solutions Architect.
- Experience with real-time streaming solutions like Azure Stream Analytics, Kafka, or Event Hubs.
- Familiarity with Databricks and its integration with Azure Synapse.
- Knowledge of Graph Databases and NoSQL technologies.
- Knowledge of AI/LALM applications and how they connect with data pipelines
- Experience with CI/CD pipelines, DevOps, and Infrastructure as Code (IaC).