Are you a skilled and experienced Data Engineering Specialist with a passion for working in an enterprise-level environment? Do you thrive on tackling complex challenges and shaping data integration systems that drive operational excellence and compliance? If so, we have a fantastic opportunity for you!
We are seeking a Data Engineering Specialist to join our client’s team, a global leader in the financial industry. As a Data Engineering Specialist, you will play a pivotal role in the finance integration of special case legal entities across a global scale. This position requires strong expertise in Azure Databricks, Azure Data Factory, and Azure Data Lake, coupled with a deep understanding of data engineering best practices.
The role responsibilities:
- Troubleshoot user issues related to Databricks (Spark) and recommend optimisations for resource-intensive jobs.
- Provide guidance to users on Databricks cluster management best practices.
- Collaborate with diverse teams to develop, implement, and maintain high-performance, growth-ready data-processing and data integration systems.
- Enhance data access methods to ensure a secure and compliant self-service platform.
- Assist users in finding cost-effective setups while maintaining efficiency.
- Implement observability solutions in Databricks to optimize costs.
- Drive the adoption of cutting-edge Databricks features.
- Review existing infrastructure to identify and implement cost-saving opportunities.
The person we’re looking for:
- 3+ years of experience in data engineering application development using Spark, preferably in an enterprise environment on Microsoft Azure.
- 5+ years of experience in building enterprise big data/data engineering applications using continuous integration tools like Azure DevOps.
- Strong expertise in Databricks and Spark, including the ability to train and guide others.
- Proficiency in Python/PySpark and SQL (Spark SQL).
- Familiarity with Git version control.
- Sound understanding of reusable software design patterns.
- Experience with Azure Data Factory, Azure Data Lake, Azure Log Analytics, Azure KeyVault, and Git.
- Proven track record in designing and implementing production-grade solutions.
- Commitment to delivering high-quality code with a focus on simplicity, performance, maintainability, and scalability.
- Practical experience in applying agile methodologies and DevOps practices.
- Excellent analytical and conceptual skills to comprehend complex technology stacks and their dependencies.
- Experience with enterprise system integration.
- Knowledge of the insurance industry.
- Familiarity with finance & accounting or actuarial concepts.
Join us in this exciting opportunity to shape the future of data engineering within a global financial leader. Apply your expertise to drive operational excellence, compliance, and cost savings while working with cutting-edge technologies in a collaborative and innovative environment.
Note: The client’s name and specific details will be disclosed during the later stages of the selection process. Apply now to explore this exciting opportunity further.
By applying for this opportunity, you agree that Zenith People Ltd may share your details with the end client at the shortlist stage. If you have the relevant skills and the drive to enhance your career, we would love to speak with you. Apply now or give us a call – 0191 428 6444.
|Job Category||IT and Digital|
|Salary||£650 Per Day|