Division: Group Digital Capabilities (GDC)
Is data technology your passion? Can you autonomously design, develop and deploy data pipelines, making sure these pipelines are filled with qualitative data which serve as an input for our machine learning engineers? Then this is the job for you.
In this role, You collaborate closely with our machine learning engineers and/or cloud architects on projects, always with a quality focused, end-to-end approach. You will support the reporting teams in the data exploration and data preparation phases, Implement data quality controls. You will liaise with IT infrastructure teams to address infrastructure issues and to ensure that the components and software used on the platform are all consistent.
As a senior you will also build and share knowledge with your colleagues while enjoying specialized training to keep you at the top of your field..
Join Euroclear
Euroclear is a financial services company that specializes in the settlement of securities transactions, as well as the safekeeping and asset servicing of these securities. We are located in Brussels and several major cities in Europe and around the world. We are deeply convinced that diversity of talents, backgrounds and opinions is a key to success, by encouraging engagement, energy and innovation.
You will join the AIR (Analytics, Insight and Reporting tribe) in GDC division and our dedicated in-house team of data-specialists using a pragmatic best-tool-for-the-job approach to optimize our hybrid infrastructure. With a solid focus on DataOps and MLOps we firmly believe in robust and production-ready solutions being a crucial part to our work. The result? Our team provides an ecosystem of data-driven products to internal and external consumers
We will train you to become a data engineer with a proven foundation in the field. You will excel in building digital data-driven solutions and infrastructure and become an architectural futurist that thoughtfully designs, develops and deploys infrastructure and data pipelines. Our languages? Java, Scala, Spark and SQL. When not coding, we speak English as we have colleagues from all around the world, and join a truly international team.
Job requirements
The Skills
1. Experience with analysis and creation of data pipelines, data architecture, ETL/ELT development and with processing structured and unstructured data, including post-go live activities. Ability to analyse data, identify issues (. gaps, inconsistencies) and you can troubleshoot them
2. Experience using data stored in RDBMS and experience or some understanding of NoSQL databases
3. Knowledge of Scala and Spark, and a good understanding of the Hadoop ecosystem including Hadoop file formats like Parquet and ORC
4. Can write performant Scala code and SQL statements and can design modular, future proof solutions that are fit for purpose
5. Autonomous in working on Unix based systems.
6. You have a true agile mentality, capable and willing to tackle tasks outside of their strengths to help the team
7. Experience in working with customers to identify and clarify requirements
8. You have a strong curiosity about FINTECH and technologies related data.
9. You have good communication skills!
Nice to haves
10. Experience with open-source technologies used in Data Analytics like Spark, Hive, HBase, Kafka
11. Knowledge of Cloudera, IBM mainframe and or Azure
12. Speak English fluently, any other language is a plus.
Don’t tick all the boxes? Not to worry, we are a people organization that believe in a growth mindset, so if you are an enthusiastic learner with passion for technology, you’ll fit right in