By designing and implementing scalable data pipelines, ensuring efficient data flow and processing. You will optimize data storage, enable real-time analytics, and enhance data security. By leveraging GCP's advanced tools, you facilitate data-driven decision-making, improve operational efficiency, and support business growth through insightful analytics and streamlined data management.
Your expertise drives impactful data solutions.
* Design, develop, maintain and optimize data pipelines on GCP
* Build and manage data warehouses and data lakes using GCP services like BigQuery, Cloud Storage, and Dataflow.
* Strong proficiency in GCP services such as BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and Cloud Composer.
* Expertise in SQL and experience with relational and NoSQL databases.
* Proficiency in programming languages such as Python, Spark, or Scala.
* Experience with ETL tools and frameworks
* Work with cutting-edge technology in the field of data engineering
* Stay current with the latest developments and best practices on GCP
* Engage in continuous (peer)learning
* Participate on a tactical level within our company
* Enjoy an open and informal company culture
* Collaborate with awesome colleagues