Google Pillar Gcp Data Architect, Remote

  • Lisboa
  • Devoteam
At Devoteam, we believe that technology with strong human values can actively drive change for the better. Discover how Tech for People unlocks the future, creating a positive impact on the people and the world around us. We are a global leading player in Digital Transformation for leading organizations across EMEA, with a revenue of €652M. We believe in transforming technology to create value for our clients, partners and employees in a world where technology is developed for people. We are proud of the culture we have built together. We are proud of our people at the service of technology. We are proud of our diverse environment. Because we are #TechforPeople. Join our multidisciplinary team of Cloud experts, Designers, Business consultants, Security experts, Engineers, Developers and other extraordinary talents, spread across more than 18 EMEA countries. Become one of our +8.000 tech and business leaders on cloud, data and cyber security. Let’s fuse creativity with technology together and build innovative solutions that actively change things for the better. Our Devoteam G Cloud is looking for a Google Cloud Data Architect to join our Google Cloud Platform specialists, working with one of our Clients in the telco sector. - Delivery of Data projects more focused on the Engineering component; - Working with GCP Data Services such as BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub and Dataplex; - Write efficient SQL queries; - Develop data processing pipelines using programming frameworks like Apache Beam; - Automate data engineering tasks; - Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management; - Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP. - Bachelor degree in IT or similar; - More than 6 years of professional experience, with expertise in the delivery of Data Engineering projects; - GCP Data Services, BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub and Dataplex; - Knowledge of programing languages: Python, Java, or SQL; - Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion; - Knowledge of streaming data processing using tools like Apache Kafka; - GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have); - Proficiency in English (written and spoken).