Sr. Google Cloud Platform Data engineer /Google Cloud Platform Architect - Atlanta GA-onsite only locals and F2F interview must- with experience in BigQuery, Dataflow, Pub/Sub, Cloud Run), 8+ years Data Engineering, ETL/ELT, SQL, Python, Linux scripting, data modeling/warehousing, ETL tools (Datastage/Informatica/SSIS), CI/CD, Git, Terraform.
Job Summary: We are seeking a skilled Google Cloud Platform (Google Cloud Platform) Data Engineer to design, build, and optimize data pipelines and analytics solutions in the cloud. The ideal candidate must have hands-on experience with Google Cloud Platform data services, strong ETL/ELT development skills, and a solid understanding of data architecture, data modeling, data warehousing and performance optimization. Key Responsibilities: Develop ETL/ELT processes to extract data from various sources, transform it, and load it into BigQuery or other target systems. Build and maintain data models, data warehouses, and data lakes for analytics and reporting. Design and implement scalable, secure, and efficient data pipelines on Google Cloud Platform using tools such as Dataflow, Pub/Sub, cloud run, Python and linux scripting. Optimize BigQuery queries, manage partitioning and clustering, and handle cost optimization. Integrate data from on-premise and cloud systems using Cloud Storage, and APIs. Work closely with DevOps teams to automate deployments using Terraform, Cloud Build, or CI/CD pipelines. Ensure security and compliance by applying IAM roles, encryption, and network controls. Collaborate with data analysts, data scientists, and application teams to deliver high-quality data solutions. Implement best practices for data quality, monitoring, and governance. Required Skills and Experience: Bachelor s degree in Computer Science, Information Technology, or related field. Minimum 8 years of experience in data engineering, preferably in a cloud environment. Minimum 3 years of hands-on and strong expertise in Google Cloud Platform services: o BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring. Proficiency in SQL, Python and Linux scripting. Prior experience with ETL tools such as Datastage, Informatica, SSIS Familiarity with data modeling (star/snowflake) and data warehouse concepts. Understanding of CI/CD, version control (Git), and Infrastructure as Code (Terraform). Strong problem-solving and analytical mindset. Effective communication and collaboration skills. Ability to work in an agile and fast-paced environment. Google Cloud Platform Professional Data Engineer or Cloud Architect certification is a plus.
...Description Job Summary: The Technical Writer I is responsible for creating and maintaining high-quality documentation for Geographic... ...Science, or a related field Relevant certifications (e.g., CPTC, UX writing, accessibility) are a plus Special Requirements:...
...Job Description and Duties Under general direction of a Deputy Chief Counsel, the Assistant Chief Counsel is responsible for supervising a group of attorneys involved in a variety of litigation and other legal activities. The incumbent may also coordinate more complex...
...Renewable Properties is seeking a talented Real Estate Analyst who is detail-oriented and motivated to learn in supporting the company... ...real estate and title functions across its renewable energy development pipeline. The Real Estate Analyst will work closely with multiple...
...global workforce! We are looking for someone with a general mechanical or industrial engineering background to make complex topics easy to understand at... ...the United States. Preferred: ~5 years of work experience in an engineering role. ~ Experience with...
...Calling all innovators - find your future at Fiserv. We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants and consumers to one another millions...