Data Engineer, Expert
Oakland, CA, US, 94612
Requisition ID # 167473
Job Category: Information Technology
Job Level: Individual Contributor
Business Unit: Information Technology
Work Type: Hybrid
Job Location: Oakland
Department Overview
Information Systems Technology Services is a unified organization comprised of various departments which collaborate effectively in order to deliver high quality technology solutions.
Position Summary
The Data Analytics and Insights team is seeking an experienced and talented Expert Data Engineer to join our growing team of analytics experts. As a key member of our team, you will play an essential role in the design, development, and maintenance of data pipelines, data and analytic products, which includes enterprise datasets, data applications, reports, and dashboards. We are looking for a proactive, detail-oriented, and motivated individual who can thrive in a fast-paced environment and help us scale our product development to meet our clients' ever-evolving needs. The data engineer will collaborate with our cross functional team including solution architects, data pipeline engineers, data analysts, and data scientists on mission critical initiatives and will ensure optimal delivery of data and analytic products.
You will have a unique opportunity to be at the forefront of the utility industry and gain a comprehensive view of the nation’s most advanced smart grid. It is the perfect role for someone who would like to continue to build upon their professional experience and help advance PG&E’s sustainability goals.
This position is hybrid, working from your remote office and Oakland, CA based on business needs.
PG&E is providing the salary range that can reasonably be expected for this position at the time of the job posting. This salary range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, internal equity, specific skills, education, licenses or certifications, experience, market value, and geographic location. The decision will be made on a case-by-case basis related to these factors. This job is also eligible to participate in PG&E’s discretionary incentive compensation programs.
Bay Area: $132,000 - 196,900
Job Responsibilities
- Work closely with Subject Matter Experts (SMEs) to design and develop data model, data pipelines and front end applications.
- Develop and optimize cloud-based data storage and processing solutions using Snowflake.
- Design, implement, and maintain robust data pipelines and ETL processes using Informatica.
- Collaborate with data analysts and data scientists to understand data requirements and deliver high-quality data solutions.
- Ensure data integrity and security across all data workflows and storage solutions.
- Monitor and troubleshoot data pipelines, addressing any issues promptly to ensure the smooth flow of data.
- Contribute to the development and implementation of data governance and best practices.
- Stay current with industry trends and advancements in data engineering technologies and methodologies.
- Lead, mentor, and support less experienced data engineers
- Architect and standardize Snowflake environments across development, QA, and production, ensuring consistency in roles, permissions, and data access patterns
- Lead Snowflake migration initiatives, including legacy system retirement and secure data sharing with external partners
- Define and enforce Snowflake usage standards, including performance optimization, scalability, and security protocols
- Coordinate with cross-functional teams (e.g., data governance, cybersecurity, business stewards) to align Snowflake implementations with enterprise data strategy
Qualifications
Minimum:
- Bachelors degree in Computer Science, Engineering, or a related field or equivalent experience
- Minimum of 7 years of experience in data engineering
Desired:
- Proven experience with ETL Tools and Snowflake.
- Strong proficiency in SQL, Python, and Other Scripting Technologies.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka is a plus.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Knowledge of data visualization tools such as Tableau or Power BI.
- Experience with Machine Learning engineering, principles, and framework.
Nearest Major Market: San Francisco
Nearest Secondary Market: Oakland