Data Engineer, Expert
San Ramon, CA, US, 94583
Requisition ID # 170276
Job Category: Information Technology
Job Level: Individual Contributor
Business Unit: Electric Operations
Work Type: Hybrid
Job Location: San Ramon
Department Overview
Electric Operations ensures the delivery of clean, safe, reliable and affordable energy to nearly 16 million people in Northern and Central California. Electric Operations is responsible for every aspect of PG&E's electric distribution and transmission operations, including planning, engineering, maintenance and construction, asset management, business planning, restoration and emergency response.
The Quality Management organization is made up of over 300 coworkers and contractors that help ensure the safe and reliable delivery of electricity to approximately 16 million people throughout a 70,000 square-mile service area in northern and central California. The organization is accountable for the company's program to underground 10,000 miles of electric distribution lines to reduce wildfire risk, the System Inspection Program to identify potential risks to the safety and reliability of the system, and Maintenance & Construction Programs ensure safe and reliable delivery of electricity. The Quality Management Planning & Improvement (P&I) team supports the organization to meet or exceed established operational targets, including compliance requirements, KPIs, budget, and risk reduction strategies that support the company’s Purpose, Virtutes and Stands.
Position Summary
Designs, develops, modifies, configures, debugs and evaluates jobs for extracting data from various sources, implements transformation logic, and stores data in various formats fit for use by stakeholders. Collects metadata about jobs including data lineage and transformation logic. Works with teams, clients, data owners, and leadership throughout the development cycle practicing continuous improvement.
This position is hybrid, working from your remote office and your assigned location based on business need.
PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, specific skills, education, licenses or certifications, experience, market value, geographic location, and internal equity. Although we estimate the successful candidate hired into this role will be placed towards the middle or entry point of the range, the decision will be made on a case-by-case basis related to these factors.
Bay Minimum: $140,000
Bay Maximum: $238,000
This job is also eligible to participate in PG&E’s discretionary incentive compensation programs.
Job Responsibilities
- Leads a team on moderately complex to complex data and analytics-centric problems having broad impact that require in-depth analysis and judgment to obtain results or solutions.
- May contribute to the resolution of uniquely complex data and analytics-centric problems having significant impact
- Identifies, designs and implements internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
- Resolves application programming analysis problems of broad scope within procedural guidelines.
- Provides assistance to other programmers/analysts on unusual or especially complex problems that cross multiple functional/technology areas.
- Conceptualizes and generates infrastructure that allows big data to be accessed and analyzed with verified data quality and metadata is appropriately captured and catalogued.
- Collaborates with peers to develop departmental standards, norms, and new goals/objectives.
- Plans work to meet assigned general objectives; reviews progress regularly and solutions may provide an opportunity for creative/non-standard approaches.
- Assesses data pipeline performance and suggests/implements changes as required.
- Communicates (oral and written) recommendations.
- Mentors/provides guidance to less experienced colleagues.
Qualifications
Minimum:
- BA/BS in Computer Science, Management Information Systems or related field of study, or equivalent experience
- 7 years of experience with data engineering/ETL ecosystems such as Palantir Foundry, Spark, Informatica, SAP BODS, OBIEE
- Experience with multiple data engineering/ETL ecosystems
- Experience with machine learning algorithm deployment
Desired:
- Master’s degree in Computer Science, Management Information Systems or related field, or equivalent experience
- Experience leading development teams
- Business Intelligence and data access tool expertise, including advanced SQL, data modeling, and performance optimization.
- Strong software engineering fundamentals (Git, CI/CD, unit and integration testing) with production data pipelines experience
- Proficiency in Python and SQL within Palantir Foundry, including PySpark-based transformations and data workflows
Nearest Major Market: San Francisco
Nearest Secondary Market: Oakland