Expert Data Analyst
Oakland, CA, US, 94612
Requisition ID # 162232
Job Category: Business Operations / Strategy
Job Level: Individual Contributor
Business Unit: Electric Engineering
Work Type: Hybrid
Job Location: Oakland
Department Overview
Electric Asset Management is responsible for the electric system engineering and planning, asset strategy, and risk management across transmission, distribution, and substation asset families. This centralized, risk-informed approach allows PG&E to manage electric risk, asset and system health, interconnections, and performance by using consistent standards, work methods, prioritization, and program sponsorship, while leveraging lessons learned from inspections and asset data to inform asset management decisions. The organization is accountable for asset planning and strategy, standards and work methods, and asset data management for Electric.
The Asset Knowledge Management (AKM) team, within Electric Asset Management, is responsible for the management, quality, and access of PG&E’s electric asset data. The team’s objective is to maximize the use and ensure the trustworthiness of PG&E’s critical electric data assets.
Position Summary
The Data & Analytics Product Development team is looking for an Expert level data analyst who develops, maintains, and enhances products/ analytics driving engineering and asset management improvements and insights. The individual in this role must have demonstrated success with leadership support, cross-functional products, respond to analytical requests, ability to communicate concepts to managers/ stakeholders/ sponsors across the organization, detailed oriented, and identifies gaps in data requirements.
In this role, you will work closely with fellow product developers, product managers, and partners throughout the Electric organization to understand their needs to develop valuable products and analytical reports.
Works on process improvement, and product development/ enhancement. Works on technical development phases: data engineering, analytics, and visualization/user interface. Interacts with technical and non-technical clients to resolve analysis and technical issues. Works with product managers, team members, clients, and senior leadership throughout the development cycle practicing continuous improvement.
PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, specific skills, education, licenses or certifications, experience, market value, geographic location, and internal equity.
A reasonable salary range is:
Bay Area Minimum:$122,000
Bay Area Maximum:$194,000
Job Responsibilities
- Acts as lead to provide direction to less experienced employees.
- Analyzes data using various advanced statistical tools to develop complex ad hoc reports, statistics, trends, and profiles.
- Demonstrates and uses broad expertise in data processing and data analysis and applies this knowledge to several different issues.
- Designs data analysis to achieve business objectives, process data, analyze data, write clear and concise data findings and tailor communications to specific audiences. Creates automated data routines and processes for reporting and data delivery ensuring accurate data manipulation and data cleansing.
- Provides product demos to various audiences.
- Manages complex data analysis projects with limited supervision on only the most complex projects from initial request to presenting data results ensuring that the project is completed within budget and on-time.
- Provides new and innovative solutions and recommendations to enhance database(s) with new data sources and improved data processes.
- Works with Principals and Managers across functions to influence decisions.
Application Development:
- Work closely with Subject Matter Experts (SMEs) to design and develop full stack application.
- Work on both frontend and backend components, ensuring seamless integration and functionality.
- Improve on application User Interface and visualization design.
- Implement operational applications/ data visualization.
- Developing interactive workflow UIs.
- Develop ability to build/ access data versions within the tool as necessary to enable analytics.
- Develop writeback functionality into the tool as needed.
- Maintain applications as usage grows and requirements change.
Data Integration and Management:
- Develop data pipelines using PySpark to create datasets, objects, and User Interface applications primarily within the data mgt & development platform.
- Work with our Data Engineers to understand the ontology (data model) and data pipelines supporting the applications.
- End to End data pipeline development – responsible for using Python and Pyspark to perform ETL and derive new datasets necessary for business applications.
- Designing and building for high-scale data intensive workflows
Collaborate and Communication:
- Collaborate with data scientists, analysts, and other stakeholders to understand requirements and deliver solutions.
- Communicate technical concepts and project status to non-technical stakeholders.
Performance Optimization:
- Optimize application performance, including both frontend responsiveness and backend processing speed.
- Ensure scalability and reliability of applications.
User Experience (UX) and Interface Design:
- Design and develop intuitive user interfaces using modern frontend frameworks and libraries.
- Focus on enhancing user experience and usability of applications.
Security and Compliance:
- Implement security best practices to protect data and applications.
- Ensure compliance with relevant data protection regulations and company policies.
Testing and Quality Assurance:
- Perform data validations and analysis, use PyTest to create and implement unit test for one time or automated test embedded within a pipeline.
- Implement expectations within health checks to implement automated validations for build freshness, data freshness, primary key, schema check, but also more complex data validation that should require a build to be aborted or a warning notification. Monitor and debug critical issues such as data staleness or data quality.
- Develop and execute unit tests, integration tests, and end-to-end tests to ensure software quality.
- Debug and resolve issues reported by users or identified through testing.
Documentation and Support:
- Create and maintain technical documentation for developed application and data pipelines.
- Prove ongoing support and troubleshooting for deployed applications as needed.
Continuous Improvement:
- Stay updated with the latest developments in technologies.
- Continuously improve code quality, architecture, and development processes.
Project Management:
- Participate in project planning including task estimation and timeline management.
- Track progress and ensure timely delivery of project milestones.
Qualifications
Minimum:
- BA/BS Degree in Marketing, Business, Computer Science, Engineering or other related field or equivalent work experience
- Job-related experience, 6 years
Desired:
- Master’s Degree or equivalent experience Job-related experience, 8 years
- Foundry workflow development experience
- Experience with asset conditions and work management processes
- Influence skills
- Proficiency with the steps in the data analytics lifecycle: data gathering and preparation, feature engineering, development, and testing.
- Advanced skill in Microsoft Excel and PowerPoint
- Experience collaborating, working on a team, teaching and/or mentoring junior colleagues
- Proficiency in synthesizing complex information into clear insights and translating those insights into decisions and actions
- Ability to clearly communicate complex technical details and insights to colleagues, stakeholders, and leadership
- Proficiency with programming best practices, including documentation, version control (e.g., Git or equivalent), unit testing, etc.
- Proficiency in relevant programming languages and techniques, such as Python, object-oriented programing, SQL etc.
- Experience with data visualization tools like Power BI or Tableau
- Strong organizational, prioritization and multi-tasking skills
- Strong problem-solving skills
#featuredjob
Nearest Major Market: San Francisco
Nearest Secondary Market: Oakland