Expert Technical Lead - Snowflake and Informatica
Oakland, CA, US, 94612
Requisition ID # 165669
Job Category: Information Technology
Job Level: Individual Contributor
Business Unit: Information Technology
Work Type: Hybrid
Job Location: Oakland
Department Overview
The Data Solutions Architecture Team at Pacific Gas & Electric Company is responsible for driving long-term enterprise-wide data solutions, target state architecture, and overall excellence with the application of data, analytics, and information to critical business challenges and opportunities. This team is chartered to develop the strategy, roadmap, and accompanying standards that will enable better use of data and information and to develop analytics maturity at PG&E.
Position Summary
The Digital Utility runs on data and information. PG&E believes that one of the most critical drivers of our future success depends on our ability to extract insights and information from the data we manage. The Expert Technical Lead fills a critical role in PG&E’s success building the Digital Utility.
We are seeking an experienced and highly skilled Expert Technical Lead to drive the design, implementation, and management of our Snowflake and Informatica Cloud environments. This role requires a combination of software engineering, operations, and solution architecture expertise. The ideal candidate will play a key role in defining and automating our cloud infrastructure, optimizing data pipelines, and ensuring long-term maintainability and cost efficiency.
We strive for a team that will make a difference in the new PG&E. As a Technical Lead, you will have a direct impact on the day-to-day life of data solutions, delivery, and affect the safety of California. You will be collaborating with other technical leaders to define and implement standards of how data works and operates for the Enterprise. As a Technical Lead, you will have a supportive manager, collaborative team, and have the ability to be ground breaking, forward thinking, and revolutionize the way PG&E thinks and be a critical role in revolutionizing solutions that are industry changing as we are one of the most advanced Digital Utilities.
PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, specific skills, education, licenses or certifications, experience, market value, geographic location, and internal equity. We would not anticipate that the individual hired into this role would land at or near the top half of the range described below, but the decision will be dependent on the facts and circumstances of each case.
A reasonable salary range is:
Bay Area Minimum: $136,000.00
Bay Area Maximum: $232,000.00
Job Responsibilities
- Snowflake and Informatica Cloud Development: Lead the design, development, and optimization of Snowflake and Informatica Cloud solutions, ensuring best practices for data storage, processing, and security.
- Informatica Cloud Administration: Oversee the administration and management of Informatica Cloud, ensuring system stability, security, and optimal performance.
- Data Integration & Pipeline Orchestration: Design, develop, and manage data integration jobs and workflows in Informatica Cloud, ensuring seamless data movement between systems.
- Infrastructure Automation: Use Terraform to create and manage AWS assets, ensuring a fully automated, repeatable, and scalable infrastructure.
- Cloud Security Best Practices: Implement and manage cloud security measures to ensure data integrity, compliance, and secure access management across environments.
- CI/CD & Data Pipeline Automation: Develop and implement CI/CD pipelines for data integration and processing workflows, improving efficiency, and reliability.
- Solution Architecture & Design: Collaborate with stakeholders to gather requirements and design solutions within the existing technical landscape, with a focus on long-term operations, maintainability, and cost management.
- Performance Optimization & Monitoring: Continuously optimize system performance, monitor for issues, and proactively implement improvements.
- Collaboration & Leadership: Work closely with engineering, data, and operations teams to ensure alignment with business objectives and technical standards.
Qualifications
Minimum:
- Bachelors Degree in Computer Science or job-related discipline or equivalent experience
- 5 years job-related experience
Desired:
- Proven experience in designing, implementing, and managing Snowflake and Informatica Cloud environments
- Strong expertise in AWS infrastructure and Terraform for infrastructure-as-code automation
- Experience in administration and management of Informatica Cloud, including configuring and monitoring services
- Hands-on experience in developing and orchestrating Informatica Cloud data integration jobs and pipelines
- Deep understanding of cloud security best practices and access management strategies
- Hand-on experience with CI/CD pipelines and automation of data pipelines
- Strong knowledge of solution architecture, including gathering business requirements and designing scalable, maintainable, and cost-effective solutions
- Experience optimizing cloud costs and ensuring operational efficiency
- Excellent problem-solving and troubleshooting skills
- Strong communication and collaboration skills
- Experience with enterprise business architecture principles and industry standard architecture frameworks.
- Ability to achieve a deep understanding of line of business strategies, priorities, needs, and current capabilities.
- Ability to work collaboratively to engage and influence business and IT stakeholders, senior leadership, and external partners.
- Familiarity with at least two or more of: Scaled Agile, Scrum development methodology, DevOps/DevSecOps, LEAN, Six Sigma, or ITIL practices.
- Experience with any of the following: Data Architecture, Airflow, Jenkins, Palantir Foundry, Data Quality tools, Collibra, MDM, Spark, Teradata, SAP Business Warehouse, Business Object Suite, Tableau, SAS Enterprise Miner, Power BI, and other database and BI technologies, open-source Hadoop and related technologies, data access languages such as SQL, SAS, R, Python, Scala, etc.
- Excellent written and oral communication skills across all levels; ability to communicate complex technical concepts to leaders, business sponsors and stakeholders in clear, concise language that inspires confidence and earns trust.
- Experience with UX research, design thinking (DT), or in bringing products to market.
- Experience working in the Utility Industry and a working knowledge of Utility concepts and challenges a plus.
Nearest Major Market: San Francisco
Nearest Secondary Market: Oakland