logo

View all jobs

Lead Data Engineer Snowflake

San Francisco, CA · Information Technology
Our client, a major bank, looking for talented Lead Data Engineer Snowflake for San Francisco, CA Location.

This is a permanent position with great benefits package and competitive compensation DOE.
Relocation assistance is available.
Currently 100% remote, but will require couple of days on-site commute with hybrid commute/home schedule.

No H1 Visa will be supported for this position.
** Must be authorized to work for ANY employer in US 

Role: 

- Design and development ETL/ELT using Python with Snowflake and SQL Server. Be able to develop & automate a project through its entire lifecycle.
- Knowledge in Data Warehouse/Data Mart design and implementation. Be able to develop a project through its entire lifecycle.
- Build distributed, reusable, and efficient backend application using Python.
- Implementation of security and Data protection
- Providing guidance on new technologies and continuous improvement in best practices
- Researching, implementation and development of software development tools
- Understand repeatable automated processes for building the application, test it, document it, and deploy it at scale.
- Establish quality processes to deliver a stable and reliable solution
- Efficient in writing complex SQL, stored procedures in Snowflake, SQL Server
- Preparing documentation (Data Mapping, Technical Specifications, Production Support, data dictionaries, test cases, etc.) for all projects
- Coach junior team members and help your team to continuously improve by contributing to tooling, documentation, and development practices


Qualifications


Experience & Education:
- 10+ Years of experience in leading Data Engineering solutions & efforts
- 3+ Years of experience in Snowflake, SQL Server/Oracle and knowledge in No SQL database like Mongo DB
- 2+ Years of experience in Python
- 2+ Years of experience in ETL Developer role with deep knowledge of data processing tools like Informatica, SSIS.
- Experience/Knowledge in Cloud technologies
- Strong experience in building data warehouse solutions and Data Modeling.
- Strong ETL performance-tuning skills and the ability to analyze and optimize production volumes and batch schedules.
- Experience with ETL, Unix/Linux, as well as Git or other version control systems
- Expertise in operational data stores and real time data integration 
- Experience with Development Methodologies, Databases Platforms and Data Modeling tools (Erwin/Model Manager) 
- Expert level skill in modelling, managing, scaling and performance tuning high-volume transactional database.
- Bachelor's Degree in computer science or equivalent experience.
 
Plus Skills / Experience:
- Capital Markets knowledge and experience is highly desired
- Understanding of traditional and alternative asset class investment data, including but not limited to equity, fixed income, private equity, real estate, derivatives, mutual funds, ETFs, global assets, foreign exchange, etc.
- Focus on development/ improvement of framework to support repeatable and scalable solutions



Please email your resume or
Keywords data snowflake python etl modeling unix linux sql t-sql mdm cloud
Powered by