hero

Jobs With No Boss

The easiest way to find and promote career opportunities in boss-less organizations from all over the world.
Jobs With No Boss
89
companies
341
Jobs

Lead Snowflake Engineer

Intellibus

Intellibus

Posted on Apr 15, 2024

Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together!

Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.

Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.

What We Offer:

A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.

We are looking for Engineers who can :

  • Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake.
  • Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks.
  • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs.
  • Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views.
  • Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs.
  • Implement data synchronization processes to ensure consistency and accuracy of data across different systems.
  • Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features.
  • Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency.
  • Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills
  • Work on SQL performance measuring, query tuning, and database tuning
  • Handle SQL language and cloud-based technologies
  • Set up the RBAC model at the infra and data level.
  • Work on Data Masking / Encryption / Tokenization, Data Wrangling / Data Pipeline orchestration (tasks).
  • Setup AWS S3/EC2, Configure External stages, and SQS/SNS
  • Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)

Key Skills & Qualifications:

  • ETL – Experience with ETL processes for data integration.
  • SQL – Strong SQL skills for querying and data manipulation
  • Python – Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations
  • Unix – Competent in Unix for file operations, searches, and regular expressions
  • AWS – Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions
  • Database Modeling – Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms.
  • Snowflake – Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures.
  • Airflow – Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines
  • Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree.
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.
  • Proven leadership abilities, with experience mentoring junior developers and driving technical excellence within the team.

We work closely with:

  • Data Wrangling
  • ETL
  • Talend
  • Jasper
  • Java
  • Python
  • Unix
  • AWS
  • Data Warehousing
  • Data Modeling
  • Database Migration
  • RBAC model
  • Data migration

Experience Needed :

  • At least 7 years of Data Wrangling Experience
  • At least 7 years of Snowflake Experience
  • At least 7 years of ETL Experience

Our Process:

  • Schedule a 15 min Video Call with someone from our Team
  • 4 Proctored GQ Tests (< 2 hours)
  • 30-45 min Final Video Interview
  • Receive Job Offer

If you are interested in reaching out to us, please apply and our team will contact you within the hour.