What’s in it for you as an employee of QFG?
-
Health & wellbeing resources and programs
-
Paid vacation, personal, and sick days for work-life balance
-
Competitive compensation and benefits packages
-
Work-life balance in a hybrid environment with at least 3 days in office
-
Career growth and development opportunities
-
Opportunities to contribute to community causes
-
Work with diverse team members in an inclusive and collaborative environment
This job posting is for an existing vacancy
We’re looking for our next Principal Data Engineer. Could It Be You?
The ideal Principal Data Engineer will be an experienced professional ready to work in an agile environment. This role requires in-depth knowledge and understanding of data ingestion, orchestration, compute, automation, and modeling, particularly within the high-velocity domain of brokerage technology and Digital Investing Engineering.
Need more details? Keep reading…
-
Design, develop, and maintain robust, scalable, and high-performance brokerage data pipelines and ETL/ELT processes, ensuring data quality, integrity, and timely availability for consumption.
-
Spearhead the creation of innovative data products and cross-domain data assets that align with QuestEnterprise's top-line OKRs, specifically supporting the high-velocity demands of Digital Investing Engineering.
-
Act as a technical leader and subject matter expert on data architecture, modeling, and best practices, driving the modernization of data infrastructure to leverage cutting-edge cloud technologies (GCP/Databricks).
-
Drive end-to-end automation of data workflows, monitoring, alerting, and deployment processes to enhance operational efficiency and reliability.
-
Enable and operationalize AI tooling and machine learning pipelines in close collaboration with Data Science and ML Ops teams, translating complex models into production-ready data flows.
-
Provide expert data consultation and enablement for self-servicing capabilities, empowering business analysts and stakeholders with tools (e.g., PowerBI, Looker) to access and derive insights independently.
-
Serve as the primary liaison between business stakeholders, software engineering, data science/ML Ops, and Enterprise data/AI enablement teams, translating business needs into technical data solutions.
-
Support audits and operational due diligence by ensuring comprehensive data lineage, governance, security, and compliance across all data products and infrastructure.
-
Mentor and coach junior and intermediate data engineers, fostering a culture of engineering excellence, continuous learning, and technical innovation within the team.
So are YOU our next Principal Data Engineer? You are if you…
-
8+ years of progressive experience in the data engineering field.
-
Expert-level proficiency with GCP data engineering services including BigQuery, Dataflow, Airflow (or Cloud Composer), Pub/Sub, Data Catalog, and CloudSQL, or equivalent expertise with Databricks.
-
Demonstrated experience with Relational Data Stores such as MSSQL or MySQL.
-
Strong knowledge of SQL and Python.
-
Experience in data modeling for both On-Premises and Cloud consumption. This includes expertise in technical architecture, infrastructure, and robust ETL/ELT pipeline development, with a focus on data ingestion, orchestration, and compute optimization
Technical Leadership and AI/BI Specialization
-
Spearhead the implementation of self-service Business Intelligence (BI) solutions, leveraging tools such as PowerBI and Looker, or advanced technologies like conversational AI agents (e.g., Google Cloud BigQuery and Databricks AI agents).
-
Practical experience and awareness of leveraging generative AI developer tools (e.g., Claude, Cursor, Github Copilot) to significantly boost coding efficiency, accelerate development, and enhance data pipeline quality.
-
Act as a technical leader to enable the adoption of new technologies both within the immediate team and across the broader organization.
-
Work in close collaboration with Solution Architects and Data Science teams to design and refine data ingestion pipelines and define comprehensive data modeling strategies for consumption.
-
Collaborate with the team to strategically decide on the most appropriate tools and methodologies for various data integration scenarios.
Project Management and Mentorship
-
Verifiable track record of successfully leading multiple concurrent projects, including proactively troubleshooting technical challenges and efficiently resolving production issues in a timely manner with the team.
-
Provide guidance and mentorship to new and current team members to facilitate their upskilling and professional growth.
-
Proven ability to thrive in ambiguity, effectively prioritize competing needs, and consistently deliver results in a dynamic, fast-paced environment.
Communication and Stakeholder Influence
-
Exceptional presentation and communication skills (e.g., PowerPoint, Google Slides).
-
Ability to communicate effectively and influence a diverse group of stakeholders, including external engineering teams, product development teams, business stakeholders, and external partners.
-
Capability to participate in and present novel technologies or concepts during enterprise-wide forums (e.g., QuestTalk).
Good to have skills:
-
Databricks
-
Worked in SAFe - Agile development process
-
Design, document and develop complex data pipelines and cross domain data products
-
Knowledge of the Financial industry (Investment & Trading/Brokerage Technology)
-
GCP - Google Cloud Professional Data Engineer Certification preferred
Compensation Information:
-
Base salary range: $140,000 - $160,000
-
The final compensation package will be commensurate with the successful candidate's experience, skills, and geographic location (Canada). It includes a comprehensive benefits plan and a competitive incentive (bonus) program for Full-Time Permanent roles.
Sounds like you? Click below to apply!
#LI-Hybrid