Location: REMOTE / Toronto, Ontario
This job allows you to work remotely.
Our Client stands at the forefront of narrative intelligence, harnessing the power of advanced machine learning to safeguard enterprises and government agencies from social media manipulation and narrative attacks. Our platform rapidly sifts through millions of unstructured, cross-channel media datasets, transforming them into actionable insights. By identifying adversarial online messaging, understanding its audience, and providing crucial context such as source credibility, we empower our customers to effectively respond to digital threats.
Their team takes on complex media challenges, from crisis management to countering state-backed disinformation. We help clients stay ahead of social media manipulation and emerging threats by spotting and stopping harmful online trends.
They're seeking a seasoned and forward-thinking Senior Data Engineer & DevOps Lead to drive the design, implementation, and growth of data infrastructure and DevOps strategies. In this role, you'll work closely with Data, Product, and Engineering teams to develop a unified vision for scalable, secure, and efficient systems. You'll also play a key role in building and mentoring a high-performing team of data engineers and DevOps professionals. Your expertise will be essential in aligning data systems with business needs and fostering innovation across the organization.
What you will accomplish:
- Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
- Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
- Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
- Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
- Foster a culture of innovation, accountability, and collaboration within the team.
- Establish best practices for performance management, career development, and skills growth.
- Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
- Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
- Drive the implementation of best practices in data governance, quality, and security.
- Ensure the availability, reliability, and performance of data systems
- Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
- Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
- Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
- Present project updates, performance metrics, and strategic initiatives to leadership.
Special Perks:
- Remote-first culture - Must work in EST
- Health insurance, 401(k) and other benefits
- Flexible work schedule and unlimited time off
- Opportunities for professional growth and development
- A vibrant and inclusive company culture
Must Have Skills:
- 13+ years of experience in engineering, with at least 5+ years in data engineering, data ops, or related roles.
- Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines.
- Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
- Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
- Must have either ElasticSearch or OpenSearch experience.
- Proficiency in programming and scripting languages (e.g., Python, Java, Bash).
- Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar.
- Experience with big data technologies (e.g., Kafka).
- Familiarity with ML/AI infrastructure and frameworks.
- Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
- You have maintained data management systems and have built new data pipelines from scratch.
- You are comfortable automating data flows with resilient code using Python.
- Experience with Dagster or other data orchestration platforms such as Airflow.
- You have strong database architecture design and management knowledge in both structured and unstructured data.
- Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
- You are able to define and communicate data architecture requirements, keeping current with data management best practices.
- Track record of successfully managing and scaling high-performing technical teams.
- Exceptional leadership, communication, and decision-making abilities.
- Strong analytical mindset with a solution-oriented approach.
- Ability to balance strategic vision with tactical execution.
Nice to Have Skills:
- Certifications in cloud platforms or DevOps practices are a plus.