Back to jobs available

Data Systems Analyst

  • Information Technology
  • Ottawa
  • Permanent
Apply Now

To apply, please forward your resume and cover letter by Feb 15, 2026

The Mint is hiring a Data Systems Analyst who can thrive in a dynamic and inclusive environment.

Reporting to the Manager, Enterprise Data Architecture, the Data Systems Analyst is a role that combines data engineering, analytics, and data science to enable advanced analytics and reporting, predictive analysis, planning and forecasting capabilities across the Mint. This position serves as the technical backbone for enterprise data and AI initiatives, ensuring reliable data integration, transformation, and modeling across platforms such as D365 F&O, Microsoft Fabric, Azure Synapse, and Power BI. The role focuses on building and optimizing data pipelines, designing semantic models, and delivering high-value analytics and AI solutions that move the organization from diagnostic reporting toward predictive and prescriptive insights. In addition to technical delivery, the Analyst collaborates closely with Finance, Operations, Commercials, HR, Legal, other business and IT stakeholders to align data solutions with business objectives, while contributing to governance, best practices, and continuous improvement practices. This position requires strong problem-solving skills, hands-on experience with modern data platforms, and the ability to communicate complex technical concepts in a clear and actionable manner to non-technical audiences.

Your Responsibilities:

  • Build and maintain reliable, scalable data pipelines and integrations across enterprise systems (e.g., ERP, PLM, MES) and analytics platforms.
  • Ingest, cleanse, transform, and model data to produce trusted datasets for reporting, advanced analytics, and AI use cases.
  • Design and optimize semantic models to enable self-serve analytics and consistent performance and business logic.
  • Develop and maintain dashboards, reports, and analytics products that support timely, data-driven decision-making.
  • Apply statistical analysis and machine learning techniques (e.g., forecasting, A/B testing, anomaly detection) to generate predictive insights.
  • Translate business problems into clear analytical requirements, partnering with stakeholders to define success metrics, deliverables, and adoption plans.
  • Diagnose and resolve data quality, pipeline, and production issues; perform root-cause analysis and implement preventative controls.
  • Document data flows, business definitions, and metadata in the data catalog to improve transparency, reuse, and governance.
  • Contribute to data governance, security, and standards, ensuring compliant access controls and best practices across the data lifecycle.
  • Drive continuous improvement through automation, monitoring, performance tuning, and modern engineering practices.

  

This position offers a hybrid work arrangement, requiring on-site presence at our Ottawa location three to five days per week, depending on meeting requirements and other activities.

This selection process may also qualify you for other positions with similar requirements and qualifications.

Qualifications:

Language Requirements:

Proficiency in both official languages (English / French) is required at the time of hire. If no qualified candidate fully meets the required levels, other candidate language profiles may be considered in accordance with the Mint’s Staffing Policy. The linguistic profile for this position is:

  • Written Comprehension: B Intermediate
  • Written Expression: B Intermediate
  • Oral Proficiency: B Intermediate

  

Education & Designation:

  • University degree in computer science, system analysis, or a related field of study, or a combination of education and equivalent and extensive experience might be considered

  

Experience:

  • 5+ years in data engineering/BI with D365 F&O exposure and at least one production Synapse Link implementation
  • Experience with Azure DevOps for version control, CI/CD and environment promotion
  • Experience with Direct Lake semantic models and Fabric Warehouse vs. Lakehouse design choices is an asset

  

Additional Qualifications:

Skills and Abilities:

  • Strong communication: Ability to explain technical concepts to business stakeholders
  • Data engineering expertise: Building robust pipelines and models across Fabric and Synapse
  • Analytical thinking: Ability to apply statistical and ML methods for forecasting and optimization
  • Problem-solving: Addressing performance issues and ensuring data quality
  • Collaboration: Working effectively with Finance, Operations, HR, and other business and IT teams
  • Adaptability: Navigating evolving technologies and business requirements
  • Data governance tools (e.g., Purview) and KPI/metrics standardization
  • Forecasting methods (classical time series or ML) applied to finance, commerce, and operations datasets
  • Python/PySpark for transformations; performance tuning of large fact tables/slowly changing dimensions

  

Knowledge :

  • D365 F&O and Synapse Link configuration
  • Strong SQL (T-SQL/Spark SQL) and data modeling skills
  • Hands-on experience with Power BI (Power Query, DAX, model security)
  • Knowledge of Microsoft Fabric (Dataflows Gen2, Lakehouse, Pipelines)
  • Understanding of BPA/BPP or equivalent EPM tools for budgeting and forecasting

  

Apply Now