Number of Applicants
:000+
We are looking for a Data and developer engineer in Montreal. This person be responsible for participate in the conception, development, deployment and optimization of data pipelines, Data Lake, Datawarehouse for varied data integration/ingestion use cases.You will contribute to complex and diverse initiatives though building automated solutions-Including robotic process automation, automating orchestrated jobs, and enhancing other integrations for file transfers, back-end database updates through front-end (vice versa).
Responsibilities:
- Participate in the conception, development, deployment and optimization of data pipelines, Data Lake, Datawarehouse for varied data integration/ingestion use cases (for example: interfaces between systems, IoT data point ingestion);
- Collaborate with Analysts and Architects in the design of Datawarehouse and integration datasets.
- Contribute to brainstorming discussions and solution definition related to information architecture and design, data collection, data analysis methodology, data normalization, enterprise data dictionaries, cross system data integration.
- Performs problem solving and testing of hypotheses and complex analysis of data.
- Performs data modelling for proper analysis and insight.
- Document, and implement best practices.
- Write and maintain unit tests, integration tests and performance tests.
- Analyze, identify, diagnose, document and assist to resolve: Application issues, Data issues
- Deploy and merge solutions using docker.
- Orchestration/DataOps: GitLab/Multiple (Python, Ruby, etc.)
- Orchestration/DataOps: AirFlow/Python
- Orchestration/DataOps: Docker/Multiple (Go, Python, etc.)
- Orchestration/DataOps: SonarQube/Multiple (Java, JavaScript)
- Back-End: uWSGI/Python/C, Lua, Perl, Python, etc.
- Back-End: Django/Python
- Front-End: NGINX/C, Lua, Perl, Python, etc.
Necessary Qualifications:
- Good hands-on data engineering experience:
- Experience on coding with SQL, T-SQL, noSQL, Python, C#
-Building & integrating pipelines.
- Familiar with relational and non-relational databases
- Experience with ELT/ETL: ingestion of disparate systems (through rest API, sftp, SQL databases, etc.) in CSL's Data Layer (Data Lake / Data Warehouse)
- Automated testing
Nice to have:
- Experience with Agle methodology
- Knowledge of Big data ingestion
- Experience with BI / Analytics
- Bilingue
Job Types: Full-time, Permanent
Schedule:
Work Location: In person
Share this job with your friends
Copyright © 2024 Grabjobs Pte.Ltd. All Rights Reserved.