Job Description - Data Engineer - ETL / Python \/ Hadoop \/ Snowflake
Job Description:
Responsibilities
Organize business needs into ETL/ELT logical models and ensure data structures are designed for flexibility to support scalability of business solutions
Craft and implement data pipelines utilizing Glue, Lambda, Spark, and Python
Define and deliver reusable components for ETL/ELT framework.
Define optimal data flow for system integration and data migration
Integrate new data management technologies and software engineering tools into existing structures
Qualifications
Experienced in Design, Development, and Implementation of large - scale projects in financial industries using Data Warehousing ETL tools (Pentaho)
Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling
Strong knowledge and experience of SQL, Snowflake, Python and Spark
Experience with Big Data/distributed frameworks such as Spark, Kubernetes, Hadoop, and Hive
Ability to design ETL/ELT solutions based on user reporting and archival requirements
Strong sense of customer service to consistently and effectively address client needs
Self-motivated; comfortable working independently under general direction
Primary Skills:
Spark
Python
Snowflake
Airflow
Hadoop
Education:
Bachelors Degree
Additional client information:
Original job Data Engineer - ETL / Python \/ Hadoop \/ Snowflake posted on GrabJobs ©. To flag any issues with this job please use the Report Job button on GrabJobs.