Data Engineer, Python / AWS

icon building Company : Credera
icon briefcase Job Type : Full Time

Number of Applicants

 : 

000+

Click to reveal the number of candidates who applied for this job.

Job Description - Data Engineer, Python / AWS

REMOTE ROLE for Contractor anywhere in N. America **
12 month contract and possible extension **
Rate = $65 - 80/hr USD Based On Experience

Description:

Data Engineers at TA Digital work closely with Subject Matter Experts (SMEs) to design the ontology (data model), develop data pipelines, and integrate Foundry with external systems containing the data. Data engineers also need to provide guidance and support on how to access and leverage the data foundation to create new workflows or analyze data.

Responsibilities Include:

Leverage Generative AI on AWS data
Integrate new data sources to Foundry using Data Connection
Implement 2-way integrations between Foundry and external systems
Develop pipelines transforming tabular or unstructured data
Implement data transformations in PySpark / PySpark SQL to derive new datasets or create ontology objects
Set up support structures for pipelines running in production
Monitor and debug critical issues such as data staleness or data quality
Improve performance of data pipelines (latency, resource usage)
Design and implement an ontology based on business requirements and available data
Provide data engineering context for application development

Requirements:

Generative AI on AWS such as Amazon Bedrock, Amazon SageMaker, Amazon EC2, Amazon EC2 UltraClusters, AWS Trainium or AWS Inferentia
Python – complete language proficiency
SQL – proficiency in querying language (join types, filtering, aggregation) and data modeling (relationship types, constraints)
PySpark – basic familiarity (DataFrame operations, PySpark SQL functions) and differences with other DataFrame implementations (Pandas)
Distributed compute – conceptual knowledge of Hadoop and Spark (driver, executors, partitions)
Databases – general familiarity with common relational database models and proprietary instantiations, such as SAP, Salesforce etc.
Git – knowledge of version control / collaboration workflows and best practices
Iterative working – familiarity with agile and iterative working methodology and rapid user feedback gathering concepts
Data quality – best practices

#J-18808-Ljbffr
Original job Data Engineer, Python / AWS posted on GrabJobs ©. To flag any issues with this job please use the Report Job button on GrabJobs.
icon no cv required No CV Required icon fast interview Fast Interview via Chat

Share this job with your friends

icon get direction How to get there?

icon geo-alt Washington, District Of Columbia

icon get direction How to get there?
View similar Others jobs below

Similar Jobs in the US

GrabJobs is the no1 job portal in the US, connecting you to thousands of jobs fast! Find the best jobs in the US, apply in 1 click and get a job today!

Mobile Apps

Copyright © 2024 Grabjobs Pte.Ltd. All Rights Reserved.