AWS/AZURE Data Engineer Job in United State | Yulys
×

Job Title: AWS/AZURE Data Engineer

Company Name: Precision Technologies
Salary: USD 105,000.00
-
USD 125,000.00 Yearly
Job Industry: Program Development
Job Type: Full time
WorkPlace Type: remote
Location: United State, United States
Required Candidates: 1 Candidates
Skills:
Magento Module Development
Extension Development
REST API Integration
Job Description:


We’re hiring AWS/AZURE Data Engineer with 5+ years experience for our direct clients and implementation partners with active full-time openings! (NO-C2C)


Client Location : New York, New Jersey, Texas, Georgia, Pennsylvania, Connecticut, Massachusetts & Chicago IL.


📢 We welcome applications from candidates looking to restart their careers after a break!


Title: AWS/AZURE Data Engineer (Full-Time)

What We’re Looking For:

  1. Minimum 5+ years of hands-on experience in designing, building, and optimizing scalable data pipelines, data integration workflows, and data warehousing solutions across enterprise-level environments.
  2. Deep expertise in ETL/ELT development, data ingestion, and data transformation using industry-standard tools such as Apache Spark, Airflow, Talend, or Informatica.
  3. Proven experience working with cloud platforms including AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse, Blob Storage), or GCP (BigQuery, Dataflow) for data engineering workloads.
  4. Advanced proficiency in SQL, Python, or Scala for implementing robust, efficient, and reusable data processing logic.
  5. Strong familiarity with data lake and lakehouse architectures, leveraging file formats such as Parquet, Avro, or ORC for optimized storage and performance.
  6. Solid experience in data modeling, dimensional modeling, data governance, and metadata management, ensuring accuracy, security, and compliance.
  7. Hands-on experience with real-time streaming and event-driven architectures using tools like Apache Kafka, AWS Kinesis, or Azure Event Hubs.
  8. Skilled in CI/CD pipeline setup and source control practices using tools like Git, Jenkins, Azure DevOps, or GitLab CI/CD for automated and collaborative delivery.
  9. Demonstrated success in performance tuning, data quality validation, and maintaining pipeline reliability for mission-critical data workloads.
  10. Excellent problem-solving, root cause analysis, and debugging skills with a proactive approach to improving data accuracy, efficiency, and platform scalability.
  11. Experience working in Agile/Scrum environments, collaborating cross-functionally with data scientists, business analysts, and DevOps teams to deliver end-to-end data solutions.
  12. Bonus: Exposure to data cataloging tools, data observability platforms, or integration with BI dashboards (e.g., Tableau, Power BI, Superset).


Please reach me at +1 732-348-5149 or you can send me your updated resume at javeed@ptcit.com


📢 Next Steps:- I apologize in advance if I miss your call or there is a delay in my response. Please be assured that I will get back to you at the earliest opportunity. All applications will be reviewed carefully, and we sincerely appreciate your patience and understanding during this process.

Find part-time jobs and grow professionally by visiting our career guides now. Make a fresh start in your job search in August.

Become a part of our growth newsletter