
JobFinder Spain
Cargo:
Scala or Java Data Engineer (Senior/Lead) ID28611 – Campinas
Requisitos:
Scala or Java Data Engineer (Senior/Lead) ID28611
Posted On 01/23/2025
Job Information
City: Campinas
State/Province: Sao Paulo
Postal Code: 13000-000
Industry: IT Services
Job Description
AgileEngine is one of the Inc. 5000 fastest-growing companies in the U.S. and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions.
If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment every day, there’s no better place - guaranteed!
What you will do
- Design, develop, maintain, and enhance highly scalable data engineering solutions leveraging AWS services.
- Design, build, document, and implement scalable pipelines with a clear focus on data quality and reliability.
- Ingest and transform structured, semi-structured, and unstructured data from multiple sources.
- Build an enterprise-level ETL/ELT solution.
- Innovate and build proprietary algorithms to tackle complex problems involving interesting data challenges.
- Execute and continually optimize new customer data ingestion and model implementation processes.
- Integrate business knowledge with technical functionalities.
- Often work collaboratively with application engineers, data scientists, product managers, and product delivery teams.
- Develop solutions at the intersection of data and ML.
- Monitor workflow performance, reliability, and ensure SLA targets are met.
- Automate existing code and processes using scripting, CI/CD, infrastructure-as-code, and configuration management tools.
- Work with AI on problems like NLP, image analysis featurization, and OCR labeling.
Must haves
- 5+ years of experience with Scala (preferred) OR Java for data engineering and ETL.
- 5+ years of experience with data pipeline tools such as Spark.
- 5+ years of experience working in the AWS ecosystem (preferred), or GCP.
- High proficiency in SQL programming with relational databases – experience writing complex SQL queries is a must.
- Experience in using the best practices of Cloud provider AI services.
- Ability to contribute in an agile, collaborative, and fast-paced environment.
- An excellent problem-solving ability, to think outside of the box to solve common problems.
Nice to haves
- 5+ years of experience with DAG orchestration and workflow management tools like Airflow or AWS Step Functions.
- 3+ years of experience using cloud-provider AI services.
- 3+ years of experience with Kubernetes.
- 3+ years of hands-on experience developing ETL solutions using RDS and warehouse solutions using AWS services (S3, IAM, Lambda, RDS, Redshift, Glue, SQS, EKS, ECR).
- Experience working with distributed computing tools like Hive.
- Experience with Git and CI/CD tools like Jenkins, GitLab CI/CD, or GitHub Actions.
- Experience with containers/orchestration tools like Docker or Helm.
- Experience in a fast-paced agile development environment.
- AWS certifications (AWS Certified Solutions Architect, Developer, or DevOps).
- Knowledge of commercial claims management systems.
The benefits of joining us
- Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.
- Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.
- A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.
- Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
#J-18808-Ljbffr
Salário:
a combinar
Benefícios:
a combinar