StreamSets Developer (Senior Level)

Job Highlights

  • laptop to be provided 
  • Upon regularization 30 paid leaves per year
  • Upon regularization HMO for you PLUS 1 dependent worth P150,000

Role Summary

1

1

Lead the effort to design, build and configure applications, acting as the primary point of contact

Responsibilities

  • Lead the effort to design, build and configure applications, acting as the primary point of contact
  • Manage Critical Data Pipelines that power analytics for various business units
  • Developed different pipelines in the Streamsets according the requirements of the business owner Intensively used Python, JSON Groovy scripts coding to deploy the Streamsets pipelines into the server
  • Build, deploy, and manage data pipelines using StreamSets Data Collector (SDC) and StreamSets DataOps.
  • Design real-time and batch data workflows to support analytics, reporting, and operational systems.
  • Ensure data pipelines are optimized for performance and scalability.
  • Connect and integrate data from diverse sources such as relational databases, cloud services, APIs, and streaming platforms (e.g., Kafka).
  • Handle complex data transformations, data cleansing, and enrichment processes.
  • Collaborate with stakeholders to understand data requirements and provide tailored solutions.
  • Monitor and troubleshoot data pipelines to ensure reliable data flow and minimal downtime.
  • Automate pipeline deployments using CI/CD tools and implement version control.
  • Implement data quality and governance frameworks.

Qualifications

  • The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field
  • Must have a minimum of 4 years of experience in StreamSets
  • 4+ years of experience in data engineering or related roles
  • Proficiency in StreamSets Data Collector and Transformer.
  • Strong understanding of ETL/ELT processes and data pipeline design principles.
  • Knowledge of databases (SQL, NoSQL), data warehousing, and big data platforms (Hadoop, Spark).
  • Familiarity with streaming platforms such as Kafka or AWS Kinesis.
  • Proficiency in programming languages like Python, Java, or Scala.
  • Experience with cloud platforms like AWS, Azure, or GCP.
  • Familiarity with DevOps tools such as Jenkins, Git, and Docker.
  • StreamSets Professional Certification.
  • AWS Certified Data Analytics – Specialty or equivalent cloud certifications.

Technical Environment

Additional Information

Career Level

Senior

Work Location

BGC Taguig

Work Setup

RTO (Dayshift)

Job Type

Full Time/Project Based

Contact Form

Fields marked with an * are required