PP - JAVA with Data Processing Engineer - Job0118

Remote
Contracted
Experienced




Job Summary

We are seeking a highly skilled Java with Data Processing Engineer to join our dynamic team. This role is pivotal in designing and developing data products and platforms that enhance our customer applications. The ideal candidate will possess a strong background in software engineering, particularly in Java, and have extensive experience in data processing and analytics. You will collaborate closely with product teams to identify business requirements and architect large-scale solutions, ensuring that our data products meet the highest standards of performance and reliability.

Job Responsibilities

  • Design and Development: Lead the design and development of data products and platforms tailored for customer applications, ensuring scalability and performance.
  • Collaboration: Work closely with product teams to gather business requirements and translate them into technical specifications for large-scale solution architecture.
  • Ownership: Take full ownership of key analytics components, services, and offerings from ideation through to production deployment.
  • Experimentation and Automation: Engage in experimentation, automation, and simulation to enhance product features and performance.
  • Performance Tuning: Contribute to performance tuning, improvements, and load testing to ensure optimal system performance.
  • Data Processing: Utilize Java for data processing and injection into Snowflake data warehouses, ensuring efficient data flow and integrity.
  • Mentorship: Provide guidance and mentorship to junior engineers, fostering a culture of knowledge sharing and continuous improvement.

Basic Qualifications

  • BS/MS/PhD in Computer Science, Mathematics, or a related field.
  • Minimum of 5 years of software engineering experience with a focus on data processing.
  • Strong experience in Java and Spring Boot Framework.
  • Proficient in JAVA for data processing and injection into Snowflake warehouses.
  • Experience with Ruby on Rails, with a demonstrated ability to contribute to projects using this framework
  • Hands-on experience with big data tools/libraries and the MapReduce framework.
  • Proven experience in designing, developing, debugging, and monitoring distributed systems.
  • Proficient in Java (JUnit primarily) with a strong focus on data processing.
  • Strong experience in SQL, including stored procedures, for database management.
  • Experience with SnowSQL & PostgreSQL for database management.
  • Familiarity with QLess, REST APIs for service integration.
  • Proficient in Terraform for infrastructure as code.
  • Experience with Gradle for build automation.
  • Familiarity with Kafka for real-time data streaming.
  • Understanding of Protobufs for data serialization.
  • Experience with XML and SFTP for data interchange.
  • Knowledge of Kubernetes (K8s) and Linux is a nice-to-have.
  • Programming Languages & Frameworks:
  • Strong experience in PL/SQL, PL/PostgreSQL, Snowflake
  • Familiarity with big data tools/libraries and MapReduce frameworks.

Nice to Have

  • Knowledge of cloud platforms such as AWS & GCP; AWS experience is a plus as the company transitions to GCP.
  • Hands-on experience implementing machine learning models for actionable insights.
  • Proven track record in addressing business problems related to reporting and big data analytics.
  • Production-grade experience with tools such as Spark, Hive, Airflow, SQL.
  • Strong sense of platform feature ownership from ideation to deployment
  • Experience in fintech and payment workflow processes.



Share

Apply for this position

Required*
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

Human Check*