Position title
Kafka/Spark Streaming Engineer
Description

As part of Daman’s Data Engineering team, you will be architecting and delivering highly scalable, high-performance data integration and transformation platforms. The solutions you will work on will include cloud, hybrid, and legacy environments that will require a broad and deep stack of data engineering skills. You will be using core cloud data warehouse tools, Hadoop, spark, events streaming platforms, and other data management-related technologies. You will also engage in requirements and solution concept development, requiring strong analytic and communication skills.

Responsibilities
  • Function as the solution lead for building the data pipelines to support the development/enablement of Information Supply Chains within our client organizations – this could include building (1) data provisioning frameworks, (2) data integration into the data warehouse, data marts, and other analytical repositories (3) integration of analytical results into operational systems, (4) development of data lakes and other data archival stores.
  • Optimally leverage the data integration tool components for developing efficient solutions for data management, data wrangling, data packaging, and integration. Develop overall design and determine the division of labor across various architectural components
  • Deploy and customize Daman Standard Architecture components
  • Mentor client personnel. Train clients on the Daman Integration Methodology and related supplemental solutions
  • Provide feedback and enhance Daman intellectual property related to data management technology deployments
  • Assist in the development of task plans including schedule and effort estimation
Qualifications
  • Continuous Data Movement/ Streaming/ Messaging:
  • Experience using Kafka as a distributed messaging system
  • Experience with Kafka producer and consumer APIs
  • Understanding of event-based application patterns & streaming data
  • Experience with related technologies ex Spark streaming or other message brokers like MQ is a PLUS
  • Batch Data Movement – ETL (Datastage experience is preferred, but more important is a  deep knowledge of data integration concepts)
  • Other core data skills:
  • Competency in core data & analytics concepts
  • Competency in SQL, including advanced concepts
  • Traditional Relational Databases – Prefer DB2/UDB
  • Distributed data & compute (partitioning, indexes, access patterns)
  • NoSQL Platforms & Concepts (Doc Store – Couchbase, Wide Column Store – Cassandra, Graph – Datastax Enterprise Graph) is a PLUS
  • Machine Learning is a PLUS.
  • Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required

 

Daman Is an Equal Opportunity Employer, and All Qualified Applicants Will Receive Consideration for the Employment Without Regard to Race, Color, Religion, Sex, National Origin, Disability Status, Protected Veteran Status, Or Any Other Characteristic Protected by Law.

Employment Type
Full-time
Job Location
Austin, Austin, United State
Date posted
April 16, 2021
PDF Export
Close modal window

Thank you for submitting your application. We will contact you shortly!