Big Data Engineer

Warsaw, full time

Key duties:

  •     design and deliver automated transformation of large data sets leveraging MapReduce, streaming, and other emerging technologies
  •     leverage Kafka, HBase, Elasticsearch, etc. to ingest transformed data at scale
  •     collaborate with security experts to deliver high-impact web-based APIs
  •     analyze, monitor, and optimize for performance
  •     implement high-volume data integration solutions
  •     writing and maintaining documentation

Successful candidate will have:

  •    working experience in large scale complex internet systems processing huge amount of data in real time
  •    excellent knowledge of all aspects of IT, and deep understanding of Linux, UNIX
  •    ability to independently analyze and solve problems
  •    involvement in the assigned projects and creative approach to the tasks performed

Additionally it's good to have:

  •    very good knowledge of technologies like Kafka, Hadoop, Spark, HBase
  •    knowledge of ad-serving systems
  •    experience in programming in languages such as C ++, Python, Java

To apply, write at, adding the position name in the topic.

Please note that we will contact only selected candidates.