Careers at Bigabid

Data Engineer

Bigabid is an innovative technology company led by data scientists and engineers  devoted to mobile app growth.  Our proprietary ad platforms powered by machine learning are a culmination of that devotion. We deliver valuable results and insights for a fast growing clientele of major app developers using elite programmatic user acquisition and retargeting technologies.

Our ever-evolving, state-of-the-art machine learning technology analyzes tens of TB of raw data per day to produce millions of ad recommendations in real-time.

As a data engineer you will be working on our massive (PBs) data pipelines, making sure the data is clean, whole, and accessible. Your team's goal is to create amazing ground breaking tools to make the data scientists more productive and agile.

If you love working on complicated network pipelines, you understand the importance of reliable data and have felt the pain of big data inconsistencies,and you're the type who thinks of great solutions and want to bring them to life, BigaBid is your best challenge.


  • Create and maintain optimal data pipeline architecture
  • Build and maintain our feature store` and machine learning orchestration mechanism
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Spark and AWS ‘big data’ technologies
  • Work with stakeholders including the executive, product and marketing teams to assist with data-related technical issues and support their data infrastructure needs.
  • Create data tools for analytics and data scientist team members
  • Work with data and analytics experts to strive for greater functionality in our data systems


  • 5+ Years coding (preferably Python)
  • Strong SQL abilities
  • 2+ Years experience with big data tools: Hadoop, Spark, Kafka, Presto, EMR etc.
  • Experience building and optimizing ‘big data’ data pipelines; including - message queuing,stream processing, and highly scalable data sets
  • Experience performing root cause analysis on internal and external data and processes.
  • Strong organizational skills with the ability to juggle multiple tasks within constraints timelines


Bonus Skills:

  • Experience with Airflow or other workflow management software
  • Familiar with the Linux environment and bash scripting
  • Familiar with Machine Learning techniques
Please fill out the form below to submit your interest.

Subscribe to
our Newsletter

Contact us
Please fill out the form below to submit your interest.
Please review our privacy practices: read privacy policy.