We are seeking a skilled Big Data Developer to join our remote team in the United Arab Emirates. The ideal candidate will have a strong background in data engineering, working with large-scale data environments, and developing solutions using in-memory computing tools. You will be responsible for designing and building data pipelines, working with both SQL and NoSQL databases, and ensuring data quality through various processes.

Ideal candidates would have working knowledge of Hadoop ecosystem, relational data stores, Data Integration techniques, XML, Python, Spark, SAS, R, emerging Big Data tools and technologies, Visualization tools in Big Data, ETL techniques in Hadoop and AWS ecosystem. Projects leverage Agile methodology for enabling new business capabilities

Key Responsibilities:

  • Work efficiently in a UNIX/Linux environment to develop, manage, and optimize big data applications.
  • Develop in-memory computing solutions using R, Python, PySpark, and Scala.
  • Parse and process data from XML and JSON sources, utilizing shell scripting and SQL for data handling.
  • Design and implement data pipelines with a focus on data sourcing and data quality functions like standardization, transformation, linking, and matching.
  • Manage data storage and retrieval across SQL (DB2, SQLServer, Sybase, Oracle) and NoSQL (MongoDB, DynamoDB) databases.
  • Apply knowledge of data, master data, and metadata standards, processes, and technologies to support data-driven solutions.
  • Leverage DevOps tools and CI/CD pipelines using Docker, Jenkins, or Test Driven Development (TDD) methodologies for continuous integration and deployment.

Required Skills & Experience:

  • 4-6 years of working experience in big data development, with hands-on experience in UNIX/Linux environments.
  • Expertise in in-memory computing using R, Python, PySpark, and Scala.
  • Experience in parsing and shredding data from XML and JSON using shell scripting and SQL.
  • Proficiency with SQL (DB2, SQLServer, Sybase, Oracle) and NoSQL databases (MongoDB, DynamoDB).
  • Experience designing data pipelines and sourcing routines with a focus on data quality.
  • Familiarity with DevOps and CI/CD practices using tools like Docker, Jenkins, and TDD.

Preferred Skills:

  • Experience in Financial services or with information management needs is a plus.
  • Familiarity with Agile methodologies, including Jira and VersionOne.
  • Ability to work in a fast-paced environment, managing multiple priorities and deadlines.
  • Knowledge of AWS Cloud services (EC2, EMR, ECS, S3, SNS, SQS, CloudFormation, CloudWatch, Lambda) is an advantage.

Benefits:

  • Competitive salary based on experience.
  • Full remote working flexibility.
  • Opportunities for professional development and career advancement.

To apply, please send your CV and a cover letter to info@raasrpo.com.

Job Category: Big Data Developer
Job Type: Full Time
Job Location: United Arab Emirates

Apply for this position

Allowed Type(s): .pdf, .doc, .docx, .rtf