Big Data Engineer – BDEIS211

Job Information

  • icon
    Category Data Engineer Jobs
  • icon
    Posted On Dec 21 ,2021
  • icon
    Qualifications Bachelor's Degree
  • icon
    Employer Name DelonJobs
  • icon
    Contact Email cv@delon.ng

Job Description

DelonJobs is seeking to hire a Big data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. The individual will also be responsible for integrating them with the architecture used across the company. You are at the intersection of data, engineering and product. Salary is negotiable. Candidate must possess a bachelor’s degree in computer science/engineering with minimum of 3 years experience in area of technical expertise. If you consider yourself a match and are interested in this role, kindly send your CV with the job title and code (Big Data Engineer-BDEIS211) as subject to cv@delon.ng 


JOB RESPONSIBILITIES 

  • Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
  • Implementing ETL processes.
  • Design, build, maintain and test robust data pipelines for different types of data streams.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Analyzing data and defining data retention policies.

KEY COMPETENCIES

  • Proficient understanding of distributed computing principles.
  • Management of Hadoop cluster, with all included services.
  • Ability to solve any ongoing issues with operating the cluster.
  • Proficiency with Hadoop v2, MapReduce, HDFS.
  • Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming.
  • Good knowledge of Big Data querying tools, such as Drill, Pig, Hive, and Impala.
  • Experience with Spark.
  • Experience with integration of data from multiple data sources.
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB.
  • Knowledge of various ETL techniques and frameworks, such as Flume or Sqoop.
  • Experience with various messaging systems, such as Kafka or RabbitMQ.
  • Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.
  • Good understanding of Lambda Architecture, along with its advantages and drawbacks
  • Experience with Cloudera/MapR/Hortonworks.