Job Description

Job Description: Cloudera is seeking a Senior Solutions Consultant to join its APAC Professional Services team in Indonesia. In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark and related Big Data technology. This role is a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This role will present the successful candidate the opportunity to travel across Asia Pacific and across multiple industries and large customer organisations. Responsibilities:
  • Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow
  • Design and implement Hadoop and NiFi platform architectures and configurations for customers
  • Perform platform installation and upgrades for advanced secured cluster configurations
  • Analyse complex distributed production deployments, and make recommendations to optimise performance
  • Able to document and present complex architectures for the customers technical teams
  • Work closely with Cloudera’ teams at all levels to help ensure the success of project consulting engagements with customer
  • Drive projects with customers to successful completion
  • Write and produce technical documentation, blogs and knowledge-base articles
  • Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements
  • Keep current with the Hadoop Big Data ecosystem technologies
  • Attend speaking engagements when needed
  • Travel up to 75%
Qualifications:
  • 10+ years in Information Technology and System Architecture experience
  • 5+ years of Professional Services (customer facing) experience architecting large scale storage, data centre and /or globally distributed solutions
  • 5+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
  • Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments.
  • Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
  • Ability to understand and translate customer requirements into technical requirements
  • Experience implementing data transformation and processing solutions
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience setting up multi-node Hadoop clusters
  • Experience in configuring security configurations (LDAP/AD, Kerberos/SPNEGO)
  • Experience in Cloudera Software and/or HDP Certification (HDPCA / HDPCD) is a plus
  • Strong experience implementing software and/or solutions in the enterprise Linux environment
  • Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations
  • Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet
  • Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements

Related Jobs

Job Detail

  • Job Id
    8b65c988a2e4d9e8
  • Location
    id
  • Company
  • Type
    Private
  • Employment Status
    Permanent
  • Positions
    Available
  • Career Level
    Experience
  • Gender
    Male/Female

Contact

Sponsored by

https://halokerja.id connects jobseekers and recruiters by accurately matching candidate profiles to the relevant job openings through an advanced 2-way matching technology. While most job portals only focus on getting candidates the next job, Shine focuses on the entire career growth of candidates.

Latest Job