MindsMapped IT Consultants | H1B Sponsorship | H1b Visa Transfer | H1B Visa Employer | H1B Visa Sponsor | H1B Visa Jobs | H1B Visa Application | H1B Visa Approval | OPT CPT JOBS | Work Visa Sponsorship | Green card process | IT Staffing | H1B Transfer | H1b Jobs | IT Staffing | H1B Sponsorship | IT Staffing | H1B Sponsorship | Immigration Services | Online IT Training | Staff Augmentation | Outsourcing | IT Consultants | Work visa sponsorship | H1b visa transfer | IT Training | Green card process | OPT/CPT Candidates | IT Jobs | Java J2ee jobs | BigData Hadoop jobs | dotnet jobs | sap developer jobs | Informatica jobs | Business Intelligence | Cognos jobs | Business Analyst Jobs | Quality Analyst Jobs | Software Testing Jobs

MindsMapped IT Consultants | H1B Sponsorship | H1b Visa Transfer | H1B Visa Employer | H1B Visa Sponsor | H1B Visa Jobs | H1B Visa Application | H1B Visa Approval | OPT CPT JOBS | Work Visa Sponsorship | Green card process | IT Staffing | H1B Transfer | H1b Jobs | IT Staffing | H1B Sponsorship | IT Staffing | H1B Sponsorship | Immigration Services | Online IT Training | Staff Augmentation | Outsourcing | IT Consultants | Work visa sponsorship | H1b visa transfer | IT Training | Green card process | OPT/CPT Candidates | IT Jobs | Java J2ee jobs | BigData Hadoop jobs | dotnet jobs | sap developer jobs | Informatica jobs | Business Intelligence | Cognos jobs | Business Analyst Jobs | Quality Analyst Jobs | Software Testing Jobs

Loading...

Responsibilities

  • Have an excellent understanding of how the Hadoop components work and be able to quickly troubleshoot issues.
  • Continuous evaluation of Hadoop infrastructure requirements and design/deploy solutions (high availability, big data clusters, etc).
  • Understand the end to end architecture and process flow.
  • Understand Business requirements and involvement in design discussions.
  • Design, develop, and maintain high volume Scala based data processing batch jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Scalding, Cascading, Hive, Impala, Avro, Flume, Oozie, and Sqoop.
  • Design and maintain schemas in the Hadoop/Vertica analytics database and write efficient SQL for loading and querying analytics data.
  • Integrate data processing jobs and services with applications like Coremetrics, Twitter etc., using technologies such as Flume, Kafka, RabbitMQ, Spring, MongoDB, ElasticSearch, Coherence, MySQL, etc.
  • Write appropriate unit, integration and load tests using industry standard frameworks such as Specs2, ScalaTest, ScalaCheck, JMeter, JUnit, Cucumber, and Grinder.
  • Maintain ideas to implementation innovation strategy by exploring new technologies, languages, and techniques in the rapidly evolving world of high volume data processing.


Requirements

  • Very strong server-side Java experience, especially in an open source, data-intensive, distributed environment Strong in Web Services
  • Experience with open source products
  • Implemented and in-depth knowledge of various Java, J2EE, and EAI patterns
  • Well aware of various architectural concepts (Multi-tenancy, SOA, SCA, etc.) and NFR’s (performance, scalability, monitoring, etc.)
  • Good understanding of algorithms, data structure, and performance optimization techniques
  • Experience working with batch processing/ real-time systems using various open source technologies like Solr, Hadoop, NoSQL, Spark, Hive, etc.

Experience Required: 0-20 years.

Candidates must be legally authorized to work in the U.S.