Tuesday, 3 February 2015

Hadoop and Big Data Application Architect (Anywhere, USA), U.S. West or U.S. East preferably – Remote Project

Job Title: Hadoop and Big Data Application Architect (Anywhere, USA)
Duration: Long Term
Client: Nvent Solutions
Location:  U.S. West or U.S. East preferably – work remote when you are not at the client site (US).  Must be willing to travel to client site when needed, travel will be paid for.  Travel might be up to 80%

Nvent is looking for an architect to join our services team. As a Big Data Application Architect on our consulting team you will help first to work directly with our customers and partners to optimize their plans and objectives for architecting, designing, implementing and deploying Big Data applications and platforms including technologies like Apache Hadoop, Cassandra, Cascading, Storm and Kafka.  You must be very curious, self-driven and have great communication skills. You will be working with local and distributed team members.
Salary: Open and Depends on Experience – We want you!
Big Data Systems Architect Job Description:
Key responsibilities include:
  • Work directly with customers’ technical resources to devise and recommend solutions based on the understood requirements 
  • Analyze complex distributed production deployments, and make recommendations to optimize performance 
  • Able to document and present complex architectures for the customers technical teams
  • Work closely with Nvent’s teams at all levels to help ensure the success of project consulting engagements with customer
  • Help design and implement Hadoop architectures and configurations for customer
  • Drive projects with customers to successful completion 
  • Write and produce technical documentation, knowledge base articles and screencasts
  • Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements 
  • Keep current with the Hadoop, Storm and other Big Data and Data ecosystem technologies
  • Attend speaking engagements when needed 
  • Travel up to 80%
     Required expertise:
  • More than five years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
  • 2+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions 
  • Ability to understand and translate customer requirements into technical requirements 
  • Experience implementing data transformation and processing solutions using Apache PIG
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce jobs
  • Experience setting up multi-node Hadoop clusters
  • Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment 
  • Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations 
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
  • Strong understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO). 
  • Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP 
  • Solid background in Database administration or design - Oracle RAC a plus Excellent verbal and written communications Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements 
  • Demonstrable experience using R and the algorithms provided by Mahout 
     Nice, but not required experience:
  • Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments. 
  • Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.

If you enjoy solving challenging data science problems with large datasets, and looking for an opportunity to work at a company that fosters innovation and creativity, while delivering cutting edge technologies, then submit your resume today along with a contact number and time/day that you can be contacted! 

We have a very good relationship with the Hiring Manager and for the right candidate we can arrange an interview within a few days.

Look forward to connecting with you.

Thanks & Regards

Aravind
COMPQSOFT, Inc.
HUB Zone, SDB, MBE Certified
An ISO 9001:2008 & ISO 27001:2005 Certified
Phone: 281-978-4434  І  Fax: 281-657-6717