Thursday 9 November 2017

Required Onsite Fulltime Permanent Big Data Architect in Plano, TX

Role: Fulltime Big Data Architect
Location: Plano, TX
Fulltime Permanent

Job Details:
•             Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education.
•             Minimum of 9+ years of work experience in the Information Technology Field.
•             Minimum of 5+ years of hands on experience Big Data technologies.
•             Demonstrate strong leadership, communication, analytical and organizational skills. Should possess excellent communication and articulation skills.
•             Experience with architecture and design of large-scale shared data solutions with multiple stakeholders.
•             Expertise in Hadoop ecosystem Enterprise product distribution through Cloudera or Hortonworks
•             Expertise in Hadoop ecosystem components such as HDFS, MapReduce 2, Hive, Pig, Zookeeper
•             Expertise with Hadoop ecosystem and experience with Hive, Pig, Oozie, Flume, Impala, Spark and Sqoop.
•             Expertise in building distributed systems, query processing, database internals or analytic systems Expertise with data schema - logical and physical data  modeling
•             Hands on Experience with Spark, HBase, Java (MapReduce), Python (linux shell like scripts) development
•             Able to independently architect solutions, lead code reviews and ensure quality throughout the life of each project.
•             Experience in full software development lifecycle of the Data Warehousing Project.
•             Strong knowledge of database modeling principles, techniques and enterprise data management best practices.
•             Hands on Experience in loading data into HDFS from heterogeneous databases – DB2, Oracle, and SQL server using Apache Sqoop.
•             Experience in analysis of data using Hive, Pig and Impala and managing, navigating data and tables using Hue.
•             Intensively worked with Oozie, Flume, Sqoop, Spark, Storm and Hive for data loading and analytics
•             Experience in exploring the data lake using big data tools  for fraud & risk detection, sentiment analysis, etc. using Spark libraries
•             Experience with data mining techniques and analytics functions Predictive analytics experience using R
•             Efficient in Writing SQL’s with complex joins ,aggregations, UNIX Scripts
•             Good understanding of Data Warehouse modeling concepts.
•             Must be able to provide Solutions or Enhancements to fix the issues quickly when reported by the clients or users.
•             Flexibility to Self-learn and  understand the system, further assist with query tuning and application performance
•             Ability to effectively manage day-to-day interactions and relationships with a diverse group of team members

Preferred

•             At least 7 years of experience in software development life cycle stages
•             At least 5 years of experience Big Data technologies and ecosystem
•             At least 7 years of experience in Project life cycle activities on development and maintenance projects
•             At least 3 years of experience in Design and Architecture review
•             At least 2 years of experience in application support and maintenance (including some experience on-call support)
•             Good Analytical skills
•             High impact communication
•             Ability to ramp up in new technologies
•             Ability to work in a team, in diverse/ multiple stakeholder environments.

•             Experience and desire to work in a Global delivery environment

NEERAJ KUSHWAHA
Desk No-609-853-0818 Ext-2260
Fax :   609 799 5746
neeraj.kushwaha@nityo.com
www.nityo.com
For more openings follow: http://usajobs-resumes.blogspot.in/