Careers

Find Jobs Below:

If you are Data Warehousing Professional and you have BIG Data Skills we want to talk with you.

This is a client-facing position with travel where you will deliver technical solutions to satisfy business needs. You will be responsible for providing the framework for positioning and our approach to Enterprise Data Architecture and Big Data at a client site. Adept in managing both the client and partner expectations to deliver quality services and solutions associated with our Core offerings. You will possess excellent presentation skills, strong documentation skills and the ability to communicate with various levels of management and technical staff. In addition, the Senior BIG DATA Architect plays an active role in pre-sale activities, including building and promoting new offerings, building and delivering presentations, support scoping and estimating activities and development of SOWs.

Design, architect and build a data platform over Big Data Technologies

  • Leading innovation by exploring, investigating, recommending, benchmarking and implementing data centric technologies for the platform
  • Will be the technical architect and point person for the data platform

Skills:

  • Have a passion for Big Data technologies and a flexible, creative approach to problem solving.
  • Excellent problem solving and programming skills; proven technical leadership and communication skills
  • Have extensive experience with data implementations, data storage and distribution
  • Have made active contributions to open source projects like Apache Hadoop or Cassandra
  • Have a solid track record of building large scale systems utilizing Big Data Technologies
  • Excellent understanding of Big Data Analytics platforms and ETL in the context of Big Data

We are looking for an individual who has built big data systems from the ground up and has extensive experience in implementing solutions around Big Data and is a Hadoop stack expert.

The vision is to architect a reliable and scalable data platform, provide standard interfaces to query and support analytics for their big security related data sets that are transparent, efficient and easy to access as possible by their varied applications. This individual also understands the challenges of Big Data across distributed systems, understands replication and synchronization among many machines.

Big Data Skills of interest

  • Hadoop/Apache
  • MapReduce
  • HBASE
  • Hive
  • nPath
  • Hortonworks
  • HCatalog
  • Cloudera
  • NoSQL

Other technologies of interest

  • Java/J2EE/Spring
  • Application Servers- JBOSS, Weblogic, and Websphere.
  • Web Servers- Apache, Tomcat, and Netscape
Know More

Good java/j2ee Professional + BigData Exposure (First Preference)

  • Excellent in Big Data
  • Designing & Architecture exposure
  • Framework Exposure
  • Worked on large Enterprise applications
  • Excellent DB exposure
  • Exposure in BigData Techonologies like Hadoop, Mapreduce, Nosql, Lucene, PIG, hive, Cassendra, Couch DB, MongoDB, Dyna, flume, Maparch, Mahout
  • Experience in setting up Hadoop eco system, environment, clusters etc.
Know More

Location :  Phoenix ,AZ/Portland, OR/Atlanta, GA/Minneapolis, MN

 

Essential Skills:

  • Expert understanding of Magentos code structure, extension architecture, theming hierarchy, and fallback components.
  • Expertise in authoring and extending Magentoextensions
  •  Aware of the Design Patterns used in Magento2 is plus
  • Experience working with third-party Magentoextensions
  • Firm grasp of Git-based source control
  • Competent with PHP object-oriented programming & MVC applications
  • Competent with MySQL-based database concepts
  • PHP 5, Magento, MySQL, Ajax, CSS, HTML 5, Javascript, JQuery and other E-commerce or CMS
  • Strong knowledge of Zend or at least one framework, MySQL
  • Thorough knowledge of developing the e-commerce application

 

Role description:

  • Developing new projects built using the Magentoplatform, writing custom extensions
  • Maintenance of the featured developed (Existing as well as new) based on Magento2
  • Perform unit testing and automation testing of the features developed
  • Create and update design documents, code review documents
  • Conduct feature demos to the customer (internal as well as external customers)

 

Desired Skills:

  • Expert understanding of Magentos code structure, extension architecture, theming hierarchy, and fallback components
  • Expertise in authoring and extending Magentoextensions
  • Aware of the Design Patterns used in Magento2 is plus
  • Experience working with third-party Magentoextensions
  • Firm grasp of Git-based source control
  • Competent with PHP object-oriented programming & MVC applications
  • Competent with MySQL-based database concepts
  • PHP 5, Magento, MySQL, Ajax, CSS, HTML 5, Javascript, JQuery and other E-commerce or CMS
  • Strong knowledge of Zend or at least one framework, MySQL
  • Thorough knowledge of developing the e-commerce application
Know More

Location: Atlanta, GA

Job Description:

  • Developing/maintaining REST/SOAP web services using Java8, Spring MVC, Spring Boot, tomcat Developing scheduler using Spring Boot Developing web based dashboard using Spring MVC, JSP, tomcat
  • looking for person who has Java background and has ability to write scripts and programs for Testing team.
  • Required Experience, Skills and Qualifications
  • Java8, Spring MVC, Spring Boot, tomcat and JSP, Micro services.

 

 

Know More

Role- Big Data Engineer

Location- NYC, NY

Duration- 12 Months

 

Job Description:

  • A formal background and proven experience in engineering, mathematics and computer science, particularly within the financial services sector
  • Hands on Programming / Scripting Experience (Python, Java, Scala, Bash)
  • Hands on with Ansible playbooks and create Hadoop builds and deployments
  • DevOps Tools (Chef, Docker, Puppet, Bamboo, Jenkins)
  • Linux / Windows (Command line). An understanding of Unix/Linux including system administration and shell scripting
  • Proficiency with Hadoop v2, MapReduce, HDFS, Spark
  • Management of Hadoop cluster, with all included services
  • Good knowledge of Big Data querying tools, such as Pig, Hive, Impala and Spark
  • Data Concepts (ETL, near-/real-time streaming, data structures, metadata and workflow management)
  • The ability to function within a multidisciplinary, global team. Be a self-starter with a strong curiosity for extracting knowledge from data and the ability to elicit technical requirements from a non-technical audience
  • Collaboration with team members, business stakeholders and data SMEs to elicit, translate, and prescribe requirements. Cultivate sustained innovation to deliver exceptional products to customers
  • Experience with integration of data from multiple data sources
  • Strong communication skills and the ability to present deep technical findings to a business audience
Know More